Sample records for output files generated

  1. Performance regression manager for large scale systems

    DOEpatents

    Faraj, Daniel A.

    2017-10-17

    System and computer program product to perform an operation comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputting for display an indication of a result of the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.

  2. Performance regression manager for large scale systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faraj, Daniel A.

    Methods comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputting for display an indication of a result ofmore » the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.« less

  3. Performance regression manager for large scale systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faraj, Daniel A.

    System and computer program product to perform an operation comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputtingmore » for display an indication of a result of the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.« less

  4. Information retrieval system

    NASA Technical Reports Server (NTRS)

    Berg, R. F.; Holcomb, J. E.; Kelroy, E. A.; Levine, D. A.; Mee, C., III

    1970-01-01

    Generalized information storage and retrieval system capable of generating and maintaining a file, gathering statistics, sorting output, and generating final reports for output is reviewed. File generation and file maintenance programs written for the system are general purpose routines.

  5. Transported Geothermal Energy Technoeconomic Screening Tool - Calculation Engine

    DOE Data Explorer

    Liu, Xiaobing

    2016-09-21

    This calculation engine estimates technoeconomic feasibility for transported geothermal energy projects. The TGE screening tool (geotool.exe) takes input from input file (input.txt), and list results into output file (output.txt). Both the input and ouput files are in the same folder as the geotool.exe. To use the tool, the input file containing adequate information of the case should be prepared in the format explained below, and the input file should be put into the same folder as geotool.exe. Then the geotool.exe can be executed, which will generate a output.txt file in the same folder containing all key calculation results. The format and content of the output file is explained below as well.

  6. Wrapping Python around MODFLOW/MT3DMS based groundwater models

    NASA Astrophysics Data System (ADS)

    Post, V.

    2008-12-01

    Numerical models that simulate groundwater flow and solute transport require a great amount of input data that is often organized into different files. A large proportion of the input data consists of spatially-distributed model parameters. The model output consists of a variety data such as heads, fluxes and concentrations. Typically all files have different formats. Consequently, preparing input and managing output is a complex and error-prone task. Proprietary software tools are available that facilitate the preparation of input files and analysis of model outcomes. The use of such software may be limited if it does not support all the features of the groundwater model or when the costs of such tools are prohibitive. Therefore a Python library was developed that contains routines to generate input files and process output files of MODFLOW/MT3DMS based models. The library is freely available and has an open structure so that the routines can be customized and linked into other scripts and libraries. The current set of functions supports the generation of input files for MODFLOW and MT3DMS, including the capability to read spatially-distributed input parameters (e.g. hydraulic conductivity) from PNG files. Both ASCII and binary output files can be read efficiently allowing for visualization of, for example, solute concentration patterns in contour plots with superimposed flow vectors using matplotlib. Series of contour plots are then easily saved as an animation. The subroutines can also be used within scripts to calculate derived quantities such as the mass of a solute within a particular region of the model domain. Using Python as a wrapper around groundwater models provides an efficient and flexible way of processing input and output data, which is not constrained by limitations of third-party products.

  7. Integrated Geothermal-CO2 Storage Reservoirs: FY1 Final Report

    DOE Data Explorer

    Buscheck, Thomas A.

    2012-01-01

    The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  8. Manual for Getdata Version 3.1: a FORTRAN Utility Program for Time History Data

    NASA Technical Reports Server (NTRS)

    Maine, Richard E.

    1987-01-01

    This report documents version 3.1 of the GetData computer program. GetData is a utility program for manipulating files of time history data, i.e., data giving the values of parameters as functions of time. The most fundamental capability of GetData is extracting selected signals and time segments from an input file and writing the selected data to an output file. Other capabilities include converting file formats, merging data from several input files, time skewing, interpolating to common output times, and generating calculated output signals as functions of the input signals. This report also documents the interface standards for the subroutines used by GetData to read and write the time history files. All interface to the data files is through these subroutines, keeping the main body of GetData independent of the precise details of the file formats. Different file formats can be supported by changes restricted to these subroutines. Other computer programs conforming to the interface standards can call the same subroutines to read and write files in compatible formats.

  9. External-Compression Supersonic Inlet Design Code

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2011-01-01

    A computer code named SUPIN has been developed to perform aerodynamic design and analysis of external-compression, supersonic inlets. The baseline set of inlets include axisymmetric pitot, two-dimensional single-duct, axisymmetric outward-turning, and two-dimensional bifurcated-duct inlets. The aerodynamic methods are based on low-fidelity analytical and numerical procedures. The geometric methods are based on planar geometry elements. SUPIN has three modes of operation: 1) generate the inlet geometry from a explicit set of geometry information, 2) size and design the inlet geometry and analyze the aerodynamic performance, and 3) compute the aerodynamic performance of a specified inlet geometry. The aerodynamic performance quantities includes inlet flow rates, total pressure recovery, and drag. The geometry output from SUPIN includes inlet dimensions, cross-sectional areas, coordinates of planar profiles, and surface grids suitable for input to grid generators for analysis by computational fluid dynamics (CFD) methods. The input data file for SUPIN and the output file from SUPIN are text (ASCII) files. The surface grid files are output as formatted Plot3D or stereolithography (STL) files. SUPIN executes in batch mode and is available as a Microsoft Windows executable and Fortran95 source code with a makefile for Linux.

  10. NEMAR plotting computer program

    NASA Technical Reports Server (NTRS)

    Myler, T. R.

    1981-01-01

    A FORTRAN coded computer program which generates CalComp plots of trajectory parameters is examined. The trajectory parameters are calculated and placed on a data file by the Near Earth Mission Analysis Routine computer program. The plot program accesses the data file and generates the plots as defined by inputs to the plot program. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included. Although this plot program utilizes a random access data file, a data file of the same type and formatted in 102 numbers per record could be generated by any computer program and used by this plot program.

  11. Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations

    DOE Data Explorer

    Buscheck, Thomas A.

    2012-01-01

    Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk : FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  12. Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations

    DOE Data Explorer

    Buscheck, Thomas A.

    2000-01-01

    Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk: FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  13. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    NASA Astrophysics Data System (ADS)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  14. Framework for Integrating Science Data Processing Algorithms Into Process Control Systems

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris A.; Crichton, Daniel J.; Chang, Albert Y.; Foster, Brian M.; Freeborn, Dana J.; Woollard, David M.; Ramirez, Paul M.

    2011-01-01

    A software framework called PCS Task Wrapper is responsible for standardizing the setup, process initiation, execution, and file management tasks surrounding the execution of science data algorithms, which are referred to by NASA as Product Generation Executives (PGEs). PGEs codify a scientific algorithm, some step in the overall scientific process involved in a mission science workflow. The PCS Task Wrapper provides a stable operating environment to the underlying PGE during its execution lifecycle. If the PGE requires a file, or metadata regarding the file, the PCS Task Wrapper is responsible for delivering that information to the PGE in a manner that meets its requirements. If the PGE requires knowledge of upstream or downstream PGEs in a sequence of executions, that information is also made available. Finally, if information regarding disk space, or node information such as CPU availability, etc., is required, the PCS Task Wrapper provides this information to the underlying PGE. After this information is collected, the PGE is executed, and its output Product file and Metadata generation is managed via the PCS Task Wrapper framework. The innovation is responsible for marshalling output Products and Metadata back to a PCS File Management component for use in downstream data processing and pedigree. In support of this, the PCS Task Wrapper leverages the PCS Crawler Framework to ingest (during pipeline processing) the output Product files and Metadata produced by the PGE. The architectural components of the PCS Task Wrapper framework include PGE Task Instance, PGE Config File Builder, Config File Property Adder, Science PGE Config File Writer, and PCS Met file Writer. This innovative framework is really the unifying bridge between the execution of a step in the overall processing pipeline, and the available PCS component services as well as the information that they collectively manage.

  15. Effect of Spatial Locality Prefetching on Structural Locality

    DTIC Science & Technology

    1991-12-01

    Pollution module calculates the SLC and CAM cache pollution percentages. And finally, the Generate Reference Frequency List module produces the output...3.2.5 Generate Reference Frequency List 3.2.6 Each program module in the structure chart is mapped into an Ada package. By performing this encapsulation...call routine to generate reference -- frequency list -- end if -- end loop -- close input, output, and reference files end Cache Simulator Figure 3.5

  16. QX MAN: Q and X file manipulation

    NASA Technical Reports Server (NTRS)

    Krein, Mark A.

    1992-01-01

    QX MAN is a grid and solution file manipulation program written primarily for the PARC code and the GRIDGEN family of grid generation codes. QX MAN combines many of the features frequently encountered in grid generation, grid refinement, the setting-up of initial conditions, and post processing. QX MAN allows the user to manipulate single block and multi-block grids (and their accompanying solution files) by splitting, concatenating, rotating, translating, re-scaling, and stripping or adding points. In addition, QX MAN can be used to generate an initial solution file for the PARC code. The code was written to provide several formats for input and output in order for it to be useful in a broad spectrum of applications.

  17. ANLPS. Graphics Driver for PostScript Output

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engert, D.E.

    1987-09-01

    ANLPS is a PostScript graphics device driver for use with the proprietary CA TELLAGRAF, CUECHART, and DISSPLA products. The driver allows the user to create and send text and graphics output in the Adobe Systems` PostScript page description language, which is accepted by many print devices. The PostScript output can be generated by TELLAGRAF 6.0 and DISSPLA 10.0. The files containing the PostScript output are sent to PostScript laser printers, such as the Apple LaserWriter. It is not necessary to initialize the printer, as the output for each plot is self-contained. All CA fonts are mapped to PostScript fonts, e.g.more » Swiss-Medium is mapped to Helvetica, and the mapping is easily changed. Hardware shading and hardware characters, area fill, and color are included. Auxiliary routines are provided which allow graphics files containing figures, logos, and diagrams to be merged with text files. The user can then position, scale, and rotate the figures on the output page in the reserved area specified.« less

  18. HDF-EOS Dump Tools

    NASA Astrophysics Data System (ADS)

    Prasad, U.; Rahabi, A.

    2001-05-01

    The following utilities developed for HDF-EOS format data dump are of special use for Earth science data for NASA's Earth Observation System (EOS). This poster demonstrates their use and application. The first four tools take HDF-EOS data files as input. HDF-EOS Metadata Dumper - metadmp Metadata dumper extracts metadata from EOS data granules. It operates by simply copying blocks of metadata from the file to the standard output. It does not process the metadata in any way. Since all metadata in EOS granules is encoded in the Object Description Language (ODL), the output of metadmp will be in the form of complete ODL statements. EOS data granules may contain up to three different sets of metadata (Core, Archive, and Structural Metadata). HDF-EOS Contents Dumper - heosls Heosls dumper displays the contents of HDF-EOS files. This utility provides detailed information on the POINT, SWATH, and GRID data sets. in the files. For example: it will list, the Geo-location fields, Data fields and objects. HDF-EOS ASCII Dumper - asciidmp The ASCII dump utility extracts fields from EOS data granules into plain ASCII text. The output from asciidmp should be easily human readable. With minor editing, asciidmp's output can be made ingestible by any application with ASCII import capabilities. HDF-EOS Binary Dumper - bindmp The binary dumper utility dumps HDF-EOS objects in binary format. This is useful for feeding the output of it into existing program, which does not understand HDF, for example: custom software and COTS products. HDF-EOS User Friendly Metadata - UFM The UFM utility tool is useful for viewing ECS metadata. UFM takes an EOSDIS ODL metadata file and produces an HTML report of the metadata for display using a web browser. HDF-EOS METCHECK - METCHECK METCHECK can be invoked from either Unix or Dos environment with a set of command line options that a user might use to direct the tool inputs and output . METCHECK validates the inventory metadata in (.met file) using The Descriptor file (.desc) as the reference. The tool takes (.desc), and (.met) an ODL file as inputs, and generates a simple output file contains the results of the checking process.

  19. Brady's Geothermal Field - March 2016 Vibroseis SEG-Y Files and UTM Locations

    DOE Data Explorer

    Kurt Feigl

    2016-03-31

    PoroTomo March 2016 (Task 6.4) Updated vibroseis source locations with UTM locations. Supersedes gdr.openei.org/submissions/824. Updated vibroseis source location data for Stages 1-4, PoroTomo March 2016. This revision includes source point locations in UTM format (meters) for all four Stages of active source acquisition. Vibroseis sweep data were collected on a Signature Recorder unit (mfr Seismic Source) mounted in the vibroseis cab during the March 2016 PoroTomo active seismic survey Stages 1 to 4. Each sweep generated a GPS timed SEG-Y file with 4 input channels and a 20 second record length. Ch1 = pilot sweep, Ch2 = accelerometer output from the vibe's mass, Ch3 = accel output from the baseplase, and Ch4 = weighted sum of the accelerometer outputs. SEG-Y files are available via the links below.

  20. Simulations of Brady's-Type Fault Undergoing CO2 Push-Pull: Pressure-Transient and Sensitivity Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jung, Yoojin; Doughty, Christine

    Input and output files used for fault characterization through numerical simulation using iTOUGH2. The synthetic data for the push period are generated by running a forward simulation (input parameters are provided in iTOUGH2 Brady GF6 Input Parameters.txt [InvExt6i.txt]). In general, the permeability of the fault gouge, damage zone, and matrix are assumed to be unknown. The input and output files are for the inversion scenario where only pressure transients are available at the monitoring well located 200 m above the injection well and only the fault gouge permeability is estimated. The input files are named InvExt6i, INPUT.tpl, FOFT.ins, CO2TAB, andmore » the output files are InvExt6i.out, pest.fof, and pest.sav (names below are display names). The table graphic in the data files below summarizes the inversion results, and indicates the fault gouge permeability can be estimated even if imperfect guesses are used for matrix and damage zone permeabilities, and permeability anisotropy is not taken into account.« less

  1. User Guide for HUFPrint, A Tabulation and Visualization Utility for the Hydrogeologic-Unit Flow (HUF) Package of MODFLOW

    USGS Publications Warehouse

    Banta, Edward R.; Provost, Alden M.

    2008-01-01

    This report documents HUFPrint, a computer program that extracts and displays information about model structure and hydraulic properties from the input data for a model built using the Hydrogeologic-Unit Flow (HUF) Package of the U.S. Geological Survey's MODFLOW program for modeling ground-water flow. HUFPrint reads the HUF Package and other MODFLOW input files, processes the data by hydrogeologic unit and by model layer, and generates text and graphics files useful for visualizing the data or for further processing. For hydrogeologic units, HUFPrint outputs such hydraulic properties as horizontal hydraulic conductivity along rows, horizontal hydraulic conductivity along columns, horizontal anisotropy, vertical hydraulic conductivity or anisotropy, specific storage, specific yield, and hydraulic-conductivity depth-dependence coefficient. For model layers, HUFPrint outputs such effective hydraulic properties as horizontal hydraulic conductivity along rows, horizontal hydraulic conductivity along columns, horizontal anisotropy, specific storage, primary direction of anisotropy, and vertical conductance. Text files tabulating hydraulic properties by hydrogeologic unit, by model layer, or in a specified vertical section may be generated. Graphics showing two-dimensional cross sections and one-dimensional vertical sections at specified locations also may be generated. HUFPrint reads input files designed for MODFLOW-2000 or MODFLOW-2005.

  2. User Guide and Documentation for Five MODFLOW Ground-Water Modeling Utility Programs

    USGS Publications Warehouse

    Banta, Edward R.; Paschke, Suzanne S.; Litke, David W.

    2008-01-01

    This report documents five utility programs designed for use in conjunction with ground-water flow models developed with the U.S. Geological Survey's MODFLOW ground-water modeling program. One program extracts calculated flow values from one model for use as input to another model. The other four programs extract model input or output arrays from one model and make them available in a form that can be used to generate an ArcGIS raster data set. The resulting raster data sets may be useful for visual display of the data or for further geographic data processing. The utility program GRID2GRIDFLOW reads a MODFLOW binary output file of cell-by-cell flow terms for one (source) model grid and converts the flow values to input flow values for a different (target) model grid. The spatial and temporal discretization of the two models may differ. The four other utilities extract selected 2-dimensional data arrays in MODFLOW input and output files and write them to text files that can be imported into an ArcGIS geographic information system raster format. These four utilities require that the model cells be square and aligned with the projected coordinate system in which the model grid is defined. The four raster-conversion utilities are * CBC2RASTER, which extracts selected stress-package flow data from a MODFLOW binary output file of cell-by-cell flows; * DIS2RASTER, which extracts cell-elevation data from a MODFLOW Discretization file; * MFBIN2RASTER, which extracts array data from a MODFLOW binary output file of head or drawdown; and * MULT2RASTER, which extracts array data from a MODFLOW Multiplier file.

  3. 18 CFR 35.12 - Filing of initial rate schedules and tariffs.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... integration of hydroelectric generating resources whose output cannot be predicted quantitatively due to water... explanation of how the proposed rate or charge was derived. For example, is it a standard rate of the filing public utility; is it a special rate arrived at through negotiations and, if so, were unusual customer...

  4. 18 CFR 35.12 - Filing of initial rate schedules and tariffs.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... integration of hydroelectric generating resources whose output cannot be predicted quantitatively due to water... explanation of how the proposed rate or charge was derived. For example, is it a standard rate of the filing public utility; is it a special rate arrived at through negotiations and, if so, were unusual customer...

  5. 18 CFR 35.12 - Filing of initial rate schedules and tariffs.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... integration of hydroelectric generating resources whose output cannot be predicted quantitatively due to water... explanation of how the proposed rate or charge was derived. For example, is it a standard rate of the filing public utility; is it a special rate arrived at through negotiations and, if so, were unusual customer...

  6. 18 CFR 35.12 - Filing of initial rate schedules and tariffs.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... integration of hydroelectric generating resources whose output cannot be predicted quantitatively due to water... explanation of how the proposed rate or charge was derived. For example, is it a standard rate of the filing public utility; is it a special rate arrived at through negotiations and, if so, were unusual customer...

  7. 18 CFR 35.12 - Filing of initial rate schedules and tariffs.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... integration of hydroelectric generating resources whose output cannot be predicted quantitatively due to water... explanation of how the proposed rate or charge was derived. For example, is it a standard rate of the filing public utility; is it a special rate arrived at through negotiations and, if so, were unusual customer...

  8. Software for Preprocessing Data from Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2004-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris). EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PV-WAVE based plotting software.

  9. Software for Preprocessing Data From Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2003-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: (1) Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. (2) QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot. (3) EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PVWAVE based plotting software.

  10. Reporting Differences Between Spacecraft Sequence Files

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy E.; Fisher, Forest W.

    2010-01-01

    A suite of computer programs, called seq diff suite, reports differences between the products of other computer programs involved in the generation of sequences of commands for spacecraft. These products consist of files of several types: replacement sequence of events (RSOE), DSN keyword file [DKF (wherein DSN signifies Deep Space Network)], spacecraft activities sequence file (SASF), spacecraft sequence file (SSF), and station allocation file (SAF). These products can include line numbers, request identifications, and other pieces of information that are not relevant when generating command sequence products, though these fields can result in the appearance of many changes to the files, particularly when using the UNIX diff command to inspect file differences. The outputs of prior software tools for reporting differences between such products include differences in these non-relevant pieces of information. In contrast, seq diff suite removes the fields containing the irrelevant pieces of information before processing to extract differences, so that only relevant differences are reported. Thus, seq diff suite is especially useful for reporting changes between successive versions of the various products and in particular flagging difference in fields relevant to the sequence command generation and review process.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pitarka, Arben

    GEN_SRF_4 is a computer program for generation kinematic earthquake rupture models for use in ground motion modeling and simulations of earthquakes. The output is an ascii SRF formatted file containing kinematic rupture parameters.

  12. Model input and output files for the simulation of time of arrival of landfill leachate at the water table, Municipal Solid Waste Landfill Facility, U.S. Army Air Defense Artillery Center and Fort Bliss, El Paso County, Texas

    USGS Publications Warehouse

    Abeyta, Cynthia G.; Frenzel, Peter F.

    1999-01-01

    This report contains listings of model input and output files for the simulation of the time of arrival of landfill leachate at the water table from the Municipal Solid Waste Landfill Facility (MSWLF), about 10 miles northeast of downtown El Paso, Texas. This simulation was done by the U.S. Geological Survey in cooperation with the U.S. Department of the Army, U.S. Army Air Defense Artillery Center and Fort Bliss, El Paso, Texas. The U.S. Environmental Protection Agency-developed Hydrologic Evaluation of Landfill Performance (HELP) and Multimedia Exposure Assessment (MULTIMED) computer models were used to simulate the production of leachate by a landfill and transport of landfill leachate to the water table. Model input data files used with and output files generated by the HELP and MULTIMED models are provided in ASCII format on a 3.5-inch 1.44-megabyte IBM-PC compatible floppy disk.

  13. Program VSMOKE--Users Manual

    Treesearch

    Leonidas G. Lavdas

    1996-01-01

    This is a users manual for VSMOKE, a computer porgram for predicting the smoke and dry weather visibility impact of a singel prescvribed fire at several downwind locations. VSMOKE is a FORTRAN 77 program that depends on the input in file VSMOKE.IPT to generate output in file compatible with those used by the U.S. Environmental Protection Agency. VSMOKE is uniquely...

  14. Managing Data From Signal-Propagation Experiments

    NASA Technical Reports Server (NTRS)

    Kantak, A. V.

    1989-01-01

    Computer programs generate characteristic plots from amplitudes and phases. Software system enables minicomputer to process data on amplitudes and phases of signals received during experiments in ground-mobile/satellite radio propagation. Takes advantage of file-handling capabilities of UNIX operating system and C programming language. Interacts with user, under whose guidance programs in FORTRAN language generate plots of spectra or other curves of types commonly used to characterize signals. FORTRAN programs used to process file-handling outputs into any of several useful forms.

  15. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  16. Software for Preprocessing Data From Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2002-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC "E" test-stand complex and utilize the SSC file format. The programs are the following: 1) Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel; 2) QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris); and 3) EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PVWAVE based plotting software.

  17. Washington Play Fairway Analysis Geothermal GIS Data

    DOE Data Explorer

    Corina Forson

    2015-12-15

    This file contains file geodatabases of the Mount St. Helens seismic zone (MSHSZ), Wind River valley (WRV) and Mount Baker (MB) geothermal play-fairway sites in the Washington Cascades. The geodatabases include input data (feature classes) and output rasters (generated from modeling and interpolation) from the geothermal play-fairway in Washington State, USA. These data were gathered and modeled to provide an estimate of the heat and permeability potential within the play-fairways based on: mapped volcanic vents, hot springs and fumaroles, geothermometry, intrusive rocks, temperature-gradient wells, slip tendency, dilation tendency, displacement, displacement gradient, max coulomb shear stress, sigma 3, maximum shear strain rate, and dilational strain rate at 200m and 3 km depth. In addition this file contains layer files for each of the output rasters. For details on the areas of interest please see the 'WA_State_Play_Fairway_Phase_1_Technical_Report' in the download package. This submission also includes a file with the geothermal favorability of the Washington Cascade Range based off of an earlier statewide assessment. Additionally, within this file there are the maximum shear and dilational strain rate rasters for all of Washington State.

  18. Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios

    USGS Publications Warehouse

    Banta, Edward R.

    2014-01-01

    Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable Document Format file.

  19. PROP3D: A Program for 3D Euler Unsteady Aerodynamic and Aeroelastic (Flutter and Forced Response) Analysis of Propellers. Version 1.0

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Reddy, T. S. R.

    1996-01-01

    This guide describes the input data required, for steady or unsteady aerodynamic and aeroelastic analysis of propellers and the output files generated, in using PROP3D. The aerodynamic forces are obtained by solving three dimensional unsteady, compressible Euler equations. A normal mode structural analysis is used to obtain the aeroelastic equations, which are solved using either time domain or frequency domain solution method. Sample input and output files are included in this guide for steady aerodynamic analysis of single and counter-rotation propellers, and aeroelastic analysis of single-rotation propeller.

  20. SARAH 3.2: Dirac gauginos, UFO output, and more

    NASA Astrophysics Data System (ADS)

    Staub, Florian

    2013-07-01

    SARAH is a Mathematica package optimized for the fast, efficient and precise study of supersymmetric models beyond the MSSM: a new model can be defined in a short form and all vertices are derived. This allows SARAH to create model files for FeynArts/FormCalc, CalcHep/CompHep and WHIZARD/O'Mega. The newest version of SARAH now provides the possibility to create model files in the UFO format which is supported by MadGraph 5, MadAnalysis 5, GoSam, and soon by Herwig++. Furthermore, SARAH also calculates the mass matrices, RGEs and 1-loop corrections to the mass spectrum. This information is used to write source code for SPheno in order to create a precision spectrum generator for the given model. This spectrum-generator-generator functionality as well as the output of WHIZARD and CalcHep model files has seen further improvement in this version. Also models including Dirac gauginos are supported with the new version of SARAH, and additional checks for the consistency of the implementation of new models have been created. Program summaryProgram title:SARAH Catalogue identifier: AEIB_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIB_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3 22 411 No. of bytes in distributed program, including test data, etc.: 3 629 206 Distribution format: tar.gz Programming language: Mathematica. Computer: All for which Mathematica is available. Operating system: All for which Mathematica is available. Classification: 11.1, 11.6. Catalogue identifier of previous version: AEIB_v1_0 Journal reference of previous version: Comput. Phys. Comm. 182 (2011) 808 Does the new version supersede the previous version?: Yes, the new version includes all known features of the previous version but also provides the new features mentioned below. Nature of problem: To use Madgraph for new models it is necessary to provide the corresponding model files which include all information about the interactions of the model. However, the derivation of the vertices for a given model and putting those into model files which can be used with Madgraph is usually very time consuming. Dirac gauginos are not present in the minimal supersymmetric standard model (MSSM) or many extensions of it. Dirac mass terms for vector superfields lead to new structures in the supersymmetric (SUSY) Lagrangian (bilinear mass term between gaugino and matter fermion as well as new D-terms) and modify also the SUSY renormalization group equations (RGEs). The Dirac character of gauginos can change the collider phenomenology. In addition, they come with an extended Higgs sector for which a precise calculation of the 1-loop masses has not happened so far. Solution method: SARAH calculates the complete Lagrangian for a given model whose gauge sector can be any direct product of SU(N) gauge groups. The chiral superfields can transform as any, irreducible representation with respect to these gauge groups and it is possible to handle an arbitrary number of symmetry breakings or particle rotations. Also the gauge fixing is automatically added. Using this information, SARAH derives all vertices for a model. These vertices can be exported to model files in the UFO which is supported by Madgraph and other codes like GoSam, MadAnalysis or ALOHA. The user can also study models with Dirac gauginos. In that case SARAH includes all possible terms in the Lagrangian stemming from the new structures and can also calculate the RGEs. The entire impact of these terms is then taken into account in the output of SARAH to UFO, CalcHep, WHIZARD, FeynArts and SPheno. Reasons for new version: SARAH provides, with this version, the possibility of creating model files in the UFO format. The UFO format is supposed to become a standard format for model files which should be supported by many different tools in the future. Also models with Dirac gauginos were not supported in earlier versions. Summary of revisions: Support of models with Dirac gauginos. Output of model files in the UFO format, speed improvement in the output of WHIZARD model files, CalcHep output supports the internal diagonalization of mass matrices, output of control files for LHPC spectrum plotter, support of generalized PDG numbering scheme PDG.IX, improvement of the calculation of the decay widths and branching ratios with SPheno, the calculation of new low energy observables are added to the SPheno output, the handling of gauge fixing terms has been significantly simplified. Restrictions: SARAH can only derive the Lagrangian in an automatized way for N=1 SUSY models, but not for those with more SUSY generators. Furthermore, SARAH supports only renormalizable operators in the output of model files in the UFO format and also for CalcHep, FeynArts and WHIZARD. Also color sextets are not yet included in the model files for Monte Carlo tools. Dimension 5 operators are only supported in the calculation of the RGEs and mass matrices. Unusual features: SARAH does not need the Lagrangian of a model as input to calculate the vertices. The gauge structure, particle and content and superpotential as well as rotations stemming from gauge symmetry breaking are sufficient. All further information is derived by SARAH on its own. Therefore, the model files are very short and the implementation of new models is fast and easy. In addition, the implementation of a model can be checked for physical and formal consistency. In addition, SARAH can generate Fortran code for a full 1-loop analysis of the mass spectrum in the context for Dirac gauginos. Running time: Measured CPU time for the evaluation of the MSSM using a Lenovo Thinkpad X220 with i7 processor (2.53 GHz). Calculating the complete Lagrangian: 9 s. Calculating all vertices: 51 s. Output of the UFO model files: 49 s.

  1. Translator program converts computer printout into braille language

    NASA Technical Reports Server (NTRS)

    Powell, R. A.

    1967-01-01

    Computer program converts print image tape files into six dot Braille cells, enabling a blind computer programmer to monitor and evaluate data generated by his own programs. The Braille output is printed 8 lines per inch.

  2. UFO - The Universal FEYNRULES Output

    NASA Astrophysics Data System (ADS)

    Degrande, Céline; Duhr, Claude; Fuks, Benjamin; Grellscheid, David; Mattelaer, Olivier; Reiter, Thomas

    2012-06-01

    We present a new model format for automatized matrix-element generators, the so-called Universal FEYNRULES Output (UFO). The format is universal in the sense that it features compatibility with more than one single generator and is designed to be flexible, modular and agnostic of any assumption such as the number of particles or the color and Lorentz structures appearing in the interaction vertices. Unlike other model formats where text files need to be parsed, the information on the model is encoded into a PYTHON module that can easily be linked to other computer codes. We then describe an interface for the MATHEMATICA package FEYNRULES that allows for an automatic output of models in the UFO format.

  3. Statistical evaluation of PACSTAT random number generation capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, G.F.; Toland, M.R.; Harty, H.

    1988-05-01

    This report summarizes the work performed in verifying the general purpose Monte Carlo driver-program PACSTAT. The main objective of the work was to verify the performance of PACSTAT's random number generation capabilities. Secondary objectives were to document (using controlled configuration management procedures) changes made in PACSTAT at Pacific Northwest Laboratory, and to assure that PACSTAT input and output files satisfy quality assurance traceability constraints. Upon receipt of the PRIME version of the PACSTAT code from the Basalt Waste Isolation Project, Pacific Northwest Laboratory staff converted the code to run on Digital Equipment Corporation (DEC) VAXs. The modifications to PACSTAT weremore » implemented using the WITNESS configuration management system, with the modifications themselves intended to make the code as portable as possible. Certain modifications were made to make the PACSTAT input and output files conform to quality assurance traceability constraints. 10 refs., 17 figs., 6 tabs.« less

  4. iMatTOUGH: An open-source Matlab-based graphical user interface for pre- and post-processing of TOUGH2 and iTOUGH2 models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan

    TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less

  5. iMatTOUGH: An open-source Matlab-based graphical user interface for pre- and post-processing of TOUGH2 and iTOUGH2 models

    DOE PAGES

    Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan

    2016-04-01

    TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less

  6. User's Manual for DuctE3D: A Program for 3D Euler Unsteady Aerodynamic and Aeroelastic Analysis of Ducted Fans

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Reddy, T. S. R.

    1997-01-01

    The program DuctE3D is used for steady or unsteady aerodynamic and aeroelastic analysis of ducted fans. This guide describes the input data required and the output files generated, in using DuctE3D. The analysis solves three dimensional unsteady, compressible Euler equations to obtain the aerodynamic forces. A normal mode structural analysis is used to obtain the aeroelastic equations, which are solved using either the time domain or the frequency domain solution method. Sample input and output files are included in this guide for steady aerodynamic analysis and aeroelastic analysis of an isolated fan row.

  7. Program documentation for the space environment test division post-test data reduction program (GNFLEX)

    NASA Technical Reports Server (NTRS)

    Jones, L. D.

    1979-01-01

    The Space Environment Test Division Post-Test Data Reduction Program processes data from test history tapes generated on the Flexible Data System in the Space Environment Simulation Laboratory at the National Aeronautics and Space Administration/Lyndon B. Johnson Space Center. The program reads the tape's data base records to retrieve the item directory conversion file, the item capture file and the process link file to determine the active parameters. The desired parameter names are read in by lead cards after which the periodic data records are read to determine parameter data level changes. The data is considered to be compressed rather than full sample rate. Tabulations and/or a tape for generating plots may be output.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, Benjamin S.

    The Futility package contains the following: 1) Definition of the size of integers and real numbers; 2) A generic Unit test harness; 3) Definitions for some basic extensions to the Fortran language: arbitrary length strings, a parameter list construct, exception handlers, command line processor, timers; 4) Geometry definitions: point, line, plane, box, cylinder, polyhedron; 5) File wrapper functions: standard Fortran input/output files, Fortran binary files, HDF5 files; 6) Parallel wrapper functions: MPI, and Open MP abstraction layers, partitioning algorithms; 7) Math utilities: BLAS, Matrix and Vector definitions, Linear Solver methods and wrappers for other TPLs (PETSC, MKL, etc), preconditioner classes;more » 8) Misc: random number generator, water saturation properties, sorting algorithms.« less

  9. Software on diffractive optics and computer-generated holograms

    NASA Astrophysics Data System (ADS)

    Doskolovich, Leonid L.; Golub, Michael A.; Kazanskiy, Nikolay L.; Khramov, Alexander G.; Pavelyev, Vladimir S.; Seraphimovich, P. G.; Soifer, Victor A.; Volotovskiy, S. G.

    1995-01-01

    The `Quick-DOE' software for an IBM PC-compatible computer is aimed at calculating the masks of diffractive optical elements (DOEs) and computer generated holograms, computer simulation of DOEs, and for executing a number of auxiliary functions. In particular, among the auxiliary functions are the file format conversions, mask visualization on display from a file, implementation of fast Fourier transforms, and arranging and preparation of composite images for the output on a photoplotter. The software is aimed for use by opticians, DOE designers, and the programmers dealing with the development of the program for DOE computation.

  10. An integrated software system for geometric correction of LANDSAT MSS imagery

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Esilva, A. J. F. M.; Camara-Neto, G.; Serra, P. R. M.; Desousa, R. C. M.; Mitsuo, Fernando Augusta, II

    1984-01-01

    A system for geometrically correcting LANDSAT MSS imagery includes all phases of processing, from receiving a raw computer compatible tape (CCT) to the generation of a corrected CCT (or UTM mosaic). The system comprises modules for: (1) control of the processing flow; (2) calculation of satellite ephemeris and attitude parameters, (3) generation of uncorrected files from raw CCT data; (4) creation, management and maintenance of a ground control point library; (5) determination of the image correction equations, using attitude and ephemeris parameters and existing ground control points; (6) generation of corrected LANDSAT file, using the equations determined beforehand; (7) union of LANDSAT scenes to produce and UTM mosaic; and (8) generation of output tape, in super-structure format.

  11. Spec2Harv: Converting Spectrum output to HARVEST input

    Treesearch

    Eric J. Gustafson; Luke V. Rasmussen; Larry A. Leefers

    2003-01-01

    Spec2Harv was developed to automate the conversion of harvest schedules generated by the Spectrum model into script files that can be used by the HARVEST simulation model to simulate the implementation of the Spectrum schedules in a spatially explicit way.

  12. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    NASA Astrophysics Data System (ADS)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  13. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations.

    PubMed

    Laloo, Jalal Z A; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  14. A user-friendly application for the extraction of kubios hrv output to an optimal format for statistical analysis - biomed 2011.

    PubMed

    Johnsen Lind, Andreas; Helge Johnsen, Bjorn; Hill, Labarron K; Sollers Iii, John J; Thayer, Julian F

    2011-01-01

    The aim of the present manuscript is to present a user-friendly and flexible platform for transforming Kubios HRV output files to an .xls-file format, used by MS Excel. The program utilizes either native or bundled Java and is platform-independent and mobile. This means that it can run without being installed on a computer. It also has an option of continuous transferring of data indicating that it can run in the background while Kubios produces output files. The program checks for changes in the file structure and automatically updates the .xls- output file.

  15. 76 FR 2892 - City of Ouray; Notice of Application Accepted for Filing, Ready for Environmental Analysis, and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-18

    ... feet of new pipeline to direct water to a new power plant. The power plant will house one Pelton turbine and induction motor generator with a maximum output of 20 kilowatt. g. Location: Ouray, Colorado...

  16. CAP: A Computer Code for Generating Tabular Thermodynamic Functions from NASA Lewis Coefficients. Revised

    NASA Technical Reports Server (NTRS)

    Zehe, Michael J.; Gordon, Sanford; McBride, Bonnie J.

    2002-01-01

    For several decades the NASA Glenn Research Center has been providing a file of thermodynamic data for use in several computer programs. These data are in the form of least-squares coefficients that have been calculated from tabular thermodynamic data by means of the NASA Properties and Coefficients (PAC) program. The source thermodynamic data are obtained from the literature or from standard compilations. Most gas-phase thermodynamic functions are calculated by the authors from molecular constant data using ideal gas partition functions. The Coefficients and Properties (CAP) program described in this report permits the generation of tabulated thermodynamic functions from the NASA least-squares coefficients. CAP provides considerable flexibility in the output format, the number of temperatures to be tabulated, and the energy units of the calculated properties. This report provides a detailed description of input preparation, examples of input and output for several species, and a listing of all species in the current NASA Glenn thermodynamic data file.

  17. CAP: A Computer Code for Generating Tabular Thermodynamic Functions from NASA Lewis Coefficients

    NASA Technical Reports Server (NTRS)

    Zehe, Michael J.; Gordon, Sanford; McBride, Bonnie J.

    2001-01-01

    For several decades the NASA Glenn Research Center has been providing a file of thermodynamic data for use in several computer programs. These data are in the form of least-squares coefficients that have been calculated from tabular thermodynamic data by means of the NASA Properties and Coefficients (PAC) program. The source thermodynamic data are obtained from the literature or from standard compilations. Most gas-phase thermodynamic functions are calculated by the authors from molecular constant data using ideal gas partition functions. The Coefficients and Properties (CAP) program described in this report permits the generation of tabulated thermodynamic functions from the NASA least-squares coefficients. CAP provides considerable flexibility in the output format, the number of temperatures to be tabulated, and the energy units of the calculated properties. This report provides a detailed description of input preparation, examples of input and output for several species, and a listing of all species in the current NASA Glenn thermodynamic data file.

  18. CIF2Cell: Generating geometries for electronic structure programs

    NASA Astrophysics Data System (ADS)

    Björkman, Torbjörn

    2011-05-01

    The CIF2Cell program generates the geometrical setup for a number of electronic structure programs based on the crystallographic information in a Crystallographic Information Framework (CIF) file. The program will retrieve the space group number, Wyckoff positions and crystallographic parameters, make a sensible choice for Bravais lattice vectors (primitive or principal cell) and generate all atomic positions. Supercells can be generated and alloys are handled gracefully. The code currently has output interfaces to the electronic structure programs ABINIT, CASTEP, CPMD, Crystal, Elk, Exciting, EMTO, Fleur, RSPt, Siesta and VASP. Program summaryProgram title: CIF2Cell Catalogue identifier: AEIM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPL version 3 No. of lines in distributed program, including test data, etc.: 12 691 No. of bytes in distributed program, including test data, etc.: 74 933 Distribution format: tar.gz Programming language: Python (versions 2.4-2.7) Computer: Any computer that can run Python (versions 2.4-2.7) Operating system: Any operating system that can run Python (versions 2.4-2.7) Classification: 7.3, 7.8, 8 External routines: PyCIFRW [1] Nature of problem: Generate the geometrical setup of a crystallographic cell for a variety of electronic structure programs from data contained in a CIF file. Solution method: The CIF file is parsed using routines contained in the library PyCIFRW [1], and crystallographic as well as bibliographic information is extracted. The program then generates the principal cell from symmetry information, crystal parameters, space group number and Wyckoff sites. Reduction to a primitive cell is then performed, and the resulting cell is output to suitably named files along with documentation of the information source generated from any bibliographic information contained in the CIF file. If the space group symmetries is not present in the CIF file the program will fall back on internal tables, so only the minimal input of space group, crystal parameters and Wyckoff positions are required. Additional key features are handling of alloys and supercell generation. Additional comments: Currently implements support for the following general purpose electronic structure programs: ABINIT [2,3], CASTEP [4], CPMD [5], Crystal [6], Elk [7], exciting [8], EMTO [9], Fleur [10], RSPt [11], Siesta [12] and VASP [13-16]. Running time: The examples provided in the distribution take only seconds to run.

  19. Modifications to the accuracy assessment analysis routine MLTCRP to produce an output file

    NASA Technical Reports Server (NTRS)

    Carnes, J. G.

    1978-01-01

    Modifications are described that were made to the analysis program MLTCRP in the accuracy assessment software system to produce a disk output file. The output files produced by this modified program are used to aggregate data for regions greater than a single segment.

  20. MPST Software: MoonKommand

    NASA Technical Reports Server (NTRS)

    Kwok, John H.; Call, Jared A.; Khanampornpan, Teerapat

    2012-01-01

    This software automatically processes Sally Ride Science (SRS) delivered MoonKAM camera control files (ccf) into uplink products for the GRAIL-A and GRAIL-B spacecraft as part of an education and public outreach (EPO) extension to the Grail Mission. Once properly validated and deemed safe for execution onboard the spacecraft, MoonKommand generates the command products via the Automated Sequence Processor (ASP) and generates uplink (.scmf) files for radiation to the Grail-A and/or Grail-B spacecraft. Any errors detected along the way are reported back to SRS via email. With Moon Kommand, SRS can control their EPO instrument as part of a fully automated process. Inputs are received from SRS as either image capture files (.ccficd) for new image requests, or downlink/delete files (.ccfdl) for requesting image downlink from the instrument and on-board memory management. The Moon - Kommand outputs are command and file-load (.scmf) files that will be uplinked by the Deep Space Network (DSN). Without MoonKommand software, uplink product generation for the MoonKAM instrument would be a manual process. The software is specific to the Moon - KAM instrument on the GRAIL mission. At the time of this writing, the GRAIL mission was making final preparations to begin the science phase, which was scheduled to continue until June 2012.

  1. 3D Visualization of Hydrological Model Outputs For a Better Understanding of Multi-Scale Phenomena

    NASA Astrophysics Data System (ADS)

    Richard, J.; Schertzer, D. J. M.; Tchiguirinskaia, I.

    2014-12-01

    During the last decades, many hydrological models has been created to simulate extreme events or scenarios on catchments. The classical outputs of these models are 2D maps, time series or graphs, which are easily understood by scientists, but not so much by many stakeholders, e.g. mayors or local authorities, and the general public. One goal of the Blue Green Dream project is to create outputs that are adequate for them. To reach this goal, we decided to convert most of the model outputs into a unique 3D visualization interface that combines all of them. This conversion has to be performed with an hydrological thinking to keep the information consistent with the context and the raw outputs.We focus our work on the conversion of the outputs of the Multi-Hydro (MH) model, which is physically based, fully distributed and with a GIS data interface. MH splits the urban water cycle into 4 components: the rainfall, the surface runoff, the infiltration and the drainage. To each of them, corresponds a modeling module with specific inputs and outputs. The superimposition of all this information will highlight the model outputs and help to verify the quality of the raw input data. For example, the spatial and the time variability of the rain generated by the rainfall module will be directly visible in 4D (3D + time) before running a full simulation. It is the same with the runoff module: because the result quality depends of the resolution of the rasterized land use, it will confirm or not the choice of the cell size.As most of the inputs and outputs are GIS files, two main conversions will be applied to display the results into 3D. First, a conversion from vector files to 3D objects. For example, buildings are defined in 2D inside a GIS vector file. Each polygon can be extruded with an height to create volumes. The principle is the same for the roads but an intrusion, instead of an extrusion, is done inside the topography file. The second main conversion is the raster conversion. Several files, such as the topography, the land use, the water depth, etc., are defined by geo-referenced grids. The corresponding grids are converted into a list of triangles to be displayed inside the 3D window. For the water depth, the display in pixels will not longer be the only solution. Creation of water contours will be done to more easily delineate the flood inside the catchment.

  2. Validation Results for LEWICE 2.0. [Supplement

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Rutkowski, Adam

    1999-01-01

    Two CD-ROMs contain experimental ice shapes and code prediction used for validation of LEWICE 2.0 (see NASA/CR-1999-208690, CASI ID 19990021235). The data include ice shapes for both experiment and for LEWICE, all of the input and output files for the LEWICE cases, JPG files of all plots generated, an electronic copy of the text of the validation report, and a Microsoft Excel(R) spreadsheet containing all of the quantitative measurements taken. The LEWICE source code and executable are not contained on the discs.

  3. Arlequin suite ver 3.5: a new series of programs to perform population genetics analyses under Linux and Windows.

    PubMed

    Excoffier, Laurent; Lischer, Heidi E L

    2010-05-01

    We present here a new version of the Arlequin program available under three different forms: a Windows graphical version (Winarl35), a console version of Arlequin (arlecore), and a specific console version to compute summary statistics (arlsumstat). The command-line versions run under both Linux and Windows. The main innovations of the new version include enhanced outputs in XML format, the possibility to embed graphics displaying computation results directly into output files, and the implementation of a new method to detect loci under selection from genome scans. Command-line versions are designed to handle large series of files, and arlsumstat can be used to generate summary statistics from simulated data sets within an Approximate Bayesian Computation framework. © 2010 Blackwell Publishing Ltd.

  4. Managing the computational chemistry big data problem: the ioChem-BD platform.

    PubMed

    Álvarez-Moreno, M; de Graaf, C; López, N; Maseras, F; Poblet, J M; Bo, C

    2015-01-26

    We present the ioChem-BD platform ( www.iochem-bd.org ) as a multiheaded tool aimed to manage large volumes of quantum chemistry results from a diverse group of already common simulation packages. The platform has an extensible structure. The key modules managing the main tasks are to (i) upload of output files from common computational chemistry packages, (ii) extract meaningful data from the results, and (iii) generate output summaries in user-friendly formats. A heavy use of the Chemical Mark-up Language (CML) is made in the intermediate files used by ioChem-BD. From them and using XSL techniques, we manipulate and transform such chemical data sets to fulfill researchers' needs in the form of HTML5 reports, supporting information, and other research media.

  5. ListingAnalyst: A program for analyzing the main output file from MODFLOW

    USGS Publications Warehouse

    Winston, Richard B.; Paulinski, Scott

    2014-01-01

    ListingAnalyst is a Windows® program for viewing the main output file from MODFLOW-2005, MODFLOW-NWT, or MODFLOW-LGR. It organizes and displays large files quickly without using excessive memory. The sections and subsections of the file are displayed in a tree-view control, which allows the user to navigate quickly to desired locations in the files. ListingAnalyst gathers error and warning messages scattered throughout the main output file and displays them all together in an error and a warning tab. A grid view displays tables in a readable format and allows the user to copy the table into a spreadsheet. The user can also search the file for terms of interest.

  6. PeakWorks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-11-30

    The PeakWorks software is designed to assist in the quantitative analysis of atom probe tomography (APT) generated mass spectra. Specifically, through an interactive user interface, mass peaks can be identified automatically (defined by a threshold) and/or identified manually. The software then provides a means to assign specific elemental isotopes (including more than one) to each peak. The software also provides a means for the user to choose background subtraction of each peak based on background fitting functions, the choice of which is left to the users discretion. Peak ranging (the mass range over which peaks are integrated) is also automatedmore » allowing the user to chose a quantitative range (e.g. full-widthhalf- maximum). The software then integrates all identified peaks, providing a background-subtracted composition, which also includes the deconvolution of peaks (i.e. those peaks that happen to have overlapping isotopic masses). The software is also able to output a 'range file' that can be used in other software packages, such as within IVAS. A range file lists the peak identities, the mass range of each identified peak, and a color code for the peak. The software is also able to generate 'dummy' peak ranges within an outputted range file that can be used within IVAS to provide a means for background subtracted proximity histogram analysis.« less

  7. CAPRI: Using a Geometric Foundation for Computational Analysis and Design

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    2002-01-01

    CAPRI (Computational Analysis Programming Interface) is a software development tool intended to make computerized design, simulation and analysis faster and more efficient. The computational steps traditionally taken for most engineering analysis (Computational Fluid Dynamics (CFD), structural analysis, etc.) are: Surface Generation, usually by employing a Computer Aided Design (CAD) system; Grid Generation, preparing the volume for the simulation; Flow Solver, producing the results at the specified operational point; Post-processing Visualization, interactively attempting to understand the results. It should be noted that the structures problem is more tractable than CFD; there are fewer mesh topologies used and the grids are not as fine (this problem space does not have the length scaling issues of fluids). For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. In most cases, the output from a CAD system could go IGES files. The output from Grid Generators and Solvers do not really have standards though there are a couple of file formats that can be used for a subset of the gridding (i.e. PLOT3D) data formats and the upcoming CGNS). The user would have to patch up the data or translate from one format to another to move to the next step. Sometimes this could take days. Instead of the serial approach to analysis, CAPRI takes a geometry centric approach. CAPRI is a software building tool-kit that refers to two ideas: (1) A simplified, object-oriented, hierarchical view of a solid part integrating both geometry and topology definitions, and (2) programming access to this part or assembly and any attached data. The connection to the geometry is made through an Application Programming Interface (API) and not a file system.

  8. IDSP- INTERACTIVE DIGITAL SIGNAL PROCESSOR

    NASA Technical Reports Server (NTRS)

    Mish, W. H.

    1994-01-01

    The Interactive Digital Signal Processor, IDSP, consists of a set of time series analysis "operators" based on the various algorithms commonly used for digital signal analysis work. The processing of a digital time series to extract information is usually achieved by the application of a number of fairly standard operations. However, it is often desirable to "experiment" with various operations and combinations of operations to explore their effect on the results. IDSP is designed to provide an interactive and easy-to-use system for this type of digital time series analysis. The IDSP operators can be applied in any sensible order (even recursively), and can be applied to single time series or to simultaneous time series. IDSP is being used extensively to process data obtained from scientific instruments onboard spacecraft. It is also an excellent teaching tool for demonstrating the application of time series operators to artificially-generated signals. IDSP currently includes over 43 standard operators. Processing operators provide for Fourier transformation operations, design and application of digital filters, and Eigenvalue analysis. Additional support operators provide for data editing, display of information, graphical output, and batch operation. User-developed operators can be easily interfaced with the system to provide for expansion and experimentation. Each operator application generates one or more output files from an input file. The processing of a file can involve many operators in a complex application. IDSP maintains historical information as an integral part of each file so that the user can display the operator history of the file at any time during an interactive analysis. IDSP is written in VAX FORTRAN 77 for interactive or batch execution and has been implemented on a DEC VAX-11/780 operating under VMS. The IDSP system generates graphics output for a variety of graphics systems. The program requires the use of Versaplot and Template plotting routines and IMSL Math/Library routines. These software packages are not included in IDSP. The virtual memory requirement for the program is approximately 2.36 MB. The IDSP system was developed in 1982 and was last updated in 1986. Versaplot is a registered trademark of Versatec Inc. Template is a registered trademark of Template Graphics Software Inc. IMSL Math/Library is a registered trademark of IMSL Inc.

  9. Simulation of Distributed PV Power Output in Oahu Hawaii

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lave, Matthew Samuel

    2016-08-01

    Distributed solar photovoltaic (PV) power generation in Oahu has grown rapidly since 2008. For applications such as determining the value of energy storage, it is important to have PV power output timeseries. Since these timeseries of not typically measured, here we produce simulated distributed PV power output for Oahu. Simulated power output is based on (a) satellite-derived solar irradiance, (b) PV permit data by neighborhood, and (c) population data by census block. Permit and population data was used to model locations of distributed PV, and irradiance data was then used to simulate power output. PV power output simulations are presentedmore » by sub-neighborhood polygons, neighborhoods, and for the whole island of Oahu. Summary plots of annual PV energy and a sample week timeseries of power output are shown, and a the files containing the entire timeseries are described.« less

  10. HLYWD: a program for post-processing data files to generate selected plots or time-lapse graphics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munro, J.K. Jr.

    1980-05-01

    The program HLYWD is a post-processor of output files generated by large plasma simulation computations or of data files containing a time sequence of plasma diagnostics. It is intended to be used in a production mode for either type of application; i.e., it allows one to generate along with the graphics sequence, segments containing title, credits to those who performed the work, text to describe the graphics, and acknowledgement of funding agency. The current version is designed to generate 3D plots and allows one to select type of display (linear or semi-log scales), choice of normalization of function values formore » display purposes, viewing perspective, and an option to allow continuous rotations of surfaces. This program was developed with the intention of being relatively easy to use, reasonably flexible, and requiring a minimum investment of the user's time. It uses the TV80 library of graphics software and ORDERLIB system software on the CDC 7600 at the National Magnetic Fusion Energy Computing Center at Lawrence Livermore Laboratory in California.« less

  11. Obscuration Code with Space Station Applications (Manual)

    DTIC Science & Technology

    1985-12-01

    used to perform this DCL style com - mand parsing, readers are referred to the VMS documentation concerning the Command Definition Utility or CDU. I I I...FOR0O7.DAT; Input echo file: USERI: [RJM.NASJAN5S1 .LIS;3 The above examples show the operation of the SET OUTPUT com - mand. Note that the printer file is...be opened using the SET OUTPUT com - mand. The output files can be opened and closed using the SET OUTPUT /ECHOING, /PRINTABLE, /PLOTTABLE commands

  12. GENSURF: A mesh generator for 3D finite element analysis of surface and corner cracks in finite thickness plates subjected to mode-1 loadings

    NASA Technical Reports Server (NTRS)

    Raju, I. S.

    1992-01-01

    A computer program that generates three-dimensional (3D) finite element models for cracked 3D solids was written. This computer program, gensurf, uses minimal input data to generate 3D finite element models for isotropic solids with elliptic or part-elliptic cracks. These models can be used with a 3D finite element program called surf3d. This report documents this mesh generator. In this manual the capabilities, limitations, and organization of gensurf are described. The procedures used to develop 3D finite element models and the input for and the output of gensurf are explained. Several examples are included to illustrate the use of this program. Several input data files are included with this manual so that the users can edit these files to conform to their crack configuration and use them with gensurf.

  13. Integrating SAS and GIS software to improve habitat-use estimates from radiotelemetry data

    USGS Publications Warehouse

    Kenow, K.P.; Wright, R.G.; Samuel, M.D.; Rasmussen, P.W.

    2001-01-01

    Radiotelemetry has been used commonly to remotely determine habitat use by a variety of wildlife species. However, habitat misclassification can occur because the true location of a radiomarked animal can only be estimated. Analytical methods that provide improved estimates of habitat use from radiotelemetry location data using a subsampling approach have been proposed previously. We developed software, based on these methods, to conduct improved habitat-use analyses. A Statistical Analysis System (SAS)-executable file generates a random subsample of points from the error distribution of an estimated animal location and formats the output into ARC/INFO-compatible coordinate and attribute files. An associated ARC/INFO Arc Macro Language (AML) creates a coverage of the random points, determines the habitat type at each random point from an existing habitat coverage, sums the number of subsample points by habitat type for each location, and outputs tile results in ASCII format. The proportion and precision of habitat types used is calculated from the subsample of points generated for each radiotelemetry location. We illustrate the method and software by analysis of radiotelemetry data for a female wild turkey (Meleagris gallopavo).

  14. Auto Draw from Excel Input Files

    NASA Technical Reports Server (NTRS)

    Strauss, Karl F.; Goullioud, Renaud; Cox, Brian; Grimes, James M.

    2011-01-01

    The design process often involves the use of Excel files during project development. To facilitate communications of the information in the Excel files, drawings are often generated. During the design process, the Excel files are updated often to reflect new input. The problem is that the drawings often lag the updates, often leading to confusion of the current state of the design. The use of this program allows visualization of complex data in a format that is more easily understandable than pages of numbers. Because the graphical output can be updated automatically, the manual labor of diagram drawing can be eliminated. The more frequent update of system diagrams can reduce confusion and reduce errors and is likely to uncover symmetric problems earlier in the design cycle, thus reducing rework and redesign.

  15. Extracting the Data From the LCM vk4 Formatted Output File

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, James G.

    These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and computemore » laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.« less

  16. Interactive computer methods for generating mineral-resource maps

    USGS Publications Warehouse

    Calkins, James Alfred; Crosby, A.S.; Huffman, T.E.; Clark, A.L.; Mason, G.T.; Bascle, R.J.

    1980-01-01

    Inasmuch as maps are a basic tool of geologists, the U.S. Geological Survey's CRIB (Computerized Resources Information Bank) was constructed so that the data it contains can be used to generate mineral-resource maps. However, by the standard methods used-batch processing and off-line plotting-the production of a finished map commonly takes 2-3 weeks. To produce computer-generated maps more rapidly, cheaply, and easily, and also to provide an effective demonstration tool, we have devised two related methods for plotting maps as alternatives to conventional batch methods. These methods are: 1. Quick-Plot, an interactive program whose output appears on a CRT (cathode-ray-tube) device, and 2. The Interactive CAM (Cartographic Automatic Mapping system), which combines batch and interactive runs. The output of the Interactive CAM system is final compilation (not camera-ready) paper copy. Both methods are designed to use data from the CRIB file in conjunction with a map-plotting program. Quick-Plot retrieves a user-selected subset of data from the CRIB file, immediately produces an image of the desired area on a CRT device, and plots data points according to a limited set of user-selected symbols. This method is useful for immediate evaluation of the map and for demonstrating how trial maps can be made quickly. The Interactive CAM system links the output of an interactive CRIB retrieval to a modified version of the CAM program, which runs in the batch mode and stores plotting instructions on a disk, rather than on a tape. The disk can be accessed by a CRT, and, thus, the user can view and evaluate the map output on a CRT immediately after a batch run, without waiting 1-3 days for an off-line plot. The user can, therefore, do most of the layout and design work in a relatively short time by use of the CRT, before generating a plot tape and having the map plotted on an off-line plotter.

  17. RM-CLEAN: RM spectra cleaner

    NASA Astrophysics Data System (ADS)

    Heald, George

    2017-08-01

    RM-CLEAN reads in dirty Q and U cubes, generates rmtf based on the frequencies given in an ASCII file, and cleans the RM spectra following the algorithm given by Brentjens (2007). The output cubes contain the clean model components and the CLEANed RM spectra. The input cubes must be reordered with mode=312, and the output cubes will have the same ordering and thus must be reordered after being written to disk. RM-CLEAN runs as a MIRIAD (ascl:1106.007) task and a Python wrapper is included with the code.

  18. VizieR Online Data Catalog: Algorithm for correcting CoRoT raw light curves (Mislis+, 2010)

    NASA Astrophysics Data System (ADS)

    Mislis, D.; Schmitt, J. H. M. M.; Carone, L.; Guenther, E. W.; Patzold, M.

    2010-10-01

    Requirements : gfortran (or g77, ifort) compiler Input Files : The input files sould be raw CoRoT txt files (http://idoc-corot.ias.u-psud.fr/index.jsp) with names CoRoT*.txt Run the cda by typing C>: ./cda.csh (code and data sould be in the same directory) Output files : CDA creates one ascii output file with name - CoRoT*.R.cor for R filter (2 data files).

  19. VizieR Online Data Catalog: ND2 rotational spectrum (Melosso+,

    NASA Astrophysics Data System (ADS)

    Melosso, M.; Degli Esposti, C.; Dore, L.

    2018-01-01

    files used with the SPFIT/SPCAT program suite. There are 8 files of supplementary material, including a ReadMe, which was created by the AAS data editors. The text files are as follows: 1_Explan.txt = information on the content of the other files. 2ND2.fit = the output file of the fit of spectroscopic data used in the present study. 3ND2.lin = the corresponding line file. 4ND2.par = the corresponding parameter file. 5ND2.cat = the output file of the prediction made with the parameters determined in this study. 6ND2.var = the corresponding parameter file 7ND2.int = the corresponding intensity file (1 data file).

  20. Music 4C, a multi-voiced synthesis program with instruments defined in C

    NASA Astrophysics Data System (ADS)

    Beauchamp, James W.

    2003-04-01

    Music 4C is a program which runs under Unix (including Linux) and provides a means for the synthesis of arbitrary signals as defined by the C code. The program is actually a loose translation of an earlier program, Music 4BF [H. S. Howe, Jr., Electronic Music Synthesis (Norton, 1975)]. A set of instrument definitions are driven by a numerical score which consists of a series of ``events.'' Each event gives an instrument name, start time and duration, and a number of parameters (e.g., pitch) which describe the event. Each instrument definition consists of event parameters, performance variables, initializations, and a synthesis algorithmic code. Thus, the synthetic signal, no matter how complex, is precisely defined. Moreover, the resulting sounds can be overlaid in any arbitrary pattern. The program serves as a mixer of algorithmically produced sounds or recorded sounds taken from sample files or synthesized from spectrum files. A score file can be entered by hand, generated from a program, translated from a MIDI file, or generated from an alpha-numeric score using an auxiliary program, Notepro. Output sample files are in wav, snd, or aiff format. The program is provided in the C source code for download.

  1. File-based data flow in the CMS Filter Farm

    NASA Astrophysics Data System (ADS)

    Andre, J.-M.; Andronidis, A.; Bawej, T.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.

  2. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 4: HARP Output (HARPO) graphics display user's guide

    NASA Technical Reports Server (NTRS)

    Sproles, Darrell W.; Bavuso, Salvatore J.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical postprocessor program HARPO (HARP Output). HARPO reads ASCII files generated by HARP. It provides an interactive plotting capability that can be used to display alternate model data for trade-off analyses. File data can also be imported to other commercial software programs.

  3. CGDV: a webtool for circular visualization of genomics and transcriptomics data.

    PubMed

    Jha, Vineet; Singh, Gulzar; Kumar, Shiva; Sonawane, Amol; Jere, Abhay; Anamika, Krishanpal

    2017-10-24

    Interpretation of large-scale data is very challenging and currently there is scarcity of web tools which support automated visualization of a variety of high throughput genomics and transcriptomics data and for a wide variety of model organisms along with user defined karyotypes. Circular plot provides holistic visualization of high throughput large scale data but it is very complex and challenging to generate as most of the available tools need informatics expertise to install and run them. We have developed CGDV (Circos for Genomics and Transcriptomics Data Visualization), a webtool based on Circos, for seamless and automated visualization of a variety of large scale genomics and transcriptomics data. CGDV takes output of analyzed genomics or transcriptomics data of different formats, such as vcf, bed, xls, tab limited matrix text file, CNVnator raw output and Gene fusion raw output, to plot circular view of the sample data. CGDV take cares of generating intermediate files required for circos. CGDV is freely available at https://cgdv-upload.persistent.co.in/cgdv/ . The circular plot for each data type is tailored to gain best biological insights into the data. The inter-relationship between data points, homologous sequences, genes involved in fusion events, differential expression pattern, sequencing depth, types and size of variations and enrichment of DNA binding proteins can be seen using CGDV. CGDV thus helps biologists and bioinformaticians to visualize a variety of genomics and transcriptomics data seamlessly.

  4. Mapping RNA-seq Reads with STAR

    PubMed Central

    Dobin, Alexander; Gingeras, Thomas R.

    2015-01-01

    Mapping of large sets of high-throughput sequencing reads to a reference genome is one of the foundational steps in RNA-seq data analysis. The STAR software package performs this task with high levels of accuracy and speed. In addition to detecting annotated and novel splice junctions, STAR is capable of discovering more complex RNA sequence arrangements, such as chimeric and circular RNA. STAR can align spliced sequences of any length with moderate error rates providing scalability for emerging sequencing technologies. STAR generates output files that can be used for many downstream analyses such as transcript/gene expression quantification, differential gene expression, novel isoform reconstruction, signal visualization, and so forth. In this unit we describe computational protocols that produce various output files, use different RNA-seq datatypes, and utilize different mapping strategies. STAR is Open Source software that can be run on Unix, Linux or Mac OS X systems. PMID:26334920

  5. Mapping RNA-seq Reads with STAR.

    PubMed

    Dobin, Alexander; Gingeras, Thomas R

    2015-09-03

    Mapping of large sets of high-throughput sequencing reads to a reference genome is one of the foundational steps in RNA-seq data analysis. The STAR software package performs this task with high levels of accuracy and speed. In addition to detecting annotated and novel splice junctions, STAR is capable of discovering more complex RNA sequence arrangements, such as chimeric and circular RNA. STAR can align spliced sequences of any length with moderate error rates, providing scalability for emerging sequencing technologies. STAR generates output files that can be used for many downstream analyses such as transcript/gene expression quantification, differential gene expression, novel isoform reconstruction, and signal visualization. In this unit, we describe computational protocols that produce various output files, use different RNA-seq datatypes, and utilize different mapping strategies. STAR is open source software that can be run on Unix, Linux, or Mac OS X systems. Copyright © 2015 John Wiley & Sons, Inc.

  6. A mass spectrometry proteomics data management platform.

    PubMed

    Sharma, Vagisha; Eng, Jimmy K; Maccoss, Michael J; Riffle, Michael

    2012-09-01

    Mass spectrometry-based proteomics is increasingly being used in biomedical research. These experiments typically generate a large volume of highly complex data, and the volume and complexity are only increasing with time. There exist many software pipelines for analyzing these data (each typically with its own file formats), and as technology improves, these file formats change and new formats are developed. Files produced from these myriad software programs may accumulate on hard disks or tape drives over time, with older files being rendered progressively more obsolete and unusable with each successive technical advancement and data format change. Although initiatives exist to standardize the file formats used in proteomics, they do not address the core failings of a file-based data management system: (1) files are typically poorly annotated experimentally, (2) files are "organically" distributed across laboratory file systems in an ad hoc manner, (3) files formats become obsolete, and (4) searching the data and comparing and contrasting results across separate experiments is very inefficient (if possible at all). Here we present a relational database architecture and accompanying web application dubbed Mass Spectrometry Data Platform that is designed to address the failings of the file-based mass spectrometry data management approach. The database is designed such that the output of disparate software pipelines may be imported into a core set of unified tables, with these core tables being extended to support data generated by specific pipelines. Because the data are unified, they may be queried, viewed, and compared across multiple experiments using a common web interface. Mass Spectrometry Data Platform is open source and freely available at http://code.google.com/p/msdapl/.

  7. CHARMM-GUI ligand reader and modeler for CHARMM force field generation of small molecules.

    PubMed

    Kim, Seonghoon; Lee, Jumin; Jo, Sunhwan; Brooks, Charles L; Lee, Hui Sun; Im, Wonpil

    2017-06-05

    Reading ligand structures into any simulation program is often nontrivial and time consuming, especially when the force field parameters and/or structure files of the corresponding molecules are not available. To address this problem, we have developed Ligand Reader & Modeler in CHARMM-GUI. Users can upload ligand structure information in various forms (using PDB ID, ligand ID, SMILES, MOL/MOL2/SDF file, or PDB/mmCIF file), and the uploaded structure is displayed on a sketchpad for verification and further modification. Based on the displayed structure, Ligand Reader & Modeler generates the ligand force field parameters and necessary structure files by searching for the ligand in the CHARMM force field library or using the CHARMM general force field (CGenFF). In addition, users can define chemical substitution sites and draw substituents in each site on the sketchpad to generate a set of combinatorial structure files and corresponding force field parameters for throughput or alchemical free energy simulations. Finally, the output from Ligand Reader & Modeler can be used in other CHARMM-GUI modules to build a protein-ligand simulation system for all supported simulation programs, such as CHARMM, NAMD, GROMACS, AMBER, GENESIS, LAMMPS, Desmond, OpenMM, and CHARMM/OpenMM. Ligand Reader & Modeler is available as a functional module of CHARMM-GUI at http://www.charmm-gui.org/input/ligandrm. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  8. Simulating storage part of application with Simgrid

    NASA Astrophysics Data System (ADS)

    Wang, Cong

    2017-10-01

    Design of a file system simulation and visualization system, using simgrid API and visualization techniques to help users understanding and improving the file system portion of their application. The core of the simulator is the API provided by simgrid, cluefs tracks and catches the procedure of the I/O operation. Run the simulator simulating this application to generate the output visualization file, which can visualize the I/O action proportion and time series. Users can also change the parameters in the configuration file to change the parameters of the storage system such as reading and writing bandwidth, users can also adjust the storage strategy, test the performance, getting reference to be much easier to optimize the storage system. We have tested all the aspects of the simulator, the results suggest that the simulator performance can be believable.

  9. User guide for MODPATH version 6 - A particle-tracking model for MODFLOW

    USGS Publications Warehouse

    Pollock, David W.

    2012-01-01

    MODPATH is a particle-tracking post-processing model that computes three-dimensional flow paths using output from groundwater flow simulations based on MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. This report documents MODPATH version 6. Previous versions were documented in USGS Open-File Reports 89-381 and 94-464. The program uses a semianalytical particle-tracking scheme that allows an analytical expression of a particle's flow path to be obtained within each finite-difference grid cell. A particle's path is computed by tracking the particle from one cell to the next until it reaches a boundary, an internal sink/source, or satisfies another termination criterion. Data input to MODPATH consists of a combination of MODFLOW input data files, MODFLOW head and flow output files, and other input files specific to MODPATH. Output from MODPATH consists of several output files, including a number of particle coordinate output files intended to serve as input data for other programs that process, analyze, and display the results in various ways. MODPATH is written in FORTRAN and can be compiled by any FORTRAN compiler that fully supports FORTRAN-2003 or by most commercially available FORTRAN-95 compilers that support the major FORTRAN-2003 language extensions.

  10. Risk Assessment Update: Russian Segment

    NASA Technical Reports Server (NTRS)

    Christiansen, Eric; Lear, Dana; Hyde, James; Bjorkman, Michael; Hoffman, Kevin

    2012-01-01

    BUMPER-II version 1.95j source code was provided to RSC-E- and Khrunichev at January 2012 MMOD TIM in Moscow. MEMCxP and ORDEM 3.0 environments implemented as external data files. NASA provided a sample ORDEM 3.0 g."key" & "daf" environment file set for demonstration and benchmarking BUMPER -II v1.95j installation at the Jan-12 TIM. ORDEM 3.0 has been completed and is currently in beta testing. NASA will provide a preliminary set of ORDEM 3.0 ".key" & ".daf" environment files for the years 2012 through 2028. Bumper output files produced using the new ORDEM 3.0 data files are intended for internal use only, not for requirements verification. Output files will contain these words ORDEM FILE DESCRIPTION = PRELIMINARY VERSION: not for production. The projectile density term in many BUMPER-II ballistic limit equations will need to be updated. Cube demo scripts and output files delivered at the Jan-12 TIM have been updated for the new ORDEM 3.0 data files. Risk assessment results based on ORDEM 3.0 and MEM will be presented for the Russian Segment (RS) of ISS.

  11. Program Predicts Performance of Optical Parametric Oscillators

    NASA Technical Reports Server (NTRS)

    Cross, Patricia L.; Bowers, Mark

    2006-01-01

    A computer program predicts the performances of solid-state lasers that operate at wavelengths from ultraviolet through mid-infrared and that comprise various combinations of stable and unstable resonators, optical parametric oscillators (OPOs), and sum-frequency generators (SFGs), including second-harmonic generators (SHGs). The input to the program describes the signal, idler, and pump beams; the SFG and OPO crystals; and the laser geometry. The program calculates the electric fields of the idler, pump, and output beams at three locations (inside the laser resonator, just outside the input mirror, and just outside the output mirror) as functions of time for the duration of the pump beam. For each beam, the electric field is used to calculate the fluence at the output mirror, plus summary parameters that include the centroid location, the radius of curvature of the wavefront leaving through the output mirror, the location and size of the beam waist, and a quantity known, variously, as a propagation constant or beam-quality factor. The program provides a typical Windows interface for entering data and selecting files. The program can include as many as six plot windows, each containing four graphs.

  12. User's manual for THPLOT, A FORTRAN 77 Computer program for time history plotting

    NASA Technical Reports Server (NTRS)

    Murray, J. E.

    1982-01-01

    A general purpose FORTRAN 77 computer program (THPLOT) for plotting time histories using Calcomp pen plotters is described. The program is designed to read a time history data file and to generate time history plots for selected time intervals and/or selected data channels. The capabilities of the program are described. The card input required to define the plotting operation is described and examples of card input and the resulting plotted output are given. The examples are followed by a description of the printed output, including both normal output and error messages. Lastly, implementation of the program is described. A complete listing of the program with reference maps produced by the CDC FTN 5.0 compiler is included.

  13. Life and dynamic capacity modeling for aircraft transmissions

    NASA Technical Reports Server (NTRS)

    Savage, Michael

    1991-01-01

    A computer program to simulate the dynamic capacity and life of parallel shaft aircraft transmissions is presented. Five basic configurations can be analyzed: single mesh, compound, parallel, reverted, and single plane reductions. In execution, the program prompts the user for the data file prefix name, takes input from a ASCII file, and writes its output to a second ASCII file with the same prefix name. The input data file includes the transmission configuration, the input shaft torque and speed, and descriptions of the transmission geometry and the component gears and bearings. The program output file describes the transmission, its components, their capabilities, locations, and loads. It also lists the dynamic capability, ninety percent reliability, and mean life of each component and the transmission as a system. Here, the program, its input and output files, and the theory behind the operation of the program are described.

  14. Legato: Personal Computer Software for Analyzing Pressure-Sensitive Paint Data

    NASA Technical Reports Server (NTRS)

    Schairer, Edward T.

    2001-01-01

    'Legato' is personal computer software for analyzing radiometric pressure-sensitive paint (PSP) data. The software is written in the C programming language and executes under Windows 95/98/NT operating systems. It includes all operations normally required to convert pressure-paint image intensities to normalized pressure distributions mapped to physical coordinates of the test article. The program can analyze data from both single- and bi-luminophore paints and provides for both in situ and a priori paint calibration. In addition, there are functions for determining paint calibration coefficients from calibration-chamber data. The software is designed as a self-contained, interactive research tool that requires as input only the bare minimum of information needed to accomplish each function, e.g., images, model geometry, and paint calibration coefficients (for a priori calibration) or pressure-tap data (for in situ calibration). The program includes functions that can be used to generate needed model geometry files for simple model geometries (e.g., airfoils, trapezoidal wings, rotor blades) based on the model planform and airfoil section. All data files except images are in ASCII format and thus are easily created, read, and edited. The program does not use database files. This simplifies setup but makes the program inappropriate for analyzing massive amounts of data from production wind tunnels. Program output consists of Cartesian plots, false-colored real and virtual images, pressure distributions mapped to the surface of the model, assorted ASCII data files, and a text file of tabulated results. Graphical output is displayed on the computer screen and can be saved as publication-quality (PostScript) files.

  15. A Mass Spectrometry Proteomics Data Management Platform*

    PubMed Central

    Sharma, Vagisha; Eng, Jimmy K.; MacCoss, Michael J.; Riffle, Michael

    2012-01-01

    Mass spectrometry-based proteomics is increasingly being used in biomedical research. These experiments typically generate a large volume of highly complex data, and the volume and complexity are only increasing with time. There exist many software pipelines for analyzing these data (each typically with its own file formats), and as technology improves, these file formats change and new formats are developed. Files produced from these myriad software programs may accumulate on hard disks or tape drives over time, with older files being rendered progressively more obsolete and unusable with each successive technical advancement and data format change. Although initiatives exist to standardize the file formats used in proteomics, they do not address the core failings of a file-based data management system: (1) files are typically poorly annotated experimentally, (2) files are “organically” distributed across laboratory file systems in an ad hoc manner, (3) files formats become obsolete, and (4) searching the data and comparing and contrasting results across separate experiments is very inefficient (if possible at all). Here we present a relational database architecture and accompanying web application dubbed Mass Spectrometry Data Platform that is designed to address the failings of the file-based mass spectrometry data management approach. The database is designed such that the output of disparate software pipelines may be imported into a core set of unified tables, with these core tables being extended to support data generated by specific pipelines. Because the data are unified, they may be queried, viewed, and compared across multiple experiments using a common web interface. Mass Spectrometry Data Platform is open source and freely available at http://code.google.com/p/msdapl/. PMID:22611296

  16. File-Based Data Flow in the CMS Filter Farm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andre, J.M.; et al.

    2015-12-23

    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes aremore » also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.« less

  17. 76 FR 12957 - City of Tacoma, Washington; Notice of Preliminary Permit Application Accepted for Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-09

    ... a total installed capacity of 500 kilowatts (kW); and (2) a station transformer at the powerhouse to... station transformer at the powerhouse to connect the turbine output to a 13.8-kV distribution line owned... turbine/generating units with a total installed capacity of 1,200 kW; and (2) a station transformer at the...

  18. Improving the Taiwan Military’s Disaster Relief Response to Typhoons

    DTIC Science & Technology

    2015-06-01

    circulation, are mostly westbound. When they reach the vicinity of Taiwan or the Philippines , which are always at the edge of the Pacific subtropical high...files from the POM base case model, one set for each design point. To automate the process of running all the GAMS files, a Windows batch file ( BAT ...is used to call on GAMS to solve each version of the model. The BAT file creates a new directory for each run to hold output, and one of the outputs

  19. Optimizing Input/Output Using Adaptive File System Policies

    NASA Technical Reports Server (NTRS)

    Madhyastha, Tara M.; Elford, Christopher L.; Reed, Daniel A.

    1996-01-01

    Parallel input/output characterization studies and experiments with flexible resource management algorithms indicate that adaptivity is crucial to file system performance. In this paper we propose an automatic technique for selecting and refining file system policies based on application access patterns and execution environment. An automatic classification framework allows the file system to select appropriate caching and pre-fetching policies, while performance sensors provide feedback used to tune policy parameters for specific system environments. To illustrate the potential performance improvements possible using adaptive file system policies, we present results from experiments involving classification-based and performance-based steering.

  20. Role Discovery in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-14

    RolX takes the features from Re-FeX or any other feature matrix as input and outputs role assignments (clusters). The output of RolX is a csv file containing the node-role memberships and a csv file containing the role-feature definitions.

  1. A Geometry Based Infra-Structure for Computational Analysis and Design

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    1998-01-01

    The computational steps traditionally taken for most engineering analysis suites (computational fluid dynamics (CFD), structural analysis, heat transfer and etc.) are: (1) Surface Generation -- usually by employing a Computer Assisted Design (CAD) system; (2) Grid Generation -- preparing the volume for the simulation; (3) Flow Solver -- producing the results at the specified operational point; (4) Post-processing Visualization -- interactively attempting to understand the results. For structural analysis, integrated systems can be obtained from a number of commercial vendors. These vendors couple directly to a number of CAD systems and are executed from within the CAD Graphical User Interface (GUI). It should be noted that the structural analysis problem is more tractable than CFD; there are fewer mesh topologies used and the grids are not as fine (this problem space does not have the length scaling issues of fluids). For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. In most cases, the output from a CAD system could go to Initial Graphics Exchange Specification (IGES) or Standard Exchange Program (STEP) files. The output from Grid Generators and Solvers do not really have standards though there are a couple of file formats that can be used for a subset of the gridding (i.e. PLOT3D data formats). The user would have to patch up the data or translate from one format to another to move to the next step. Sometimes this could take days. Specifically the problems with this procedure are:(1) File based -- Information flows from one step to the next via data files with formats specified for that procedure. File standards, when they exist, are wholly inadequate. For example, geometry from CAD systems (transmitted via IGES files) is defined as disjoint surfaces and curves (as well as masses of other information of no interest for the Grid Generator). This is particularly onerous for modern CAD systems based on solid modeling. The part was a proper solid and in the translation to IGES has lost this important characteristic. STEP is another standard for CAD data that exists and supports the concept of a solid. The problem with STEP is that a solid modeling geometry kernel is required to query and manipulate the data within this type of file. (2) 'Good' Geometry. A bottleneck in getting results from a solver is the construction of proper geometry to be fed to the grid generator. With 'good' geometry a grid can be constructed in tens of minutes (even with a complex configuration) using unstructured techniques. Adroit multi-block methods are not far behind. This means that a million node steady-state solution can be computed on the order of hours (using current high performance computers) starting from this 'good' geometry. Unfortunately, the geometry usually transmitted from the CAD system is not 'good' in the grid generator sense. The grid generator needs smooth closed solid geometry. It can take a week (or more) of interaction with the CAD output (sometimes by hand) before the process can begin. One way Communication. (3) One-way Communication -- All information travels on from one phase to the next. This makes procedures like node adaptation difficult when attempting to add or move nodes that sit on bounding surfaces (when the actual surface data has been lost after the grid generation phase). Until this process can be automated, more complex problems such as multi-disciplinary analysis or using the above procedure for design becomes prohibitive. There is also no way to easily deal with this system in a modular manner. One can only replace the grid generator, for example, if the software reads and writes the same files. Instead of the serial approach to analysis as described above, CAPRI takes a geometry centric approach. This makes the actual geometry (not a discretized version) accessible to all phases of the analysis. The connection to the geometry is made through an Application Programming Interface (API) and NOT a file system. This API isolates the top-level applications (grid generators, solvers and visualization components) from the geometry engine. Also this allows the replacement of one geometry kernel with another, without effecting these top-level applications. For example, if UniGraphics is used as the CAD package then Parasolid (UG's own geometry engine) can be used for all geometric queries so that no solid geometry information is lost in a translation. This is much better than STEP because when the data is queried, the same software is executed as used in the CAD system. Therefore, one analyzes the exact part that is in the CAD system. CAPRI uses the same idea as the commercial structural analysis codes but does not specify control. Software components of the CAD system are used, but the analysis suite, not the CAD operator, specifies the control of the software session. This also means that the license issues (may be) minimized and individuals need not have to know how to operate a CAD system in order to run the suite.

  2. ZOOM Lite: next-generation sequencing data mapping and visualization software

    PubMed Central

    Zhang, Zefeng; Lin, Hao; Ma, Bin

    2010-01-01

    High-throughput next-generation sequencing technologies pose increasing demands on the efficiency, accuracy and usability of data analysis software. In this article, we present ZOOM Lite, a software for efficient reads mapping and result visualization. With a kernel capable of mapping tens of millions of Illumina or AB SOLiD sequencing reads efficiently and accurately, and an intuitive graphical user interface, ZOOM Lite integrates reads mapping and result visualization into a easy to use pipeline on desktop PC. The software handles both single-end and paired-end reads, and can output both the unique mapping result or the top N mapping results for each read. Additionally, the software takes a variety of input file formats and outputs to several commonly used result formats. The software is freely available at http://bioinfor.com/zoom/lite/. PMID:20530531

  3. Command system output bit verification

    NASA Technical Reports Server (NTRS)

    Odd, C. W.; Abbate, S. F.

    1981-01-01

    An automatic test was developed to test the ability of the deep space station (DSS) command subsystem and exciter to generate and radiate, from the exciter, the correct idle bit sequence for a given flight project or to store and radiate received command data elements and files without alteration. This test, called the command system output bit verification test, is an extension of the command system performance test (SPT) and can be selected as an SPT option. The test compares the bit stream radiated from the DSS exciter with reference sequences generated by the SPT software program. The command subsystem and exciter are verified when the bit stream and reference sequences are identical. It is a key element of the acceptance testing conducted on the command processor assembly (CPA) operational program (DMC-0584-OP-G) prior to its transfer from development to operations.

  4. VizieR Online Data Catalog: Radiative forces for stellar envelopes (Seaton, 1997)

    NASA Astrophysics Data System (ADS)

    Seaton, M. J.; Yan, Y.; Mihalas, D.; Pradhan, A. K.

    2000-02-01

    (1) Primary data files, stages.zz These files give data for the calculation of radiative accelerations, GRAD, for elements with nuclear charge zz. Data are available for zz=06, 07, 08, 10, 11, 12, 13, 14, 16, 18, 20, 24, 25, 26 and 28. Calculations are made using data from the Opacity Project (see papers SYMP and IXZ). The data are given for each ionisation stage, j. They are tabulated on a mesh of (T, Ne, CHI) where T is temperature, Ne electron density and CHI is abundance multiplier. The files include data for ionisation fractions, for each (T, Ne). The file contents are described in the paper ACC and as comments in the code add.f (2) Code add.f This reads a file stages.zz and creates a file acc.zz giving radiative accelerations averaged over ionisation stages. The code prompts for names of input and output files. The code, as provided, gives equal weights (as defined in the paper ACC) to all stages. Th weights are set in SUBROUTINE WEIGHTS, which could be changed to give any weights preferred by the user. The dependence of diffusion coefficients on ionisation stage is given by a function ZET, which is defined in SUBROUTINE ZETA. The expressions used for ZET are as given in the paper. The user can change that subroutine if other expressions are preferred. The output file contains values, ZETBAR, of ZET, averaged over ionisation stages. (3) Files acc.zz Radiative accelerations computed using add.f as provided. The user will need to run the code add.f only if it is required to change the subroutines WEIGHTS or ZETA. The contents of the files acc.zz are described in the paper ACC and in comments contained in the code add.f. (4) Code accfit.f This code gives gives radiative accelerations, and some related data, for a stellar model. Methods used to interpolate data to the values of (T, RHO) for the stellar model are based on those used in the code opfit.for (see the paper OPF). The executable file accfit.com runs accfit.f. It uses a list of files given in accfit.files (see that file for further description). The mesh used for the abundance-multiplier CHI on the output file will generally be finer than that used in the input files acc.zz. The mesh to be used is specified on a file chi.dat. For a test run, the stellar model used is given in the file 10000_4.2 (Teff=10000 K, LOG10(g)=4.2) The output file from that test run is acc100004.2. The contents of the output file are described in the paper ACC and as comments in the code accfit.f. (5) The code diff.f This code reads the output file (e.g. acc1000004.2) created by accfit.f. For any specified depth point in the model and value of CHI, it gives values of radiative accelerations, the quantity ZETBAR required for calculation of diffusion coefficients, and Rosseland-mean opacities. The code prompts for input data. It creates a file recording all data calculated. The code diff.f is intended for incorporation, as a set of subroutines, in codes for diffusion calculations. (1 data file).

  5. DYNA3D, INGRID, and TAURUS: an integrated, interactive software system for crashworthiness engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benson, D.J.; Hallquist, J.O.; Stillman, D.W.

    1985-04-01

    Crashworthiness engineering has always been a high priority at Lawrence Livermore National Laboratory because of its role in the safe transport of radioactive material for the nuclear power industry and military. As a result, the authors have developed an integrated, interactive set of finite element programs for crashworthiness analysis. The heart of the system is DYNA3D, an explicit, fully vectorized, large deformation structural dynamics code. DYNA3D has the following four capabilities that are critical for the efficient and accurate analysis of crashes: (1) fully nonlinear solid, shell, and beam elements for representing a structure, (2) a broad range of constitutivemore » models for representing the materials, (3) sophisticated contact algorithms for the impact interactions, and (4) a rigid body capability to represent the bodies away from the impact zones at a greatly reduced cost without sacrificing any accuracy in the momentum calculations. To generate the large and complex data files for DYNA3D, INGRID, a general purpose mesh generator, is used. It runs on everything from IBM PCs to CRAYS, and can generate 1000 nodes/minute on a PC. With its efficient hidden line algorithms and many options for specifying geometry, INGRID also doubles as a geometric modeller. TAURUS, an interactive post processor, is used to display DYNA3D output. In addition to the standard monochrome hidden line display, time history plotting, and contouring, TAURUS generates interactive color displays on 8 color video screens by plotting color bands superimposed on the mesh which indicate the value of the state variables. For higher quality color output, graphic output files may be sent to the DICOMED film recorders. We have found that color is every bit as important as hidden line removal in aiding the analyst in understanding his results. In this paper the basic methodologies of the programs are presented along with several crashworthiness calculations.« less

  6. User`s and reference guide to the INEL RML/analytical radiochemistry sample tracking database version 1.00

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Femec, D.A.

    This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user`s guide and a reference guide. The user`s guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMAmore » and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices.« less

  7. QuakeSim Project Networking

    NASA Astrophysics Data System (ADS)

    Kong, D.; Donnellan, A.; Pierce, M. E.

    2012-12-01

    QuakeSim is an online computational framework focused on using remotely sensed geodetic imaging data to model and understand earthquakes. With the rise in online social networking over the last decade, many tools and concepts have been developed that are useful to research groups. In particular, QuakeSim is interested in the ability for researchers to post, share, and annotate files generated by modeling tools in order to facilitate collaboration. To accomplish this, features were added to the preexisting QuakeSim site that include single sign-on, automated saving of output from modeling tools, and a personal user space to manage sharing permissions on these saved files. These features implement OpenID and Lightweight Data Access Protocol (LDAP) technologies to manage files across several different servers, including a web server running Drupal and other servers hosting the computational tools themselves.

  8. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model......Do not return it to the originator. ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory

  9. Perl-speaks-NONMEM (PsN)--a Perl module for NONMEM related programming.

    PubMed

    Lindbom, Lars; Ribbing, Jakob; Jonsson, E Niclas

    2004-08-01

    The NONMEM program is the most widely used nonlinear regression software in population pharmacokinetic/pharmacodynamic (PK/PD) analyses. In this article we describe a programming library, Perl-speaks-NONMEM (PsN), intended for programmers that aim at using the computational capability of NONMEM in external applications. The library is object oriented and written in the programming language Perl. The classes of the library are built around NONMEM's data, model and output files. The specification of the NONMEM model is easily set or changed through the model and data file classes while the output from a model fit is accessed through the output file class. The classes have methods that help the programmer perform common repetitive tasks, e.g. summarising the output from a NONMEM run, setting the initial estimates of a model based on a previous run or truncating values over a certain threshold in the data file. PsN creates a basis for the development of high-level software using NONMEM as the regression tool.

  10. MASTRE trajectory code update to automate flight trajectory design, performance predictions, and vehicle sizing for support of shuttle and shuttle derived vehicles: Programmers manual

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The information required by a programmer using the Minimum Hamiltonian AScent Trajectory Evaluation (MASTRE) Program is provided. This document enables the programmer to either modify the program or convert the program to computers other than the VAX computer. Documentation for each subroutine or function based on providing the definitions of the variables and a source listing are included. Questions concerning the equations, techniques, or input requirements should be answered by either the Engineering or User's manuals. Three appendices are also included which provide a listing of the Root-Sum-Square (RSS) program, a listing of subroutine names and definitions used in the MASTRE User Friendly Interface Program, and listing of the subroutine names and definitions used in the Mass Properties Program. The RSS Program is used to aid in the performance of dispersion analyses. The RSS program reads a file generated by the MASTRE Program, calculates dispersion parameters, and generates output tables and output plot files. UFI Program provides a screen user interface to aid the user in providing input to the model. The Mass Properties Program defines the mass properties data for the MASTRE program through the use of user interface software.

  11. Natural Resource Information System. Volume 2: System operating procedures and instructions

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A total computer software system description is provided for the prototype Natural Resource Information System designed to store, process, and display data of maximum usefulness to land management decision making. Program modules are described, as are the computer file design, file updating methods, digitizing process, and paper tape conversion to magnetic tape. Operating instructions for the system, data output, printed output, and graphic output are also discussed.

  12. Attitude profile design program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Attitude Profile Design (APD) Program was designed to be used as a stand-alone addition to the Simplex Computation of Optimum Orbital Trajectories (SCOOT). The program uses information from a SCOOT output file and the user defined attitude profile to produce time histories of attitude, angular body rates, and accelerations. The APD program is written in standard FORTRAN77 and should be portable to any machine that has an appropriate compiler. The input and output are through formatted files. The program reads the basic flight data, such as the states of the vehicles, acceleration profiles, and burn information, from the SCOOT output file. The user inputs information about the desired attitude profile during coasts in a high level manner. The program then takes these high level commands and executes the maneuvers, outputting the desired information.

  13. A globally calibrated scheme for generating daily meteorology from monthly statistics: Global-WGEN (GWGEN) v1.0

    NASA Astrophysics Data System (ADS)

    Sommer, Philipp S.; Kaplan, Jed O.

    2017-10-01

    While a wide range of Earth system processes occur at daily and even subdaily timescales, many global vegetation and other terrestrial dynamics models historically used monthly meteorological forcing both to reduce computational demand and because global datasets were lacking. Recently, dynamic land surface modeling has moved towards resolving daily and subdaily processes, and global datasets containing daily and subdaily meteorology have become available. These meteorological datasets, however, cover only the instrumental era of the last approximately 120 years at best, are subject to considerable uncertainty, and represent extremely large data files with associated computational costs of data input/output and file transfer. For periods before the recent past or in the future, global meteorological forcing can be provided by climate model output, but the quality of these data at high temporal resolution is low, particularly for daily precipitation frequency and amount. Here, we present GWGEN, a globally applicable statistical weather generator for the temporal downscaling of monthly climatology to daily meteorology. Our weather generator is parameterized using a global meteorological database and simulates daily values of five common variables: minimum and maximum temperature, precipitation, cloud cover, and wind speed. GWGEN is lightweight, modular, and requires a minimal set of monthly mean variables as input. The weather generator may be used in a range of applications, for example, in global vegetation, crop, soil erosion, or hydrological models. While GWGEN does not currently perform spatially autocorrelated multi-point downscaling of daily weather, this additional functionality could be implemented in future versions.

  14. SnopViz, an interactive snow profile visualization tool

    NASA Astrophysics Data System (ADS)

    Fierz, Charles; Egger, Thomas; gerber, Matthias; Bavay, Mathias; Techel, Frank

    2016-04-01

    SnopViz is a visualization tool for both simulation outputs of the snow-cover model SNOWPACK and observed snow profiles. It has been designed to fulfil the needs of operational services (Swiss Avalanche Warning Service, Avalanche Canada) as well as offer the flexibility required to satisfy the specific needs of researchers. This JavaScript application runs on any modern browser and does not require an active Internet connection. The open source code is available for download from models.slf.ch where examples can also be run. Both the SnopViz library and the SnopViz User Interface will become a full replacement of the current research visualization tool SN_GUI for SNOWPACK. The SnopViz library is a stand-alone application that parses the provided input files, for example, a single snow profile (CAAML file format) or multiple snow profiles as output by SNOWPACK (PRO file format). A plugin architecture allows for handling JSON objects (JavaScript Object Notation) as well and plugins for other file formats may be added easily. The outputs are provided either as vector graphics (SVG) or JSON objects. The SnopViz User Interface (UI) is a browser based stand-alone interface. It runs in every modern browser, including IE, and allows user interaction with the graphs. SVG, the XML based standard for vector graphics, was chosen because of its easy interaction with JS and a good software support (Adobe Illustrator, Inkscape) to manipulate graphs outside SnopViz for publication purposes. SnopViz provides new visualization for SNOWPACK timeline output as well as time series input and output. The actual output format for SNOWPACK timelines was retained while time series are read from SMET files, a file format used in conjunction with the open source data handling code MeteoIO. Finally, SnopViz is able to render single snow profiles, either observed or modelled, that are provided as CAAML-file. This file format (caaml.org/Schemas/V5.0/Profiles/SnowProfileIACS) is an international standard to exchange snow profile data. It is supported by the International Association of Cryospheric Sciences (IACS) and was developed in collaboration with practitioners (Avalanche Canada).

  15. Preprocessor and postprocessor computer programs for a radial-flow finite-element model

    USGS Publications Warehouse

    Pucci, A.A.; Pope, D.A.

    1987-01-01

    Preprocessing and postprocessing computer programs that enhance the utility of the U.S. Geological Survey radial-flow model have been developed. The preprocessor program: (1) generates a triangular finite element mesh from minimal data input, (2) produces graphical displays and tabulations of data for the mesh , and (3) prepares an input data file to use with the radial-flow model. The postprocessor program is a version of the radial-flow model, which was modified to (1) produce graphical output for simulation and field results, (2) generate a statistic for comparing the simulation results with observed data, and (3) allow hydrologic properties to vary in the simulated region. Examples of the use of the processor programs for a hypothetical aquifer test are presented. Instructions for the data files, format instructions, and a listing of the preprocessor and postprocessor source codes are given in the appendixes. (Author 's abstract)

  16. Encryption and decryption using FPGA

    NASA Astrophysics Data System (ADS)

    Nayak, Nikhilesh; Chandak, Akshay; Shah, Nisarg; Karthikeyan, B.

    2017-11-01

    In this paper, we are performing multiple cryptography methods on a set of data and comparing their outputs. Here AES algorithm and RSA algorithm are used. Using AES Algorithm an 8 bit input (plain text) gets encrypted using a cipher key and the result is displayed on tera term (serially). For simulation a 128 bit input is used and operated with a 128 bit cipher key to generate encrypted text. The reverse operations are then performed to get decrypted text. In RSA Algorithm file handling is used to input plain text. This text is then operated on to get the encrypted and decrypted data, which are then stored in a file. Finally the results of both the algorithms are compared.

  17. DelPhi Web Server: A comprehensive online suite for electrostatic calculations of biological macromolecules and their complexes

    PubMed Central

    Sarkar, Subhra; Witham, Shawn; Zhang, Jie; Zhenirovskyy, Maxim; Rocchia, Walter; Alexov, Emil

    2011-01-01

    Here we report a web server, the DelPhi web server, which utilizes DelPhi program to calculate electrostatic energies and the corresponding electrostatic potential and ionic distributions, and dielectric map. The server provides extra services to fix structural defects, as missing atoms in the structural file and allows for generation of missing hydrogen atoms. The hydrogen placement and the corresponding DelPhi calculations can be done with user selected force field parameters being either Charmm22, Amber98 or OPLS. Upon completion of the calculations, the user is given option to download fixed and protonated structural file, together with the parameter and Delphi output files for further analysis. Utilizing Jmol viewer, the user can see the corresponding structural file, to manipulate it and to change the presentation. In addition, if the potential map is requested to be calculated, the potential can be mapped onto the molecule surface. The DelPhi web server is available from http://compbio.clemson.edu/delphi_webserver. PMID:24683424

  18. Electronic Photography at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Holm, Jack; Judge, Nancianne

    1995-01-01

    An electronic photography facility has been established in the Imaging & Photographic Technology Section, Visual Imaging Branch, at the NASA Langley Research Center (LaRC). The purpose of this facility is to provide the LaRC community with access to digital imaging technology. In particular, capabilities have been established for image scanning, direct image capture, optimized image processing for storage, image enhancement, and optimized device dependent image processing for output. Unique approaches include: evaluation and extraction of the entire film information content through scanning; standardization of image file tone reproduction characteristics for optimal bit utilization and viewing; education of digital imaging personnel on the effects of sampling and quantization to minimize image processing related information loss; investigation of the use of small kernel optimal filters for image restoration; characterization of a large array of output devices and development of image processing protocols for standardized output. Currently, the laboratory has a large collection of digital image files which contain essentially all the information present on the original films. These files are stored at 8-bits per color, but the initial image processing was done at higher bit depths and/or resolutions so that the full 8-bits are used in the stored files. The tone reproduction of these files has also been optimized so the available levels are distributed according to visual perceptibility. Look up tables are available which modify these files for standardized output on various devices, although color reproduction has been allowed to float to some extent to allow for full utilization of output device gamut.

  19. 41 CFR 101-30.401-2 - Automated catalog data output.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...

  20. 41 CFR 101-30.401-2 - Automated catalog data output.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...

  1. 41 CFR 101-30.401-2 - Automated catalog data output.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...

  2. 41 CFR 101-30.401-2 - Automated catalog data output.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...

  3. 41 CFR 101-30.401-2 - Automated catalog data output.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...

  4. OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Chan, William M.

    2012-01-01

    Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.

  5. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  6. NASA TLA workload analysis support. Volume 2: Metering and spacing studies validation data

    NASA Technical Reports Server (NTRS)

    Sundstrom, J. L.

    1980-01-01

    Four sets of graphic reports--one for each of the metering and spacing scenarios--are presented. The complete data file from which the reports were generated is also given. The data was used to validate the detail task of both the pilot and copilot for four metering and spacing scenarios. The output presents two measures of demand workload and a report showing task length and task interaction.

  7. Scheduling Algorithm for Mission Planning and Logistics Evaluation (SAMPLE). Volume 1: User's guide

    NASA Technical Reports Server (NTRS)

    Dupnick, E.; Wiggins, D.

    1980-01-01

    An interactive computer program for automatically generating traffic models for the Space Transportation System (STS) is presented. Information concerning run stream construction, input data, and output data is provided. The flow of the interactive data stream is described. Error messages are specified, along with suggestions for remedial action. In addition, formats and parameter definitions for the payload data set (payload model), feasible combination file, and traffic model are documented.

  8. Documentation of model input and output values for simulation of pumping effects in Paradise Valley, a basin tributary to the Humboldt River, Humboldt County, Nevada

    USGS Publications Warehouse

    Carey, A.E.; Prudic, David E.

    1996-01-01

    Documentation is provided of model input and sample output used in a previous report for analysis of ground-water flow and simulated pumping scenarios in Paradise Valley, Humboldt County, Nevada.Documentation includes files containing input values and listings of sample output. The files, in American International Standard Code for Information Interchange (ASCII) or binary format, are compressed and put on a 3-1/2-inch diskette. The decompressed files require approximately 8.4 megabytes of disk space on an International Business Machine (IBM)- compatible microcomputer using the MicroSoft Disk Operating System (MS-DOS) operating system version 5.0 or greater.

  9. Methods for the design and analysis of power optimized finite-state machines using clock gating

    NASA Astrophysics Data System (ADS)

    Chodorowski, Piotr

    2017-11-01

    The paper discusses two methods of design of power optimized FSMs. Both methods use clock gating techniques. The main objective of the research was to write a program capable of generating automatic hardware description of finite-state machines in VHDL and testbenches to help power analysis. The creation of relevant output files is detailed step by step. The program was tested using the LGSynth91 FSM benchmark package. An analysis of the generated circuits shows that the second method presented in this paper leads to significant reduction of power consumption.

  10. Using Parameters of Dynamic Pulse Function for 3d Modeling in LOD3 Based on Random Textures

    NASA Astrophysics Data System (ADS)

    Alizadehashrafi, B.

    2015-12-01

    The pulse function (PF) is a technique based on procedural preprocessing system to generate a computerized virtual photo of the façade with in a fixed size square(Alizadehashrafi et al., 2009, Musliman et al., 2010). Dynamic Pulse Function (DPF) is an enhanced version of PF which can create the final photo, proportional to real geometry. This can avoid distortion while projecting the computerized photo on the generated 3D model(Alizadehashrafi and Rahman, 2013). The challenging issue that might be handled for having 3D model in LoD3 rather than LOD2, is the final aim that have been achieved in this paper. In the technique based DPF the geometries of the windows and doors are saved in an XML file schema which does not have any connections with the 3D model in LoD2 and CityGML format. In this research the parameters of Dynamic Pulse Functions are utilized via Ruby programming language in SketchUp Trimble to generate (exact position and deepness) the windows and doors automatically in LoD3 based on the same concept of DPF. The advantage of this technique is automatic generation of huge number of similar geometries e.g. windows by utilizing parameters of DPF along with defining entities and window layers. In case of converting the SKP file to CityGML via FME software or CityGML plugins the 3D model contains the semantic database about the entities and window layers which can connect the CityGML to MySQL(Alizadehashrafi and Baig, 2014). The concept behind DPF, is to use logical operations to project the texture on the background image which is dynamically proportional to real geometry. The process of projection is based on two vertical and horizontal dynamic pulses starting from upper-left corner of the background wall in down and right directions respectively based on image coordinate system. The logical one/zero on the intersections of two vertical and horizontal dynamic pulses projects/does not project the texture on the background image. It is possible to define priority for each layer. For instance the priority of the door layer can be higher than window layer which means that window texture cannot be projected on the door layer. Orthogonal and rectified perpendicular symmetric photos of the 3D objects that are proportional to the real façade geometry must be utilized for the generation of the output frame for DPF. The DPF produces very high quality and small data size of output image files in quite smaller dimension compare with the photorealistic texturing method. The disadvantage of DPF is its preprocessing method to generate output image file rather than online processing to generate the texture within the 3D environment such as CityGML. Furthermore the result of DPF can be utilized for 3D model in LOD2 rather than LOD3. In the current work the random textures of the window layers are created based on parameters of DPF within Ruby console of SketchUp Trimble to generate the deeper geometries of the windows and their exact position on the façade automatically along with random textures to increase Level of Realism (LoR)(Scarpino, 2010). As the output frame in DPF is proportional to real geometry (height and width of the façade) it is possible to query the XML database and convert them to units such as meter automatically. In this technique, the perpendicular terrestrial photo from the façade is rectified by employing projective transformation based on the frame which is in constrain proportion to real geometry. The rectified photos which are not suitable for texturing but necessary for measuring, can be resized in constrain proportion to real geometry before measuring process. Height and width of windows, doors, horizontal and vertical distance between windows from upper left corner of the photo dimensions of doors and windows are parameters that should be measured to run the program as a plugins in SketchUp Trimble. The system can use these parameters and texture file names and file paths to create the façade semi-automatically. To avoid leaning geometry the textures of windows, doors and etc, should be cropped and rectified from perpendicular photos, so that they can be used in the program to create the whole façade along with its geometries. Texture enhancement should be done in advance such as removing disturbing objects, exposure setting, left-right up-down transformation, and so on. In fact, the quality, small data size, scale and semantic database for each façade are the prominent advantages of this method.

  11. Federal Logistics Information System (FLIS) Procedures Manual. General and Administrative Information. Volume 1.

    DTIC Science & Technology

    1996-04-01

    Logistics Transfer 3 Data KFA Match Through Association 1 KFC File Data Minus Security Classi- 1 Note 1: Output DICs other than Search and Inter- fled...vols 8/9 KEC Output Exceeds AUTODIN Limitations 4,5 vols 8/9 KFA Match through Association 4 vols 8/9 KFC File Data Minus Security Classified...Activities 2 Nuclear Ordnance 4 Reference Numbers 2 SECURITY CLASSIFIED DATA, FILE DATA MINUS 4 vols 8/9, DIC KFC SECURITY CLASSIFIED CHARACTERISTICS 4 vols

  12. Transferable Output ASCII Data (TOAD) gateway: Version 1.0 user's guide

    NASA Technical Reports Server (NTRS)

    Bingel, Bradford D.

    1991-01-01

    The Transferable Output ASCII Data (TOAD) Gateway, release 1.0 is described. This is a software tool for converting tabular data from one format into another via the TOAD format. This initial release of the Gateway allows free data interchange among the following file formats: TOAD; Standard Interface File (SIF); Program to Optimize Simulated Trajectories (POST) input; Comma Separated Value (TSV); and a general free-form file format. As required, additional formats can be accommodated quickly and easily.

  13. A Climate Statistics Tool and Data Repository

    NASA Astrophysics Data System (ADS)

    Wang, J.; Kotamarthi, V. R.; Kuiper, J. A.; Orr, A.

    2017-12-01

    Researchers at Argonne National Laboratory and collaborating organizations have generated regional scale, dynamically downscaled climate model output using Weather Research and Forecasting (WRF) version 3.3.1 at a 12km horizontal spatial resolution over much of North America. The WRF model is driven by boundary conditions obtained from three independent global scale climate models and two different future greenhouse gas emission scenarios, named representative concentration pathways (RCPs). The repository of results has a temporal resolution of three hours for all the simulations, includes more than 50 variables, is stored in Network Common Data Form (NetCDF) files, and the data volume is nearly 600Tb. A condensed 800Gb set of NetCDF files were made for selected variables most useful for climate-related planning, including daily precipitation, relative humidity, solar radiation, maximum temperature, minimum temperature, and wind. The WRF model simulations are conducted for three 10-year time periods (1995-2004, 2045-2054, and 2085-2094), and two future scenarios RCP4.5 and RCP8.5). An open-source tool was coded using Python 2.7.8 and ESRI ArcGIS 10.3.1 programming libraries to parse the NetCDF files, compute summary statistics, and output results as GIS layers. Eight sets of summary statistics were generated as examples for the contiguous U.S. states and much of Alaska, including number of days over 90°F, number of days with a heat index over 90°F, heat waves, monthly and annual precipitation, drought, extreme precipitation, multi-model averages, and model bias. This paper will provide an overview of the project to generate the main and condensed data repositories, describe the Python tool and how to use it, present the GIS results of the computed examples, and discuss some of the ways they can be used for planning. The condensed climate data, Python tool, computed GIS results, and documentation of the work are shared on the Internet.

  14. Software for objective comparison of vocal acoustic features over weeks of audio recording: KLFromRecordingDays

    NASA Astrophysics Data System (ADS)

    Soderstrom, Ken; Alalawi, Ali

    KLFromRecordingDays allows measurement of Kullback-Leibler (KL) distances between 2D probability distributions of vocal acoustic features. Greater KL distance measures reflect increased phonological divergence across the vocalizations compared. The software has been used to compare *.wav file recordings made by Sound Analysis Recorder 2011 of songbird vocalizations pre- and post-drug and surgical manipulations. Recordings from individual animals in *.wav format are first organized into subdirectories by recording day and then segmented into individual syllables uttered and acoustic features of these syllables using Sound Analysis Pro 2011 (SAP). KLFromRecordingDays uses syllable acoustic feature data output by SAP to a MySQL table to generate and compare "template" (typically pre-treatment) and "target" (typically post-treatment) probability distributions. These distributions are a series of virtual 2D plots of the duration of each syllable (as x-axis) to each of 13 other acoustic features measured by SAP for that syllable (as y-axes). Differences between "template" and "target" probability distributions for each acoustic feature are determined by calculating KL distance, a measure of divergence of the target 2D distribution pattern from that of the template. KL distances and the mean KL distance across all acoustic features are calculated for each recording day and output to an Excel spreadsheet. Resulting data for individual subjects may then be pooled across treatment groups and graphically summarized and used for statistical comparisons. Because SAP-generated MySQL files are accessed directly, data limits associated with spreadsheet output are avoided, and the totality of vocal output over weeks may be objectively analyzed all at once. The software has been useful for measuring drug effects on songbird vocalizations and assessing recovery from damage to regions of vocal motor cortex. It may be useful in studies employing other species, and as part of speech therapies tracking progress in producing distinct speech sounds in isolation.

  15. Optimized Next-Generation Sequencing Genotype-Haplotype Calling for Genome Variability Analysis

    PubMed Central

    Navarro, Javier; Nevado, Bruno; Hernández, Porfidio; Vera, Gonzalo; Ramos-Onsins, Sebastián E

    2017-01-01

    The accurate estimation of nucleotide variability using next-generation sequencing data is challenged by the high number of sequencing errors produced by new sequencing technologies, especially for nonmodel species, where reference sequences may not be available and the read depth may be low due to limited budgets. The most popular single-nucleotide polymorphism (SNP) callers are designed to obtain a high SNP recovery and low false discovery rate but are not designed to account appropriately the frequency of the variants. Instead, algorithms designed to account for the frequency of SNPs give precise results for estimating the levels and the patterns of variability. These algorithms are focused on the unbiased estimation of the variability and not on the high recovery of SNPs. Here, we implemented a fast and optimized parallel algorithm that includes the method developed by Roesti et al and Lynch, which estimates the genotype of each individual at each site, considering the possibility to call both bases from the genotype, a single one or none. This algorithm does not consider the reference and therefore is independent of biases related to the reference nucleotide specified. The pipeline starts from a BAM file converted to pileup or mpileup format and the software outputs a FASTA file. The new program not only reduces the running times but also, given the improved use of resources, it allows its usage with smaller computers and large parallel computers, expanding its benefits to a wider range of researchers. The output file can be analyzed using software for population genetics analysis, such as the R library PopGenome, the software VariScan, and the program mstatspop for analysis considering positions with missing data. PMID:28894353

  16. Development of a Distributed Parallel Computing Framework to Facilitate Regional/Global Gridded Crop Modeling with Various Scenarios

    NASA Astrophysics Data System (ADS)

    Jang, W.; Engda, T. A.; Neff, J. C.; Herrick, J.

    2017-12-01

    Many crop models are increasingly used to evaluate crop yields at regional and global scales. However, implementation of these models across large areas using fine-scale grids is limited by computational time requirements. In order to facilitate global gridded crop modeling with various scenarios (i.e., different crop, management schedule, fertilizer, and irrigation) using the Environmental Policy Integrated Climate (EPIC) model, we developed a distributed parallel computing framework in Python. Our local desktop with 14 cores (28 threads) was used to test the distributed parallel computing framework in Iringa, Tanzania which has 406,839 grid cells. High-resolution soil data, SoilGrids (250 x 250 m), and climate data, AgMERRA (0.25 x 0.25 deg) were also used as input data for the gridded EPIC model. The framework includes a master file for parallel computing, input database, input data formatters, EPIC model execution, and output analyzers. Through the master file for parallel computing, the user-defined number of threads of CPU divides the EPIC simulation into jobs. Then, Using EPIC input data formatters, the raw database is formatted for EPIC input data and the formatted data moves into EPIC simulation jobs. Then, 28 EPIC jobs run simultaneously and only interesting results files are parsed and moved into output analyzers. We applied various scenarios with seven different slopes and twenty-four fertilizer ranges. Parallelized input generators create different scenarios as a list for distributed parallel computing. After all simulations are completed, parallelized output analyzers are used to analyze all outputs according to the different scenarios. This saves significant computing time and resources, making it possible to conduct gridded modeling at regional to global scales with high-resolution data. For example, serial processing for the Iringa test case would require 113 hours, while using the framework developed in this study requires only approximately 6 hours, a nearly 95% reduction in computing time.

  17. Mass spectrometer output file format mzML.

    PubMed

    Deutsch, Eric W

    2010-01-01

    Mass spectrometry is an important technique for analyzing proteins and other biomolecular compounds in biological samples. Each of the vendors of these mass spectrometers uses a different proprietary binary output file format, which has hindered data sharing and the development of open source software for downstream analysis. The solution has been to develop, with the full participation of academic researchers as well as software and hardware vendors, an open XML-based format for encoding mass spectrometer output files, and then to write software to use this format for archiving, sharing, and processing. This chapter presents the various components and information available for this format, mzML. In addition to the XML schema that defines the file structure, a controlled vocabulary provides clear terms and definitions for the spectral metadata, and a semantic validation rules mapping file allows the mzML semantic validator to insure that an mzML document complies with one of several levels of requirements. Complete documentation and example files insure that the format may be uniformly implemented. At the time of release, there already existed several implementations of the format and vendors have committed to supporting the format in their products.

  18. VizieR Online Data Catalog: Habitable zones around main-sequence stars (Kopparapu+, 2014)

    NASA Astrophysics Data System (ADS)

    Kopparapu, R. K.; Ramirez, R. M.; Schottelkotte, J.; Kasting, J. F.; Domagal-Goldman, S.; Eymet, V.

    2017-08-01

    Language: Fortran 90 Code tested under the following compilers/operating systems: ifort/CentOS linux Description of input data: No input necessary. Description of output data: Output files: HZs.dat, HZ_coefficients.dat System requirements: No major system requirement. Fortran compiler necessary. Calls to external routines: None. Additional comments: None (1 data file).

  19. Modifications to the accuracy assessment analysis routine SPATL to produce an output file

    NASA Technical Reports Server (NTRS)

    Carnes, J. G.

    1978-01-01

    The SPATL is an analysis program in the Accuracy Assessment Software System which makes comparisons between ground truth information and dot labeling for an individual segment. In order to facilitate the aggregation cf this information, SPATL was modified to produce a disk output file containing the necessary information about each segment.

  20. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  1. NEAMS-IPL MOOSE Midyear Framework Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Permann, Cody; Alger, Brian; Peterson, John

    The MOOSE Framework is a modular pluggable framework for building complex simulations. The ability to add new objects with custom syntax is a core capability that makes MOOSE a powerful platform for coupling multiple applications together within a single environment. The creation of a new, more standardized JSON syntax output improves the external interfaces for generating graphical components or for validating input file syntax. The design of this interface and the requirements it satisfies are covered in this short report.

  2. AMPS/PC - AUTOMATIC MANUFACTURING PROGRAMMING SYSTEM

    NASA Technical Reports Server (NTRS)

    Schroer, B. J.

    1994-01-01

    The AMPS/PC system is a simulation tool designed to aid the user in defining the specifications of a manufacturing environment and then automatically writing code for the target simulation language, GPSS/PC. The domain of problems that AMPS/PC can simulate are manufacturing assembly lines with subassembly lines and manufacturing cells. The user defines the problem domain by responding to the questions from the interface program. Based on the responses, the interface program creates an internal problem specification file. This file includes the manufacturing process network flow and the attributes for all stations, cells, and stock points. AMPS then uses the problem specification file as input for the automatic code generator program to produce a simulation program in the target language GPSS. The output of the generator program is the source code of the corresponding GPSS/PC simulation program. The system runs entirely on an IBM PC running PC DOS Version 2.0 or higher and is written in Turbo Pascal Version 4 requiring 640K memory and one 360K disk drive. To execute the GPSS program, the PC must have resident the GPSS/PC System Version 2.0 from Minuteman Software. The AMPS/PC program was developed in 1988.

  3. Pattern Generator for Bench Test of Digital Boards

    NASA Technical Reports Server (NTRS)

    Berkun, Andrew C.; Chu, Anhua J.

    2012-01-01

    All efforts to develop electronic equipment reach a stage where they need a board test station for each board. The SMAP digital system consists of three board types that interact with each other using interfaces with critical timing. Each board needs to be tested individually before combining into the integrated digital electronics system. Each board needs critical timing signals from the others to be able to operate. A bench test system was developed to support test of each board. The test system produces all the outputs of the control and timing unit, and is delivered much earlier than the timing unit. Timing signals are treated as data. A large file is generated containing the state of every timing signal at any instant. This file is streamed out to an IO card, which is wired directly to the device-under-test (DUT) input pins. This provides a flexible test environment that can be adapted to any of the boards required to test in a standalone configuration. The problem of generating the critical timing signals is then transferred from a hardware problem to a software problem where it is more easily dealt with.

  4. Fallon, Nevada FORGE Thermal-Hydrological-Mechanical Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blankenship, Doug; Sonnenthal, Eric

    Archive contains thermal-mechanical simulation input/output files. Included are files which fall into the following categories: ( 1 ) Spreadsheets with various input parameter calculations ( 2 ) Final Simulation Inputs ( 3 ) Native-State Thermal-Hydrological Model Input File Folders ( 4 ) Native-State Thermal-Hydrological-Mechanical Model Input Files ( 5 ) THM Model Stimulation Cases See 'File Descriptions.xlsx' resource below for additional information on individual files.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myronakis, M; Cai, W; Dhou, S

    Purpose: To design a comprehensive open-source, publicly available, graphical user interface (GUI) to facilitate the configuration, generation, processing and use of the 4D Extended Cardiac-Torso (XCAT) phantom. Methods: The XCAT phantom includes over 9000 anatomical objects as well as respiratory, cardiac and tumor motion. It is widely used for research studies in medical imaging and radiotherapy. The phantom generation process involves the configuration of a text script to parameterize the geometry, motion, and composition of the whole body and objects within it, and to generate simulated PET or CT images. To avoid the need for manual editing or script writing,more » our MATLAB-based GUI uses slider controls, drop-down lists, buttons and graphical text input to parameterize and process the phantom. Results: Our GUI can be used to: a) generate parameter files; b) generate the voxelized phantom; c) combine the phantom with a lesion; d) display the phantom; e) produce average and maximum intensity images from the phantom output files; f) incorporate irregular patient breathing patterns; and f) generate DICOM files containing phantom images. The GUI provides local help information using tool-tip strings on the currently selected phantom, minimizing the need for external documentation. The DICOM generation feature is intended to simplify the process of importing the phantom images into radiotherapy treatment planning systems or other clinical software. Conclusion: The GUI simplifies and automates the use of the XCAT phantom for imaging-based research projects in medical imaging or radiotherapy. This has the potential to accelerate research conducted with the XCAT phantom, or to ease the learning curve for new users. This tool does not include the XCAT phantom software itself. We would like to acknowledge funding from MRA, Varian Medical Systems Inc.« less

  6. IGES transformer and NURBS in grid generation

    NASA Technical Reports Server (NTRS)

    Yu, Tzu-Yi; Soni, Bharat K.

    1993-01-01

    In the field of Grid Generation and the CAD/CAM, there are numerous geometry output formats which require the designer to spend a great deal of time manipulating geometrical entities in order to achieve a useful sculptured geometrical description for grid generation. Also in this process, there is a danger of losing fidelity of the geometry under consideration. This stresses the importance of a standard geometry definition for the communication link between varying CAD/CAM and grid system. The IGES (Initial Graphics Exchange Specification) file is a widely used communication between CAD/CAM and the analysis tools. The scientists at NASA Research Centers - including NASA Ames, NASA Langley, NASA Lewis, NASA Marshall - have recognized this importance and, therefore, in 1992 they formed the committee of the 'NASA-IGES' which is the subset of the standard IGES. This committee stresses the importance and encourages the CFD community to use the standard IGES file for the interface between the CAD/CAM and CFD analysis. Also, two of the IGES entities -- the NURBS Curve (Entity 126) and NURBS Surface (Entity 128) -- which have many useful geometric properties -- like the convex hull property, local control property and affine invariance, also widely utilized analytical geometries can be accurately represented using NURBS. This is important in today grid generation tools because of the emphasis of the interactive design. To satisfy the geometry transformation between the CAD/CAM system and Grid Generation field, the CAGI (Computer Aided Geometry Design) developed, which include the Geometry Transformation, Geometry Manipulation and Geometry Generation as well as the user interface. This paper will present the successful development IGES file transformer and application of NURBS definition in the grid generation.

  7. User's manual for the HYPGEN hyperbolic grid generator and the HGUI graphical user interface

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Chiu, Ing-Tsau; Buning, Pieter G.

    1993-01-01

    The HYPGEN program is used to generate a 3-D volume grid over a user-supplied single-block surface grid. This is accomplished by solving the 3-D hyperbolic grid generation equations consisting of two orthogonality relations and one cell volume constraint. In this user manual, the required input files and parameters and output files are described. Guidelines on how to select the input parameters are given. Illustrated examples are provided showing a variety of topologies and geometries that can be treated. HYPGEN can be used in stand-alone mode as a batch program or it can be called from within a graphical user interface HGUI that runs on Silicon Graphics workstations. This user manual provides a description of the menus, buttons, sliders, and typein fields in HGUI for users to enter the parameters needed to run HYPGEN. Instructions are given on how to configure the interface to allow HYPGEN to run either locally or on a faster remote machine through the use of shell scripts on UNIX operating systems. The volume grid generated is copied back to the local machine for visualization using a built-in hook to PLOT3D.

  8. ACTS (Advanced Communications Technology Satellite) Propagation Experiment: Preprocessing Software User's Manual

    NASA Technical Reports Server (NTRS)

    Crane, Robert K.; Wang, Xuhe; Westenhaver, David

    1996-01-01

    The preprocessing software manual describes the Actspp program originally developed to observe and diagnose Advanced Communications Technology Satellite (ACTS) propagation terminal/receiver problems. However, it has been quite useful for automating the preprocessing functions needed to convert the terminal output to useful attenuation estimates. Prior to having data acceptable for archival functions, the individual receiver system must be calibrated and the power level shifts caused by ranging tone modulation must be received. Actspp provides three output files: the daylog, the diurnal coefficient file, and the file that contains calibration information.

  9. Equivalent Longitudinal Area Distributions of the B-58 and XB-70-1 Airplanes for Use in Wave Drag and Sonic Boom Calculations

    NASA Technical Reports Server (NTRS)

    Tinetti, Ana F.; Maglieri, Domenic J.; Driver, Cornelius; Bobbitt, Percy J.

    2011-01-01

    A detailed geometric description, in wave drag format, has been developed for the Convair B-58 and North American XB-70-1 delta wing airplanes. These descriptions have been placed on electronic files, the contents of which are described in this paper They are intended for use in wave drag and sonic boom calculations. Included in the electronic file and in the present paper are photographs and 3-view drawings of the two airplanes, tabulated geometric descriptions of each vehicle and its components, and comparisons of the electronic file outputs with existing data. The comparisons include a pictorial of the two airplanes based on the present geometric descriptions, and cross-sectional area distributions for both the normal Mach cuts and oblique Mach cuts above and below the vehicles. Good correlation exists between the area distributions generated in the late 1950s and 1960s and the present files. The availability of these electronic files facilitates further validation of sonic boom prediction codes through the use of two existing data bases on these airplanes, which were acquired in the 1960s and have not been fully exploited.

  10. Environmental flow allocation and statistics calculator

    USGS Publications Warehouse

    Konrad, Christopher P.

    2011-01-01

    The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.

  11. ModelArchiver—A program for facilitating the creation of groundwater model archives

    USGS Publications Warehouse

    Winston, Richard B.

    2018-03-01

    ModelArchiver is a program designed to facilitate the creation of groundwater model archives that meet the requirements of the U.S. Geological Survey (USGS) policy (Office of Groundwater Technical Memorandum 2016.02, https://water.usgs.gov/admin/memo/GW/gw2016.02.pdf, https://water.usgs.gov/ogw/policy/gw-model/). ModelArchiver version 1.0 leads the user step-by-step through the process of creating a USGS groundwater model archive. The user specifies the contents of each of the subdirectories within the archive and provides descriptions of the archive contents. Descriptions of some files can be specified automatically using file extensions. Descriptions also can be specified individually. Those descriptions are added to a readme.txt file provided by the user. ModelArchiver moves the content of the archive to the archive folder and compresses some folders into .zip files.As part of the archive, the modeler must create a metadata file describing the archive. The program has a built-in metadata editor and provides links to websites that can aid in creation of the metadata. The built-in metadata editor is also available as a stand-alone program named FgdcMetaEditor version 1.0, which also is described in this report. ModelArchiver updates the metadata file provided by the user with descriptions of the files in the archive. An optional archive list file generated automatically by ModelMuse can streamline the creation of archives by identifying input files, output files, model programs, and ancillary files for inclusion in the archive.

  12. Computer Aided Design of Polyhedron Solids to Model Air in Com-Geom Descriptions

    DTIC Science & Technology

    1983-08-01

    34The GIFT Code User Manual, Volume I, Introduction and Input Requirements," BRL Report No. 1802, July 1975 (Unclassified). (AD B0060Z7LK 2G...Kuehl, L. Bain and M. Reisinger, "The GIFT Code User Manual, Volume II, The Output Options," BRL Report ARBRL-TR-02189, September 1979...is generated from the GIFT code under op- tion XSECT. This option produces plot files which define cross- sectional views of the COM-GEOM

  13. Development of a Low-Latency, High Data Rate, Differential GPS Relative Positioning System for UAV Formation Flight Control

    DTIC Science & Technology

    2006-09-01

    spiral development cycle involved transporting the software processes from a Windows XP / MATLAB environment to a Linux / C++ environment. This...tested on. Additionally, in the case of the GUMSTIX PC boards, the LINUX operating system is burned into the read-only memory. Lastly, both PC-104 and...both the real-time environment and the post-processed en - vironment. When the system operates in real-time mode, an output file is generated which

  14. Web-based Toolkit for Dynamic Generation of Data Processors

    NASA Astrophysics Data System (ADS)

    Patel, J.; Dascalu, S.; Harris, F. C.; Benedict, K. K.; Gollberg, G.; Sheneman, L.

    2011-12-01

    All computation-intensive scientific research uses structured datasets, including hydrology and all other types of climate-related research. When it comes to testing their hypotheses, researchers might use the same dataset differently, and modify, transform, or convert it to meet their research needs. Currently, many researchers spend a good amount of time performing data processing and building tools to speed up this process. They might routinely repeat the same process activities for new research projects, spending precious time that otherwise could be dedicated to analyzing and interpreting the data. Numerous tools are available to run tests on prepared datasets and many of them work with datasets in different formats. However, there is still a significant need for applications that can comprehensively handle data transformation and conversion activities and help prepare the various processed datasets required by the researchers. We propose a web-based application (a software toolkit) that dynamically generates data processors capable of performing data conversions, transformations, and customizations based on user-defined mappings and selections. As a first step, the proposed solution allows the users to define various data structures and, in the next step, can select various file formats and data conversions for their datasets of interest. In a simple scenario, the core of the proposed web-based toolkit allows the users to define direct mappings between input and output data structures. The toolkit will also support defining complex mappings involving the use of pre-defined sets of mathematical, statistical, date/time, and text manipulation functions. Furthermore, the users will be allowed to define logical cases for input data filtering and sampling. At the end of the process, the toolkit is designed to generate reusable source code and executable binary files for download and use by the scientists. The application is also designed to store all data structures and mappings defined by a user (an author), and allow the original author to modify them using standard authoring techniques. The users can change or define new mappings to create new data processors for download and use. In essence, when executed, the generated data processor binary file can take an input data file in a given format and output this data, possibly transformed, in a different file format. If they so desire, the users will be able modify directly the source code in order to define more complex mappings and transformations that are not currently supported by the toolkit. Initially aimed at supporting research in hydrology, the toolkit's functions and features can be either directly used or easily extended to other areas of climate-related research. The proposed web-based data processing toolkit will be able to generate various custom software processors for data conversion and transformation in a matter of seconds or minutes, saving a significant amount of researchers' time and allowing them to focus on core research issues.

  15. Recognition of Computer Viruses by Detecting Their Gene of Self Replication

    DTIC Science & Technology

    2006-03-01

    etection A pproach ................................................................................................. 6 1.4.1 The syntactic analysis m...Therefore a group of instructions acting together in the right order have to be identified for the gene of self-replication to be obvious in a...its first system call NtCreateFile, while the outputs of NtWriteFile become its output arguments. These four blocks form the final structure - The Gene

  16. DOCU-TEXT: A tool before the data dictionary

    NASA Technical Reports Server (NTRS)

    Carter, B.

    1983-01-01

    DOCU-TEXT, a proprietary software package that aids in the production of documentation for a data processing organization and can be installed and operated only on IBM computers is discussed. In organizing information that ultimately will reside in a data dictionary, DOCU-TEXT proved to be a useful documentation tool in extracting information from existing production jobs, procedure libraries, system catalogs, control data sets and related files. DOCU-TEXT reads these files to derive data that is useful at the system level. The output of DOCU-TEXT is a series of user selectable reports. These reports can reflect the interactions within a single job stream, a complete system, or all the systems in an installation. Any single report, or group of reports, can be generated in an independent documentation pass.

  17. Scheduler software for tracking and data relay satellite system loading analysis: User manual and programmer guide

    NASA Technical Reports Server (NTRS)

    Craft, R.; Dunn, C.; Mccord, J.; Simeone, L.

    1980-01-01

    A user guide and programmer documentation is provided for a system of PRIME 400 minicomputer programs. The system was designed to support loading analyses on the Tracking Data Relay Satellite System (TDRSS). The system is a scheduler for various types of data relays (including tape recorder dumps and real time relays) from orbiting payloads to the TDRSS. Several model options are available to statistically generate data relay requirements. TDRSS time lines (representing resources available for scheduling) and payload/TDRSS acquisition and loss of sight time lines are input to the scheduler from disk. Tabulated output from the interactive system includes a summary of the scheduler activities over time intervals specified by the user and overall summary of scheduler input and output information. A history file, which records every event generated by the scheduler, is written to disk to allow further scheduling on remaining resources and to provide data for graphic displays or additional statistical analysis.

  18. ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog

    NASA Technical Reports Server (NTRS)

    Gray, F. P., Jr. (Editor)

    1979-01-01

    A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

  19. FROST - FREEDOM OPERATIONS SIMULATION TEST VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Deshpande, G. K.

    1994-01-01

    The Space Station Freedom Information System processes and transmits data between the space station and the station controllers and payload operators on the ground. Components of the system include flight hardware, communications satellites, software and ground facilities. FROST simulates operation of the SSF Information System, tracking every data packet from generation to destination for both uplinks and downlinks. This program collects various statistics concerning the SSF Information System operation and provides reports of these at user-specified intervals. Additionally, FROST has graphical display capability to enhance interpretation of these statistics. FROST models each of the components of the SSF Information System as an object, to which packets are generated, received, processed, transmitted, and/or dumped. The user must provide the information system design with specified parameters and inter-connections among objects. To aid this process, FROST supplies an example SSF Information System for simulation, but this example must be copied before it is changed and used for further simulation. Once specified, system architecture and parameters are put into the input file, named the Test Configuration Definition (TCD) file. Alternative system designs can then be simulated simply by editing the TCD file. Within this file the user can define new objects, alter object parameters, redefine paths, redefine generation rates and windows, and redefine object interconnections. At present, FROST does not model every feature of the SSF Information System, but it is capable of simulating many of the system's important functions. To generate data messages, which can come from any object, FROST defines "windows" to specify when, what kind, and how much of that data is generated. All messages are classified by priority as either (1)emergency (2)quick look (3)telemetry or (4)payload data. These messages are processed by all objects according to priority. That is, all priority 1 (emergency) messages are processed and transmitted before priority 2 messages, and so forth. FROST also allows for specification of "pipeline" or "direct" links. Pipeline links are used to broadcast at constant intervals, while direct links transmit messages only when packets are ready for transmission. FROST allows the user substantial flexibility to customize output for a simulation. Output consists of tables and graphs, as specified in the TCD file, to be generated at the specified interval. These tables may be generated at short intervals during the run to produce snapshots as simulation proceeds, or generated after the run to give a summary of the entire run. FROST is written in SIMSCRIPT II.5 (developed by CACI) for DEC VAX series computers running VMS. FROST was developed on a VAX 8700 and is intended to be run on large VAXes with at least 32Mb of memory. The main memory requirement for FROST is dependent on the number of processors used in the simulation and the event time. The standard distribution medium for this package is a 9-track 1600 BPI DEC VAX BACKUP Format Magnetic Tape. An executable is included on the tape in addition to the source code. FROST was developed in 1990 and is a copyrighted work with all copyright vested in NASA. DEC, VAX and VMS are registered trademarks of Digital Equipment Corporation. IBM PC is a trademark of International Business Machines. SIMSCRIPT II.5 is a trademark of CACI.

  20. A Monte Carlo simulation framework for electron beam dose calculations using Varian phase space files for TrueBeam Linacs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodrigues, Anna; Yin, Fang-Fang; Wu, Qiuwen, E-mail: Qiuwen.Wu@Duke.edu

    2015-05-15

    Purpose: To develop a framework for accurate electron Monte Carlo dose calculation. In this study, comprehensive validations of vendor provided electron beam phase space files for Varian TrueBeam Linacs against measurement data are presented. Methods: In this framework, the Monte Carlo generated phase space files were provided by the vendor and used as input to the downstream plan-specific simulations including jaws, electron applicators, and water phantom computed in the EGSnrc environment. The phase space files were generated based on open field commissioning data. A subset of electron energies of 6, 9, 12, 16, and 20 MeV and open and collimatedmore » field sizes 3 × 3, 4 × 4, 5 × 5, 6 × 6, 10 × 10, 15 × 15, 20 × 20, and 25 × 25 cm{sup 2} were evaluated. Measurements acquired with a CC13 cylindrical ionization chamber and electron diode detector and simulations from this framework were compared for a water phantom geometry. The evaluation metrics include percent depth dose, orthogonal and diagonal profiles at depths R{sub 100}, R{sub 50}, R{sub p}, and R{sub p+} for standard and extended source-to-surface distances (SSD), as well as cone and cut-out output factors. Results: Agreement for the percent depth dose and orthogonal profiles between measurement and Monte Carlo was generally within 2% or 1 mm. The largest discrepancies were observed within depths of 5 mm from phantom surface. Differences in field size, penumbra, and flatness for the orthogonal profiles at depths R{sub 100}, R{sub 50}, and R{sub p} were within 1 mm, 1 mm, and 2%, respectively. Orthogonal profiles at SSDs of 100 and 120 cm showed the same level of agreement. Cone and cut-out output factors agreed well with maximum differences within 2.5% for 6 MeV and 1% for all other energies. Cone output factors at extended SSDs of 105, 110, 115, and 120 cm exhibited similar levels of agreement. Conclusions: We have presented a Monte Carlo simulation framework for electron beam dose calculations for Varian TrueBeam Linacs. Electron beam energies of 6 to 20 MeV for open and collimated field sizes from 3 × 3 to 25 × 25 cm{sup 2} were studied and results were compared to the measurement data with excellent agreement. Application of this framework can thus be used as the platform for treatment planning of dynamic electron arc radiotherapy and other advanced dynamic techniques with electron beams.« less

  1. A Monte Carlo simulation framework for electron beam dose calculations using Varian phase space files for TrueBeam Linacs.

    PubMed

    Rodrigues, Anna; Sawkey, Daren; Yin, Fang-Fang; Wu, Qiuwen

    2015-05-01

    To develop a framework for accurate electron Monte Carlo dose calculation. In this study, comprehensive validations of vendor provided electron beam phase space files for Varian TrueBeam Linacs against measurement data are presented. In this framework, the Monte Carlo generated phase space files were provided by the vendor and used as input to the downstream plan-specific simulations including jaws, electron applicators, and water phantom computed in the EGSnrc environment. The phase space files were generated based on open field commissioning data. A subset of electron energies of 6, 9, 12, 16, and 20 MeV and open and collimated field sizes 3 × 3, 4 × 4, 5 × 5, 6 × 6, 10 × 10, 15 × 15, 20 × 20, and 25 × 25 cm(2) were evaluated. Measurements acquired with a CC13 cylindrical ionization chamber and electron diode detector and simulations from this framework were compared for a water phantom geometry. The evaluation metrics include percent depth dose, orthogonal and diagonal profiles at depths R100, R50, Rp, and Rp+ for standard and extended source-to-surface distances (SSD), as well as cone and cut-out output factors. Agreement for the percent depth dose and orthogonal profiles between measurement and Monte Carlo was generally within 2% or 1 mm. The largest discrepancies were observed within depths of 5 mm from phantom surface. Differences in field size, penumbra, and flatness for the orthogonal profiles at depths R100, R50, and Rp were within 1 mm, 1 mm, and 2%, respectively. Orthogonal profiles at SSDs of 100 and 120 cm showed the same level of agreement. Cone and cut-out output factors agreed well with maximum differences within 2.5% for 6 MeV and 1% for all other energies. Cone output factors at extended SSDs of 105, 110, 115, and 120 cm exhibited similar levels of agreement. We have presented a Monte Carlo simulation framework for electron beam dose calculations for Varian TrueBeam Linacs. Electron beam energies of 6 to 20 MeV for open and collimated field sizes from 3 × 3 to 25 × 25 cm(2) were studied and results were compared to the measurement data with excellent agreement. Application of this framework can thus be used as the platform for treatment planning of dynamic electron arc radiotherapy and other advanced dynamic techniques with electron beams.

  2. Construct User Guide

    DTIC Science & Technology

    2012-11-01

    validation using calibrated grounding. In 2007 BRIMS Conference Proceedings, Norfolk, VA. Simon, H. A. (1957). Administrative Behavior: A study of...Construct will write the output to the directory specified by the path name. Users should ensure that if they have opened any output files (e.g., in Excel... open an input file, it will exit and close. There are times when an error message is not present to the user in this situation! Users should ensure

  3. Program Aids In Printing FORTRAN-Coded Output

    NASA Technical Reports Server (NTRS)

    Akian, Richard A.

    1993-01-01

    FORPRINT computer program prints FORTRAN-coded output files on most non-Postscript printers with such extra features as control of fonts for Epson and Hewlett Packard printers. Rewrites data to printer and inserts correct printer-control codes. Alternative uses include ability to separate data or ASCII file during printing by use of editing software to insert "1" in first column of data line that starts new page. Written in FORTRAN 77.

  4. DXRaySMCS: a user-friendly interface developed for prediction of diagnostic radiology X-ray spectra produced by Monte Carlo (MCNP-4C) simulation.

    PubMed

    Bahreyni Toossi, M T; Moradi, H; Zare, H

    2008-01-01

    In this work, the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of X-ray spectra in diagnostic radiology. The electron's path in the target was followed until its energy was reduced to 10 keV. A user-friendly interface named 'diagnostic X-ray spectra by Monte Carlo simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user-friendly interface for: (i) modifying the MCNP input file, (ii) launching the MCNP program to simulate electron and photon transport and (iii) processing the MCNP output file to yield a summary of the results (relative photon number per energy bin). In this article, the development and characteristics of DXRaySMCS are outlined. As part of the validation process, output spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study.

  5. Conjoin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjaardema, Gregory

    2010-08-06

    Conjoin is a code for joining sequentially in time multiple exodusII database files. It is used to create a single results or restart file from multiple results or restart files which typically arise as the result of multiple restarted analyses. The resulting output file will be the union of the input files with a status variable indicating the status of each element at the various time planes.Combining multiple exodusII files arising from a restarted analysis or combining multiple exodusII files arising from a finite element analysis with dynamic topology changes.

  6. Shuttle Data Center File-Processing Tool in Java

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Miller, Walter H.

    2006-01-01

    A Java-language computer program has been written to facilitate mining of data in files in the Shuttle Data Center (SDC) archives. This program can be executed on a variety of workstations or via Web-browser programs. This program is partly similar to prior C-language programs used for the same purpose, while differing from those programs in that it exploits the platform-neutrality of Java in implementing several features that are important for analysis of large sets of time-series data. The program supports regular expression queries of SDC archive files, reads the files, interleaves the time-stamped samples according to a chosen output, then transforms the results into that format. A user can choose among a variety of output file formats that are useful for diverse purposes, including plotting, Markov modeling, multivariate density estimation, and wavelet multiresolution analysis, as well as for playback of data in support of simulation and testing.

  7. MDplot: Visualise Molecular Dynamics.

    PubMed

    Margreitter, Christian; Oostenbrink, Chris

    2017-05-10

    The MDplot package provides plotting functions to allow for automated visualisation of molecular dynamics simulation output. It is especially useful in cases where the plot generation is rather tedious due to complex file formats or when a large number of plots are generated. The graphs that are supported range from those which are standard, such as RMsD/RMsF (root-mean-square deviation and root-mean-square fluctuation, respectively) to less standard, such as thermodynamic integration analysis and hydrogen bond monitoring over time. All told, they address many commonly used analyses. In this article, we set out the MDplot package's functions, give examples of the function calls, and show the associated plots. Plotting and data parsing is separated in all cases, i.e. the respective functions can be used independently. Thus, data manipulation and the integration of additional file formats is fairly easy. Currently, the loading functions support GROMOS, GROMACS, and AMBER file formats. Moreover, we also provide a Bash interface that allows simple embedding of MDplot into Bash scripts as the final analysis step. The package can be obtained in the latest major version from CRAN (https://cran.r-project.org/package=MDplot) or in the most recent version from the project's GitHub page at https://github.com/MDplot/MDplot, where feedback is also most welcome. MDplot is published under the GPL-3 license.

  8. Software for Managing Personal Files.

    ERIC Educational Resources Information Center

    Lundeen, Gerald

    1989-01-01

    Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…

  9. SNPmplexViewer--toward a cost-effective traceability system

    PubMed Central

    2011-01-01

    Background Beef traceability has become mandatory in many regions of the world and is typically achieved through the use of unique numerical codes on ear tags and animal passports. DNA-based traceability uses the animal's own DNA code to identify it and the products derived from it. Using SNaPshot, a primer-extension-based method, a multiplex of 25 SNPs in a single reaction has been practiced for reducing the expense of genotyping a panel of SNPs useful for identity control. Findings To further decrease SNaPshot's cost, we introduced the Perl script SNPmplexViewer, which facilitates the analysis of trace files for reactions performed without the use of fluorescent size standards. SNPmplexViewer automatically aligns reference and target trace electropherograms, run with and without fluorescent size standards, respectively. SNPmplexViewer produces a modified target trace file containing a normalised trace in which the reference size standards are embedded. SNPmplexViewer also outputs aligned images of the two electropherograms together with a difference profile. Conclusions Modified trace files generated by SNPmplexViewer enable genotyping of SnaPshot reactions performed without fluorescent size standards, using common fragment-sizing software packages. SNPmplexViewer's normalised output may also improve the genotyping software's performance. Thus, SNPmplexViewer is a general free tool enabling the reduction of SNaPshot's cost as well as the fast viewing and comparing of trace electropherograms for fragment analysis. SNPmplexViewer is available at http://cowry.agri.huji.ac.il/cgi-bin/SNPmplexViewer.cgi. PMID:21600063

  10. PATSTAGS - PATRAN-STAGSC-1 TRANSLATOR

    NASA Technical Reports Server (NTRS)

    Otte, N. E.

    1994-01-01

    PATSTAGS translates PATRAN finite model data into STAGS (Structural Analysis of General Shells) input records to be used for engineering analysis. The program reads data from a PATRAN neutral file and writes STAGS input records into a STAGS input file and a UPRESS data file. It is able to support translations of nodal constraints, nodal, element, force and pressure data. PATSTAGS uses three files: the PATRAN neutral file to be translated, a STAGS input file and a STAGS pressure data file. The user provides the names for the neutral file and the desired names of the STAGS files to be created. The pressure data file contains the element live pressure data used in the STAGS subroutine UPRESS. PATSTAGS is written in FORTRAN 77 for DEC VAX series computers running VMS. The main memory requirement for execution is approximately 790K of virtual memory. Output blocks can be modified to output the data in any format desired, allowing the program to be used to translate model data to analysis codes other than STAGSC-1 (HQN-10967). This program is available in DEC VAX BACKUP format on a 9-track magnetic tape or TK50 tape cartridge. Documentation is included in the price of the program. PATSTAGS was developed in 1990. DEC, VAX, TK50 and VMS are trademarks of Digital Equipment Corporation.

  11. User’s Guide. To the Federal Insurance Administration’s 1978-1979 Flood Claims File for Computation of Depth-Damage Relationships.

    DTIC Science & Technology

    1981-12-01

    reading a file either saved in a previous session or created as a result of the internal execution save file (described later). LOAND PFN LOADS...command is used to make new data retrievals. READ PEN DIRECT ENTRY FROM A PREVIOUSLY SAVED FILE This command bypasses the conventional terminal entry by...INTERNAL SAVE FILE This command accesses a file created using the internal execution save file output option. Loading a file results in entering the

  12. CARE3MENU- A CARE III USER FRIENDLY INTERFACE

    NASA Technical Reports Server (NTRS)

    Pierce, J. L.

    1994-01-01

    CARE3MENU generates an input file for the CARE III program. CARE III is used for reliability prediction of complex, redundant, fault-tolerant systems including digital computers, aircraft, nuclear and chemical control systems. The CARE III input file often becomes complicated and is not easily formatted with a text editor. CARE3MENU provides an easy, interactive method of creating an input file by automatically formatting a set of user-supplied inputs for the CARE III system. CARE3MENU provides detailed on-line help for most of its screen formats. The reliability model input process is divided into sections using menu-driven screen displays. Each stage, or set of identical modules comprising the model, must be identified and described in terms of number of modules, minimum number of modules for stage operation, and critical fault threshold. The fault handling and fault occurence models are detailed in several screens by parameters such as transition rates, propagation and detection densities, Weibull or exponential characteristics, and model accuracy. The system fault tree and critical pairs fault tree screens are used to define the governing logic and to identify modules affected by component failures. Additional CARE3MENU screens prompt the user for output options and run time control values such as mission time and truncation values. There are fourteen major screens, many with default values and HELP options. The documentation includes: 1) a users guide with several examples of CARE III models, the dialog required to input them to CARE3MENU, and the output files created; and 2) a maintenance manual for assistance in changing the HELP files and modifying any of the menu formats or contents. CARE3MENU is written in FORTRAN 77 for interactive execution and has been implemented on a DEC VAX series computer operating under VMS. This program was developed in 1985.

  13. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-94, with projections to 2020; (supplement one to U.S. Geological Survey Water-resources investigations report 94-4251)

    USGS Publications Warehouse

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.). Output files resulting from the computer simulations are included for reference.

  14. A Compilation of MATLAB Scripts and Functions for MACGMC Analyses

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.

    2017-01-01

    The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic simulations with minimal manual intervention. This document is formatted to provide MATLAB source files and descriptions of how to utilize them. It is assumed that the user has a basic understanding of how MATLAB scripts work and some MATLAB programming experience.

  15. Development of a user-friendly system for image processing of electron microscopy by integrating a web browser and PIONE with Eos.

    PubMed

    Tsukamoto, Takafumi; Yasunaga, Takuo

    2014-11-01

    Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also need to provide common workspace for analysis because the client is physically separated from a server. We solved the file format problem by extension of rules of OptionControlFile of Eos. Furthermore, to solve workspace problems, we have developed two type of system. The first system is to use only local environments. The user runs a web server provided by Eos, access to a web client through a web browser, and manipulate the local files with GUI on the web browser. The second system is employing PIONE (Process-rule for Input/Output Negotiation Environment), which is our developing platform that works under heterogenic distributed environment. The users can put their resources, such as microscopic images, text files and so on, into the server-side environment supported by PIONE, and so experts can write PIONE rule definition, which defines a workflow of image processing. PIONE run each image processing on suitable computers, following the defined rule. PIONE has the ability of interactive manipulation, and user is able to try a command with various setting values. In this situation, we contribute to auto-generation of GUI for a PIONE workflow.As advanced functions, we have developed a module to log user actions. The logs include information such as setting values in image processing, procedure of commands and so on. If we use the logs effectively, we can get a lot of advantages. For example, when an expert may discover some know-how of image processing, other users can also share logs including his know-hows and so we may obtain recommendation workflow of image analysis, if we analyze logs. To implement social platform of image processing for electron microscopists, we have developed system infrastructure, as well. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Trick Simulation Environment 07

    NASA Technical Reports Server (NTRS)

    Lin, Alexander S.; Penn, John M.

    2012-01-01

    The Trick Simulation Environment is a generic simulation toolkit used for constructing and running simulations. This release includes a Monte Carlo analysis simulation framework and a data analysis package. It produces all auto documentation in XML. Also, the software is capable of inserting a malfunction at any point during the simulation. Trick 07 adds variable server output options and error messaging and is capable of using and manipulating wide characters for international support. Wide character strings are available as a fundamental type for variables processed by Trick. A Trick Monte Carlo simulation uses a statistically generated, or predetermined, set of inputs to iteratively drive the simulation. Also, there is a framework in place for optimization and solution finding where developers may iteratively modify the inputs per run based on some analysis of the outputs. The data analysis package is capable of reading data from external simulation packages such as MATLAB and Octave, as well as the common comma-separated values (CSV) format used by Excel, without the use of external converters. The file formats for MATLAB and Octave were obtained from their documentation sets, and Trick maintains generic file readers for each format. XML tags store the fields in the Trick header comments. For header files, XML tags for structures and enumerations, and the members within are stored in the auto documentation. For source code files, XML tags for each function and the calling arguments are stored in the auto documentation. When a simulation is built, a top level XML file, which includes all of the header and source code XML auto documentation files, is created in the simulation directory. Trick 07 provides an XML to TeX converter. The converter reads in header and source code XML documentation files and converts the data to TeX labels and tables suitable for inclusion in TeX documents. A malfunction insertion capability allows users to override the value of any simulation variable, or call a malfunction job, at any time during the simulation. Users may specify conditions, use the return value of a malfunction trigger job, or manually activate a malfunction. The malfunction action may consist of executing a block of input file statements in an action block, setting simulation variable values, call a malfunction job, or turn on/off simulation jobs.

  17. Engineering description of the ascent/descent bet product

    NASA Technical Reports Server (NTRS)

    Seacord, A. W., II

    1986-01-01

    The Ascent/Descent output product is produced in the OPIP routine from three files which constitute its input. One of these, OPIP.IN, contains mission specific parameters. Meteorological data, such as atmospheric wind velocities, temperatures, and density, are obtained from the second file, the Corrected Meteorological Data File (METDATA). The third file is the TRJATTDATA file which contains the time-tagged state vectors that combine trajectory information from the Best Estimate of Trajectory (BET) filter, LBRET5, and Best Estimate of Attitude (BEA) derived from IMU telemetry. Each term in the two output data files (BETDATA and the Navigation Block, or NAVBLK) are defined. The description of the BETDATA file includes an outline of the algorithm used to calculate each term. To facilitate describing the algorithms, a nomenclature is defined. The description of the nomenclature includes a definition of the coordinate systems used. The NAVBLK file contains navigation input parameters. Each term in NAVBLK is defined and its source is listed. The production of NAVBLK requires only two computational algorithms. These two algorithms, which compute the terms DELTA and RSUBO, are described. Finally, the distribution of data in the NAVBLK records is listed.

  18. Artificial neural networks for modeling ammonia emissions released from sewage sludge composting

    NASA Astrophysics Data System (ADS)

    Boniecki, P.; Dach, J.; Pilarski, K.; Piekarska-Boniecka, H.

    2012-09-01

    The project was designed to develop, test and validate an original Neural Model describing ammonia emissions generated in composting sewage sludge. The composting mix was to include the addition of such selected structural ingredients as cereal straw, sawdust and tree bark. All created neural models contain 7 input variables (chemical and physical parameters of composting) and 1 output (ammonia emission). The α data file was subdivided into three subfiles: the learning file (ZU) containing 330 cases, the validation file (ZW) containing 110 cases and the test file (ZT) containing 110 cases. The standard deviation ratios (for all 4 created networks) ranged from 0.193 to 0.218. For all of the selected models, the correlation coefficient reached the high values of 0.972-0.981. The results show that he predictive neural model describing ammonia emissions from composted sewage sludge is well suited for assessing such emissions. The sensitivity analysis of the model for the input of variables of the process in question has shown that the key parameters describing ammonia emissions released in composting sewage sludge are pH and the carbon to nitrogen ratio (C:N).

  19. eF-seek: prediction of the functional sites of proteins by searching for similar electrostatic potential and molecular surface shape.

    PubMed

    Kinoshita, Kengo; Murakami, Yoichi; Nakamura, Haruki

    2007-07-01

    We have developed a method to predict ligand-binding sites in a new protein structure by searching for similar binding sites in the Protein Data Bank (PDB). The similarities are measured according to the shapes of the molecular surfaces and their electrostatic potentials. A new web server, eF-seek, provides an interface to our search method. It simply requires a coordinate file in the PDB format, and generates a prediction result as a virtual complex structure, with the putative ligands in a PDB format file as the output. In addition, the predicted interacting interface is displayed to facilitate the examination of the virtual complex structure on our own applet viewer with the web browser (URL: http://eF-site.hgc.jp/eF-seek).

  20. Information retrieval and display system

    NASA Technical Reports Server (NTRS)

    Groover, J. L.; King, W. L.

    1977-01-01

    Versatile command-driven data management system offers users, through simplified command language, a means of storing and searching data files, sorting data files into specified orders, performing simple or complex computations, effecting file updates, and printing or displaying output data. Commands are simple to use and flexible enough to meet most data management requirements.

  1. Automatic Residential/Commercial Classification of Parcels with Solar Panel Detections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morton, April M; Omitaomu, Olufemi A; Kotikot, Susan

    A computational method to automatically detect solar panels on rooftops to aid policy and financial assessment of solar distributed generation. The code automatically classifies parcels containing solar panels in the U.S. as residential or commercial. The code allows the user to specify an input dataset containing parcels and detected solar panels, and then uses information about the parcels and solar panels to automatically classify the rooftops as residential or commercial using machine learning techniques. The zip file containing the code includes sample input and output datasets for the Boston and DC areas.

  2. DITTY - a computer program for calculating population dose integrated over ten thousand years

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.

    The computer program DITTY (Dose Integrated Over Ten Thousand Years) was developed to determine the collective dose from long term nuclear waste disposal sites resulting from the ground-water pathways. DITTY estimates the time integral of collective dose over a ten-thousand-year period for time-variant radionuclide releases to surface waters, wells, or the atmosphere. This document includes the following information on DITTY: a description of the mathematical models, program designs, data file requirements, input preparation, output interpretations, sample problems, and program-generated diagnostic messages.

  3. Data handling with SAM and art at the NO vA experiment

    DOE PAGES

    Aurisano, A.; Backhouse, C.; Davies, G. S.; ...

    2015-12-23

    During operations, NOvA produces between 5,000 and 7,000 raw files per day with peaks in excess of 12,000. These files must be processed in several stages to produce fully calibrated and reconstructed analysis files. In addition, many simulated neutrino interactions must be produced and processed through the same stages as data. To accommodate the large volume of data and Monte Carlo, production must be possible both on the Fermilab grid and on off-site farms, such as the ones accessible through the Open Science Grid. To handle the challenge of cataloging these files and to facilitate their off-line processing, we havemore » adopted the SAM system developed at Fermilab. SAM indexes files according to metadata, keeps track of each file's physical locations, provides dataset management facilities, and facilitates data transfer to off-site grids. To integrate SAM with Fermilab's art software framework and the NOvA production workflow, we have developed methods to embed metadata into our configuration files, art files, and standalone ROOT files. A module in the art framework propagates the embedded information from configuration files into art files, and from input art files to output art files, allowing us to maintain a complete processing history within our files. Embedding metadata in configuration files also allows configuration files indexed in SAM to be used as inputs to Monte Carlo production jobs. Further, SAM keeps track of the input files used to create each output file. Parentage information enables the construction of self-draining datasets which have become the primary production paradigm used at NOvA. In this study we will present an overview of SAM at NOvA and how it has transformed the file production framework used by the experiment.« less

  4. Persistence of Antibiotic Resistance Plasmids in Biofilms

    DTIC Science & Technology

    2014-10-01

    from a particular file #e.g. here everything from “data1.xls” will have identifier of “Rep1” r1<-collate(ls( patt ="*rep1"),TRUE,lastVal="Rep1") r2...collate(ls( patt ="*rep2"),TRUE,lastVal="Rep2") #merge the data from the two files allOutput<-rbind(r1,r2) #at this point allOutput can be used

  5. RCHILD - an R-package for flexible use of the landscape evolution model CHILD

    NASA Astrophysics Data System (ADS)

    Dietze, Michael

    2014-05-01

    Landscape evolution models provide powerful approaches to numerically assess earth surface processes, to quantify rates of landscape change, infer sediment transfer rates, estimate sediment budgets, investigate the consequences of changes in external drivers on a geomorphic system, to provide spatio-temporal interpolations between known landscape states or to test conceptual hypotheses. CHILD (Channel-Hillslope Integrated Landscape Development Model) is one of the most-used models of landscape change in the context of at least tectonic and geomorphologic process interactions. Running CHILD from command line and working with the model output can be a rather awkward task (static model control via text input file, only numeric output in text files). The package RCHILD is a collection of functions for the free statistical software R that help using CHILD in a flexible, dynamic and user-friendly way. The comprised functions allow creating maps, real-time scenes, animations and further thematic plots from model output. The model input files can be modified dynamically and, hence, (feedback-related) changes in external factors can be implemented iteratively. Output files can be written to common formats that can be readily imported to standard GIS software. This contribution presents the basic functionality of the model CHILD as visualised and modified by the package. A rough overview of the available functions is given. Application examples help to illustrate the great potential of numeric modelling of geomorphologic processes.

  6. “One code to find them all”: a perl tool to conveniently parse RepeatMasker output files

    PubMed Central

    2014-01-01

    Background Of the different bioinformatic methods used to recover transposable elements (TEs) in genome sequences, one of the most commonly used procedures is the homology-based method proposed by the RepeatMasker program. RepeatMasker generates several output files, including the .out file, which provides annotations for all detected repeats in a query sequence. However, a remaining challenge consists of identifying the different copies of TEs that correspond to the identified hits. This step is essential for any evolutionary/comparative analysis of the different copies within a family. Different possibilities can lead to multiple hits corresponding to a unique copy of an element, such as the presence of large deletions/insertions or undetermined bases, and distinct consensus corresponding to a single full-length sequence (like for long terminal repeat (LTR)-retrotransposons). These possibilities must be taken into account to determine the exact number of TE copies. Results We have developed a perl tool that parses the RepeatMasker .out file to better determine the number and positions of TE copies in the query sequence, in addition to computing quantitative information for the different families. To determine the accuracy of the program, we tested it on several RepeatMasker .out files corresponding to two organisms (Drosophila melanogaster and Homo sapiens) for which the TE content has already been largely described and which present great differences in genome size, TE content, and TE families. Conclusions Our tool provides access to detailed information concerning the TE content in a genome at the family level from the .out file of RepeatMasker. This information includes the exact position and orientation of each copy, its proportion in the query sequence, and its quality compared to the reference element. In addition, our tool allows a user to directly retrieve the sequence of each copy and obtain the same detailed information at the family level when a local library with incomplete TE class/subclass information was used with RepeatMasker. We hope that this tool will be helpful for people working on the distribution and evolution of TEs within genomes.

  7. No3CoGP: non-conserved and conserved coexpressed gene pairs.

    PubMed

    Mal, Chittabrata; Aftabuddin, Md; Kundu, Sudip

    2014-12-08

    Analyzing the microarray data of different conditions, one can identify the conserved and condition-specific genes and gene modules, and thus can infer the underlying cellular activities. All the available tools based on Bioconductor and R packages differ in how they extract differential coexpression and at what level they study. There is a need for a user-friendly, flexible tool which can start analysis using raw or preprocessed microarray data and can report different levels of useful information. We present a GUI software, No3CoGP: Non-Conserved and Conserved Coexpressed Gene Pairs which takes Affymetrix microarray data (.CEL files or log2 normalized.txt files) along with annotation file (.csv file), Chip Definition File (CDF file) and probe file as inputs, utilizes the concept of network density cut-off and Fisher's z-test to extract biologically relevant information. It can identify four possible types of gene pairs based on their coexpression relationships. These are (i) gene pair showing coexpression in one condition but not in the other, (ii) gene pair which is positively coexpressed in one condition but negatively coexpressed in the other condition, (iii) positively and (iv) negatively coexpressed in both the conditions. Further, it can generate modules of coexpressed genes. Easy-to-use GUI interface enables researchers without knowledge in R language to use No3CoGP. Utilization of one or more CPU cores, depending on the availability, speeds up the program. The output files stored in the respective directories under the user-defined project offer the researchers to unravel condition-specific functionalities of gene, gene sets or modules.

  8. Modifications of the U.S. Geological Survey modular, finite-difference, ground-water flow model to read and write geographic information system files

    USGS Publications Warehouse

    Orzol, Leonard L.; McGrath, Timothy S.

    1992-01-01

    This report documents modifications to the U.S. Geological Survey modular, three-dimensional, finite-difference, ground-water flow model, commonly called MODFLOW, so that it can read and write files used by a geographic information system (GIS). The modified model program is called MODFLOWARC. Simulation programs such as MODFLOW generally require large amounts of input data and produce large amounts of output data. Viewing data graphically, generating head contours, and creating or editing model data arrays such as hydraulic conductivity are examples of tasks that currently are performed either by the use of independent software packages or by tedious manual editing, manipulating, and transferring data. Programs such as GIS programs are commonly used to facilitate preparation of the model input data and analyze model output data; however, auxiliary programs are frequently required to translate data between programs. Data translations are required when different programs use different data formats. Thus, the user might use GIS techniques to create model input data, run a translation program to convert input data into a format compatible with the ground-water flow model, run the model, run a translation program to convert the model output into the correct format for GIS, and use GIS to display and analyze this output. MODFLOWARC, avoids the two translation steps and transfers data directly to and from the ground-water-flow model. This report documents the design and use of MODFLOWARC and includes instructions for data input/output of the Basic, Block-centered flow, River, Recharge, Well, Drain, Evapotranspiration, General-head boundary, and Streamflow-routing packages. The modification to MODFLOW and the Streamflow-Routing package was minimized. Flow charts and computer-program code describe the modifications to the original computer codes for each of these packages. Appendix A contains a discussion on the operation of MODFLOWARC using a sample problem.

  9. BIREFRINGENT FILTER MODEL

    NASA Technical Reports Server (NTRS)

    Cross, P. L.

    1994-01-01

    Birefringent filters are often used as line-narrowing components in solid state lasers. The Birefringent Filter Model program generates a stand-alone model of a birefringent filter for use in designing and analyzing a birefringent filter. It was originally developed to aid in the design of solid state lasers to be used on aircraft or spacecraft to perform remote sensing of the atmosphere. The model is general enough to allow the user to address problems such as temperature stability requirements, manufacturing tolerances, and alignment tolerances. The input parameters for the program are divided into 7 groups: 1) general parameters which refer to all elements of the filter; 2) wavelength related parameters; 3) filter, coating and orientation parameters; 4) input ray parameters; 5) output device specifications; 6) component related parameters; and 7) transmission profile parameters. The program can analyze a birefringent filter with up to 12 different components, and can calculate the transmission and summary parameters for multiple passes as well as a single pass through the filter. The Jones matrix, which is calculated from the input parameters of Groups 1 through 4, is used to calculate the transmission. Output files containing the calculated transmission or the calculated Jones' matrix as a function of wavelength can be created. These output files can then be used as inputs for user written programs. For example, to plot the transmission or to calculate the eigen-transmittances and the corresponding eigen-polarizations for the Jones' matrix, write the appropriate data to a file. The Birefringent Filter Model is written in Microsoft FORTRAN 2.0. The program format is interactive. It was developed on an IBM PC XT equipped with an 8087 math coprocessor, and has a central memory requirement of approximately 154K. Since Microsoft FORTRAN 2.0 does not support complex arithmetic, matrix routines for addition, subtraction, and multiplication of complex, double precision variables are included. The Birefringent Filter Model was written in 1987.

  10. 77 FR 26791 - Records Schedules; Availability and Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ...-374- 09-7, 1 item, 1 temporary item). Master files of an electronic information system containing...-2012-0006, 1 item, 1 temporary item). Master files of an electronic information system used to document...-0001, 2 items, 2 temporary items). Master files and outputs of an electronic information system used to...

  11. Flight dynamics analysis and simulation of heavy lift airships. Volume 3: User's manual

    NASA Technical Reports Server (NTRS)

    Emmen, R. D.; Tischler, M. B.

    1982-01-01

    The User's Manual provides the basic information necessary to run the programs. This includes descriptions of the various data files necessary for the program, the various outputs from the program and the options available to the user when executing the program. Additional data file information is contained in the three appendices to the manual. These appendices list all input variables and their permissible values, an example listing of these variables, and all output variables available to the user.

  12. The Invasive Species Forecasting System (ISFS): An iRODS-Based, Cloud-Enabled Decision Support System for Invasive Species Habitat Suitability Modeling

    NASA Technical Reports Server (NTRS)

    Gill, Roger; Schnase, John L.

    2012-01-01

    The Invasive Species Forecasting System (ISFS) is an online decision support system that allows users to load point occurrence field sample data for a plant species of interest and quickly generate habitat suitability maps for geographic regions of interest, such as a national park, monument, forest, or refuge. Target customers for ISFS are natural resource managers and decision makers who have a need for scientifically valid, model- based predictions of the habitat suitability of plant species of management concern. In a joint project involving NASA and the Maryland Department of Natural Resources, ISFS has been used to model the potential distribution of Wavyleaf Basketgrass in Maryland's Chesapeake Bay Watershed. Maximum entropy techniques are used to generate predictive maps using predictor datasets derived from remotely sensed data and climate simulation outputs. The workflow to run a model is implemented in an iRODS microservice using a custom ISFS file driver that clips and re-projects data to geographic regions of interest, then shells out to perform MaxEnt processing on the input data. When the model completes, all output files and maps from the model run are registered in iRODS and made accessible to the user. The ISFS user interface is a web browser that uses the iRODS PHP client to interact with the ISFS/iRODS- server. ISFS is designed to reside in a VMware virtual machine running SLES 11 and iRODS 3.0. The ISFS virtual machine is hosted in a VMware vSphere private cloud infrastructure to deliver the online service.

  13. C2x: A tool for visualisation and input preparation for CASTEP and other electronic structure codes

    NASA Astrophysics Data System (ADS)

    Rutter, M. J.

    2018-04-01

    The c2x code fills two distinct roles. Its first role is in acting as a converter between the binary format .check files from the widely-used CASTEP [1] electronic structure code and various visualisation programs. Its second role is to manipulate and analyse the input and output files from a variety of electronic structure codes, including CASTEP, ONETEP and VASP, as well as the widely-used 'Gaussian cube' file format. Analysis includes symmetry analysis, and manipulation arbitrary cell transformations. It continues to be under development, with growing functionality, and is written in a form which would make it easy to extend it to working directly with files from other electronic structure codes. Data which c2x is capable of extracting from CASTEP's binary checkpoint files include charge densities, spin densities, wavefunctions, relaxed atomic positions, forces, the Fermi level, the total energy, and symmetry operations. It can recreate .cell input files from checkpoint files. Volumetric data can be output in formats useable by many common visualisation programs, and c2x will itself calculate integrals, expand data into supercells, and interpolate data via combinations of Fourier and trilinear interpolation. It can extract data along arbitrary lines (such as lines between atoms) as 1D output. C2x is able to convert between several common formats for describing molecules and crystals, including the .cell format of CASTEP. It can construct supercells, reduce cells to their primitive form, and add specified k-point meshes. It uses the spglib library [2] to report symmetry information, which it can add to .cell files. C2x is a command-line utility, so is readily included in scripts. It is available under the GPL and can be obtained from http://www.c2x.org.uk. It is believed to be the only open-source code which can read CASTEP's .check files, so it will have utility in other projects.

  14. Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.

    2007-01-01

    Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.

  15. SSM/OOM - SSM WITH OOM MANIPULATION CODE

    NASA Technical Reports Server (NTRS)

    Goza, S. P.

    1994-01-01

    Creating, animating, and recording solid-shaded and wireframe three-dimensional geometric models can be of great assistance in the research and design phases of product development, in project planning, and in engineering analyses. SSM and OOM are application programs which together allow for interactive construction and manipulation of three-dimensional models of real-world objects as simple as boxes or as complex as Space Station Freedom. The output of SSM, in the form of binary files defining geometric three dimensional models, is used as input to OOM. Animation in OOM is done using 3D models from SSM as well as cameras and light sources. The animated results of OOM can be output to videotape recorders, film recorders, color printers and disk files. SSM and OOM are also available separately as MSC-21914 and MSC-22263, respectively. The Solid Surface Modeler (SSM) is an interactive graphics software application for solid-shaded and wireframe three-dimensional geometric modeling. The program has a versatile user interface that, in many cases, allows mouse input for intuitive operation or keyboard input when accuracy is critical. SSM can be used as a stand-alone model generation and display program and offers high-fidelity still image rendering. Models created in SSM can also be loaded into the Object Orientation Manipulator for animation or engineering simulation. The Object Orientation Manipulator (OOM) is an application program for creating, rendering, and recording three-dimensional computer-generated still and animated images. This is done using geometrically defined 3D models, cameras, and light sources, referred to collectively as animation elements. OOM does not provide the tools necessary to construct 3D models; instead, it imports binary format model files generated by the Solid Surface Modeler (SSM). Model files stored in other formats must be converted to the SSM binary format before they can be used in OOM. SSM is available as MSC-21914 or as part of the SSM/OOM bundle, COS-10047. Among OOM's features are collision detection (with visual and audio feedback), the capability to define and manipulate hierarchical relationships between animation elements, stereographic display, and ray- traced rendering. OOM uses Euler angle transformations for calculating the results of translation and rotation operations. OOM and SSM are written in C-language for implementation on SGI IRIS 4D series workstations running the IRIX operating system. A minimum of 8Mb of RAM is recommended for each program. The standard distribution medium for this program package is a .25 inch streaming magnetic IRIX tape cartridge in UNIX tar format. These versions of OOM and SSM were released in 1993.

  16. VizieR Online Data Catalog: RefleX : X-ray-tracing code (Paltani+, 2017)

    NASA Astrophysics Data System (ADS)

    Paltani, S.; Ricci, C.

    2017-11-01

    We provide here the RefleX executable, for both Linux and MacOSX, together with the User Manual and example script file and output file Running (for instance): reflex_linux will produce the file reflex.out Note that the results may differ slightly depending on the OS, because of slight differences in some implementations numerical computations. The difference are scientifically meaningless. (5 data files).

  17. Construct User Guide

    DTIC Science & Technology

    2012-11-01

    interactions in construct: An empirical validation using calibrated grounding. In 2007 BRIMS Conference Proceedings, Norfolk, VA. Simon, H. A...by the path name. Users should ensure that if they have opened any output files (e.g., in Excel to view the files), they should either close the file...stringvars to delimit string variables. Common Gotchas If Construct is unable to open an input file, it will exit and close. There are times when an

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosler, Peter

    Stride Search provides a flexible tool for detecting storms or other extreme climate events in high-resolution climate data sets saved on uniform latitude-longitude grids in standard NetCDF format. Users provide the software a quantitative description of a meteorological event they are interested in; the software searches a data set for locations in space and time that meet the user’s description. In its first stage, Stride Search performs a spatial search of the data set at each timestep by dividing a search domain into circular sectors of constant geodesic radius. Data from a netCDF file is read into memory for eachmore » circular search sector. If the data meet or exceed a set of storm identification criteria (defined by the user), a storm is recorded to a linked list. Finally, the linked list is examined and duplicate detections of the same storm are removed and the results are written to an output file. The first stage’s output file is read by a second program that builds storm. Additional identification criteria may be applied at this stage to further classify storms. Storm tracks are the software’s ultimate output and routines are provided for formatting that output for various external software libraries for plotting and tabulating data.« less

  19. HBNG: Graph theory based visualization of hydrogen bond networks in protein structures.

    PubMed

    Tiwari, Abhishek; Tiwari, Vivek

    2007-07-09

    HBNG is a graph theory based tool for visualization of hydrogen bond network in 2D. Digraphs generated by HBNG facilitate visualization of cooperativity and anticooperativity chains and rings in protein structures. HBNG takes hydrogen bonds list files (output from HBAT, HBEXPLORE, HBPLUS and STRIDE) as input and generates a DOT language script and constructs digraphs using freeware AT and T Graphviz tool. HBNG is useful in the enumeration of favorable topologies of hydrogen bond networks in protein structures and determining the effect of cooperativity and anticooperativity on protein stability and folding. HBNG can be applied to protein structure comparison and in the identification of secondary structural regions in protein structures. Program is available from the authors for non-commercial purposes.

  20. National RCRA Hazardous Waste Biennial Report Data Files

    EPA Pesticide Factsheets

    The United States Environmental Protection Agency (EPA), in cooperation with the States, biennially collects information regarding the generation, management, and final disposition of hazardous wastes regulated under the Resource Conservation and Recovery Act of 1976 (RCRA), as amended. Collection, validation and verification of the Biennial Report (BR) data is the responsibility of RCRA authorized states and EPA regions. EPA does not modify the data reported by the states or regions. Any questions regarding the information reported for a RCRA handler should be directed to the state agency or region responsible for the BR data collection. BR data are collected every other year (odd-numbered years) and submitted in the following year. The BR data are used to support regulatory activities and provide basic statistics and trend of hazardous waste generation and management. BR data is available to the public through 3 mechanisms. 1. The RCRAInfo website includes data collected from 2001 to present-day (https://rcrainfo.epa.gov/rcrainfoweb/action/main-menu/view). Users of the RCRAInfo website can run queries and output reports for different data collection years at this site. All BR data collected from 2001 to present-day is stored in RCRAInfo, and is accessible through this website. 2. An FTP site allows users to access BR data files collected from 1999 - present day (ftp://ftp.epa.gov/rcrainfodata/). Zip files are available for download directly from this

  1. MDplot: Visualise Molecular Dynamics

    PubMed Central

    Margreitter, Christian; Oostenbrink, Chris

    2017-01-01

    The MDplot package provides plotting functions to allow for automated visualisation of molecular dynamics simulation output. It is especially useful in cases where the plot generation is rather tedious due to complex file formats or when a large number of plots are generated. The graphs that are supported range from those which are standard, such as RMsD/RMsF (root-mean-square deviation and root-mean-square fluctuation, respectively) to less standard, such as thermodynamic integration analysis and hydrogen bond monitoring over time. All told, they address many commonly used analyses. In this article, we set out the MDplot package′s functions, give examples of the function calls, and show the associated plots. Plotting and data parsing is separated in all cases, i.e. the respective functions can be used independently. Thus, data manipulation and the integration of additional file formats is fairly easy. Currently, the loading functions support GROMOS, GROMACS, and AMBER file formats. Moreover, we also provide a Bash interface that allows simple embedding of MDplot into Bash scripts as the final analysis step. Availability The package can be obtained in the latest major version from CRAN (https://cran.r-project.org/package=MDplot) or in the most recent version from the project′s GitHub page at https://github.com/MDplot/MDplot, where feedback is also most welcome. MDplot is published under the GPL-3 license. PMID:28845302

  2. GROSS- GAMMA RAY OBSERVATORY ATTITUDE DYNAMICS SIMULATOR

    NASA Technical Reports Server (NTRS)

    Garrick, J.

    1994-01-01

    The Gamma Ray Observatory (GRO) spacecraft will constitute a major advance in gamma ray astronomy by offering the first opportunity for comprehensive observations in the range of 0.1 to 30,000 megaelectronvolts (MeV). The Gamma Ray Observatory Attitude Dynamics Simulator, GROSS, is designed to simulate this mission. The GRO Dynamics Simulator consists of three separate programs: the Standalone Profile Program; the Simulator Program, which contains the Simulation Control Input/Output (SCIO) Subsystem, the Truth Model (TM) Subsystem, and the Onboard Computer (OBC) Subsystem; and the Postprocessor Program. The Standalone Profile Program models the environment of the spacecraft and generates a profile data set for use by the simulator. This data set contains items such as individual external torques; GRO spacecraft, Tracking and Data Relay Satellite (TDRS), and solar and lunar ephemerides; and star data. The Standalone Profile Program is run before a simulation. The SCIO subsystem is the executive driver for the simulator. It accepts user input, initializes parameters, controls simulation, and generates output data files and simulation status display. The TM subsystem models the spacecraft dynamics, sensors, and actuators. It accepts ephemerides, star data, and environmental torques from the Standalone Profile Program. With these and actuator commands from the OBC subsystem, the TM subsystem propagates the current state of the spacecraft and generates sensor data for use by the OBC and SCIO subsystems. The OBC subsystem uses sensor data from the TM subsystem, a Kalman filter (for attitude determination), and control laws to compute actuator commands to the TM subsystem. The OBC subsystem also provides output data to the SCIO subsystem for output to the analysts. The Postprocessor Program is run after simulation is completed. It generates printer and CRT plots and tabular reports of the simulated data at the direction of the user. GROSS is written in FORTRAN 77 and ASSEMBLER and has been implemented on a VAX 11/780 under VMS 4.5. It has a virtual memory requirement of 255k. GROSS was developed in 1986.

  3. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020; (supplement three to U.S. Geological Survey Water-resources investigations report 94-4251)

    USGS Publications Warehouse

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) and revised by Kernodle (Kernodle, J.M., 1998, Simulation of ground-water flow in the Albuquerque Basin, 1901-95, with projections to 2020 (supplement two to U.S. Geological Survey Water-Resources Investigations Report 94-4251): U.S. Geological Survey Open-File Report 96-209, 54 p.). Output files resulting from the computer simulations are included for reference.

  4. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Kato, Toshio

    Osaka Gas Co., Ltd. constructed Osaka Gas Technical Information System (OGTIS) in 1979, which stores and retrieves the in-house technical information and provides even primary materials by unifying optical disk files, facsimile system and so on. The major information sources are technical materials, survey materials, planning documents, design materials, research reports, business tour reports which are all generated inside the Company. At the present moment it amounts to 25,000 items in total adding 1,000 items annually. The data file is updated once in a month and also outputs the abstract journal OGTIS Report monthly. In 1983 it constructed System for International Exchange of Personal Information (SIP) as a subsystem of OGTIS in order to compile SIP database which covers exchange outlines with oversea enterprises or organizations. The data size is 2,600 totally adding about 500 annually with monthly data updating.

  5. Translator for Optimizing Fluid-Handling Components

    NASA Technical Reports Server (NTRS)

    Landon, Mark; Perry, Ernest

    2007-01-01

    A software interface has been devised to facilitate optimization of the shapes of valves, elbows, fittings, and other components used to handle fluids under extreme conditions. This software interface translates data files generated by PLOT3D (a NASA grid-based plotting-and- data-display program) and by computational fluid dynamics (CFD) software into a format in which the files can be read by Sculptor, which is a shape-deformation- and-optimization program. Sculptor enables the user to interactively, smoothly, and arbitrarily deform the surfaces and volumes in two- and three-dimensional CFD models. Sculptor also includes design-optimization algorithms that can be used in conjunction with the arbitrary-shape-deformation components to perform automatic shape optimization. In the optimization process, the output of the CFD software is used as feedback while the optimizer strives to satisfy design criteria that could include, for example, improved values of pressure loss, velocity, flow quality, mass flow, etc.

  6. Advances in a distributed approach for ocean model data interoperability

    USGS Publications Warehouse

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  7. Computer aided stress analysis of long bones utilizing computer tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marom, S.A.

    1986-01-01

    A computer aided analysis method, utilizing computed tomography (CT) has been developed, which together with a finite element program determines the stress-displacement pattern in a long bone section. The CT data file provides the geometry, the density and the material properties for the generated finite element model. A three-dimensional finite element model of a tibial shaft is automatically generated from the CT file by a pre-processing procedure for a finite element program. The developed pre-processor includes an edge detection algorithm which determines the boundaries of the reconstructed cross-sectional images of the scanned bone. A mesh generation procedure than automatically generatesmore » a three-dimensional mesh of a user-selected refinement. The elastic properties needed for the stress analysis are individually determined for each model element using the radiographic density (CT number) of each pixel with the elemental borders. The elastic modulus is determined from the CT radiographic density by using an empirical relationship from the literature. The generated finite element model, together with applied loads, determined from existing gait analysis and initial displacements, comprise a formatted input for the SAP IV finite element program. The output of this program, stresses and displacements at the model elements and nodes, are sorted and displayed by a developed post-processor to provide maximum and minimum values at selected locations in the model.« less

  8. Broadband Fan Noise Prediction System for Turbofan Engines. Volume 2; BFaNS User's Manual and Developer's Guide

    NASA Technical Reports Server (NTRS)

    Morin, Bruce L.

    2010-01-01

    Pratt & Whitney has developed a Broadband Fan Noise Prediction System (BFaNS) for turbofan engines. This system computes the noise generated by turbulence impinging on the leading edges of the fan and fan exit guide vane, and noise generated by boundary-layer turbulence passing over the fan trailing edge. BFaNS has been validated on three fan rigs that were tested during the NASA Advanced Subsonic Technology Program (AST). The predicted noise spectra agreed well with measured data. The predicted effects of fan speed, vane count, and vane sweep also agreed well with measurements. The noise prediction system consists of two computer programs: Setup_BFaNS and BFaNS. Setup_BFaNS converts user-specified geometry and flow-field information into a BFaNS input file. From this input file, BFaNS computes the inlet and aft broadband sound power spectra generated by the fan and FEGV. The output file from BFaNS contains the inlet, aft and total sound power spectra from each noise source. This report is the second volume of a three-volume set documenting the Broadband Fan Noise Prediction System: Volume 1: Setup_BFaNS User s Manual and Developer s Guide; Volume 2: BFaNS User s Manual and Developer s Guide; and Volume 3: Validation and Test Cases. The present volume begins with an overview of the Broadband Fan Noise Prediction System, followed by step-by-step instructions for installing and running BFaNS. It concludes with technical documentation of the BFaNS computer program.

  9. Broadband Fan Noise Prediction System for Turbofan Engines. Volume 1; Setup_BFaNS User's Manual and Developer's Guide

    NASA Technical Reports Server (NTRS)

    Morin, Bruce L.

    2010-01-01

    Pratt & Whitney has developed a Broadband Fan Noise Prediction System (BFaNS) for turbofan engines. This system computes the noise generated by turbulence impinging on the leading edges of the fan and fan exit guide vane, and noise generated by boundary-layer turbulence passing over the fan trailing edge. BFaNS has been validated on three fan rigs that were tested during the NASA Advanced Subsonic Technology Program (AST). The predicted noise spectra agreed well with measured data. The predicted effects of fan speed, vane count, and vane sweep also agreed well with measurements. The noise prediction system consists of two computer programs: Setup_BFaNS and BFaNS. Setup_BFaNS converts user-specified geometry and flow-field information into a BFaNS input file. From this input file, BFaNS computes the inlet and aft broadband sound power spectra generated by the fan and FEGV. The output file from BFaNS contains the inlet, aft and total sound power spectra from each noise source. This report is the first volume of a three-volume set documenting the Broadband Fan Noise Prediction System: Volume 1: Setup_BFaNS User s Manual and Developer s Guide; Volume 2: BFaNS User's Manual and Developer s Guide; and Volume 3: Validation and Test Cases. The present volume begins with an overview of the Broadband Fan Noise Prediction System, followed by step-by-step instructions for installing and running Setup_BFaNS. It concludes with technical documentation of the Setup_BFaNS computer program.

  10. Broadband Fan Noise Prediction System for Turbofan Engines. Volume 3; Validation and Test Cases

    NASA Technical Reports Server (NTRS)

    Morin, Bruce L.

    2010-01-01

    Pratt & Whitney has developed a Broadband Fan Noise Prediction System (BFaNS) for turbofan engines. This system computes the noise generated by turbulence impinging on the leading edges of the fan and fan exit guide vane, and noise generated by boundary-layer turbulence passing over the fan trailing edge. BFaNS has been validated on three fan rigs that were tested during the NASA Advanced Subsonic Technology Program (AST). The predicted noise spectra agreed well with measured data. The predicted effects of fan speed, vane count, and vane sweep also agreed well with measurements. The noise prediction system consists of two computer programs: Setup_BFaNS and BFaNS. Setup_BFaNS converts user-specified geometry and flow-field information into a BFaNS input file. From this input file, BFaNS computes the inlet and aft broadband sound power spectra generated by the fan and FEGV. The output file from BFaNS contains the inlet, aft and total sound power spectra from each noise source. This report is the third volume of a three-volume set documenting the Broadband Fan Noise Prediction System: Volume 1: Setup_BFaNS User s Manual and Developer s Guide; Volume 2: BFaNS User s Manual and Developer s Guide; and Volume 3: Validation and Test Cases. The present volume begins with an overview of the Broadband Fan Noise Prediction System, followed by validation studies that were done on three fan rigs. It concludes with recommended improvements and additional studies for BFaNS.

  11. DockoMatic: automated peptide analog creation for high throughput virtual screening.

    PubMed

    Jacob, Reed B; Bullock, Casey W; Andersen, Tim; McDougal, Owen M

    2011-10-01

    The purpose of this manuscript is threefold: (1) to describe an update to DockoMatic that allows the user to generate cyclic peptide analog structure files based on protein database (pdb) files, (2) to test the accuracy of the peptide analog structure generation utility, and (3) to evaluate the high throughput capacity of DockoMatic. The DockoMatic graphical user interface interfaces with the software program Treepack to create user defined peptide analogs. To validate this approach, DockoMatic produced cyclic peptide analogs were tested for three-dimensional structure consistency and binding affinity against four experimentally determined peptide structure files available in the Research Collaboratory for Structural Bioinformatics database. The peptides used to evaluate this new functionality were alpha-conotoxins ImI, PnIA, and their published analogs. Peptide analogs were generated by DockoMatic and tested for their ability to bind to X-ray crystal structure models of the acetylcholine binding protein originating from Aplysia californica. The results, consisting of more than 300 simulations, demonstrate that DockoMatic predicts the binding energy of peptide structures to within 3.5 kcal mol(-1), and the orientation of bound ligand compares to within 1.8 Å root mean square deviation for ligand structures as compared to experimental data. Evaluation of high throughput virtual screening capacity demonstrated that Dockomatic can collect, evaluate, and summarize the output of 10,000 AutoDock jobs in less than 2 hours of computational time, while 100,000 jobs requires approximately 15 hours and 1,000,000 jobs is estimated to take up to a week. Copyright © 2011 Wiley Periodicals, Inc.

  12. RevManHAL: towards automatic text generation in systematic reviews.

    PubMed

    Torres Torres, Mercedes; Adams, Clive E

    2017-02-09

    Systematic reviews are a key part of healthcare evaluation. They involve important painstaking but repetitive work. A major producer of systematic reviews, the Cochrane Collaboration, employs Review Manager (RevMan) programme-a software which assists reviewers and produces XML-structured files. This paper describes an add-on programme (RevManHAL) which helps auto-generate the abstract, results and discussion sections of RevMan-generated reviews in multiple languages. The paper also describes future developments for RevManHAL. RevManHAL was created in Java using NetBeans by a programmer working full time for 2 months. The resulting open-source programme uses editable phrase banks to envelop text/numbers from within the prepared RevMan file in formatted readable text of a chosen language. In this way, considerable parts of the review's 'abstract', 'results' and 'discussion' sections are created and a phrase added to 'acknowledgements'. RevManHAL's output needs to be checked by reviewers, but already, from our experience within the Cochrane Schizophrenia Group (200 maintained reviews, 900 reviewers), RevManHAL has saved much time which is better employed thinking about the meaning of the data rather than restating them. Many more functions will become possible as review writing becomes increasingly automated.

  13. Interactive digital signal processor

    NASA Technical Reports Server (NTRS)

    Mish, W. H.; Wenger, R. M.; Behannon, K. W.; Byrnes, J. B.

    1982-01-01

    The Interactive Digital Signal Processor (IDSP) is examined. It consists of a set of time series analysis Operators each of which operates on an input file to produce an output file. The operators can be executed in any order that makes sense and recursively, if desired. The operators are the various algorithms used in digital time series analysis work. User written operators can be easily interfaced to the sysatem. The system can be operated both interactively and in batch mode. In IDSP a file can consist of up to n (currently n=8) simultaneous time series. IDSP currently includes over thirty standard operators that range from Fourier transform operations, design and application of digital filters, eigenvalue analysis, to operators that provide graphical output, allow batch operation, editing and display information.

  14. File Specification for the 7-km GEOS-5 Nature Run, Ganymed Release Non-Hydrostatic 7-km Global Mesoscale Simulation

    NASA Technical Reports Server (NTRS)

    da Silva, Arlindo M.; Putman, William; Nattala, J.

    2014-01-01

    This document describes the gridded output files produced by a two-year global, non-hydrostatic mesoscale simulation for the period 2005-2006 produced with the non-hydrostatic version of GEOS-5 Atmospheric Global Climate Model (AGCM). In addition to standard meteorological parameters (wind, temperature, moisture, surface pressure), this simulation includes 15 aerosol tracers (dust, sea-salt, sulfate, black and organic carbon), O3, CO and CO2. This model simulation is driven by prescribed sea-surface temperature and sea-ice, daily volcanic and biomass burning emissions, as well as high-resolution inventories of anthropogenic sources. A description of the GEOS-5 model configuration used for this simulation can be found in Putman et al. (2014). The simulation is performed at a horizontal resolution of 7 km using a cubed-sphere horizontal grid with 72 vertical levels, extending up to to 0.01 hPa (approximately 80 km). For user convenience, all data products are generated on two logically rectangular longitude-latitude grids: a full-resolution 0.0625 deg grid that approximately matches the native cubed-sphere resolution, and another 0.5 deg reduced-resolution grid. The majority of the full-resolution data products are instantaneous with some fields being time-averaged. The reduced-resolution datasets are mostly time-averaged, with some fields being instantaneous. Hourly data intervals are used for the reduced-resolution datasets, while 30-minute intervals are used for the full-resolution products. All full-resolution output is on the model's native 72-layer hybrid sigma-pressure vertical grid, while the reduced-resolution output is given on native vertical levels and on 48 pressure surfaces extending up to 0.02 hPa. Section 4 presents additional details on horizontal and vertical grids. Information of the model surface representation can be found in Appendix B. The GEOS-5 product is organized into file collections that are described in detail in Appendix C. Additional details about variables listed in this file specification can be found in a separate document, the GEOS-5 File Specification Variable Definition Glossary. Documentation about the current access methods for products described in this document can be found on the GEOS-5 Nature Run portal: http://gmao.gsfc.nasa.gov/projects/G5NR. Information on the scientific quality of this simulation will appear in a forthcoming NASA Technical Report Series on Global Modeling and Data Assimilation to be available from http://gmao.gsfc.nasa.gov/pubs/tm/.

  15. MPI_XSTAR: MPI-based parallelization of XSTAR program

    NASA Astrophysics Data System (ADS)

    Danehkar, A.

    2017-12-01

    MPI_XSTAR parallelizes execution of multiple XSTAR runs using Message Passing Interface (MPI). XSTAR (ascl:9910.008), part of the HEASARC's HEAsoft (ascl:1408.004) package, calculates the physical conditions and emission spectra of ionized gases. MPI_XSTAR invokes XSTINITABLE from HEASoft to generate a job list of XSTAR commands for given physical parameters. The job list is used to make directories in ascending order, where each individual XSTAR is spawned on each processor and outputs are saved. HEASoft's XSTAR2TABLE program is invoked upon the contents of each directory in order to produce table model FITS files for spectroscopy analysis tools.

  16. Sandia Unstructured Triangle Tabular Interpolation Package v 0.1 beta

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2013-09-24

    The software interpolates tabular data, such as for equations of state, provided on an unstructured triangular grid. In particular, interpolation occurs in a two dimensional space by looking up the triangle in which the desired evaluation point resides and then performing a linear interpolation over the n-tuples associated with the nodes of the chosen triangle. The interface to the interpolation routines allows for automated conversion of units from those tabulated to the desired output units. when multiple tables are included in a data file, new tables may be generated by on-the-fly mixing of the provided tables

  17. Hardware independence checkout software

    NASA Technical Reports Server (NTRS)

    Cameron, Barry W.; Helbig, H. R.

    1990-01-01

    ACSI has developed a program utilizing CLIPS to assess compliance with various programming standards. Essentially the program parses C code to extract the names of all function calls. These are asserted as CLIPS facts which also include information about line numbers, source file names, and called functions. Rules have been devised to establish functions called that have not been defined in any of the source parsed. These are compared against lists of standards (represented as facts) using rules that check intersections and/or unions of these. By piping the output into other processes the source is appropriately commented by generating and executing parsed scripts.

  18. SurvNet: a web server for identifying network-based biomarkers that most correlate with patient survival data.

    PubMed

    Li, Jun; Roebuck, Paul; Grünewald, Stefan; Liang, Han

    2012-07-01

    An important task in biomedical research is identifying biomarkers that correlate with patient clinical data, and these biomarkers then provide a critical foundation for the diagnosis and treatment of disease. Conventionally, such an analysis is based on individual genes, but the results are often noisy and difficult to interpret. Using a biological network as the searching platform, network-based biomarkers are expected to be more robust and provide deep insights into the molecular mechanisms of disease. We have developed a novel bioinformatics web server for identifying network-based biomarkers that most correlate with patient survival data, SurvNet. The web server takes three input files: one biological network file, representing a gene regulatory or protein interaction network; one molecular profiling file, containing any type of gene- or protein-centred high-throughput biological data (e.g. microarray expression data or DNA methylation data); and one patient survival data file (e.g. patients' progression-free survival data). Given user-defined parameters, SurvNet will automatically search for subnetworks that most correlate with the observed patient survival data. As the output, SurvNet will generate a list of network biomarkers and display them through a user-friendly interface. SurvNet can be accessed at http://bioinformatics.mdanderson.org/main/SurvNet.

  19. ALC: automated reduction of rule-based models

    PubMed Central

    Koschorreck, Markus; Gilles, Ernst Dieter

    2008-01-01

    Background Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously. Results ALC (Automated Layer Construction) is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML) as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website. Conclusion ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files. PMID:18973705

  20. Micro-fabricated flexible PZT cantilever using d33 mode for energy harvesting

    NASA Astrophysics Data System (ADS)

    Cho, Hyunok; Park, Jongcheol; Park, Jae Yeong

    2017-12-01

    This paper presents a micro-fabricated flexible and curled PZT [Pb(Zr0.52Ti0.48)O3] cantilever using d33 piezoelectric mode for vibration based energy harvesting applications. The proposed cantilever based energy harvester consists of polyimide, PZT thin film, and inter-digitated IrOx electrodes. The flexible cantilever was formed using bulk-micromachining on a silicon wafer to integrate it with ICs. The d33 piezoelectric mode was applied to achieve a large output voltage by using inter-digitated electrodes, and the PZT thin film on polyimide layer has a remnant polarization and coercive filed of approximately 2 P r = 47.9 μC/cm2 and 2 E c = 78.8 kV/cm, respectively. The relative dielectric constant was 900. The fabricated micro-electromechanical systems energy harvester generated output voltages of 1.2 V and output power of 117 nW at its optimal resistive load of 6.6 MΩ from its resonant frequency of 97.8 Hz with an acceleration of 5 m/s2.

  1. User's guide for a large signal computer model of the helical traveling wave tube

    NASA Technical Reports Server (NTRS)

    Palmer, Raymond W.

    1992-01-01

    The use is described of a successful large-signal, two-dimensional (axisymmetric), deformable disk computer model of the helical traveling wave tube amplifier, an extensively revised and operationally simplified version. We also discuss program input and output and the auxiliary files necessary for operation. Included is a sample problem and its input data and output results. Interested parties may now obtain from the author the FORTRAN source code, auxiliary files, and sample input data on a standard floppy diskette, the contents of which are described herein.

  2. File concepts for parallel I/O

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1989-01-01

    The subject of input/output (I/O) was often neglected in the design of parallel computer systems, although for many problems I/O rates will limit the speedup attainable. The I/O problem is addressed by considering the role of files in parallel systems. The notion of parallel files is introduced. Parallel files provide for concurrent access by multiple processes, and utilize parallelism in the I/O system to improve performance. Parallel files can also be used conventionally by sequential programs. A set of standard parallel file organizations is proposed, organizations are suggested, using multiple storage devices. Problem areas are also identified and discussed.

  3. On the possibility of measuring atmospheric OH using intracavity laser spectroscopy

    NASA Technical Reports Server (NTRS)

    Mcmanus, J. Barry; Kolb, C. E.

    1994-01-01

    Intracavity laser spectroscopy (ILS) has been demonstrated to be useful for measuring extremely weak absorption produced by gases in air. ILS is based on the observation that when there are spectrally narrow losses within the cavity of a broadband laser, the laser output has corresponding spectral holes where the laser oscillation is partially quenched. The depth of the laser output dips can be enhanced by a factor of 10(exp 5) over the depth of the initial cavity loss, and absorptivities of 10(exp -8) cm(exp -1) have been measured in lasers only one meter long. With ILS, one can achieve in a compact space a spectral contrast that would otherwise require kilometers of pathlength. ILS systems typically use quasi-continuous wave dye lasers operating close to threshold. The pump laser is modulated from just below to just above the threshold level for the dye laser, and the dye laser output is spectroscopically observed during a well defined time interval after the onset of lasing (the generation time). The spectral contrast of an intracavity absorber is equivalent to that produced by absorption through a path length equal to the generation time multiplied by the speed of light (assuming the cavity is completely filed with the absorber) up to some limiting time. Thus, if one measures the spectrum after 33 microseconds, the effective path length is 10,000 meters.

  4. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  5. V&V of MCNP 6.1.1 Beta Against Intermediate and High-Energy Experimental Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mashnik, Stepan G

    This report presents a set of validation and verification (V&V) MCNP 6.1.1 beta results calculated in parallel, with MPI, obtained using its event generators at intermediate and high-energies compared against various experimental data. It also contains several examples of results using the models at energies below 150 MeV, down to 10 MeV, where data libraries are normally used. This report can be considered as the forth part of a set of MCNP6 Testing Primers, after its first, LA-UR-11-05129, and second, LA-UR-11-05627, and third, LA-UR-26944, publications, but is devoted to V&V with the latest, 1.1 beta version of MCNP6. The MCNP6more » test-problems discussed here are presented in the /VALIDATION_CEM/and/VALIDATION_LAQGSM/subdirectories in the MCNP6/Testing/directory. README files that contain short descriptions of every input file, the experiment, the quantity of interest that the experiment measures and its description in the MCNP6 output files, and the publication reference of that experiment are presented for every test problem. Templates for plotting the corresponding results with xmgrace as well as pdf files with figures representing the final results of our V&V efforts are presented. Several technical “bugs” in MCNP 6.1.1 beta were discovered during our current V&V of MCNP6 while running it in parallel with MPI using its event generators. These “bugs” are to be fixed in the following version of MCNP6. Our results show that MCNP 6.1.1 beta using its CEM03.03, LAQGSM03.03, Bertini, and INCL+ABLA, event generators describes, as a rule, reasonably well different intermediate- and high-energy measured data. This primer isn’t meant to be read from cover to cover. Readers may skip some sections and go directly to any test problem in which they are interested.« less

  6. Merged analog and photon counting profiles used as input for other RLPROF VAPs

    DOE Data Explorer

    Newsom, Rob

    2014-10-03

    The rlprof_merge VAP "merges" the photon counting and analog signals appropriately for each channel, creating an output data file that is very similar to the original raw data file format that the Raman lidar initially had.

  7. Merged analog and photon counting profiles used as input for other RLPROF VAPs

    DOE Data Explorer

    Newsom, Rob

    1998-03-01

    The rlprof_merge VAP "merges" the photon counting and analog signals appropriately for each channel, creating an output data file that is very similar to the original raw data file format that the Raman lidar initially had.

  8. 76 FR 63575 - Transportation Conformity Rule: MOVES Regional Grace Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... written in FORTRAN and used simple text files for data input and output, MOVES2010a is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables. These changes...

  9. 76 FR 63554 - Transportation Conformity Rule: MOVES Regional Grace Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... written in FORTRAN and used simple text files for data input and output, MOVES2010a is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables. These changes...

  10. 77 FR 11394 - Transportation Conformity Rule: MOVES Regional Grace Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-27

    ... written in FORTRAN and used simple text files for data input and output, MOVES is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables.\\13\\ \\13\\ Some...

  11. Development of a Dmt Monitor for Statistical Tracking of Gravitational-Wave Burst Triggers Generated from the Omega Pipeline

    NASA Astrophysics Data System (ADS)

    Li, Jun-Wei; Cao, Jun-Wei

    2010-04-01

    One challenge in large-scale scientific data analysis is to monitor data in real-time in a distributed environment. For the LIGO (Laser Interferometer Gravitational-wave Observatory) project, a dedicated suit of data monitoring tools (DMT) has been developed, yielding good extensibility to new data type and high flexibility to a distributed environment. Several services are provided, including visualization of data information in various forms and file output of monitoring results. In this work, a DMT monitor, OmegaMon, is developed for tracking statistics of gravitational-wave (OW) burst triggers that are generated from a specific OW burst data analysis pipeline, the Omega Pipeline. Such results can provide diagnostic information as reference of trigger post-processing and interferometer maintenance.

  12. Integral processing in beyond-Hartree-Fock calculations

    NASA Technical Reports Server (NTRS)

    Taylor, P. R.

    1986-01-01

    The increasing rate at which improvements in processing capacity outstrip improvements in input/output performance of large computers has led to recent attempts to bypass generation of a disk-based integral file. The direct self-consistent field (SCF) method of Almlof and co-workers represents a very successful implementation of this approach. This paper is concerned with the extension of this general approach to configuration interaction (CI) and multiconfiguration-self-consistent field (MCSCF) calculations. After a discussion of the particular types of molecular orbital (MO) integrals for which -- at least for most current generation machines -- disk-based storage seems unavoidable, it is shown how all the necessary integrals can be obtained as matrix elements of Coulomb and exchange operators that can be calculated using a direct approach. Computational implementations of such a scheme are discussed.

  13. Time-series animation techniques for visualizing urban growth

    USGS Publications Warehouse

    Acevedo, W.; Masuoka, P.

    1997-01-01

    Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file. ?? 1997 Elsevier Science Ltd.

  14. The NCAR Research Data Archive's Hybrid Approach for Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Schuster, D.; Worley, S. J.

    2013-12-01

    The NCAR Research Data Archive (RDA http://rda.ucar.edu) maintains a variety of data discovery and access capabilities for it's 600+ dataset collections to support the varying needs of a diverse user community. In-house developed and standards-based community tools offer services to more than 10,000 users annually. By number of users the largest group is external and access the RDA through web based protocols; the internal NCAR HPC users are fewer in number, but typically access more data volume. This paper will detail the data discovery and access services maintained by the RDA to support both user groups, and show metrics that illustrate how the community is using the services. The distributed search capability enabled by standards-based community tools, such as Geoportal and an OAI-PMH access point that serves multiple metadata standards, provide pathways for external users to initially discover RDA holdings. From here, in-house developed web interfaces leverage primary discovery level metadata databases that support keyword and faceted searches. Internal NCAR HPC users, or those familiar with the RDA, may go directly to the dataset collection of interest and refine their search based on rich file collection metadata. Multiple levels of metadata have proven to be invaluable for discovery within terabyte-sized archives composed of many atmospheric or oceanic levels, hundreds of parameters, and often numerous grid and time resolutions. Once users find the data they want, their access needs may vary as well. A THREDDS data server running on targeted dataset collections enables remote file access through OPENDAP and other web based protocols primarily for external users. In-house developed tools give all users the capability to submit data subset extraction and format conversion requests through scalable, HPC based delayed mode batch processing. Users can monitor their RDA-based data processing progress and receive instructions on how to access the data when it is ready. External users are provided with RDA server generated scripts to download the resulting request output. Similarly they can download native dataset collection files or partial files using Wget or cURL based scripts supplied by the RDA server. Internal users can access the resulting request output or native dataset collection files directly from centralized file systems.

  15. User’s guide for MapMark4GUI—A graphical user interface for the MapMark4 R package

    USGS Publications Warehouse

    Shapiro, Jason

    2018-05-29

    MapMark4GUI is an R graphical user interface (GUI) developed by the U.S. Geological Survey to support user implementation of the MapMark4 R statistical software package. MapMark4 was developed by the U.S. Geological Survey to implement probability calculations for simulating undiscovered mineral resources in quantitative mineral resource assessments. The GUI provides an easy-to-use tool to input data, run simulations, and format output results for the MapMark4 package. The GUI is written and accessed in the R statistical programming language. This user’s guide includes instructions on installing and running MapMark4GUI and descriptions of the statistical output processes, output files, and test data files.

  16. Open-Source Logic-Based Automated Sleep Scoring Software using Electrophysiological Recordings in Rats

    PubMed Central

    Gross, Brooks A.; Walsh, Christine M.; Turakhia, Apurva A.; Booth, Victoria; Mashour, George; Poe, Gina R.

    2009-01-01

    Manual state scoring of physiological recordings in sleep studies is time-consuming, resulting in a data backlog, research delays and increased personnel costs. We developed MATLAB-based software to automate scoring of sleep/waking states in rats, potentially extendable to other animals, from a variety of recording systems. The software contains two programs, Sleep Scorer and Auto-Scorer, for manual and automated scoring. Auto-Scorer is a logic-based program that displays power spectral densities of an electromyographic signal and σ, δ, and θ frequency bands of an electroencephalographic signal, along with the δ/θ ratio and σ ×θ, for every epoch. The user defines thresholds from the training file state definitions which the Auto-Scorer uses with logic to discriminate the state of every epoch in the file. Auto-Scorer was evaluated by comparing its output to manually scored files from 6 rats under 2 experimental conditions by 3 users. Each user generated a training file, set thresholds, and autoscored the 12 files into 4 states (waking, non-REM, transition-to-REM, and REM sleep) in ¼ the time required to manually score the file. Overall performance comparisons between Auto-Scorer and manual scoring resulted in a mean agreement of 80.24 +/− 7.87%, comparable to the average agreement among 3 manual scorers (83.03 +/− 4.00%). There was no significant difference between user-user and user-Auto-Scorer agreement ratios. These results support the use of our open-source Auto-Scorer, coupled with user review, to rapidly and accurately score sleep/waking states from rat recordings. PMID:19615408

  17. KMWin--a convenient tool for graphical presentation of results from Kaplan-Meier survival time analysis.

    PubMed

    Gross, Arnd; Ziepert, Marita; Scholz, Markus

    2012-01-01

    Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups.

  18. KMWin – A Convenient Tool for Graphical Presentation of Results from Kaplan-Meier Survival Time Analysis

    PubMed Central

    Gross, Arnd; Ziepert, Marita; Scholz, Markus

    2012-01-01

    Background Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. Results On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. Conclusions We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups. PMID:22723912

  19. Radar - 449MHz - Forks, WA (FKS) - Raw Data

    DOE Data Explorer

    Gottas, Daniel

    2018-06-25

    **Winds.** A radar wind profiler measures the Doppler shift of electromagnetic energy scattered back from atmospheric turbulence and hydrometeors along 3-5 vertical and off-vertical point beam directions. Back-scattered signal strength and radial-component velocities are remotely sensed along all beam directions and are combined to derive the horizontal wind field over the radar. These data typically are sampled and averaged hourly and usually have 6-m and/or 100-m vertical resolutions up to 4 km for the 915 MHz and 8 km for the 449 MHz systems. **Temperature.** To measure atmospheric temperature, a radio acoustic sounding system (RASS) is used in conjunction with the wind profile. These data typically are sampled and averaged for five minutes each hour and have a 60-m vertical resolution up to 1.5 km for the 915 MHz and 60 m up to 3.5 km for the 449 MHz. **Moments and Spectra.** The raw spectra and moments data are available for all dwells along each beam and are stored in daily files. For each day, there are files labeled "header" and "data." These files are generated by the radar data acquisition system (LAP-XM) and are encoded in a proprietary binary format. Values of spectral density at each Doppler velocity (FFT point), as well as the radial velocity, signal-to-noise ratio, and spectra width for the selected signal peak are included in these files. Attached zip files, *449mhz-spectra-data-extraction.zip* and *449mhz-moment-data-extraction.zip*, include executables to unpack the spectra, (GetSpectra32.exe) and moments (GetMomSp32.exe), respectively. Documentation on usage and output file formats also are included in the zip files.

  20. Radar - 449MHz - North Bend, OR (OTH) - Raw Data

    DOE Data Explorer

    Gottas, Daniel

    2018-06-25

    **Winds.** A radar wind profiler measures the Doppler shift of electromagnetic energy scattered back from atmospheric turbulence and hydrometeors along 3-5 vertical and off-vertical point beam directions. Back-scattered signal strength and radial-component velocities are remotely sensed along all beam directions and are combined to derive the horizontal wind field over the radar. These data typically are sampled and averaged hourly and usually have 6-m and/or 100-m vertical resolutions up to 4 km for the 915 MHz and 8 km for the 449 MHz systems. **Temperature.** To measure atmospheric temperature, a radio acoustic sounding system (RASS) is used in conjunction with the wind profile. These data typically are sampled and averaged for five minutes each hour and have a 60-m vertical resolution up to 1.5 km for the 915 MHz and 60 m up to 3.5 km for the 449 MHz. **Moments and Spectra.** The raw spectra and moments data are available for all dwells along each beam and are stored in daily files. For each day, there are files labeled "header" and "data." These files are generated by the radar data acquisition system (LAP-XM) and are encoded in a proprietary binary format. Values of spectral density at each Doppler velocity (FFT point), as well as the radial velocity, signal-to-noise ratio, and spectra width for the selected signal peak are included in these files. Attached zip files, *449mhz-spectra-data-extraction.zip* and *449mhz-moment-data-extraction.zip*, include executables to unpack the spectra, (GetSpectra32.exe) and moments (GetMomSp32.exe), respectively. Documentation on usage and output file formats also are included in the zip files.

  1. Radar - 449MHz - North Bend, OR (OTH) - Reviewed Data

    DOE Data Explorer

    Gottas, Daniel

    2018-06-25

    **Winds.** A radar wind profiler measures the Doppler shift of electromagnetic energy scattered back from atmospheric turbulence and hydrometeors along 3-5 vertical and off-vertical point beam directions. Back-scattered signal strength and radial-component velocities are remotely sensed along all beam directions and are combined to derive the horizontal wind field over the radar. These data typically are sampled and averaged hourly and usually have 6-m and/or 100-m vertical resolutions up to 4 km for the 915 MHz and 8 km for the 449 MHz systems. **Temperature.** To measure atmospheric temperature, a radio acoustic sounding system (RASS) is used in conjunction with the wind profile. These data typically are sampled and averaged for five minutes each hour and have a 60-m vertical resolution up to 1.5 km for the 915 MHz and 60 m up to 3.5 km for the 449 MHz. **Moments and Spectra.** The raw spectra and moments data are available for all dwells along each beam and are stored in daily files. For each day, there are files labeled "header" and "data." These files are generated by the radar data acquisition system (LAP-XM) and are encoded in a proprietary binary format. Values of spectral density at each Doppler velocity (FFT point), as well as the radial velocity, signal-to-noise ratio, and spectra width for the selected signal peak are included in these files. Attached zip files, *449mhz-spectra-data-extraction.zip* and *449mhz-moment-data-extraction.zip*, include executables to unpack the spectra, (GetSpectra32.exe) and moments (GetMomSp32.exe), respectively. Documentation on usage and output file formats also are included in the zip files.

  2. Radar - 449MHz - Forks, WA (FKS) - Reviewed Data

    DOE Data Explorer

    Gottas, Daniel

    2018-06-25

    **Winds.** A radar wind profiler measures the Doppler shift of electromagnetic energy scattered back from atmospheric turbulence and hydrometeors along 3-5 vertical and off-vertical point beam directions. Back-scattered signal strength and radial-component velocities are remotely sensed along all beam directions and are combined to derive the horizontal wind field over the radar. These data typically are sampled and averaged hourly and usually have 6-m and/or 100-m vertical resolutions up to 4 km for the 915 MHz and 8 km for the 449 MHz systems. **Temperature.** To measure atmospheric temperature, a radio acoustic sounding system (RASS) is used in conjunction with the wind profile. These data typically are sampled and averaged for five minutes each hour and have a 60-m vertical resolution up to 1.5 km for the 915 MHz and 60 m up to 3.5 km for the 449 MHz. **Moments and Spectra.** The raw spectra and moments data are available for all dwells along each beam and are stored in daily files. For each day, there are files labeled "header" and "data." These files are generated by the radar data acquisition system (LAP-XM) and are encoded in a proprietary binary format. Values of spectral density at each Doppler velocity (FFT point), as well as the radial velocity, signal-to-noise ratio, and spectra width for the selected signal peak are included in these files. Attached zip files, *449mhz-spectra-data-extraction.zip* and *449mhz-moment-data-extraction.zip*, include executables to unpack the spectra, (GetSpectra32.exe) and moments (GetMomSp32.exe), respectively. Documentation on usage and output file formats also are included in the zip files.

  3. Radar - 449MHz - Astoria, OR (AST) - Reviewed Data

    DOE Data Explorer

    Gottas, Daniel

    2018-06-25

    **Winds.** A radar wind profiler measures the Doppler shift of electromagnetic energy scattered back from atmospheric turbulence and hydrometeors along 3-5 vertical and off-vertical point beam directions. Back-scattered signal strength and radial-component velocities are remotely sensed along all beam directions and are combined to derive the horizontal wind field over the radar. These data typically are sampled and averaged hourly and usually have 6-m and/or 100-m vertical resolutions up to 4 km for the 915 MHz and 8 km for the 449 MHz systems. **Temperature.** To measure atmospheric temperature, a radio acoustic sounding system (RASS) is used in conjunction with the wind profile. These data typically are sampled and averaged for five minutes each hour and have a 60-m vertical resolution up to 1.5 km for the 915 MHz and 60 m up to 3.5 km for the 449 MHz. **Moments and Spectra.** The raw spectra and moments data are available for all dwells along each beam and are stored in daily files. For each day, there are files labeled "header" and "data." These files are generated by the radar data acquisition system (LAP-XM) and are encoded in a proprietary binary format. Values of spectral density at each Doppler velocity (FFT point), as well as the radial velocity, signal-to-noise ratio, and spectra width for the selected signal peak are included in these files. Attached zip files, *449mhz-spectra-data-extraction.zip* and *449mhz-moment-data-extraction.zip*, include executables to unpack the spectra, (GetSpectra32.exe) and moments (GetMomSp32.exe), respectively. Documentation on usage and output file formats also are included in the zip files.

  4. Radar - 449MHz - Astoria, OR (AST) - Raw Data

    DOE Data Explorer

    Gottas, Daniel

    2018-06-25

    **Winds.** A radar wind profiler measures the Doppler shift of electromagnetic energy scattered back from atmospheric turbulence and hydrometeors along 3-5 vertical and off-vertical point beam directions. Back-scattered signal strength and radial-component velocities are remotely sensed along all beam directions and are combined to derive the horizontal wind field over the radar. These data typically are sampled and averaged hourly and usually have 6-m and/or 100-m vertical resolutions up to 4 km for the 915 MHz and 8 km for the 449 MHz systems. **Temperature.** To measure atmospheric temperature, a radio acoustic sounding system (RASS) is used in conjunction with the wind profile. These data typically are sampled and averaged for five minutes each hour and have a 60-m vertical resolution up to 1.5 km for the 915 MHz and 60 m up to 3.5 km for the 449 MHz. **Moments and Spectra.** The raw spectra and moments data are available for all dwells along each beam and are stored in daily files. For each day, there are files labeled "header" and "data." These files are generated by the radar data acquisition system (LAP-XM) and are encoded in a proprietary binary format. Values of spectral density at each Doppler velocity (FFT point), as well as the radial velocity, signal-to-noise ratio, and spectra width for the selected signal peak are included in these files. Attached zip files, *449mhz-spectra-data-extraction.zip* and *449mhz-moment-data-extraction.zip*, include executables to unpack the spectra, (GetSpectra32.exe) and moments (GetMomSp32.exe), respectively. Documentation on usage and output file formats also are included in the zip files.

  5. Investigating Access Performance of Long Time Series with Restructured Big Model Data

    NASA Astrophysics Data System (ADS)

    Shen, S.; Ostrenga, D.; Vollmer, B.; Meyer, D. J.

    2017-12-01

    Data sets generated by models are substantially increasing in volume, due to increases in spatial and temporal resolution, and the number of output variables. Many users wish to download subsetted data in preferred data formats and structures, as it is getting increasingly difficult to handle the original full-size data files. For example, application research users, such as those involved with wind or solar energy, or extreme weather events, are likely only interested in daily or hourly model data at a single point or for a small area for a long time period, and prefer to have the data downloaded in a single file. With native model file structures, such as hourly data from NASA Modern-Era Retrospective analysis for Research and Applications Version-2 (MERRA-2), it may take over 10 hours for the extraction of interested parameters at a single point for 30 years. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is exploring methods to address this particular user need. One approach is to create value-added data by reconstructing the data files. Taking MERRA-2 data as an example, we have tested converting hourly data from one-day-per-file into different data cubes, such as one-month, one-year, or whole-mission. Performance are compared for reading local data files and accessing data through interoperable service, such as OPeNDAP. Results show that, compared to the original file structure, the new data cubes offer much better performance for accessing long time series. We have noticed that performance is associated with the cube size and structure, the compression method, and how the data are accessed. An optimized data cube structure will not only improve data access, but also may enable better online analytic services.

  6. Investigating Access Performance of Long Time Series with Restructured Big Model Data

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Ostrenga, Dana M.; Vollmer, Bruce E.; Meyer, Dave

    2017-01-01

    Data sets generated by models are substantially increasing in volume, due to increases in spatial and temporal resolution, and the number of output variables. Many users wish to download subsetted data in preferred data formats and structures, as it is getting increasingly difficult to handle the original full-size data files. For example, application research users such as those involved with wind or solar energy, or extreme weather events are likely only interested in daily or hourly model data at a single point (or for a small area) for a long time period, and prefer to have the data downloaded in a single file. With native model file structures, such as hourly data from NASA Modern-Era Retrospective analysis for Research and Applications Version-2 (MERRA-2), it may take over 10 hours for the extraction of parameters-of-interest at a single point for 30 years. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is exploring methods to address this particular user need. One approach is to create value-added data by reconstructing the data files. Taking MERRA-2 data as an example, we have tested converting hourly data from one-day-per-file into different data cubes, such as one-month, or one-year. Performance is compared for reading local data files and accessing data through interoperable services, such as OPeNDAP. Results show that, compared to the original file structure, the new data cubes offer much better performance for accessing long time series. We have noticed that performance is associated with the cube size and structure, the compression method, and how the data are accessed. An optimized data cube structure will not only improve data access, but also may enable better online analysis services

  7. OFFSET - RAY TRACING OPTICAL ANALYSIS OF OFFSET SOLAR COLLECTOR FOR SPACE STATION SOLAR DYNAMIC POWER SYSTEM

    NASA Technical Reports Server (NTRS)

    Jefferies, K.

    1994-01-01

    OFFSET is a ray tracing computer code for optical analysis of a solar collector. The code models the flux distributions within the receiver cavity produced by reflections from the solar collector. It was developed to model the offset solar collector of the solar dynamic electric power system being developed for Space Station Freedom. OFFSET has been used to improve the understanding of the collector-receiver interface and to guide the efforts of NASA contractors also researching the optical components of the power system. The collector for Space Station Freedom consists of 19 hexagonal panels each containing 24 triangular, reflective facets. Current research is geared toward optimizing flux distribution inside the receiver via changes in collector design and receiver orientation. OFFSET offers many options for experimenting with the design of the system. The offset parabolic collector model configuration is determined by an input file of facet corner coordinates. The user may choose other configurations by changing this file, but to simulate collectors that have other than 19 groups of 24 triangular facets would require modification of the FORTRAN code. Each of the roughly 500 facets in the assembled collector may be independently aimed to smooth out, or tailor, the flux distribution on the receiver's wall. OFFSET simulates the effects of design changes such as in receiver aperture location, tilt angle, and collector facet contour. Unique features of OFFSET include: 1) equations developed to pseudo-randomly select ray originating sources on the Sun which appear evenly distributed and include solar limb darkening; 2) Cone-optics technique used to add surface specular error to the ray originating sources to determine the apparent ray sources of the reflected sun; 3) choice of facet reflective surface contour -- spherical, ideal parabolic, or toroidal; 4) Gaussian distributions of radial and tangential components of surface slope error added to the surface normals at the ten nodal points on each facet; and 5) color contour plots of receiver incident flux distribution generated by PATRAN processing of FORTRAN computer code output. OFFSET output includes a file of input data for confirmation, a PATRAN results file containing the values necessary to plot the flux distribution at the receiver surface, a PATRAN results file containing the intensity distribution on a 40 x 40 cm area of the receiver aperture plane, a data file containing calculated information on the system configuration, a file including the X-Y coordinates of the target points of each collector facet on the aperture opening, and twelve P/PLOT input data files to allow X-Y plotting of various results data. OFFSET is written in FORTRAN (70%) for the IBM VM operating system. The code contains PATRAN statements (12%) and P/PLOT statements (18%) for generating plots. Once the program has been run on VM (or an equivalent system), the PATRAN and P/PLOT files may be transferred to a DEC VAX (or equivalent system) with access to PATRAN for PATRAN post processing. OFFSET was written in 1988 and last updated in 1989. PATRAN is a registered trademark of PDA Engineering. IBM is a registered trademark of International Business Machines Corporation. DEC VAX is a registered trademark of Digital Equipment Corporation.

  8. DICOM to print, 35-mm slides, web, and video projector: tutorial using Adobe Photoshop.

    PubMed

    Gurney, Jud W

    2002-10-01

    Preparing images for publication has dealt with film and the photographic process. With picture archiving and communications systems, many departments will no longer produce film. This will change how images are produced for publication. DICOM, the file format for radiographic images, has to be converted and then prepared for traditional publication, 35-mm slides, the newest techniques of video projection, and the World Wide Web. Tagged image file format is the common format for traditional print publication, whereas joint photographic expert group is the current file format for the World Wide Web. Each medium has specific requirements that can be met with a common image-editing program such as Adobe Photoshop (Adobe Systems, San Jose, CA). High-resolution images are required for print, a process that requires interpolation. However, the Internet requires images with a small file size for rapid transmission. The resolution of each output differs and the image resolution must be optimized to match the output of the publishing medium.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McSpaden, Alexander Thomas

    Two Python scripts have been written that process the output files of MCNP6 into a format that mimics the list-mode output of Los Alamos National Laboratory’s MC-15 and NPOD neutron detection systems. This report details the methods implemented in these scripts and instructions on their use.

  10. DockoMatic 2.0: high throughput inverse virtual screening and homology modeling.

    PubMed

    Bullock, Casey; Cornia, Nic; Jacob, Reed; Remm, Andrew; Peavey, Thomas; Weekes, Ken; Mallory, Chris; Oxford, Julia T; McDougal, Owen M; Andersen, Timothy L

    2013-08-26

    DockoMatic is a free and open source application that unifies a suite of software programs within a user-friendly graphical user interface (GUI) to facilitate molecular docking experiments. Here we describe the release of DockoMatic 2.0; significant software advances include the ability to (1) conduct high throughput inverse virtual screening (IVS); (2) construct 3D homology models; and (3) customize the user interface. Users can now efficiently setup, start, and manage IVS experiments through the DockoMatic GUI by specifying receptor(s), ligand(s), grid parameter file(s), and docking engine (either AutoDock or AutoDock Vina). DockoMatic automatically generates the needed experiment input files and output directories and allows the user to manage and monitor job progress. Upon job completion, a summary of results is generated by Dockomatic to facilitate interpretation by the user. DockoMatic functionality has also been expanded to facilitate the construction of 3D protein homology models using the Timely Integrated Modeler (TIM) wizard. The wizard TIM provides an interface that accesses the basic local alignment search tool (BLAST) and MODELER programs and guides the user through the necessary steps to easily and efficiently create 3D homology models for biomacromolecular structures. The DockoMatic GUI can be customized by the user, and the software design makes it relatively easy to integrate additional docking engines, scoring functions, or third party programs. DockoMatic is a free comprehensive molecular docking software program for all levels of scientists in both research and education.

  11. The MODIS Vegetation Canopy Water Content product

    NASA Astrophysics Data System (ADS)

    Ustin, S. L.; Riano, D.; Trombetti, M.

    2008-12-01

    Vegetation water stress drives wildfire behavior and risk, having important implications for biogeochemical cycling in natural ecosystems, agriculture, and forestry. Water stress limits plant transpiration and carbon gain. The regulation of photosynthesis creates close linkages between the carbon, water, and energy cycles and through metabolism to the nitrogen cycle. We generated systematic weekly CWC estimated for the USA from 2000-2006. MODIS measures the sunlit reflectance of the vegetation in the visible, near-infrared, and shortwave infrared. Radiative transfer models, such as PROSPECT-SAILH, determine how sunlight interacts with plant and soil materials. These models can be applied over a range of scales and ecosystem types. Artificial Neural Networks (ANN) were used to optimize the inversion of these models to determine vegetation water content. We carried out multi-scale validation of the product using field data, airborne and satellite cross-calibration. An Algorithm Theoretical Basis Document (ATBD) of the product is under evaluation by NASA. The CWC product inputs are 1) The MODIS Terra/Aqua surface reflectance product (MOD09A1/MYD09A1) 2) The MODIS land cover map product (MOD12Q1) reclassified to grassland, shrub-land and forest canopies; 3) An ANN trained with PROSPECT-SAILH; 4) A calibration file for each land cover type. The output is an ENVI file with the CWC values. The code is written in Matlab environment and is being adapted to read not only the 8 day MODIS composites, but also daily surface reflectance data. We plan to incorporate the cloud and snow mask and generate as output a geotiff file. Vegetation water content estimates will help predicting linkages between biogeochemical cycles, which will enable further understanding of feedbacks to atmospheric concentrations of greenhouse gases. It will also serve to estimate primary productivity of the biosphere; monitor/assess natural vegetation health related to drought, pollution or diseases; improve irrigation scheduling by reducing over-watering and under-watering. These estimates will also allow researchers to identify wildfire behavior/risk: drives ignition probability and burning efficiency; to be used as an indicator of soil moisture and Leaf Area Index.

  12. Computer Aided Grid Interface: An Interactive CFD Pre-Processor

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.

    1997-01-01

    NASA maintains an applications oriented computational fluid dynamics (CFD) efforts complementary to and in support of the aerodynamic-propulsion design and test activities. This is especially true at NASA/MSFC where the goal is to advance and optimize present and future liquid-fueled rocket engines. Numerical grid generation plays a significant role in the fluid flow simulations utilizing CFD. An overall goal of the current project was to develop a geometry-grid generation tool that will help engineers, scientists and CFD practitioners to analyze design problems involving complex geometries in a timely fashion. This goal is accomplished by developing the CAGI: Computer Aided Grid Interface system. The CAGI system is developed by integrating CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) geometric system output and/or Initial Graphics Exchange Specification (IGES) files (including all the NASA-IGES entities), geometry manipulations and generations associated with grid constructions, and robust grid generation methodologies. This report describes the development process of the CAGI system.

  13. Computer Aided Grid Interface: An Interactive CFD Pre-Processor

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.

    1996-01-01

    NASA maintains an applications oriented computational fluid dynamics (CFD) efforts complementary to and in support of the aerodynamic-propulsion design and test activities. This is especially true at NASA/MSFC where the goal is to advance and optimize present and future liquid-fueled rocket engines. Numerical grid generation plays a significant role in the fluid flow simulations utilizing CFD. An overall goal of the current project was to develop a geometry-grid generation tool that will help engineers, scientists and CFD practitioners to analyze design problems involving complex geometries in a timely fashion. This goal is accomplished by developing the Computer Aided Grid Interface system (CAGI). The CAGI system is developed by integrating CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) geometric system output and / or Initial Graphics Exchange Specification (IGES) files (including all the NASA-IGES entities), geometry manipulations and generations associated with grid constructions, and robust grid generation methodologies. This report describes the development process of the CAGI system.

  14. CAGI: Computer Aided Grid Interface. A work in progress

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.; Yu, Tzu-Yi; Vaughn, David

    1992-01-01

    Progress realized in the development of a Computer Aided Grid Interface (CAGI) software system in integrating CAD/CAM geometric system output and/or Interactive Graphics Exchange Standard (IGES) files, geometry manipulations associated with grid generation, and robust grid generation methodologies is presented. CAGI is being developed in a modular fashion and will offer fast, efficient and economical response to geometry/grid preparation, allowing the ability to upgrade basic geometry in a step-by-step fashion interactively and under permanent visual control along with minimizing the differences between the actual hardware surface descriptions and corresponding numerical analog. The computer code GENIE is used as a basis. The Non-Uniform Rational B-Splines (NURBS) representation of sculptured surfaces is utilized for surface grid redistribution. The computer aided analysis system, PATRAN, is adapted as a CAD/CAM system. The progress realized in NURBS surface grid generation, the development of IGES transformer, and geometry adaption using PATRAN will be presented along with their applicability to grid generation associated with rocket propulsion applications.

  15. Doclet To Synthesize UML

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2005-01-01

    The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.

  16. Java-based Graphical User Interface for MAVERIC-II

    NASA Technical Reports Server (NTRS)

    Seo, Suk Jai

    2005-01-01

    A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in files and examine the output data later. Users can also view the output stored in output files by calling a plotting program such as gnuplot. A typical scenario of the use of MAVERIC consists of three-steps; editing existing input data files, running MAVERIC, and plotting output results.

  17. BOREAS TE-19 Ecosystem Carbon Balance Model

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Papagno, Andrea (Editor); Frolking, Steve

    2000-01-01

    The BOREAS TE-19 team developed a model called the Spruce and Moss Model (SPAM) designed to simulate the daily carbon balance of a black spruce/moss boreal forest ecosystem. It is driven by daily weather conditions, and consists of four components: (1) soil climate, (2) tree photosynthesis and respiration, (3) moss photosynthesis and respiration, and (4) litter decomposition and associated heterotrophic respiration. The model simulates tree gross and net photosynthesis, wood respiration, live root respiration, moss gross and net photosynthesis, and heterotrophic respiration (decomposition of root litter, young needle and moss litter, and humus). These values can be combined to generate predictions of total site net ecosystem exchange of carbon (NEE), total soil dark respiration (live roots + heterotrophs + live moss), spruce and moss net productivity, and net carbon accumulation in the soil. To date, simulations have been of the BOREAS NSA-OBS and SSA-OBS tower sites, from 1968-95 (except 1990-93). The files include source code and sample input and output files in ASCII format. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Activity Archive Center (DAAC).

  18. As-built design specification for segment map (Sgmap) program

    NASA Technical Reports Server (NTRS)

    Tompkins, M. A. (Principal Investigator)

    1981-01-01

    The segment map program (SGMAP), which is part of the CLASFYT package, is described in detail. This program is designed to output symbolic maps or numerical dumps from LANDSAT cluster/classification files or aircraft ground truth/processed ground truth files which are in 'universal' format.

  19. Automated Breast Ultrasound for Ductal Pattern Reconstruction: Ground Truth File Generation and CADe Evaluation

    NASA Astrophysics Data System (ADS)

    Manousaki, D.; Panagiotopoulou, A.; Bizimi, V.; Haynes, M. S.; Love, S.; Kallergi, M.

    2017-11-01

    The purpose of this study was the generation of ground truth files (GTFs) of the breast ducts from 3D images of the Invenia™ Automated Breast Ultrasound System (ABUS) system (GE Healthcare, Little Chalfont, UK) and the application of these GTFs for the optimization of the imaging protocol and the evaluation of a computer aided detection (CADe) algorithm developed for automated duct detection. Six lactating, nursing volunteers were scanned with the ABUS before and right after breastfeeding their infants. An expert in breast ultrasound generated rough outlines of the milk-filled ducts in the transaxial slices of all image volumes and the final GTFs were created by using thresholding and smoothing tools in ImageJ. In addition, a CADe algorithm automatically segmented duct like areas and its results were compared to the expert’s GTFs by estimating true positive fraction (TPF) or % overlap. The CADe output differed significantly from the expert’s but both detected a smaller than expected volume of the ducts due to insufficient contrast (ducts were partially filled with milk), discontinuities, and artifacts. GTFs were used to modify the imaging protocol and improve the CADe method. In conclusion, electronic GTFs provide a valuable tool in the optimization of a tomographic imaging system, the imaging protocol, and the CADe algorithms. Their generation, however, is an extremely time consuming, strenuous process, particularly for multi-slice examinations, and alternatives based on phantoms or simulations are highly desirable.

  20. Creation of a Machine File and Subsequent Computer-Assisted Production of Publishing Outputs, Including a Translation Journal and an Index.

    ERIC Educational Resources Information Center

    Buckland, Lawrence F.; Weaver, Vance

    Reported are the findings of the Uspekhi experiment in creating a labeled machine file, as well as sample products of this system - an article from a scientific journal and an index page. Production cost tables are presented for the machine file, primary journals, and journal indexes. Comparisons were made between the 1965 predicted costs and the…

  1. Image processing tool for automatic feature recognition and quantification

    DOEpatents

    Chen, Xing; Stoddard, Ryan J.

    2017-05-02

    A system for defining structures within an image is described. The system includes reading of an input file, preprocessing the input file while preserving metadata such as scale information and then detecting features of the input file. In one version the detection first uses an edge detector followed by identification of features using a Hough transform. The output of the process is identified elements within the image.

  2. AP-IO: asynchronous pipeline I/O for hiding periodic output cost in CFD simulation.

    PubMed

    Xiaoguang, Ren; Xinhai, Xu

    2014-01-01

    Computational fluid dynamics (CFD) simulation often needs to periodically output intermediate results to files in the form of snapshots for visualization or restart, which seriously impacts the performance. In this paper, we present asynchronous pipeline I/O (AP-IO) optimization scheme for the periodically snapshot output on the basis of asynchronous I/O and CFD application characteristics. In AP-IO, dedicated background I/O processes or threads are in charge of handling the file write in pipeline mode, therefore the write overhead can be hidden with more calculation than classic asynchronous I/O. We design the framework of AP-IO and implement it in OpenFOAM, providing CFD users with a user-friendly interface. Experimental results on the Tianhe-2 supercomputer demonstrate that AP-IO can achieve a good optimization effect for the periodical snapshot output in CFD application, and the effect is especially better for massively parallel CFD simulations, which can reduce the total execution time up to about 40%.

  3. AP-IO: Asynchronous Pipeline I/O for Hiding Periodic Output Cost in CFD Simulation

    PubMed Central

    Xiaoguang, Ren; Xinhai, Xu

    2014-01-01

    Computational fluid dynamics (CFD) simulation often needs to periodically output intermediate results to files in the form of snapshots for visualization or restart, which seriously impacts the performance. In this paper, we present asynchronous pipeline I/O (AP-IO) optimization scheme for the periodically snapshot output on the basis of asynchronous I/O and CFD application characteristics. In AP-IO, dedicated background I/O processes or threads are in charge of handling the file write in pipeline mode, therefore the write overhead can be hidden with more calculation than classic asynchronous I/O. We design the framework of AP-IO and implement it in OpenFOAM, providing CFD users with a user-friendly interface. Experimental results on the Tianhe-2 supercomputer demonstrate that AP-IO can achieve a good optimization effect for the periodical snapshot output in CFD application, and the effect is especially better for massively parallel CFD simulations, which can reduce the total execution time up to about 40%. PMID:24955390

  4. Seshat: A Web service for accurate annotation, validation, and analysis of TP53 variants generated by conventional and next-generation sequencing.

    PubMed

    Tikkanen, Tuomas; Leroy, Bernard; Fournier, Jean Louis; Risques, Rosa Ana; Malcikova, Jitka; Soussi, Thierry

    2018-07-01

    Accurate annotation of genomic variants in human diseases is essential to allow personalized medicine. Assessment of somatic and germline TP53 alterations has now reached the clinic and is required in several circumstances such as the identification of the most effective cancer therapy for patients with chronic lymphocytic leukemia (CLL). Here, we present Seshat, a Web service for annotating TP53 information derived from sequencing data. A flexible framework allows the use of standard file formats such as Mutation Annotation Format (MAF) or Variant Call Format (VCF), as well as common TXT files. Seshat performs accurate variant annotations using the Human Genome Variation Society (HGVS) nomenclature and the stable TP53 genomic reference provided by the Locus Reference Genomic (LRG). In addition, using the 2017 release of the UMD_TP53 database, Seshat provides multiple statistical information for each TP53 variant including database frequency, functional activity, or pathogenicity. The information is delivered in standardized output tables that minimize errors and facilitate comparison of mutational data across studies. Seshat is a beneficial tool to interpret the ever-growing TP53 sequencing data generated by multiple sequencing platforms and it is freely available via the TP53 Website, http://p53.fr or directly at http://vps338341.ovh.net/. © 2018 Wiley Periodicals, Inc.

  5. 75 FR 51260 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-19

    ...: CER Generation II, LLC. Description: CER Generation II, LLC submits tariff filing per 35.12: CER.... Applicants: CER Generation, LLC. Description: CER Generation, LLC submits tariff filing per 35.12: CER...

  6. Program Description: Financial Master File Processor-SWRL Financial System.

    ERIC Educational Resources Information Center

    Ideda, Masumi

    Computer routines designed to produce various management and accounting reports required by the Southwest Regional Laboratory's (SWRL) Financial System are described. Input data requirements and output report formats are presented together with a discussion of the Financial Master File updating capabilities of the system. This document should be…

  7. GRIDGEN Version 1.0: a computer program for generating unstructured finite-volume grids

    USGS Publications Warehouse

    Lien, Jyh-Ming; Liu, Gaisheng; Langevin, Christian D.

    2015-01-01

    GRIDGEN is a computer program for creating layered quadtree grids for use with numerical models, such as the MODFLOW–USG program for simulation of groundwater flow. The program begins by reading a three-dimensional base grid, which can have variable row and column widths and spatially variable cell top and bottom elevations. From this base grid, GRIDGEN will continuously divide into four any cell intersecting user-provided refinement features (points, lines, and polygons) until the desired level of refinement is reached. GRIDGEN will then smooth, or balance, the grid so that no two adjacent cells, including overlying and underlying cells, differ by more than a user-specified level tolerance. Once these gridding processes are completed, GRIDGEN saves a tree structure file so that the layered quadtree grid can be quickly reconstructed as needed. Once a tree structure file has been created, GRIDGEN can then be used to (1) export the layered quadtree grid as a shapefile, (2) export grid connectivity and cell information as ASCII text files for use with MODFLOW–USG or other numerical models, and (3) intersect the grid with shapefiles of points, lines, or polygons, and save intersection output as ASCII text files and shapefiles. The GRIDGEN program is demonstrated by creating a layered quadtree grid for the Biscayne aquifer in Miami-Dade County, Florida, using hydrologic features to control where refinement is added.

  8. R.E.D. Server: a web service for deriving RESP and ESP charges and building force field libraries for new molecules and molecular fragments.

    PubMed

    Vanquelef, Enguerran; Simon, Sabrina; Marquant, Gaelle; Garcia, Elodie; Klimerak, Geoffroy; Delepine, Jean Charles; Cieplak, Piotr; Dupradeau, François-Yves

    2011-07-01

    R.E.D. Server is a unique, open web service, designed to derive non-polarizable RESP and ESP charges and to build force field libraries for new molecules/molecular fragments. It provides to computational biologists the means to derive rigorously molecular electrostatic potential-based charges embedded in force field libraries that are ready to be used in force field development, charge validation and molecular dynamics simulations. R.E.D. Server interfaces quantum mechanics programs, the RESP program and the latest version of the R.E.D. tools. A two step approach has been developed. The first one consists of preparing P2N file(s) to rigorously define key elements such as atom names, topology and chemical equivalencing needed when building a force field library. Then, P2N files are used to derive RESP or ESP charges embedded in force field libraries in the Tripos mol2 format. In complex cases an entire set of force field libraries or force field topology database is generated. Other features developed in R.E.D. Server include help services, a demonstration, tutorials, frequently asked questions, Jmol-based tools useful to construct PDB input files and parse R.E.D. Server outputs as well as a graphical queuing system allowing any user to check the status of R.E.D. Server jobs.

  9. TOAD Editor

    NASA Technical Reports Server (NTRS)

    Bingle, Bradford D.; Shea, Anne L.; Hofler, Alicia S.

    1993-01-01

    Transferable Output ASCII Data (TOAD) computer program (LAR-13755), implements format designed to facilitate transfer of data across communication networks and dissimilar host computer systems. Any data file conforming to TOAD format standard called TOAD file. TOAD Editor is interactive software tool for manipulating contents of TOAD files. Commonly used to extract filtered subsets of data for visualization of results of computation. Also offers such user-oriented features as on-line help, clear English error messages, startup file, macroinstructions defined by user, command history, user variables, UNDO features, and full complement of mathematical statistical, and conversion functions. Companion program, TOAD Gateway (LAR-14484), converts data files from variety of other file formats to that of TOAD. TOAD Editor written in FORTRAN 77.

  10. SARAH 4: A tool for (not only SUSY) model builders

    NASA Astrophysics Data System (ADS)

    Staub, Florian

    2014-06-01

    We present the new version of the Mathematica package SARAH which provides the same features for a non-supersymmetric model as previous versions for supersymmetric models. This includes an easy and straightforward definition of the model, the calculation of all vertices, mass matrices, tadpole equations, and self-energies. Also the two-loop renormalization group equations for a general gauge theory are now included and have been validated with the independent Python code PyR@TE. Model files for FeynArts, CalcHep/CompHep, WHIZARD and in the UFO format can be written, and source code for SPheno for the calculation of the mass spectrum, a set of precision observables, and the decay widths and branching ratios of all states can be generated. Furthermore, the new version includes routines to output model files for Vevacious for both, supersymmetric and non-supersymmetric, models. Global symmetries are also supported with this version and by linking Susyno the handling of Lie groups has been improved and extended.

  11. CFS MATLAB toolbox: An experiment builder for continuous flash suppression (CFS) task.

    PubMed

    Nuutinen, Mikko; Mustonen, Terhi; Häkkinen, Jukka

    2017-09-15

    CFS toolbox is an open-source collection of MATLAB functions that utilizes PsychToolbox-3 (PTB-3). It is designed to allow a researcher to create and run continuous flash suppression experiments using a variety of experimental parameters (i.e., stimulus types and locations, noise characteristics, and experiment window settings). In a CFS experiment, one of the eyes at a time is presented with a dynamically changing noise pattern, while the other eye is concurrently presented with a static target stimulus, such as a Gabor patch. Due to the strong interocular suppression created by the dominant noise pattern mask, the target stimulus is rendered invisible for an extended duration. Very little knowledge of MATLAB is required for using the toolbox; experiments are generated by modifying csv files with the required parameters, and result data are output to text files for further analysis. The open-source code is available on the project page under a Creative Commons License ( http://www.mikkonuutinen.arkku.net/CFS_toolbox/ and https://bitbucket.org/mikkonuutinen/cfs_toolbox ).

  12. A sophisticated cad tool for the creation of complex models for electromagnetic interaction analysis

    NASA Astrophysics Data System (ADS)

    Dion, Marc; Kashyap, Satish; Louie, Aloisius

    1991-06-01

    This report describes the essential features of the MS-DOS version of DIDEC-DREO, an interactive program for creating wire grid, surface patch, and cell models of complex structures for electromagnetic interaction analysis. It uses the device-independent graphics library DIGRAF and the graphics kernel system HALO, and can be executed on systems with various graphics devices. Complicated structures can be created by direct alphanumeric keyboard entry, digitization of blueprints, conversion form existing geometric structure files, and merging of simple geometric shapes. A completed DIDEC geometric file may then be converted to the format required for input to a variety of time domain and frequency domain electromagnetic interaction codes. This report gives a detailed description of the program DIDEC-DREO, its installation, and its theoretical background. Each available interactive command is described. The associated program HEDRON which generates simple geometric shapes, and other programs that extract the current amplitude data from electromagnetic interaction code outputs, are also discussed.

  13. Applications of Mars Global Reference Atmospheric Model (Mars-GRAM 2005) Supporting Mission Site Selection for Mars Science Laboratory

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, Carl G.

    2008-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM 2005) is an engineering level atmospheric model widely used for diverse mission applications. An overview is presented of Mars-GRAM 2005 and its new features. One new feature of Mars-GRAM 2005 is the 'auxiliary profile' option. In this option, an input file of temperature and density versus altitude is used to replace mean atmospheric values from Mars-GRAM's conventional (General Circulation Model) climatology. An auxiliary profile can be generated from any source of data or alternate model output. Auxiliary profiles for this study were produced from mesoscale model output (Southwest Research Institute's Mars Regional Atmospheric Modeling System (MRAMS) model and Oregon State University's Mars mesoscale model (MMM5)model) and a global Thermal Emission Spectrometer(TES) database. The global TES database has been specifically generated for purposes of making Mars-GRAM auxiliary profiles. This data base contains averages and standard deviations of temperature, density, and thermal wind components,averaged over 5-by-5 degree latitude-longitude bins and 15 degree L(s) bins, for each of three Mars years of TES nadir data. Results are presented using auxiliary profiles produced from the mesoscale model output and TES observed data for candidate Mars Science Laboratory (MSL) landing sites. Input parameters rpscale (for density perturbations) and rwscale (for wind perturbations) can be used to "recalibrate" Mars-GRAM perturbation magnitudes to better replicate observed or mesoscale model variability.

  14. The Functional Measurement Experiment Builder suite: two Java-based programs to generate and run functional measurement experiments.

    PubMed

    Mairesse, Olivier; Hofmans, Joeri; Theuns, Peter

    2008-05-01

    We propose a free, easy-to-use computer program that does not requires prior knowledge of computer programming to generate and run experiments using textual or pictorial stimuli. Although the FM Experiment Builder suite was initially programmed for building and conducting FM experiments, it can also be applied for non-FM experiments that necessitate randomized, single, or multifactorial designs. The program is highly configurable, allowing multilingual use and a wide range of different response formats. The outputs of the experiments are Microsoft Excel compatible .xls files that allow easy copy-paste of the results into Weiss's FM CalSTAT program (2006) or any other statistical package. Its Java-based structure is compatible with both Windows and Macintosh operating systems, and its compactness (< 1 MB) makes it easily distributable over the Internet.

  15. miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.

    PubMed

    Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M

    2009-07-01

    Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/.

  16. A new insight into the oscillation characteristics of endosonic files used in dentistry.

    PubMed

    Lea, S C; Walmsley, A D; Lumley, P J; Landini, G

    2004-05-21

    The aim of this study was to assess the oscillation characteristics of unconstrained endosonic files using a scanning laser vibrometer (SLV). Factors investigated included file vibration frequency and node/antinode location as well as the variation in file displacement amplitude due to increasing generator power setting. A 30 kHz Mini Piezon generator (Electro-Medical Systems, Switzerland) was used in conjunction with a #15 and #35 K-file. Each file was fixed in position with the long axis of the file perpendicular to the SLV camera head. The laser from the SLV was scanned over the length of the oscillating file for generator power settings 1 to 5 (minimum to half power). Measurements were repeated ten times. The fundamental vibration frequency for both files was 27.50 kHz. Scans of each file showed the positions of nodes/anti-nodes along the file length. The #15 file demonstrated no significant variation in its mean maximum displacement amplitude with increasing generator power, except at power setting 5, where a decrease in displacement amplitude was observed. The #35 file showed a general increase in mean maximum displacement amplitude with increasing power setting, except at power setting 4 where a 65% decrease in displacement amplitude occurred. In conclusion, scanning laser vibrometry is an effective method for assessing endosonic file vibration characteristics. The SLV was able to demonstrate that (unloaded) file vibration displacement amplitude does not increase linearly with increasing generator power. Further work is being performed on a greater variety of files and generators. Vibration characteristics of files under various loads and varying degrees of constraint should also be investigated.

  17. BioVEC: a program for biomolecule visualization with ellipsoidal coarse-graining.

    PubMed

    Abrahamsson, Erik; Plotkin, Steven S

    2009-09-01

    Biomolecule Visualization with Ellipsoidal Coarse-graining (BioVEC) is a tool for visualizing molecular dynamics simulation data while allowing coarse-grained residues to be rendered as ellipsoids. BioVEC reads in configuration files, which may be output from molecular dynamics simulations that include orientation output in either quaternion or ANISOU format, and can render frames of the trajectory in several common image formats for subsequent concatenation into a movie file. The BioVEC program is written in C++, uses the OpenGL API for rendering, and is open source. It is lightweight, allows for user-defined settings for and texture, and runs on either Windows or Linux platforms.

  18. GEOTHERM user guide

    USGS Publications Warehouse

    Swanson, James R.

    1977-01-01

    GEOTHERM is a computerized geothermal resources file developed by the U.S. Geological Survey. The file contains data on geothermal fields, wells, and chemical analyses from the United states and international sources. The General Information Processing System (GIPSY) in the IBM 370/155 computer is used to store and retrieve data. The GIPSY retrieval program contains simple commands which can be used to search the file, select a narrowly defined subset, sort the records, and output the data in a variety of forms. Eight commands are listed and explained so that the GEOTHERM file can be accessed directly by geologists. No programming experience is necessary to retrieve data from the file.

  19. EnergyPlus™

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Originally developed in 1999, an updated version 8.8.0 with bug fixes was released on September 30th, 2017. EnergyPlus™ is a whole building energy simulation program that engineers, architects, and researchers use to model both energy consumption—for heating, cooling, ventilation, lighting and plug and process loads—and water use in buildings. EnergyPlus is a console-based program that reads input and writes output to text files. It ships with a number of utilities including IDF-Editor for creating input files using a simple spreadsheet-like interface, EP-Launch for managing input and output files and performing batch simulations, and EP-Compare for graphically comparing the results ofmore » two or more simulations. Several comprehensive graphical interfaces for EnergyPlus are also available. DOE does most of its work with EnergyPlus using the OpenStudio® software development kit and suite of applications. DOE releases major updates to EnergyPlus twice annually.« less

  20. Rosetta: Ensuring the Preservation and Usability of ASCII-based Data into the Future

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Arms, S. C.

    2015-12-01

    Field data obtained from dataloggers often take the form of comma separated value (CSV) ASCII text files. While ASCII based data formats have positive aspects, such as the ease of accessing the data from disk and the wide variety of tools available for data analysis, there are some drawbacks, especially when viewing the situation through the lens of data interoperability and stewardship. The Unidata data translation tool, Rosetta, is a web-based service that provides an easy, wizard-based interface for data collectors to transform their datalogger generated ASCII output into Climate and Forecast (CF) compliant netCDF files following the CF-1.6 discrete sampling geometries. These files are complete with metadata describing what data are contained in the file, the instruments used to collect the data, and other critical information that otherwise may be lost in one of many README files. The choice of the machine readable netCDF data format and data model, coupled with the CF conventions, ensures long-term preservation and interoperability, and that future users will have enough information to responsibly use the data. However, with the understanding that the observational community appreciates the ease of use of ASCII files, methods for transforming the netCDF back into a CSV or spreadsheet format are also built-in. One benefit of translating ASCII data into a machine readable format that follows open community-driven standards is that they are instantly able to take advantage of data services provided by the many open-source data server tools, such as the THREDDS Data Server (TDS). While Rosetta is currently a stand-alone service, this talk will also highlight efforts to couple Rosetta with the TDS, thus allowing self-publishing of thoroughly documented datasets by the data producers themselves.

  1. 75 FR 7577 - Combined Notice of Filings # 1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-22

    ... following exempt wholesale generator filings: Docket Numbers: EG10-18-000. Applicants: CER Generation, LLC. Description: Notice of Self-Certification of Exempt Wholesale Generator Status of CER Generation, LLC. Filed...Energy, Inc., Nine Mile Point Nuclear Station, LLC, CER Generation II, LLC, Handsome Lake Energy, LLC...

  2. Recursive Feature Extraction in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-14

    ReFeX extracts recursive topological features from graph data. The input is a graph as a csv file and the output is a csv file containing feature values for each node in the graph. The features are based on topological counts in the neighborhoods of each nodes, as well as recursive summaries of neighbors' features.

  3. ARSRP Signal Processing Software

    DTIC Science & Technology

    1993-02-01

    Hardcopy of plot? Y or N : Y Enter name of postscript output file. : wave2 .ps Postscript file created. Another plot? Y or N :N 19 The two plots...created and stored in wavel.ps and wave2 .ps are shown in Figures 4 and 5 with the corresponding MSS real- time plots from the ARSRP Monitoring Support

  4. pmx Webserver: A User Friendly Interface for Alchemistry.

    PubMed

    Gapsys, Vytautas; de Groot, Bert L

    2017-02-27

    With the increase of available computational power and improvements in simulation algorithms, alchemical molecular dynamics based free energy calculations have developed into routine usage. To further facilitate the usability of alchemical methods for amino acid mutations, we have developed a web based infrastructure for obtaining hybrid protein structures and topologies. The presented webserver allows amino acid mutation selection in five contemporary molecular mechanics force fields. In addition, a complete mutation scan with a user defined amino acid is supported. The output generated by the webserver is directly compatible with the Gromacs molecular dynamics engine and can be used with any of the alchemical free energy calculation setup. Furthermore, we present a database of input files and precalculated free energy differences for tripeptides approximating a disordered state of a protein, of particular use for protein stability studies. Finally, the usage of the webserver and its output is exemplified by performing an alanine scan and investigating thermodynamic stability of the Trp cage mini protein. The webserver is accessible at http://pmx.mpibpc.mpg.de.

  5. Aquatic toxicity information retrieval data base (AQUIRE for non-vms) (1600 bpi). Data file

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The purpose of AQUIRE is to provide scientists and managers quick access to a comprehensive, systematic, computerized compilation of aquatic toxicity data. During 1992 and early 1993, nine data updates were made to the AQUIRE system. AQUIRE now contains 109,338 individual aquatic toxicity test results for 5,159 chemicals, 2,429 organisms, and over 160 endpoints reviewed from 7,517 publications. New features include a data selection option that permits searches that are restricted to data added or modified through any of the eight most recent updates, and a report generation (Full Record Detail) that displays the entire AQUIRE record for each testmore » identified in a search. Selection of the Full Record Detail feature allows the user to peruse all AQUIRE fields for a given test, including the information stored in the remarks section, while the standard AQUIRE output format presents selected data fields in a concise table. The standard report remains an available option for rapid viewing of system output.« less

  6. Aquatic toxicity information retrieval data base (AQUIRE for non-vms) (6250 bpi). Data file

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The purpose of AQUIRE is to provide scientists and managers quick access to a comprehensive, systematic, computerized compilation of aquatic toxicity data. During 1992 and early 1993, nine data updates were made to the AQUIRE system. AQUIRE now contains 109,338 individual aquatic toxicity test results for 5,159 chemicals, 2,429 organisms, and over 160 endpoints reviewed from 7,517 publications. New features include a data selection option that permits searches that are restricted to data added or modified through any of the eight most recent updates, and a report generation (Full Record Detail) that displays the entire AQUIRE record for each testmore » identified in a search. Selection of the Full Record Detail feature allows the user to peruse all AQUIRE fields for a given test, including the information stored in the remarks section, while the standard AQUIRE output format presents selected data fields in a concise table. The standard report remains an available option for rapid viewing of system output.« less

  7. 75 FR 35012 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-21

    ..., 2010. Take notice that the Commission received the following exempt wholesale generator filings: Docket...: Notice of Self-Certification of Exempt Wholesale Generator Status of Alta Wind I, LLC. Filed Date: 06/08... Exempt Wholesale Generator Status of Alta Wind II, LLC. Filed Date: 06/08/2010. Accession Number...

  8. A user-oriented synthetic workload generator

    NASA Technical Reports Server (NTRS)

    Kao, Wei-Lun

    1991-01-01

    A user oriented synthetic workload generator that simulates users' file access behavior based on real workload characterization is described. The model for this workload generator is user oriented and job specific, represents file I/O operations at the system call level, allows general distributions for the usage measures, and assumes independence in the file I/O operation stream. The workload generator consists of three parts which handle specification of distributions, creation of an initial file system, and selection and execution of file I/O operations. Experiments on SUN NFS are shown to demonstrate the usage of the workload generator.

  9. User's Manual for Aerofcn: a FORTRAN Program to Compute Aerodynamic Parameters

    NASA Technical Reports Server (NTRS)

    Conley, Joseph L.

    1992-01-01

    The computer program AeroFcn is discussed. AeroFcn is a utility program that computes the following aerodynamic parameters: geopotential altitude, Mach number, true velocity, dynamic pressure, calibrated airspeed, equivalent airspeed, impact pressure, total pressure, total temperature, Reynolds number, speed of sound, static density, static pressure, static temperature, coefficient of dynamic viscosity, kinematic viscosity, geometric altitude, and specific energy for a standard- or a modified standard-day atmosphere using compressible flow and normal shock relations. Any two parameters that define a unique flight condition are selected, and their values are entered interactively. The remaining parameters are computed, and the solutions are stored in an output file. Multiple cases can be run, and the multiple case solutions can be stored in another output file for plotting. Parameter units, the output format, and primary constants in the atmospheric and aerodynamic equations can also be changed.

  10. Characterizing output bottlenecks in a supercomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Bing; Chase, Jeffrey; Dillow, David A

    2012-01-01

    Supercomputer I/O loads are often dominated by writes. HPC (High Performance Computing) file systems are designed to absorb these bursty outputs at high bandwidth through massive parallelism. However, the delivered write bandwidth often falls well below the peak. This paper characterizes the data absorption behavior of a center-wide shared Lustre parallel file system on the Jaguar supercomputer. We use a statistical methodology to address the challenges of accurately measuring a shared machine under production load and to obtain the distribution of bandwidth across samples of compute nodes, storage targets, and time intervals. We observe and quantify limitations from competing traffic,more » contention on storage servers and I/O routers, concurrency limitations in the client compute node operating systems, and the impact of variance (stragglers) on coupled output such as striping. We then examine the implications of our results for application performance and the design of I/O middleware systems on shared supercomputers.« less

  11. Scientific workflow and support for high resolution global climate modeling at the Oak Ridge Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Anantharaj, V.; Mayer, B.; Wang, F.; Hack, J.; McKenna, D.; Hartman-Baker, R.

    2012-04-01

    The Oak Ridge Leadership Computing Facility (OLCF) facilitates the execution of computational experiments that require tens of millions of CPU hours (typically using thousands of processors simultaneously) while generating hundreds of terabytes of data. A set of ultra high resolution climate experiments in progress, using the Community Earth System Model (CESM), will produce over 35,000 files, ranging in sizes from 21 MB to 110 GB each. The execution of the experiments will require nearly 70 Million CPU hours on the Jaguar and Titan supercomputers at OLCF. The total volume of the output from these climate modeling experiments will be in excess of 300 TB. This model output must then be archived, analyzed, distributed to the project partners in a timely manner, and also made available more broadly. Meeting this challenge would require efficient movement of the data, staging the simulation output to a large and fast file system that provides high volume access to other computational systems used to analyze the data and synthesize results. This file system also needs to be accessible via high speed networks to an archival system that can provide long term reliable storage. Ideally this archival system is itself directly available to other systems that can be used to host services making the data and analysis available to the participants in the distributed research project and to the broader climate community. The various resources available at the OLCF now support this workflow. The available systems include the new Jaguar Cray XK6 2.63 petaflops (estimated) supercomputer, the 10 PB Spider center-wide parallel file system, the Lens/EVEREST analysis and visualization system, the HPSS archival storage system, the Earth System Grid (ESG), and the ORNL Climate Data Server (CDS). The ESG features federated services, search & discovery, extensive data handling capabilities, deep storage access, and Live Access Server (LAS) integration. The scientific workflow enabled on these systems, and developed as part of the Ultra-High Resolution Climate Modeling Project, allows users of OLCF resources to efficiently share simulated data, often multi-terabyte in volume, as well as the results from the modeling experiments and various synthesized products derived from these simulations. The final objective in the exercise is to ensure that the simulation results and the enhanced understanding will serve the needs of a diverse group of stakeholders across the world, including our research partners in U.S. Department of Energy laboratories & universities, domain scientists, students (K-12 as well as higher education), resource managers, decision makers, and the general public.

  12. Designing for Peta-Scale in the LSST Database

    NASA Astrophysics Data System (ADS)

    Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.

    2007-10-01

    The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.

  13. MS/MS Automated Selected Ion Chromatograms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monroe, Matthew

    2005-12-12

    This program can be used to read a LC-MS/MS data file from either a Finnigan ion trap mass spectrometer (.Raw file) or an Agilent Ion Trap mass spectrometer (.MGF and .CDF files) and create a selected ion chromatogram (SIC) for each of the parent ion masses chosen for fragmentation. The largest peak in each SIC is also identified, with reported statistics including peak elution time, height, area, and signal to noise ratio. It creates several output files, including a base peak intensity (BPI) chromatogram for the survey scan, a BPI for the fragmentation scans, an XML file containing the SICmore » data for each parent ion, and a "flat file" (ready for import into a database) containing summaries of the SIC data statistics.« less

  14. Litho hotspots fixing using model based algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Meili; Yu, Shirui; Mao, Zhibiao; Shafee, Marwa; Madkour, Kareem; ElManhawy, Wael; Kwan, Joe; Hu, Xinyi; Wan, Qijian; Du, Chunshan

    2017-04-01

    As technology advances, IC designs are getting more sophisticated, thus it becomes more critical and challenging to fix printability issues in the design flow. Running lithography checks before tapeout is now mandatory for designers, which creates a need for more advanced and easy-to-use techniques for fixing hotspots found after lithographic simulation without creating a new design rule checking (DRC) violation or generating a new hotspot. This paper presents a new methodology for fixing hotspots on layouts while using the same engine currently used to detect the hotspots. The fix is achieved by applying minimum movement of edges causing the hotspot, with consideration of DRC constraints. The fix is internally simulated by the lithographic simulation engine to verify that the hotspot is eliminated and that no new hotspot is generated by the new edge locations. Hotspot fix checking is enhanced by adding DRC checks to the litho-friendly design (LFD) rule file to guarantee that any fix options that violate DRC checks are removed from the output hint file. This extra checking eliminates the need to re-run both DRC and LFD checks to ensure the change successfully fixed the hotspot, which saves time and simplifies the designer's workflow. This methodology is demonstrated on industrial designs, where the fixing rate of single and dual layer hotspots is reported.

  15. Biological data integration: wrapping data and tools.

    PubMed

    Lacroix, Zoé

    2002-06-01

    Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. Building a digital library for scientific data requires accessing and manipulating data extracted from flat files or databases, documents retrieved from the Web as well as data generated by software. We present an approach to wrapping web data sources, databases, flat files, or data generated by tools through a database view mechanism. Generally, a wrapper has two tasks: it first sends a query to the source to retrieve data and, second builds the expected output with respect to the virtual structure. Our wrappers are composed of a retrieval component based on an intermediate object view mechanism called search views mapping the source capabilities to attributes, and an eXtensible Markup Language (XML) engine, respectively, to perform these two tasks. The originality of the approach consists of: 1) a generic view mechanism to access seamlessly data sources with limited capabilities and 2) the ability to wrap data sources as well as the useful specific tools they may provide. Our approach has been developed and demonstrated as part of the multidatabase system supporting queries via uniform object protocol model (OPM) interfaces.

  16. PRince: a web server for structural and physicochemical analysis of protein-RNA interface.

    PubMed

    Barik, Amita; Mishra, Abhishek; Bahadur, Ranjit Prasad

    2012-07-01

    We have developed a web server, PRince, which analyzes the structural features and physicochemical properties of the protein-RNA interface. Users need to submit a PDB file containing the atomic coordinates of both the protein and the RNA molecules in complex form (in '.pdb' format). They should also mention the chain identifiers of interacting protein and RNA molecules. The size of the protein-RNA interface is estimated by measuring the solvent accessible surface area buried in contact. For a given protein-RNA complex, PRince calculates structural, physicochemical and hydration properties of the interacting surfaces. All these parameters generated by the server are presented in a tabular format. The interacting surfaces can also be visualized with software plug-in like Jmol. In addition, the output files containing the list of the atomic coordinates of the interacting protein, RNA and interface water molecules can be downloaded. The parameters generated by PRince are novel, and users can correlate them with the experimentally determined biophysical and biochemical parameters for better understanding the specificity of the protein-RNA recognition process. This server will be continuously upgraded to include more parameters. PRince is publicly accessible and free for use. Available at http://www.facweb.iitkgp.ernet.in/~rbahadur/prince/home.html.

  17. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    NASA Technical Reports Server (NTRS)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  18. DockoMatic 2.0: High Throughput Inverse Virtual Screening and Homology Modeling

    PubMed Central

    Bullock, Casey; Cornia, Nic; Jacob, Reed; Remm, Andrew; Peavey, Thomas; Weekes, Ken; Mallory, Chris; Oxford, Julia T.; McDougal, Owen M.; Andersen, Timothy L.

    2013-01-01

    DockoMatic is a free and open source application that unifies a suite of software programs within a user-friendly Graphical User Interface (GUI) to facilitate molecular docking experiments. Here we describe the release of DockoMatic 2.0; significant software advances include the ability to: (1) conduct high throughput Inverse Virtual Screening (IVS); (2) construct 3D homology models; and (3) customize the user interface. Users can now efficiently setup, start, and manage IVS experiments through the DockoMatic GUI by specifying a receptor(s), ligand(s), grid parameter file(s), and docking engine (either AutoDock or AutoDock Vina). DockoMatic automatically generates the needed experiment input files and output directories, and allows the user to manage and monitor job progress. Upon job completion, a summary of results is generated by Dockomatic to facilitate interpretation by the user. DockoMatic functionality has also been expanded to facilitate the construction of 3D protein homology models using the Timely Integrated Modeler (TIM) wizard. The wizard TIM provides an interface that accesses the basic local alignment search tool (BLAST) and MODELLER programs, and guides the user through the necessary steps to easily and efficiently create 3D homology models for biomacromolecular structures. The DockoMatic GUI can be customized by the user, and the software design makes it relatively easy to integrate additional docking engines, scoring functions, or third party programs. DockoMatic is a free comprehensive molecular docking software program for all levels of scientists in both research and education. PMID:23808933

  19. 78 FR 55693 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-11

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 Take notice that the Commission received the following electric corporate filings: Docket Numbers: EC13-143-000. Applicants: Seneca Generation, LLC, Lake Lynn Generation, LLC, All Dams Generation, LLC, PE Hydro Generation...

  20. Tool for Merging Proposals Into DSN Schedules

    NASA Technical Reports Server (NTRS)

    Khanampornpan, Teerapat; Kwok, John; Call, Jared

    2008-01-01

    A Practical Extraction and Reporting Language (Perl) script called merge7da has been developed to facilitate determination, by a project scheduler in NASA's Deep Space Network, of whether a proposal for use of the DSN could create a conflict with the current DSN schedule. Prior to the development of merge7da, there was no way to quickly identify potential schedule conflicts: it was necessary to submit a proposal and wait a day or two for a response from a DSN scheduling facility. By using merge7da to detect and eliminate potential schedule conflicts before submitting a proposal, a project scheduler saves time and gains assurance that the proposal will probably be accepted. merge7da accepts two input files, one of which contains the current DSN schedule and is in a DSN-standard format called '7da'. The other input file contains the proposal and is in another DSN-standard format called 'C1/C2'. merge7da processes the two input files to produce a merged 7da-format output file that represents the DSN schedule as it would be if the proposal were to be adopted. This 7da output file can be loaded into various DSN scheduling software tools now in use.

  1. Online data handling and storage at the CMS experiment

    NASA Astrophysics Data System (ADS)

    Andre, J.-M.; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Demiragli, Z.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gómez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, RK; Morovic, S.; Nuñez-Barranco-Fernández, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced by the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ∼62 sources produced with an aggregate rate of ∼2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.

  2. Online Data Handling and Storage at the CMS Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andre, J. M.; et al.

    2015-12-23

    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced bymore » the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ~62 sources produced with an aggregate rate of ~2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.« less

  3. Photogrammetric Measurements in Fixed Wing Uav Imagery

    NASA Astrophysics Data System (ADS)

    Gülch, E.

    2012-07-01

    Several flights have been undertaken with PAMS (Photogrammetric Aerial Mapping System) by Germap, Germany, which is briefly introduced. This system is based on the SmartPlane fixed-wing UAV and a CANON IXUS camera system. The plane is equipped with GPS and has an infrared sensor system to estimate attitude values. A software has been developed to link the PAMS output to a standard photogrammetric processing chain built on Trimble INPHO. The linking of the image files and image IDs and the handling of different cases with partly corrupted output have to be solved to generate an INPHO project file. Based on this project file the software packages MATCH-AT, MATCH-T DSM, OrthoMaster and OrthoVista for digital aerial triangulation, DTM/DSM generation and finally digital orthomosaik generation are applied. The focus has been on investigations on how to adapt the "usual" parameters for the digital aerial triangulation and other software to the UAV flight conditions, which are showing high overlaps, large kappa angles and a certain image blur in case of turbulences. It was found, that the selected parameter setup shows a quite stable behaviour and can be applied to other flights. A comparison is made to results from other open source multi-ray matching software to handle the issue of the described flight conditions. Flights over the same area at different times have been compared to each other. The major objective was here to see, on how far differences occur relative to each other, without having access to ground control data, which would have a potential for applications with low requirements on the absolute accuracy. The results show, that there are influences of weather and illumination visible. The "unusual" flight pattern, which shows big time differences for neighbouring strips has an influence on the AT and DTM/DSM generation. The results obtained so far do indicate problems in the stability of the camera calibration. This clearly requests a usage of GCPs for all projects, independent on the application. The effort is estimated to be even higher as expected, as also self-calibration will be an issue to handle a possibly instable camera calibration. To overcome some of the encountered problems with the very specific features of UAV flights a software UAVision was developed based on Open Source libraries to produce input data for bundle adjustment of UAV images by PAMS. The empirical test results show a considerable improvement in the matching of tie points. The results do, however, show that the Open Source bundle adjustment was not applicable to this type of imagery. This still leaves the possibility to use the improved tie point correspondences in the commercial AT package.

  4. COMGEN - A PROGRAM FOR GENERATING FINITE ELEMENT MODELS OF COMPOSITE MATERIALS AT THE MICRO LEVEL (SGI IRIS VERSION)

    NASA Technical Reports Server (NTRS)

    Melis, M. E.

    1994-01-01

    A significant percentage of time spent in a typical finite element analysis is taken up in the modeling and assignment of loads and constraints. This process not only requires the analyst to be well-versed in the art of finite element modeling, but also demands familiarity with some sort of preprocessing software in order to complete the task expediently. COMGEN (COmposite Model GENerator) is an interactive FORTRAN program which can be used to create a wide variety of finite element models of continuous fiber composite materials at the micro level. It quickly generates batch or "session files" to be submitted to the finite element pre- and post-processor program, PATRAN. (PDA Engineering, Costa Mesa, CA.) In modeling a composite material, COMGEN assumes that its constituents can be represented by a "unit cell" of a fiber surrounded by matrix material. Two basic cell types are available. The first is a square packing arrangement where the fiber is positioned in the center of a square matrix cell. The second type, hexagonal packing, has the fiber centered in a hexagonal matrix cell. Different models can be created using combinations of square and hexagonal packing schemes. Variations include two- and three- dimensional cases, models with a fiber-matrix interface, and different constructions of unit cells. User inputs include fiber diameter and percent fiber-volume of the composite to be analyzed. In addition, various mesh densities, boundary conditions, and loads can be assigned to the models within COMGEN. The PATRAN program then uses a COMGEN session file to generate finite element models and their associated loads which can then be translated to virtually any finite element analysis code such as NASTRAN or MARC. COMGEN is written in FORTRAN 77 and has been implemented on DEC VAX series computers under VMS and SGI IRIS series workstations under IRIX. If the user has the PATRAN package available, the output can be graphically displayed. Without PATRAN, the output is tabular. The VAX VMS version is available on a 5.25 inch 360K MS-DOS format diskette (standard distribution media) or a 9-track 1600 BPI DEC VAX FILES-11 format magnetic tape, and it requires about 124K of main memory. The standard distribution media for the IRIS version is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The memory requirement for the IRIS version is 627K. COMGEN was developed in 1990. DEC, VAX and VMS are trademarks of Digital Equipment Corporation. PATRAN is a registered trademark of PDA Engineering. SGI IRIS and IRIX are trademarks of Silicon Graphics, Inc. MS-DOS is a registered trademark of Microsoft Corporation. UNIX is a registered trademark of AT&T.

  5. Coastal Area Tactical-mapping System (CATS)

    DTIC Science & Technology

    2007-09-30

    file (I.STD) file. A direct comparison to the timetag of the scanner index wedge times should then yield the shot number that corresponds to the...output of 4 relevant timing files. They are as follows: A.STD: The time at which the A-Scan Wedge index mark was detected. This is recorded as...a coarse time (seconds) and a fine time (microseconds). B.STD: The time at which the B-Scan Wedge index mark was detected. This is recorded as a

  6. JPLEX: Java Simplex Implementation with Branch-and-Bound Search for Automated Test Assembly

    ERIC Educational Resources Information Center

    Park, Ryoungsun; Kim, Jiseon; Dodd, Barbara G.; Chung, Hyewon

    2011-01-01

    JPLEX, short for Java simPLEX, is an automated test assembly (ATA) program. It is a mixed integer linear programming (MILP) solver written in Java. It reads in a configuration file, solves the minimization problem, and produces an output file for postprocessing. It implements the simplex algorithm to create a fully relaxed solution and…

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Mark A.; Bigelow, Matthew; Gilkey, Jeff C.

    The Super Strypi SWIL is a six degree-of-freedom (6DOF) simulation for the Super Strypi Launch Vehicle that includes a subset of the Super Strypi NGC software (guidance, ACS and sequencer). Aerodynamic and propulsive forces, mass properties, ACS (attitude control system) parameters, guidance parameters and Monte-Carlo parameters are defined in input files. Output parameters are saved to a Matlab mat file.

  8. Assessing Homegrown Library Collections: Using Google Analytics to Track Use of Screencasts and Flash-Based Learning Objects

    ERIC Educational Resources Information Center

    Betty, Paul

    2009-01-01

    Increasing use of screencast and Flash authoring software within libraries is resulting in "homegrown" library collections of digital learning objects and multimedia presentations. The author explores the use of Google Analytics to track usage statistics for interactive Shockwave Flash (.swf) files, the common file output for screencast and Flash…

  9. 75 FR 33748 - Amateur Radio Use of the Allocation at 5 MHz

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-15

    ... envelope power (PEP). 3. The existing amateur radio use of the 60 meter band represents a balancing of... Comment Filing System (ECFS), (2) the Federal Government's eRulemaking Portal, or (3) by filing paper... transmitter output power in modern amateur radio transceivers is 100 W PEP, and that the present 50 W PEP...

  10. Transferable Output ASCII Data (TOAD) editor version 1.0 user's guide

    NASA Technical Reports Server (NTRS)

    Bingel, Bradford D.; Shea, Anne L.; Hofler, Alicia S.

    1991-01-01

    The Transferable Output ASCII Data (TOAD) editor is an interactive software tool for manipulating the contents of TOAD files. The TOAD editor is specifically designed to work with tabular data. Selected subsets of data may be displayed to the user's screen, sorted, exchanged, duplicated, removed, replaced, inserted, or transferred to and from external files. It also offers a number of useful features including on-line help, macros, a command history, an 'undo' option, variables, and a full compliment of mathematical functions and conversion factors. Written in ANSI FORTRAN 77 and completely self-contained, the TOAD editor is very portable and has already been installed on SUN, SGI/IRIS, and CONVEX hosts.

  11. Small file aggregation in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  12. MASCOT HTML and XML parser: an implementation of a novel object model for protein identification data.

    PubMed

    Yang, Chunguang G; Granite, Stephen J; Van Eyk, Jennifer E; Winslow, Raimond L

    2006-11-01

    Protein identification using MS is an important technique in proteomics as well as a major generator of proteomics data. We have designed the protein identification data object model (PDOM) and developed a parser based on this model to facilitate the analysis and storage of these data. The parser works with HTML or XML files saved or exported from MASCOT MS/MS ions search in peptide summary report or MASCOT PMF search in protein summary report. The program creates PDOM objects, eliminates redundancy in the input file, and has the capability to output any PDOM object to a relational database. This program facilitates additional analysis of MASCOT search results and aids the storage of protein identification information. The implementation is extensible and can serve as a template to develop parsers for other search engines. The parser can be used as a stand-alone application or can be driven by other Java programs. It is currently being used as the front end for a system that loads HTML and XML result files of MASCOT searches into a relational database. The source code is freely available at http://www.ccbm.jhu.edu and the program uses only free and open-source Java libraries.

  13. Agricultural Baseline (BL0) scenario

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinckel, Chad M [University of Tennessee] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154)

    2016-07-13

    Scientific reason for data generation: to serve as the reference case for the BT16 volume 1 agricultural scenarios. The agricultural baseline runs from 2015 through 2040; a starting year of 2014 is used. Date the data set was last modified: 02/12/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: simulation was developed without offering a farmgate price to energy crops or residues (i.e., building on both the USDA 2015 baseline and the agricultural census data (USDA NASS 2014). Data generated are .txt output files by year, simulation identifier, county code (1-3109). Instruments used: POLYSYS (version POLYS2015_V10_alt_JAN22B) supplied by the University of Tennessee APAC The quality assurance and quality control that have been applied: • Check for negative planted area, harvested area, production, yield and cost values. • Check if harvested area exceeds planted area for annuals. • Check FIPS codes.

  14. Titan I propulsion system modeling and possible performance improvements

    NASA Astrophysics Data System (ADS)

    Giusti, Oreste

    This thesis features the Titan I propulsion systems and offers data-supported suggestions for improvements to increase performance. The original propulsion systems were modeled both graphically in CAD and via equations. Due to the limited availability of published information, it was necessary to create a more detailed, secondary set of models. Various engineering equations---pertinent to rocket engine design---were implemented in order to generate the desired extra detail. This study describes how these new models were then imported into the ESI CFD Suite. Various parameters are applied to these imported models as inputs that include, for example, bi-propellant combinations, pressure, temperatures, and mass flow rates. The results were then processed with ESI VIEW, which is visualization software. The output files were analyzed for forces in the nozzle, and various results were generated, including sea level thrust and ISP. Experimental data are provided to compare the original engine configuration models to the derivative suggested improvement models.

  15. Post-flight BET products for the 2nd discovery entry, STS-19 (51-A)

    NASA Technical Reports Server (NTRS)

    Kelly, G. M.; Mcconnell, J. G.; Heck, M. L.; Troutman, P. A.; Waters, L. A.; Findlay, J. T.

    1985-01-01

    The post-flight products for the second Discovery flight, STS-19 (51-A), are summarized. The inertial best estimate trajectory (BET), BT19D19/UN=169750N, was developed using spacecraft dynamic measurements from Inertial Measurement Unit 2 (IMU2) in conjunction with the best tracking coverage available for any of the earlier Shuttle entries. As a consequence of the latter, an anchor epoch was selected which conforms to an initial altitude of greater than a million feet. The Extended BET, ST19BET/UN=274885C, incorporated the previously mentioned inertial reconstructed state information and the Langley Atmospheric Information Retrieval System (LAIRS) atmosphere, ST19MET/UN=712662N, with some minor exceptions. Primary and back-up AEROBET reels are NK0165 and NK0201, respectively. This product was only developed over the lowermost 360 kft altitude range due to atmosphere problems but this relates to altitudes well above meaningful signal in the IMUs. Summary results generated from the AEROBET for this flight are presented with meaningful configuration and statistical comparisons from the previous thirteen flights. Modified maximum likelihood estimation (MMLE) files were generated based on IMU2 and the Rate Gyro Assembly/Accelerometer Assembly (RGA/AA), respectively. Appendices attached define spacecraft and physical constants utilized, show plots of the final tracking data residuals from the post-flight fit, list relevant parameters from the BET at a two second spacing, and retain for archival purpose all relevant input and output tapes and files generated.

  16. 76 FR 16400 - Combined Notice of Filings #

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ... Self-Certification of Exempt Wholesale Generator Status. Filed Date: 03/10/2011. Accession Number... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings Take notice.... Applicants: Ameren Energy Generating Company. Description: Application of Ameren Energy Generating Company to...

  17. MICE data handling on the Grid

    NASA Astrophysics Data System (ADS)

    Martyniak, J.; Mice Collaboration

    2014-06-01

    The international Muon Ionisation Cooling Experiment (MICE) is designed to demonstrate the principle of muon ionisation cooling for the first time, for application to a future Neutrino factory or Muon Collider. The experiment is currently under construction at the ISIS synchrotron at the Rutherford Appleton Laboratory (RAL), UK. In this paper we present a system - the Raw Data Mover, which allows us to store and distribute MICE raw data - and a framework for offline reconstruction and data management. The aim of the Raw Data Mover is to upload raw data files onto a safe tape storage as soon as the data have been written out by the DAQ system and marked as ready to be uploaded. Internal integrity of the files is verified and they are uploaded to the RAL Tier-1 Castor Storage Element (SE) and placed on two tapes for redundancy. We also make another copy at a separate disk-based SE at this stage to make it easier for users to access data quickly. Both copies are check-summed and the replicas are registered with an instance of the LCG File Catalog (LFC). On success a record with basic file properties is added to the MICE Metadata DB. The reconstruction process is triggered by new raw data records filled in by the mover system described above. Off-line reconstruction jobs for new raw files are submitted to RAL Tier-1 and the output is stored on tape. Batch reprocessing is done at multiple MICE enabled Grid sites and output files are shipped to central tape or disk storage at RAL using a custom File Transfer Controller.

  18. Generation of Long-time Complex Signals for Testing the Instruments for Detection of Voltage Quality Disturbances

    NASA Astrophysics Data System (ADS)

    Živanović, Dragan; Simić, Milan; Kokolanski, Zivko; Denić, Dragan; Dimcev, Vladimir

    2018-04-01

    Software supported procedure for generation of long-time complex test sentences, suitable for testing the instruments for detection of standard voltage quality (VQ) disturbances is presented in this paper. This solution for test signal generation includes significant improvements of computer-based signal generator presented and described in the previously published paper [1]. The generator is based on virtual instrumentation software for defining the basic signal parameters, data acquisition card NI 6343, and power amplifier for amplification of output voltage level to the nominal RMS voltage value of 230 V. Definition of basic signal parameters in LabVIEW application software is supported using Script files, which allows simple repetition of specific test signals and combination of more different test sequences in the complex composite test waveform. The basic advantage of this generator compared to the similar solutions for signal generation is the possibility for long-time test sequence generation according to predefined complex test scenarios, including various combinations of VQ disturbances defined in accordance with the European standard EN50160. Experimental verification of the presented signal generator capability is performed by testing the commercial power quality analyzer Fluke 435 Series II. In this paper are shown some characteristic complex test signals with various disturbances and logged data obtained from the tested power quality analyzer.

  19. The CMIP5 Model Documentation Questionnaire: Development of a Metadata Retrieval System for the METAFOR Common Information Model

    NASA Astrophysics Data System (ADS)

    Pascoe, Charlotte; Lawrence, Bryan; Moine, Marie-Pierre; Ford, Rupert; Devine, Gerry

    2010-05-01

    The EU METAFOR Project (http://metaforclimate.eu) has created a web-based model documentation questionnaire to collect metadata from the modelling groups that are running simulations in support of the Coupled Model Intercomparison Project - 5 (CMIP5). The CMIP5 model documentation questionnaire will retrieve information about the details of the models used, how the simulations were carried out, how the simulations conformed to the CMIP5 experiment requirements and details of the hardware used to perform the simulations. The metadata collected by the CMIP5 questionnaire will allow CMIP5 data to be compared in a scientifically meaningful way. This paper describes the life-cycle of the CMIP5 questionnaire development which starts with relatively unstructured input from domain specialists and ends with formal XML documents that comply with the METAFOR Common Information Model (CIM). Each development step is associated with a specific tool. (1) Mind maps are used to capture information requirements from domain experts and build a controlled vocabulary, (2) a python parser processes the XML files generated by the mind maps, (3) Django (python) is used to generate the dynamic structure and content of the web based questionnaire from processed xml and the METAFOR CIM, (4) Python parsers ensure that information entered into the CMIP5 questionnaire is output as CIM compliant xml, (5) CIM compliant output allows automatic information capture tools to harvest questionnaire content into databases such as the Earth System Grid (ESG) metadata catalogue. This paper will focus on how Django (python) and XML input files are used to generate the structure and content of the CMIP5 questionnaire. It will also address how the choice of development tools listed above provided a framework that enabled working scientists (who we would never ordinarily get to interact with UML and XML) to be part the iterative development process and ensure that the CMIP5 model documentation questionnaire reflects what scientists want to know about the models. Keywords: metadata, CMIP5, automatic information capture, tool development

  20. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  1. 76 FR 5572 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-01

    ... filing per 35.13(a)(2)(iii: 2152 Rio Blanco Wind Farm, LLC GIA to be effective 1/ 5/2011. Filed Date: 01..., 2011. Take notice that the Commission received the following exempt wholesale generator filings: Docket...-Certification as an Exempt Wholesale Generator of Mountain View Power Partners IV, LLC. Filed Date: 01/20/2011...

  2. A Hybrid Parachute Simulation Environment for the Orion Parachute Development Project

    NASA Technical Reports Server (NTRS)

    Moore, James W.

    2011-01-01

    A parachute simulation environment (PSE) has been developed that aims to take advantage of legacy parachute simulation codes and modern object-oriented programming techniques. This hybrid simulation environment provides the parachute analyst with a natural and intuitive way to construct simulation tasks while preserving the pedigree and authority of established parachute simulations. NASA currently employs four simulation tools for developing and analyzing air-drop tests performed by the CEV Parachute Assembly System (CPAS) Project. These tools were developed at different times, in different languages, and with different capabilities in mind. As a result, each tool has a distinct interface and set of inputs and outputs. However, regardless of the simulation code that is most appropriate for the type of test, engineers typically perform similar tasks for each drop test such as prediction of loads, assessment of altitude, and sequencing of disreefs or cut-aways. An object-oriented approach to simulation configuration allows the analyst to choose models of real physical test articles (parachutes, vehicles, etc.) and sequence them to achieve the desired test conditions. Once configured, these objects are translated into traditional input lists and processed by the legacy simulation codes. This approach minimizes the number of sim inputs that the engineer must track while configuring an input file. An object oriented approach to simulation output allows a common set of post-processing functions to perform routine tasks such as plotting and timeline generation with minimal sensitivity to the simulation that generated the data. Flight test data may also be translated into the common output class to simplify test reconstruction and analysis.

  3. Utilizing Mars Global Reference Atmospheric Model (Mars-GRAM 2005) to Evaluate Entry Probe Mission Sites

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, Carl G.

    2008-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM 2005) is an engineering-level atmospheric model widely used for diverse mission applications. An overview is presented of Mars-GRAM 2005 and its new features. The "auxiliary profile" option is one new feature of Mars-GRAM 2005. This option uses an input file of temperature and density versus altitude to replace the mean atmospheric values from Mars-GRAM's conventional (General Circulation Model) climatology. Any source of data or alternate model output can be used to generate an auxiliary profile. Auxiliary profiles for this study were produced from mesoscale model output (Southwest Research Institute's Mars Regional Atmospheric Modeling System (MRAMS) model and Oregon State University's Mars mesoscale model (MMM5) model) and a global Thermal Emission Spectrometer (TES) database. The global TES database has been specifically generated for purposes of making Mars-GRAM auxiliary profiles. This data base contains averages and standard deviations of temperature, density, and thermal wind components, averaged over 5-by-5 degree latitude-longitude bins and 15 degree Ls bins, for each of three Mars years of TES nadir data. The Mars Science Laboratory (MSL) sites are used as a sample of how Mars-GRAM' could be a valuable tool for planning of future Mars entry probe missions. Results are presented using auxiliary profiles produced from the mesoscale model output and TES observed data for candidate MSL landing sites. Input parameters rpscale (for density perturbations) and rwscale (for wind perturbations) can be used to "recalibrate" Mars-GRAM perturbation magnitudes to better replicate observed or mesoscale model variability.

  4. AQUATOX Frequently Asked Questions

    EPA Pesticide Factsheets

    Capabilities, Installation, Source Code, Example Study Files, Biotic State Variables, Initial Conditions, Loadings, Volume, Sediments, Parameters, Libraries, Ecotoxicology, Waterbodies, Link to Watershed Models, Output, Metals, Troubleshooting

  5. Application guide for AFINCH (Analysis of Flows in Networks of Channels) described by NHDPlus

    USGS Publications Warehouse

    Holtschlag, David J.

    2009-01-01

    AFINCH (Analysis of Flows in Networks of CHannels) is a computer application that can be used to generate a time series of monthly flows at stream segments (flowlines) and water yields for catchments defined in the National Hydrography Dataset Plus (NHDPlus) value-added attribute system. AFINCH provides a basis for integrating monthly flow data from streamgages, water-use data, monthly climatic data, and land-cover characteristics to estimate natural monthly water yields from catchments by user-defined regression equations. Images of monthly water yields for active streamgages are generated in AFINCH and provide a basis for detecting anomalies in water yields, which may be associated with undocumented flow diversions or augmentations. Water yields are multiplied by the drainage areas of the corresponding catchments to estimate monthly flows. Flows from catchments are accumulated downstream through the streamflow network described by the stream segments. For stream segments where streamgages are active, ratios of measured to accumulated flows are computed. These ratios are applied to upstream water yields to proportionally adjust estimated flows to match measured flows. Flow is conserved through the NHDPlus network. A time series of monthly flows can be generated for stream segments that average about 1-mile long, or monthly water yields from catchments that average about 1 square mile. Estimated monthly flows can be displayed within AFINCH, examined for nonstationarity, and tested for monotonic trends. Monthly flows also can be used to estimate flow-duration characteristics at stream segments. AFINCH generates output files of monthly flows and water yields that are compatible with ArcMap, a geographical information system analysis and display environment. Chloropleth maps of monthly water yield and flow can be generated and analyzed within ArcMap by joining NHDPlus data structures with AFINCH output. Matlab code for the AFINCH application is presented.

  6. DOE-2 sample run book: Version 2.1E

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winkelmann, F.C.; Birdsall, B.E.; Buhl, W.F.

    1993-11-01

    The DOE-2 Sample Run Book shows inputs and outputs for a variety of building and system types. The samples start with a simple structure and continue to a high-rise office building, a medical building, three small office buildings, a bar/lounge, a single-family residence, a small office building with daylighting, a single family residence with an attached sunspace, a ``parameterized`` building using input macros, and a metric input/output example. All of the samples use Chicago TRY weather. The main purpose of the Sample Run Book is instructional. It shows the relationship of LOADS-SYSTEMS-PLANT-ECONOMICS inputs, displays various input styles, and illustrates manymore » of the basic and advanced features of the program. Many of the sample runs are preceded by a sketch of the building showing its general appearance and the zoning used in the input. In some cases we also show a 3-D rendering of the building as produced by the program DrawBDL. Descriptive material has been added as comments in the input itself. We find that a number of users have loaded these samples onto their editing systems and use them as ``templates`` for creating new inputs. Another way of using them would be to store various portions as files that can be read into the input using the {number_sign}{number_sign} include command, which is part of the Input Macro feature introduced in version DOE-2.lD. Note that the energy rate structures here are the same as in the DOE-2.lD samples, but have been rewritten using the new DOE-2.lE commands and keywords for ECONOMICS. The samples contained in this report are the same as those found on the DOE-2 release files. However, the output numbers that appear here may differ slightly from those obtained from the release files. The output on the release files can be used as a check set to compare results on your computer.« less

  7. From data to analysis: linking NWChem and Avogadro with the syntax and semantics of Chemical Markup Language.

    PubMed

    de Jong, Wibe A; Walker, Andrew M; Hanwell, Marcus D

    2013-05-24

    Multidisciplinary integrated research requires the ability to couple the diverse sets of data obtained from a range of complex experiments and computer simulations. Integrating data requires semantically rich information. In this paper an end-to-end use of semantically rich data in computational chemistry is demonstrated utilizing the Chemical Markup Language (CML) framework. Semantically rich data is generated by the NWChem computational chemistry software with the FoX library and utilized by the Avogadro molecular editor for analysis and visualization. The NWChem computational chemistry software has been modified and coupled to the FoX library to write CML compliant XML data files. The FoX library was expanded to represent the lexical input files and molecular orbitals used by the computational chemistry software. Draft dictionary entries and a format for molecular orbitals within CML CompChem were developed. The Avogadro application was extended to read in CML data, and display molecular geometry and electronic structure in the GUI allowing for an end-to-end solution where Avogadro can create input structures, generate input files, NWChem can run the calculation and Avogadro can then read in and analyse the CML output produced. The developments outlined in this paper will be made available in future releases of NWChem, FoX, and Avogadro. The production of CML compliant XML files for computational chemistry software such as NWChem can be accomplished relatively easily using the FoX library. The CML data can be read in by a newly developed reader in Avogadro and analysed or visualized in various ways. A community-based effort is needed to further develop the CML CompChem convention and dictionary. This will enable the long-term goal of allowing a researcher to run simple "Google-style" searches of chemistry and physics and have the results of computational calculations returned in a comprehensible form alongside articles from the published literature.

  8. From data to analysis: linking NWChem and Avogadro with the syntax and semantics of Chemical Markup Language

    PubMed Central

    2013-01-01

    Background Multidisciplinary integrated research requires the ability to couple the diverse sets of data obtained from a range of complex experiments and computer simulations. Integrating data requires semantically rich information. In this paper an end-to-end use of semantically rich data in computational chemistry is demonstrated utilizing the Chemical Markup Language (CML) framework. Semantically rich data is generated by the NWChem computational chemistry software with the FoX library and utilized by the Avogadro molecular editor for analysis and visualization. Results The NWChem computational chemistry software has been modified and coupled to the FoX library to write CML compliant XML data files. The FoX library was expanded to represent the lexical input files and molecular orbitals used by the computational chemistry software. Draft dictionary entries and a format for molecular orbitals within CML CompChem were developed. The Avogadro application was extended to read in CML data, and display molecular geometry and electronic structure in the GUI allowing for an end-to-end solution where Avogadro can create input structures, generate input files, NWChem can run the calculation and Avogadro can then read in and analyse the CML output produced. The developments outlined in this paper will be made available in future releases of NWChem, FoX, and Avogadro. Conclusions The production of CML compliant XML files for computational chemistry software such as NWChem can be accomplished relatively easily using the FoX library. The CML data can be read in by a newly developed reader in Avogadro and analysed or visualized in various ways. A community-based effort is needed to further develop the CML CompChem convention and dictionary. This will enable the long-term goal of allowing a researcher to run simple “Google-style” searches of chemistry and physics and have the results of computational calculations returned in a comprehensible form alongside articles from the published literature. PMID:23705910

  9. iReport: a generalised Galaxy solution for integrated experimental reporting.

    PubMed

    Hiltemann, Saskia; Hoogstrate, Youri; der Spek, Peter van; Jenster, Guido; Stubbs, Andrew

    2014-01-01

    Galaxy offers a number of visualisation options with components, such as Trackster, Circster and Galaxy Charts, but currently lacks the ability to easily combine outputs from different tools into a single view or report. A number of tools produce HTML reports as output in order to combine the various output files from a single tool; however, this requires programming and knowledge of HTML, and the reports must be custom-made for each new tool. We have developed a generic and flexible reporting tool for Galaxy, iReport, that allows users to create interactive HTML reports directly from the Galaxy UI, with the ability to combine an arbitrary number of outputs from any number of different tools. Content can be organised into different tabs, and interactivity can be added to components. To demonstrate the capability of iReport we provide two publically available examples, the first is an iReport explaining about iReports, created for, and using content from the recent Galaxy Community Conference 2014. The second is a genetic report based on a trio analysis to determine candidate pathogenic variants which uses our previously developed Galaxy toolset for whole-genome NGS analysis, CGtag. These reports may be adapted for outputs from any sequencing platform and any results, such as omics data, non-high throughput results and clinical variables. iReport provides a secure, collaborative, and flexible web-based reporting system that is compatible with Galaxy (and non-Galaxy) generated content. We demonstrate its value with a real-life example of reporting genetic trio-analysis.

  10. Users guide: Steady-state aerodynamic-loads program for shuttle TPS tiles

    NASA Technical Reports Server (NTRS)

    Kerr, P. A.; Petley, D. H.

    1984-01-01

    A user's guide for the computer program that calculates the steady-state aerodynamic loads on the Shuttle thermal-protection tiles is presented. The main element in the program is the MITAS-II, Martin Marietta Interactive Thermal Analysis System. The MITAS-II is used to calculate the mass flow in a nine-tile model designed to simulate conditions duing a Shuttle flight. The procedures used to execute the program using the MITAS-II software are described. A list of the necessry software and data files along with a brief description of their functions is given. The format of the data file containing the surface pressure data is specified. The interpolation techniques used to calculate the pressure profile over the tile matrix are briefly described. In addition, the output from a sample run is explained. The actual output and the procedure file used to execute the program at NASA Langley Research Center on a CDC CYBER-175 are provided in the appendices.

  11. JaxoDraw: A graphical user interface for drawing Feynman diagrams

    NASA Astrophysics Data System (ADS)

    Binosi, D.; Theußl, L.

    2004-08-01

    JaxoDraw is a Feynman graph plotting tool written in Java. It has a complete graphical user interface that allows all actions to be carried out via mouse click-and-drag operations in a WYSIWYG fashion. Graphs may be exported to postscript/EPS format and can be saved in XML files to be used for later sessions. One of JaxoDraw's main features is the possibility to create ? code that may be used to generate graphics output, thus combining the powers of ? with those of a modern day drawing program. With JaxoDraw it becomes possible to draw even complicated Feynman diagrams with just a few mouse clicks, without the knowledge of any programming language. Program summaryTitle of program: JaxoDraw Catalogue identifier: ADUA Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUA Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Distribution format: tar gzip file Operating system: Any Java-enabled platform, tested on Linux, Windows ME, XP, Mac OS X Programming language used: Java License: GPL Nature of problem: Existing methods for drawing Feynman diagrams usually require some 'hard-coding' in one or the other programming or scripting language. It is not very convenient and often time consuming, to generate relatively simple diagrams. Method of solution: A program is provided that allows for the interactive drawing of Feynman diagrams with a graphical user interface. The program is easy to learn and use, produces high quality output in several formats and runs on any operating system where a Java Runtime Environment is available. Number of bytes in distributed program, including test data: 2 117 863 Number of lines in distributed program, including test data: 60 000 Restrictions: Certain operations (like internal latex compilation, Postscript preview) require the execution of external commands that might not work on untested operating systems. Typical running time: As an interactive program, the running time depends on the complexity of the diagram to be drawn.

  12. A software tool to automatically assure and report daily treatment deliveries by a cobalt‐60 radiation therapy device

    PubMed Central

    Wooten, H. Omar; Green, Olga; Li, Harold H.; Liu, Shi; Li, Xiaoling; Rodriguez, Vivian; Mutic, Sasa; Kashani, Rojano

    2016-01-01

    The aims of this study were to develop a method for automatic and immediate verification of treatment delivery after each treatment fraction in order to detect and correct errors, and to develop a comprehensive daily report which includes delivery verification results, daily image‐guided radiation therapy (IGRT) review, and information for weekly physics reviews. After systematically analyzing the requirements for treatment delivery verification and understanding the available information from a commercial MRI‐guided radiotherapy treatment machine, we designed a procedure to use 1) treatment plan files, 2) delivery log files, and 3) beam output information to verify the accuracy and completeness of each daily treatment delivery. The procedure verifies the correctness of delivered treatment plan parameters including beams, beam segments and, for each segment, the beam‐on time and MLC leaf positions. For each beam, composite primary fluence maps are calculated from the MLC leaf positions and segment beam‐on time. Error statistics are calculated on the fluence difference maps between the plan and the delivery. A daily treatment delivery report is designed to include all required information for IGRT and weekly physics reviews including the plan and treatment fraction information, daily beam output information, and the treatment delivery verification results. A computer program was developed to implement the proposed procedure of the automatic delivery verification and daily report generation for an MRI guided radiation therapy system. The program was clinically commissioned. Sensitivity was measured with simulated errors. The final version has been integrated into the commercial version of the treatment delivery system. The method automatically verifies the EBRT treatment deliveries and generates the daily treatment reports. Already in clinical use for over one year, it is useful to facilitate delivery error detection, and to expedite physician daily IGRT review and physicist weekly chart review. PACS number(s): 87.55.km PMID:27167269

  13. Groundwater flow and solute transport modelling from within R: Development of the RMODFLOW and RMT3DMS packages.

    NASA Astrophysics Data System (ADS)

    Rogiers, Bart

    2015-04-01

    Since a few years, an increasing number of contributed R packages is becoming available, in the field of hydrology. Hydrological time series analysis packages, lumped conceptual rainfall-runoff models, distributed hydrological models, weather generators, and different calibration and uncertainty estimation methods are all available. Also a few packages are available for solving partial differential equations. Subsurface hydrological modelling is however still seldomly performed in R, or with codes interfaced with R, despite the fact that excellent geostatistical packages, model calibration/inversion options and state-of-the-art visualization libraries are available. Moreover, other popular scientific programming languages like matlab and python have packages for pre- and post-processing files of MODFLOW (Harbaugh 2005) and MT3DMS (Zheng 2010) models. To fill this gap, we present here the development versions of the RMODFLOW and RMT3DMS packages, which allow pre- and post-processing MODFLOW and MT3DMS input and output files from within R. File reading and writing functions are currently available for different packages, and plotting functions are foreseen making use of the ggplot2 package (plotting system based on the grammar of graphics; Wickham 2009). The S3 generic-function object oriented programming style is used for this. An example is provided, making modifications to an existing model, and visualization of the model output. References Harbaugh, A. (2005). MODFLOW-2005: The US Geological Survey Modular Ground-water Model--the Ground-water Flow Process, U.S. Geological Survey Techniques and Methods 6-A16 (p. 253). Wickham, H. (2009). ggplot2: elegant graphics for data analysis. Springer New York, 2009. Zheng, C. (2010). MT3DMS v5.3, a modular three-dimensional multispecies transport model for simulation of advection, dispersion and chemical reactions of contaminants in groundwater systems. Supplemental User's Guide. (p. 56).

  14. Low-Speed Fingerprint Image Capture System User`s Guide, June 1, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitus, B.R.; Goddard, J.S.; Jatko, W.B.

    1993-06-01

    The Low-Speed Fingerprint Image Capture System (LS-FICS) uses a Sun workstation controlling a Lenzar ElectroOptics Opacity 1000 imaging system to digitize fingerprint card images to support the Federal Bureau of Investigation`s (FBI`s) Automated Fingerprint Identification System (AFIS) program. The system also supports the operations performed by the Oak Ridge National Laboratory- (ORNL-) developed Image Transmission Network (ITN) prototype card scanning system. The input to the system is a single FBI fingerprint card of the agreed-upon standard format and a user-specified identification number. The output is a file formatted to be compatible with the National Institute of Standards and Technology (NIST)more » draft standard for fingerprint data exchange dated June 10, 1992. These NIST compatible files contain the required print and text images. The LS-FICS is designed to provide the FBI with the capability of scanning fingerprint cards into a digital format. The FBI will replicate the system to generate a data base of test images. The Host Workstation contains the image data paths and the compression algorithm. A local area network interface, disk storage, and tape drive are used for the image storage and retrieval, and the Lenzar Opacity 1000 scanner is used to acquire the image. The scanner is capable of resolving 500 pixels/in. in both x and y directions. The print images are maintained in full 8-bit gray scale and compressed with an FBI-approved wavelet-based compression algorithm. The text fields are downsampled to 250 pixels/in. and 2-bit gray scale. The text images are then compressed using a lossless Huffman coding scheme. The text fields retrieved from the output files are easily interpreted when displayed on the screen. Detailed procedures are provided for system calibration and operation. Software tools are provided to verify proper system operation.« less

  15. Performance evaluation of an automated single-channel sleep–wake detection algorithm

    PubMed Central

    Kaplan, Richard F; Wang, Ying; Loparo, Kenneth A; Kelly, Monica R; Bootzin, Richard R

    2014-01-01

    Background A need exists, from both a clinical and a research standpoint, for objective sleep measurement systems that are both easy to use and can accurately assess sleep and wake. This study evaluates the output of an automated sleep–wake detection algorithm (Z-ALG) used in the Zmachine (a portable, single-channel, electroencephalographic [EEG] acquisition and analysis system) against laboratory polysomnography (PSG) using a consensus of expert visual scorers. Methods Overnight laboratory PSG studies from 99 subjects (52 females/47 males, 18–60 years, median age 32.7 years), including both normal sleepers and those with a variety of sleep disorders, were assessed. PSG data obtained from the differential mastoids (A1–A2) were assessed by Z-ALG, which determines sleep versus wake every 30 seconds using low-frequency, intermediate-frequency, and high-frequency and time domain EEG features. PSG data were independently scored by two to four certified PSG technologists, using standard Rechtschaffen and Kales guidelines, and these score files were combined on an epoch-by-epoch basis, using a majority voting rule, to generate a single score file per subject to compare against the Z-ALG output. Both epoch-by-epoch and standard sleep indices (eg, total sleep time, sleep efficiency, latency to persistent sleep, and wake after sleep onset) were compared between the Z-ALG output and the technologist consensus score files. Results Overall, the sensitivity and specificity for detecting sleep using the Z-ALG as compared to the technologist consensus are 95.5% and 92.5%, respectively, across all subjects, and the positive predictive value and the negative predictive value for detecting sleep are 98.0% and 84.2%, respectively. Overall κ agreement is 0.85 (approaching the level of agreement observed among sleep technologists). These results persist when the sleep disorder subgroups are analyzed separately. Conclusion This study demonstrates that the Z-ALG automated sleep–wake detection algorithm, using the single A1–A2 EEG channel, has a level of accuracy that is similar to PSG technologists in the scoring of sleep and wake, thereby making it suitable for a variety of in-home monitoring applications, such as in conjunction with the Zmachine system. PMID:25342922

  16. Terminal Area Simulation System User's Guide - Version 10.0

    NASA Technical Reports Server (NTRS)

    Switzer, George F.; Proctor, Fred H.

    2014-01-01

    The Terminal Area Simulation System (TASS) is a three-dimensional, time-dependent, large eddy simulation model that has been developed for studies of wake vortex and weather hazards to aviation, along with other atmospheric turbulence, and cloud-scale weather phenomenology. This document describes the source code for TASS version 10.0 and provides users with needed documentation to run the model. The source code is programed in Fortran language and is formulated to take advantage of vector and efficient multi-processor scaling for execution on massively-parallel supercomputer clusters. The code contains different initialization modules allowing the study of aircraft wake vortex interaction with the atmosphere and ground, atmospheric turbulence, atmospheric boundary layers, precipitating convective clouds, hail storms, gust fronts, microburst windshear, supercell and mesoscale convective systems, tornadic storms, and ring vortices. The model is able to operate in either two- or three-dimensions with equations numerically formulated on a Cartesian grid. The primary output from the TASS is time-dependent domain fields generated by the prognostic equations and diagnosed variables. This document will enable a user to understand the general logic of TASS, and will show how to configure and initialize the model domain. Also described are the formats of the input and output files, as well as the parameters that control the input and output.

  17. Generation and use of the Goddard trajectory determination system SLP ephemeris files

    NASA Technical Reports Server (NTRS)

    Armstrong, M. G.; Tomaszewski, I. B.

    1973-01-01

    Information is presented to acquaint users of the Goddard Trajectory Determination System Solar/Lunar/Planetary ephemeris files with the details connected with the generation and use of these files. In particular, certain sections constitute a user's manual for the ephemeris files.

  18. 78 FR 62612 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-22

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 Take notice that the Commission received the following electric corporate filings: Docket Numbers: EC14-3-000...: PE Hydro Generation, LLC Description: Self-Certification of EG of PE Hydro Generation, LLC. Filed...

  19. TagDigger: user-friendly extraction of read counts from GBS and RAD-seq data.

    PubMed

    Clark, Lindsay V; Sacks, Erik J

    2016-01-01

    In genotyping-by-sequencing (GBS) and restriction site-associated DNA sequencing (RAD-seq), read depth is important for assessing the quality of genotype calls and estimating allele dosage in polyploids. However, existing pipelines for GBS and RAD-seq do not provide read counts in formats that are both accurate and easy to access. Additionally, although existing pipelines allow previously-mined SNPs to be genotyped on new samples, they do not allow the user to manually specify a subset of loci to examine. Pipelines that do not use a reference genome assign arbitrary names to SNPs, making meta-analysis across projects difficult. We created the software TagDigger, which includes three programs for analyzing GBS and RAD-seq data. The first script, tagdigger_interactive.py, rapidly extracts read counts and genotypes from FASTQ files using user-supplied sets of barcodes and tags. Input and output is in CSV format so that it can be opened by spreadsheet software. Tag sequences can also be imported from the Stacks, TASSEL-GBSv2, TASSEL-UNEAK, or pyRAD pipelines, and a separate file can be imported listing the names of markers to retain. A second script, tag_manager.py, consolidates marker names and sequences across multiple projects. A third script, barcode_splitter.py, assists with preparing FASTQ data for deposit in a public archive by splitting FASTQ files by barcode and generating MD5 checksums for the resulting files. TagDigger is open-source and freely available software written in Python 3. It uses a scalable, rapid search algorithm that can process over 100 million FASTQ reads per hour. TagDigger will run on a laptop with any operating system, does not consume hard drive space with intermediate files, and does not require programming skill to use.

  20. Sanger and Next-Generation Sequencing data for characterization of CTL epitopes in archived HIV-1 proviral DNA.

    PubMed

    Tumiotto, Camille; Riviere, Lionel; Bellecave, Pantxika; Recordon-Pinson, Patricia; Vilain-Parce, Alice; Guidicelli, Gwenda-Line; Fleury, Hervé

    2017-01-01

    One of the strategies for curing viral HIV-1 is a therapeutic vaccine involving the stimulation of cytotoxic CD8-positive T cells (CTL) that are Human Leucocyte Antigen (HLA)-restricted. The lack of efficiency of previous vaccination strategies may have been due to the immunogenic peptides used, which could be different from a patient's virus epitopes and lead to a poor CTL response. To counteract this lack of specificity, conserved epitopes must be targeted. One alternative is to gather as many data as possible from a large number of patients on their HIV-1 proviral archived epitope variants, taking into account their genetic background to select the best presented CTL epitopes. In order to process big data generated by Next-Generation Sequencing (NGS) of the DNA of HIV-infected patients, we have developed a software package called TutuGenetics. This tool combines an alignment derived either from Sanger or NGS files, HLA typing, target gene and a CTL epitope list as input files. It allows automatic translation after correction of the alignment obtained between the HxB2 reference and the reads, followed by automatic calculation of the MHC IC50 value for each epitope variant and the HLA allele of the patient by using NetMHCpan 3.0, resulting in a csv file as output result. We validated this new tool by comparing Sanger and NGS (454, Roche) sequences obtained from the proviral DNA of patients at success of ART included in the Provir Latitude 45 study and showed a 90% correlation between the quantitative results of NGS and Sanger. This automated analysis combined with complementary samples should yield more data regarding the archived CTL epitopes according to the patients' HLA alleles and will be useful for screening epitopes that in theory are presented efficiently to the HLA groove, thus constituting promising immunogenic peptides for a therapeutic vaccine.

  1. ProMC: Input-output data format for HEP applications using varint encoding

    NASA Astrophysics Data System (ADS)

    Chekanov, S. V.; May, E.; Strand, K.; Van Gemmeren, P.

    2014-10-01

    A new data format for Monte Carlo (MC) events, or any structural data, including experimental data, is discussed. The format is designed to store data in a compact binary form using variable-size integer encoding as implemented in the Google's Protocol Buffers package. This approach is implemented in the PROMC library which produces smaller file sizes for MC records compared to the existing input-output libraries used in high-energy physics (HEP). Other important features of the proposed format are a separation of abstract data layouts from concrete programming implementations, self-description and random access. Data stored in PROMC files can be written, read and manipulated in a number of programming languages, such C++, JAVA, FORTRAN and PYTHON.

  2. Issues in ATM Support of High-Performance, Geographically Distributed Computing

    NASA Technical Reports Server (NTRS)

    Claus, Russell W.; Dowd, Patrick W.; Srinidhi, Saragur M.; Blade, Eric D.G

    1995-01-01

    This report experimentally assesses the effect of the underlying network in a cluster-based computing environment. The assessment is quantified by application-level benchmarking, process-level communication, and network file input/output. Two testbeds were considered, one small cluster of Sun workstations and another large cluster composed of 32 high-end IBM RS/6000 platforms. The clusters had Ethernet, fiber distributed data interface (FDDI), Fibre Channel, and asynchronous transfer mode (ATM) network interface cards installed, providing the same processors and operating system for the entire suite of experiments. The primary goal of this report is to assess the suitability of an ATM-based, local-area network to support interprocess communication and remote file input/output systems for distributed computing.

  3. Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN

    NASA Technical Reports Server (NTRS)

    Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.

    1996-01-01

    A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.

  4. Exploiting Efficient Transpacking for One-Sided Communication and MPI-IO

    NASA Astrophysics Data System (ADS)

    Mir, Faisal Ghias; Träff, Jesper Larsson

    Based on a construction of socalled input-output datatypes that define a mapping between non-consecutive input and output buffers, we outline an efficient method for copying of structured data. We term this operation transpacking, and show how transpacking can be applied for the MPI implementation of one-sided communication and MPI-IO. For one-sided communication via shared-memory, we demonstrate the expected performance improvements by up to a factor of two. For individual MPI-IO, the time to read or write from file dominates the overall time, but even here efficient transpacking can in some scenarios reduce file I/O time considerably. The reported results have been achieved on a single NEC SX-8 vector node.

  5. 76 FR 45789 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... that the Commission received the following exempt wholesale generator filings: Docket Numbers: EG11-109... Generator for Double ``C'' Limited. Filed Date: 07/25/2011. Accession Number: 20110725-5065. Comment Date: 5... Limited. Description: Notice of Self-Certification of Exempt Wholesale Generator for High Sierra Limited...

  6. The Comparison of Point Data Models for the Output of WRF Hydro Model in the IDV

    NASA Astrophysics Data System (ADS)

    Ho, Y.; Weber, J.

    2017-12-01

    WRF Hydro netCDF output files contain streamflow, flow depth, longitude, latitude, altitude and stream order values for each forecast point. However, the data are not CF compliant. The total number of forecast points for the US CONUS is approximately 2.7 million and it is a big challenge for any visualization and analysis tool. The IDV point cloud display shows point data as a set of points colored by parameter. This display is very efficient compared to a standard point type display for rendering a large number of points. The one problem we have is that the data I/O can be a bottleneck issue when dealing with a large collection of point input files. In this presentation, we will experiment with different point data models and their APIs to access the same WRF Hydro model output. The results will help us construct a CF compliant netCDF point data format for the community.

  7. Planetary image conversion task

    NASA Technical Reports Server (NTRS)

    Martin, M. D.; Stanley, C. L.; Laughlin, G.

    1985-01-01

    The Planetary Image Conversion Task group processed 12,500 magnetic tapes containing raw imaging data from JPL planetary missions and produced an image data base in consistent format on 1200 fully packed 6250-bpi tapes. The output tapes will remain at JPL. A copy of the entire tape set was delivered to US Geological Survey, Flagstaff, Ariz. A secondary task converted computer datalogs, which had been stored in project specific MARK IV File Management System data types and structures, to flat-file, text format that is processable on any modern computer system. The conversion processing took place at JPL's Image Processing Laboratory on an IBM 370-158 with existing software modified slightly to meet the needs of the conversion task. More than 99% of the original digital image data was successfully recovered by the conversion task. However, processing data tapes recorded before 1975 was destructive. This discovery is of critical importance to facilities responsible for maintaining digital archives since normal periodic random sampling techniques would be unlikely to detect this phenomenon, and entire data sets could be wiped out in the act of generating seemingly positive sampling results. Reccomended follow-on activities are also included.

  8. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations.

    PubMed

    Stocco, Andrea; Yamasaki, Brianna L; Prat, Chantel S

    2018-04-01

    This article describes the data analyzed in the paper "Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model" (Stocco et al., 2017) [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004) [2], Simon task (Craft and Simon, 1970) [3], and Automated Operation Span (Unsworth et al., 2005) [4]), as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format) as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  9. Update on HCDstruct - A Tool for Hybrid Wing Body Conceptual Design and Structural Optimization

    NASA Technical Reports Server (NTRS)

    Gern, Frank H.

    2015-01-01

    HCDstruct is a Matlab® based software tool to rapidly build a finite element model for structural optimization of hybrid wing body (HWB) aircraft at the conceptual design level. The tool uses outputs from a Flight Optimization System (FLOPS) performance analysis together with a conceptual outer mold line of the vehicle, e.g. created by Vehicle Sketch Pad (VSP), to generate a set of MSC Nastran® bulk data files. These files can readily be used to perform a structural optimization and weight estimation using Nastran’s® Solution 200 multidisciplinary optimization solver. Initially developed at NASA Langley Research Center to perform increased fidelity conceptual level HWB centerbody structural analyses, HCDstruct has grown into a complete HWB structural sizing and weight estimation tool, including a fully flexible aeroelastic loads analysis. Recent upgrades to the tool include the expansion to a full wing tip-to-wing tip model for asymmetric analyses like engine out conditions and dynamic overswings, as well as a fully actuated trailing edge, featuring up to 15 independently actuated control surfaces and twin tails. Several example applications of the HCDstruct tool are presented.

  10. Navy Tethered Balloon Measurements Made During the ’Fire’ Marine Stratocu IFO (Intensive Field Operation)

    DTIC Science & Technology

    1989-04-03

    instrument described in the previous sec- tion. RH2 = V4 - VB2 (8) where V4 is the output voltage stored in the data file OUTXXYY.SHR and VB2 is the...archive comment. The V option can be followed by a C for a verbose listing with file comments. PKXARC FAST! Archive Extract Utility Version 3.3 10-23-86...Copyright (c) 1986 PKWARE, Inc. All Rights Reserved. PKXARC/h for help Extracts files from an archive to their original name, size, time, & date

  11. Interactive Visual Simulation of Communication Systems. Volume 2

    DTIC Science & Technology

    1988-04-29

    struct( int non; ) ARM *PARAMPTR; Itypedef struct float x[8], y [8]; FILE *fp; mnt time; ) STATE, *STATEPTR; *psk8_dem (pparam, size,pstate,pstar) I...i*THETA); pstate-> y ~i]=sin(i*THETA) ; pstate->fp=fopen ( "graf .dat"l,"w"); pstate->time=0; Iif (length output_fifo(o)==maxlength-output_fifo(0...time򒾐) fprintf(pstate->fp,"\

  12. 77 FR 20378 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-04

    ... Generator Status of Ensign Wind, LLC. Filed Date: 3/27/12. Accession Number: 20120327-5056. Comments Due: 5... of Self-Certification of Exempt Wholesale Generator Status of Tuscola Bay Wind, LLC. Filed Date: 3/27.... Applicants: Minco Wind III, LLC. Description: Notice of Self-Certification of Exempt Wholesale Generator...

  13. Super Strypi HWIL 6DOF (Hardware-In-Loop six-degree-of-freedom) Rev. 2175

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilkey, Jeff C.; Harl, Nathan R.; Kowalchuk, Scott A.

    2016-02-23

    The Super Strypi HWIL is a six degree-of-freedom (6DOF) simulation for the Super Strypi Launch Vehicle. The simulation is used to test the NGC flight software including the navigation software. Aerodynamic and propulsive forces, mass properties, ACS (attitude control system) parameters are defined in input files. Output parameters are saved to a Matlab mat file.

  14. surf3d: A 3-D finite-element program for the analysis of surface and corner cracks in solids subjected to mode-1 loadings

    NASA Technical Reports Server (NTRS)

    Raju, I. S.; Newman, J. C., Jr.

    1993-01-01

    A computer program, surf3d, that uses the 3D finite-element method to calculate the stress-intensity factors for surface, corner, and embedded cracks in finite-thickness plates with and without circular holes, was developed. The cracks are assumed to be either elliptic or part eliptic in shape. The computer program uses eight-noded hexahedral elements to model the solid. The program uses a skyline storage and solver. The stress-intensity factors are evaluated using the force method, the crack-opening displacement method, and the 3-D virtual crack closure methods. In the manual the input to and the output of the surf3d program are described. This manual also demonstrates the use of the program and describes the calculation of the stress-intensity factors. Several examples with sample data files are included with the manual. To facilitate modeling of the user's crack configuration and loading, a companion program (a preprocessor program) that generates the data for the surf3d called gensurf was also developed. The gensurf program is a three dimensional mesh generator program that requires minimal input and that builds a complete data file for surf3d. The program surf3d is operational on Unix machines such as CRAY Y-MP, CRAY-2, and Convex C-220.

  15. Analyzing endosonic root canal file oscillations: an in vitro evaluation.

    PubMed

    Lea, Simon C; Walmsley, A Damien; Lumley, Philip J

    2010-05-01

    Passive ultrasonic irrigation may be used to improve bacterial reduction within the root canal. The technique relies on a small file being driven to oscillate freely within the canal and activating an irrigant solution through biophysical forces such as microstreaming. There is limited information regarding a file's oscillation patterns when operated while surrounded by fluid as is the case within a canal root. Files of different sizes (#10 and #30, 27 mm and 31 mm) were connected to an ultrasound generator via a 120 degrees file holder. Files were immersed in a water bath, and a laser vibrometer set up with measurement lines superimposed over the files. The laser vibrometer was scanned over the oscillating files. Measurements were repeated 10 times for each file/power setting used. File mode shapes are comprised of a series of nodes/antinodes, with thinner, longer files producing more antinodes. The maximum vibration occurred at the free end of the file. Increasing generator power had no significant effect on this maximum amplitude (p > 0.20). Maximum displacement amplitudes were 17 to 22 microm (#10 file, 27 mm), 15 to 21 microm (#10 file, 31 mm), 6 to 9 microm (#30 file, 27 mm), and 5 to 7 microm (#30, 31 mm) for all power settings. Antinodes occurring along the remaining file length were significantly larger at generator power 1 than at powers 2 through 5 (p < 0.03). At higher generator powers, energy delivered to the file is dissipated in unwanted vibration resulting in reduced vibration displacement amplitudes. This may reduce the occurrence of the biophysical forces necessary to maximize the technique's effectiveness. Copyright (c) 2010 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  16. User's guide for MODTOOLS: Computer programs for translating data of MODFLOW and MODPATH into geographic information system files

    USGS Publications Warehouse

    Orzol, Leonard L.

    1997-01-01

    MODTOOLS uses the particle data calculated by MODPATH to construct several types of GIS output. MODTOOLS uses particle information recorded by MODPATH such as the row, column, or layer of the model grid, to generate a set of characteristics associated with each particle. The user can choose from the set of characteristics associated with each particle and use the capabilities of the GIS to selectively trace the movement of water discharging from specific cells in the model grid. MODTOOLS allows the hydrogeologist to utilize the capabilities of the GIS to graphically combine the results of the particle-tracking analysis, which facilitates the analysis and understanding of complex ground-water flow systems.

  17. libprofit: Image creation from luminosity profiles

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Taranu, D.; Tobar, R.

    2016-12-01

    libprofit is a C++ library for image creation based on different luminosity profiles. It offers fast and accurate two-dimensional integration for a useful number of profiles, including Sersic, Core-Sersic, broken-exponential, Ferrer, Moffat, empirical King, point-source and sky, with a simple mechanism for adding new profiles. libprofit provides a utility to read the model and profile parameters from the command-line and generate the corresponding image. It can output the resulting image as text values, a binary stream, or as a simple FITS file. It also provides a shared library exposing an API that can be used by any third-party application. R and Python interfaces are available: ProFit (ascl:1612.004) and PyProfit (ascl:1612.005).

  18. EOS MLS Level 1B Data Processing, Version 2.2

    NASA Technical Reports Server (NTRS)

    Perun, Vincent; Jarnot, Robert; Pickett, Herbert; Cofield, Richard; Schwartz, Michael; Wagner, Paul

    2009-01-01

    A computer program performs level- 1B processing (the term 1B is explained below) of data from observations of the limb of the Earth by the Earth Observing System (EOS) Microwave Limb Sounder (MLS), which is an instrument aboard the Aura spacecraft. This software accepts, as input, the raw EOS MLS scientific and engineering data and the Aura spacecraft ephemeris and attitude data. Its output consists of calibrated instrument radiances and associated engineering and diagnostic data. [This software is one of several computer programs, denoted product generation executives (PGEs), for processing EOS MLS data. Starting from level 0 (representing the aforementioned raw data, the PGEs and their data products are denoted by alphanumeric labels (e.g., 1B and 2) that signify the successive stages of processing.] At the time of this reporting, this software is at version 2.2 and incorporates improvements over a prior version that make the code more robust, improve calibration, provide more diagnostic outputs, improve the interface with the Level 2 PGE, and effect a 15-percent reduction in file sizes by use of data compression.

  19. SAMSA2: a standalone metatranscriptome analysis pipeline.

    PubMed

    Westreich, Samuel T; Treiber, Michelle L; Mills, David A; Korf, Ian; Lemay, Danielle G

    2018-05-21

    Complex microbial communities are an area of growing interest in biology. Metatranscriptomics allows researchers to quantify microbial gene expression in an environmental sample via high-throughput sequencing. Metatranscriptomic experiments are computationally intensive because the experiments generate a large volume of sequence data and each sequence must be compared with reference sequences from thousands of organisms. SAMSA2 is an upgrade to the original Simple Annotation of Metatranscriptomes by Sequence Analysis (SAMSA) pipeline that has been redesigned for standalone use on a supercomputing cluster. SAMSA2 is faster due to the use of the DIAMOND aligner, and more flexible and reproducible because it uses local databases. SAMSA2 is available with detailed documentation, and example input and output files along with examples of master scripts for full pipeline execution. SAMSA2 is a rapid and efficient metatranscriptome pipeline for analyzing large RNA-seq datasets in a supercomputing cluster environment. SAMSA2 provides simplified output that can be examined directly or used for further analyses, and its reference databases may be upgraded, altered or customized to fit the needs of any experiment.

  20. Morphometric analysis of root canal cleaning after rotary instrumentation with or without laser irradiation

    NASA Astrophysics Data System (ADS)

    Marchesan, Melissa A.; Geurisoli, Danilo M. Z.; Brugnera, Aldo, Jr.; Barbin, Eduardo L.; Pecora, Jesus D.

    2002-06-01

    The present study examined root canal cleaning, using the optic microscope, after rotary instrumentation with ProFile.04 with or without laser application with different output energies. Cleaning and shaping can be accomplished manually, with ultra-sonic and sub-sonic devices, with rotary instruments and recently, increasing development in laser radiation has shown promising results for disinfection and smear layer removal. In this study, 30 palatal maxillary molar roots were examined using an optic microscope after rotary instrumentation with ProFile .04 with or without Er:YAG laser application (KaVo KeyLaser II, Germany) with different output energies (2940 nm, 15 Hz, 300 pulses, 500 milli-sec duration, 42 J, 140 mJ showed on the display- input, 61 mJ at fiberoptic tip-output and 140 mJ showed on the display-input and 51 mJ at fiberoptic tip-output). Statistical analysis showed no statistical differences between the tested treatments (ANOVA, p>0.05). ANOVA also showed a statistically significant difference (p<0.01) between the root canal thirds, indicating that the middle third had less debris than the apical third. We conclude that: 1) none of the tested treatments led to totally cleaned root canals; 2) all treatments removed debris similarly, 3) the middle third had less debris than the apical third; 4) variation in output energy did not increase cleaning.

  1. 75 FR 80484 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-22

    ... exempt wholesale generator status re Hatchet Ridge Wind 2010-B. Filed Date: 12/14/2010. Accession Number... of exempt wholesale generator status re Hatchet Ridge Wind 2010-A. Filed Date: 12/14/2010. Accession... Mountain Wind Farm LGIA to be effective 11/24/ 2010. Filed Date: 12/14/2010. Accession Number: 20101214...

  2. 78 FR 62345 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-18

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 2 Take notice that the Commission received the following exempt wholesale generator filings: Docket Numbers: EG14-3-000. Applicants: Miami Wind I LLC. Description: Notice of Self-Certification of Exempt Wholesale Generator Status of Miami Wind I LLC. Filed Date:...

  3. MATLAB software for viewing and processing u-channel and discrete sample paleomagnetic data: UPmag and DPmag

    NASA Astrophysics Data System (ADS)

    Xuan, C.; Channell, J. E.

    2009-12-01

    With the increasing efficiency of acquiring paleomagnetic data from u-channel or discrete samples, large volumes of data can be accumulated within a short time period. It is often critical to visualize and process these data in “real time” as measurements proceed, so that the measurement plan can be dictated accordingly. New MATLABTM software, UPmag and DPmag, are introduced for easy and rapid analysis of natural remanent magnetization (NRM) and laboratory-induced remanent magnetization data for u-channel and discrete samples, respectively. UPmag comprises three MATLABTM graphic user interfaces: UVIEW, UDIR, and UINT. UVIEW allows users to open and check through measurement data from the magnetometer as well as to correct detected flux-jumps in the data, and to export files for further treatment. UDIR reads the *.dir file generated by UVIEW, automatically calculates component directions using selectable demagnetization range(s) with anchored or free origin, and displays orthogonal projections and stepwise intensity plots for any position along the u-channel sample. UDIR can also display data on equal area stereographic projections and draw virtual geomagnetic poles (VGP) on various map projections. UINT provides a convenient platform to evaluate relative paleointensity estimates using the *.int files that can be exported from UVIEW. DPmag comprises two MATLABTM graphic user interfaces: DDIR and DFISHER. DDIR reads output files from the discrete sample magnetometer measurement system. DDIR allows users to calculate component directions for each discrete sample, to plot the demagnetization data on orthogonal projections and equal area projections, as well as to show the stepwise intensity data. DFISHER reads the *.pca file exported from DDIR, calculates VGP and Fisher statistics for data from selected groups of samples, and plots the results on equal area projections and as VGPs on a range of map projections. Data and plots from UPmag and DPmag can be exported to various file formats.

  4. DC to DC power converters and methods of controlling the same

    DOEpatents

    Steigerwald, Robert Louis; Elasser, Ahmed; Sabate, Juan Antonio; Todorovic, Maja Harfman; Agamy, Mohammed

    2012-12-11

    A power generation system configured to provide direct current (DC) power to a DC link is described. The system includes a first power generation unit configured to output DC power. The system also includes a first DC to DC converter comprising an input section and an output section. The output section of the first DC to DC converter is coupled in series with the first power generation unit. The first DC to DC converter is configured to process a first portion of the DC power output by the first power generation unit and to provide an unprocessed second portion of the DC power output of the first power generation unit to the output section.

  5. Update to the NASA Lewis Ice Accretion Code LEWICE

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    1994-01-01

    This report is intended as an update to NASA CR-185129 'User's Manual for the NASA Lewis Ice Accretion Prediction Code (LEWICE).' It describes modifications and improvements made to this code as well as changes to the input and output files, interactive input, and graphics output. The comparison of this code to experimental data is shown to have improved as a result of these modifications.

  6. MSL: Facilitating automatic and physical analysis of published scientific literature in PDF format.

    PubMed

    Ahmed, Zeeshan; Dandekar, Thomas

    2015-01-01

    Published scientific literature contains millions of figures, including information about the results obtained from different scientific experiments e.g. PCR-ELISA data, microarray analysis, gel electrophoresis, mass spectrometry data, DNA/RNA sequencing, diagnostic imaging (CT/MRI and ultrasound scans), and medicinal imaging like electroencephalography (EEG), magnetoencephalography (MEG), echocardiography  (ECG), positron-emission tomography (PET) images. The importance of biomedical figures has been widely recognized in scientific and medicine communities, as they play a vital role in providing major original data, experimental and computational results in concise form. One major challenge for implementing a system for scientific literature analysis is extracting and analyzing text and figures from published PDF files by physical and logical document analysis. Here we present a product line architecture based bioinformatics tool 'Mining Scientific Literature (MSL)', which supports the extraction of text and images by interpreting all kinds of published PDF files using advanced data mining and image processing techniques. It provides modules for the marginalization of extracted text based on different coordinates and keywords, visualization of extracted figures and extraction of embedded text from all kinds of biological and biomedical figures using applied Optimal Character Recognition (OCR). Moreover, for further analysis and usage, it generates the system's output in different formats including text, PDF, XML and images files. Hence, MSL is an easy to install and use analysis tool to interpret published scientific literature in PDF format.

  7. Enhancements to the SSME transfer function modeling code

    NASA Technical Reports Server (NTRS)

    Irwin, R. Dennis; Mitchell, Jerrel R.; Bartholomew, David L.; Glenn, Russell D.

    1995-01-01

    This report details the results of a one year effort by Ohio University to apply the transfer function modeling and analysis tools developed under NASA Grant NAG8-167 (Irwin, 1992), (Bartholomew, 1992) to attempt the generation of Space Shuttle Main Engine High Pressure Turbopump transfer functions from time domain data. In addition, new enhancements to the transfer function modeling codes which enhance the code functionality are presented, along with some ideas for improved modeling methods and future work. Section 2 contains a review of the analytical background used to generate transfer functions with the SSME transfer function modeling software. Section 2.1 presents the 'ratio method' developed for obtaining models of systems that are subject to single unmeasured excitation sources and have two or more measured output signals. Since most of the models developed during the investigation use the Eigensystem Realization Algorithm (ERA) for model generation, Section 2.2 presents an introduction of ERA, and Section 2.3 describes how it can be used to model spectral quantities. Section 2.4 details the Residue Identification Algorithm (RID) including the use of Constrained Least Squares (CLS) and Total Least Squares (TLS). Most of this information can be found in the report (and is repeated for convenience). Section 3 chronicles the effort of applying the SSME transfer function modeling codes to the a51p394.dat and a51p1294.dat time data files to generate transfer functions from the unmeasured input to the 129.4 degree sensor output. Included are transfer function modeling attempts using five methods. The first method is a direct application of the SSME codes to the data files and the second method uses the underlying trends in the spectral density estimates to form transfer function models with less clustering of poles and zeros than the models obtained by the direct method. In the third approach, the time data is low pass filtered prior to the modeling process in an effort to filter out high frequency characteristics. The fourth method removes the presumed system excitation and its harmonics in order to investigate the effects of the excitation on the modeling process. The fifth method is an attempt to apply constrained RID to obtain better transfer functions through more accurate modeling over certain frequency ranges. Section 4 presents some new C main files which were created to round out the functionality of the existing SSME transfer function modeling code. It is now possible to go from time data to transfer function models using only the C codes; it is not necessary to rely on external software. The new C main files and instructions for their use are included. Section 5 presents current and future enhancements to the XPLOT graphics program which was delivered with the initial software. Several new features which have been added to the program are detailed in the first part of this section. The remainder of Section 5 then lists some possible features which may be added in the future. Section 6 contains the conclusion section of this report. Section 6.1 is an overview of the work including a summary and observations relating to finding transfer functions with the SSME code. Section 6.2 contains information relating to future work on the project.

  8. SWIFT MODELLER: a Java based GUI for molecular modeling.

    PubMed

    Mathur, Abhinav; Shankaracharya; Vidyarthi, Ambarish S

    2011-10-01

    MODELLER is command line argument based software which requires tedious formatting of inputs and writing of Python scripts which most people are not comfortable with. Also the visualization of output becomes cumbersome due to verbose files. This makes the whole software protocol very complex and requires extensive study of MODELLER manuals and tutorials. Here we describe SWIFT MODELLER, a GUI that automates formatting, scripting and data extraction processes and present it in an interactive way making MODELLER much easier to use than before. The screens in SWIFT MODELLER are designed keeping homology modeling in mind and their flow is a depiction of its steps. It eliminates the formatting of inputs, scripting processes and analysis of verbose output files through automation and makes pasting of the target sequence as the only prerequisite. Jmol (3D structure visualization tool) has been integrated into the GUI which opens and demonstrates the protein data bank files created by the MODELLER software. All files required and created by the software are saved in a folder named after the work instance's date and time of execution. SWIFT MODELLER lowers the skill level required for the software through automation of many of the steps in the original software protocol, thus saving an enormous amount of time per instance and making MODELLER very easy to work with.

  9. Broadband Heating Rate Profile Project (BBHRP) - SGP ripbe370mcfarlane

    DOE Data Explorer

    Riihimaki, Laura; Shippert, Timothy

    2014-11-05

    The objective of the ARM Broadband Heating Rate Profile (BBHRP) Project is to provide a structure for the comprehensive assessment of our ability to model atmospheric radiative transfer for all conditions. Required inputs to BBHRP include surface albedo and profiles of atmospheric state (temperature, humidity), gas concentrations, aerosol properties, and cloud properties. In the past year, the Radiatively Important Parameters Best Estimate (RIPBE) VAP was developed to combine all of the input properties needed for BBHRP into a single gridded input file. Additionally, an interface between the RIPBE input file and the RRTM was developed using the new ARM integrated software development environment (ISDE) and effort was put into developing quality control (qc) flags and provenance information on the BBHRP output files so that analysis of the output would be more straightforward. This new version of BBHRP, sgp1bbhrpripbeC1.c1, uses the RIPBE files as input to RRTM, and calculates broadband SW and LW fluxes and heating rates at 1-min resolution using the independent column approximation. The vertical resolution is 45 m in the lower and middle troposphere to match the input cloud properties, but is at coarser resolution in the upper atmosphere. Unlike previous versions, the vertical grid is the same for both clear-sky and cloudy-sky calculations.

  10. Broadband Heating Rate Profile Project (BBHRP) - SGP 1bbhrpripbe1mcfarlane

    DOE Data Explorer

    Riihimaki, Laura; Shippert, Timothy

    2014-11-05

    The objective of the ARM Broadband Heating Rate Profile (BBHRP) Project is to provide a structure for the comprehensive assessment of our ability to model atmospheric radiative transfer for all conditions. Required inputs to BBHRP include surface albedo and profiles of atmospheric state (temperature, humidity), gas concentrations, aerosol properties, and cloud properties. In the past year, the Radiatively Important Parameters Best Estimate (RIPBE) VAP was developed to combine all of the input properties needed for BBHRP into a single gridded input file. Additionally, an interface between the RIPBE input file and the RRTM was developed using the new ARM integrated software development environment (ISDE) and effort was put into developing quality control (qc) flags and provenance information on the BBHRP output files so that analysis of the output would be more straightforward. This new version of BBHRP, sgp1bbhrpripbeC1.c1, uses the RIPBE files as input to RRTM, and calculates broadband SW and LW fluxes and heating rates at 1-min resolution using the independent column approximation. The vertical resolution is 45 m in the lower and middle troposphere to match the input cloud properties, but is at coarser resolution in the upper atmosphere. Unlike previous versions, the vertical grid is the same for both clear-sky and cloudy-sky calculations.

  11. Broadband Heating Rate Profile Project (BBHRP) - SGP ripbe1mcfarlane

    DOE Data Explorer

    Riihimaki, Laura; Shippert, Timothy

    2014-11-05

    The objective of the ARM Broadband Heating Rate Profile (BBHRP) Project is to provide a structure for the comprehensive assessment of our ability to model atmospheric radiative transfer for all conditions. Required inputs to BBHRP include surface albedo and profiles of atmospheric state (temperature, humidity), gas concentrations, aerosol properties, and cloud properties. In the past year, the Radiatively Important Parameters Best Estimate (RIPBE) VAP was developed to combine all of the input properties needed for BBHRP into a single gridded input file. Additionally, an interface between the RIPBE input file and the RRTM was developed using the new ARM integrated software development environment (ISDE) and effort was put into developing quality control (qc) flags and provenance information on the BBHRP output files so that analysis of the output would be more straightforward. This new version of BBHRP, sgp1bbhrpripbeC1.c1, uses the RIPBE files as input to RRTM, and calculates broadband SW and LW fluxes and heating rates at 1-min resolution using the independent column approximation. The vertical resolution is 45 m in the lower and middle troposphere to match the input cloud properties, but is at coarser resolution in the upper atmosphere. Unlike previous versions, the vertical grid is the same for both clear-sky and cloudy-sky calculations.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurst, Aaron M.

    A data structure based on an eXtensible Markup Language (XML) hierarchy according to experimental nuclear structure data in the Evaluated Nuclear Structure Data File (ENSDF) is presented. A Python-coded translator has been developed to interpret the standard one-card records of the ENSDF datasets, together with their associated quantities defined according to field position, and generate corresponding representative XML output. The quantities belonging to this mixed-record format are described in the ENSDF manual. Of the 16 ENSDF records in total, XML output has been successfully generated for 15 records. An XML-translation for the Comment Record is yet to be implemented; thismore » will be considered in a separate phase of the overall translation effort. Continuation records, not yet implemented, will also be treated in a future phase of this work. Several examples are presented in this document to illustrate the XML schema and methods for handling the various ENSDF data types. However, the proposed nomenclature for the XML elements and attributes need not necessarily be considered as a fixed set of constructs. Indeed, better conventions may be suggested and a consensus can be achieved amongst the various groups of people interested in this project. The main purpose here is to present an initial phase of the translation effort to demonstrate the feasibility of interpreting ENSDF datasets and creating a representative XML-structured hierarchy for data storage.« less

  13. Flight design system level C requirements. Solid rocket booster and external tank impact prediction processors. [space transportation system

    NASA Technical Reports Server (NTRS)

    Seale, R. H.

    1979-01-01

    The prediction of the SRB and ET impact areas requires six separate processors. The SRB impact prediction processor computes the impact areas and related trajectory data for each SRB element. Output from this processor is stored on a secure file accessible by the SRB impact plot processor which generates the required plots. Similarly the ET RTLS impact prediction processor and the ET RTLS impact plot processor generates the ET impact footprints for return-to-launch-site (RTLS) profiles. The ET nominal/AOA/ATO impact prediction processor and the ET nominal/AOA/ATO impact plot processor generate the ET impact footprints for non-RTLS profiles. The SRB and ET impact processors compute the size and shape of the impact footprints by tabular lookup in a stored footprint dispersion data base. The location of each footprint is determined by simulating a reference trajectory and computing the reference impact point location. To insure consistency among all flight design system (FDS) users, much input required by these processors will be obtained from the FDS master data base.

  14. MuffinInfo: HTML5-Based Statistics Extractor from Next-Generation Sequencing Data.

    PubMed

    Alic, Andy S; Blanquer, Ignacio

    2016-09-01

    Usually, the information known a priori about a newly sequenced organism is limited. Even resequencing the same organism can generate unpredictable output. We introduce MuffinInfo, a FastQ/Fasta/SAM information extractor implemented in HTML5 capable of offering insights into next-generation sequencing (NGS) data. Our new tool can run on any software or hardware environment, in command line or graphically, and in browser or standalone. It presents information such as average length, base distribution, quality scores distribution, k-mer histogram, and homopolymers analysis. MuffinInfo improves upon the existing extractors by adding the ability to save and then reload the results obtained after a run as a navigable file (also supporting saving pictures of the charts), by supporting custom statistics implemented by the user, and by offering user-adjustable parameters involved in the processing, all in one software. At the moment, the extractor works with all base space technologies such as Illumina, Roche, Ion Torrent, Pacific Biosciences, and Oxford Nanopore. Owing to HTML5, our software demonstrates the readiness of web technologies for mild intensive tasks encountered in bioinformatics.

  15. IRFK2D: a computer program for simulating intrinsic random functions of order k

    NASA Astrophysics Data System (ADS)

    Pardo-Igúzquiza, Eulogio; Dowd, Peter A.

    2003-07-01

    IRFK2D is an ANSI Fortran-77 program that generates realizations of an intrinsic function of order k (with k equal to 0, 1 or 2) with a permissible polynomial generalized covariance model. The realizations may be non-conditional or conditioned to the experimental data. The turning bands method is used to generate realizations in 2D and 3D from simulations of an intrinsic random function of order k along lines that span the 2D or 3D space. The program generates two output files, the first containing the simulated values and the second containing the theoretical generalized variogram for different directions together with the theoretical model. The experimental variogram is calculated from the simulated values while the theoretical variogram is the specified generalized covariance model. The generalized variogram is used to assess the quality of the simulation as measured by the extent to which the generalized covariance is reproduced by the simulation. The examples given in this paper indicate that IRFK2D is an efficient implementation of the methodology.

  16. NASA GES DISC On-line Visualization and Analysis System for Gridded Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory G.; Berrick, S.; Rui, H.; Liu, Z.; Zhu, T.; Teng, W.; Shen, S.; Qin, J.

    2005-01-01

    The ability to use data stored in the current NASA Earth Observing System (EOS) archives for studying regional or global phenomena is highly dependent on having a detailed understanding of the data's internal structure and physical implementation. Gaining this understanding and applying it to data reduction is a time-consuming task that must be undertaken before the core investigation can begin. This is an especially difficult challenge when science objectives require users to deal with large multi-sensor data sets that are usually of different formats, structures, and resolutions. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has taken a major step towards meeting this challenge by developing an infrastructure with a Web interface that allows users to perform interactive analysis online without downloading any data, the GES-DISC Interactive Online Visualization and Analysis Infrastructure or "Giovanni." Giovanni provides interactive, online, analysis tools for data users to facilitate their research. There have been several instances of this interface created to serve TRMM users, Aerosol scientists, Ocean Color and Agriculture applications users. The first generation of these tools support gridded data only. The user selects geophysical parameters, area of interest, time period; and the system generates an output on screen in a matter of seconds. The currently available output options are: Area plot averaged or accumulated over any available data period for any rectangular area; Time plot time series averaged over any rectangular area; Hovmoller plots image view of any longitude-time and latitude-time cross sections; ASCII output for all plot types; Image animation for area plot. Another analysis suite deals with parameter intercomparison: scatter plots, temporal correlation maps, GIs-compatible outputs, etc. This allow user to focus on data content (i.e. science parameters) and eliminate the need for expensive learning, development and processing tasks that are redundantly incurred by an archive's user community. The current implementation utilizes the GrADS-DODS Server (GDS), and provides subsetting and analysis services across the Internet for any GrADS-readable dataset. The subsetting capability allows users to retrieve a specified temporal and/or spatial subdomain from a large dataset, eliminating the need to download everything simply to access a small relevant portion of a dataset. The analysis capability allows users to retrieve the results of an operation applied to one or more datasets on the server. We use this approach to read pre-processed binary files and/or to read and extract the needed parts directly from HDF or HDF-EOS files. These subsets then serve as inputs into GrADS analysis scripts. It can be used in a wide variety of Earth science applications: climate and weather events study and monitoring; modeling. It can be easily configured for new applications.

  17. MIMS for TRIM

    EPA Pesticide Factsheets

    MIMS supports complex computational studies that use multiple interrelated models / programs, such as the modules within TRIM. MIMS is used by TRIM to run various models in sequence, while sharing input and output files.

  18. TH-C-18A-12: Evaluation of the Impact of Body Size and Tube Output Limits in the Optimization of Fast Scanning with High-Pitch Dual Source CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramirez Giraldo, J; Mileto, A.; Hurwitz, L.

    2014-06-15

    Purpose: To evaluate the impact of body size and tube power limits in the optimization of fast scanning with high-pitch dual source CT (DSCT). Methods: A previously validated MERCURY phantom, made of polyethylene, with circular cross-section of diameters 16, 23, 30 and 37cm, and connected through tapered sections, was scanned using a second generation DSCT system. The DSCT operates with two independently controlled x-ray tube generators offering up to 200 kW power reserve (100 kW per tube). The entire length of the phantom (42cm) was scanned with two protocols using: A)Standard single-source CT (SSCT) protocol with pitch of 0.8, andmore » B) DSCT protocol with high-pitch values ranging from 1.6 to 3.2 (0.2 steps). All scans used 120 kVp with 150 quality reference mAs using automatic exposure control. Scanner radiation output (CTDIvol) and effective mAs values were extracted retrospectively from DICOM files for each slice. Image noise was recorded. All variables were assessed relative to phantom diameter. Results: With standard-pitch SSCT, the scanner radiation output (and tube-current) were progressively adapted with increasing size, from 6 mGy (120 mAs) up to 15 mGy (270 mAs) from the thinnest (16cm) to the thickest diameter (37 cm), respectively. By comparison, using high-pitch (3.2), the scanner output was bounded at about 8 mGy (140 mAs), independent of phantom diameter. Although relative to standard-pitch, the high-pitch led to lower radiation output for the same scan, the image noise was higher, particularly for larger diameters. To match the radiation output adaptation of standard-pitch, a high-pitch mode of 1.6 was needed, with the advantage of scanning twice as fast. Conclusion: To maximize the benefits of fast scanning with high-pitch DSCT, the body size and tube power limits of the system need to be considered such that a good balance between speed of acquisition and image quality are warranted. JCRG is an employee of Siemens Medical Solutions USA Inc.« less

  19. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  20. ReSeqTools: an integrated toolkit for large-scale next-generation sequencing based resequencing analysis.

    PubMed

    He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z

    2013-12-04

    Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.

  1. TAILSIM Users Guide

    NASA Technical Reports Server (NTRS)

    Hiltner, Dale W.

    2000-01-01

    The TAILSIM program uses a 4th order Runge-Kutta method to integrate the standard aircraft equations-of-motion (EOM). The EOM determine three translational and three rotational accelerations about the aircraft's body axis reference system. The forces and moments that drive the EOM are determined from aerodynamic coefficients, dynamic derivatives, and control inputs. Values for these terms are determined from linear interpolation of tables that are a function of parameters such as angle-of-attack and surface deflections. Buildup equations combine these terms and dimensionalize them to generate the driving total forces and moments. Features that make TAILSIM applicable to studies of tailplane stall include modeling of the reversible control System, modeling of the pilot performing a load factor and/or airspeed command task, and modeling of vertical gusts. The reversible control system dynamics can be described as two hinged masses connected by a spring. resulting in a fifth order system. The pilot model is a standard form of lead-lag with a time delay applied to an integrated pitch rate and/or airspeed error feedback. The time delay is implemented by a Pade approximation, while the commanded pitch rate is determined by a commanded load factor. Vertical gust inputs include a single 1-cosine gust and a continuous NASA Dryden gust model. These dynamic models. coupled with the use of a nonlinear database, allow the tailplane stall characteristics, elevator response, and resulting aircraft response, to be modeled. A useful output capability of the TAILSIM program is the ability to display multiple post-run plot pages to allow a quick assessment of the time history response. There are 16 plot pages currently available to the user. Each plot page displays 9 parameters. Each parameter can also be displayed individually. on a one plot-per-page format. For a more refined display of the results the program can also create files of tabulated data. which can then be used by other plotting programs. The TAILSIM program was written straightforwardly assuming the user would want to change the database tables, the buildup equations, the output parameters. and the pilot model parameters. A separate database file and input file are automatically read in by the program. The use of an include file to set up all common blocks facilitates easy changing of parameter names and array sizes.

  2. AfterQC: automatic filtering, trimming, error removing and quality control for fastq data.

    PubMed

    Chen, Shifu; Huang, Tanxiao; Zhou, Yanqing; Han, Yue; Xu, Mingyan; Gu, Jia

    2017-03-14

    Some applications, especially those clinical applications requiring high accuracy of sequencing data, usually have to face the troubles caused by unavoidable sequencing errors. Several tools have been proposed to profile the sequencing quality, but few of them can quantify or correct the sequencing errors. This unmet requirement motivated us to develop AfterQC, a tool with functions to profile sequencing errors and correct most of them, plus highly automated quality control and data filtering features. Different from most tools, AfterQC analyses the overlapping of paired sequences for pair-end sequencing data. Based on overlapping analysis, AfterQC can detect and cut adapters, and furthermore it gives a novel function to correct wrong bases in the overlapping regions. Another new feature is to detect and visualise sequencing bubbles, which can be commonly found on the flowcell lanes and may raise sequencing errors. Besides normal per cycle quality and base content plotting, AfterQC also provides features like polyX (a long sub-sequence of a same base X) filtering, automatic trimming and K-MER based strand bias profiling. For each single or pair of FastQ files, AfterQC filters out bad reads, detects and eliminates sequencer's bubble effects, trims reads at front and tail, detects the sequencing errors and corrects part of them, and finally outputs clean data and generates HTML reports with interactive figures. AfterQC can run in batch mode with multiprocess support, it can run with a single FastQ file, a single pair of FastQ files (for pair-end sequencing), or a folder for all included FastQ files to be processed automatically. Based on overlapping analysis, AfterQC can estimate the sequencing error rate and profile the error transform distribution. The results of our error profiling tests show that the error distribution is highly platform dependent. Much more than just another new quality control (QC) tool, AfterQC is able to perform quality control, data filtering, error profiling and base correction automatically. Experimental results show that AfterQC can help to eliminate the sequencing errors for pair-end sequencing data to provide much cleaner outputs, and consequently help to reduce the false-positive variants, especially for the low-frequency somatic mutations. While providing rich configurable options, AfterQC can detect and set all the options automatically and require no argument in most cases.

  3. Integrating the ODI-PPA scientific gateway with the QuickReduce pipeline for on-demand processing

    NASA Astrophysics Data System (ADS)

    Young, Michael D.; Kotulla, Ralf; Gopu, Arvind; Liu, Wilson

    2014-07-01

    As imaging systems improve, the size of astronomical data has continued to grow, making the transfer and processing of data a significant burden. To solve this problem for the WIYN Observatory One Degree Imager (ODI), we developed the ODI-Portal, Pipeline, and Archive (ODI-PPA) science gateway, integrating the data archive, data reduction pipelines, and a user portal. In this paper, we discuss the integration of the QuickReduce (QR) pipeline into PPA's Tier 2 processing framework. QR is a set of parallelized, stand-alone Python routines accessible to all users, and operators who can create master calibration products and produce standardized calibrated data, with a short turn-around time. Upon completion, the data are ingested into the archive and portal, and made available to authorized users. Quality metrics and diagnostic plots are generated and presented via the portal for operator approval and user perusal. Additionally, users can tailor the calibration process to their specific science objective(s) by selecting custom datasets, applying preferred master calibrations or generating their own, and selecting pipeline options. Submission of a QuickReduce job initiates data staging, pipeline execution, and ingestion of output data products all while allowing the user to monitor the process status, and to download or further process/analyze the output within the portal. User-generated data products are placed into a private user-space within the portal. ODI-PPA leverages cyberinfrastructure at Indiana University including the Big Red II supercomputer, the Scholarly Data Archive tape system and the Data Capacitor shared file system.

  4. Automation of the CFD Process on Distributed Computing Systems

    NASA Technical Reports Server (NTRS)

    Tejnil, Ed; Gee, Ken; Rizk, Yehia M.

    2000-01-01

    A script system was developed to automate and streamline portions of the CFD process. The system was designed to facilitate the use of CFD flow solvers on supercomputer and workstation platforms within a parametric design event. Integrating solver pre- and postprocessing phases, the fully automated ADTT script system marshalled the required input data, submitted the jobs to available computational resources, and processed the resulting output data. A number of codes were incorporated into the script system, which itself was part of a larger integrated design environment software package. The IDE and scripts were used in a design event involving a wind tunnel test. This experience highlighted the need for efficient data and resource management in all parts of the CFD process. To facilitate the use of CFD methods to perform parametric design studies, the script system was developed using UNIX shell and Perl languages. The goal of the work was to minimize the user interaction required to generate the data necessary to fill a parametric design space. The scripts wrote out the required input files for the user-specified flow solver, transferred all necessary input files to the computational resource, submitted and tracked the jobs using the resource queuing structure, and retrieved and post-processed the resulting dataset. For computational resources that did not run queueing software, the script system established its own simple first-in-first-out queueing structure to manage the workload. A variety of flow solvers were incorporated in the script system, including INS2D, PMARC, TIGER and GASP. Adapting the script system to a new flow solver was made easier through the use of object-oriented programming methods. The script system was incorporated into an ADTT integrated design environment and evaluated as part of a wind tunnel experiment. The system successfully generated the data required to fill the desired parametric design space. This stressed the computational resources required to compute and store the information. The scripts were continually modified to improve the utilization of the computational resources and reduce the likelihood of data loss due to failures. An ad-hoc file server was created to manage the large amount of data being generated as part of the design event. Files were stored and retrieved as needed to create new jobs and analyze the results. Additional information is contained in the original.

  5. Automated system for generation of soil moisture products for agricultural drought assessment

    NASA Astrophysics Data System (ADS)

    Raja Shekhar, S. S.; Chandrasekar, K.; Sesha Sai, M. V. R.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Drought is a frequently occurring disaster affecting lives of millions of people across the world every year. Several parameters, indices and models are being used globally to forecast / early warning of drought and monitoring drought for its prevalence, persistence and severity. Since drought is a complex phenomenon, large number of parameter/index need to be evaluated to sufficiently address the problem. It is a challenge to generate input parameters from different sources like space based data, ground data and collateral data in short intervals of time, where there may be limitation in terms of processing power, availability of domain expertise, specialized models & tools. In this study, effort has been made to automate the derivation of one of the important parameter in the drought studies viz Soil Moisture. Soil water balance bucket model is in vogue to arrive at soil moisture products, which is widely popular for its sensitivity to soil conditions and rainfall parameters. This model has been encoded into "Fish-Bone" architecture using COM technologies and Open Source libraries for best possible automation to fulfill the needs for a standard procedure of preparing input parameters and processing routines. The main aim of the system is to provide operational environment for generation of soil moisture products by facilitating users to concentrate on further enhancements and implementation of these parameters in related areas of research, without re-discovering the established models. Emphasis of the architecture is mainly based on available open source libraries for GIS and Raster IO operations for different file formats to ensure that the products can be widely distributed without the burden of any commercial dependencies. Further the system is automated to the extent of user free operations if required with inbuilt chain processing for every day generation of products at specified intervals. Operational software has inbuilt capabilities to automatically download requisite input parameters like rainfall, Potential Evapotranspiration (PET) from respective servers. It can import file formats like .grd, .hdf, .img, generic binary etc, perform geometric correction and re-project the files to native projection system. The software takes into account the weather, crop and soil parameters to run the designed soil water balance model. The software also has additional features like time compositing of outputs to generate weekly, fortnightly profiles for further analysis. Other tools to generate "Area Favorable for Crop Sowing" using the daily soil moisture with highly customizable parameters interface has been provided. A whole India analysis would now take a mere 20 seconds for generation of soil moisture products which would normally take one hour per day using commercial software.

  6. ThermalTracker Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The software processes recorded thermal video and detects the flight tracks of birds and bats that passed through the camera's field of view. The output is a set of images that show complete flight tracks for any detections, with the direction of travel indicated and the thermal image of the animal delineated. A report of the descriptive features of each detected track is also output in the form of a comma-separated value text file.

  7. Method of operating a thermoelectric generator

    DOEpatents

    Reynolds, Michael G; Cowgill, Joshua D

    2013-11-05

    A method for operating a thermoelectric generator supplying a variable-load component includes commanding the variable-load component to operate at a first output and determining a first load current and a first load voltage to the variable-load component while operating at the commanded first output. The method also includes commanding the variable-load component to operate at a second output and determining a second load current and a second load voltage to the variable-load component while operating at the commanded second output. The method includes calculating a maximum power output of the thermoelectric generator from the determined first load current and voltage and the determined second load current and voltage, and commanding the variable-load component to operate at a third output. The commanded third output is configured to draw the calculated maximum power output from the thermoelectric generator.

  8. A convertor and user interface to import CAD files into worldtoolkit virtual reality systems

    NASA Technical Reports Server (NTRS)

    Wang, Peter Hor-Ching

    1996-01-01

    Virtual Reality (VR) is a rapidly developing human-to-computer interface technology. VR can be considered as a three-dimensional computer-generated Virtual World (VW) which can sense particular aspects of a user's behavior, allow the user to manipulate the objects interactively, and render the VW at real-time accordingly. The user is totally immersed in the virtual world and feel the sense of transforming into that VW. NASA/MSFC Computer Application Virtual Environments (CAVE) has been developing the space-related VR applications since 1990. The VR systems in CAVE lab are based on VPL RB2 system which consists of a VPL RB2 control tower, an LX eyephone, an Isotrak polhemus sensor, two Fastrak polhemus sensors, a folk of Bird sensor, and two VPL DG2 DataGloves. A dynamics animator called Body Electric from VPL is used as the control system to interface with all the input/output devices and to provide the network communications as well as VR programming environment. The RB2 Swivel 3D is used as the modelling program to construct the VW's. A severe limitation of the VPL VR system is the use of RB2 Swivel 3D, which restricts the files to a maximum of 1020 objects and doesn't have the advanced graphics texture mapping. The other limitation is that the VPL VR system is a turn-key system which does not provide the flexibility for user to add new sensors and C language interface. Recently, NASA/MSFC CAVE lab provides VR systems built on Sense8 WorldToolKit (WTK) which is a C library for creating VR development environments. WTK provides device drivers for most of the sensors and eyephones available on the VR market. WTK accepts several CAD file formats, such as Sense8 Neutral File Format, AutoCAD DXF and 3D Studio file format, Wave Front OBJ file format, VideoScape GEO file format, Intergraph EMS stereolithographics and CATIA Stereolithographics STL file formats. WTK functions are object-oriented in their naming convention, are grouped into classes, and provide easy C language interface. Using a CAD or modelling program to build a VW for WTK VR applications, we typically construct the stationary universe with all the geometric objects except the dynamic objects, and create each dynamic object in an individual file.

  9. Power generation systems and methods

    NASA Technical Reports Server (NTRS)

    Jones, Jack A. (Inventor); Chao, Yi (Inventor)

    2011-01-01

    A power generation system includes a plurality of submerged mechanical devices. Each device includes a pump that can be powered, in operation, by mechanical energy to output a pressurized output liquid flow in a conduit. Main output conduits are connected with the device conduits to combine pressurized output flows output from the submerged mechanical devices into a lower number of pressurized flows. These flows are delivered to a location remote of the submerged mechanical devices for power generation.

  10. Program to convert SUDS2ASC files to a single binary SEGY file

    USGS Publications Warehouse

    Goldman, Mark

    2000-01-01

    This program, SUDS2SEGY, converts and combines ASCII files created using SUDS2ASC Version 2.60, to a single SEGY file. SUDS2ASC has been used previously to create an ASCII file of three-component seismic data for an individual recording station. However, many seismic processing packages have difficulty reading in ASCII data. In addition, it may be cumbersome to process a separate file for each recording station, particularly if traces from different recording stations contain a different number of data samples and/or a different start time. This new program - SUDS2SEGY - combines these recording station files into a single SEGY file. In addition, SUDS2SEGY normalizes the trace times so that each trace starts at a given time and consists of a fixed number of samples. This normalization allows seismic data from many different stations to be read in as a single "data gather". SUDS2SEGY also produces a report summarizing the offset and maximum absolute amplitude for each component in a station file. These data are output separately to an ASCII file and can be subsequently input to a plotting package.

  11. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    PubMed

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  12. Agricultural Baseline (BL0) scenario of the 2016 Billion-Ton Report

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinkel, Chad [University of Tennessee, APAC] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Langholtz, Matthew H [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Myers, Aaron [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000320373827)

    2016-07-13

    Scientific reason for data generation: to serve as the reference case for the BT16 volume 1 agricultural scenarios. The agricultural baseline runs from 2015 through 2040; a starting year of 2014 is used. Date the data set was last modified: 02/12/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: simulation was developed without offering a farmgate price to energy crops or residues (i.e., building on both the USDA 2015 baseline and the agricultural census data (USDA NASS 2014). Data generated are .txt output files by year, simulation identifier, county code (1-3109). Instruments used: POLYSYS (version POLYS2015_V10_alt_JAN22B) supplied by the University of Tennessee APAC The quality assurance and quality control that have been applied: • Check for negative planted area, harvested area, production, yield and cost values. • Check if harvested area exceeds planted area for annuals. • Check FIPS codes.

  13. User's manual for the generalized computer program system. Open-channel flow and sedimentation, TABS-2. Main text

    NASA Astrophysics Data System (ADS)

    Thomas, W. A.; McAnally, W. H., Jr.

    1985-07-01

    TABS-2 is a generalized numerical modeling system for open-channel flows, sedimentation, and constituent transport. It consists of more than 40 computer programs to perform modeling and related tasks. The major modeling components--RMA-2V, STUDH, and RMA-4--calculate two-dimensional, depth-averaged flows, sedimentation, and dispersive transport, respectively. The other programs in the system perform digitizing, mesh generation, data management, graphical display, output analysis, and model interfacing tasks. Utilities include file management and automatic generation of computer job control instructions. TABS-2 has been applied to a variety of waterways, including rivers, estuaries, bays, and marshes. It is designed for use by engineers and scientists who may not have a rigorous computer background. Use of the various components is described in Appendices A-O. The bound version of the report does not include the appendices. A looseleaf form with Appendices A-O is distributed to system users.

  14. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework

    PubMed Central

    2012-01-01

    Background For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. Results We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. Conclusion The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources. PMID:23216909

  15. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework.

    PubMed

    Lewis, Steven; Csordas, Attila; Killcoyne, Sarah; Hermjakob, Henning; Hoopmann, Michael R; Moritz, Robert L; Deutsch, Eric W; Boyle, John

    2012-12-05

    For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources.

  16. Datacomputer Project

    DTIC Science & Technology

    1975-06-30

    assigned small inte- gers called Job File Numbers or JFNs with which future references are made. Since the name of the device on which a file...what could reasonably be called the "Datacomputer proper", and are the primary output of the Datacomputer project. They are conceptually and func...sections, each of which is broken into 512 word blocks called pages. When the Request Handler - 18 - MHM^MMMMMM 1 P ■■" Li..i..ii.i »I

  17. Method for data compression by associating complex numbers with files of data values

    DOEpatents

    Feo, J.T.; Hanks, D.C.; Kraay, T.A.

    1998-02-10

    A method for compressing data for storage or transmission is disclosed. Given a complex polynomial and a value assigned to each root, a root generated data file (RGDF) is created, one entry at a time. Each entry is mapped to a point in a complex plane. An iterative root finding technique is used to map the coordinates of the point to the coordinates of one of the roots of the polynomial. The value associated with that root is assigned to the entry. An equational data compression (EDC) method reverses this procedure. Given a target data file, the EDC method uses a search algorithm to calculate a set of m complex numbers and a value map that will generate the target data file. The error between a simple target data file and generated data file is typically less than 10%. Data files can be transmitted or stored without loss by transmitting the m complex numbers, their associated values, and an error file whose size is at most one-tenth of the size of the input data file. 4 figs.

  18. Method for data compression by associating complex numbers with files of data values

    DOEpatents

    Feo, John Thomas; Hanks, David Carlton; Kraay, Thomas Arthur

    1998-02-10

    A method for compressing data for storage or transmission. Given a complex polynomial and a value assigned to each root, a root generated data file (RGDF) is created, one entry at a time. Each entry is mapped to a point in a complex plane. An iterative root finding technique is used to map the coordinates of the point to the coordinates of one of the roots of the polynomial. The value associated with that root is assigned to the entry. An equational data compression (EDC) method reverses this procedure. Given a target data file, the EDC method uses a search algorithm to calculate a set of m complex numbers and a value map that will generate the target data file. The error between a simple target data file and generated data file is typically less than 10%. Data files can be transmitted or stored without loss by transmitting the m complex numbers, their associated values, and an error file whose size is at most one-tenth of the size of the input data file.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shepard, Kenneth L.; Sturcken, Noah Andrew

    Power controller includes an output terminal having an output voltage, at least one clock generator to generate a plurality of clock signals and a plurality of hardware phases. Each hardware phase is coupled to the at least one clock generator and the output terminal and includes a comparator. Each hardware phase is configured to receive a corresponding one of the plurality of clock signals and a reference voltage, combine the corresponding clock signal and the reference voltage to produce a reference input, generate a feedback voltage based on the output voltage, compare the reference input and the feedback voltage usingmore » the comparator and provide a comparator output to the output terminal, whereby the comparator output determines a duty cycle of the power controller. An integrated circuit including the power controller is also provided.« less

  20. 76 FR 69264 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-08

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 Take notice that the Commission received the following electric rate filings: Docket Numbers: ER10-3069-002; ER10-3070-002. Applicants: Alcoa Power Generating Inc. Description: Alcoa Power Generating Inc. and Alcoa...

  1. 77 FR 27042 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-08

    ... that the Commission received the following exempt wholesale generator filings: Docket Numbers: EG12-62-000. Applicants: Canadian Hills Wind, LLC. Description: Notice of Self-Certification of Exempt Wholesale Generator Status of Canadian Hills Wind, LLC. Filed Date: 4/26/12. Accession Number: 20120426-5240...

  2. Mission Operations Center (MOC) - Precipitation Processing System (PPS) Interface Software System (MPISS)

    NASA Technical Reports Server (NTRS)

    Ferrara, Jeffrey; Calk, William; Atwell, William; Tsui, Tina

    2013-01-01

    MPISS is an automatic file transfer system that implements a combination of standard and mission-unique transfer protocols required by the Global Precipitation Measurement Mission (GPM) Precipitation Processing System (PPS) to control the flow of data between the MOC and the PPS. The primary features of MPISS are file transfers (both with and without PPS specific protocols), logging of file transfer and system events to local files and a standard messaging bus, short term storage of data files to facilitate retransmissions, and generation of file transfer accounting reports. The system includes a graphical user interface (GUI) to control the system, allow manual operations, and to display events in real time. The PPS specific protocols are an enhanced version of those that were developed for the Tropical Rainfall Measuring Mission (TRMM). All file transfers between the MOC and the PPS use the SSH File Transfer Protocol (SFTP). For reports and data files generated within the MOC, no additional protocols are used when transferring files to the PPS. For observatory data files, an additional handshaking protocol of data notices and data receipts is used. MPISS generates and sends to the PPS data notices containing data start and stop times along with a checksum for the file for each observatory data file transmitted. MPISS retrieves the PPS generated data receipts that indicate the success or failure of the PPS to ingest the data file and/or notice. MPISS retransmits the appropriate files as indicated in the receipt when required. MPISS also automatically retrieves files from the PPS. The unique feature of this software is the use of both standard and PPS specific protocols in parallel. The advantage of this capability is that it supports users that require the PPS protocol as well as those that do not require it. The system is highly configurable to accommodate the needs of future users.

  3. Selected micrometeorological and soil-moisture data at Amargosa Desert Research Site, an arid site near Beatty, Nye County, Nevada, 1998-2000

    USGS Publications Warehouse

    Johnson, Michael J.; Mayers, Charles J.; Andraski, Brian J.

    2002-01-01

    Selected micrometeorological and soil-moisture data were collected at the Amargosa Desert Research Site adjacent to a low-level radioactive waste and hazardous chemical waste facility near Beatty, Nev., 1998-2000. Data were collected in support of ongoing research studies to improve the understanding of hydrologic and contaminant-transport processes in arid environments. Micrometeorological data include precipitation, air temperature, solar radiation, net radiation, relative humidity, ambient vapor pressure, wind speed and direction, barometric pressure, soil temperature, and soil-heat flux. All micrometeorological data were collected using a 10-second sampling interval by data loggers that output daily mean, maximum, and minimum values, and hourly mean values. For precipitation, data output consisted of daily, hourly, and 5-minute totals. Soil-moisture data included periodic measurements of soil-water content at nine neutron-probe access tubes with measurable depths ranging from 5.25 to 29.75 meters. The computer data files included in this report contain the complete micrometeorological and soil-moisture data sets. The computer data consists of seven files with about 14 megabytes of information. The seven files are in tabular format: (1) one file lists daily mean, maximum, and minimum micrometeorological data and daily total precipitation; (2) three files list hourly mean micrometeorological data and hourly precipitation for each year (1998-2000); (3) one file lists 5-minute precipitation data; (4) one file lists mean soil-water content by date and depth at four experimental sites; and (5) one file lists soil-water content by date and depth for each neutron-probe access tube. This report highlights selected data contained in the computer data files using figures, tables, and brief discussions. Instrumentation used for data collection also is described. Water-content profiles are shown to demonstrate variability of water content with depth. Time-series data are plotted to illustrate temporal variations in micrometeorological and soil-water content data. Substantial precipitation at the end of an El Ni?o cycle in early 1998 resulted in measurable water penetration to a depth of 1.25 meters at one of the four experimental soil-monitoring sites.

  4. NREL: SMARTS - About SMARTS

    Science.gov Websites

    its references list. To use SMARTS, users construct text files of 20-30 lines of simple text and ' output consists of spreadsheet-compatible American Standard Code for Information Interchange (ASCII) text

  5. Charge control microcomputer device for vehicle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morishita, M.; Kouge, S.

    1986-08-26

    A charge control microcomputer device is described for a vehicle, comprising: an AC generator driven by an engine for generating an output current, the generator having armature coils and a field coil; a battery charged by a rectified output of the generator and generating a terminal voltage; a voltage regulator for controlling a current flowing in the field coil, to control an output voltage of the generator to a predetermined value; an engine controlling microcomputer for receiving engine parameter data from the engine, to control the operation of the engine; a charge control microcomputer for processing input data including datamore » on at least one engine parameter output from the engine controlling microcomputer, and charge system data including at least one of battery terminal voltage data, generator voltage data and generator output current data, to provide a reference voltage for the voltage regulator.« less

  6. Mixed Linear/Square-Root Encoded Single Slope Ramp Provides a Fast, Low Noise Analog to Digital Converter with Very High Linearity for Focal Plane Arrays

    NASA Technical Reports Server (NTRS)

    Wrigley, Christopher James (Inventor); Hancock, Bruce R. (Inventor); Cunningham, Thomas J. (Inventor); Newton, Kenneth W. (Inventor)

    2014-01-01

    An analog-to-digital converter (ADC) converts pixel voltages from a CMOS image into a digital output. A voltage ramp generator generates a voltage ramp that has a linear first portion and a non-linear second portion. A digital output generator generates a digital output based on the voltage ramp, the pixel voltages, and comparator output from an array of comparators that compare the voltage ramp to the pixel voltages. A return lookup table linearizes the digital output values.

  7. Detection with Enhanced Energy Windowing Phase I Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bass, David A.; Enders, Alexander L.

    2016-12-01

    This document reviews the progress of Phase I of the Detection with Enhanced Energy Windowing (DEEW) project. The DEEW project is the implementation of software incorporating an algorithm which reviews data generated by radiation portal monitors and utilizes advanced and novel techniques for detecting radiological and fissile material while not alarming on Naturally Occurring Radioactive Material. Independent testing indicated that the Enhanced Energy Windowing algorithm showed promise at reducing the probability of alarm in the stream of commerce compared to existing algorithms and other developmental algorithms, while still maintaining adequate sensitivity to threats. This document contains a brief description ofmore » the project, instructions for setting up and running the applications, and guidance to help make reviewing the output files and source code easier.« less

  8. High Output Piezo/Triboelectric Hybrid Generator

    PubMed Central

    Jung, Woo-Suk; Kang, Min-Gyu; Moon, Hi Gyu; Baek, Seung-Hyub; Yoon, Seok-Jin; Wang, Zhong-Lin; Kim, Sang-Woo; Kang, Chong-Yun

    2015-01-01

    Recently, piezoelectric and triboelectric energy harvesting devices have been developed to convert mechanical energy into electrical energy. Especially, it is well known that triboelectric nanogenerators have a simple structure and a high output voltage. However, whereas nanostructures improve the output of triboelectric generators, its fabrication process is still complicated and unfavorable in term of the large scale and long-time durability of the device. Here, we demonstrate a hybrid generator which does not use nanostructure but generates much higher output power by a small mechanical force and integrates piezoelectric generator into triboelectric generator, derived from the simultaneous use of piezoelectric and triboelectric mechanisms in one press-and-release cycle. This hybrid generator combines high piezoelectric output current and triboelectric output voltage, which produces peak output voltage of ~370 V, current density of ~12 μA·cm−2, and average power density of ~4.44 mW·cm−2. The output power successfully lit up 600 LED bulbs by the application of a 0.2 N mechanical force and it charged a 10 μF capacitor to 10 V in 25 s. Beyond energy harvesting, this work will provide new opportunities for developing a small, built-in power source in self-powered electronics such as mobile electronics. PMID:25791299

  9. 77 FR 40875 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-11

    ... following exempt wholesale generator filings: Docket Numbers: EG12-81-000. Applicants: Blue Sky East, LLC. Description: Notice of Self-Certification of Exempt Wholesale Generator Status of Blue Sky East, LLC. Filed.... Docket Numbers: ER12-2146-000 Applicants: Interstate Power and Light Company Description: Cancellation of...

  10. 76 FR 44318 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-25

    ... that the Commission received the following exempt wholesale generator filings: Docket Numbers: EG11-107-000. Applicants: Trinity Hills Wind Farm LLC. Description: Notice of Self-Certification of Exempt Wholesale Generator Status of Trinity Hills Wind Farm LLC. File Date: 07/15/2011. Accession Number: 20110715...

  11. 76 FR 32182 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-03

    ... that the Commission received the following exempt wholesale generator filings: Docket Numbers: EG11-87-000. Applicants: Sherbino II Wind Farm LLC. Description: Notice of Self-Certification of Exempt Wholesale Generator Status of Sherbino II Wind Farm LLC. Filed Date: 05/26/2011. Accession Number: 20110526...

  12. 78 FR 67354 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    .... Description: Simon Solar, LLC submits Supplement Record in Pending Filing to be effective 10/1/2013. Filed.... Applicants: All Dams Generation, LLC, Arlington Valley Solar Energy II, LLC, Bluegrass Generation Company, L.L.C., Calhoun Power Company, LLC, Centinela Solar Energy, LLC, Cherokee County Cogeneration Partners...

  13. Automated ISS Flight Utilities

    NASA Technical Reports Server (NTRS)

    Offermann, Jan Tuzlic

    2016-01-01

    During my internship at NASA Johnson Space Center, I worked in the Space Radiation Analysis Group (SRAG), where I was tasked with a number of projects focused on the automation of tasks and activities related to the operation of the International Space Station (ISS). As I worked on a number of projects, I have written short sections below to give a description for each, followed by more general remarks on the internship experience. My first project is titled "General Exposure Representation EVADOSE", also known as "GEnEVADOSE". This project involved the design and development of a C++/ ROOT framework focused on radiation exposure for extravehicular activity (EVA) planning for the ISS. The utility helps mission managers plan EVAs by displaying information on the cumulative radiation doses that crew will receive during an EVA as a function of the egress time and duration of the activity. SRAG uses a utility called EVADOSE, employing a model of the space radiation environment in low Earth orbit to predict these doses, as while outside the ISS the astronauts will have less shielding from charged particles such as electrons and protons. However, EVADOSE output is cumbersome to work with, and prior to GEnEVADOSE, querying data and producing graphs of ISS trajectories and cumulative doses versus egress time required manual work in Microsoft Excel. GEnEVADOSE automates all this work, reading in EVADOSE output file(s) along with a plaintext file input by the user providing input parameters. GEnEVADOSE will output a text file containing all the necessary dosimetry for each proposed EVA egress time, for each specified EVADOSE file. It also plots cumulative dose versus egress time and the ISS trajectory, and displays all of this information in an auto-generated presentation made in LaTeX. New features have also been added, such as best-case scenarios (egress times corresponding to the least dose), interpolated curves for trajectories, and the ability to query any time in the EVADES output. As mentioned above, GEnEVADOSE makes extensive use of ROOT version 6, the data analysis framework developed at the European Organization for Nuclear Research (CERN), and the code is written to the C++11 standard (as are the other projects). My second project is the Automated Mission Reference Exposure Utility (AMREU).Unlike GEnEVADOSE, AMREU is a combination of three frameworks written in both Python and C++, also making use of ROOT (and PyROOT). Run as a combination of daily and weekly cron jobs, these macros query the SRAG database system to determine the active ISS missions, and query minute-by-minute radiation dose information from ISS-TEPC (Tissue Equivalent Proportional Counter), one of the radiation detectors onboard the ISS. Using this information, AMREU creates a corrected data set of daily radiation doses, addressing situations where TEPC may be offline or locked up by correcting doses for days with less than 95% live time (the total amount time the instrument acquires data) by averaging the past 7 days. As not all errors may be automatically detectable, AMREU also allows for manual corrections, checking an updated plaintext file each time it runs. With the corrected data, AMREU generates cumulative dose plots for each mission, and uses a Python script to generate a flight note file (.docx format) containing these plots, as well as information sections to be filled in and modified by the space weather environment officers with information specific to the week. AMREU is set up to run without requiring any user input, and it automatically archives old flight notes and information files for missions that are no longer active. My other projects involve cleaning up a large data set from the Charged Particle Directional Spectrometer (CPDS), joining together many different data sets in order to clean up information in SRAG SQL databases, and developing other automated utilities for displaying information on active solar regions, that may be used by the space weather environment officers to monitor solar activity. I consulted my mentor Dr. Ryan Rios and Dr. Kerry Lee for project requirements and added features, and ROOT developer Edmond Offermann for advice on using the ROOT library. I also received advice and feedback from Dr. Janet Barzilla of SRAG, who tested my code. Besides these inputs, I worked independently, writing all of the code by myself. The code for all these projects is documented throughout, and I have attempted to write it in a modular format. Assuming that ROOT is updated accordingly, these codes are also Y2038-compliant (and Y10K-compliant). This allows the code to be easily referenced, modified and possibly repurposed for non-ISS missions in the future, should the necessary inputs exist. These projects have taught me a lot about coding and software design - I have become a much more skilled C++ programmer and ROOT user, and I also learned to code in Python and PyROOT (and its advantages and disadvantages compared to C++/ ROOT). Furthermore, I have learned about space radiation and radiation modeling, topics that greatly interest me as I pursue a degree in physics. Working alongside experimental physicists like Dr. Rios, I have developed a greater understanding and appreciation for experimental science, something I have always leaned towards but to which I lacked significant exposure. My work in SRAG has also given me the invaluable opportunity to witness the work environment for physicists at NASA, and what a career in academia may look like at a government laboratory such as NASA Johnson Space Center. As I continue my studies and look forward to graduate school and a future career, this experience at NASA has given me a meaningful and enjoyable opportunity to put my skills to use and see what my future career path might hold.

  14. Computerized Integrated Inventory Control for an Air Force Base-Level Supply System.

    DTIC Science & Technology

    1980-06-01

    3465 4710-4730 9110-9180 3515-3540 4810-4820 9320-9360 3620-3694 4910-4940 9505 -9540 3720-2750 5110-5180 9620-9650 3805-3030 5210-5280 3910-3995 4010...Buffers Disc Files Input 1.2 K Output 1.2 K 2.4 K Printer 2 (double) x 150 .0003 K Tape Input 1.2 K Output 1.2 K 2.4 K Card Reader 2 (double) x 100 . 0002

  15. A program to compute three-dimensional subsonic unsteady aerodynamic characteristics using the doublet lattic method, L216 (DUBFLX). Volume 1: Engineering and usage

    NASA Technical Reports Server (NTRS)

    Richard, M.; Harrison, B. A.

    1979-01-01

    The program input presented consists of configuration geometry, aerodynamic parameters, and modal data; output includes element geometry, pressure difference distributions, integrated aerodynamic coefficients, stability derivatives, generalized aerodynamic forces, and aerodynamic influence coefficient matrices. Optionally, modal data may be input on magnetic file (tape or disk), and certain geometric and aerodynamic output may be saved for subsequent use.

  16. NOSS altimeter algorithm specifications

    NASA Technical Reports Server (NTRS)

    Hancock, D. W.; Forsythe, R. G.; Mcmillan, J. D.

    1982-01-01

    A description of all algorithms required for altimeter processing is given. Each description includes title, description, inputs/outputs, general algebraic sequences and data volume. All required input/output data files are described and the computer resources required for the entire altimeter processing system were estimated. The majority of the data processing requirements for any radar altimeter of the Seasat-1 type are scoped. Additions and deletions could be made for the specific altimeter products required by other projects.

  17. GPFA-AB_Phase1ReservoirTask2DataUpload

    DOE Data Explorer

    Teresa E. Jordan

    2015-10-22

    This submission to the Geothermal Data Repository (GDR) node of the National Geothermal Data System (NGDS) in support of Phase 1 Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin. The files included in this zip file contain all data pertinent to the methods and results of this task’s output, which is a cohesive multi-state map of all known potential geothermal reservoirs in our region, ranked by their potential favorability. Favorability is quantified using a new metric, Reservoir Productivity Index, as explained in the Reservoirs Methodology Memo (included in zip file). Shapefile and images of the Reservoir Productivity and Reservoir Uncertainty are included as well.

  18. Thermo-msf-parser: an open source Java library to parse and visualize Thermo Proteome Discoverer msf files.

    PubMed

    Colaert, Niklaas; Barsnes, Harald; Vaudel, Marc; Helsens, Kenny; Timmerman, Evy; Sickmann, Albert; Gevaert, Kris; Martens, Lennart

    2011-08-05

    The Thermo Proteome Discoverer program integrates both peptide identification and quantification into a single workflow for peptide-centric proteomics. Furthermore, its close integration with Thermo mass spectrometers has made it increasingly popular in the field. Here, we present a Java library to parse the msf files that constitute the output of Proteome Discoverer. The parser is also implemented as a graphical user interface allowing convenient access to the information found in the msf files, and in Rover, a program to analyze and validate quantitative proteomics information. All code, binaries, and documentation is freely available at http://thermo-msf-parser.googlecode.com.

  19. File formats commonly used in mass spectrometry proteomics.

    PubMed

    Deutsch, Eric W

    2012-12-01

    The application of mass spectrometry (MS) to the analysis of proteomes has enabled the high-throughput identification and abundance measurement of hundreds to thousands of proteins per experiment. However, the formidable informatics challenge associated with analyzing MS data has required a wide variety of data file formats to encode the complex data types associated with MS workflows. These formats encompass the encoding of input instruction for instruments, output products of the instruments, and several levels of information and results used by and produced by the informatics analysis tools. A brief overview of the most common file formats in use today is presented here, along with a discussion of related topics.

  20. Delay correlation analysis and representation for vital complaint VHDL models

    DOEpatents

    Rich, Marvin J.; Misra, Ashutosh

    2004-11-09

    A method and system unbind a rise/fall tuple of a VHDL generic variable and create rise time and fall time generics of each generic variable that are independent of each other. Then, according to a predetermined correlation policy, the method and system collect delay values in a VHDL standard delay file, sort the delay values, remove duplicate delay values, group the delay values into correlation sets, and output an analysis file. The correlation policy may include collecting all generic variables in a VHDL standard delay file, selecting each generic variable, and performing reductions on the set of delay values associated with each selected generic variable.

  1. Size reduction techniques for vital compliant VHDL simulation models

    DOEpatents

    Rich, Marvin J.; Misra, Ashutosh

    2006-08-01

    A method and system select delay values from a VHDL standard delay file that correspond to an instance of a logic gate in a logic model. Then the system collects all the delay values of the selected instance and builds super generics for the rise-time and the fall-time of the selected instance. Then, the system repeats this process for every delay value in the standard delay file (310) that correspond to every instance of every logic gate in the logic model. The system then outputs a reduced size standard delay file (314) containing the super generics for every instance of every logic gate in the logic model.

  2. Complementary power output characteristics of electromagnetic generators and triboelectric generators.

    PubMed

    Fan, Feng-Ru; Tang, Wei; Yao, Yan; Luo, Jianjun; Zhang, Chi; Wang, Zhong Lin

    2014-04-04

    Recently, a triboelectric generator (TEG) has been invented to convert mechanical energy into electricity by a conjunction of triboelectrification and electrostatic induction. Compared to the traditional electromagnetic generator (EMG) that produces a high output current but low voltage, the TEG has different output characteristics of low output current but high output voltage. In this paper, we present a comparative study regarding the fundamentals of TEGs and EMGs. The power output performances of the EMG and the TEG have a special complementary relationship, with the EMG being a voltage source and the TEG a current source. Utilizing a power transformed and managed (PTM) system, the current output of a TEG can reach as high as ∼3 mA, which can be coupled with the output signal of an EMG to enhance the output power. We also demonstrate a design to integrate a TEG and an EMG into a single device for simultaneously harvesting mechanical energy. In addition, the integrated NGs can independently output a high voltage and a high current to meet special needs.

  3. 78 FR 34361 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 2 Take notice that the Commission received the following exempt wholesale generator filings: Docket Numbers: EG13-36-000. Applicants: Catalina Solar Lessee, LLC. Description: Notice of Self-Certification of Exempt Wholesale Generator Status of Catalina Solar...

  4. 76 FR 25324 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-04

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 2 Take notice that the Commission received the following exempt wholesale generator filings: Docket Numbers: EG11-80-000. Applicants: Bayonne Energy Center, LLC. Description: Notice of Self-Certification of Exempt Wholesale Generator, Status of Bayonne Energy...

  5. 77 FR 27041 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-08

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 2 Take notice that the Commission received the following electric rate filings: Docket Numbers: ER11-4266-003. Applicants: Richland-Stryker Generation LLC. Description: Supplemental to Notice of Non-Material Change in Status of Richland-Stryker Generation LLC....

  6. 77 FR 105 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-03

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 2 Take notice that the Commission received the following exempt wholesale generator filings: Docket Numbers: EG12-22-000. Applicants: Perrin Ranch Wind, LLC. Description: Notice of Self-Certification of Exempt Wholesale Generator Status of Perrin Ranch Wind, LLC...

  7. 76 FR 39089 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-05

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 2 Take notice that the Commission received the following exempt wholesale generator filings: Docket Numbers: EG11-80-000. Applicants: Bayonne Energy Center, LLC. Description: Amendment to Notice of Self-Certification of Exempt Wholesale Generator Status of...

  8. 77 FR 33208 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-05

    .... Comments Due: 5 p.m. ET 6/12/12. Docket Numbers: ER12-1832-000. Applicants: Lucky Corridor, LLC... that the Commission received the following exempt wholesale generator filings: Docket Numbers: EG12-69... Generator Status of Shooting Star Wind Project, LLC. Filed Date: 5/22/12. Accession Number: 20120522-5163...

  9. 77 FR 46428 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-03

    ... Generator Status of Spinning Spur Wind LLC. Filed Date: 7/27/12. Accession Number: 20120727-5038. Comments... II Wind Farm LLC, Fowler Ridge III Wind Farm LLC, Fowler Ridge Wind Farm LLC, Goshen Phase II, LLC... that the Commission received the following exempt wholesale generator filings: Docket Numbers: EG12-93...

  10. 78 FR 9682 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... Wholesale Generator Status of Niagara Wind Power, LLC. Filed Date: 1/31/13. Accession Number: 20130131-5139... Bay Wind, LLC, Vasco Winds, LLC, Victory Garden Phase IV, LLC, Waymart Wind Farm, L.P., Wessington... that the Commission received the following exempt wholesale generator filings: Docket Numbers: EG13-15...

  11. 75 FR 13529 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 March 15..., 2010. Docket Numbers: ER09-1099-003; ER07-412-004. Applicants: Empire Generating Co, LLC; ECP Energy I, LLC. Description: Quarterly Report of ECP Energy I, LLC, and Empire Generating Co, LLC. Filed Date: 02...

  12. 78 FR 61942 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-07

    ... filings: Docket Numbers: EG13-63-000. Applicants: Mountain Wind Power, LLC. Description: Notice Of Self-Certification Of Exempt Wholesale Generator Status Of Mountain Wind Power, LLC. Filed Date: 9/26/13. Accession...: Mountain Wind Power, LLC. Description: Notice Of Self-Certification Of Exempt Wholesale Generator Status Of...

  13. 75 FR 8685 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-25

    ...: EC10-47-000. Applicants: Ashtabula Wind II, LLC. Description: Ashtabula Wind II, LLC et al. requests... Service Company submits supplement to the Triennial Market Power Analysis filed on 7/31/09. Filed Date: 02... Corporation; Hot Spring Power Company, LLC; Mt. Tom Generating Company; Choctaw Gas Generation, LLC; Hopewell...

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fried, Jack

    The TSPM receives and processes ASIC signals and transmits processed data over to the PC using an Ethernet cable. The data is given a location and a time stamp. This is the heart of the device as it gathers and stamps the timing and location of events on each of the ASICs. The five files for the TSPM are needed to manufacture Pet scanners that are based on the RatCAP (Rat Conscious Animal PET). They include a TSPM schematic, a raw data file to build the RatCAP TSPM, an output file that along with the assay file is used bymore » an assembly house to build the RatCAP TSPM, an assay file that provides the part list and XY location for the components that go on the RatCAP TSPM, firmware that includes the source code to program the FPGA, and a realized program on the TSPM based on the firmware.« less

  15. GIDEP Batching Tool

    NASA Technical Reports Server (NTRS)

    Fong, Danny; Odell,Dorice; Barry, Peter; Abrahamian, Tomik

    2008-01-01

    This software provides internal, automated search mechanics of GIDEP (Government- Industry Data Exchange Program) Alert data imported from the GIDEP government Web site. The batching tool allows the import of a single parts list in tab-delimited text format into the local JPL GIDEP database. Delimiters from every part number are removed. The original part numbers with delimiters are compared, as well as the newly generated list without the delimiters. The two lists run against the GIDEP imports, and output any matches. This feature only works with Netscape 2.0 or greater, or Internet Explorer 4.0 or greater. The user selects the browser button to choose a text file to import. When the submit button is pressed, this script will import alerts from the text file into the local JPL GIDEP database. This batch tool provides complete in-house control over exported material and data for automated batch match abilities. The batching tool has the ability to match capabilities of the parts list to tables, and yields results that aid further research and analysis. This provides more control over GIDEP information for metrics and reports information not provided by the government site. This software yields results quickly and gives more control over external data from the government site in order to generate other reports not available from the external source. There is enough space to store years of data. The program relates to risk identification and management with regard to projects and GIDEP alert information encompassing flight parts for space exploration.

  16. Extraction of Vertical Profiles of Atmospheric Variables from Gridded Binary, Edition 2 (GRIB2) Model Output Files

    DTIC Science & Technology

    2018-01-18

    processing. Specifically, the method described herein uses wgrib2 commands along with a Python script or program to produce tabular text files that in...It makes use of software that is readily available and can be implemented on many computer systems combined with relatively modest additional...example), extracts appropriate information, and lists the extracted information in a readable tabular form. The Python script used here is described in

  17. Fallon, Nevada FORGE Distinct Element Reservoir Modeling

    DOE Data Explorer

    Blankenship, Doug; Pettitt, Will; Riahi, Azadeh; Hazzard, Jim; Blanksma, Derrick

    2018-03-12

    Archive containing input/output data for distinct element reservoir modeling for Fallon FORGE. Models created using 3DEC, InSite, and in-house Python algorithms (ITASCA). List of archived files follows; please see 'Modeling Metadata.pdf' (included as a resource below) for additional file descriptions. Data sources include regional geochemical model, well positions and geometry, principal stress field, capability for hydraulic fractures, capability for hydro-shearing, reservoir geomechanical model-stimulation into multiple zones, modeled thermal behavior during circulation, and microseismicity.

  18. Up-to-date state of storage techniques used for large numerical data files

    NASA Technical Reports Server (NTRS)

    Chlouba, V.

    1975-01-01

    Methods for data storage and output in data banks and memory files are discussed along with a survey of equipment available for this. Topics discussed include magnetic tapes, magnetic disks, Terabit magnetic tape memory, Unicon 690 laser memory, IBM 1360 photostore, microfilm recording equipment, holographic recording, film readers, optical character readers, digital data storage techniques, and photographic recording. The individual types of equipment are summarized in tables giving the basic technical parameters.

  19. Navy Occupational Health Information Management System (NOHIMS). Environmental Exposure Module. Users’ Manual

    DTIC Science & Technology

    1987-01-16

    menus , controls user and device access to the system, manages the security features associated with menus , devices, and users, provides...in the files, or the number of files in the system. 2-2 3.0 MODULE INPUT PROCESSES 3.1 Summary of Input Processes The EE module contains many menu ...Output Processes The EE module contains many menu options which enable the user to obtain needed information from the module. These options can be

  20. Modeling and Optimization of Coordinative Operation of Hydro-wind-photovoltaic Considering Power Generation and Output Fluctuation

    NASA Astrophysics Data System (ADS)

    Wang, Xianxun; Mei, Yadong

    2017-04-01

    Coordinative operation of hydro-wind-photovoltaic is the solution of mitigating the conflict of power generation and output fluctuation of new energy and conquering the bottleneck of new energy development. Due to the deficiencies of characterizing output fluctuation, depicting grid construction and disposal of power abandon, the research of coordinative mechanism is influenced. In this paper, the multi-object and multi-hierarchy model of coordinative operation of hydro-wind-photovoltaic is built with the aim of maximizing power generation and minimizing output fluctuation and the constraints of topotaxy of power grid and balanced disposal of power abandon. In the case study, the comparison of uncoordinative and coordinative operation is carried out with the perspectives of power generation, power abandon and output fluctuation. By comparison from power generation, power abandon and output fluctuation between separate operation and coordinative operation of multi-power, the coordinative mechanism is studied. Compared with running solely, coordinative operation of hydro-wind-photovoltaic can gain the compensation benefits. Peak-alternation operation reduces the power abandon significantly and maximizes resource utilization effectively by compensating regulation of hydropower. The Pareto frontier of power generation and output fluctuation is obtained through multiple-objective optimization. It clarifies the relationship of mutual influence between these two objects. When coordinative operation is taken, output fluctuation can be markedly reduced at the cost of a slight decline of power generation. The power abandon also drops sharply compared with operating separately. Applying multi-objective optimization method to optimize the coordinate operation, Pareto optimal solution set of power generation and output fluctuation is achieved.

  1. Performance of the Galley Parallel File System

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    As the input/output (I/O) needs of parallel scientific applications increase, file systems for multiprocessors are being designed to provide applications with parallel access to multiple disks. Many parallel file systems present applications with a conventional Unix-like interface that allows the application to access multiple disks transparently. This interface conceals the parallism within the file system, which increases the ease of programmability, but makes it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. Furthermore, most current parallel file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic parallel workloads. Initial experiments, reported in this paper, indicate that Galley is capable of providing high-performance 1/O to applications the applications that rely on them. In Section 3 we describe that access data in patterns that have been observed to be common.

  2. A suite of MATLAB-based computational tools for automated analysis of COPAS Biosort data

    PubMed Central

    Morton, Elizabeth; Lamitina, Todd

    2010-01-01

    Complex Object Parametric Analyzer and Sorter (COPAS) devices are large-object, fluorescence-capable flow cytometers used for high-throughput analysis of live model organisms, including Drosophila melanogaster, Caenorhabditis elegans, and zebrafish. The COPAS is especially useful in C. elegans high-throughput genome-wide RNA interference (RNAi) screens that utilize fluorescent reporters. However, analysis of data from such screens is relatively labor-intensive and time-consuming. Currently, there are no computational tools available to facilitate high-throughput analysis of COPAS data. We used MATLAB to develop algorithms (COPAquant, COPAmulti, and COPAcompare) to analyze different types of COPAS data. COPAquant reads single-sample files, filters and extracts values and value ratios for each file, and then returns a summary of the data. COPAmulti reads 96-well autosampling files generated with the ReFLX adapter, performs sample filtering, graphs features across both wells and plates, performs some common statistical measures for hit identification, and outputs results in graphical formats. COPAcompare performs a correlation analysis between replicate 96-well plates. For many parameters, thresholds may be defined through a simple graphical user interface (GUI), allowing our algorithms to meet a variety of screening applications. In a screen for regulators of stress-inducible GFP expression, COPAquant dramatically accelerated data analysis and allowed us to rapidly move from raw data to hit identification. Because the COPAS file structure is standardized and our MATLAB code is freely available, our algorithms should be extremely useful for analysis of COPAS data from multiple platforms and organisms. The MATLAB code is freely available at our web site (www.med.upenn.edu/lamitinalab/downloads.shtml). PMID:20569218

  3. GWM-VI: groundwater management with parallel processing for multiple MODFLOW versions

    USGS Publications Warehouse

    Banta, Edward R.; Ahlfeld, David P.

    2013-01-01

    Groundwater Management–Version Independent (GWM–VI) is a new version of the Groundwater Management Process of MODFLOW. The Groundwater Management Process couples groundwater-flow simulation with a capability to optimize stresses on the simulated aquifer based on an objective function and constraints imposed on stresses and aquifer state. GWM–VI extends prior versions of Groundwater Management in two significant ways—(1) it can be used with any version of MODFLOW that meets certain requirements on input and output, and (2) it is structured to allow parallel processing of the repeated runs of the MODFLOW model that are required to solve the optimization problem. GWM–VI uses the same input structure for files that describe the management problem as that used by prior versions of Groundwater Management. GWM–VI requires only minor changes to the input files used by the MODFLOW model. GWM–VI uses the Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER-API) to implement both version independence and parallel processing. GWM–VI communicates with the MODFLOW model by manipulating certain input files and interpreting results from the MODFLOW listing file and binary output files. Nearly all capabilities of prior versions of Groundwater Management are available in GWM–VI. GWM–VI has been tested with MODFLOW-2005, MODFLOW-NWT (a Newton formulation for MODFLOW-2005), MF2005-FMP2 (the Farm Process for MODFLOW-2005), SEAWAT, and CFP (Conduit Flow Process for MODFLOW-2005). This report provides sample problems that demonstrate a range of applications of GWM–VI and the directory structure and input information required to use the parallel-processing capability.

  4. Storage of sparse files using parallel log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a singlemore » patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.« less

  5. Sprinting for the Win; Distribution of Power Output in Women's Professional Cycling.

    PubMed

    Peiffer, Jeremiah J; Abbiss, Chris R; Haakonssen, Eric C; Menaspà, Paolo

    2018-04-24

    This study examined the power output distribution and sprint characteristics of professional female road cyclists. 31 race files, representing top-five finishes, were collected from seven professional female cyclists. Files were analysed for sprint characteristics including; mean and peak power output, velocity and duration. The final 20 min before the sprint was analysed to determine the mean maximal power output (MMP) consistent with 5, 15, 30, 60, 240 and 600s durations. Throughout the race, the number of efforts for each duration exceeding 80% of its corresponding final 20-min MMP (MMP 80 ) were determined. The number of 15s efforts exceeding 80% of the mean final sprint power output (MSP 80 ) were determined. Sprint finishes lasted 21.8 ± 6.7s with a mean and peak power output of 679 ± 101W and 886 ± 91W, respectively. Throughout the race, more 5, 15, and 30s efforts above MMP 80 were completed in the 5 th compared with the 1 st - 4 th quintiles of the race. 60s efforts were greater during the 5 th compared 1 st , 2 nd , and 4 th quintiles and during the 3 rd compared with 4 th quintile. More 240s efforts were recorded during the 5 th compared with 1 st and 4 th quintiles. 82% of 15s efforts above MSP 80 were completed in the 2 nd , 3 rd and 5 th quintiles of the race. This data demonstrates the variable nature of women's professional cycling and the physical demands necessary for success; thus providing information that could enhance in-race decision-making and the development of race-specific training programs.

  6. Format( )MEDIC( )Input

    NASA Astrophysics Data System (ADS)

    Foster, K.

    1994-09-01

    This document is a description of a computer program called Format( )MEDIC( )Input. The purpose of this program is to allow the user to quickly reformat wind velocity data in the Model Evaluation Database (MEDb) into a reasonable 'first cut' set of MEDIC input files (MEDIC.nml, StnLoc.Met, and Observ.Met). The user is cautioned that these resulting input files must be reviewed for correctness and completeness. This program will not format MEDb data into a Problem Station Library or Problem Metdata File. A description of how the program reformats the data is provided, along with a description of the required and optional user input and a description of the resulting output files. A description of the MEDb is not provided here but can be found in the RAS Division Model Evaluation Database Description document.

  7. Designing and Implementing a Family of Intrusion Detection Systems

    DTIC Science & Technology

    2004-11-01

    configure (train), generates many false alarms – Misuse detection (signature analysis) (NFR, Emerald , Snort, STAT) • Generates few false alarms • Detects...to create .rhosts file in world-writable ftp home directory – rlogin using bogus .rhosts file S0 create_file read_rhosts S3S2 login S1 STAT KN-14...world-writable ftp home directory – rlogin using bogus .rhosts file S0 create_file read_rhosts S3S2 login S1 STAT KN-17 ftp-write in STATL use ustat

  8. The Electrolyte Genome project: A big data approach in battery materials discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qu, Xiaohui; Jain, Anubhav; Rajput, Nav Nidhi

    2015-06-01

    We present a high-throughput infrastructure for the automated calculation of molecular properties with a focus on battery electrolytes. The infrastructure is largely open-source and handles both practical aspects (input file generation, output file parsing, and information management) as well as more complex problems (structure matching, salt complex generation, and failure recovery). Using this infrastructure, we have computed the ionization potential (IP) and electron affinities (EA) of 4830 molecules relevant to battery electrolytes (encompassing almost 55,000 quantum mechanics calculations) at the B3LYP/6-31+G(*) level. We describe automated workflows for computing redox potential, dissociation constant, and salt-molecule binding complex structure generation. We presentmore » routines for automatic recovery from calculation errors, which brings the failure rate from 9.2% to 0.8% for the QChem DFT code. Automated algorithms to check duplication between two arbitrary molecules and structures are described. We present benchmark data on basis sets and functionals on the G2-97 test set; one finding is that a IP/EA calculation method that combines PBE geometry optimization and B3LYP energy evaluation requires less computational cost and yields nearly identical results as compared to a full B3LYP calculation, and could be suitable for the calculation of large molecules. Our data indicates that among the 8 functionals tested, XYGJ-OS and B3LYP are the two best functionals to predict IP/EA with an RMSE of 0.12 and 0.27 eV, respectively. Application of our automated workflow to a large set of quinoxaline derivative molecules shows that functional group effect and substitution position effect can be separated for IP/EA of quinoxaline derivatives, and the most sensitive position is different for IP and EA. Published by Elsevier B.V« less

  9. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Marie-Paule, E-mail: marie-paule.garcia@univ-brest.fr; Villoing, Daphnée; McKay, Erin

    Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of amore » given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. Conclusions: The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.« less

  10. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry.

    PubMed

    Garcia, Marie-Paule; Villoing, Daphnée; McKay, Erin; Ferrer, Ludovic; Cremonesi, Marta; Botta, Francesca; Ferrari, Mahila; Bardiès, Manuel

    2015-12-01

    The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit gate offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on gate to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user's imaging requirements and generates automatically command files used as input for gate. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant gate input files are generated for the virtual patient model and associated pharmacokinetics. Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body "step and shoot" acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.

  11. Selected micrometeorological, soil-moisture, and evapotranspiration data at Amargosa Desert Research Site in Nye County near Beatty, Nevada, 2001-05

    USGS Publications Warehouse

    Johnson, Michael J.; Mayers, C. Justin; Garcia, C. Amanda; Andraski, Brian J.

    2007-01-01

    Selected micrometeorological and soil-moisture data were collected at the Amargosa Desert Research Site adjacent to a low-level radio-active waste and hazardous chemical waste facility near Beatty, Nevada, 2001-05. Evapotranspiration data were collected from February 2002 through the end of December 2005. Data were col-lected in support of ongoing research to improve the understanding of hydrologic and contaminant-transport processes in arid environments. Micrometeorological data include solar radiation, net radiation, air temperature, relative humidity, saturated and ambient vapor pressure, wind speed and direction, barometric pressure, precipitation, near-surface soil temperature, soil-heat flux and soil-water content. All micrometeorological data were collected using a 10-second sampling interval by data loggers that output daily and hourly mean values. Daily maximum and minimum values are based on hourly mean values. Precipitation data output includes daily and hourly totals. Selected soil-moisture profiles at depth include periodic measurements of soil volumetric water-content measurements at nine neutron-probe access tubes to depths ranging from 5.25 to 29.25 meters. Evapotranspiration data include measurement of daily evapotranspiration and 15-minute fluxes of the four principal energy budget components of latent-heat flux, sensible-heat flux, soil-heat flux, and net radiation. Other data collected and used in equations to determine evapotranspiration include temperature and water content of soil, temperature and vapor pressure of air, and covariance values. Evapotranspiration and flux estimates during 15-minute intervals were calculated at a 0.1-second execution interval using the eddy covariance method. Data files included in this report contain the complete micrometeorological, soil-moisture, and evapotranspiration field data sets. These data files are presented in tabular Excel spreadsheet format. This report highlights selected data contained in the computer generated data files using figures, tables, and brief discussions. Instrumentation used for data collection also is described. Water-content profiles are shown to demonstrate variability of water content with depth. Time-series data are plotted to illustrate temporal variations in micrometeorological, soil-water content, and evapotranspiration data.

  12. Defining Geodetic Reference Frame using Matlab®: PlatEMotion 2.0

    NASA Astrophysics Data System (ADS)

    Cannavò, Flavio; Palano, Mimmo

    2016-03-01

    We describe the main features of the developed software tool, namely PlatE-Motion 2.0 (PEM2), which allows inferring the Euler pole parameters by inverting the observed velocities at a set of sites located on a rigid block (inverse problem). PEM2 allows also calculating the expected velocity value for any point located on the Earth providing an Euler pole (direct problem). PEM2 is the updated version of a previous software tool initially developed for easy-to-use file exchange with the GAMIT/GLOBK software package. The software tool is developed in Matlab® framework and, as the previous version, includes a set of MATLAB functions (m-files), GUIs (fig-files), map data files (mat-files) and user's manual as well as some example input files. New changes in PEM2 include (1) some bugs fixed, (2) improvements in the code, (3) improvements in statistical analysis, (4) new input/output file formats. In addition, PEM2 can be now run under the majority of operating systems. The tool is open source and freely available for the scientific community.

  13. MSL: Facilitating automatic and physical analysis of published scientific literature in PDF format

    PubMed Central

    Ahmed, Zeeshan; Dandekar, Thomas

    2018-01-01

    Published scientific literature contains millions of figures, including information about the results obtained from different scientific experiments e.g. PCR-ELISA data, microarray analysis, gel electrophoresis, mass spectrometry data, DNA/RNA sequencing, diagnostic imaging (CT/MRI and ultrasound scans), and medicinal imaging like electroencephalography (EEG), magnetoencephalography (MEG), echocardiography  (ECG), positron-emission tomography (PET) images. The importance of biomedical figures has been widely recognized in scientific and medicine communities, as they play a vital role in providing major original data, experimental and computational results in concise form. One major challenge for implementing a system for scientific literature analysis is extracting and analyzing text and figures from published PDF files by physical and logical document analysis. Here we present a product line architecture based bioinformatics tool ‘Mining Scientific Literature (MSL)’, which supports the extraction of text and images by interpreting all kinds of published PDF files using advanced data mining and image processing techniques. It provides modules for the marginalization of extracted text based on different coordinates and keywords, visualization of extracted figures and extraction of embedded text from all kinds of biological and biomedical figures using applied Optimal Character Recognition (OCR). Moreover, for further analysis and usage, it generates the system’s output in different formats including text, PDF, XML and images files. Hence, MSL is an easy to install and use analysis tool to interpret published scientific literature in PDF format. PMID:29721305

  14. TileDCS web system

    NASA Astrophysics Data System (ADS)

    Maidantchik, C.; Ferreira, F.; Grael, F.; Atlas Tile Calorimeter Community

    2010-04-01

    The web system described here provides features to monitor the ATLAS Detector Control System (DCS) acquired data. The DCS is responsible for overseeing the coherent and safe operation of the ATLAS experiment hardware. In the context of the Hadronic Tile Calorimeter Detector (TileCal), it controls the power supplies of the readout electronics acquiring voltages, currents, temperatures and coolant pressure measurements. The physics data taking requires the stable operation of the power sources. The TileDCS Web System retrieves automatically data and extracts the statistics for given periods of time. The mean and standard deviation outcomes are stored as XML files and are compared to preset thresholds. Further, a graphical representation of the TileCal cylinders indicates the state of the supply system of each detector drawer. Colors are designated for each kind of state. In this way problems are easier to find and the collaboration members can focus on them. The user selects a module and the system presents detailed information. It is possible to verify the statistics and generate charts of the parameters over the time. The TileDCS Web System also presents information about the power supplies latest status. One wedge is colored green whenever the system is on. Otherwise it is colored red. Furthermore, it is possible to perform customized analysis. It provides search interfaces where the user can set the module, parameters, and the time period of interest. The system also produces the output of the retrieved data as charts, XML files, CSV and ROOT files according to the user's choice.

  15. preAssemble: a tool for automatic sequencer trace data processing.

    PubMed

    Adzhubei, Alexei A; Laerdahl, Jon K; Vlasova, Anna V

    2006-01-17

    Trace or chromatogram files (raw data) are produced by automatic nucleic acid sequencing equipment or sequencers. Each file contains information which can be interpreted by specialised software to reveal the sequence (base calling). This is done by the sequencer proprietary software or publicly available programs. Depending on the size of a sequencing project the number of trace files can vary from just a few to thousands of files. Sequencing quality assessment on various criteria is important at the stage preceding clustering and contig assembly. Two major publicly available packages--Phred and Staden are used by preAssemble to perform sequence quality processing. The preAssemble pre-assembly sequence processing pipeline has been developed for small to large scale automatic processing of DNA sequencer chromatogram (trace) data. The Staden Package Pregap4 module and base-calling program Phred are utilized in the pipeline, which produces detailed and self-explanatory output that can be displayed with a web browser. preAssemble can be used successfully with very little previous experience, however options for parameter tuning are provided for advanced users. preAssemble runs under UNIX and LINUX operating systems. It is available for downloading and will run as stand-alone software. It can also be accessed on the Norwegian Salmon Genome Project web site where preAssemble jobs can be run on the project server. preAssemble is a tool allowing to perform quality assessment of sequences generated by automatic sequencing equipment. preAssemble is flexible since both interactive jobs on the preAssemble server and the stand alone downloadable version are available. Virtually no previous experience is necessary to run a default preAssemble job, on the other hand options for parameter tuning are provided. Consequently preAssemble can be used as efficiently for just several trace files as for large scale sequence processing.

  16. Method and apparatus for measuring response time

    DOEpatents

    Johanson, Edward W.; August, Charles

    1985-01-01

    A method of measuring the response time of an electrical instrument which generates an output signal in response to the application of a specified input, wherein the output signal varies as a function of time and when subjected to a step input approaches a steady-state value, comprises the steps of: (a) applying a step input of predetermined value to the electrical instrument to generate an output signal; (b) simultaneously starting a timer; (c) comparing the output signal to a reference signal to generate a stop signal when the output signal is substantially equal to the reference signal, the reference signal being a specified percentage of the steady-state value of the output signal corresponding to the predetermined value of the step input; and (d) applying the stop signal when generated to stop the timer.

  17. Method and apparatus for measuring response time

    DOEpatents

    Johanson, E.W.; August, C.

    1983-08-11

    A method of measuring the response time of an electrical instrument which generates an output signal in response to the application of a specified input, wherein the output signal varies as a function of time and when subjected to a step input approaches a steady-state value, comprises the steps of: (a) applying a step input of predetermined value to the electrical instrument to generate an output signal; (b) simultaneously starting a timer; (c) comparing the output signal to a reference signal to generate a stop signal when the output signal is substantially equal to the reference signal, the reference signal being a specified percentage of the steady-state value of the output signal corresponding to the predetermined value of the step input; and (d) applying the stop signal when generated to stop the timer.

  18. LiGRO: a graphical user interface for protein-ligand molecular dynamics.

    PubMed

    Kagami, Luciano Porto; das Neves, Gustavo Machado; da Silva, Alan Wilter Sousa; Caceres, Rafael Andrade; Kawano, Daniel Fábio; Eifler-Lima, Vera Lucia

    2017-10-04

    To speed up the drug-discovery process, molecular dynamics (MD) calculations performed in GROMACS can be coupled to docking simulations for the post-screening analyses of large compound libraries. This requires generating the topology of the ligands in different software, some basic knowledge of Linux command lines, and a certain familiarity in handling the output files. LiGRO-the python-based graphical interface introduced here-was designed to overcome these protein-ligand parameterization challenges by allowing the graphical (non command line-based) control of GROMACS (MD and analysis), ACPYPE (ligand topology builder) and PLIP (protein-binder interactions monitor)-programs that can be used together to fully perform and analyze the outputs of complex MD simulations (including energy minimization and NVT/NPT equilibration). By allowing the calculation of linear interaction energies in a simple and quick fashion, LiGRO can be used in the drug-discovery pipeline to select compounds with a better protein-binding interaction profile. The design of LiGRO allows researchers to freely download and modify the software, with the source code being available under the terms of a GPLv3 license from http://www.ufrgs.br/lasomfarmacia/ligro/ .

  19. SIGACE Code for Generating High-Temperature ACE Files; Validation and Benchmarking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Amit R.; Ganesan, S.; Trkov, A.

    2005-05-24

    A code named SIGACE has been developed as a tool for MCNP users within the scope of a research contract awarded by the Nuclear Data Section of the International Atomic Energy Agency (IAEA) (Ref: 302-F4-IND-11566 B5-IND-29641). A new recipe has been evolved for generating high-temperature ACE files for use with the MCNP code. Under this scheme the low-temperature ACE file is first converted to an ENDF formatted file using the ACELST code and then Doppler broadened, essentially limited to the data in the resolved resonance region, to any desired higher temperature using SIGMA1. The SIGACE code then generates a high-temperaturemore » ACE file for use with the MCNP code. A thinning routine has also been introduced in the SIGACE code for reducing the size of the ACE files. The SIGACE code and the recipe for generating ACE files at higher temperatures has been applied to the SEFOR fast reactor benchmark problem (sodium-cooled fast reactor benchmark described in ENDF-202/BNL-19302, 1974 document). The calculated Doppler coefficient is in good agreement with the experimental value. A similar calculation using ACE files generated directly with the NJOY system also agrees with our SIGACE computed results. The SIGACE code and the recipe is further applied to study the numerical benchmark configuration of selected idealized PWR pin cell configurations with five different fuel enrichments as reported by Mosteller and Eisenhart. The SIGACE code that has been tested with several FENDL/MC files will be available, free of cost, upon request, from the Nuclear Data Section of the IAEA.« less

  20. Pilot climate data system user's guide

    NASA Technical Reports Server (NTRS)

    Reph, M. G.; Treinish, L. A.; Bloch, L.

    1984-01-01

    Instructions for using the Pilot Climate Data System (PCDS), an interactive, scientific data management system for locating, obtaining, manipulating, and displaying climate-research data are presented. The PCDS currently provides this supoort for approximately twenty data sets. Figures that illustrate the terminal displays which a user sees when he/she runs the PCDS and some examples of the output from this system are included. The capabilities which are described in detail allow a user to perform the following: (1) obtain comprehensive descriptions of a number of climate parameter data sets and the associated sensor measurements from which they were derived; (2) obtain detailed information about the temporal coverage and data volume of data sets which are readily accessible via the PCDS; (3) extract portions of a data set using criteria such as time range and geographic location, and output the data to tape, user terminal, system printer, or online disk files in a special data-set-independent format; (4) access and manipulate the data in these data-set-independent files, performing such functions as combining the data, subsetting the data, and averaging the data; and (5) create various graphical representations of the data stored in the data-set-independent files.

  1. Network Visualization Project (NVP)

    DTIC Science & Technology

    2016-07-01

    network visualization, network traffic analysis, network forensics 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF...shell, is a command-line framework used for network forensic analysis. Dshell processes existing pcap files and filters output information based on

  2. Addendum I, BIOPLUME III Graphics Conversion to SURFER Format

    EPA Pesticide Factsheets

    This procedure can be used to create a SURFER® compatible grid file from Bioplume III input and output graphics. The input data and results from Bioplume III can be contoured and printed directly from SURFER.

  3. Overview of the ASA24® Researcher Website

    Cancer.gov

    The ASA24 Researcher Website allows researchers, clinicians, and teachers to register to use the ASA24 system for research, clinical practice, or teaching; to manage logistics of data collection; and to obtain analytic output files.

  4. Sight Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G.

    2014-09-01

    Enables applications to emit log information into an output file and produced a structured visual summary of the log data, as well as various statistical analyses of it. This makes it easier for developers to understand the behavior of their applications.

  5. The distributed production system of the SuperB project: description and results

    NASA Astrophysics Data System (ADS)

    Brown, D.; Corvo, M.; Di Simone, A.; Fella, A.; Luppi, E.; Paoloni, E.; Stroili, R.; Tomassetti, L.

    2011-12-01

    The SuperB experiment needs large samples of MonteCarlo simulated events in order to finalize the detector design and to estimate the data analysis performances. The requirements are beyond the capabilities of a single computing farm, so a distributed production model capable of exploiting the existing HEP worldwide distributed computing infrastructure is needed. In this paper we describe the set of tools that have been developed to manage the production of the required simulated events. The production of events follows three main phases: distribution of input data files to the remote site Storage Elements (SE); job submission, via SuperB GANGA interface, to all available remote sites; output files transfer to CNAF repository. The job workflow includes procedures for consistency checking, monitoring, data handling and bookkeeping. A replication mechanism allows storing the job output on the local site SE. Results from 2010 official productions are reported.

  6. VizieR Online Data Catalog: Habitable zone code (Valle+, 2014)

    NASA Astrophysics Data System (ADS)

    Valle, G.; Dell'Omodarme, M.; Prada Moroni, P. G.; Degl'Innocenti, S.

    2014-06-01

    A C computation code that provide in output the distance dm (i for which the duration of habitability is longest, the corresponding duration tm (in Gyr), the width W (in AU) of the zone for which the habitability lasts tm/2, the inner (Ri) and outer (Ro) boundaries of the 4Gyr continuously habitable zone. The code read the input file HZ-input.dat, containing in each row the mass of the host star (range: 0.70-1.10M⊙), its metallicity (either Z (range: 0.005-0.004) or [Fe/H]), the helium-to-metal enrichment ratio (range: 1-3, standard value = 2), the equilibrium temperature for habitable zone outer boundary computation (range: 169-203K) and the planet Bond Albedo (range: 0.0-1.0, Earth = 0.3). The output is printed on-screen. Compilation: just use your favorite C compiler: gcc hz.c -lm -o HZ (2 data files).

  7. Parallel file system with metadata distributed across partitioned key-value store c

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  8. Inertial Manifolds for Navier-Stokes Equations and Related Dynamical Systems

    DTIC Science & Technology

    1991-05-31

    Graphics IRIS (SGI). The RLE files for the animation are loaded to an Abekas and recorded to tape by Betacam . This computational work was done by using the...scripts and comments, are loaded to the Abekas-A60 digital image storage device, and then recorded to the Betacam BVW-75 analog tape recorder. Static...interfacing, huge data files are output to the Data Vault parallelly with little cost. In addition to the SGIs, Abekas, Betacam and Solitaire, the

  9. User's manual for SYNC: A FORTRAN program for merging and time-synchronizing data

    NASA Technical Reports Server (NTRS)

    Maine, R. E.

    1981-01-01

    The FORTRAN 77 computer program SYNC for merging and time synchronizing data is described. The program SYNC reads one or more input files which contain either synchronous data frames or time-tagged data points, which can be compressed. The program decompresses and time synchronizes the data, correcting for any channel time skews. Interpolation and hold last value synchronization algorithms are available. The output from SYNC is a file of time synchronized data frames at any requested sample rate.

  10. A Prototype Model for Automating Nursing Diagnosis, Nurse Care Planning and Patient Classification.

    DTIC Science & Technology

    1986-03-01

    Each diagnosis has an assessment level. Assessment levels are defining characteristics observed by the nurse or subjectively stated by the patient... characteristics of this order line. Select IV Order (Figure 4.l.1.le] is the first screen of a series of three. Select IV Order has up to 10 selections...For I F Upatient orders. Input Files Used: IVC.Scr and Procfile.Prg * Output Files Used: None Calling Routine: IUB.Prg * Routine Called: None

  11. Conversion of the Forces Mobilization Model (FORCEMOB) from FORTRAN to C

    DTIC Science & Technology

    2015-08-01

    300 K !’"vale Data 18.192 K 136 K Slack 2.560 K 84 K Mapped File 412 K 412 K Sharel!ble 5.444 K 4.440 K Managed Heap - r age Table l.klusable...the C version of FORCEMOB is ready for operational use. This page is intentionally blank. v Contents 1. Introduction...without a graphical user interface (GUI): once run, FORCEMOB reads user-created input files, performs mathematical operations upon them, and outputs text

  12. Federal Logistics Information System (FLIS) Procedures Manual, Volume 1, Change 1

    DTIC Science & Technology

    1996-07-01

    Se- 2 KAT Add FLIS Data Base Data 1 curity Classified Characteristics KDZ Delete Logistics Transfer 3 Data KFA Match Through Association I KFC File...a Cancelled menus normally furnished with this DIC NSNIPSCN, Related Generic or (2) the segment Z data pertains to an NSN. or Reference Number FSC...8/9 KEC Output Exceeds AUTODIN Limitations 4,5 vols 8/9 KFA Match through Association 4 vols 8/9 KFC File Data Minus Security Classified Character- 4

  13. OASIS - ORBIT ANALYSIS AND SIMULATION SOFTWARE

    NASA Technical Reports Server (NTRS)

    Wu, S. C.

    1994-01-01

    The Orbit Analysis and Simulation Software, OASIS, is a software system developed for covariance and simulation analyses of problems involving earth satellites, especially the Global Positioning System (GPS). It provides a flexible, versatile and efficient accuracy analysis tool for earth satellite navigation and GPS-based geodetic studies. To make future modifications and enhancements easy, the system is modular, with five major modules: PATH/VARY, REGRES, PMOD, FILTER/SMOOTHER, and OUTPUT PROCESSOR. PATH/VARY generates satellite trajectories. Among the factors taken into consideration are: 1) the gravitational effects of the planets, moon and sun; 2) space vehicle orientation and shapes; 3) solar pressure; 4) solar radiation reflected from the surface of the earth; 5) atmospheric drag; and 6) space vehicle gas leaks. The REGRES module reads the user's input, then determines if a measurement should be made based on geometry and time. PMOD modifies a previously generated REGRES file to facilitate various analysis needs. FILTER/SMOOTHER is especially suited to a multi-satellite precise orbit determination and geodetic-type problems. It can be used for any situation where parameters are simultaneously estimated from measurements and a priori information. Examples of nonspacecraft areas of potential application might be Very Long Baseline Interferometry (VLBI) geodesy and radio source catalogue studies. OUTPUT PROCESSOR translates covariance analysis results generated by FILTER/SMOOTHER into user-desired easy-to-read quantities, performs mapping of orbit covariances and simulated solutions, transforms results into different coordinate systems, and computes post-fit residuals. The OASIS program was developed in 1986. It is designed to be implemented on a DEC VAX 11/780 computer using VAX VMS 3.7 or higher. It can also be implemented on a Micro VAX II provided sufficient disk space is available.

  14. 77 FR 9224 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-16

    ... following exempt wholesale generator filings: Docket Numbers: EG12-31-000. Applicants: Quantum Choctaw Power, LLC. Description: Quantum Choctaw Power Notice of Self-certification of Exempt Wholesale Generator...

  15. GMI-IPS: Python Processing Software for Aircraft Campaigns

    NASA Technical Reports Server (NTRS)

    Damon, M. R.; Strode, S. A.; Steenrod, S. D.; Prather, M. J.

    2018-01-01

    NASA's Atmospheric Tomography Mission (ATom) seeks to understand the impact of anthropogenic air pollution on gases in the Earth's atmosphere. Four flight campaigns are being deployed on a seasonal basis to establish a continuous global-scale data set intended to improve the representation of chemically reactive gases in global atmospheric chemistry models. The Global Modeling Initiative (GMI), is creating chemical transport simulations on a global scale for each of the ATom flight campaigns. To meet the computational demands required to translate the GMI simulation data to grids associated with the flights from the ATom campaigns, the GMI ICARTT Processing Software (GMI-IPS) has been developed and is providing key functionality for data processing and analysis in this ongoing effort. The GMI-IPS is written in Python and provides computational kernels for data interpolation and visualization tasks on GMI simulation data. A key feature of the GMI-IPS, is its ability to read ICARTT files, a text-based file format for airborne instrument data, and extract the required flight information that defines regional and temporal grid parameters associated with an ATom flight. Perhaps most importantly, the GMI-IPS creates ICARTT files containing GMI simulated data, which are used in collaboration with ATom instrument teams and other modeling groups. The initial main task of the GMI-IPS is to interpolate GMI model data to the finer temporal resolution (1-10 seconds) of a given flight. The model data includes basic fields such as temperature and pressure, but the main focus of this effort is to provide species concentrations of chemical gases for ATom flights. The software, which uses parallel computation techniques for data intensive tasks, linearly interpolates each of the model fields to the time resolution of the flight. The temporally interpolated data is then saved to disk, and is used to create additional derived quantities. In order to translate the GMI model data to the spatial grid of the flight path as defined by the pressure, latitude, and longitude points at each flight time record, a weighted average is then calculated from the nearest neighbors in two dimensions (latitude, longitude). Using SciPya's Regular Grid Interpolator, interpolation functions are generated for the GMI model grid and the calculated weighted averages. The flight path points are then extracted from the ATom ICARTT instrument file, and are sent to the multi-dimensional interpolating functions to generate GMI field quantities along the spatial path of the flight. The interpolated field quantities are then written to a ICARTT data file, which is stored for further manipulation. The GMI-IPS is aware of a generic ATom ICARTT header format, containing basic information for all flight campaigns. The GMI-IPS includes logic to edit metadata for the derived field quantities, as well as modify the generic header data such as processing dates and associated instrument files. The ICARTT interpolated data is then appended to the modified header data, and the ICARTT processing is complete for the given flight and ready for collaboration. The output ICARTT data adheres to the ICARTT file format standards V1.1. The visualization component of the GMI-IPS uses Matplotlib extensively and has several functions ranging in complexity. First, it creates a model background curtain for the flight (time versus model eta levels) with the interpolated flight data superimposed on the curtain. Secondly, it creates a time-series plot of the interpolated flight data. Lastly, the visualization component creates averaged 2D model slices (longitude versus latitude) with overlaid flight track circles at key pressure levels. The GMI-IPS consists of a handful of classes and supporting functionality that have been generalized to be compatible with any ICARTT file that adheres to the base class definition. The base class represents a generic ICARTT entry, only defining a single time entry and 3D spatial positioning parameters. Other classes inherit from this base class; several classes for input ICARTT instrument files, which contain the necessary flight positioning information as a basis for data processing, as well as other classes for output ICARTT files, which contain the interpolated model data. Utility classes provide functionality for routine procedures such as: comparing field names among ICARTT files, reading ICARTT entries from a data file and storing them in data structures, and returning a reduced spatial grid based on a collection of ICARTT entries. Although the GMI-IPS is compatible with GMI model data, it can be adapted with reasonable effort for any simulation that creates Hierarchical Data Format (HDF) files. The same can be said of its adaptability to ICARTT files outside of the context of the ATom mission. The GMI-IPS contains just under 30,000 lines of code, eight classes, and a dozen drivers and utility programs. It is maintained with GIT source code management and has been used to deliver processed GMI model data for the ATom campaigns that have taken place to date.

  16. In Situ Surface Characterization

    NASA Technical Reports Server (NTRS)

    Deen, Robert G.; Leger, Patrick C.; Yanovsky, Igor

    2011-01-01

    Operation of in situ space assets, such as rovers and landers, requires operators to acquire a thorough understanding of the environment surrounding the spacecraft. The following programs help with that understanding by providing higher-level information characterizing the surface, which is not immediately obvious by just looking at the XYZ terrain data. This software suite covers three primary programs: marsuvw, marsrough, and marsslope, and two secondary programs, which together use XYZ data derived from in situ stereo imagery to characterize the surface by determining surface normal, surface roughness, and various aspects of local slope, respectively. These programs all use the Planetary Image Geometry (PIG) library to read mission-specific data files. The programs themselves are completely multimission; all mission dependencies are handled by PIG. The input data consists of images containing XYZ locations as derived by, e.g., marsxyz. The marsuvw program determines surface normals from XYZ data by gathering XYZ points from an area around each pixel and fitting a plane to those points. Outliers are rejected, and various consistency checks are applied. The result shows the orientation of the local surface at each point as a unit vector. The program can be run in two modes: standard, which is typically used for in situ arm work, and slope, which is typically used for rover mobility. The difference is primarily due to optimizations necessary for the larger patch sizes in the slope case. The marsrough program determines surface roughness in a small area around each pixel, which is defined as the maximum peak-to-peak deviation from the plane perpendicular to the surface normal at that pixel. The marsslope program takes a surface normal file as input and derives one of several slope-like outputs from it. The outputs include slope, slope rover direction (a measure of slope radially away from the rover), slope heading, slope magnitude, northerly tilt, and solar energy (compares the slope with the Sun s location at local noon). The marsuvwproj program projects a surface normal onto an arbitrary plane in space, resulting in a normalized 3D vector, which is constrained to lie in the plane. The marsuvwrot program rotates the vectors in a surface normal file, generating a new surface normal file. It also can change coordinate systems for an existing surface normal file. While the algorithms behind this suite are not particularly unique, what makes the programs useful is their integration into the larger in situ image processing system via the PIG library. They work directly with space in situ data, understanding the appropriate image metadata fields and updating them properly. The secondary programs (marsuvwproj, marsuvwrot) were originally developed to deal with anomalous situations on Opportunity and Spirit, respectively, but may have more general applicability.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamlet, Jason R.; Mayo, Jackson R.

    Embodiments of the invention describe a Boolean circuit having a voter circuit and a plurality of approximate circuits each based, at least in part, on a reference circuit. The approximate circuits are each to generate one or more output signals based on values of received input signals. The voter circuit is to receive the one or more output signals generated by each of the approximate circuits, and is to output one or more signals corresponding to a majority value of the received signals. At least some of the approximate circuits are to generate an output value different than the referencemore » circuit for one or more input signal values; however, for each possible input signal value, the majority values of the one or more output signals generated by the approximate circuits and received by the voter circuit correspond to output signal result values of the reference circuit.« less

  18. iTOUGH2 Universal Optimization Using the PEST Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, S.A.

    2010-07-01

    iTOUGH2 (http://www-esd.lbl.gov/iTOUGH2) is a computer program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis [Finsterle, 2007a, b, c]. iTOUGH2 contains a number of local and global minimization algorithms for automatic calibration of a model against measured data, or for the solution of other, more general optimization problems (see, for example, Finsterle [2005]). A detailed residual and estimation uncertainty analysis is conducted to assess the inversion results. Moreover, iTOUGH2 can be used to perform a formal sensitivity analysis, or to conduct Monte Carlo simulations for the examination for prediction uncertainties. iTOUGH2's capabilities are continually enhanced. As the name implies, iTOUGH2more » is developed for use in conjunction with the TOUGH2 forward simulator for nonisothermal multiphase flow in porous and fractured media [Pruess, 1991]. However, iTOUGH2 provides FORTRAN interfaces for the estimation of user-specified parameters (see subroutine USERPAR) based on user-specified observations (see subroutine USEROBS). These user interfaces can be invoked to add new parameter or observation types to the standard set provided in iTOUGH2. They can also be linked to non-TOUGH2 models, i.e., iTOUGH2 can be used as a universal optimization code, similar to other model-independent, nonlinear parameter estimation packages such as PEST [Doherty, 2008] or UCODE [Poeter and Hill, 1998]. However, to make iTOUGH2's optimization capabilities available for use with an external code, the user is required to write some FORTRAN code that provides the link between the iTOUGH2 parameter vector and the input parameters of the external code, and between the output variables of the external code and the iTOUGH2 observation vector. While allowing for maximum flexibility, the coding requirement of this approach limits its applicability to those users with FORTRAN coding knowledge. To make iTOUGH2 capabilities accessible to many application models, the PEST protocol [Doherty, 2007] has been implemented into iTOUGH2. This protocol enables communication between the application (which can be a single 'black-box' executable or a script or batch file that calls multiple codes) and iTOUGH2. The concept requires that for the application model: (1) Input is provided on one or more ASCII text input files; (2) Output is returned to one or more ASCII text output files; (3) The model is run using a system command (executable or script/batch file); and (4) The model runs to completion without any user intervention. For each forward run invoked by iTOUGH2, select parameters cited within the application model input files are then overwritten with values provided by iTOUGH2, and select variables cited within the output files are extracted and returned to iTOUGH2. It should be noted that the core of iTOUGH2, i.e., its optimization routines and related analysis tools, remains unchanged; it is only the communication format between input parameters, the application model, and output variables that are borrowed from PEST. The interface routines have been provided by Doherty [2007]. The iTOUGH2-PEST architecture is shown in Figure 1. This manual contains installation instructions for the iTOUGH2-PEST module, and describes the PEST protocol as well as the input formats needed in iTOUGH2. Examples are provided that demonstrate the use of model-independent optimization and analysis using iTOUGH2.« less

  19. MEPSA: minimum energy pathway analysis for energy landscapes.

    PubMed

    Marcos-Alcalde, Iñigo; Setoain, Javier; Mendieta-Moreno, Jesús I; Mendieta, Jesús; Gómez-Puertas, Paulino

    2015-12-01

    From conformational studies to atomistic descriptions of enzymatic reactions, potential and free energy landscapes can be used to describe biomolecular systems in detail. However, extracting the relevant data of complex 3D energy surfaces can sometimes be laborious. In this article, we present MEPSA (Minimum Energy Path Surface Analysis), a cross-platform user friendly tool for the analysis of energy landscapes from a transition state theory perspective. Some of its most relevant features are: identification of all the barriers and minima of the landscape at once, description of maxima edge profiles, detection of the lowest energy path connecting two minima and generation of transition state theory diagrams along these paths. In addition to a built-in plotting system, MEPSA can save most of the generated data into easily parseable text files, allowing more versatile uses of MEPSA's output such as the generation of molecular dynamics restraints from a calculated path. MEPSA is freely available (under GPLv3 license) at: http://bioweb.cbm.uam.es/software/MEPSA/ CONTACT: pagomez@cbm.csic.es. Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. 300 mW of coherent light at 488 nm using a generic approach

    NASA Astrophysics Data System (ADS)

    Karamehmedović, Emir; Pedersen, Christian; Andersen, Martin T.; Tidemand-Lichtenberg, Peter

    2008-02-01

    We present a generic approach for efficient generation of CW light with a predetermined wavelength within the visible or UV spectrum. Based on sum-frequency generation (SFG), the circulating intra-cavity field of a high-finesse diode pumped CW solid-state laser (DPSSL) and the output from a tapered, single-frequency external cavity diode laser (ECDL) are mixed inside a 10 mm periodically poled KTP crstal (pp-KTP). The pp-KTP is situated inside the DPSSL cavity to enhance conversion efficiency of the nonlinear mixing process. This approach combines different solid state technologies; the tuneability of ECDLs, the high intra-cavity filed of DPSSLs and flexible quasi phase matching in pp-tapered ECDL with a center wavelength of 766 nm in combination with a high finesse Nd:YVo4 laser at 1342 nm. Up to 308 mW of light at 488nm was measured in our experiments. The conversion of te ECDL beam was up to 47% after it was transmitted through a PM fiber, and up to 32% without fiber coupling. Replacing the seed laser and the nonlinear crystal makes it possible to generate light at virtually any desired wavelength withing the visible spectrum.

  1. Software Template for Instruction in Mathematics

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Moebes, Travis A.; Beall, Anna

    2005-01-01

    Intelligent Math Tutor (IMT) is a software system that serves as a template for creating software for teaching mathematics. IMT can be easily connected to artificial-intelligence software and other analysis software through input and output of files. IMT provides an easy-to-use interface for generating courses that include tests that contain both multiple-choice and fill-in-the-blank questions, and enables tracking of test scores. IMT makes it easy to generate software for Web-based courses or to manufacture compact disks containing executable course software. IMT also can function as a Web-based application program, with features that run quickly on the Web, while retaining the intelligence of a high-level language application program with many graphics. IMT can be used to write application programs in text, graphics, and/or sound, so that the programs can be tailored to the needs of most handicapped persons. The course software generated by IMT follows a "back to basics" approach of teaching mathematics by inducing the student to apply creative mathematical techniques in the process of learning. Students are thereby made to discover mathematical fundamentals and thereby come to understand mathematics more deeply than they could through simple memorization.

  2. 76 FR 75540 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-02

    .... Applicants: Choctaw Gas Generation, LLC, Quantum Choctaw Power, LLC. Description: Application For... For Expedited Action of Choctaw Gas Generation, LLC and Quantum Choctaw Power, LLC. Filed Date: 11/22...

  3. File Formats Commonly Used in Mass Spectrometry Proteomics*

    PubMed Central

    Deutsch, Eric W.

    2012-01-01

    The application of mass spectrometry (MS) to the analysis of proteomes has enabled the high-throughput identification and abundance measurement of hundreds to thousands of proteins per experiment. However, the formidable informatics challenge associated with analyzing MS data has required a wide variety of data file formats to encode the complex data types associated with MS workflows. These formats encompass the encoding of input instruction for instruments, output products of the instruments, and several levels of information and results used by and produced by the informatics analysis tools. A brief overview of the most common file formats in use today is presented here, along with a discussion of related topics. PMID:22956731

  4. SWITCH user's manual

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The planning program, SWITCH, and its surrounding changed-goal-replanning program, Runaround, are described. The evolution of SWITCH and Runaround from an earlier planner, DEVISER, is recounted. SWITCH's plan representation, and its process of building a plan by backward chaining with strict chronological backtracking, are described. A guide for writing knowledge base files is provided, as are narrative guides for installing the program, running it, and interacting with it while it is running. Some utility functions are documented. For the sake of completeness, a narrative guide to the experimental discrepancy-replanning feature is provided. Appendices contain knowledge base files for a blocksworld domain, and a DRIBBLE file illustrating the output from, and user interaction with, the program in that domain.

  5. Distributive On-line Processing, Visualization and Analysis System for Gridded Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Leptoukh, G.; Berrick, S.; Liu, Z.; Pham, L.; Rui, H.; Shen, S.; Teng, W.; Zhu, T.

    2004-01-01

    The ability to use data stored in the current Earth Observing System (EOS) archives for studying regional or global phenomena is highly dependent on having a detailed understanding of the data's internal structure and physical implementation. Gaining this understanding and applying it to data reduction is a time- consuming task that must be undertaken before the core investigation can begin. This is an especially difficult challenge when science objectives require users to deal with large multi-sensor data sets that are usually of different formats, structures, and resolutions, for example, when preparing data for input into modeling systems. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has taken a major step towards meeting this challenge by developing an infrastructure with a Web interface that allows users to perform interactive analysis online without downloading any data, the GES-DISC Interactive Online Visualization and Analysis Infrastructure or "Giovanni." Giovanni provides interactive, online, analysis tools for data users to facilitate their research. There have been several instances of this interface created to serve TRMM users, Aerosol scientists, Ocean Color and Agriculture applications users. The first generation of these tools support gridded data only. The user selects geophysical parameters, area of interest, time period; and the system generates an output on screen in a matter of seconds. The currently available output options are: Area plot averaged or accumulated over any available data period for any rectangular area; Time plot time series averaged over any rectangular area; Time plots image view of any longitude-time and latitude-time cross sections; ASCII output for all plot types; Image animation for area plot. In the future, we will add correlation plots, GIS-compatible outputs, etc. This allow user to focus on data content (i.e. science parameters) and eliminate the need for expensive learning, development and processing tasks that are redundantly incurred by an archive's user community. The current implementation utilizes the GrADS-DODS Server (GDS), a stable, secure data server that provides subsetting and analysis services across the Internet for any GrADS-readable dataset. The subsetting capability allows users to retrieve a specified temporal and/or spatial subdomain from a large dataset, eliminating the need to download everything simply to access a small relevant portion of a dataset. The analysis capability allows users to retrieve the results of an operation applied to one or more datasets on the server. In our case, we use this approach to read pre-processed binary files and/or to read and extract the needed parts from HDF or HDF-EOS files. These subsets then serve as inputs into GrADS processing and analysis scripts. It can be used in a wide variety of Earth science applications: climate and weather events study and monitoring; modeling. It can be easily configured for new applications.

  6. High-Speed Numeric Function Generator Using Piecewise Quadratic Approximations

    DTIC Science & Technology

    2007-09-01

    application; User specifies the fuction to approxiamte. % % This programs turns the function provided into an inline function... PRIMARY = < primary file 1> < primary file 2> #SECONDARY = <secondary file 1> <secondary file 2> #CHIP2 = <file to compile to user chip

  7. Generation of OH Radical by Ultrasonic Irradiation in Batch and Circulatory Reactor

    NASA Astrophysics Data System (ADS)

    Fang, Yu; Shimizu, Sayaka; Yamamoto, Takuya; Komarov, Sergey

    2018-03-01

    Ultrasonic technology has been widely investigated in the past as one of the advance oxidation processes to treat wastewater, in this process acoustic cavitation causes generation of OH radical, which play a vital role in improving the treatment efficiency. In this study, OH radical formation rate was measured in batch and circulatory reactor by using Weissler reaction at various ultrasound output power. It is found that the generation rate in batch reactor is higher than that in circulatory reactor at the same output power. The generation rate tended to be slower when output power exceeds 137W. The optimum condition for circulatory reactor was found to be 137W output and 4L/min flow rate. Results of aluminum foil erosion test revealed a strong dependence of cavitation zone length on the ultrasound output power. This is assumed to be one of the reasons why the generation rate of HO radicals becomes slower at higher output power in circulatory reactor.

  8. SU-F-T-367: Using PRIMO, a PENELOPE-Based Software, to Improve the Small Field Dosimetry of Linear Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benmakhlouf, H; Andreo, P; Brualla, L

    2016-06-15

    Purpose: To calculate output correction factors for Varian Clinac 2100iX beams for seven small field detectors and use the values to determine the small field output factors for the linacs at Karolinska university hospital. Methods: Phase space files (psf) for square fields between 0.25cm and 10cm were calculated using the PENELOPE-based PRIMO software. The linac MC-model was tuned by comparing PRIMO-estimated and experimentally determined depth doses and lateral dose-profiles for 40cmx40cm fields. The calculated psf were used as radiation sources to calculate the correction factors of IBA and PTW detectors with the code penEasy/PENELOPE. Results: The optimal tuning parameters ofmore » the MClinac model in PRIMO were 5.4 MeV incident electron energy and zero energy spread, focal spot size and beam divergence. Correction factors obtained for the liquid ion chamber (PTW-T31018) are within 1% down to 0.5 cm fields. For unshielded diodes (IBA-EFD, IBA-SFD, PTW-T60017 and PTW-T60018) the corrections are up to 2% at intermediate fields (>1cm side), becoming down to −11% for fields smaller than 1cm. The shielded diode (IBA-PFD and PTW-T60016) corrections vary with field size from 0 to −4%. Volume averaging effects are found for most detectors in the presence of 0.25cm fields. Conclusion: Good agreement was found between correction factors based on PRIMO-generated psf and those from other publications. The calculated factors will be implemented in output factor measurements (using several detectors) in the clinic. PRIMO is a userfriendly general code capable of generating small field psf and can be used without having to code own linac geometries. It can therefore be used to improve the clinical dosimetry, especially in the commissioning of linear accelerators. Important dosimetry data, such as dose-profiles and output factors can be determined more accurately for a specific machine, geometry and setup by using PRIMO and having a MC-model of the detector used.« less

  9. Distributed Data Collection for the ATLAS EventIndex

    NASA Astrophysics Data System (ADS)

    Sánchez, J.; Fernández Casaní, A.; González de la Hoz, S.

    2015-12-01

    The ATLAS EventIndex contains records of all events processed by ATLAS, in all processing stages. These records include the references to the files containing each event (the GUID of the file) and the internal pointer to each event in the file. This information is collected by all jobs that run at Tier-0 or on the Grid and process ATLAS events. Each job produces a snippet of information for each permanent output file. This information is packed and transferred to a central broker at CERN using an ActiveMQ messaging system, and then is unpacked, sorted and reformatted in order to be stored and catalogued into a central Hadoop server. This contribution describes in detail the Producer/Consumer architecture to convey this information from the running jobs through the messaging system to the Hadoop server.

  10. 78 FR 5508 - Vogtle Electric Generating Plant, Units 3 and 4; Application and Amendment to Combined Licenses...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ... before the issuance of any amendment. IV. Electronic Submissions (E-Filing) All documents filed in NRC... be filed in accordance with the NRC E-Filing rule (72 FR 49139; August 28, 2007). The E-Filing... the procedural requirements of E-Filing, at least ten 10 days prior to the filing deadline, the...

  11. The GPRIME approach to finite element modeling

    NASA Technical Reports Server (NTRS)

    Wallace, D. R.; Mckee, J. H.; Hurwitz, M. M.

    1983-01-01

    GPRIME, an interactive modeling system, runs on the CDC 6000 computers and the DEC VAX 11/780 minicomputer. This system includes three components: (1) GPRIME, a user friendly geometric language and a processor to translate that language into geometric entities, (2) GGEN, an interactive data generator for 2-D models; and (3) SOLIDGEN, a 3-D solid modeling program. Each component has a computer user interface of an extensive command set. All of these programs make use of a comprehensive B-spline mathematics subroutine library, which can be used for a wide variety of interpolation problems and other geometric calculations. Many other user aids, such as automatic saving of the geometric and finite element data bases and hidden line removal, are available. This interactive finite element modeling capability can produce a complete finite element model, producing an output file of grid and element data.

  12. Slave finite element for non-linear analysis of engine structures. Volume 2: Programmer's manual and user's manual

    NASA Technical Reports Server (NTRS)

    Witkop, D. L.; Dale, B. J.; Gellin, S.

    1991-01-01

    The programming aspects of SFENES are described in the User's Manual. The information presented is provided for the installation programmer. It is sufficient to fully describe the general program logic and required peripheral storage. All element generated data is stored externally to reduce required memory allocation. A separate section is devoted to the description of these files thereby permitting the optimization of Input/Output (I/O) time through efficient buffer descriptions. Individual subroutine descriptions are presented along with the complete Fortran source listings. A short description of the major control, computation, and I/O phases is included to aid in obtaining an overall familiarity with the program's components. Finally, a discussion of the suggested overlay structure which allows the program to execute with a reasonable amount of memory allocation is presented.

  13. Visual Data Analysis for Satellites

    NASA Technical Reports Server (NTRS)

    Lau, Yee; Bhate, Sachin; Fitzpatrick, Patrick

    2008-01-01

    The Visual Data Analysis Package is a collection of programs and scripts that facilitate visual analysis of data available from NASA and NOAA satellites, as well as dropsonde, buoy, and conventional in-situ observations. The package features utilities for data extraction, data quality control, statistical analysis, and data visualization. The Hierarchical Data Format (HDF) satellite data extraction routines from NASA's Jet Propulsion Laboratory were customized for specific spatial coverage and file input/output. Statistical analysis includes the calculation of the relative error, the absolute error, and the root mean square error. Other capabilities include curve fitting through the data points to fill in missing data points between satellite passes or where clouds obscure satellite data. For data visualization, the software provides customizable Generic Mapping Tool (GMT) scripts to generate difference maps, scatter plots, line plots, vector plots, histograms, timeseries, and color fill images.

  14. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felberg, Lisa E.; Brookes, David H.; Yap, Eng-Hui

    2016-11-02

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized Poisson Boltzmann equation. The PB-AM software package includes the generation of outputs files appropriate for visualization using VMD, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmannmore » Solver (APBS) software package to make it more accessible to a larger group of scientists, educators and students that are more familiar with the APBS framework.« less

  15. Calculation Package for the Analysis of Performance of Cells 1-6, with Underdrain, of the Environmental Management Waste Management Facility Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzales D.

    2010-03-30

    This calculation package presents the results of an assessment of the performance of the 6 cell design of the Environmental Management Waste Management Facility (EMWMF). The calculations show that the new cell 6 design at the EMWMF meets the current WAC requirement. QA/QC steps were taken to verify the input/output data for the risk model and data transfer from modeling output files to tables and calculation.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.

  17. AFT Program Description Navigation/Strike Tasks. Phase II,

    DTIC Science & Technology

    1972-09-01

    1 Subroutine ............... 2- 96 2-23 Data Input/Output - PMSG : 1 Subroutine ................ 2-97 2-24 Data Input/Output - LPMSG: 1 Subroutine...T99DI3 GOFLAG Exercise Start Flag PAD Roll Rate (degrees/second) PHIS Bank Angle (degrees) PMSG 17 KBP Message INPUT STUDENT FILE DATA 2-41 PMSG T3 KBP...Message CRASH PMSG T4 KBP Message DEPRESS THE RESET-TO-ZERO CONSOLE BUTTON PSI F-4 Heading (degrees) PSIAFT Desired AFT Heading RCIS Average Rate-of

  18. Generation new MP3 data set after compression

    NASA Astrophysics Data System (ADS)

    Atoum, Mohammed Salem; Almahameed, Mohammad

    2016-02-01

    The success of audio steganography techniques is to ensure imperceptibility of the embedded secret message in stego file and withstand any form of intentional or un-intentional degradation of secret message (robustness). Crucial to that using digital audio file such as MP3 file, which comes in different compression rate, however research studies have shown that performing steganography in MP3 format after compression is the most suitable one. Unfortunately until now the researchers can not test and implement their algorithm because no standard data set in MP3 file after compression is generated. So this paper focuses to generate standard data set with different compression ratio and different Genre to help researchers to implement their algorithms.

  19. 78 FR 40469 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-05

    .... Docket Numbers: ER10-3300-004. Applicants: La Paloma Generating Company, LLC. Description: Triennial Updated Market Power Analysis for the Southwest Region of La Paloma Generating Company, LLC. Filed Date: 6...

  20. 75 FR 23752 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-04

    ... Time on Monday, May 17, 2010. Docket Numbers: ER10-662-001. Applicants: CER Generation, LLC. Description: Amendment to Application of CER Generation, LLC. Filed Date: 04/26/2010. Accession Numbers...

Top