Sample records for single input file

  1. Life and dynamic capacity modeling for aircraft transmissions

    NASA Technical Reports Server (NTRS)

    Savage, Michael

    1991-01-01

    A computer program to simulate the dynamic capacity and life of parallel shaft aircraft transmissions is presented. Five basic configurations can be analyzed: single mesh, compound, parallel, reverted, and single plane reductions. In execution, the program prompts the user for the data file prefix name, takes input from a ASCII file, and writes its output to a second ASCII file with the same prefix name. The input data file includes the transmission configuration, the input shaft torque and speed, and descriptions of the transmission geometry and the component gears and bearings. The program output file describes the transmission, its components, their capabilities, locations, and loads. It also lists the dynamic capability, ninety percent reliability, and mean life of each component and the transmission as a system. Here, the program, its input and output files, and the theory behind the operation of the program are described.

  2. VERAIn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan

    2015-02-16

    CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less

  3. Conjoin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjaardema, Gregory

    2010-08-06

    Conjoin is a code for joining sequentially in time multiple exodusII database files. It is used to create a single results or restart file from multiple results or restart files which typically arise as the result of multiple restarted analyses. The resulting output file will be the union of the input files with a status variable indicating the status of each element at the various time planes.Combining multiple exodusII files arising from a restarted analysis or combining multiple exodusII files arising from a finite element analysis with dynamic topology changes.

  4. DOE-2 sample run book: Version 2.1E

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winkelmann, F.C.; Birdsall, B.E.; Buhl, W.F.

    1993-11-01

    The DOE-2 Sample Run Book shows inputs and outputs for a variety of building and system types. The samples start with a simple structure and continue to a high-rise office building, a medical building, three small office buildings, a bar/lounge, a single-family residence, a small office building with daylighting, a single family residence with an attached sunspace, a ``parameterized`` building using input macros, and a metric input/output example. All of the samples use Chicago TRY weather. The main purpose of the Sample Run Book is instructional. It shows the relationship of LOADS-SYSTEMS-PLANT-ECONOMICS inputs, displays various input styles, and illustrates manymore » of the basic and advanced features of the program. Many of the sample runs are preceded by a sketch of the building showing its general appearance and the zoning used in the input. In some cases we also show a 3-D rendering of the building as produced by the program DrawBDL. Descriptive material has been added as comments in the input itself. We find that a number of users have loaded these samples onto their editing systems and use them as ``templates`` for creating new inputs. Another way of using them would be to store various portions as files that can be read into the input using the {number_sign}{number_sign} include command, which is part of the Input Macro feature introduced in version DOE-2.lD. Note that the energy rate structures here are the same as in the DOE-2.lD samples, but have been rewritten using the new DOE-2.lE commands and keywords for ECONOMICS. The samples contained in this report are the same as those found on the DOE-2 release files. However, the output numbers that appear here may differ slightly from those obtained from the release files. The output on the release files can be used as a check set to compare results on your computer.« less

  5. PROP3D: A Program for 3D Euler Unsteady Aerodynamic and Aeroelastic (Flutter and Forced Response) Analysis of Propellers. Version 1.0

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Reddy, T. S. R.

    1996-01-01

    This guide describes the input data required, for steady or unsteady aerodynamic and aeroelastic analysis of propellers and the output files generated, in using PROP3D. The aerodynamic forces are obtained by solving three dimensional unsteady, compressible Euler equations. A normal mode structural analysis is used to obtain the aeroelastic equations, which are solved using either time domain or frequency domain solution method. Sample input and output files are included in this guide for steady aerodynamic analysis of single and counter-rotation propellers, and aeroelastic analysis of single-rotation propeller.

  6. Program to convert SUDS2ASC files to a single binary SEGY file

    USGS Publications Warehouse

    Goldman, Mark

    2000-01-01

    This program, SUDS2SEGY, converts and combines ASCII files created using SUDS2ASC Version 2.60, to a single SEGY file. SUDS2ASC has been used previously to create an ASCII file of three-component seismic data for an individual recording station. However, many seismic processing packages have difficulty reading in ASCII data. In addition, it may be cumbersome to process a separate file for each recording station, particularly if traces from different recording stations contain a different number of data samples and/or a different start time. This new program - SUDS2SEGY - combines these recording station files into a single SEGY file. In addition, SUDS2SEGY normalizes the trace times so that each trace starts at a given time and consists of a fixed number of samples. This normalization allows seismic data from many different stations to be read in as a single "data gather". SUDS2SEGY also produces a report summarizing the offset and maximum absolute amplitude for each component in a station file. These data are output separately to an ASCII file and can be subsequently input to a plotting package.

  7. Broadband Heating Rate Profile Project (BBHRP) - SGP ripbe370mcfarlane

    DOE Data Explorer

    Riihimaki, Laura; Shippert, Timothy

    2014-11-05

    The objective of the ARM Broadband Heating Rate Profile (BBHRP) Project is to provide a structure for the comprehensive assessment of our ability to model atmospheric radiative transfer for all conditions. Required inputs to BBHRP include surface albedo and profiles of atmospheric state (temperature, humidity), gas concentrations, aerosol properties, and cloud properties. In the past year, the Radiatively Important Parameters Best Estimate (RIPBE) VAP was developed to combine all of the input properties needed for BBHRP into a single gridded input file. Additionally, an interface between the RIPBE input file and the RRTM was developed using the new ARM integrated software development environment (ISDE) and effort was put into developing quality control (qc) flags and provenance information on the BBHRP output files so that analysis of the output would be more straightforward. This new version of BBHRP, sgp1bbhrpripbeC1.c1, uses the RIPBE files as input to RRTM, and calculates broadband SW and LW fluxes and heating rates at 1-min resolution using the independent column approximation. The vertical resolution is 45 m in the lower and middle troposphere to match the input cloud properties, but is at coarser resolution in the upper atmosphere. Unlike previous versions, the vertical grid is the same for both clear-sky and cloudy-sky calculations.

  8. Broadband Heating Rate Profile Project (BBHRP) - SGP 1bbhrpripbe1mcfarlane

    DOE Data Explorer

    Riihimaki, Laura; Shippert, Timothy

    2014-11-05

    The objective of the ARM Broadband Heating Rate Profile (BBHRP) Project is to provide a structure for the comprehensive assessment of our ability to model atmospheric radiative transfer for all conditions. Required inputs to BBHRP include surface albedo and profiles of atmospheric state (temperature, humidity), gas concentrations, aerosol properties, and cloud properties. In the past year, the Radiatively Important Parameters Best Estimate (RIPBE) VAP was developed to combine all of the input properties needed for BBHRP into a single gridded input file. Additionally, an interface between the RIPBE input file and the RRTM was developed using the new ARM integrated software development environment (ISDE) and effort was put into developing quality control (qc) flags and provenance information on the BBHRP output files so that analysis of the output would be more straightforward. This new version of BBHRP, sgp1bbhrpripbeC1.c1, uses the RIPBE files as input to RRTM, and calculates broadband SW and LW fluxes and heating rates at 1-min resolution using the independent column approximation. The vertical resolution is 45 m in the lower and middle troposphere to match the input cloud properties, but is at coarser resolution in the upper atmosphere. Unlike previous versions, the vertical grid is the same for both clear-sky and cloudy-sky calculations.

  9. Broadband Heating Rate Profile Project (BBHRP) - SGP ripbe1mcfarlane

    DOE Data Explorer

    Riihimaki, Laura; Shippert, Timothy

    2014-11-05

    The objective of the ARM Broadband Heating Rate Profile (BBHRP) Project is to provide a structure for the comprehensive assessment of our ability to model atmospheric radiative transfer for all conditions. Required inputs to BBHRP include surface albedo and profiles of atmospheric state (temperature, humidity), gas concentrations, aerosol properties, and cloud properties. In the past year, the Radiatively Important Parameters Best Estimate (RIPBE) VAP was developed to combine all of the input properties needed for BBHRP into a single gridded input file. Additionally, an interface between the RIPBE input file and the RRTM was developed using the new ARM integrated software development environment (ISDE) and effort was put into developing quality control (qc) flags and provenance information on the BBHRP output files so that analysis of the output would be more straightforward. This new version of BBHRP, sgp1bbhrpripbeC1.c1, uses the RIPBE files as input to RRTM, and calculates broadband SW and LW fluxes and heating rates at 1-min resolution using the independent column approximation. The vertical resolution is 45 m in the lower and middle troposphere to match the input cloud properties, but is at coarser resolution in the upper atmosphere. Unlike previous versions, the vertical grid is the same for both clear-sky and cloudy-sky calculations.

  10. Apically extruded dentin debris by reciprocating single-file and multi-file rotary system.

    PubMed

    De-Deus, Gustavo; Neves, Aline; Silva, Emmanuel João; Mendonça, Thais Accorsi; Lourenço, Caroline; Calixto, Camila; Lima, Edson Jorge Moreira

    2015-03-01

    This study aims to evaluate the apical extrusion of debris by the two reciprocating single-file systems: WaveOne and Reciproc. Conventional multi-file rotary system was used as a reference for comparison. The hypotheses tested were (i) the reciprocating single-file systems extrude more than conventional multi-file rotary system and (ii) the reciprocating single-file systems extrude similar amounts of dentin debris. After solid selection criteria, 80 mesial roots of lower molars were included in the present study. The use of four different instrumentation techniques resulted in four groups (n = 20): G1 (hand-file technique), G2 (ProTaper), G3 (WaveOne), and G4 (Reciproc). The apparatus used to evaluate the collection of apically extruded debris was typical double-chamber collector. Statistical analysis was performed for multiple comparisons. No significant difference was found in the amount of the debris extruded between the two reciprocating systems. In contrast, conventional multi-file rotary system group extruded significantly more debris than both reciprocating groups. Hand instrumentation group extruded significantly more debris than all other groups. The present results yielded favorable input for both reciprocation single-file systems, inasmuch as they showed an improved control of apically extruded debris. Apical extrusion of debris has been studied extensively because of its clinical relevance, particularly since it may cause flare-ups, originated by the introduction of bacteria, pulpal tissue, and irrigating solutions into the periapical tissues.

  11. Fallon, Nevada FORGE Thermal-Hydrological-Mechanical Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blankenship, Doug; Sonnenthal, Eric

    Archive contains thermal-mechanical simulation input/output files. Included are files which fall into the following categories: ( 1 ) Spreadsheets with various input parameter calculations ( 2 ) Final Simulation Inputs ( 3 ) Native-State Thermal-Hydrological Model Input File Folders ( 4 ) Native-State Thermal-Hydrological-Mechanical Model Input Files ( 5 ) THM Model Stimulation Cases See 'File Descriptions.xlsx' resource below for additional information on individual files.

  12. Assessment of apically extruded debris produced by the single-file ProTaper F2 technique under reciprocating movement.

    PubMed

    De-Deus, Gustavo; Brandão, Maria Claudia; Barino, Bianca; Di Giorgi, Karina; Fidel, Rivail Antonio Sergio; Luna, Aderval Severino

    2010-09-01

    This study was designed to quantitatively evaluate the amount of dentin debris extruded from the apical foramen by comparing the conventional sequence of the ProTaper Universal nickel-titanium (NiTi) files with the single-file ProTaper F2 technique. Thirty mesial roots of lower molars were selected, and the use of different instrumentation techniques resulted in 3 groups (n=10 each). In G1, a crown-down hand-file technique was used, and in G2 conventional ProTaper Universal technique was used. In G3, ProTaper F2 file was used in a reciprocating motion. The apical finish preparation was equivalent to ISO size 25. An apparatus was used to evaluate the apically extruded debris. Statistical analysis was performed using 1-way analysis of variance and Tukey multiple comparisons. No significant difference was found in the amount of the debris extruded between the conventional sequence of the ProTaper Universal NiTi files and the single-file ProTaper F2 technique (P>.05). In contrast, the hand instrumentation group extruded significantly more debris than both NiTi groups (P<.05). The present results yielded favorable input for the F2 single-file technique in terms of apically extruded debris, inasmuch as it is the most simple and cost-effective instrumentation approach. Copyright (c) 2010 Mosby, Inc. All rights reserved.

  13. Effective seat-to-head transmissibility in whole-body vibration: Effects of posture and arm position

    NASA Astrophysics Data System (ADS)

    Rahmatalla, Salam; DeShaw, Jonathan

    2011-12-01

    Seat-to-head transmissibility is a biomechanical measure that has been widely used for many decades to evaluate seat dynamics and human response to vibration. Traditionally, transmissibility has been used to correlate single-input or multiple-input with single-output motion; it has not been effectively used for multiple-input and multiple-output scenarios due to the complexity of dealing with the coupled motions caused by the cross-axis effect. This work presents a novel approach to use transmissibility effectively for single- and multiple-input and multiple-output whole-body vibrations. In this regard, the full transmissibility matrix is transformed into a single graph, such as those for single-input and single-output motions. Singular value decomposition and maximum distortion energy theory were used to achieve the latter goal. Seat-to-head transmissibility matrices for single-input/multiple-output in the fore-aft direction, single-input/multiple-output in the vertical direction, and multiple-input/multiple-output directions are investigated in this work. A total of ten subjects participated in this study. Discrete frequencies of 0.5-16 Hz were used for the fore-aft direction using supported and unsupported back postures. Random ride files from a dozer machine were used for the vertical and multiple-axis scenarios considering two arm postures: using the armrests or grasping the steering wheel. For single-input/multiple-output, the results showed that the proposed method was very effective in showing the frequencies where the transmissibility is mostly sensitive for the two sitting postures and two arm positions. For multiple-input/multiple-output, the results showed that the proposed effective transmissibility indicated higher values for the armrest-supported posture than for the steering-wheel-supported posture.

  14. Transported Geothermal Energy Technoeconomic Screening Tool - Calculation Engine

    DOE Data Explorer

    Liu, Xiaobing

    2016-09-21

    This calculation engine estimates technoeconomic feasibility for transported geothermal energy projects. The TGE screening tool (geotool.exe) takes input from input file (input.txt), and list results into output file (output.txt). Both the input and ouput files are in the same folder as the geotool.exe. To use the tool, the input file containing adequate information of the case should be prepared in the format explained below, and the input file should be put into the same folder as geotool.exe. Then the geotool.exe can be executed, which will generate a output.txt file in the same folder containing all key calculation results. The format and content of the output file is explained below as well.

  15. MissileLab User’s Guide

    DTIC Science & Technology

    2009-02-01

    extension) that contain the airframe geometry specific to a single configuration. Results from a MissileLab run will be stored in a directory...re)created and contain all APE results and associated input files. C. Background In the early stages of missile system design, it is necessary to...Copying the AeroEngine Files After installation, the subdirectories in the “AeroEngine” directory contain contact information on how to obtain valid

  16. An approach for coupled-code multiphysics core simulations from a common input

    DOE PAGES

    Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; ...

    2014-12-10

    This study describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which ismore » built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak Ridge National Laboratory using 1156 cores, and a synopsis of the solution results and code performance is presented. Finally, ongoing development of this approach is also briefly described.« less

  17. Mars Reconnaissance Orbiter Uplink Analysis Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

    2008-01-01

    This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

  18. VizieR Online Data Catalog: Planetary atmosphere radiative transport code (Garcia Munoz+ 2015)

    NASA Astrophysics Data System (ADS)

    Garcia Munoz, A.; Mills, F. P.

    2014-08-01

    Files are: * readme.txt * Input files: INPUThazeL.txt, INPUTL13.txt, INPUT_L60.txt; they contain explanations to the input parameters. Copy INPUT_XXXX.txt into INPUT.dat to execute some of the examples described in the reference. * Files with scattering matrix properties: phFhazeL.txt, phFL13.txt, phF_L60.txt * Script for compilation in GFortran (myscript) (10 data files).

  19. QX MAN: Q and X file manipulation

    NASA Technical Reports Server (NTRS)

    Krein, Mark A.

    1992-01-01

    QX MAN is a grid and solution file manipulation program written primarily for the PARC code and the GRIDGEN family of grid generation codes. QX MAN combines many of the features frequently encountered in grid generation, grid refinement, the setting-up of initial conditions, and post processing. QX MAN allows the user to manipulate single block and multi-block grids (and their accompanying solution files) by splitting, concatenating, rotating, translating, re-scaling, and stripping or adding points. In addition, QX MAN can be used to generate an initial solution file for the PARC code. The code was written to provide several formats for input and output in order for it to be useful in a broad spectrum of applications.

  20. A Network Flow Approach to the Initial Skills Training Scheduling Problem

    DTIC Science & Technology

    2007-12-01

    include (but are not limited to) queuing theory, stochastic analysis and simulation. After the demand schedule has been estimated, it can be ...software package has already been purchased and is in use by AFPC, AFPC has requested that the new algorithm be programmed in this language as well ...the discussed outputs from those schedules. Required Inputs A single input file details the students to be scheduled as well as the courses

  1. Exploiting Efficient Transpacking for One-Sided Communication and MPI-IO

    NASA Astrophysics Data System (ADS)

    Mir, Faisal Ghias; Träff, Jesper Larsson

    Based on a construction of socalled input-output datatypes that define a mapping between non-consecutive input and output buffers, we outline an efficient method for copying of structured data. We term this operation transpacking, and show how transpacking can be applied for the MPI implementation of one-sided communication and MPI-IO. For one-sided communication via shared-memory, we demonstrate the expected performance improvements by up to a factor of two. For individual MPI-IO, the time to read or write from file dominates the overall time, but even here efficient transpacking can in some scenarios reduce file I/O time considerably. The reported results have been achieved on a single NEC SX-8 vector node.

  2. External-Compression Supersonic Inlet Design Code

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2011-01-01

    A computer code named SUPIN has been developed to perform aerodynamic design and analysis of external-compression, supersonic inlets. The baseline set of inlets include axisymmetric pitot, two-dimensional single-duct, axisymmetric outward-turning, and two-dimensional bifurcated-duct inlets. The aerodynamic methods are based on low-fidelity analytical and numerical procedures. The geometric methods are based on planar geometry elements. SUPIN has three modes of operation: 1) generate the inlet geometry from a explicit set of geometry information, 2) size and design the inlet geometry and analyze the aerodynamic performance, and 3) compute the aerodynamic performance of a specified inlet geometry. The aerodynamic performance quantities includes inlet flow rates, total pressure recovery, and drag. The geometry output from SUPIN includes inlet dimensions, cross-sectional areas, coordinates of planar profiles, and surface grids suitable for input to grid generators for analysis by computational fluid dynamics (CFD) methods. The input data file for SUPIN and the output file from SUPIN are text (ASCII) files. The surface grid files are output as formatted Plot3D or stereolithography (STL) files. SUPIN executes in batch mode and is available as a Microsoft Windows executable and Fortran95 source code with a makefile for Linux.

  3. Similarities and Differences in Patterns and Geolocation of SSH Attack Data

    DTIC Science & Technology

    2015-09-01

    failed inputs. ..................................................................................14  Figure 7.  Latest “ passwd ” commands entered by...also has fake file contents to allow an attacker to “cat” files like /etc/ passwd [12]. Kippo saves all downloaded files for later inspection. The...overall post-compromise activity, human activity inside the honeypot, top 10 inputs (overall), top 10 successful inputs, top 10 failed inputs, passwd

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Chao

    Sparx, a new environment for Cryo-EM image processing; Cryo-EM, Single particle reconstruction, principal component analysis; Hardware Req.: PC, MAC, Supercomputer, Mainframe, Multiplatform, Workstation. Software Req.: operating system is Unix; Compiler C++; type of files: source code, object library, executable modules, compilation instructions; sample problem input data. Location/transmission: http://sparx-em.org; User manual & paper: http://sparx-em.org;

  5. A Low Cost Microcomputer Laboratory for Investigating Computer Architecture.

    ERIC Educational Resources Information Center

    Mitchell, Eugene E., Ed.

    1980-01-01

    Described is a microcomputer laboratory at the United States Military Academy at West Point, New York, which provides easy access to non-volatile memory and a single input/output file system for 16 microcomputer laboratory positions. A microcomputer network that has a centralized data base is implemented using the concepts of computer network…

  6. PATSTAGS: PATRAN-To-STAGSC-1 Translator

    NASA Technical Reports Server (NTRS)

    Otte, Neil

    1993-01-01

    PATSTAGS computer program translates data from PATRAN finite-element mathematical model into STAGS input records used for engineering analysis. Reads data from PATRAN neutral file and writes STAGS input records into STAGS input file and UPRESS data file. Supports translations of nodal constraints, and of nodal, element, force, and pressure data. Written in FORTRAN 77.

  7. Task-parallel message passing interface implementation of Autodock4 for docking of very large databases of compounds using high-performance super-computers.

    PubMed

    Collignon, Barbara; Schulz, Roland; Smith, Jeremy C; Baudry, Jerome

    2011-04-30

    A message passing interface (MPI)-based implementation (Autodock4.lga.MPI) of the grid-based docking program Autodock4 has been developed to allow simultaneous and independent docking of multiple compounds on up to thousands of central processing units (CPUs) using the Lamarkian genetic algorithm. The MPI version reads a single binary file containing precalculated grids that represent the protein-ligand interactions, i.e., van der Waals, electrostatic, and desolvation potentials, and needs only two input parameter files for the entire docking run. In comparison, the serial version of Autodock4 reads ASCII grid files and requires one parameter file per compound. The modifications performed result in significantly reduced input/output activity compared with the serial version. Autodock4.lga.MPI scales up to 8192 CPUs with a maximal overhead of 16.3%, of which two thirds is due to input/output operations and one third originates from MPI operations. The optimal docking strategy, which minimizes docking CPU time without lowering the quality of the database enrichments, comprises the docking of ligands preordered from the most to the least flexible and the assignment of the number of energy evaluations as a function of the number of rotatable bounds. In 24 h, on 8192 high-performance computing CPUs, the present MPI version would allow docking to a rigid protein of about 300K small flexible compounds or 11 million rigid compounds.

  8. The present status and problems in document retrieval system : document input type retrieval system

    NASA Astrophysics Data System (ADS)

    Inagaki, Hirohito

    The office-automation (OA) made many changes. Many documents were begun to maintained in an electronic filing system. Therefore, it is needed to establish efficient document retrieval system to extract useful information. Current document retrieval systems are using simple word-matching, syntactic-matching, semantic-matching to obtain high retrieval efficiency. On the other hand, the document retrieval systems using special hardware devices, such as ISSP, were developed for aiming high speed retrieval. Since these systems can accept a single sentence or keywords as input, it is difficult to explain searcher's request. We demonstrated document input type retrieval system, which can directly accept document as an input, and can search similar documents from document data-base.

  9. SnopViz, an interactive snow profile visualization tool

    NASA Astrophysics Data System (ADS)

    Fierz, Charles; Egger, Thomas; gerber, Matthias; Bavay, Mathias; Techel, Frank

    2016-04-01

    SnopViz is a visualization tool for both simulation outputs of the snow-cover model SNOWPACK and observed snow profiles. It has been designed to fulfil the needs of operational services (Swiss Avalanche Warning Service, Avalanche Canada) as well as offer the flexibility required to satisfy the specific needs of researchers. This JavaScript application runs on any modern browser and does not require an active Internet connection. The open source code is available for download from models.slf.ch where examples can also be run. Both the SnopViz library and the SnopViz User Interface will become a full replacement of the current research visualization tool SN_GUI for SNOWPACK. The SnopViz library is a stand-alone application that parses the provided input files, for example, a single snow profile (CAAML file format) or multiple snow profiles as output by SNOWPACK (PRO file format). A plugin architecture allows for handling JSON objects (JavaScript Object Notation) as well and plugins for other file formats may be added easily. The outputs are provided either as vector graphics (SVG) or JSON objects. The SnopViz User Interface (UI) is a browser based stand-alone interface. It runs in every modern browser, including IE, and allows user interaction with the graphs. SVG, the XML based standard for vector graphics, was chosen because of its easy interaction with JS and a good software support (Adobe Illustrator, Inkscape) to manipulate graphs outside SnopViz for publication purposes. SnopViz provides new visualization for SNOWPACK timeline output as well as time series input and output. The actual output format for SNOWPACK timelines was retained while time series are read from SMET files, a file format used in conjunction with the open source data handling code MeteoIO. Finally, SnopViz is able to render single snow profiles, either observed or modelled, that are provided as CAAML-file. This file format (caaml.org/Schemas/V5.0/Profiles/SnowProfileIACS) is an international standard to exchange snow profile data. It is supported by the International Association of Cryospheric Sciences (IACS) and was developed in collaboration with practitioners (Avalanche Canada).

  10. User's Guide for the Updated EST/BEST Software System

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin

    2003-01-01

    This User's Guide describes the structure of the IPACS input file that reflects the modularity of each module. The structured format helps the user locate specific input data and manually enter or edit it. The IPACS input file can have any user-specified filename, but must have a DAT extension. The input file may consist of up to six input data blocks; the data blocks must be separated by delimiters beginning with the $ character. If multiple sections are desired, they must be arranged in the order listed.

  11. Wrapping Python around MODFLOW/MT3DMS based groundwater models

    NASA Astrophysics Data System (ADS)

    Post, V.

    2008-12-01

    Numerical models that simulate groundwater flow and solute transport require a great amount of input data that is often organized into different files. A large proportion of the input data consists of spatially-distributed model parameters. The model output consists of a variety data such as heads, fluxes and concentrations. Typically all files have different formats. Consequently, preparing input and managing output is a complex and error-prone task. Proprietary software tools are available that facilitate the preparation of input files and analysis of model outcomes. The use of such software may be limited if it does not support all the features of the groundwater model or when the costs of such tools are prohibitive. Therefore a Python library was developed that contains routines to generate input files and process output files of MODFLOW/MT3DMS based models. The library is freely available and has an open structure so that the routines can be customized and linked into other scripts and libraries. The current set of functions supports the generation of input files for MODFLOW and MT3DMS, including the capability to read spatially-distributed input parameters (e.g. hydraulic conductivity) from PNG files. Both ASCII and binary output files can be read efficiently allowing for visualization of, for example, solute concentration patterns in contour plots with superimposed flow vectors using matplotlib. Series of contour plots are then easily saved as an animation. The subroutines can also be used within scripts to calculate derived quantities such as the mass of a solute within a particular region of the model domain. Using Python as a wrapper around groundwater models provides an efficient and flexible way of processing input and output data, which is not constrained by limitations of third-party products.

  12. Optimized distributed systems achieve significant performance improvement on sorted merging of massive VCF files.

    PubMed

    Sun, Xiaobo; Gao, Jingjing; Jin, Peng; Eng, Celeste; Burchard, Esteban G; Beaty, Terri H; Ruczinski, Ingo; Mathias, Rasika A; Barnes, Kathleen; Wang, Fusheng; Qin, Zhaohui S

    2018-06-01

    Sorted merging of genomic data is a common data operation necessary in many sequencing-based studies. It involves sorting and merging genomic data from different subjects by their genomic locations. In particular, merging a large number of variant call format (VCF) files is frequently required in large-scale whole-genome sequencing or whole-exome sequencing projects. Traditional single-machine based methods become increasingly inefficient when processing large numbers of files due to the excessive computation time and Input/Output bottleneck. Distributed systems and more recent cloud-based systems offer an attractive solution. However, carefully designed and optimized workflow patterns and execution plans (schemas) are required to take full advantage of the increased computing power while overcoming bottlenecks to achieve high performance. In this study, we custom-design optimized schemas for three Apache big data platforms, Hadoop (MapReduce), HBase, and Spark, to perform sorted merging of a large number of VCF files. These schemas all adopt the divide-and-conquer strategy to split the merging job into sequential phases/stages consisting of subtasks that are conquered in an ordered, parallel, and bottleneck-free way. In two illustrating examples, we test the performance of our schemas on merging multiple VCF files into either a single TPED or a single VCF file, which are benchmarked with the traditional single/parallel multiway-merge methods, message passing interface (MPI)-based high-performance computing (HPC) implementation, and the popular VCFTools. Our experiments suggest all three schemas either deliver a significant improvement in efficiency or render much better strong and weak scalabilities over traditional methods. Our findings provide generalized scalable schemas for performing sorted merging on genetics and genomics data using these Apache distributed systems.

  13. Optimized distributed systems achieve significant performance improvement on sorted merging of massive VCF files

    PubMed Central

    Gao, Jingjing; Jin, Peng; Eng, Celeste; Burchard, Esteban G; Beaty, Terri H; Ruczinski, Ingo; Mathias, Rasika A; Barnes, Kathleen; Wang, Fusheng

    2018-01-01

    Abstract Background Sorted merging of genomic data is a common data operation necessary in many sequencing-based studies. It involves sorting and merging genomic data from different subjects by their genomic locations. In particular, merging a large number of variant call format (VCF) files is frequently required in large-scale whole-genome sequencing or whole-exome sequencing projects. Traditional single-machine based methods become increasingly inefficient when processing large numbers of files due to the excessive computation time and Input/Output bottleneck. Distributed systems and more recent cloud-based systems offer an attractive solution. However, carefully designed and optimized workflow patterns and execution plans (schemas) are required to take full advantage of the increased computing power while overcoming bottlenecks to achieve high performance. Findings In this study, we custom-design optimized schemas for three Apache big data platforms, Hadoop (MapReduce), HBase, and Spark, to perform sorted merging of a large number of VCF files. These schemas all adopt the divide-and-conquer strategy to split the merging job into sequential phases/stages consisting of subtasks that are conquered in an ordered, parallel, and bottleneck-free way. In two illustrating examples, we test the performance of our schemas on merging multiple VCF files into either a single TPED or a single VCF file, which are benchmarked with the traditional single/parallel multiway-merge methods, message passing interface (MPI)–based high-performance computing (HPC) implementation, and the popular VCFTools. Conclusions Our experiments suggest all three schemas either deliver a significant improvement in efficiency or render much better strong and weak scalabilities over traditional methods. Our findings provide generalized scalable schemas for performing sorted merging on genetics and genomics data using these Apache distributed systems. PMID:29762754

  14. PLEXOS Input Data Generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PLEXOS Input Data Generator (PIDG) is a tool that enables PLEXOS users to better version their data, automate data processing, collaborate in developing inputs, and transfer data between different production cost modeling and other power systems analysis software. PIDG can process data that is in a generalized format from multiple input sources, including CSV files, PostgreSQL databases, and PSS/E .raw files and write it to an Excel file that can be imported into PLEXOS with only limited manual intervention.

  15. CFEST Coupled Flow, Energy & Solute Transport Version CFEST005 User’s Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Chen, Yousu; Gilca, Alex

    2006-07-20

    The CFEST (Coupled Flow, Energy, and Solute Transport) simulator described in this User’s Guide is a three-dimensional finite-element model used to evaluate groundwater flow and solute mass transport. Confined and unconfined aquifer systems, as well as constant and variable density fluid flows can be represented with CFEST. For unconfined aquifers, the model uses a moving boundary for the water table, deforming the numerical mesh so that the uppermost nodes are always at the water table. For solute transport, changes in concentra¬tion of a single dissolved chemical constituent are computed for advective and hydrodynamic transport, linear sorption represented by a retardationmore » factor, and radioactive decay. Although several thermal parameters described in this User’s Guide are required inputs, thermal transport has not yet been fully implemented in the simulator. Once fully implemented, transport of thermal energy in the groundwater and solid matrix of the aquifer can also be used to model aquifer thermal regimes. The CFEST simulator is written in the FORTRAN 77 language, following American National Standards Institute (ANSI) standards. Execution of the CFEST simulator is controlled through three required text input files. These input file use a structured format of associated groups of input data. Example input data lines are presented for each file type, as well as a description of the structured FORTRAN data format. Detailed descriptions of all input requirements, output options, and program structure and execution are provided in this User’s Guide. Required inputs for auxillary CFEST utilities that aide in post-processing data are also described. Global variables are defined for those with access to the source code. Although CFEST is a proprietary code (CFEST, Inc., Irvine, CA), the Pacific Northwest National Laboratory retains permission to maintain its own source, and to distribute executables to Hanford subcontractors.« less

  16. A computer program (MACPUMP) for interactive aquifer-test analysis

    USGS Publications Warehouse

    Day-Lewis, F. D.; Person, M.A.; Konikow, Leonard F.

    1995-01-01

    This report introduces MACPUMP (Version 1.0), an aquifer-test-analysis package for use with Macintosh4 computers. The report outlines the input- data format, describes the solutions encoded in the program, explains the menu-items, and offers a tutorial illustrating the use of the program. The package reads list-directed aquifer-test data from a file, plots the data to the screen, generates and plots type curves for several different test conditions, and allows mouse-controlled curve matching. MACPUMP features pull-down menus, a simple text viewer for displaying data-files, and optional on-line help windows. This version includes the analytical solutions for nonleaky and leaky confined aquifers, using both type curves and straight-line methods, and for the analysis of single-well slug tests using type curves. An executable version of the code and sample input data sets are included on an accompanying floppy disk.

  17. Format( )MEDIC( )Input

    NASA Astrophysics Data System (ADS)

    Foster, K.

    1994-09-01

    This document is a description of a computer program called Format( )MEDIC( )Input. The purpose of this program is to allow the user to quickly reformat wind velocity data in the Model Evaluation Database (MEDb) into a reasonable 'first cut' set of MEDIC input files (MEDIC.nml, StnLoc.Met, and Observ.Met). The user is cautioned that these resulting input files must be reviewed for correctness and completeness. This program will not format MEDb data into a Problem Station Library or Problem Metdata File. A description of how the program reformats the data is provided, along with a description of the required and optional user input and a description of the resulting output files. A description of the MEDb is not provided here but can be found in the RAS Division Model Evaluation Database Description document.

  18. Simulations of Brady's-Type Fault Undergoing CO2 Push-Pull: Pressure-Transient and Sensitivity Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jung, Yoojin; Doughty, Christine

    Input and output files used for fault characterization through numerical simulation using iTOUGH2. The synthetic data for the push period are generated by running a forward simulation (input parameters are provided in iTOUGH2 Brady GF6 Input Parameters.txt [InvExt6i.txt]). In general, the permeability of the fault gouge, damage zone, and matrix are assumed to be unknown. The input and output files are for the inversion scenario where only pressure transients are available at the monitoring well located 200 m above the injection well and only the fault gouge permeability is estimated. The input files are named InvExt6i, INPUT.tpl, FOFT.ins, CO2TAB, andmore » the output files are InvExt6i.out, pest.fof, and pest.sav (names below are display names). The table graphic in the data files below summarizes the inversion results, and indicates the fault gouge permeability can be estimated even if imperfect guesses are used for matrix and damage zone permeabilities, and permeability anisotropy is not taken into account.« less

  19. Replacing Fortran Namelists with JSON

    NASA Astrophysics Data System (ADS)

    Robinson, T. E., Jr.

    2017-12-01

    Maintaining a log of input parameters for a climate model is very important to understanding potential causes for answer changes during the development stages. Additionally, since modern Fortran is now interoperable with C, a more modern approach to software infrastructure to include code written in C is necessary. Merging these two separate facets of climate modeling requires a quality control for monitoring changes to input parameters and model defaults that can work with both Fortran and C. JSON will soon replace namelists as the preferred key/value pair input in the GFDL model. By adding a JSON parser written in C into the model, the input can be used by all functions and subroutines in the model, errors can be handled by the model instead of by the internal namelist parser, and the values can be output into a single file that is easily parsable by readily available tools. Input JSON files can handle all of the functionality of a namelist while being portable between C and Fortran. Fortran wrappers using unlimited polymorphism are crucial to allow for simple and compact code which avoids the need for many subroutines contained in an interface. Errors can be handled with more detail by providing information about location of syntax errors or typos. The output JSON provides a ground truth for values that the model actually uses by providing not only the values loaded through the input JSON, but also any default values that were not included. This kind of quality control on model input is crucial for maintaining reproducibility and understanding any answer changes resulting from changes in the input.

  20. PATSTAGS - PATRAN-STAGSC-1 TRANSLATOR

    NASA Technical Reports Server (NTRS)

    Otte, N. E.

    1994-01-01

    PATSTAGS translates PATRAN finite model data into STAGS (Structural Analysis of General Shells) input records to be used for engineering analysis. The program reads data from a PATRAN neutral file and writes STAGS input records into a STAGS input file and a UPRESS data file. It is able to support translations of nodal constraints, nodal, element, force and pressure data. PATSTAGS uses three files: the PATRAN neutral file to be translated, a STAGS input file and a STAGS pressure data file. The user provides the names for the neutral file and the desired names of the STAGS files to be created. The pressure data file contains the element live pressure data used in the STAGS subroutine UPRESS. PATSTAGS is written in FORTRAN 77 for DEC VAX series computers running VMS. The main memory requirement for execution is approximately 790K of virtual memory. Output blocks can be modified to output the data in any format desired, allowing the program to be used to translate model data to analysis codes other than STAGSC-1 (HQN-10967). This program is available in DEC VAX BACKUP format on a 9-track magnetic tape or TK50 tape cartridge. Documentation is included in the price of the program. PATSTAGS was developed in 1990. DEC, VAX, TK50 and VMS are trademarks of Digital Equipment Corporation.

  1. Image processing tool for automatic feature recognition and quantification

    DOEpatents

    Chen, Xing; Stoddard, Ryan J.

    2017-05-02

    A system for defining structures within an image is described. The system includes reading of an input file, preprocessing the input file while preserving metadata such as scale information and then detecting features of the input file. In one version the detection first uses an edge detector followed by identification of features using a Hough transform. The output of the process is identified elements within the image.

  2. VizieR Online Data Catalog: Algorithm for correcting CoRoT raw light curves (Mislis+, 2010)

    NASA Astrophysics Data System (ADS)

    Mislis, D.; Schmitt, J. H. M. M.; Carone, L.; Guenther, E. W.; Patzold, M.

    2010-10-01

    Requirements : gfortran (or g77, ifort) compiler Input Files : The input files sould be raw CoRoT txt files (http://idoc-corot.ias.u-psud.fr/index.jsp) with names CoRoT*.txt Run the cda by typing C>: ./cda.csh (code and data sould be in the same directory) Output files : CDA creates one ascii output file with name - CoRoT*.R.cor for R filter (2 data files).

  3. Highly parallel reconfigurable computer architecture for robotic computation having plural processor cells each having right and left ensembles of plural processors

    NASA Technical Reports Server (NTRS)

    Fijany, Amir (Inventor); Bejczy, Antal K. (Inventor)

    1994-01-01

    In a computer having a large number of single-instruction multiple data (SIMD) processors, each of the SIMD processors has two sets of three individual processor elements controlled by a master control unit and interconnected among a plurality of register file units where data is stored. The register files input and output data in synchronism with a minor cycle clock under control of two slave control units controlling the register file units connected to respective ones of the two sets of processor elements. Depending upon which ones of the register file units are enabled to store or transmit data during a particular minor clock cycle, the processor elements within an SIMD processor are connected in rings or in pipeline arrays, and may exchange data with the internal bus or with neighboring SIMD processors through interface units controlled by respective ones of the two slave control units.

  4. Profex: a graphical user interface for the Rietveld refinement program BGMN.

    PubMed

    Doebelin, Nicola; Kleeberg, Reinhard

    2015-10-01

    Profex is a graphical user interface for the Rietveld refinement program BGMN . Its interface focuses on preserving BGMN 's powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems.

  5. Profex: a graphical user interface for the Rietveld refinement program BGMN

    PubMed Central

    Doebelin, Nicola; Kleeberg, Reinhard

    2015-01-01

    Profex is a graphical user interface for the Rietveld refinement program BGMN. Its interface focuses on preserving BGMN’s powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems. PMID:26500466

  6. Development of EnergyPlus Utility to Batch Simulate Building Energy Performance on a National Scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valencia, Jayson F.; Dirks, James A.

    2008-08-29

    EnergyPlus is a simulation program that requires a large number of details to fully define and model a building. Hundreds or even thousands of lines in a text file are needed to run the EnergyPlus simulation depending on the size of the building. To manually create these files is a time consuming process that would not be practical when trying to create input files for thousands of buildings needed to simulate national building energy performance. To streamline the process needed to create the input files for EnergyPlus, two methods were created to work in conjunction with the National Renewable Energymore » Laboratory (NREL) Preprocessor; this reduced the hundreds of inputs needed to define a building in EnergyPlus to a small set of high-level parameters. The first method uses Java routines to perform all of the preprocessing on a Windows machine while the second method carries out all of the preprocessing on the Linux cluster by using an in-house built utility called Generalized Parametrics (GPARM). A comma delimited (CSV) input file is created to define the high-level parameters for any number of buildings. Each method then takes this CSV file and uses the data entered for each parameter to populate an extensible markup language (XML) file used by the NREL Preprocessor to automatically prepare EnergyPlus input data files (idf) using automatic building routines and macro templates. Using a Linux utility called “make”, the idf files can then be automatically run through the Linux cluster and the desired data from each building can be aggregated into one table to be analyzed. Creating a large number of EnergyPlus input files results in the ability to batch simulate building energy performance and scale the result to national energy consumption estimates.« less

  7. Manual for Getdata Version 3.1: a FORTRAN Utility Program for Time History Data

    NASA Technical Reports Server (NTRS)

    Maine, Richard E.

    1987-01-01

    This report documents version 3.1 of the GetData computer program. GetData is a utility program for manipulating files of time history data, i.e., data giving the values of parameters as functions of time. The most fundamental capability of GetData is extracting selected signals and time segments from an input file and writing the selected data to an output file. Other capabilities include converting file formats, merging data from several input files, time skewing, interpolating to common output times, and generating calculated output signals as functions of the input signals. This report also documents the interface standards for the subroutines used by GetData to read and write the time history files. All interface to the data files is through these subroutines, keeping the main body of GetData independent of the precise details of the file formats. Different file formats can be supported by changes restricted to these subroutines. Other computer programs conforming to the interface standards can call the same subroutines to read and write files in compatible formats.

  8. PHREEQCI; a graphical user interface for the geochemical computer program PHREEQC

    USGS Publications Warehouse

    Charlton, Scott R.; Macklin, Clifford L.; Parkhurst, David L.

    1997-01-01

    PhreeqcI is a Windows-based graphical user interface for the geochemical computer program PHREEQC. PhreeqcI provides the capability to generate and edit input data files, run simulations, and view text files containing simulation results, all within the framework of a single interface. PHREEQC is a multipurpose geochemical program that can perform speciation, inverse, reaction-path, and 1D advective reaction-transport modeling. Interactive access to all of the capabilities of PHREEQC is available with PhreeqcI. The interface is written in Visual Basic and will run on personal computers under the Windows(3.1), Windows95, and WindowsNT operating systems.

  9. Application Program Interface for the Orion Aerodynamics Database

    NASA Technical Reports Server (NTRS)

    Robinson, Philip E.; Thompson, James

    2013-01-01

    The Application Programming Interface (API) for the Crew Exploration Vehicle (CEV) Aerodynamic Database has been developed to provide the developers of software an easily implemented, fully self-contained method of accessing the CEV Aerodynamic Database for use in their analysis and simulation tools. The API is programmed in C and provides a series of functions to interact with the database, such as initialization, selecting various options, and calculating the aerodynamic data. No special functions (file read/write, table lookup) are required on the host system other than those included with a standard ANSI C installation. It reads one or more files of aero data tables. Previous releases of aerodynamic databases for space vehicles have only included data tables and a document of the algorithm and equations to combine them for the total aerodynamic forces and moments. This process required each software tool to have a unique implementation of the database code. Errors or omissions in the documentation, or errors in the implementation, led to a lengthy and burdensome process of having to debug each instance of the code. Additionally, input file formats differ for each space vehicle simulation tool, requiring the aero database tables to be reformatted to meet the tool s input file structure requirements. Finally, the capabilities for built-in table lookup routines vary for each simulation tool. Implementation of a new database may require an update to and verification of the table lookup routines. This may be required if the number of dimensions of a data table exceeds the capability of the simulation tools built-in lookup routines. A single software solution was created to provide an aerodynamics software model that could be integrated into other simulation and analysis tools. The highly complex Orion aerodynamics model can then be quickly included in a wide variety of tools. The API code is written in ANSI C for ease of portability to a wide variety of systems. The input data files are in standard formatted ASCII, also for improved portability. The API contains its own implementation of multidimensional table reading and lookup routines. The same aerodynamics input file can be used without modification on all implementations. The turnaround time from aerodynamics model release to a working implementation is significantly reduced

  10. Python Processing and Version Control using VisTrails for the Netherlands Hydrological Instrument (Invited)

    NASA Astrophysics Data System (ADS)

    Verkaik, J.

    2013-12-01

    The Netherlands Hydrological Instrument (NHI) model predicts water demands in periods of drought, supporting the Dutch decision makers in taking operational as well as long-term decisions with respect to the water supply. Other applications of NHI are predicting fresh-salt interaction, nutrient loadings, and agriculture change. The NHI model consists of several coupled models: a saturated groundwater model (MODFLOW), an unsaturated groundwater model (MetaSWAP), a sub-catchment surface water model (MOZART), and a distribution network of surface waters model (DM/SOBEK). Each of these models requires specific, usually large, input data that may be the result of sophisticated schematization workflows. Input data can also be dependent on each other, for example, the precipitation data is input for the unsaturated zone model (cells) as well as for the surface water models (polygons). For efficient data management, we developed several Python tools such that the modeler or stakeholder can use the model in a user-friendly manner, and data is managed in a consistent, transparent and reproducible way. Two open source Python tools are presented here: the data version control module for the workflow manager VisTrails called FileSync, and the NHI model control script that uses FileSync. VisTrails is an open-source scientific workflow and provenance management system that provides support for simulations, data exploration and visualization. Since VisTrails does not directly support version control we developed a version control module called FileSync. With this generic module, the user can synchronize data from and to his workflow through a dialog window. The FileSync dialog calls the FileSync script that is command-line based and performs the actual data synchronization. This script allows the user to easily create a model repository, upload and download data, create releases and define scenarios. The data synchronization approach applied here differs from systems as Subversion or Git, since these systems do not perform well for large (binary) model data files. For this reason, a new concept of parameterization and data splitting has been implemented. Each file, or set of files, is uniquely labeled as a parameter, and for this parameter metadata is maintained by Subversion. The metadata data contains file hashes to identify data content and the location where the actual bulk data are stored that can be reached by FTP. The NHI model control script is a command-line driven Python script for pre-processing, running, and post-processing the NHI model and uses one single configuration file for all computational kernels. This configuration file is an easy-to-use, keyword-driven, Windows INI-file, having separate sections for all the kernels. It also includes a FileSync data section where the user can specify version controlled model data to be used as input. The NHI control script keeps all the data consistent during the pre-processing. Furthermore, this script is able to do model state handling when the NHI model is used for ensemble forecasting.

  11. Integrated Geothermal-CO2 Storage Reservoirs: FY1 Final Report

    DOE Data Explorer

    Buscheck, Thomas A.

    2012-01-01

    The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  12. A computer program for obtaining airplane configuration plots from digital Datcom input data

    NASA Technical Reports Server (NTRS)

    Roy, M. L.; Sliwa, S. M.

    1983-01-01

    A computer program is described which reads the input file for the Stability and Control Digital Datcom program and generates plots from the aircraft configuration data. These plots can be used to verify the geometric input data to the Digital Datcom program. The program described interfaces with utilities available for plotting aircraft configurations by creating a file from the Digital Datcom input data.

  13. NLEdit: A generic graphical user interface for Fortran programs

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.

    1994-01-01

    NLEdit is a generic graphical user interface for the preprocessing of Fortran namelist input files. The interface consists of a menu system, a message window, a help system, and data entry forms. A form is generated for each namelist. The form has an input field for each namelist variable along with a one-line description of that variable. Detailed help information, default values, and minimum and maximum allowable values can all be displayed via menu picks. Inputs are processed through a scientific calculator program that allows complex equations to be used instead of simple numeric inputs. A custom user interface is generated simply by entering information about the namelist input variables into an ASCII file. There is no need to learn a new graphics system or programming language. NLEdit can be used as a stand-alone program or as part of a larger graphical user interface. Although NLEdit is intended for files using namelist format, it can be easily modified to handle other file formats.

  14. NIF Ignition Target 3D Point Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, O; Marinak, M; Milovich, J

    2008-11-05

    We have developed an input file for running 3D NIF hohlraums that is optimized such that it can be run in 1-2 days on parallel computers. We have incorporated increasing levels of automation into the 3D input file: (1) Configuration controlled input files; (2) Common file for 2D and 3D, different types of capsules (symcap, etc.); and (3) Can obtain target dimensions, laser pulse, and diagnostics settings automatically from NIF Campaign Management Tool. Using 3D Hydra calculations to investigate different problems: (1) Intrinsic 3D asymmetry; (2) Tolerance to nonideal 3D effects (e.g. laser power balance, pointing errors); and (3) Syntheticmore » diagnostics.« less

  15. Optimizing Input/Output Using Adaptive File System Policies

    NASA Technical Reports Server (NTRS)

    Madhyastha, Tara M.; Elford, Christopher L.; Reed, Daniel A.

    1996-01-01

    Parallel input/output characterization studies and experiments with flexible resource management algorithms indicate that adaptivity is crucial to file system performance. In this paper we propose an automatic technique for selecting and refining file system policies based on application access patterns and execution environment. An automatic classification framework allows the file system to select appropriate caching and pre-fetching policies, while performance sensors provide feedback used to tune policy parameters for specific system environments. To illustrate the potential performance improvements possible using adaptive file system policies, we present results from experiments involving classification-based and performance-based steering.

  16. Data handling with SAM and art at the NO vA experiment

    DOE PAGES

    Aurisano, A.; Backhouse, C.; Davies, G. S.; ...

    2015-12-23

    During operations, NOvA produces between 5,000 and 7,000 raw files per day with peaks in excess of 12,000. These files must be processed in several stages to produce fully calibrated and reconstructed analysis files. In addition, many simulated neutrino interactions must be produced and processed through the same stages as data. To accommodate the large volume of data and Monte Carlo, production must be possible both on the Fermilab grid and on off-site farms, such as the ones accessible through the Open Science Grid. To handle the challenge of cataloging these files and to facilitate their off-line processing, we havemore » adopted the SAM system developed at Fermilab. SAM indexes files according to metadata, keeps track of each file's physical locations, provides dataset management facilities, and facilitates data transfer to off-site grids. To integrate SAM with Fermilab's art software framework and the NOvA production workflow, we have developed methods to embed metadata into our configuration files, art files, and standalone ROOT files. A module in the art framework propagates the embedded information from configuration files into art files, and from input art files to output art files, allowing us to maintain a complete processing history within our files. Embedding metadata in configuration files also allows configuration files indexed in SAM to be used as inputs to Monte Carlo production jobs. Further, SAM keeps track of the input files used to create each output file. Parentage information enables the construction of self-draining datasets which have become the primary production paradigm used at NOvA. In this study we will present an overview of SAM at NOvA and how it has transformed the file production framework used by the experiment.« less

  17. Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations

    DOE Data Explorer

    Buscheck, Thomas A.

    2012-01-01

    Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk : FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  18. Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations

    DOE Data Explorer

    Buscheck, Thomas A.

    2000-01-01

    Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk: FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  19. A system for verifying models and classification maps by extraction of information from a variety of data sources

    NASA Technical Reports Server (NTRS)

    Norikane, L.; Freeman, A.; Way, J.; Okonek, S.; Casey, R.

    1992-01-01

    Recent updates to a geographical information system (GIS) called VICAR (Video Image Communication and Retrieval)/IBIS are described. The system is designed to handle data from many different formats (vector, raster, tabular) and many different sources (models, radar images, ground truth surveys, optical images). All the data are referenced to a single georeference plane, and average or typical values for parameters defined within a polygonal region are stored in a tabular file, called an info file. The info file format allows tracking of data in time, maintenance of links between component data sets and the georeference image, conversion of pixel values to `actual' values (e.g., radar cross-section, luminance, temperature), graph plotting, data manipulation, generation of training vectors for classification algorithms, and comparison between actual measurements and model predictions (with ground truth data as input).

  20. ESCHER: An interactive mesh-generating editor for preparing finite-element input

    NASA Technical Reports Server (NTRS)

    Oakes, W. R., Jr.

    1984-01-01

    ESCHER is an interactive mesh generation and editing program designed to help the user create a finite-element mesh, create additional input for finite-element analysis, including initial conditions, boundary conditions, and slidelines, and generate a NEUTRAL FILE that can be postprocessed for input into several finite-element codes, including ADINA, ADINAT, DYNA, NIKE, TSAAS, and ABUQUS. Two important ESCHER capabilities, interactive geometry creation and mesh archival storge are described in detail. Also described is the interactive command language and the use of interactive graphics. The archival storage and restart file is a modular, entity-based mesh data file. Modules of this file correspond to separate editing modes in the mesh editor, with data definition syntax preserved between the interactive commands and the archival storage file. Because ESCHER was expected to be highly interactive, extensive user documentation was provided in the form of an interactive HELP package.

  1. HomSI: a homozygous stretch identifier from next-generation sequencing data.

    PubMed

    Görmez, Zeliha; Bakir-Gungor, Burcu; Sagiroglu, Mahmut Samil

    2014-02-01

    In consanguineous families, as a result of inheriting the same genomic segments through both parents, the individuals have stretches of their genomes that are homozygous. This situation leads to the prevalence of recessive diseases among the members of these families. Homozygosity mapping is based on this observation, and in consanguineous families, several recessive disease genes have been discovered with the help of this technique. The researchers typically use single nucleotide polymorphism arrays to determine the homozygous regions and then search for the disease gene by sequencing the genes within this candidate disease loci. Recently, the advent of next-generation sequencing enables the concurrent identification of homozygous regions and the detection of mutations relevant for diagnosis, using data from a single sequencing experiment. In this respect, we have developed a novel tool that identifies homozygous regions using deep sequence data. Using *.vcf (variant call format) files as an input file, our program identifies the majority of homozygous regions found by microarray single nucleotide polymorphism genotype data. HomSI software is freely available at www.igbam.bilgem.tubitak.gov.tr/softwares/HomSI, with an online manual.

  2. User guide for MODPATH version 6 - A particle-tracking model for MODFLOW

    USGS Publications Warehouse

    Pollock, David W.

    2012-01-01

    MODPATH is a particle-tracking post-processing model that computes three-dimensional flow paths using output from groundwater flow simulations based on MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. This report documents MODPATH version 6. Previous versions were documented in USGS Open-File Reports 89-381 and 94-464. The program uses a semianalytical particle-tracking scheme that allows an analytical expression of a particle's flow path to be obtained within each finite-difference grid cell. A particle's path is computed by tracking the particle from one cell to the next until it reaches a boundary, an internal sink/source, or satisfies another termination criterion. Data input to MODPATH consists of a combination of MODFLOW input data files, MODFLOW head and flow output files, and other input files specific to MODPATH. Output from MODPATH consists of several output files, including a number of particle coordinate output files intended to serve as input data for other programs that process, analyze, and display the results in various ways. MODPATH is written in FORTRAN and can be compiled by any FORTRAN compiler that fully supports FORTRAN-2003 or by most commercially available FORTRAN-95 compilers that support the major FORTRAN-2003 language extensions.

  3. Aircraft signal definition for flight safety system monitoring system

    NASA Technical Reports Server (NTRS)

    Gibbs, Michael (Inventor); Omen, Debi Van (Inventor)

    2003-01-01

    A system and method compares combinations of vehicle variable values against known combinations of potentially dangerous vehicle input signal values. Alarms and error messages are selectively generated based on such comparisons. An aircraft signal definition is provided to enable definition and monitoring of sets of aircraft input signals to customize such signals for different aircraft. The input signals are compared against known combinations of potentially dangerous values by operational software and hardware of a monitoring function. The aircraft signal definition is created using a text editor or custom application. A compiler receives the aircraft signal definition to generate a binary file that comprises the definition of all the input signals used by the monitoring function. The binary file also contains logic that specifies how the inputs are to be interpreted. The file is then loaded into the monitor function, where it is validated and used to continuously monitor the condition of the aircraft.

  4. xLPR Sim Editor 1.0 User's Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mariner, Paul E.

    2017-03-01

    The United States Nuclear Regulatory Commission in cooperation with the Electric Power Research Institute contracted Sandia National Laboratories to develop the framework of a probabilistic fracture mechanics assessment code called xLPR ( Extremely Low Probability of Rupture) Version 2.0 . The purpose of xLPR is to evaluate degradation mechanisms in piping systems at nuclear power plants and to predict the probability of rupture. This report is a user's guide for xLPR Sim Editor 1.0 , a graphical user interface for creating and editing the xLPR Version 2.0 input file and for creating, editing, and using the xLPR Version 2.0 databasemore » files . The xLPR Sim Editor, provides a user - friendly way for users to change simulation options and input values, s elect input datasets from xLPR data bases, identify inputs needed for a simulation, and create and modify an input file for xLPR.« less

  5. Using NJOY to Create MCNP ACE Files and Visualize Nuclear Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahler, Albert Comstock

    We provide lecture materials that describe the input requirements to create various MCNP ACE files (Fast, Thermal, Dosimetry, Photo-nuclear and Photo-atomic) with the NJOY Nuclear Data Processing code system. Input instructions to visualize nuclear data with NJOY are also provided.

  6. TIM Version 3.0 beta Technical Description and User Guide - Appendix B - Example input file for TIMv3.0

    EPA Pesticide Factsheets

    Terrestrial Investigation Model, TIM, has several appendices to its user guide. This is the appendix that includes an example input file in its preserved format. Both parameters and comments defining them are included.

  7. NEAMS-IPL MOOSE Midyear Framework Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Permann, Cody; Alger, Brian; Peterson, John

    The MOOSE Framework is a modular pluggable framework for building complex simulations. The ability to add new objects with custom syntax is a core capability that makes MOOSE a powerful platform for coupling multiple applications together within a single environment. The creation of a new, more standardized JSON syntax output improves the external interfaces for generating graphical components or for validating input file syntax. The design of this interface and the requirements it satisfies are covered in this short report.

  8. Incorporating uncertainty in RADTRAN 6.0 input files.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennis, Matthew L.; Weiner, Ruth F.; Heames, Terence John

    Uncertainty may be introduced into RADTRAN analyses by distributing input parameters. The MELCOR Uncertainty Engine (Gauntt and Erickson, 2004) has been adapted for use in RADTRAN to determine the parameter shape and minimum and maximum of the distribution, to sample on the distribution, and to create an appropriate RADTRAN batch file. Coupling input parameters is not possible in this initial application. It is recommended that the analyst be very familiar with RADTRAN and able to edit or create a RADTRAN input file using a text editor before implementing the RADTRAN Uncertainty Analysis Module. Installation of the MELCOR Uncertainty Engine ismore » required for incorporation of uncertainty into RADTRAN. Gauntt and Erickson (2004) provides installation instructions as well as a description and user guide for the uncertainty engine.« less

  9. Flow prediction for propfan engine installation effects on transport aircraft at transonic speeds

    NASA Technical Reports Server (NTRS)

    Samant, S. S.; Yu, N. J.

    1986-01-01

    An Euler-based method for aerodynamic analysis of turboprop transport aircraft at transonic speeds has been developed. In this method, inviscid Euler equations are solved over surface-fitted grids constructed about aircraft configurations. Propeller effects are simulated by specifying sources of momentum and energy on an actuator disc located in place of the propeller. A stripwise boundary layer procedure is included to account for the viscous effects. A preliminary version of an approach to embed the exhaust plume within the global Euler solution has also been developed for more accurate treatment of the exhaust flow. The resulting system of programs is capable of handling wing-body-nacelle-propeller configurations. The propeller disks may be tractors or pushers and may represent single or counterrotation propellers. Results from analyses of three test cases of interest (a wing alone, a wing-body-nacelle model, and a wing-nacelle-endplate model) are presented. A user's manual for executing the system of computer programs with formats of various input files, sample job decks, and sample input files is provided in appendices.

  10. Development of climate data input files for the Mechanistic-Empirical Pavement Design Guide (MEPDG).

    DOT National Transportation Integrated Search

    2011-06-30

    Prior to this effort, Mississippi's MEPDG climate files were limited to 12 weather stations in only 10 countries and only seven weather stations had over 8 years (100 months)of data. Hence, building MEPDG climate input datasets improves modeling accu...

  11. Turbomachinery Forced Response Prediction System (FREPS): User's Manual

    NASA Technical Reports Server (NTRS)

    Morel, M. R.; Murthy, D. V.

    1994-01-01

    The turbomachinery forced response prediction system (FREPS), version 1.2, is capable of predicting the aeroelastic behavior of axial-flow turbomachinery blades. This document is meant to serve as a guide in the use of the FREPS code with specific emphasis on its use at NASA Lewis Research Center (LeRC). A detailed explanation of the aeroelastic analysis and its development is beyond the scope of this document, and may be found in the references. FREPS has been developed by the NASA LeRC Structural Dynamics Branch. The manual is divided into three major parts: an introduction, the preparation of input, and the procedure to execute FREPS. Part 1 includes a brief background on the necessity of FREPS, a description of the FREPS system, the steps needed to be taken before FREPS is executed, an example input file with instructions, presentation of the geometric conventions used, and the input/output files employed and produced by FREPS. Part 2 contains a detailed description of the command names needed to create the primary input file that is required to execute the FREPS code. Also, Part 2 has an example data file to aid the user in creating their own input files. Part 3 explains the procedures required to execute the FREPS code on the Cray Y-MP, a computer system available at the NASA LeRC.

  12. An Efficient Method for Verifying Gyrokinetic Microstability Codes

    NASA Astrophysics Data System (ADS)

    Bravenec, R.; Candy, J.; Dorland, W.; Holland, C.

    2009-11-01

    Benchmarks for gyrokinetic microstability codes can be developed through successful ``apples-to-apples'' comparisons among them. Unlike previous efforts, we perform the comparisons for actual discharges, rendering the verification efforts relevant to existing experiments and future devices (ITER). The process requires i) assembling the experimental analyses at multiple times, radii, discharges, and devices, ii) creating the input files ensuring that the input parameters are faithfully translated code-to-code, iii) running the codes, and iv) comparing the results, all in an organized fashion. The purpose of this work is to automate this process as much as possible: At present, a python routine is used to generate and organize GYRO input files from TRANSP or ONETWO analyses. Another routine translates the GYRO input files into GS2 input files. (Translation software for other codes has not yet been written.) Other python codes submit the multiple GYRO and GS2 jobs, organize the results, and collect them into a table suitable for plotting. (These separate python routines could easily be consolidated.) An example of the process -- a linear comparison between GYRO and GS2 for a DIII-D discharge at multiple radii -- will be presented.

  13. Program Description: EDIT Program and Vendor Master Update, SWRL Financial System.

    ERIC Educational Resources Information Center

    Ikeda, Masumi

    Computer routines to edit input data for the Southwest Regional Laboratory's (SWRL) Financial System are described. The program is responsible for validating input records, generating records for further system processing, and updating the Vendor Master File--a file containing the information necessary to support the accounts payable and…

  14. Plotit-method of interactively plotting input data for the vorlax computer program. [computerized aircraft configuration design

    NASA Technical Reports Server (NTRS)

    Denn, F. M.

    1978-01-01

    Geometric input plotting to the VORLAX computer program by means of an interactive remote terminal is reported. The software consists of a procedure file and two programs. The programs and procedure file are described and a sample execution is presented.

  15. Finite difference time domain grid generation from AMC helicopter models

    NASA Technical Reports Server (NTRS)

    Cravey, Robin L.

    1992-01-01

    A simple technique is presented which forms a cubic grid model of a helicopter from an Aircraft Modeling Code (AMC) input file. The AMC input file defines the helicopter fuselage as a series of polygonal cross sections. The cubic grid model is used as an input to a Finite Difference Time Domain (FDTD) code to obtain predictions of antenna performance on a generic helicopter model. The predictions compare reasonably well with measured data.

  16. Text File Comparator

    NASA Technical Reports Server (NTRS)

    Kotler, R. S.

    1983-01-01

    File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.

  17. Input data requirements for special processors in the computation system containing the VENTURE neutronics code. [LMFBR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.

    1979-07-01

    User input data requirements are presented for certain special processors in a nuclear reactor computation system. These processors generally read data in formatted form and generate binary interface data files. Some data processing is done to convert from the user oriented form to the interface file forms. The VENTURE diffusion theory neutronics code and other computation modules in this system use the interface data files which are generated.

  18. BIREFRINGENT FILTER MODEL

    NASA Technical Reports Server (NTRS)

    Cross, P. L.

    1994-01-01

    Birefringent filters are often used as line-narrowing components in solid state lasers. The Birefringent Filter Model program generates a stand-alone model of a birefringent filter for use in designing and analyzing a birefringent filter. It was originally developed to aid in the design of solid state lasers to be used on aircraft or spacecraft to perform remote sensing of the atmosphere. The model is general enough to allow the user to address problems such as temperature stability requirements, manufacturing tolerances, and alignment tolerances. The input parameters for the program are divided into 7 groups: 1) general parameters which refer to all elements of the filter; 2) wavelength related parameters; 3) filter, coating and orientation parameters; 4) input ray parameters; 5) output device specifications; 6) component related parameters; and 7) transmission profile parameters. The program can analyze a birefringent filter with up to 12 different components, and can calculate the transmission and summary parameters for multiple passes as well as a single pass through the filter. The Jones matrix, which is calculated from the input parameters of Groups 1 through 4, is used to calculate the transmission. Output files containing the calculated transmission or the calculated Jones' matrix as a function of wavelength can be created. These output files can then be used as inputs for user written programs. For example, to plot the transmission or to calculate the eigen-transmittances and the corresponding eigen-polarizations for the Jones' matrix, write the appropriate data to a file. The Birefringent Filter Model is written in Microsoft FORTRAN 2.0. The program format is interactive. It was developed on an IBM PC XT equipped with an 8087 math coprocessor, and has a central memory requirement of approximately 154K. Since Microsoft FORTRAN 2.0 does not support complex arithmetic, matrix routines for addition, subtraction, and multiplication of complex, double precision variables are included. The Birefringent Filter Model was written in 1987.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goltz, G.; Kaiser, L.M.; Weiner, H.

    A major mission of the U.S. Coast Guard is the task of providing and maintaining Maritime Aids to Navigation. These aids are located on and near the coastline and inland waters of the United States and its possessions. A computer program, Design Synthesis and Performance Analysis (DSPA), has been developed by the Jet Propulsion Laboratory to demonstrate the feasibility of low-cost solar array/battery power systems for use on flashing lamp buoys. To provide detailed, realistic temperature, wind, and solar insolation data for analysis of the flashing lamp buoy power systems, the two DSPA support computer program sets: MERGE and STATmore » were developed. A general description of these two packages is presented in this program summary report. The MERGE program set will enable the Coast Guard to combine temperature and wind velocity data (NOAA TDF-14 tapes) with solar insolation data (NOAA DECK-280 tapes) onto a single sequential MERGE file containing up to 12 years of hourly observations. This MERGE file can then be used as direct input to the DSPA program. The STAT program set will enable a statistical analysis to be performed of the MERGE data and produce high or low or mean profiles of the data and/or do a worst case analysis. The STAT output file consists of a one-year set of hourly statistical weather data which can be used as input to the DSPA program.« less

  20. Documentation of model input and output values for simulation of pumping effects in Paradise Valley, a basin tributary to the Humboldt River, Humboldt County, Nevada

    USGS Publications Warehouse

    Carey, A.E.; Prudic, David E.

    1996-01-01

    Documentation is provided of model input and sample output used in a previous report for analysis of ground-water flow and simulated pumping scenarios in Paradise Valley, Humboldt County, Nevada.Documentation includes files containing input values and listings of sample output. The files, in American International Standard Code for Information Interchange (ASCII) or binary format, are compressed and put on a 3-1/2-inch diskette. The decompressed files require approximately 8.4 megabytes of disk space on an International Business Machine (IBM)- compatible microcomputer using the MicroSoft Disk Operating System (MS-DOS) operating system version 5.0 or greater.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphrey, Walter R.

    CMS is a Windows application for tracking chemical inventories. Partners will use this application to record chemicals that are stored on their site and to perform periodic inventories of those chemicals. The application records information about stored chemicals from user input via the keyboard and barcode readers and stores that information into a single-file database (SQLite). A simple user login mechanism is used to control access to functions in the application. A user interface is provided that allows users to search the database and update data in the database.

  2. Converting from DDOR SASF to APF

    NASA Technical Reports Server (NTRS)

    Gladden, Roy E.; Khanampompan, Teerapat; Fisher, Forest W.

    2008-01-01

    A computer program called ddor_sasf2apf converts delta-door (delta differential one-way range) request from an SASF (spacecraft activity sequence file) format to an APF (apgen plan file) format for use in the Mars Reconnaissance Orbiter (MRO) missionplanning- and-sequencing process. The APF is used as an input to APGEN/AUTOGEN in the MRO activity- planning and command-sequencegenerating process to sequence the delta-door (DDOR) activity. The DDOR activity is a spacecraft tracking technique for determining spacecraft location. The input to ddor_sasf2apf is an input request SASF provided by an observation team that utilizes DDOR. ddor_sasf2apf parses this DDOR SASF input, rearranging parameters and reformatting the request to produce an APF file for use in AUTOGEN and/or APGEN. The benefit afforded by ddor_sasf2apf is to enable the use of the DDOR SASF file earlier in the planning stage of the command-sequence-generating process and to produce sequences, optimized for DDOR operations, that are more accurate and more robust than would otherwise be possible.

  3. WASHINGTON DAIRIES

    EPA Science Inventory

    The dairy_wa.zip file is a zip file containing an Arc/Info export file and a text document. Note the DISCLAIM.TXT file as these data are not verified. Map extent: statewide. Input Source: Address database obtained from Wa Dept of Agriculture. Data was originally developed und...

  4. PipeOnline 2.0: automated EST processing and functional data sorting.

    PubMed

    Ayoubi, Patricia; Jin, Xiaojing; Leite, Saul; Liu, Xianghui; Martajaja, Jeson; Abduraham, Abdurashid; Wan, Qiaolan; Yan, Wei; Misawa, Eduardo; Prade, Rolf A

    2002-11-01

    Expressed sequence tags (ESTs) are generated and deposited in the public domain, as redundant, unannotated, single-pass reactions, with virtually no biological content. PipeOnline automatically analyses and transforms large collections of raw DNA-sequence data from chromatograms or FASTA files by calling the quality of bases, screening and removing vector sequences, assembling and rewriting consensus sequences of redundant input files into a unigene EST data set and finally through translation, amino acid sequence similarity searches, annotation of public databases and functional data. PipeOnline generates an annotated database, retaining the processed unigene sequence, clone/file history, alignments with similar sequences, and proposed functional classification, if available. Functional annotation is automatic and based on a novel method that relies on homology of amino acid sequence multiplicity within GenBank records. Records are examined through a function ordered browser or keyword queries with automated export of results. PipeOnline offers customization for individual projects (MyPipeOnline), automated updating and alert service. PipeOnline is available at http://stress-genomics.org.

  5. AIP1OGREN: Aerosol Observing Station Intensive Properties Value-Added Product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koontz, Annette; Flynn, Connor

    The aip1ogren value-added product (VAP) computes several aerosol intensive properties. It requires as input calibrated, corrected, aerosol extensive properties (scattering and absorption coefficients, primarily) from the Aerosol Observing Station (AOS). Aerosol extensive properties depend on both the nature of the aerosol and the amount of the aerosol. We compute several properties as relationships between the various extensive properties. These intensive properties are independent of aerosol amount and instead relate to intrinsic properties of the aerosol itself. Along with the original extensive properties we report aerosol single-scattering albedo, hemispheric backscatter fraction, asymmetry parameter, and Ångström exponent for scattering and absorption withmore » one-minute averaging. An hourly averaged file is produced from the 1-minute files that includes all extensive and intensive properties as well as submicron scattering and submicron absorption fractions. Finally, in both the minutely and hourly files the aerosol radiative forcing efficiency is provided.« less

  6. Small file aggregation in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  7. VizieR Online Data Catalog: Habitable zones around main-sequence stars (Kopparapu+, 2014)

    NASA Astrophysics Data System (ADS)

    Kopparapu, R. K.; Ramirez, R. M.; Schottelkotte, J.; Kasting, J. F.; Domagal-Goldman, S.; Eymet, V.

    2017-08-01

    Language: Fortran 90 Code tested under the following compilers/operating systems: ifort/CentOS linux Description of input data: No input necessary. Description of output data: Output files: HZs.dat, HZ_coefficients.dat System requirements: No major system requirement. Fortran compiler necessary. Calls to external routines: None. Additional comments: None (1 data file).

  8. FEQinput—An editor for the full equations (FEQ) hydraulic modeling system

    USGS Publications Warehouse

    Ancalle, David S.; Ancalle, Pablo J.; Domanski, Marian M.

    2017-10-30

    IntroductionThe Full Equations Model (FEQ) is a computer program that solves the full, dynamic equations of motion for one-dimensional unsteady hydraulic flow in open channels and through control structures. As a result, hydrologists have used FEQ to design and operate flood-control structures, delineate inundation maps, and analyze peak-flow impacts. To aid in fighting floods, hydrologists are using the software to develop a system that uses flood-plain models to simulate real-time streamflow.Input files for FEQ are composed of text files that contain large amounts of parameters, data, and instructions that are written in a format exclusive to FEQ. Although documentation exists that can aid in the creation and editing of these input files, new users face a steep learning curve in order to understand the specific format and language of the files.FEQinput provides a set of tools to help a new user overcome the steep learning curve associated with creating and modifying input files for the FEQ hydraulic model and the related utility tool, Full Equations Utilities (FEQUTL).

  9. Bluues: a program for the analysis of the electrostatic properties of proteins based on generalized Born radii

    PubMed Central

    2012-01-01

    Background The Poisson-Boltzmann (PB) equation and its linear approximation have been widely used to describe biomolecular electrostatics. Generalized Born (GB) models offer a convenient computational approximation for the more fundamental approach based on the Poisson-Boltzmann equation, and allows estimation of pairwise contributions to electrostatic effects in the molecular context. Results We have implemented in a single program most common analyses of the electrostatic properties of proteins. The program first computes generalized Born radii, via a surface integral and then it uses generalized Born radii (using a finite radius test particle) to perform electrostic analyses. In particular the ouput of the program entails, depending on user's requirement: 1) the generalized Born radius of each atom; 2) the electrostatic solvation free energy; 3) the electrostatic forces on each atom (currently in a dvelopmental stage); 4) the pH-dependent properties (total charge and pH-dependent free energy of folding in the pH range -2 to 18; 5) the pKa of all ionizable groups; 6) the electrostatic potential at the surface of the molecule; 7) the electrostatic potential in a volume surrounding the molecule; Conclusions Although at the expense of limited flexibility the program provides most common analyses with requirement of a single input file in PQR format. The results obtained are comparable to those obtained using state-of-the-art Poisson-Boltzmann solvers. A Linux executable with example input and output files is provided as supplementary material. PMID:22536964

  10. iTOUGH2 Universal Optimization Using the PEST Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, S.A.

    2010-07-01

    iTOUGH2 (http://www-esd.lbl.gov/iTOUGH2) is a computer program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis [Finsterle, 2007a, b, c]. iTOUGH2 contains a number of local and global minimization algorithms for automatic calibration of a model against measured data, or for the solution of other, more general optimization problems (see, for example, Finsterle [2005]). A detailed residual and estimation uncertainty analysis is conducted to assess the inversion results. Moreover, iTOUGH2 can be used to perform a formal sensitivity analysis, or to conduct Monte Carlo simulations for the examination for prediction uncertainties. iTOUGH2's capabilities are continually enhanced. As the name implies, iTOUGH2more » is developed for use in conjunction with the TOUGH2 forward simulator for nonisothermal multiphase flow in porous and fractured media [Pruess, 1991]. However, iTOUGH2 provides FORTRAN interfaces for the estimation of user-specified parameters (see subroutine USERPAR) based on user-specified observations (see subroutine USEROBS). These user interfaces can be invoked to add new parameter or observation types to the standard set provided in iTOUGH2. They can also be linked to non-TOUGH2 models, i.e., iTOUGH2 can be used as a universal optimization code, similar to other model-independent, nonlinear parameter estimation packages such as PEST [Doherty, 2008] or UCODE [Poeter and Hill, 1998]. However, to make iTOUGH2's optimization capabilities available for use with an external code, the user is required to write some FORTRAN code that provides the link between the iTOUGH2 parameter vector and the input parameters of the external code, and between the output variables of the external code and the iTOUGH2 observation vector. While allowing for maximum flexibility, the coding requirement of this approach limits its applicability to those users with FORTRAN coding knowledge. To make iTOUGH2 capabilities accessible to many application models, the PEST protocol [Doherty, 2007] has been implemented into iTOUGH2. This protocol enables communication between the application (which can be a single 'black-box' executable or a script or batch file that calls multiple codes) and iTOUGH2. The concept requires that for the application model: (1) Input is provided on one or more ASCII text input files; (2) Output is returned to one or more ASCII text output files; (3) The model is run using a system command (executable or script/batch file); and (4) The model runs to completion without any user intervention. For each forward run invoked by iTOUGH2, select parameters cited within the application model input files are then overwritten with values provided by iTOUGH2, and select variables cited within the output files are extracted and returned to iTOUGH2. It should be noted that the core of iTOUGH2, i.e., its optimization routines and related analysis tools, remains unchanged; it is only the communication format between input parameters, the application model, and output variables that are borrowed from PEST. The interface routines have been provided by Doherty [2007]. The iTOUGH2-PEST architecture is shown in Figure 1. This manual contains installation instructions for the iTOUGH2-PEST module, and describes the PEST protocol as well as the input formats needed in iTOUGH2. Examples are provided that demonstrate the use of model-independent optimization and analysis using iTOUGH2.« less

  11. User Guide and Documentation for Five MODFLOW Ground-Water Modeling Utility Programs

    USGS Publications Warehouse

    Banta, Edward R.; Paschke, Suzanne S.; Litke, David W.

    2008-01-01

    This report documents five utility programs designed for use in conjunction with ground-water flow models developed with the U.S. Geological Survey's MODFLOW ground-water modeling program. One program extracts calculated flow values from one model for use as input to another model. The other four programs extract model input or output arrays from one model and make them available in a form that can be used to generate an ArcGIS raster data set. The resulting raster data sets may be useful for visual display of the data or for further geographic data processing. The utility program GRID2GRIDFLOW reads a MODFLOW binary output file of cell-by-cell flow terms for one (source) model grid and converts the flow values to input flow values for a different (target) model grid. The spatial and temporal discretization of the two models may differ. The four other utilities extract selected 2-dimensional data arrays in MODFLOW input and output files and write them to text files that can be imported into an ArcGIS geographic information system raster format. These four utilities require that the model cells be square and aligned with the projected coordinate system in which the model grid is defined. The four raster-conversion utilities are * CBC2RASTER, which extracts selected stress-package flow data from a MODFLOW binary output file of cell-by-cell flows; * DIS2RASTER, which extracts cell-elevation data from a MODFLOW Discretization file; * MFBIN2RASTER, which extracts array data from a MODFLOW binary output file of head or drawdown; and * MULT2RASTER, which extracts array data from a MODFLOW Multiplier file.

  12. FlaME: Flash Molecular Editor - a 2D structure input tool for the web.

    PubMed

    Dallakian, Pavel; Haider, Norbert

    2011-02-01

    So far, there have been no Flash-based web tools available for chemical structure input. The authors herein present a feasibility study, aiming at the development of a compact and easy-to-use 2D structure editor, using Adobe's Flash technology and its programming language, ActionScript. As a reference model application from the Java world, we selected the Java Molecular Editor (JME). In this feasibility study, we made an attempt to realize a subset of JME's functionality in the Flash Molecular Editor (FlaME) utility. These basic capabilities are: structure input, editing and depiction of single molecules, data import and export in molfile format. The result of molecular diagram sketching in FlaME is accessible in V2000 molfile format. By integrating the molecular editor into a web page, its communication with the HTML elements on this page is established using the two JavaScript functions, getMol() and setMol(). In addition, structures can be copied to the system clipboard. A first attempt was made to create a compact single-file application for 2D molecular structure input/editing on the web, based on Flash technology. With the application examples presented in this article, it could be demonstrated that the Flash methods are principally well-suited to provide the requisite communication between the Flash object (application) and the HTML elements on a web page, using JavaScript functions.

  13. EOS MLS Level 2 Data Processing Software Version 3

    NASA Technical Reports Server (NTRS)

    Livesey, Nathaniel J.; VanSnyder, Livesey W.; Read, William G.; Schwartz, Michael J.; Lambert, Alyn; Santee, Michelle L.; Nguyen, Honghanh T.; Froidevaux, Lucien; wang, Shuhui; Manney, Gloria L.; hide

    2011-01-01

    This software accepts the EOS MLS calibrated measurements of microwave radiances products and operational meteorological data, and produces a set of estimates of atmospheric temperature and composition. This version has been designed to be as flexible as possible. The software is controlled by a Level 2 Configuration File that controls all aspects of the software: defining the contents of state and measurement vectors, defining the configurations of the various forward models available, reading appropriate a priori spectroscopic and calibration data, performing retrievals, post-processing results, computing diagnostics, and outputting results in appropriate files. In production mode, the software operates in a parallel form, with one instance of the program acting as a master, coordinating the work of multiple slave instances on a cluster of computers, each computing the results for individual chunks of data. In addition, to do conventional retrieval calculations and producing geophysical products, the Level 2 Configuration File can instruct the software to produce files of simulated radiances based on a state vector formed from a set of geophysical product files taken as input. Combining both the retrieval and simulation tasks in a single piece of software makes it far easier to ensure that identical forward model algorithms and parameters are used in both tasks. This also dramatically reduces the complexity of the code maintenance effort.

  14. GWM-VI: groundwater management with parallel processing for multiple MODFLOW versions

    USGS Publications Warehouse

    Banta, Edward R.; Ahlfeld, David P.

    2013-01-01

    Groundwater Management–Version Independent (GWM–VI) is a new version of the Groundwater Management Process of MODFLOW. The Groundwater Management Process couples groundwater-flow simulation with a capability to optimize stresses on the simulated aquifer based on an objective function and constraints imposed on stresses and aquifer state. GWM–VI extends prior versions of Groundwater Management in two significant ways—(1) it can be used with any version of MODFLOW that meets certain requirements on input and output, and (2) it is structured to allow parallel processing of the repeated runs of the MODFLOW model that are required to solve the optimization problem. GWM–VI uses the same input structure for files that describe the management problem as that used by prior versions of Groundwater Management. GWM–VI requires only minor changes to the input files used by the MODFLOW model. GWM–VI uses the Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER-API) to implement both version independence and parallel processing. GWM–VI communicates with the MODFLOW model by manipulating certain input files and interpreting results from the MODFLOW listing file and binary output files. Nearly all capabilities of prior versions of Groundwater Management are available in GWM–VI. GWM–VI has been tested with MODFLOW-2005, MODFLOW-NWT (a Newton formulation for MODFLOW-2005), MF2005-FMP2 (the Farm Process for MODFLOW-2005), SEAWAT, and CFP (Conduit Flow Process for MODFLOW-2005). This report provides sample problems that demonstrate a range of applications of GWM–VI and the directory structure and input information required to use the parallel-processing capability.

  15. User's manual for the HYPGEN hyperbolic grid generator and the HGUI graphical user interface

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Chiu, Ing-Tsau; Buning, Pieter G.

    1993-01-01

    The HYPGEN program is used to generate a 3-D volume grid over a user-supplied single-block surface grid. This is accomplished by solving the 3-D hyperbolic grid generation equations consisting of two orthogonality relations and one cell volume constraint. In this user manual, the required input files and parameters and output files are described. Guidelines on how to select the input parameters are given. Illustrated examples are provided showing a variety of topologies and geometries that can be treated. HYPGEN can be used in stand-alone mode as a batch program or it can be called from within a graphical user interface HGUI that runs on Silicon Graphics workstations. This user manual provides a description of the menus, buttons, sliders, and typein fields in HGUI for users to enter the parameters needed to run HYPGEN. Instructions are given on how to configure the interface to allow HYPGEN to run either locally or on a faster remote machine through the use of shell scripts on UNIX operating systems. The volume grid generated is copied back to the local machine for visualization using a built-in hook to PLOT3D.

  16. Lessons learned from comparing molecular dynamics engines on the SAMPL5 dataset.

    PubMed

    Shirts, Michael R; Klein, Christoph; Swails, Jason M; Yin, Jian; Gilson, Michael K; Mobley, David L; Case, David A; Zhong, Ellen D

    2017-01-01

    We describe our efforts to prepare common starting structures and models for the SAMPL5 blind prediction challenge. We generated the starting input files and single configuration potential energies for the host-guest in the SAMPL5 blind prediction challenge for the GROMACS, AMBER, LAMMPS, DESMOND and CHARMM molecular simulation programs. All conversions were fully automated from the originally prepared AMBER input files using a combination of the ParmEd and InterMol conversion programs. We find that the energy calculations for all molecular dynamics engines for this molecular set agree to better than 0.1 % relative absolute energy for all energy components, and in most cases an order of magnitude better, when reasonable choices are made for different cutoff parameters. However, there are some surprising sources of statistically significant differences. Most importantly, different choices of Coulomb's constant between programs are one of the largest sources of discrepancies in energies. We discuss the measures required to get good agreement in the energies for equivalent starting configurations between the simulation programs, and the energy differences that occur when simulations are run with program-specific default simulation parameter values. Finally, we discuss what was required to automate this conversion and comparison.

  17. Lessons learned from comparing molecular dynamics engines on the SAMPL5 dataset

    PubMed Central

    Shirts, Michael R.; Klein, Christoph; Swails, Jason M.; Yin, Jian; Gilson, Michael K.; Mobley, David L.; Case, David A.; Zhong, Ellen D.

    2017-01-01

    We describe our efforts to prepare common starting structures and models for the SAMPL5 blind prediction challenge. We generated the starting input files and single configuration potential energies for the host-guest in the SAMPL5 blind prediction challenge for the GROMACS, AMBER, LAMMPS, DESMOND and CHARMM molecular simulation programs. All conversions were fully automated from the originally prepared AMBER input files using a combination of the ParmEd and InterMol conversion programs. We find that the energy calculations for all molecular dynamics engines for this molecular set agree to a better than 0.1% relative absolute energy for all energy components, and in most cases an order of magnitude better, when reasonable choices are made for different cutoff parameters. However, there are some surprising sources of statistically significant differences. Most importantly, different choices of Coulomb’s constant between programs are one of the largest sources of discrepancies in energies. We discuss the measures required to get good agreement in the energies for equivalent starting configurations between the simulation programs, and the energy differences that occur when simulations are run with program-specific default simulation parameter values. Finally, we discuss what was required to automate this conversion and comparison. PMID:27787702

  18. Lessons learned from comparing molecular dynamics engines on the SAMPL5 dataset

    NASA Astrophysics Data System (ADS)

    Shirts, Michael R.; Klein, Christoph; Swails, Jason M.; Yin, Jian; Gilson, Michael K.; Mobley, David L.; Case, David A.; Zhong, Ellen D.

    2017-01-01

    We describe our efforts to prepare common starting structures and models for the SAMPL5 blind prediction challenge. We generated the starting input files and single configuration potential energies for the host-guest in the SAMPL5 blind prediction challenge for the GROMACS, AMBER, LAMMPS, DESMOND and CHARMM molecular simulation programs. All conversions were fully automated from the originally prepared AMBER input files using a combination of the ParmEd and InterMol conversion programs. We find that the energy calculations for all molecular dynamics engines for this molecular set agree to better than 0.1 % relative absolute energy for all energy components, and in most cases an order of magnitude better, when reasonable choices are made for different cutoff parameters. However, there are some surprising sources of statistically significant differences. Most importantly, different choices of Coulomb's constant between programs are one of the largest sources of discrepancies in energies. We discuss the measures required to get good agreement in the energies for equivalent starting configurations between the simulation programs, and the energy differences that occur when simulations are run with program-specific default simulation parameter values. Finally, we discuss what was required to automate this conversion and comparison.

  19. Factors Affecting Energy Absorption of a Plate during Shock Wave Impact Using a Damage Material Model

    DTIC Science & Technology

    2010-08-07

    51 5.3.2 Abaqus VDLOAD Subroutine ............................................. 55 VI. INTERPRETATION OF RESULTS AND DISCUSSION...VDLOAD SUBROUTINE ........................................................... 91 C. PYTHON SCRIPT TO CONVERT ABAQUS INPUT FILE TO LS-DYNA INPUT FILE...all of the simulations, which are the pressures applied from the Abaqus /Explicit VDLOAD subroutine . The entire model 22 including the boundary

  20. CABS-flex: server for fast simulation of protein structure fluctuations

    PubMed Central

    Jamroz, Michal; Kolinski, Andrzej; Kmiecik, Sebastian

    2013-01-01

    The CABS-flex server (http://biocomp.chem.uw.edu.pl/CABSflex) implements CABS-model–based protocol for the fast simulations of near-native dynamics of globular proteins. In this application, the CABS model was shown to be a computationally efficient alternative to all-atom molecular dynamics—a classical simulation approach. The simulation method has been validated on a large set of molecular dynamics simulation data. Using a single input (user-provided file in PDB format), the CABS-flex server outputs an ensemble of protein models (in all-atom PDB format) reflecting the flexibility of the input structure, together with the accompanying analysis (residue mean-square-fluctuation profile and others). The ensemble of predicted models can be used in structure-based studies of protein functions and interactions. PMID:23658222

  1. CABS-flex: Server for fast simulation of protein structure fluctuations.

    PubMed

    Jamroz, Michal; Kolinski, Andrzej; Kmiecik, Sebastian

    2013-07-01

    The CABS-flex server (http://biocomp.chem.uw.edu.pl/CABSflex) implements CABS-model-based protocol for the fast simulations of near-native dynamics of globular proteins. In this application, the CABS model was shown to be a computationally efficient alternative to all-atom molecular dynamics--a classical simulation approach. The simulation method has been validated on a large set of molecular dynamics simulation data. Using a single input (user-provided file in PDB format), the CABS-flex server outputs an ensemble of protein models (in all-atom PDB format) reflecting the flexibility of the input structure, together with the accompanying analysis (residue mean-square-fluctuation profile and others). The ensemble of predicted models can be used in structure-based studies of protein functions and interactions.

  2. Co-PylotDB - A Python-Based Single-Window User Interface for Transmitting Information to a Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnette, Daniel W.

    2012-01-05

    Co-PylotDB, written completely in Python, provides a user interface (UI) with which to select user and data file(s), directories, and file content, and provide or capture various other information for sending data collected from running any computer program to a pre-formatted database table for persistent storage. The interface allows the user to select input, output, make, source, executable, and qsub files. It also provides fields for specifying the machine name on which the software was run, capturing compile and execution lines, and listing relevant user comments. Data automatically captured by Co-PylotDB and sent to the database are user, current directory,more » local hostname, current date, and time of send. The UI provides fields for logging into a local or remote database server, specifying a database and a table, and sending the information to the selected database table. If a server is not available, the UI provides for saving the command that would have saved the information to a database table for either later submission or for sending via email to a collaborator who has access to the desired database.« less

  3. Thrust Chamber Modeling Using Navier-Stokes Equations: Code Documentation and Listings. Volume 2

    NASA Technical Reports Server (NTRS)

    Daley, P. L.; Owens, S. F.

    1988-01-01

    A copy of the PHOENICS input files and FORTRAN code developed for the modeling of thrust chambers is given. These copies are contained in the Appendices. The listings are contained in Appendices A through E. Appendix A describes the input statements relevant to thrust chamber modeling as well as the FORTRAN code developed for the Satellite program. Appendix B describes the FORTRAN code developed for the Ground program. Appendices C through E contain copies of the Q1 (input) file, the Satellite program, and the Ground program respectively.

  4. Evaluation of Doped Phthalocyanines and a Chemically-Sensitive Field Effect Transistor for Detecting Nitrogen Dioxide

    DTIC Science & Technology

    1989-12-04

    atom (31:708). Also, embedded in the cytochrome-c protein structure, the heme -group is derived from a porphyrin ring system with iron as the centrally...character DEFINT n DIM freqa(401 ),maga(401 ),phasea(401 ),freqg(401 ),magg(401 ),phaseg(401) LET ans$-=n" INPUT ’file with refamp data: ",air$ INPUT...REM and yields envelope of every three points DEFINT n,Ilast, keepers,tposit, po sht DIM frequ(1009),ampu(1009),freqn(1009),ampn(1009) INPUT ’file of

  5. User's guide for a large signal computer model of the helical traveling wave tube

    NASA Technical Reports Server (NTRS)

    Palmer, Raymond W.

    1992-01-01

    The use is described of a successful large-signal, two-dimensional (axisymmetric), deformable disk computer model of the helical traveling wave tube amplifier, an extensively revised and operationally simplified version. We also discuss program input and output and the auxiliary files necessary for operation. Included is a sample problem and its input data and output results. Interested parties may now obtain from the author the FORTRAN source code, auxiliary files, and sample input data on a standard floppy diskette, the contents of which are described herein.

  6. ICEG2D: An Integrated Software Package for Automated Prediction of Flow Fields for Single-Element Airfoils with Ice Accretion

    NASA Technical Reports Server (NTRS)

    Thompson, David S.; Soni, Bharat K.

    2000-01-01

    An integrated software package, ICEG2D, was developed to automate computational fluid dynamics (CFD) simulations for single-element airfoils with ice accretion. ICEG2D is designed to automatically perform three primary functions: (1) generating a grid-ready, surface definition based on the geometrical characteristics of the iced airfoil surface, (2) generating a high-quality grid using the generated surface point distribution, and (3) generating the input and restart files needed to run the general purpose CFD solver NPARC. ICEG2D can be executed in batch mode using a script file or in an interactive mode by entering directives from a command line. This report summarizes activities completed in the first year of a three-year research and development program to address issues related to CFD simulations for aircraft components with ice accretion. Specifically, this document describes the technology employed in the software, the installation procedure, and a description of the operation of the software package. Validation of the geometry and grid generation modules of ICEG2D is also discussed.

  7. A User''s Guide to the Zwikker-Kosten Transmission Line Code (ZKTL)

    NASA Technical Reports Server (NTRS)

    Kelly, J. J.; Abu-Khajeel, H.

    1997-01-01

    This user's guide documents updates to the Zwikker-Kosten Transmission Line Code (ZKTL). This code was developed for analyzing new liner concepts developed to provide increased sound absorption. Contiguous arrays of multi-degree-of-freedom (MDOF) liner elements serve as the model for these liner configurations, and Zwikker and Kosten's theory of sound propagation in channels is used to predict the surface impedance. Transmission matrices for the various liner elements incorporate both analytical and semi-empirical methods. This allows standard matrix techniques to be employed in the code to systematically calculate the composite impedance due to the individual liner elements. The ZKTL code consists of four independent subroutines: 1. Single channel impedance calculation - linear version (SCIC) 2. Single channel impedance calculation - nonlinear version (SCICNL) 3. Multi-channel, multi-segment, multi-layer impedance calculation - linear version (MCMSML) 4. Multi-channel, multi-segment, multi-layer impedance calculation - nonlinear version (MCMSMLNL) Detailed examples, comments, and explanations for each liner impedance computation module are included. Also contained in the guide are depictions of the interactive execution, input files and output files.

  8. Smart command recognizer (SCR) - For development, test, and implementation of speech commands

    NASA Technical Reports Server (NTRS)

    Simpson, Carol A.; Bunnell, John W.; Krones, Robert R.

    1988-01-01

    The SCR, a rapid prototyping system for the development, testing, and implementation of speech commands in a flight simulator or test aircraft, is described. A single unit performs all functions needed during these three phases of system development, while the use of common software and speech command data structure files greatly reduces the preparation time for successive development phases. As a smart peripheral to a simulation or flight host computer, the SCR interprets the pilot's spoken input and passes command codes to the simulation or flight computer.

  9. DefEX: Hands-On Cyber Defense Exercise for Undergraduate Students

    DTIC Science & Technology

    2011-07-01

    Injection, and 4) File Upload. Next, the students patched the associated flawed Perl and PHP Hypertext Preprocessor ( PHP ) code. Finally, students...underlying script. The Zora XSS vulnerability existed in a PHP file that echoed unfiltered user input back to the screen. To eliminate the...vulnerability, students filtered the input using the PHP htmlentities function and retested the code. The htmlentities function translates certain ambiguous

  10. Computer program for analysis of high speed, single row, angular contact, spherical roller bearing, SASHBEAN. Volume 1: User's guide

    NASA Technical Reports Server (NTRS)

    Aggarwal, Arun K.

    1993-01-01

    The computer program SASHBEAN (Sikorsky Aircraft Spherical Roller High Speed Bearing Analysis) analyzes and predicts the operating characteristics of a Single Row, Angular Contact, Spherical Roller Bearing (SRACSRB). The program runs on an IBM or IBM compatible personal computer, and for a given set of input data analyzes the bearing design for it's ring deflections (axial and radial), roller deflections, contact areas and stresses, induced axial thrust, rolling element and cage rotation speeds, lubrication parameters, fatigue lives, and amount of heat generated in the bearing. The dynamic loading of rollers due to centrifugal forces and gyroscopic moments, which becomes quite significant at high speeds, is fully considered in this analysis. For a known application and it's parameters, the program is also capable of performing steady-state and time-transient thermal analyses of the bearing system. The steady-state analysis capability allows the user to estimate the expected steady-state temperature map in and around the bearing under normal operating conditions. On the other hand, the transient analysis feature provides the user a means to simulate the 'lost lubricant' condition and predict a time-temperature history of various critical points in the system. The bearing's 'time-to-failure' estimate may also be made from this (transient) analysis by considering the bearing as failed when a certain temperature limit is reached in the bearing components. The program is fully interactive and allows the user to get started and access most of its features with a minimal of training. For the most part, the program is menu driven, and adequate help messages were provided to guide a new user through various menu options and data input screens. All input data, both for mechanical and thermal analyses, are read through graphical input screens, thereby eliminating any need of a separate text editor/word processor to edit/create data files. Provision is also available to select and view the contents of output files on the monitor screen if no paper printouts are required. A separate volume (Volume-2) of this documentation describes, in detail, the underlying mathematical formulations, assumptions, and solution algorithms of this program.

  11. 75 FR 27335 - Combined Notice of Filings # 1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-14

    ... Electric Company submits updated market power study. Filed Date: 04/23/2010. Accession Number: 20100427...: ER10-1179-000. Applicants: American Electric Power Service Corporation. Description: Request of American Electric Power Service Corporation to Update Depreciation Expense Inputs in Formula Rate. Filed...

  12. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  13. CARE3MENU- A CARE III USER FRIENDLY INTERFACE

    NASA Technical Reports Server (NTRS)

    Pierce, J. L.

    1994-01-01

    CARE3MENU generates an input file for the CARE III program. CARE III is used for reliability prediction of complex, redundant, fault-tolerant systems including digital computers, aircraft, nuclear and chemical control systems. The CARE III input file often becomes complicated and is not easily formatted with a text editor. CARE3MENU provides an easy, interactive method of creating an input file by automatically formatting a set of user-supplied inputs for the CARE III system. CARE3MENU provides detailed on-line help for most of its screen formats. The reliability model input process is divided into sections using menu-driven screen displays. Each stage, or set of identical modules comprising the model, must be identified and described in terms of number of modules, minimum number of modules for stage operation, and critical fault threshold. The fault handling and fault occurence models are detailed in several screens by parameters such as transition rates, propagation and detection densities, Weibull or exponential characteristics, and model accuracy. The system fault tree and critical pairs fault tree screens are used to define the governing logic and to identify modules affected by component failures. Additional CARE3MENU screens prompt the user for output options and run time control values such as mission time and truncation values. There are fourteen major screens, many with default values and HELP options. The documentation includes: 1) a users guide with several examples of CARE III models, the dialog required to input them to CARE3MENU, and the output files created; and 2) a maintenance manual for assistance in changing the HELP files and modifying any of the menu formats or contents. CARE3MENU is written in FORTRAN 77 for interactive execution and has been implemented on a DEC VAX series computer operating under VMS. This program was developed in 1985.

  14. User's Manual for LINER: FORTRAN Code for the Numerical Simulation of Plane Wave Propagation in a Lined Two-Dimensional Channel

    NASA Technical Reports Server (NTRS)

    Reichert, R, S.; Biringen, S.; Howard, J. E.

    1999-01-01

    LINER is a system of Fortran 77 codes which performs a 2D analysis of acoustic wave propagation and noise suppression in a rectangular channel with a continuous liner at the top wall. This new implementation is designed to streamline the usage of the several codes making up LINER, resulting in a useful design tool. Major input parameters are placed in two main data files, input.inc and nurn.prm. Output data appear in the form of ASCII files as well as a choice of GNUPLOT graphs. Section 2 briefly describes the physical model. Section 3 discusses the numerical methods; Section 4 gives a detailed account of program usage, including input formats and graphical options. A sample run is also provided. Finally, Section 5 briefly describes the individual program files.

  15. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  16. SG2PS (structural geology to postscript converter) - A graphical solution for brittle structural data evaluation and paleostress calculation

    NASA Astrophysics Data System (ADS)

    Sasvári, Ágoston; Baharev, Ali

    2014-05-01

    The aim of this work was to create an open source cross platform application to process brittle structural geological data with seven paleostress inversion algorithms published by different authors and formerly not available within a single desktop application. The tool facilitates separate processing and plotting of different localities, data types and user made groups, using the same single input file. Simplified data input is supported, requiring as small amount of data as possible. Data rotation to correct for bedding tilting, rotation with paleomagnetic declination and k-means clustering are available. RUP and ANG stress estimators calculation and visualization, resolved shear direction display and Mohr circle stress visualization are available. RGB-colored vector graphical outputs are automatically generated in Encapsulated PostScript and Portable Document Format. Stereographical displays on great circle or pole point plot, equal area or equal angle net and upper or lower hemisphere projections are implemented. Rose plots displaying dip direction or strike, with dip angle distribution of the input data set are available. This tool is ideal for preliminary data interpretation on the field (quick processing and visualization in seconds); the implemented methods can be regularly used in the daily academic and industrial work as well. The authors' goal was to create an open source and self-contained desktop application that does not require any additional third party framework (such as .NET) or the Java Virtual Machine. The software has a clear and highly modular structure enabling good code portability, easy maintainability, reusability and extensibility. A Windows installer is publicly available and the program is also fully functional on Linux. The Mac OS X port should be feasible with minimal effort. The install file with test and demo data sets, detailed manual, and links to the GitHub repositories are available on the regularly updated website www.sg2ps.eu.

  17. XML-Based Generator of C++ Code for Integration With GUIs

    NASA Technical Reports Server (NTRS)

    Hua, Hook; Oyafuso, Fabiano; Klimeck, Gerhard

    2003-01-01

    An open source computer program has been developed to satisfy a need for simplified organization of structured input data for scientific simulation programs. Typically, such input data are parsed in from a flat American Standard Code for Information Interchange (ASCII) text file into computational data structures. Also typically, when a graphical user interface (GUI) is used, there is a need to completely duplicate the input information while providing it to a user in a more structured form. Heretofore, the duplication of the input information has entailed duplication of software efforts and increases in susceptibility to software errors because of the concomitant need to maintain two independent input-handling mechanisms. The present program implements a method in which the input data for a simulation program are completely specified in an Extensible Markup Language (XML)-based text file. The key benefit for XML is storing input data in a structured manner. More importantly, XML allows not just storing of data but also describing what each of the data items are. That XML file contains information useful for rendering the data by other applications. It also then generates data structures in the C++ language that are to be used in the simulation program. In this method, all input data are specified in one place only, and it is easy to integrate the data structures into both the simulation program and the GUI. XML-to-C is useful in two ways: 1. As an executable, it generates the corresponding C++ classes and 2. As a library, it automatically fills the objects with the input data values.

  18. Defining Geodetic Reference Frame using Matlab®: PlatEMotion 2.0

    NASA Astrophysics Data System (ADS)

    Cannavò, Flavio; Palano, Mimmo

    2016-03-01

    We describe the main features of the developed software tool, namely PlatE-Motion 2.0 (PEM2), which allows inferring the Euler pole parameters by inverting the observed velocities at a set of sites located on a rigid block (inverse problem). PEM2 allows also calculating the expected velocity value for any point located on the Earth providing an Euler pole (direct problem). PEM2 is the updated version of a previous software tool initially developed for easy-to-use file exchange with the GAMIT/GLOBK software package. The software tool is developed in Matlab® framework and, as the previous version, includes a set of MATLAB functions (m-files), GUIs (fig-files), map data files (mat-files) and user's manual as well as some example input files. New changes in PEM2 include (1) some bugs fixed, (2) improvements in the code, (3) improvements in statistical analysis, (4) new input/output file formats. In addition, PEM2 can be now run under the majority of operating systems. The tool is open source and freely available for the scientific community.

  19. FlaME: Flash Molecular Editor - a 2D structure input tool for the web

    PubMed Central

    2011-01-01

    Background So far, there have been no Flash-based web tools available for chemical structure input. The authors herein present a feasibility study, aiming at the development of a compact and easy-to-use 2D structure editor, using Adobe's Flash technology and its programming language, ActionScript. As a reference model application from the Java world, we selected the Java Molecular Editor (JME). In this feasibility study, we made an attempt to realize a subset of JME's functionality in the Flash Molecular Editor (FlaME) utility. These basic capabilities are: structure input, editing and depiction of single molecules, data import and export in molfile format. Implementation The result of molecular diagram sketching in FlaME is accessible in V2000 molfile format. By integrating the molecular editor into a web page, its communication with the HTML elements on this page is established using the two JavaScript functions, getMol() and setMol(). In addition, structures can be copied to the system clipboard. Conclusion A first attempt was made to create a compact single-file application for 2D molecular structure input/editing on the web, based on Flash technology. With the application examples presented in this article, it could be demonstrated that the Flash methods are principally well-suited to provide the requisite communication between the Flash object (application) and the HTML elements on a web page, using JavaScript functions. PMID:21284863

  20. Model input and output files for the simulation of time of arrival of landfill leachate at the water table, Municipal Solid Waste Landfill Facility, U.S. Army Air Defense Artillery Center and Fort Bliss, El Paso County, Texas

    USGS Publications Warehouse

    Abeyta, Cynthia G.; Frenzel, Peter F.

    1999-01-01

    This report contains listings of model input and output files for the simulation of the time of arrival of landfill leachate at the water table from the Municipal Solid Waste Landfill Facility (MSWLF), about 10 miles northeast of downtown El Paso, Texas. This simulation was done by the U.S. Geological Survey in cooperation with the U.S. Department of the Army, U.S. Army Air Defense Artillery Center and Fort Bliss, El Paso, Texas. The U.S. Environmental Protection Agency-developed Hydrologic Evaluation of Landfill Performance (HELP) and Multimedia Exposure Assessment (MULTIMED) computer models were used to simulate the production of leachate by a landfill and transport of landfill leachate to the water table. Model input data files used with and output files generated by the HELP and MULTIMED models are provided in ASCII format on a 3.5-inch 1.44-megabyte IBM-PC compatible floppy disk.

  1. Environmental flow allocation and statistics calculator

    USGS Publications Warehouse

    Konrad, Christopher P.

    2011-01-01

    The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.

  2. Documentation of a daily mean stream temperature module—An enhancement to the Precipitation-Runoff Modeling System

    USGS Publications Warehouse

    Sanders, Michael J.; Markstrom, Steven L.; Regan, R. Steven; Atkinson, R. Dwight

    2017-09-15

    A module for simulation of daily mean water temperature in a network of stream segments has been developed as an enhancement to the U.S. Geological Survey Precipitation Runoff Modeling System (PRMS). This new module is based on the U.S. Fish and Wildlife Service Stream Network Temperature model, a mechanistic, one-dimensional heat transport model. The new module is integrated in PRMS. Stream-water temperature simulation is activated by selection of the appropriate input flags in the PRMS Control File and by providing the necessary additional inputs in standard PRMS input files.This report includes a comprehensive discussion of the methods relevant to the stream temperature calculations and detailed instructions for model input preparation.

  3. Auto Draw from Excel Input Files

    NASA Technical Reports Server (NTRS)

    Strauss, Karl F.; Goullioud, Renaud; Cox, Brian; Grimes, James M.

    2011-01-01

    The design process often involves the use of Excel files during project development. To facilitate communications of the information in the Excel files, drawings are often generated. During the design process, the Excel files are updated often to reflect new input. The problem is that the drawings often lag the updates, often leading to confusion of the current state of the design. The use of this program allows visualization of complex data in a format that is more easily understandable than pages of numbers. Because the graphical output can be updated automatically, the manual labor of diagram drawing can be eliminated. The more frequent update of system diagrams can reduce confusion and reduce errors and is likely to uncover symmetric problems earlier in the design cycle, thus reducing rework and redesign.

  4. Engineering description of the ascent/descent bet product

    NASA Technical Reports Server (NTRS)

    Seacord, A. W., II

    1986-01-01

    The Ascent/Descent output product is produced in the OPIP routine from three files which constitute its input. One of these, OPIP.IN, contains mission specific parameters. Meteorological data, such as atmospheric wind velocities, temperatures, and density, are obtained from the second file, the Corrected Meteorological Data File (METDATA). The third file is the TRJATTDATA file which contains the time-tagged state vectors that combine trajectory information from the Best Estimate of Trajectory (BET) filter, LBRET5, and Best Estimate of Attitude (BEA) derived from IMU telemetry. Each term in the two output data files (BETDATA and the Navigation Block, or NAVBLK) are defined. The description of the BETDATA file includes an outline of the algorithm used to calculate each term. To facilitate describing the algorithms, a nomenclature is defined. The description of the nomenclature includes a definition of the coordinate systems used. The NAVBLK file contains navigation input parameters. Each term in NAVBLK is defined and its source is listed. The production of NAVBLK requires only two computational algorithms. These two algorithms, which compute the terms DELTA and RSUBO, are described. Finally, the distribution of data in the NAVBLK records is listed.

  5. Calibration Software for Use with Jurassicprok

    NASA Technical Reports Server (NTRS)

    Chapin, Elaine; Hensley, Scott; Siqueira, Paul

    2004-01-01

    The Jurassicprok Interferometric Calibration Software (also called "Calibration Processor" or simply "CP") estimates the calibration parameters of an airborne synthetic-aperture-radar (SAR) system, the raw measurement data of which are processed by the Jurassicprok software described in the preceding article. Calibration parameters estimated by CP include time delays, baseline offsets, phase screens, and radiometric offsets. CP examines raw radar-pulse data, single-look complex image data, and digital elevation map data. For each type of data, CP compares the actual values with values expected on the basis of ground-truth data. CP then converts the differences between the actual and expected values into updates for the calibration parameters in an interferometric calibration file (ICF) and a radiometric calibration file (RCF) for the particular SAR system. The updated ICF and RCF are used as inputs to both Jurassicprok and to the companion Motion Measurement Processor software (described in the following article) for use in generating calibrated digital elevation maps.

  6. The distributed production system of the SuperB project: description and results

    NASA Astrophysics Data System (ADS)

    Brown, D.; Corvo, M.; Di Simone, A.; Fella, A.; Luppi, E.; Paoloni, E.; Stroili, R.; Tomassetti, L.

    2011-12-01

    The SuperB experiment needs large samples of MonteCarlo simulated events in order to finalize the detector design and to estimate the data analysis performances. The requirements are beyond the capabilities of a single computing farm, so a distributed production model capable of exploiting the existing HEP worldwide distributed computing infrastructure is needed. In this paper we describe the set of tools that have been developed to manage the production of the required simulated events. The production of events follows three main phases: distribution of input data files to the remote site Storage Elements (SE); job submission, via SuperB GANGA interface, to all available remote sites; output files transfer to CNAF repository. The job workflow includes procedures for consistency checking, monitoring, data handling and bookkeeping. A replication mechanism allows storing the job output on the local site SE. Results from 2010 official productions are reported.

  7. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-94, with projections to 2020; (supplement one to U.S. Geological Survey Water-resources investigations report 94-4251)

    USGS Publications Warehouse

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.). Output files resulting from the computer simulations are included for reference.

  8. EnergyPlus™

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Originally developed in 1999, an updated version 8.8.0 with bug fixes was released on September 30th, 2017. EnergyPlus™ is a whole building energy simulation program that engineers, architects, and researchers use to model both energy consumption—for heating, cooling, ventilation, lighting and plug and process loads—and water use in buildings. EnergyPlus is a console-based program that reads input and writes output to text files. It ships with a number of utilities including IDF-Editor for creating input files using a simple spreadsheet-like interface, EP-Launch for managing input and output files and performing batch simulations, and EP-Compare for graphically comparing the results ofmore » two or more simulations. Several comprehensive graphical interfaces for EnergyPlus are also available. DOE does most of its work with EnergyPlus using the OpenStudio® software development kit and suite of applications. DOE releases major updates to EnergyPlus twice annually.« less

  9. Input Files and Procedures for Analysis of SMA Hybrid Composite Beams in MSC.Nastran and ABAQUS

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.; Patel, Hemant D.

    2005-01-01

    A thermoelastic constitutive model for shape memory alloys (SMAs) and SMA hybrid composites (SMAHCs) was recently implemented in the commercial codes MSC.Nastran and ABAQUS. The model is implemented and supported within the core of the commercial codes, so no user subroutines or external calculations are necessary. The model and resulting structural analysis has been previously demonstrated and experimentally verified for thermoelastic, vibration and acoustic, and structural shape control applications. The commercial implementations are described in related documents cited in the references, where various results are also shown that validate the commercial implementations relative to a research code. This paper is a companion to those documents in that it provides additional detail on the actual input files and solution procedures and serves as a repository for ASCII text versions of the input files necessary for duplication of the available results.

  10. LTCP 2D Graphical User Interface. Application Description and User's Guide

    NASA Technical Reports Server (NTRS)

    Ball, Robert; Navaz, Homayun K.

    1996-01-01

    A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.

  11. User's guide to HYPOINVERSE-2000, a Fortran program to solve for earthquake locations and magnitudes

    USGS Publications Warehouse

    Klein, Fred W.

    2002-01-01

    Hypoinverse is a computer program that processes files of seismic station data for an earthquake (like p wave arrival times and seismogram amplitudes and durations) into earthquake locations and magnitudes. It is one of a long line of similar USGS programs including HYPOLAYR (Eaton, 1969), HYPO71 (Lee and Lahr, 1972), and HYPOELLIPSE (Lahr, 1980). If you are new to Hypoinverse, you may want to start by glancing at the section “SOME SIMPLE COMMAND SEQUENCES” to get a feel of some simpler sessions. This document is essentially an advanced user’s guide, and reading it sequentially will probably plow the reader into more detail than he/she needs. Every user must have a crust model, station list and phase data input files, and glancing at these sections is a good place to begin. The program has many options because it has grown over the years to meet the needs of one the largest seismic networks in the world, but small networks with just a few stations do use the program and can ignore most of the options and commands. History and availability. Hypoinverse was originally written for the Eclipse minicomputer in 1978 (Klein, 1978). A revised version for VAX and Pro-350 computers (Klein, 1985) was later expanded to include multiple crustal models and other capabilities (Klein, 1989). This current report documents the expanded Y2000 version and it supercedes the earlier documents. It serves as a detailed user's guide to the current version running on unix and VAX-alpha computers, and to the version supplied with the Earthworm earthquake digitizing system. Fortran-77 source code (Sun and VAX compatible) and copies of this documentation is available via anonymous ftp from computers in Menlo Park. At present, the computer is swave.wr.usgs.gov and the directory is /ftp/pub/outgoing/klein/hyp2000. If you are running Hypoinverse on one of the Menlo Park EHZ or NCSN unix computers, the executable currently is ~klein/hyp2000/hyp2000. New features. The Y2000 version of Hypoinverse includes all of the previous capabilities, but adds Y2000 formats to those defined earlier. In most cases, the new formats add 2 digits to the year field to accommodate the century. Other fields are sometimes rearranged or expanded to accommodate a better field order. The Y2000 formats are invoked with the “200” command. When the Y2000 flag is turned on, all files are read and written in the new format and there is no mixing of format types in a single run. Some formats without a date field, like station files, have not changed. A separate program called 2000CONV has been written to convert old formats to new. Other new features, like expanded station names, calculating amplitude magnitudes from a variety of digital seismometers, station history files, interactive earthquake processing, and locations from CUSP (Caltech USGS Seismic Processing) binary files have been added. General features. Hypoinverse will locate any number of events in an input file, which can be in one of several different formats. Any or all of printout, summary or archive output may be produced. Hypoinverse is driven by user commands. The various commands define input and output files, set adjustable parameters, and solve for locations of a file of earthquake data using the parameters and files currently set. It is both interactive and "batch" in that commands may be executed either from the keyboard or from a file. You execute the commands in a file by typing @filename at the Hypoinverse prompt. Users may either supply parameters on the command line, or omit them and are prompted interactively. The current parameter values are displayed and may be taken as defaults by pressing just the RETURN key after the prompt. This makes the program very easy to use, providing you can remember the names of the commands. Combining commands with and without their required parameters into a command file permits a variety of customized procedures such as automatic input of crustal model and station data, but prompting for a different phase file each time. All commands are 3 letters long and most require one or more parameters or file names. If they appear on a line with a command, character strings such as filenames must be enclosed in apostrophes (single quotes). Appendix 1 gives this and other free-format rules for supplying parameters, which are parsed in Fortran. When several parameters are required following a command, any of them may be omitted by replacing them with null fields (see appendix 1). A null field leaves that parameter unchanged from its current or default value. When you start HYPOINVERSE, default values are in effect for all parameters except file names. Hypoinverse is a complicated program with many features and options. Many of these "advanced" or seldom used features are documented here, but are more detailed than a typical user needs to read about when first starting with the program. I have put some of this material in smaller type so that a first time user can concentrate on the more important information.

  12. A program to generate a Fortran interface for a C++ library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Lee

    Shroud is a utility to create a Fortran and C interface for a C++ library. An existing C++ library API is described in an input file. Shroud reads the file and creates source files which can be compiled to provide a Fortran API for the library.

  13. Tool for Merging Proposals Into DSN Schedules

    NASA Technical Reports Server (NTRS)

    Khanampornpan, Teerapat; Kwok, John; Call, Jared

    2008-01-01

    A Practical Extraction and Reporting Language (Perl) script called merge7da has been developed to facilitate determination, by a project scheduler in NASA's Deep Space Network, of whether a proposal for use of the DSN could create a conflict with the current DSN schedule. Prior to the development of merge7da, there was no way to quickly identify potential schedule conflicts: it was necessary to submit a proposal and wait a day or two for a response from a DSN scheduling facility. By using merge7da to detect and eliminate potential schedule conflicts before submitting a proposal, a project scheduler saves time and gains assurance that the proposal will probably be accepted. merge7da accepts two input files, one of which contains the current DSN schedule and is in a DSN-standard format called '7da'. The other input file contains the proposal and is in another DSN-standard format called 'C1/C2'. merge7da processes the two input files to produce a merged 7da-format output file that represents the DSN schedule as it would be if the proposal were to be adopted. This 7da output file can be loaded into various DSN scheduling software tools now in use.

  14. Structural/aerodynamic Blade Analyzer (SAB) User's Guide, Version 1.0

    NASA Technical Reports Server (NTRS)

    Morel, M. R.

    1994-01-01

    The structural/aerodynamic blade (SAB) analyzer provides an automated tool for the static-deflection analysis of turbomachinery blades with aerodynamic and rotational loads. A structural code calculates a deflected blade shape using aerodynamic loads input. An aerodynamic solver computes aerodynamic loads using deflected blade shape input. The two programs are iterated automatically until deflections converge. Currently, SAB version 1.0 is interfaced with MSC/NASTRAN to perform the structural analysis and PROP3D to perform the aerodynamic analysis. This document serves as a guide for the operation of the SAB system with specific emphasis on its use at NASA Lewis Research Center (LeRC). This guide consists of six chapters: an introduction which gives a summary of SAB; SAB's methodology, component files, links, and interfaces; input/output file structure; setup and execution of the SAB files on the Cray computers; hints and tips to advise the user; and an example problem demonstrating the SAB process. In addition, four appendices are presented to define the different computer programs used within the SAB analyzer and describe the required input decks.

  15. HomozygosityMapper2012--bridging the gap between homozygosity mapping and deep sequencing.

    PubMed

    Seelow, Dominik; Schuelke, Markus

    2012-07-01

    Homozygosity mapping is a common method to map recessive traits in consanguineous families. To facilitate these analyses, we have developed HomozygosityMapper, a web-based approach to homozygosity mapping. HomozygosityMapper allows researchers to directly upload the genotype files produced by the major genotyping platforms as well as deep sequencing data. It detects stretches of homozygosity shared by the affected individuals and displays them graphically. Users can interactively inspect the underlying genotypes, manually refine these regions and eventually submit them to our candidate gene search engine GeneDistiller to identify the most promising candidate genes. Here, we present the new version of HomozygosityMapper. The most striking new feature is the support of Next Generation Sequencing *.vcf files as input. Upon users' requests, we have implemented the analysis of common experimental rodents as well as of important farm animals. Furthermore, we have extended the options for single families and loss of heterozygosity studies. Another new feature is the export of *.bed files for targeted enrichment of the potential disease regions for deep sequencing strategies. HomozygosityMapper also generates files for conventional linkage analyses which are already restricted to the possible disease regions, hence superseding CPU-intensive genome-wide analyses. HomozygosityMapper is freely available at http://www.homozygositymapper.org/.

  16. Description of SHARC-2, the Strategic High-Altitude Atmospheric Radiance Code.

    DTIC Science & Technology

    1991-03-22

    the Rules for Reaction Cards .. ......... 33 7 Summary of the Rules for Auxiliary Information Cards . 35 8 SHARC CO Molecular States Input File...those used in AARC. The ion pair production rate is then obtained from the energy deposition rate by assuming that 35 eV are required to produce an ion...contain three numbers to identify the particular vibrational state (using the standard AFGL - 35 - Table 8. SHARC CO Molecular States Input File. CO

  17. Damage Tolerance Predictions for Spar Web Cracking in a Diminishing Stress Field

    DTIC Science & Technology

    2011-12-01

    specimen crack. ....................... 40  28 NASGRO material file inputs for 7075 -T6 aluminum . .................................... 43  29 AFGROW...2024-T3511 aluminum end caps riveted to stiffened 7075 -T6 sheet metal aluminum webs. The cap-to-web attachment consisted of a double row of MS20470D8...section stress constant as the cracks 43 Fig. 28 NASGRO material file inputs for 7075 -T6 aluminum . grow. In this case, cracks are assumed to

  18. Navy Occupational Health Information Management System (NOHIMS). Environmental Exposure Module. Users’ Manual

    DTIC Science & Technology

    1987-01-16

    menus , controls user and device access to the system, manages the security features associated with menus , devices, and users, provides...in the files, or the number of files in the system. 2-2 3.0 MODULE INPUT PROCESSES 3.1 Summary of Input Processes The EE module contains many menu ...Output Processes The EE module contains many menu options which enable the user to obtain needed information from the module. These options can be

  19. 76 FR 12155 - Self-Regulatory Organizations; BATS Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-04

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-63969; File No. SR-BATS-2011-007] Self-Regulatory Organizations; BATS Exchange, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule Change by BATS Exchange, Inc. to Adopt BATS Rule 11.21, entitled ``Input of Accurate Information...

  20. nem_spread Ver. 5.10

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HENNIGAN, GARY; SHADID, JOHN; SJAARDEMA, GREGORY

    2009-06-08

    Nem_spread reads it's input command file (default name nem_spread.inp), takes the named ExodusII geometry definition and spreads out the geometry (and optionally results) contained in that file out to a parallel disk system. The decomposition is taken from a scalar Nemesis load balance file generated by the companion utility nem_slice.

  1. Haplotype-Based Genotyping in Polyploids.

    PubMed

    Clevenger, Josh P; Korani, Walid; Ozias-Akins, Peggy; Jackson, Scott

    2018-01-01

    Accurate identification of polymorphisms from sequence data is crucial to unlocking the potential of high throughput sequencing for genomics. Single nucleotide polymorphisms (SNPs) are difficult to accurately identify in polyploid crops due to the duplicative nature of polyploid genomes leading to low confidence in the true alignment of short reads. Implementing a haplotype-based method in contrasting subgenome-specific sequences leads to higher accuracy of SNP identification in polyploids. To test this method, a large-scale 48K SNP array (Axiom Arachis2) was developed for Arachis hypogaea (peanut), an allotetraploid, in which 1,674 haplotype-based SNPs were included. Results of the array show that 74% of the haplotype-based SNP markers could be validated, which is considerably higher than previous methods used for peanut. The haplotype method has been implemented in a standalone program, HAPLOSWEEP, which takes as input bam files and a vcf file and identifies haplotype-based markers. Haplotype discovery can be made within single reads or span paired reads, and can leverage long read technology by targeting any length of haplotype. Haplotype-based genotyping is applicable in all allopolyploid genomes and provides confidence in marker identification and in silico-based genotyping for polyploid genomics.

  2. Data Processing Aspects of MEDLARS

    PubMed Central

    Austin, Charles J.

    1964-01-01

    The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files. PMID:14119287

  3. DATA PROCESSING ASPECTS OF MEDLARS.

    PubMed

    AUSTIN, C J

    1964-01-01

    The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files.

  4. Goddard Institute for Space Studies (GISS) 3-Dimensional (3-D) Global Tracer Transport Model (DB1006)

    DOE Data Explorer

    Fung, I.

    1993-01-01

    This directory contains the input files used in simulations of atmospheric CO2 using the GISS 3-D global tracer transport model. The directory contains 16 files including a help file (CO2FUNG.HLP), 12 files containing monthly exchanges with vegetation and soils (CO2VEG.JAN - DEC), 1 file containing releases of CO2 from fossil fuel burning (CO2FOS.MRL), 1 file containing releases of CO2 from land transformations (CO2DEF.HOU), and 1 file containing the patterns of CO2 exchange with the oceans (CO2OCN.TAK).

  5. C2x: A tool for visualisation and input preparation for CASTEP and other electronic structure codes

    NASA Astrophysics Data System (ADS)

    Rutter, M. J.

    2018-04-01

    The c2x code fills two distinct roles. Its first role is in acting as a converter between the binary format .check files from the widely-used CASTEP [1] electronic structure code and various visualisation programs. Its second role is to manipulate and analyse the input and output files from a variety of electronic structure codes, including CASTEP, ONETEP and VASP, as well as the widely-used 'Gaussian cube' file format. Analysis includes symmetry analysis, and manipulation arbitrary cell transformations. It continues to be under development, with growing functionality, and is written in a form which would make it easy to extend it to working directly with files from other electronic structure codes. Data which c2x is capable of extracting from CASTEP's binary checkpoint files include charge densities, spin densities, wavefunctions, relaxed atomic positions, forces, the Fermi level, the total energy, and symmetry operations. It can recreate .cell input files from checkpoint files. Volumetric data can be output in formats useable by many common visualisation programs, and c2x will itself calculate integrals, expand data into supercells, and interpolate data via combinations of Fourier and trilinear interpolation. It can extract data along arbitrary lines (such as lines between atoms) as 1D output. C2x is able to convert between several common formats for describing molecules and crystals, including the .cell format of CASTEP. It can construct supercells, reduce cells to their primitive form, and add specified k-point meshes. It uses the spglib library [2] to report symmetry information, which it can add to .cell files. C2x is a command-line utility, so is readily included in scripts. It is available under the GPL and can be obtained from http://www.c2x.org.uk. It is believed to be the only open-source code which can read CASTEP's .check files, so it will have utility in other projects.

  6. STS-9 BET products

    NASA Technical Reports Server (NTRS)

    Findlay, J. T.; Kelly, G. M.; Heck, M. L.; Mcconnell, J. G.; Henry, M. W.

    1984-01-01

    The final products generated for the STS-9, which landed on December 8, 1983 are reported. The trajectory reconstruction utilized an anchor epoch of GMT corresponding to an initial altitude of h 356 kft, selected in view of the limited tracking coverage available. The final state utilized IMU2 measurements and was based on processing radar tracking from six C-bands and a single S-band station, plus six photo-theodolite cameras in the vicinity of Runway 17 at Edwards Air Force Base. The final atmosphere (FLAIR9/UN=581199C) was based on a composite of the remote measured data and the 1978 Air Force Reference Atmosphere model. The Extended BET is available as STS9BET/UN=274885C. The AEROBET and MMLE input files created are discussed. Plots of the more relevant parameters from the AEROBET (reel number NL0624) are included. Input parameters, final residual plots, a trajectory listing, and data archival information are defined.

  7. Construct User Guide

    DTIC Science & Technology

    2012-11-01

    interactions in construct: An empirical validation using calibrated grounding. In 2007 BRIMS Conference Proceedings, Norfolk, VA. Simon, H. A...by the path name. Users should ensure that if they have opened any output files (e.g., in Excel to view the files), they should either close the file...stringvars to delimit string variables. Common Gotchas If Construct is unable to open an input file, it will exit and close. There are times when an

  8. Building accurate historic and future climate MEPDG input files for Louisiana DOTD : tech summary.

    DOT National Transportation Integrated Search

    2017-02-01

    The new pavement design process (originally MEPDG, then DARWin-ME, and now Pavement ME Design) requires two types : of inputs to infl uence the prediction of pavement distress for a selected set of pavement materials and structure. One input is : tra...

  9. AN ADA NAMELIST PACKAGE

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    The Ada Namelist Package, developed for the Ada programming language, enables a calling program to read and write FORTRAN-style namelist files. A namelist file consists of any number of assignment statements in any order. Features of the Ada Namelist Package are: the handling of any combination of user-defined types; the ability to read vectors, matrices, and slices of vectors and matrices; the handling of mismatches between variables in the namelist file and those in the programmed list of namelist variables; and the ability to avoid searching the entire input file for each variable. The principle user benefits of this software are the following: the ability to write namelist-readable files, the ability to detect most file errors in the initialization phase, a package organization that reduces the number of instantiated units to a few packages rather than to many subprograms, a reduced number of restrictions, and an increased execution speed. The Ada Namelist reads data from an input file into variables declared within a user program. It then writes data from the user program to an output file, printer, or display. The input file contains a sequence of assignment statements in arbitrary order. The output is in namelist-readable form. There is a one-to-one correspondence between namelist I/O statements executed in the user program and variables read or written. Nevertheless, in the input file, mismatches are allowed between assignment statements in the file and the namelist read procedure statements in the user program. The Ada Namelist Package itself is non-generic. However, it has a group of nested generic packages following the nongeneric opening portion. The opening portion declares a variety of useraccessible constants, variables and subprograms. The subprograms are procedures for initializing namelists for reading, reading and writing strings. The subprograms are also functions for analyzing the content of the current dataset and diagnosing errors. Two nested generic packages follow the opening portion. The first generic package contains procedures that read and write objects of scalar type. The second contains subprograms that read and write one and two-dimensional arrays whose components are of scalar type and whose indices are of either of the two discrete types (integer or enumeration). Subprograms in the second package also read and write vector and matrix slices. The Ada Namelist ASCII text files are available on a 360k 5.25" floppy disk written on an IBM PC/AT running under the PC DOS operating system. The largest subprogram in the package requires 150k of memory. The package was developed using VAX Ada v. 1.5 under DEC VMS v. 4.5. It should be portable to any validated Ada compiler. The software was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  10. User Guide for HUFPrint, A Tabulation and Visualization Utility for the Hydrogeologic-Unit Flow (HUF) Package of MODFLOW

    USGS Publications Warehouse

    Banta, Edward R.; Provost, Alden M.

    2008-01-01

    This report documents HUFPrint, a computer program that extracts and displays information about model structure and hydraulic properties from the input data for a model built using the Hydrogeologic-Unit Flow (HUF) Package of the U.S. Geological Survey's MODFLOW program for modeling ground-water flow. HUFPrint reads the HUF Package and other MODFLOW input files, processes the data by hydrogeologic unit and by model layer, and generates text and graphics files useful for visualizing the data or for further processing. For hydrogeologic units, HUFPrint outputs such hydraulic properties as horizontal hydraulic conductivity along rows, horizontal hydraulic conductivity along columns, horizontal anisotropy, vertical hydraulic conductivity or anisotropy, specific storage, specific yield, and hydraulic-conductivity depth-dependence coefficient. For model layers, HUFPrint outputs such effective hydraulic properties as horizontal hydraulic conductivity along rows, horizontal hydraulic conductivity along columns, horizontal anisotropy, specific storage, primary direction of anisotropy, and vertical conductance. Text files tabulating hydraulic properties by hydrogeologic unit, by model layer, or in a specified vertical section may be generated. Graphics showing two-dimensional cross sections and one-dimensional vertical sections at specified locations also may be generated. HUFPrint reads input files designed for MODFLOW-2000 or MODFLOW-2005.

  11. Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios

    USGS Publications Warehouse

    Banta, Edward R.

    2014-01-01

    Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable Document Format file.

  12. GEO2D - Two-Dimensional Computer Model of a Ground Source Heat Pump System

    DOE Data Explorer

    James Menart

    2013-06-07

    This file contains a zipped file that contains many files required to run GEO2D. GEO2D is a computer code for simulating ground source heat pump (GSHP) systems in two-dimensions. GEO2D performs a detailed finite difference simulation of the heat transfer occurring within the working fluid, the tube wall, the grout, and the ground. Both horizontal and vertical wells can be simulated with this program, but it should be noted that the vertical wall is modeled as a single tube. This program also models the heat pump in conjunction with the heat transfer occurring. GEO2D simulates the heat pump and ground loop as a system. Many results are produced by GEO2D as a function of time and position, such as heat transfer rates, temperatures and heat pump performance. On top of this information from an economic comparison between the geothermal system simulated and a comparable air heat pump systems or a comparable gas, oil or propane heating systems with a vapor compression air conditioner. The version of GEO2D in the attached file has been coupled to the DOE heating and cooling load software called ENERGYPLUS. This is a great convenience for the user because heating and cooling loads are an input to GEO2D. GEO2D is a user friendly program that uses a graphical user interface for inputs and outputs. These make entering data simple and they produce many plotted results that are easy to understand. In order to run GEO2D access to MATLAB is required. If this program is not available on your computer you can download the program MCRInstaller.exe, the 64 bit version, from the MATLAB website or from this geothermal depository. This is a free download which will enable you to run GEO2D..

  13. Legato: Personal Computer Software for Analyzing Pressure-Sensitive Paint Data

    NASA Technical Reports Server (NTRS)

    Schairer, Edward T.

    2001-01-01

    'Legato' is personal computer software for analyzing radiometric pressure-sensitive paint (PSP) data. The software is written in the C programming language and executes under Windows 95/98/NT operating systems. It includes all operations normally required to convert pressure-paint image intensities to normalized pressure distributions mapped to physical coordinates of the test article. The program can analyze data from both single- and bi-luminophore paints and provides for both in situ and a priori paint calibration. In addition, there are functions for determining paint calibration coefficients from calibration-chamber data. The software is designed as a self-contained, interactive research tool that requires as input only the bare minimum of information needed to accomplish each function, e.g., images, model geometry, and paint calibration coefficients (for a priori calibration) or pressure-tap data (for in situ calibration). The program includes functions that can be used to generate needed model geometry files for simple model geometries (e.g., airfoils, trapezoidal wings, rotor blades) based on the model planform and airfoil section. All data files except images are in ASCII format and thus are easily created, read, and edited. The program does not use database files. This simplifies setup but makes the program inappropriate for analyzing massive amounts of data from production wind tunnels. Program output consists of Cartesian plots, false-colored real and virtual images, pressure distributions mapped to the surface of the model, assorted ASCII data files, and a text file of tabulated results. Graphical output is displayed on the computer screen and can be saved as publication-quality (PostScript) files.

  14. TAILSIM Users Guide

    NASA Technical Reports Server (NTRS)

    Hiltner, Dale W.

    2000-01-01

    The TAILSIM program uses a 4th order Runge-Kutta method to integrate the standard aircraft equations-of-motion (EOM). The EOM determine three translational and three rotational accelerations about the aircraft's body axis reference system. The forces and moments that drive the EOM are determined from aerodynamic coefficients, dynamic derivatives, and control inputs. Values for these terms are determined from linear interpolation of tables that are a function of parameters such as angle-of-attack and surface deflections. Buildup equations combine these terms and dimensionalize them to generate the driving total forces and moments. Features that make TAILSIM applicable to studies of tailplane stall include modeling of the reversible control System, modeling of the pilot performing a load factor and/or airspeed command task, and modeling of vertical gusts. The reversible control system dynamics can be described as two hinged masses connected by a spring. resulting in a fifth order system. The pilot model is a standard form of lead-lag with a time delay applied to an integrated pitch rate and/or airspeed error feedback. The time delay is implemented by a Pade approximation, while the commanded pitch rate is determined by a commanded load factor. Vertical gust inputs include a single 1-cosine gust and a continuous NASA Dryden gust model. These dynamic models. coupled with the use of a nonlinear database, allow the tailplane stall characteristics, elevator response, and resulting aircraft response, to be modeled. A useful output capability of the TAILSIM program is the ability to display multiple post-run plot pages to allow a quick assessment of the time history response. There are 16 plot pages currently available to the user. Each plot page displays 9 parameters. Each parameter can also be displayed individually. on a one plot-per-page format. For a more refined display of the results the program can also create files of tabulated data. which can then be used by other plotting programs. The TAILSIM program was written straightforwardly assuming the user would want to change the database tables, the buildup equations, the output parameters. and the pilot model parameters. A separate database file and input file are automatically read in by the program. The use of an include file to set up all common blocks facilitates easy changing of parameter names and array sizes.

  15. Encryption and decryption using FPGA

    NASA Astrophysics Data System (ADS)

    Nayak, Nikhilesh; Chandak, Akshay; Shah, Nisarg; Karthikeyan, B.

    2017-11-01

    In this paper, we are performing multiple cryptography methods on a set of data and comparing their outputs. Here AES algorithm and RSA algorithm are used. Using AES Algorithm an 8 bit input (plain text) gets encrypted using a cipher key and the result is displayed on tera term (serially). For simulation a 128 bit input is used and operated with a 128 bit cipher key to generate encrypted text. The reverse operations are then performed to get decrypted text. In RSA Algorithm file handling is used to input plain text. This text is then operated on to get the encrypted and decrypted data, which are then stored in a file. Finally the results of both the algorithms are compared.

  16. Banning standard cell engineering notebook

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A family of standardized thick-oxide P-MOS building blocks (standard cells) is described. The information is presented in a form useful for systems designs, logic design, and the preparation of inputs to both sets of Design Automation programs for array design and analysis. A data sheet is provided for each cell and gives the cell name, the cell number, its logic symbol, Boolean equation, truth table, circuit schematic circuit composite, input-output capacitances, and revision date. The circuit type file, also given for each cell, together with the logic drawing contained on the data sheet provides all the information required to prepare input data files for the Design Automation Systems. A detailed description of the electrical design procedure is included.

  17. Standard interface files and procedures for reactor physics codes, version III

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmichael, B.M.

    Standards and procedures for promoting the exchange of reactor physics codes are updated to Version-III status. Standards covering program structure, interface files, file handling subroutines, and card input format are included. The implementation status of the standards in codes and the extension of the standards to new code areas are summarized. (15 references) (auth)

  18. Blade loss transient dynamics analysis. Volume 3: User's manual for TETRA program

    NASA Technical Reports Server (NTRS)

    Black, G. R.; Gallardo, V. C.; Storace, A. S.; Sagendorph, F.

    1981-01-01

    The users manual for TETRA contains program logic, flow charts, error messages, input sheets, modeling instructions, option descriptions, input variable descriptions, and demonstration problems. The process of obtaining a NASTRAN 17.5 generated modal input file for TETRA is also described with a worked sample.

  19. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020; (supplement three to U.S. Geological Survey Water-resources investigations report 94-4251)

    USGS Publications Warehouse

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) and revised by Kernodle (Kernodle, J.M., 1998, Simulation of ground-water flow in the Albuquerque Basin, 1901-95, with projections to 2020 (supplement two to U.S. Geological Survey Water-Resources Investigations Report 94-4251): U.S. Geological Survey Open-File Report 96-209, 54 p.). Output files resulting from the computer simulations are included for reference.

  20. Laboratory data manipulation tools basic data handling programs. Volume 2: Detailed software/hardware documentation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The set of computer programs described allows for data definition, data input, and data transfer between the LSI-11 microcomputers and the VAX-11/780 minicomputer. Program VAXCOM allows for a simple method of textual file transfer from the LSI to the VAX. Program LSICOM allows for easy file transfer from the VAX to the LSI. Program TTY changes the LSI-11 operators console to the LSI's printing device. Program DICTIN provides a means for defining a data set for input to either computer. Program DATAIN is a simple to operate data entry program which is capable of building data files on either machine. Program LEDITV is an extremely powerful, easy to use, line oriented text editor. Program COPYSBF is designed to print out textual files on the line printer without character loss from FORTRAN carriage control or wide record transfer.

  1. Enhancement/upgrade of Engine Structures Technology Best Estimator (EST/BEST) Software System

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin

    2003-01-01

    This report describes the work performed during the contract period and the capabilities included in the EST/BEST software system. The developed EST/BEST software system includes the integrated NESSUS, IPACS, COBSTRAN, and ALCCA computer codes required to perform the engine cycle mission and component structural analysis. Also, the interactive input generator for NESSUS, IPACS, and COBSTRAN computer codes have been developed and integrated with the EST/BEST software system. The input generator allows the user to create input from scratch as well as edit existing input files interactively. Since it has been integrated with the EST/BEST software system, it enables the user to modify EST/BEST generated files and perform the analysis to evaluate the benefits. Appendix A gives details of how to use the newly added features in the EST/BEST software system.

  2. Detection and segmentation of multiple touching product inspection items

    NASA Astrophysics Data System (ADS)

    Casasent, David P.; Talukder, Ashit; Cox, Westley; Chang, Hsuan-Ting; Weber, David

    1996-12-01

    X-ray images of pistachio nuts on conveyor trays for product inspection are considered. The first step in such a processor is to locate each individual item and place it in a separate file for input to a classifier to determine the quality of each nut. This paper considers new techniques to: detect each item (each nut can be in any orientation, we employ new rotation-invariant filters to locate each item independent of its orientation), produce separate image files for each item [a new blob coloring algorithm provides this for isolated (non-touching) input items], segmentation to provide separate image files for touching or overlapping input items (we use a morphological watershed transform to achieve this), and morphological processing to remove the shell and produce an image of only the nutmeat. Each of these operations and algorithms are detailed and quantitative data for each are presented for the x-ray image nut inspection problem noted. These techniques are of general use in many different product inspection problems in agriculture and other areas.

  3. EPA FRS Facilities State Single File CSV Download

    EPA Pesticide Factsheets

    This page provides state comma separated value (CSV) files containing key information of all facilities and sites within the Facility Registry System (FRS). Each state zip file contains a single CSV file of key facility-level information.

  4. Quantitative evaluation of apically extruded debris with different single-file systems: Reciproc, F360 and OneShape versus Mtwo.

    PubMed

    Bürklein, S; Benten, S; Schäfer, E

    2014-05-01

    To assess in a laboratory setting the amount of apically extruded debris associated with different single-file nickel-titanium instrumentation systems compared to one multiple-file rotary system. Eighty human mandibular central incisors were randomly assigned to four groups (n = 20 teeth per group). The root canals were instrumented according to the manufacturers' instructions using the reciprocating single-file system Reciproc, the single-file rotary systems F360 and OneShape and the multiple-file rotary Mtwo instruments. The apically extruded debris was collected and dried in pre-weighed glass vials. The amount of debris was assessed with a micro balance and statistically analysed using anova and post hoc Student-Newman-Keuls test. The time required to prepare the canals with the different instruments was also recorded. Reciproc produced significantly more debris compared to all other systems (P < 0.05). No significant difference was noted between the two single-file rotary systems and the multiple-file rotary system (P > 0.05). Instrumentation with the three single-file systems was significantly faster than with Mtwo (P < 0.05). Under the condition of this study, all systems caused apical debris extrusion. Rotary instrumentation was associated with less debris extrusion compared to reciprocal instrumentation. © 2013 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  5. FMC: a one-liner Python program to manage, classify and plot focal mechanisms

    NASA Astrophysics Data System (ADS)

    Álvarez-Gómez, José A.

    2014-05-01

    The analysis of earthquake focal mechanisms (or Seismic Moment Tensor, SMT) is a key tool on seismotectonics research. Each focal mechanism is characterized by several location parameters of the earthquake hypocenter, the earthquake size (magnitude and scalar moment tensor) and some geometrical characteristics of the rupture (nodal planes orientations, SMT components and/or SMT main axes orientations). The aim of FMC is to provide a simple but powerful tool to manage focal mechanism data. The data should be input to the program formatted as one of two of the focal mechanisms formatting options of the GMT (Generic Mapping Tools) package (Wessel and Smith, 1998): the Harvard CMT convention and the single nodal plane Aki and Richards (1980) convention. The former is a SMT format that can be downloaded directly from the Global CMT site (http://www.globalcmt.org/), while the later is the simplest way to describe earthquake rupture data. FMC is programmed in Python language, which is distributed as Open Source GPL-compatible, and therefore can be used to develop Free Software. Python runs on almost any machine, and has a wide support and presence in any operative system. The program has been conceived with the modularity and versatility of the classical UNIX-like tools. Is called from the command line and can be easily integrated into shell scripts (*NIX systems) or batch files (DOS/Windows systems). The program input and outputs can be done by means of ASCII files or using standard input (or redirection "<"), standard output (screen or redirection ">") and pipes ("|"). By default FMC will read the input and write the output as a Harvard CMT (psmeca formatted) ASCII file, although other formats can be used. Optionally FMC will produce a classification diagram representing the rupture type of the focal mechanisms processed. In order to count with a detailed classification of the focal mechanisms I decided to classify the focal mechanism in a series of fields that include the oblique slip regimes. This approximation is similar to the Johnston et al. (1994) classification; with 7 classes of earthquakes: 1) Normal; 2) Normal - Strike-slip; 3) Strike-slip - Normal; 4) Strike-slip; 5) Strike-slip - Reverse; 6) Reverse - strike-slip and 7) Reverse. FMC uses by default this classification in the resulting diagram, based on the Kaverina et al. (1996) projection, which improves the Frohlich and Apperson (1992) ternary diagram.

  6. Improvement of Michigan climatic files in pavement ME design.

    DOT National Transportation Integrated Search

    2015-10-01

    Climatic inputs have a great influence on Mechanistic-Empirical design results of flexible : and rigid pavements. Currently the state of Michigan has 24 climatic files embedded in Pavement ME : Design (PMED), but several limitations have been identif...

  7. File concepts for parallel I/O

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1989-01-01

    The subject of input/output (I/O) was often neglected in the design of parallel computer systems, although for many problems I/O rates will limit the speedup attainable. The I/O problem is addressed by considering the role of files in parallel systems. The notion of parallel files is introduced. Parallel files provide for concurrent access by multiple processes, and utilize parallelism in the I/O system to improve performance. Parallel files can also be used conventionally by sequential programs. A set of standard parallel file organizations is proposed, organizations are suggested, using multiple storage devices. Problem areas are also identified and discussed.

  8. MF2KtoMF05UC, a Program To Convert MODFLOW-2000 Files to MODFLOW-2005 and UCODE_2005 Files

    USGS Publications Warehouse

    Harbaugh, Arlen W.

    2007-01-01

    The program MF2KtoMF05UC has been developed to convert MODFLOW-2000 input files for use by MODFLOW-2005 and UCODE_2005. MF2KtoMF05UC was written in the Fortran 90 computer language. This report documents the use of MF2KtoMF05UC.

  9. NIH Seeks Input on In-patient Clinical Research Areas | Division of Cancer Prevention

    Cancer.gov

    [[{"fid":"2476","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Aerial view of the National Institutes of Health Clinical Center (Building 10) in Bethesda, Maryland.","field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"Aerial view of

  10. Structural tailoring of advanced turboprops (STAT): User's manual

    NASA Technical Reports Server (NTRS)

    Brown, K. W.

    1991-01-01

    This user's manual describes the Structural Tailoring of Advanced Turboprops program. It contains instructions to prepare the input for optimization, blade geometry and analysis, geometry generation, and finite element program control. In addition, a sample input file is provided as well as a section describing special applications (i.e., non-standard input).

  11. Advanced Technology Multiple Criteria Decision Model.

    DTIC Science & Technology

    1981-11-01

    ratings of the sys- tem parameters; and (3), HEADER which contains information on the structure of the problem and titles. Two supporting programs develop...in these files are given in Section V.2. 2. DATA STRUCTURE TABLES This section describes the data files used in the system selection model program ...the supporting program PPP and an input file to UPPP and SSMP. Figure 13 shows the structure of this file. b. User’s preference package (UPP) UPP is

  12. Role Discovery in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-14

    RolX takes the features from Re-FeX or any other feature matrix as input and outputs role assignments (clusters). The output of RolX is a csv file containing the node-role memberships and a csv file containing the role-feature definitions.

  13. Ecposure Related Dose Estimating Model

    EPA Science Inventory

    ERDEM is a physiologically based pharmacokinetic (PBPK) modeling system consisting of a general model and an associated front end. An actual model is defined when the user prepares an input command file. Such a command file defines the chemicals, compartments and processes that...

  14. Comparative evaluation of effect of rotary and reciprocating single-file systems on pericervical dentin: A cone-beam computed tomography study.

    PubMed

    Zinge, Priyanka Ramdas; Patil, Jayaprakash

    2017-01-01

    The aim of this study is to evaluate and compare the effect of one shape, Neolix rotary single-file systems and WaveOne, Reciproc reciprocating single-file systems on pericervical dentin (PCD) using cone-beam computed tomography (CBCT). A total of 40 freshly extracted mandibular premolars were collected and divided into two groups, namely, Group A - Rotary: A 1 - Neolix and A 2 - OneShape and Group B - Reciprocating: B 1 - WaveOne and B 2 - Reciproc. Preoperative scans of each were taken followed by conventional access cavity preparation and working length determination with 10-k file. Instrumentation of the canal was done according to the respective file system, and postinstrumentation CBCT scans of teeth were obtained. 90 μm thick slices were obtained 4 mm apical and coronal to the cementoenamel junction. The PCD thickness was calculated as the shortest distance from the canal outline to the closest adjacent root surface, which was measured in four surfaces, i.e., facial, lingual, mesial, and distal for all the groups in the two obtained scans. There was no significant difference found between rotary single-file systems and reciprocating single-file systems in their effect on PCD, but in Group B 2 , there was most significant loss of tooth structure in the mesial, lingual, and distal surface ( P < 0.05). Reciproc single-file system removes more PCD as compared to other experimental groups, whereas Neolix single file system had the least effect on PCD.

  15. User Requirements Analyzer (URA) User’s Manual H6180/Multics/Version 3.3.

    DTIC Science & Technology

    1978-07-01

    4.3 Enterinq Data Into An Input File 11 4.4 Using NAME- GEN 11 4.5 Using PUNCH Files 12 5. Receiving Output From URA Commands 12 5.1 The...CONSISTENCY RBPOPT 2fc^ KWIC INDEX 27? LIST-CHANGES Report 276 NAME- GEN 28C NAME LIST 29? PICTURE 313 PROCESS CHAIN...428 FREQUENCY U32 HELP U33 INPUT-PSL 434 INTERVAL-CONSISTENCY 435 KWIC 436 LIST-CtlANGES 437 N Af.E- GEN 4 38 NAf’B-LIST

  16. Addendum I, BIOPLUME III Graphics Conversion to SURFER Format

    EPA Pesticide Factsheets

    This procedure can be used to create a SURFER® compatible grid file from Bioplume III input and output graphics. The input data and results from Bioplume III can be contoured and printed directly from SURFER.

  17. Slope-Area Computation Program Graphical User Interface 1.0—A Preprocessing and Postprocessing Tool for Estimating Peak Flood Discharge Using the Slope-Area Method

    USGS Publications Warehouse

    Bradley, D. Nathan

    2012-01-01

    The slope-area method is a technique for estimating the peak discharge of a flood after the water has receded (Dalrymple and Benson, 1967). This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks (HWMs) on trees or buildings. These indicators of flood stage are combined with measurements of the cross-sectional geometry of the stream, estimates of channel roughness, and a mathematical model that balances the total energy of the flow between cross sections. This is in contrast to a “direct” measurement of discharge during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a gage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the stream gage, resulting in more accurate computation of high flows. The Slope-Area Computation program (SAC; Fulford, 1994) is an implementation of the slope-area method that computes a peak-discharge estimate from inputs of water-surface slope (from surveyed HWMs), channel geometry, and estimated channel roughness. SAC is a command line program written in Fortran that reads input data from a formatted text file and prints results to another formatted text file. Preparing the input file can be time-consuming and prone to errors. This document describes the SAC graphical user interface (GUI), a crossplatform “wrapper” application that prepares the SAC input file, executes the program, and helps the user interpret the output. The SAC GUI is an update and enhancement of the slope-area method (SAM; Hortness, 2004; Berenbrock, 1996), an earlier spreadsheet tool used to aid field personnel in the completion of a slope-area measurement. The SAC GUI reads survey data, develops a plan-view plot, water-surface profile, cross-section plots, and develops the SAC input file. The SAC GUI also develops HEC-2 files that can be imported into HEC–RAS.

  18. A Massively Parallel Computational Method of Reading Index Files for SOAPsnv.

    PubMed

    Zhu, Xiaoqian; Peng, Shaoliang; Liu, Shaojie; Cui, Yingbo; Gu, Xiang; Gao, Ming; Fang, Lin; Fang, Xiaodong

    2015-12-01

    SOAPsnv is the software used for identifying the single nucleotide variation in cancer genes. However, its performance is yet to match the massive amount of data to be processed. Experiments reveal that the main performance bottleneck of SOAPsnv software is the pileup algorithm. The original pileup algorithm's I/O process is time-consuming and inefficient to read input files. Moreover, the scalability of the pileup algorithm is also poor. Therefore, we designed a new algorithm, named BamPileup, aiming to improve the performance of sequential read, and the new pileup algorithm implemented a parallel read mode based on index. Using this method, each thread can directly read the data start from a specific position. The results of experiments on the Tianhe-2 supercomputer show that, when reading data in a multi-threaded parallel I/O way, the processing time of algorithm is reduced to 3.9 s and the application program can achieve a speedup up to 100×. Moreover, the scalability of the new algorithm is also satisfying.

  19. Merged analog and photon counting profiles used as input for other RLPROF VAPs

    DOE Data Explorer

    Newsom, Rob

    2014-10-03

    The rlprof_merge VAP "merges" the photon counting and analog signals appropriately for each channel, creating an output data file that is very similar to the original raw data file format that the Raman lidar initially had.

  20. Merged analog and photon counting profiles used as input for other RLPROF VAPs

    DOE Data Explorer

    Newsom, Rob

    1998-03-01

    The rlprof_merge VAP "merges" the photon counting and analog signals appropriately for each channel, creating an output data file that is very similar to the original raw data file format that the Raman lidar initially had.

  1. MISR Level 3 Radiance Versioning

    Atmospheric Science Data Center

    2016-11-04

    ... ESDT Product File Name Prefix Current Quality Designations MIL3DRD, MIL3MRD, MIL3QRD, and MIL3YRD ... Data Product Specification Rev K  (PDF). Update to work with new format of the input PGE 1 files.   F02_0007 ...

  2. HDF-EOS Dump Tools

    NASA Astrophysics Data System (ADS)

    Prasad, U.; Rahabi, A.

    2001-05-01

    The following utilities developed for HDF-EOS format data dump are of special use for Earth science data for NASA's Earth Observation System (EOS). This poster demonstrates their use and application. The first four tools take HDF-EOS data files as input. HDF-EOS Metadata Dumper - metadmp Metadata dumper extracts metadata from EOS data granules. It operates by simply copying blocks of metadata from the file to the standard output. It does not process the metadata in any way. Since all metadata in EOS granules is encoded in the Object Description Language (ODL), the output of metadmp will be in the form of complete ODL statements. EOS data granules may contain up to three different sets of metadata (Core, Archive, and Structural Metadata). HDF-EOS Contents Dumper - heosls Heosls dumper displays the contents of HDF-EOS files. This utility provides detailed information on the POINT, SWATH, and GRID data sets. in the files. For example: it will list, the Geo-location fields, Data fields and objects. HDF-EOS ASCII Dumper - asciidmp The ASCII dump utility extracts fields from EOS data granules into plain ASCII text. The output from asciidmp should be easily human readable. With minor editing, asciidmp's output can be made ingestible by any application with ASCII import capabilities. HDF-EOS Binary Dumper - bindmp The binary dumper utility dumps HDF-EOS objects in binary format. This is useful for feeding the output of it into existing program, which does not understand HDF, for example: custom software and COTS products. HDF-EOS User Friendly Metadata - UFM The UFM utility tool is useful for viewing ECS metadata. UFM takes an EOSDIS ODL metadata file and produces an HTML report of the metadata for display using a web browser. HDF-EOS METCHECK - METCHECK METCHECK can be invoked from either Unix or Dos environment with a set of command line options that a user might use to direct the tool inputs and output . METCHECK validates the inventory metadata in (.met file) using The Descriptor file (.desc) as the reference. The tool takes (.desc), and (.met) an ODL file as inputs, and generates a simple output file contains the results of the checking process.

  3. Attitude profile design program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Attitude Profile Design (APD) Program was designed to be used as a stand-alone addition to the Simplex Computation of Optimum Orbital Trajectories (SCOOT). The program uses information from a SCOOT output file and the user defined attitude profile to produce time histories of attitude, angular body rates, and accelerations. The APD program is written in standard FORTRAN77 and should be portable to any machine that has an appropriate compiler. The input and output are through formatted files. The program reads the basic flight data, such as the states of the vehicles, acceleration profiles, and burn information, from the SCOOT output file. The user inputs information about the desired attitude profile during coasts in a high level manner. The program then takes these high level commands and executes the maneuvers, outputting the desired information.

  4. Keemei: cloud-based validation of tabular bioinformatics file formats in Google Sheets.

    PubMed

    Rideout, Jai Ram; Chase, John H; Bolyen, Evan; Ackermann, Gail; González, Antonio; Knight, Rob; Caporaso, J Gregory

    2016-06-13

    Bioinformatics software often requires human-generated tabular text files as input and has specific requirements for how those data are formatted. Users frequently manage these data in spreadsheet programs, which is convenient for researchers who are compiling the requisite information because the spreadsheet programs can easily be used on different platforms including laptops and tablets, and because they provide a familiar interface. It is increasingly common for many different researchers to be involved in compiling these data, including study coordinators, clinicians, lab technicians and bioinformaticians. As a result, many research groups are shifting toward using cloud-based spreadsheet programs, such as Google Sheets, which support the concurrent editing of a single spreadsheet by different users working on different platforms. Most of the researchers who enter data are not familiar with the formatting requirements of the bioinformatics programs that will be used, so validating and correcting file formats is often a bottleneck prior to beginning bioinformatics analysis. We present Keemei, a Google Sheets Add-on, for validating tabular files used in bioinformatics analyses. Keemei is available free of charge from Google's Chrome Web Store. Keemei can be installed and run on any web browser supported by Google Sheets. Keemei currently supports the validation of two widely used tabular bioinformatics formats, the Quantitative Insights into Microbial Ecology (QIIME) sample metadata mapping file format and the Spatially Referenced Genetic Data (SRGD) format, but is designed to easily support the addition of others. Keemei will save researchers time and frustration by providing a convenient interface for tabular bioinformatics file format validation. By allowing everyone involved with data entry for a project to easily validate their data, it will reduce the validation and formatting bottlenecks that are commonly encountered when human-generated data files are first used with a bioinformatics system. Simplifying the validation of essential tabular data files, such as sample metadata, will reduce common errors and thereby improve the quality and reliability of research outcomes.

  5. Single file diffusion into a semi-infinite tube.

    PubMed

    Farrell, Spencer G; Brown, Aidan I; Rutenberg, Andrew D

    2015-11-23

    We investigate single file diffusion (SFD) of large particles entering a semi-infinite tube, such as luminal diffusion of proteins into microtubules or flagella. While single-file effects have no impact on the evolution of particle density, we report significant single-file effects for individually tracked tracer particle motion. Both exact and approximate ordering statistics of particles entering semi-infinite tubes agree well with our stochastic simulations. Considering initially empty semi-infinite tubes, with particles entering at one end starting from an initial time t = 0, tracked particles are initially super-diffusive after entering the system, but asymptotically diffusive at later times. For finite time intervals, the ratio of the net displacement of individual single-file particles to the average displacement of untracked particles is reduced at early times and enhanced at later times. When each particle is numbered, from the first to enter (n = 1) to the most recent (n = N), we find good scaling collapse of this distance ratio for all n. Experimental techniques that track individual particles, or local groups of particles, such as photo-activation or photobleaching of fluorescently tagged proteins, should be able to observe these single-file effects. However, biological phenomena that depend on local concentration, such as flagellar extension or luminal enzymatic activity, should not exhibit single-file effects.

  6. iMatTOUGH: An open-source Matlab-based graphical user interface for pre- and post-processing of TOUGH2 and iTOUGH2 models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan

    TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less

  7. iMatTOUGH: An open-source Matlab-based graphical user interface for pre- and post-processing of TOUGH2 and iTOUGH2 models

    DOE PAGES

    Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan

    2016-04-01

    TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less

  8. ICEG2D (v2.0) - An Integrated Software Package for Automated Prediction of Flow Fields for Single-Element Airfoils With Ice Accretion

    NASA Technical Reports Server (NTRS)

    Thompson David S.; Soni, Bharat K.

    2001-01-01

    An integrated geometry/grid/simulation software package, ICEG2D, is being developed to automate computational fluid dynamics (CFD) simulations for single- and multi-element airfoils with ice accretions. The current version, ICEG213 (v2.0), was designed to automatically perform four primary functions: (1) generate a grid-ready surface definition based on the geometrical characteristics of the iced airfoil surface, (2) generate high-quality structured and generalized grids starting from a defined surface definition, (3) generate the input and restart files needed to run the structured grid CFD solver NPARC or the generalized grid CFD solver HYBFL2D, and (4) using the flow solutions, generate solution-adaptive grids. ICEG2D (v2.0) can be operated in either a batch mode using a script file or in an interactive mode by entering directives from a command line within a Unix shell. This report summarizes activities completed in the first two years of a three-year research and development program to address automation issues related to CFD simulations for airfoils with ice accretions. As well as describing the technology employed in the software, this document serves as a users manual providing installation and operating instructions. An evaluation of the software is also presented.

  9. Parallel sort with a ranged, partitioned key-value store in a high perfomance computing environment

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron; Poole, Stephen W.

    2016-01-26

    Improved sorting techniques are provided that perform a parallel sort using a ranged, partitioned key-value store in a high performance computing (HPC) environment. A plurality of input data files comprising unsorted key-value data in a partitioned key-value store are sorted. The partitioned key-value store comprises a range server for each of a plurality of ranges. Each input data file has an associated reader thread. Each reader thread reads the unsorted key-value data in the corresponding input data file and performs a local sort of the unsorted key-value data to generate sorted key-value data. A plurality of sorted, ranged subsets of each of the sorted key-value data are generated based on the plurality of ranges. Each sorted, ranged subset corresponds to a given one of the ranges and is provided to one of the range servers corresponding to the range of the sorted, ranged subset. Each range server sorts the received sorted, ranged subsets and provides a sorted range. A plurality of the sorted ranges are concatenated to obtain a globally sorted result.

  10. Ada Namelist Package

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.

    1991-01-01

    Ada Namelist Package, developed for Ada programming language, enables calling program to read and write FORTRAN-style namelist files. Features are: handling of any combination of types defined by user; ability to read vectors, matrices, and slices of vectors and matrices; handling of mismatches between variables in namelist file and those in programmed list of namelist variables; and ability to avoid searching entire input file for each variable. Principle benefits derived by user: ability to read and write namelist-readable files, ability to detect most file errors in initialization phase, and organization keeping number of instantiated units to few packages rather than to many subprograms.

  11. West Flank Coso, CA FORGE 3D geologic model

    DOE Data Explorer

    Doug Blankenship

    2016-03-01

    This is an x,y,z file of the West Flank FORGE 3D geologic model. Model created in Earthvision by Dynamic Graphic Inc. The model was constructed with a grid spacing of 100 m. Geologic surfaces were extrapolated from the input data using a minimum tension gridding algorithm. The data file is tabular data in a text file, with lithology data associated with X,Y,Z grid points. All the relevant information is in the file header (the spatial reference, the projection etc.) In addition all the fields in the data file are identified in the header.

  12. Fallon FORGE 3D Geologic Model

    DOE Data Explorer

    Doug Blankenship

    2016-03-01

    An x,y,z scattered data file for the 3D geologic model of the Fallon FORGE site. Model created in Earthvision by Dynamic Graphic Inc. The model was constructed with a grid spacing of 100 m. Geologic surfaces were extrapolated from the input data using a minimum tension gridding algorithm. The data file is tabular data in a text file, with lithology data associated with X,Y,Z grid points. All the relevant information is in the file header (the spatial reference, the projection etc.) In addition all the fields in the data file are identified in the header.

  13. 76 FR 63575 - Transportation Conformity Rule: MOVES Regional Grace Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... written in FORTRAN and used simple text files for data input and output, MOVES2010a is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables. These changes...

  14. 76 FR 63554 - Transportation Conformity Rule: MOVES Regional Grace Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... written in FORTRAN and used simple text files for data input and output, MOVES2010a is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables. These changes...

  15. 77 FR 11394 - Transportation Conformity Rule: MOVES Regional Grace Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-27

    ... written in FORTRAN and used simple text files for data input and output, MOVES is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables.\\13\\ \\13\\ Some...

  16. UPEML Version 3.0: A machine-portable CDC update emulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehlhorn, T.A.; Haill, T.A.

    1992-04-01

    UPEML is a machine-portable program that emulates a subset of the functions of the standard CDC Update. Machine-portability has been achieved by conforming to ANSI standards for Fortran-77. UPEML is compact and fairly efficient; however, it only allows a restricted syntax as compared with the CDC Update. This program was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX/VMS mainframes and UNIX workstations. UPEML has also been successfully used on the multiprocessor ELXSI, on CRAYs under both UNICOS and CTSS operating systems, and on Sun, HP, Stardent and IBM workstations. UPEMLmore » was originally released with the ITS electron/photon Monte Carlo transport package, which was developed on a CDC-7600 and makes extensive use of conditional file structure to combine several problem geometry and machine options into a single program file. UPEML 3.0 is an enhanced version of the original code and is being independently released for use at any installation or with any code package. Version 3.0 includes enhanced error checking, full ASCII character support, a program library audit capability, and a partial update option in which only selected or modified decks are written to the complete file. Version 3.0 also checks for overlapping corrections, allows processing of pested calls to common decks, and allows the use of alternate files in READ and ADDFILE commands. Finally, UPEML Version 3.0 allows the assignment of input and output files at runtime on the control line.« less

  17. UPEML Version 3. 0: A machine-portable CDC update emulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehlhorn, T.A.; Haill, T.A.

    1992-04-01

    UPEML is a machine-portable program that emulates a subset of the functions of the standard CDC Update. Machine-portability has been achieved by conforming to ANSI standards for Fortran-77. UPEML is compact and fairly efficient; however, it only allows a restricted syntax as compared with the CDC Update. This program was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX/VMS mainframes and UNIX workstations. UPEML has also been successfully used on the multiprocessor ELXSI, on CRAYs under both UNICOS and CTSS operating systems, and on Sun, HP, Stardent and IBM workstations. UPEMLmore » was originally released with the ITS electron/photon Monte Carlo transport package, which was developed on a CDC-7600 and makes extensive use of conditional file structure to combine several problem geometry and machine options into a single program file. UPEML 3.0 is an enhanced version of the original code and is being independently released for use at any installation or with any code package. Version 3.0 includes enhanced error checking, full ASCII character support, a program library audit capability, and a partial update option in which only selected or modified decks are written to the complete file. Version 3.0 also checks for overlapping corrections, allows processing of pested calls to common decks, and allows the use of alternate files in READ and ADDFILE commands. Finally, UPEML Version 3.0 allows the assignment of input and output files at runtime on the control line.« less

  18. SutraPrep, a pre-processor for SUTRA, a model for ground-water flow with solute or energy transport

    USGS Publications Warehouse

    Provost, Alden M.

    2002-01-01

    SutraPrep facilitates the creation of three-dimensional (3D) input datasets for the USGS ground-water flow and transport model SUTRA Version 2D3D.1. It is most useful for applications in which the geometry of the 3D model domain and the spatial distribution of physical properties and boundary conditions is relatively simple. SutraPrep can be used to create a SUTRA main input (?.inp?) file, an initial conditions (?.ics?) file, and a 3D plot of the finite-element mesh in Virtual Reality Modeling Language (VRML) format. Input and output are text-based. The code can be run on any platform that has a standard FORTRAN-90 compiler. Executable code is available for Microsoft Windows.

  19. Developing a Complete and Effective ACT-R Architecture

    DTIC Science & Technology

    2008-01-01

    of computational primitives , as contrasted with the predominant “one-off” and “grab-bag” cognitive models in the field. These architectures have...transport/ semaphore protocols connected via a glue script. Both protocols rely on the fact that file rename and file remove operations are atomic...the Trial Log file until just prior to processing the next input request. Thus, to perform synchronous identifications it is necessary to run an

  20. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    NASA Astrophysics Data System (ADS)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  1. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations.

    PubMed

    Laloo, Jalal Z A; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  2. Sam2bam: High-Performance Framework for NGS Data Preprocessing Tools

    PubMed Central

    Cheng, Yinhe; Tzeng, Tzy-Hwa Kathy

    2016-01-01

    This paper introduces a high-throughput software tool framework called sam2bam that enables users to significantly speed up pre-processing for next-generation sequencing data. The sam2bam is especially efficient on single-node multi-core large-memory systems. It can reduce the runtime of data pre-processing in marking duplicate reads on a single node system by 156–186x compared with de facto standard tools. The sam2bam consists of parallel software components that can fully utilize multiple processors, available memory, high-bandwidth storage, and hardware compression accelerators, if available. The sam2bam provides file format conversion between well-known genome file formats, from SAM to BAM, as a basic feature. Additional features such as analyzing, filtering, and converting input data are provided by using plug-in tools, e.g., duplicate marking, which can be attached to sam2bam at runtime. We demonstrated that sam2bam could significantly reduce the runtime of next generation sequencing (NGS) data pre-processing from about two hours to about one minute for a whole-exome data set on a 16-core single-node system using up to 130 GB of memory. The sam2bam could reduce the runtime of NGS data pre-processing from about 20 hours to about nine minutes for a whole-genome sequencing data set on the same system using up to 711 GB of memory. PMID:27861637

  3. Capturing and Understanding Experiment Provenance using NiNaC

    NASA Astrophysics Data System (ADS)

    Rosati, C.

    2017-12-01

    A problem the model development team faces at the GFDL is determining climate model experiment provenance. Each experiment is configured with at least one configuration file which may reference other files. The experiment then passes through three phases before completion. Configuration files or other input files may be modified between phases. Finding the modifications later is tedious due to the expanse of the experiment input and duplication across phases. Determining provenance may be impossible if any file has been changed or deleted. To reduce these efforts and address these problems, we propose a new toolset, NiNaC, for archiving experiment provenance from the beginning of the experiment to the end and every phase in-between. Each of the three phases, check-out, build, and run, of the experiment depends on the previous phase. We use a graph to model the phase dependencies. Let each phase be represented by a node. Let each edge correspond to a dependency between phases where the node incident with the tail depends on the node incident with the head. It follows that the dependency graph is a tree. We reduce the problem to finding the lowest common ancestor and diffing the successor nodes. All files related to input for a phase are assigned a checksum. A new file is created to aggregate the checksums. Then each phase is assigned a checksum of aforementioned file as an identifier. Any change to part of a phase configuration will create unique checksums in all subsequent phases. Finding differences between experiments with this toolset is as simple as diffing two files containing checksums found by traversing the tree. One new benefit is that this toolset now allows differences in source code to be found after experiments are run, which was previously impossible for executables that cannot be linked to a known version controlled source code. Knowing that these changes exist allows us to give priority to help desk tickets concerning unmodified supported experiment releases, and minimize effort spent on unsupported experiments. It is also possible that a change is made, either by mistake or by system error. NiNaC would find the exact file in the precise phase with the change. In this way, NiNaC makes provenance tracking less tedious and solves problems where tracking provenance may previously have been impossible to do.

  4. VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shapiro, A.; Huria, H.C.; Cho, K.W.

    1991-12-01

    VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less

  5. MOVES2014 for Experienced Users, September 2014 Webinar Slides

    EPA Pesticide Factsheets

    This webinar assumes a basic knowledge of past versions of the MOtor Vehicle Emission Simulator (MOVES) and includes a demonstration of the conversion of MOVES2010b input files to MOVES2014 format, changes to the MOVES GUI, and new input options.

  6. Building accurate historic and future climate MEPDG input files for Louisiana DOTD.

    DOT National Transportation Integrated Search

    2017-02-01

    The pavement design process (originally MEPDG, then DARWin-ME, and now Pavement ME Design) requires a multi-year set of hourly : climate input data that influence pavement material properties. In Louisiana, the software provides nine locations with c...

  7. TLIFE: a Program for Spur, Helical and Spiral Bevel Transmission Life and Reliability Modeling

    NASA Technical Reports Server (NTRS)

    Savage, M.; Prasanna, M. G.; Rubadeux, K. L.

    1994-01-01

    This report describes a computer program, 'TLIFE', which models the service life of a transmission. The program is written in ANSI standard Fortran 77 and has an executable size of about 157 K bytes for use on a personal computer running DOS. It can also be compiled and executed in UNIX. The computer program can analyze any one of eleven unit transmissions either singly or in a series combination of up to twenty-five unit transmissions. Metric or English unit calculations are performed with the same routines using consistent input data and a units flag. Primary outputs are the dynamic capacity of the transmission and the mean lives of the transmission and of the sum of its components. The program uses a modular approach to separate the load analyses from the system life calculations. The program and its input and output data files are described herein. Three examples illustrate its use. A development of the theory behind the analysis in the program is included after the examples.

  8. Online handwritten mathematical expression recognition

    NASA Astrophysics Data System (ADS)

    Büyükbayrak, Hakan; Yanikoglu, Berrin; Erçil, Aytül

    2007-01-01

    We describe a system for recognizing online, handwritten mathematical expressions. The system is designed with a user-interface for writing scientific articles, supporting the recognition of basic mathematical expressions as well as integrals, summations, matrices etc. A feed-forward neural network recognizes symbols which are assumed to be single-stroke and a recursive algorithm parses the expression by combining neural network output and the structure of the expression. Preliminary results show that writer-dependent recognition rates are very high (99.8%) while writer-independent symbol recognition rates are lower (75%). The interface associated with the proposed system integrates the built-in recognition capabilities of the Microsoft's Tablet PC API for recognizing textual input and supports conversion of hand-drawn figures into PNG format. This enables the user to enter text, mathematics and draw figures in a single interface. After recognition, all output is combined into one LATEX code and compiled into a PDF file.

  9. FASTdoop: a versatile and efficient library for the input of FASTA and FASTQ files for MapReduce Hadoop bioinformatics applications.

    PubMed

    Ferraro Petrillo, Umberto; Roscigno, Gianluca; Cattaneo, Giuseppe; Giancarlo, Raffaele

    2017-05-15

    MapReduce Hadoop bioinformatics applications require the availability of special-purpose routines to manage the input of sequence files. Unfortunately, the Hadoop framework does not provide any built-in support for the most popular sequence file formats like FASTA or BAM. Moreover, the development of these routines is not easy, both because of the diversity of these formats and the need for managing efficiently sequence datasets that may count up to billions of characters. We present FASTdoop, a generic Hadoop library for the management of FASTA and FASTQ files. We show that, with respect to analogous input management routines that have appeared in the Literature, it offers versatility and efficiency. That is, it can handle collections of reads, with or without quality scores, as well as long genomic sequences while the existing routines concentrate mainly on NGS sequence data. Moreover, in the domain where a comparison is possible, the routines proposed here are faster than the available ones. In conclusion, FASTdoop is a much needed addition to Hadoop-BAM. The software and the datasets are available at http://www.di.unisa.it/FASTdoop/ . umberto.ferraro@uniroma1.it. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  10. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hellfire Missile Test at Yuma Proving Ground

    DTIC Science & Technology

    2001-11-01

    that there were· no· target misses. The Hellfire missile does not have a depleted uranium head . . -,, 2.2.2.3 Tank movement During the test, the...guide otber users through the use of this. complicated program. The_input data files for NOISEMAP consist of a root file name with several extensions...SOURCES subdirectory. This file will have the root file name followed by an accession number, then the .bps extension. The user must check the *.log

  11. VizieR Online Data Catalog: Radiative forces for stellar envelopes (Seaton, 1997)

    NASA Astrophysics Data System (ADS)

    Seaton, M. J.; Yan, Y.; Mihalas, D.; Pradhan, A. K.

    2000-02-01

    (1) Primary data files, stages.zz These files give data for the calculation of radiative accelerations, GRAD, for elements with nuclear charge zz. Data are available for zz=06, 07, 08, 10, 11, 12, 13, 14, 16, 18, 20, 24, 25, 26 and 28. Calculations are made using data from the Opacity Project (see papers SYMP and IXZ). The data are given for each ionisation stage, j. They are tabulated on a mesh of (T, Ne, CHI) where T is temperature, Ne electron density and CHI is abundance multiplier. The files include data for ionisation fractions, for each (T, Ne). The file contents are described in the paper ACC and as comments in the code add.f (2) Code add.f This reads a file stages.zz and creates a file acc.zz giving radiative accelerations averaged over ionisation stages. The code prompts for names of input and output files. The code, as provided, gives equal weights (as defined in the paper ACC) to all stages. Th weights are set in SUBROUTINE WEIGHTS, which could be changed to give any weights preferred by the user. The dependence of diffusion coefficients on ionisation stage is given by a function ZET, which is defined in SUBROUTINE ZETA. The expressions used for ZET are as given in the paper. The user can change that subroutine if other expressions are preferred. The output file contains values, ZETBAR, of ZET, averaged over ionisation stages. (3) Files acc.zz Radiative accelerations computed using add.f as provided. The user will need to run the code add.f only if it is required to change the subroutines WEIGHTS or ZETA. The contents of the files acc.zz are described in the paper ACC and in comments contained in the code add.f. (4) Code accfit.f This code gives gives radiative accelerations, and some related data, for a stellar model. Methods used to interpolate data to the values of (T, RHO) for the stellar model are based on those used in the code opfit.for (see the paper OPF). The executable file accfit.com runs accfit.f. It uses a list of files given in accfit.files (see that file for further description). The mesh used for the abundance-multiplier CHI on the output file will generally be finer than that used in the input files acc.zz. The mesh to be used is specified on a file chi.dat. For a test run, the stellar model used is given in the file 10000_4.2 (Teff=10000 K, LOG10(g)=4.2) The output file from that test run is acc100004.2. The contents of the output file are described in the paper ACC and as comments in the code accfit.f. (5) The code diff.f This code reads the output file (e.g. acc1000004.2) created by accfit.f. For any specified depth point in the model and value of CHI, it gives values of radiative accelerations, the quantity ZETBAR required for calculation of diffusion coefficients, and Rosseland-mean opacities. The code prompts for input data. It creates a file recording all data calculated. The code diff.f is intended for incorporation, as a set of subroutines, in codes for diffusion calculations. (1 data file).

  12. Program Description: Financial Master File Processor-SWRL Financial System.

    ERIC Educational Resources Information Center

    Ideda, Masumi

    Computer routines designed to produce various management and accounting reports required by the Southwest Regional Laboratory's (SWRL) Financial System are described. Input data requirements and output report formats are presented together with a discussion of the Financial Master File updating capabilities of the system. This document should be…

  13. Runwien: a text-based interface for the WIEN package

    NASA Astrophysics Data System (ADS)

    Otero de la Roza, A.; Luaña, Víctor

    2009-05-01

    A new text-based interface for WIEN2k, the full-potential linearized augmented plane-waves (FPLAPW) program, is presented. This code provides an easy to use, yet powerful way of generating arbitrarily large sets of calculations. Thus, properties over a potential energy surface and WIEN2k parameter exploration can be calculated using a simple input text file. This interface also provides new capabilities to the WIEN2k package, such as the calculation of elastic constants on hexagonal systems or the automatic gathering of relevant information. Additionally, runwien is modular, flexible and intuitive. Program summaryProgram title: runwien Catalogue identifier: AECM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL version 3 No. of lines in distributed program, including test data, etc.: 62 567 No. of bytes in distributed program, including test data, etc.: 610 973 Distribution format: tar.gz Programming language: gawk (with locale POSIX or similar) Computer: All running Unix, Linux Operating system: Unix, GNU/Linux Classification: 7.3 External routines: WIEN2k ( http://www.wien2k.at/), GAWK ( http://www.gnu.org/software/gawk/), rename by L. Wall, a Perl script which renames files, modified by R. Barker to check for the existence of target files, gnuplot ( http://www.gnuplot.info/) Subprograms used:Cat Id: ADSY_v1_0/AECB_v1_0, Title: GIBBS/CRITIC, Reference: CPC 158 (2004) 57/CPC 999 (2009) 999 Nature of problem: Creation of a text-based, batch-oriented interface for the WIEN2k package. Solution method: WIEN2k solves the Kohn-Sham equations of a solid using the FPLAPW formalism. Runwien interprets an input file containing the description of the geometry and structure of the solid and drives the execution of the WIEN2k programs. The input is simplified thanks to the default values of the WIEN2k parameters known to runwien. Additional comments: Designed for WIEN2k versions 06.4, 07.2, 08.2, and 08.3. Running time: For the test case (TiC), a single geometry takes 5 to 10 minutes on a typical desktop PC (Intel Pentium 4, 3.4 GHz, 1 GB RAM). The full example including the calculation of the elastic constants and the equation of state, takes 9 hours and 32 minutes.

  14. Development and qualification of additively manufactured parts for space

    NASA Astrophysics Data System (ADS)

    O'Brien, Michael J.

    2018-02-01

    Additive manufacturing (commonly called "3D printing") fabricates the desired final part directly from the input CAD (Computer Aided Design) file by depositing and fusing layer upon layer of the source material. New engineering designs are possible in which a single optimized part with novel topology can replace several traditional parts. The complex physics of metal deposition leads to variations in quality and to new flaws and residual stresses not seen in traditional manufacturing. Additive manufacturing currently has gaps in knowledge. Mission assurance will require: qualification and certification standards; sharing of data in handbooks; predictive models relating processing, microstructure and properties; and development of closed loop process control and non-destructive evaluation to reduce variability.

  15. ESP (External-Stores Program) - A Pilot Computer Program for Determining Flutter-Critical External-Store Configurations. Volume 1. User’s Manual,

    DTIC Science & Technology

    1985-02-01

    li’Lii El. IE F INE ,UT 1 = K MM. * GET, NAST484/UN=SYSTEM. E(EGIN, ,NAST464. PFL, 160000, RED’UCE(-). LINKI , L~DDEDDD Figure A-I1 Typical Control-Card...initiated via Che LINKI statement, in which the second term is the input data file. The permanent file name KMDM, shown in conjunction with local file

  16. cluster trials v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, John; Castillo, Andrew

    2016-09-21

    This software contains a set of python modules – input, search, cluster, analysis; these modules read input files containing spatial coordinates and associated attributes which can be used to perform nearest neighbor search (spatial indexing via kdtree), cluster analysis/identification, and calculation of spatial statistics for analysis.

  17. Characterization of single-file diffusion in one-dimensional dusty plasma

    NASA Astrophysics Data System (ADS)

    Theisen, W. L.; Sheridan, T. E.

    2010-11-01

    Single-file diffusion occurs in one-dimensional systems when particles cannot pass each other and the mean-squared displacement (msd) of these particles increases with time t. Diffusive processes that follow Ficks law predict that the msd increases as t, however, single-file diffusion is sub-Fickean meaning that the msd is predicted to increase as t^1/2. One-dimensional dusty plasma rings have been created under strongly coupled, over-damped conditions. Particle position data from these rings will be analyzed to determine the scaling of the msd with time. Results will be compared with predictions of single-file diffusion theory.

  18. OpenDrift - an open source framework for ocean trajectory modeling

    NASA Astrophysics Data System (ADS)

    Dagestad, Knut-Frode; Breivik, Øyvind; Ådlandsvik, Bjørn

    2016-04-01

    We will present a new, open source tool for modeling the trajectories and fate of particles or substances (Lagrangian Elements) drifting in the ocean, or even in the atmosphere. The software is named OpenDrift, and has been developed at Norwegian Meteorological Institute in cooperation with Institute of Marine Research. OpenDrift is a generic framework written in Python, and is openly available at https://github.com/knutfrode/opendrift/. The framework is modular with respect to three aspects: (1) obtaining input data, (2) the transport/morphological processes, and (3) exporting of results to file. Modularity is achieved through well defined interfaces between components, and use of a consistent vocabulary (CF conventions) for naming of variables. Modular input implies that it is not necessary to preprocess input data (e.g. currents, wind and waves from Eulerian models) to a particular file format. Instead "reader modules" can be written/used to obtain data directly from any original source, including files or through web based protocols (e.g. OPeNDAP/Thredds). Modularity of processes implies that a model developer may focus on the geophysical processes relevant for the application of interest, without needing to consider technical tasks such as reading, reprojecting, and colocating input data, rotation and scaling of vectors and model output. We will show a few example applications of using OpenDrift for predicting drifters, oil spills, and search and rescue objects.

  19. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-03

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  20. NMCS Information Processing System 360 Formatted File System (NIPS 360 FFS). Users Manual. Volume 1. Introduction to File Concepts

    DTIC Science & Technology

    1978-09-01

    input source language (other than ?S). Used in double form, negates an FFT spcification for a subroutine. / (Slash) Used to separate numeric digits (V-l...1) represents the digits 1-999 and also digits followed by a letter, e.g., LINE1OA ’. Th, E311owing name prefiKes are not allowed: PSSQ, VSEr, VSZ...b ZERO OUT 6 LA 6,12 (6) ADD 12 TO 6 PUT IN REG b 4’MOVF INPUT DATE rC WORK APEA, REFORMAT DD AND YY 0C)NVERr TWO DIGIT MONTH TO SYMBOLIC THREE

  1. LocalMove: computing on-lattice fits for biopolymers

    PubMed Central

    Ponty, Y.; Istrate, R.; Porcelli, E.; Clote, P.

    2008-01-01

    Given an input Protein Data Bank file (PDB) for a protein or RNA molecule, LocalMove is a web server that determines an on-lattice representation for the input biomolecule. The web server implements a Markov Chain Monte-Carlo algorithm with simulated annealing to compute an approximate fit for either the coarse-grain model or backbone model on either the cubic or face-centered cubic lattice. LocalMove returns a PDB file as output, as well as dynamic movie of 3D images of intermediate conformations during the computation. The LocalMove server is publicly available at http://bioinformatics.bc.edu/clotelab/localmove/. PMID:18556754

  2. A Computer System for a Union Catalog: Theme and Variations *

    PubMed Central

    Felter, Jacqueline W.; Tjoeng, Djoeng S.

    1965-01-01

    This article describes a computer system for the generation and maintenance of a union catalog of periodicals and for printouts of both the entire file and selected portions. Although the system was designed to meet the specifications of the Union Catalog of Medical Periodicals of New York, its use is not limited. Only the basic file maintenance program is indispensable; the subsidiary programs may be used as needed. The scope and content of the catalog are determined by the input. The preparation of the input is described in detail, with comment on the keypunching of library records. Applications to other kinds of catalogs are suggested. PMID:14271111

  3. Recursive Feature Extraction in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-14

    ReFeX extracts recursive topological features from graph data. The input is a graph as a csv file and the output is a csv file containing feature values for each node in the graph. The features are based on topological counts in the neighborhoods of each nodes, as well as recursive summaries of neighbors' features.

  4. An information retrieval system for research file data

    Treesearch

    Joan E. Lengel; John W. Koning

    1978-01-01

    Research file data have been successfully retrieved at the Forest Products Laboratory through a high-speed cross-referencing system involving the computer program FAMULUS as modified by the Madison Academic Computing Center at the University of Wisconsin. The method of data input, transfer to computer storage, system utilization, and effectiveness are discussed....

  5. Formal Verification of Digital Logic

    DTIC Science & Technology

    1991-12-01

    INVERT circuit was based upon VHDL code provided in the Zycad Reference Manual [32:Ch 10,73]. The other circuits were based upon VHtDL code written...HALFADD.PL /* This file implements a simple half-adder that * /* is built from inverters and 2 input nand gates. * /* It is based upon a Zycad VHDL file...It is based upon a Zycad VHDL file written by * /* Capt Dave Banton, which is attached below the * /* Prolog code . *load..in(primitive). %h get nor2

  6. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing.

    PubMed

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance; Messina, Thomas; Fan, Hongtao; Jaeger, Edward; Stephens, Susan

    2013-06-27

    Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available for third-party implementation and use, and can be downloaded from http://s3.amazonaws.com/jnj_rainbow/index.html.

  7. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing

    PubMed Central

    2013-01-01

    Background Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Results Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Conclusions Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available for third-party implementation and use, and can be downloaded from http://s3.amazonaws.com/jnj_rainbow/index.html. PMID:23802613

  8. Single mimivirus particles intercepted and imaged with an X-ray laser (CXIDB ID 1)

    DOE Data Explorer

    Seibert, M. Marvin; Ekeberg, Tomas; Maia, Filipe R.N.C.

    2011-02-02

    These are the files used to reconstruct the images in the paper "Single Mimivirus particles intercepted and imaged with an X-ray laser". Besides the diffracted intensities, the Hawk configuration files used for the reconstructions are also provided. The files from CXIDB ID 1 are the pattern and configuration files for the pattern showed in Figure 2a in the paper.

  9. Single mimivirus particles intercepted and imaged with an X-ray laser (CXIDB ID 2)

    DOE Data Explorer

    Seibert, M. Marvin; Ekeberg, Tomas

    2011-02-02

    These are the files used to reconstruct the images in the paper "Single Mimivirus particles intercepted and imaged with an X-ray laser". Besides the diffracted intensities, the Hawk configuration files used for the reconstructions are also provided. The files from CXIDB ID 2 are the pattern and configuration files for the pattern showed in Figure 2b in the paper.

  10. MOVES2014 at the Project Level for Experienced Users, October 2014 Webinar Slides

    EPA Pesticide Factsheets

    This webinar covers the changes that enhance the MOtor Vehicle Emission Simulator at the project scale, changes to its graphical user interface at the project scale, how to convert a MOVES2010b project-level input file to MOVES2014 format, and new input.

  11. Evaluation of canal transportation after preparation with Reciproc single-file systems with or without glide path files.

    PubMed

    Aydin, Ugur; Karataslioglu, Emrah

    2017-01-01

    Canal transportation is a common sequel caused by rotary instruments. The purpose of the present study is to evaluate the degree of transportation after the use of Reciproc single-file instruments with or without glide path files. Thirty resin blocks with L-shaped canals were divided into three groups ( n = 10). Group 1 - canals were prepared with Reciproc-25 file. Group 2 - glide path file-G1 was used before Reciproc. Group 3 - glide path files-G1 and G2 were used before Reciproc. Pre- and post-instrumentation images were superimposed under microscope, and resin removed from the inner and outer surfaces of the root canal was calculated throughout 10 points. Statistical analysis was performed with Kruskal-Wallis test and post hoc Dunn test. For coronal and middle one-thirds, there was no significant difference among groups ( P > 0.05). For apical section, transportation of Group 1 was significantly higher than other groups ( P < 0.05). Using glide path files before Reciproc single-file system reduced the degree of apical canal transportation.

  12. VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system. Version 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shapiro, A.; Huria, H.C.; Cho, K.W.

    1991-12-01

    VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less

  13. NEMAR plotting computer program

    NASA Technical Reports Server (NTRS)

    Myler, T. R.

    1981-01-01

    A FORTRAN coded computer program which generates CalComp plots of trajectory parameters is examined. The trajectory parameters are calculated and placed on a data file by the Near Earth Mission Analysis Routine computer program. The plot program accesses the data file and generates the plots as defined by inputs to the plot program. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included. Although this plot program utilizes a random access data file, a data file of the same type and formatted in 102 numbers per record could be generated by any computer program and used by this plot program.

  14. UCODE_2005 and six other computer codes for universal sensitivity analysis, calibration, and uncertainty evaluation constructed using the JUPITER API

    USGS Publications Warehouse

    Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen

    2006-01-01

    This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a weighted least-squares objective function is minimized with respect to the parameter values using a modified Gauss-Newton method or a double-dogleg technique. Sensitivities needed for the method can be read from files produced by process models that can calculate sensitivities, such as MODFLOW-2000, or can be calculated by UCODE_2005 using a more general, but less accurate, forward- or central-difference perturbation technique. Problems resulting from inaccurate sensitivities and solutions related to the perturbation techniques are discussed in the report. Statistics are calculated and printed for use in (1) diagnosing inadequate data and identifying parameters that probably cannot be estimated; (2) evaluating estimated parameter values; and (3) evaluating how well the model represents the simulated processes. Results from UCODE_2005 and codes RESIDUAL_ANALYSIS and RESIDUAL_ANALYSIS_ADV can be used to evaluate how accurately the model represents the processes it simulates. Results from LINEAR_UNCERTAINTY can be used to quantify the uncertainty of model simulated values if the model is sufficiently linear. Results from MODEL_LINEARITY and MODEL_LINEARITY_ADV can be used to evaluate model linearity and, thereby, the accuracy of the LINEAR_UNCERTAINTY results. UCODE_2005 can also be used to calculate nonlinear confidence and predictions intervals, which quantify the uncertainty of model simulated values when the model is not linear. CORFAC_PLUS can be used to produce factors that allow intervals to account for model intrinsic nonlinearity and small-scale variations in system characteristics that are not explicitly accounted for in the model or the observation weighting. The six post-processing programs are independent of UCODE_2005 and can use the results of other programs that produce the required data-exchange files. UCODE_2005 and the other six codes are intended for use on any computer operating system. The programs con

  15. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function tomore » convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.« less

  16. GENSURF: A mesh generator for 3D finite element analysis of surface and corner cracks in finite thickness plates subjected to mode-1 loadings

    NASA Technical Reports Server (NTRS)

    Raju, I. S.

    1992-01-01

    A computer program that generates three-dimensional (3D) finite element models for cracked 3D solids was written. This computer program, gensurf, uses minimal input data to generate 3D finite element models for isotropic solids with elliptic or part-elliptic cracks. These models can be used with a 3D finite element program called surf3d. This report documents this mesh generator. In this manual the capabilities, limitations, and organization of gensurf are described. The procedures used to develop 3D finite element models and the input for and the output of gensurf are explained. Several examples are included to illustrate the use of this program. Several input data files are included with this manual so that the users can edit these files to conform to their crack configuration and use them with gensurf.

  17. Blurring the Inputs: A Natural Language Approach to Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Thompson, Richard A.; Johnston, Christopher O.

    2007-01-01

    To document model parameter uncertainties and to automate sensitivity analyses for numerical simulation codes, a natural-language-based method to specify tolerances has been developed. With this new method, uncertainties are expressed in a natural manner, i.e., as one would on an engineering drawing, namely, 5.25 +/- 0.01. This approach is robust and readily adapted to various application domains because it does not rely on parsing the particular structure of input file formats. Instead, tolerances of a standard format are added to existing fields within an input file. As a demonstration of the power of this simple, natural language approach, a Monte Carlo sensitivity analysis is performed for three disparate simulation codes: fluid dynamics (LAURA), radiation (HARA), and ablation (FIAT). Effort required to harness each code for sensitivity analysis was recorded to demonstrate the generality and flexibility of this new approach.

  18. Timeline Resource Analysis Program (TRAP): User's manual and program document

    NASA Technical Reports Server (NTRS)

    Sessler, J. G.

    1981-01-01

    The Timeline Resource Analysis Program (TRAP), developed for scheduling and timelining problems, is described. Given an activity network, TRAP generates timeline plots, resource histograms, and tabular summaries of the network, schedules, and resource levels. It is written in ANSI FORTRAN for the Honeywell SIGMA 5 computer and operates in the interactive mode using the TEKTRONIX 4014-1 graphics terminal. The input network file may be a standard SIGMA 5 file or one generated using the Interactive Graphics Design System. The timeline plots can be displayed in two orderings: according to the sequence in which the tasks were read on input, and a waterfall sequence in which the tasks are ordered by start time. The input order is especially meaningful when the network consists of several interacting subnetworks. The waterfall sequence is helpful in assessing the project status at any point in time.

  19. Wake Vortex Inverse Model User's Guide

    NASA Technical Reports Server (NTRS)

    Lai, David; Delisi, Donald

    2008-01-01

    NorthWest Research Associates (NWRA) has developed an inverse model for inverting landing aircraft vortex data. The data used for the inversion are the time evolution of the lateral transport position and vertical position of both the port and starboard vortices. The inverse model performs iterative forward model runs using various estimates of vortex parameters, vertical crosswind profiles, and vortex circulation as a function of wake age. Forward model predictions of lateral transport and altitude are then compared with the observed data. Differences between the data and model predictions guide the choice of vortex parameter values, crosswind profile and circulation evolution in the next iteration. Iterations are performed until a user-defined criterion is satisfied. Currently, the inverse model is set to stop when the improvement in the rms deviation between the data and model predictions is less than 1 percent for two consecutive iterations. The forward model used in this inverse model is a modified version of the Shear-APA model. A detailed description of this forward model, the inverse model, and its validation are presented in a different report (Lai, Mellman, Robins, and Delisi, 2007). This document is a User's Guide for the Wake Vortex Inverse Model. Section 2 presents an overview of the inverse model program. Execution of the inverse model is described in Section 3. When executing the inverse model, a user is requested to provide the name of an input file which contains the inverse model parameters, the various datasets, and directories needed for the inversion. A detailed description of the list of parameters in the inversion input file is presented in Section 4. A user has an option to save the inversion results of each lidar track in a mat-file (a condensed data file in Matlab format). These saved mat-files can be used for post-inversion analysis. A description of the contents of the saved files is given in Section 5. An example of an inversion input file, with preferred parameters values, is given in Appendix A. An example of the plot generated at a normal completion of the inversion is shown in Appendix B.

  20. Obscuration Code with Space Station Applications (Manual)

    DTIC Science & Technology

    1985-12-01

    used to perform this DCL style com - mand parsing, readers are referred to the VMS documentation concerning the Command Definition Utility or CDU. I I I...FOR0O7.DAT; Input echo file: USERI: [RJM.NASJAN5S1 .LIS;3 The above examples show the operation of the SET OUTPUT com - mand. Note that the printer file is...be opened using the SET OUTPUT com - mand. The output files can be opened and closed using the SET OUTPUT /ECHOING, /PRINTABLE, /PLOTTABLE commands

  1. Transferable Output ASCII Data (TOAD) gateway: Version 1.0 user's guide

    NASA Technical Reports Server (NTRS)

    Bingel, Bradford D.

    1991-01-01

    The Transferable Output ASCII Data (TOAD) Gateway, release 1.0 is described. This is a software tool for converting tabular data from one format into another via the TOAD format. This initial release of the Gateway allows free data interchange among the following file formats: TOAD; Standard Interface File (SIF); Program to Optimize Simulated Trajectories (POST) input; Comma Separated Value (TSV); and a general free-form file format. As required, additional formats can be accommodated quickly and easily.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Mark A.; Bigelow, Matthew; Gilkey, Jeff C.

    The Super Strypi SWIL is a six degree-of-freedom (6DOF) simulation for the Super Strypi Launch Vehicle that includes a subset of the Super Strypi NGC software (guidance, ACS and sequencer). Aerodynamic and propulsive forces, mass properties, ACS (attitude control system) parameters, guidance parameters and Monte-Carlo parameters are defined in input files. Output parameters are saved to a Matlab mat file.

  3. Program VSMOKE--Users Manual

    Treesearch

    Leonidas G. Lavdas

    1996-01-01

    This is a users manual for VSMOKE, a computer porgram for predicting the smoke and dry weather visibility impact of a singel prescvribed fire at several downwind locations. VSMOKE is a FORTRAN 77 program that depends on the input in file VSMOKE.IPT to generate output in file compatible with those used by the U.S. Environmental Protection Agency. VSMOKE is uniquely...

  4. Quantitative Microbial Risk Assessment Tutorial - SDMProjectBuilder: Import Local Data Files to Identify and Modify Contamination Sources and Input ParametersUpdated 2017

    EPA Science Inventory

    Twelve example local data support files are automatically downloaded when the SDMProjectBuilder is installed on a computer. They allow the user to modify values to parameters that impact the release, migration, fate, and transport of microbes within a watershed, and control delin...

  5. Quantitative Microbial Risk Assessment Tutorial – SDMProjectBuilder: Import Local Data Files to Identify and Modify Contamination Sources and Input Parameters

    EPA Science Inventory

    Twelve example local data support files are automatically downloaded when the SDMProjectBuilder is installed on a computer. They allow the user to modify values to parameters that impact the release, migration, fate, and transport of microbes within a watershed, and control delin...

  6. Preliminary investigation of single-file diffusion in complex plasma rings

    NASA Astrophysics Data System (ADS)

    Theisen, W. L.; Sheridan, T. E.

    2010-04-01

    Particles in one-dimensional (1D) systems cannot pass each other. However, it is still possible to define a diffusion process where the mean-squared displacement (msd) of an ensemble of particles in a 1D chain increases with time t. This process is called single-file diffusion. In contrast to diffusive processes that follow Fick's law, msdt, single-file diffusion is sub-Fickean and the msd is predicted to increase as t^1/2. We have recently created 1D dusty (complex) plasma rings in the DONUT (Dusty ONU experimenT) apparatus. Particle position data from these rings will be analyzed to determine the scaling of the msd with time and results will be compared with predictions of single-file diffusion theory.

  7. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  8. GoPhast: a graphical user interface for PHAST

    USGS Publications Warehouse

    Winston, Richard B.

    2006-01-01

    GoPhast is a graphical user interface (GUI) for the USGS model PHAST. PHAST simulates multicomponent, reactive solute transport in three-dimensional, saturated, ground-water flow systems. PHAST can model both equilibrium and kinetic geochemical reactions. PHAST is derived from HST3D (flow and transport) and PHREEQC (geochemical calculations). The flow and transport calculations are restricted to constant fluid density and constant temperature. The complexity of the input required by PHAST makes manual construction of its input files tedious and error-prone. GoPhast streamlines the creation of the input file and helps reduce errors. GoPhast allows the user to define the spatial input for the PHAST flow and transport data file by drawing points, lines, or polygons on top, front, and side views of the model domain. These objects can have up to two associated formulas that define their extent perpendicular to the view plane, allowing the objects to be three-dimensional. Formulas are also used to specify the values of spatial data (data sets) both globally and for individual objects. Objects can be used to specify the values of data sets independent of the spatial and temporal discretization of the model. Thus, the grid and simulation periods for the model can be changed without respecifying spatial data pertaining to the hydrogeologic framework and boundary conditions. This report describes the operation of GoPhast and demonstrates its use with examples. GoPhast runs on Windows 2000, Windows XP, and Linux operating systems.

  9. Finite Element Analysis of a Copper Single Crystal Shape Memory Alloy-Based Endodontic Instruments

    NASA Astrophysics Data System (ADS)

    Vincent, Marin; Thiebaud, Frédéric; Bel Haj Khalifa, Saifeddine; Engels-Deutsch, Marc; Ben Zineb, Tarak

    2015-10-01

    The aim of the present paper is the development of endodontic Cu-based single crystal Shape Memory Alloy (SMA) instruments in order to eliminate the antimicrobial and mechanical deficiencies observed with the conventional Nickel-Titane (NiTi) SMA files. A thermomechanical constitutive law, already developed and implemented in a finite element code by our research group, is adopted for the simulation of the single crystal SMA behavior. The corresponding material parameters were identified starting from experimental results for a tensile test at room temperature. A computer-aided design geometry has been achieved and considered for a finite element structural analysis of the endodontic Cu-based single crystal SMA files. They are meshed with tetrahedral continuum elements to improve the computation time and the accuracy of results. The geometric parameters tested in this study are the length of the active blade, the rod length, the pitch, the taper, the tip diameter, and the rod diameter. For each set of adopted parameters, a finite element model is built and tested in a combined bending-torsion loading in accordance with ISO 3630-1 norm. The numerical analysis based on finite element procedure allowed purposing an optimal geometry suitable for Cu-based single crystal SMA endodontic files. The same analysis was carried out for the classical NiTi SMA files and a comparison was made between the two kinds of files. It showed that Cu-based single crystal SMA files are less stiff than the NiTi files. The Cu-based endodontic files could be used to improve the root canal treatments. However, the finite element analysis brought out the need for further investigation based on experiments.

  10. Trick Simulation Environment 07

    NASA Technical Reports Server (NTRS)

    Lin, Alexander S.; Penn, John M.

    2012-01-01

    The Trick Simulation Environment is a generic simulation toolkit used for constructing and running simulations. This release includes a Monte Carlo analysis simulation framework and a data analysis package. It produces all auto documentation in XML. Also, the software is capable of inserting a malfunction at any point during the simulation. Trick 07 adds variable server output options and error messaging and is capable of using and manipulating wide characters for international support. Wide character strings are available as a fundamental type for variables processed by Trick. A Trick Monte Carlo simulation uses a statistically generated, or predetermined, set of inputs to iteratively drive the simulation. Also, there is a framework in place for optimization and solution finding where developers may iteratively modify the inputs per run based on some analysis of the outputs. The data analysis package is capable of reading data from external simulation packages such as MATLAB and Octave, as well as the common comma-separated values (CSV) format used by Excel, without the use of external converters. The file formats for MATLAB and Octave were obtained from their documentation sets, and Trick maintains generic file readers for each format. XML tags store the fields in the Trick header comments. For header files, XML tags for structures and enumerations, and the members within are stored in the auto documentation. For source code files, XML tags for each function and the calling arguments are stored in the auto documentation. When a simulation is built, a top level XML file, which includes all of the header and source code XML auto documentation files, is created in the simulation directory. Trick 07 provides an XML to TeX converter. The converter reads in header and source code XML documentation files and converts the data to TeX labels and tables suitable for inclusion in TeX documents. A malfunction insertion capability allows users to override the value of any simulation variable, or call a malfunction job, at any time during the simulation. Users may specify conditions, use the return value of a malfunction trigger job, or manually activate a malfunction. The malfunction action may consist of executing a block of input file statements in an action block, setting simulation variable values, call a malfunction job, or turn on/off simulation jobs.

  11. C-SWAT: The Soil and Water Assessment Tool with consolidated input files in alleviating computational burden of recursive simulations

    USDA-ARS?s Scientific Manuscript database

    The temptation to include model parameters and high resolution input data together with the availability of powerful optimization and uncertainty analysis algorithms has significantly enhanced the complexity of hydrologic and water quality modeling. However, the ability to take advantage of sophist...

  12. Overview Of Recent Enhancements To The Bumper-II Meteoroid and Orbital Debris Risk Assessment Tool

    NASA Technical Reports Server (NTRS)

    Hyde, James L.; Christiansen, Eric L.; Lear, Dana M.; Prior, Thomas G.

    2006-01-01

    Discussion includes recent enhancements to the BUMPER-II program and input files in support of Shuttle Return to Flight. Improvements to the mesh definitions of the finite element input model will be presented. A BUMPER-II analysis process that was used to estimate statistical uncertainty is introduced.

  13. FAST - A multiprocessed environment for visualization of computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Merritt, Fergus J.; Plessel, Todd C.; Kelaita, Paul G.; Mccabe, R. Kevin

    1991-01-01

    The paper presents the Flow Analysis Software Toolset (FAST) to be used for fluid-mechanics analysis. The design criteria for FAST including the minimization of the data path in the computational fluid-dynamics (CFD) process, consistent user interface, extensible software architecture, modularization, and the isolation of three-dimensional tasks from the application programmer are outlined. Each separate process communicates through the FAST Hub, while other modules such as FAST Central, NAS file input, CFD calculator, surface extractor and renderer, titler, tracer, and isolev might work together to generate the scene. An interprocess communication package making it possible for FAST to operate as a modular environment where resources could be shared among different machines as well as a single host is discussed.

  14. FAST: Fitting and Assessment of Synthetic Templates

    NASA Astrophysics Data System (ADS)

    Kriek, Mariska; van Dokkum, Pieter G.; Labbé, Ivo; Franx, Marijn; Illingworth, Garth D.; Marchesini, Danilo; Quadri, Ryan F.; Aird, James; Coil, Alison L.; Georgakakis, Antonis

    2018-03-01

    FAST (Fitting and Assessment of Synthetic Templates) fits stellar population synthesis templates to broadband photometry and/or spectra. FAST is compatible with the photometric redshift code EAzY (ascl:1010.052) when fitting broadband photometry; it uses the photometric redshifts derived by EAzY, and the input files (for examply, photometric catalog and master filter file) are the same. FAST fits spectra in combination with broadband photometric data points or simultaneously fits two components, allowing for an AGN contribution in addition to the host galaxy light. Depending on the input parameters, FAST outputs the best-fit redshift, age, dust content, star formation timescale, metallicity, stellar mass, star formation rate (SFR), and their confidence intervals. Though some of FAST's functions overlap with those of HYPERZ (ascl:1108.010), it differs by fitting fluxes instead of magnitudes, allows the user to completely define the grid of input stellar population parameters and easily input photometric redshifts and their confidence intervals, and calculates calibrated confidence intervals for all parameters. Note that FAST is not a photometric redshift code, though it can be used as one.

  15. Storage of sparse files using parallel log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a singlemore » patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.« less

  16. Software Aids In Graphical Depiction Of Flow Data

    NASA Technical Reports Server (NTRS)

    Stegeman, J. D.

    1995-01-01

    Interactive Data Display System (IDDS) computer program is graphical-display program designed to assist in visualization of three-dimensional flow in turbomachinery. Grid and simulation data files in PLOT3D format required for input. Able to unwrap volumetric data cone associated with centrifugal compressor and display results in easy-to-understand two- or three-dimensional plots. IDDS provides majority of visualization and analysis capability for Integrated Computational Fluid Dynamics and Experiment (ICE) system. IDDS invoked from any subsystem, or used as stand-alone package of display software. Generates contour, vector, shaded, x-y, and carpet plots. Written in C language. Input file format used by IDDS is that of PLOT3D (COSMIC item ARC-12782).

  17. User's Manual for DuctE3D: A Program for 3D Euler Unsteady Aerodynamic and Aeroelastic Analysis of Ducted Fans

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Reddy, T. S. R.

    1997-01-01

    The program DuctE3D is used for steady or unsteady aerodynamic and aeroelastic analysis of ducted fans. This guide describes the input data required and the output files generated, in using DuctE3D. The analysis solves three dimensional unsteady, compressible Euler equations to obtain the aerodynamic forces. A normal mode structural analysis is used to obtain the aeroelastic equations, which are solved using either the time domain or the frequency domain solution method. Sample input and output files are included in this guide for steady aerodynamic analysis and aeroelastic analysis of an isolated fan row.

  18. Atmospheric Boundary Layer Wind Data During the Period January 1, 1998 Through January 31, 1999 at the Dallas-Fort Worth Airport. Volume 1; Quality Assessment

    NASA Technical Reports Server (NTRS)

    Zak, J. Allen; Rodgers, William G., Jr.

    2000-01-01

    The quality of the Aircraft Vortex Spacing System (AVOSS) is critically dependent on representative wind profiles in the atmospheric boundary layer. These winds observed from a number of sensor systems around the Dallas-Fort Worth airport were combined into single vertical wind profiles by an algorithm developed and implemented by MIT Lincoln Laboratory. This process, called the AVOSS Winds Analysis System (AWAS), is used by AVOSS for wake corridor predictions. During times when AWAS solutions were available, the quality of the resultant wind profiles and variance was judged from a series of plots combining all sensor observations and AWAS profiles during the period 1200 to 0400 UTC daily. First, input data was evaluated for continuity and consistency from criteria established. Next, the degree of agreement among all wind sensor systems was noted and cases of disagreement identified. Finally, the resultant AWAS solution was compared to the quality-assessed input data. When profiles differed by a specified amount from valid sensor consensus winds, times and altitudes were flagged. Volume one documents the process and quality of input sensor data. Volume two documents the data processing/sorting process and provides the resultant flagged files.

  19. Bamgineer: Introduction of simulated allele-specific copy number variants into exome and targeted sequence data sets.

    PubMed

    Samadian, Soroush; Bruce, Jeff P; Pugh, Trevor J

    2018-03-01

    Somatic copy number variations (CNVs) play a crucial role in development of many human cancers. The broad availability of next-generation sequencing data has enabled the development of algorithms to computationally infer CNV profiles from a variety of data types including exome and targeted sequence data; currently the most prevalent types of cancer genomics data. However, systemic evaluation and comparison of these tools remains challenging due to a lack of ground truth reference sets. To address this need, we have developed Bamgineer, a tool written in Python to introduce user-defined haplotype-phased allele-specific copy number events into an existing Binary Alignment Mapping (BAM) file, with a focus on targeted and exome sequencing experiments. As input, this tool requires a read alignment file (BAM format), lists of non-overlapping genome coordinates for introduction of gains and losses (bed file), and an optional file defining known haplotypes (vcf format). To improve runtime performance, Bamgineer introduces the desired CNVs in parallel using queuing and parallel processing on a local machine or on a high-performance computing cluster. As proof-of-principle, we applied Bamgineer to a single high-coverage (mean: 220X) exome sequence file from a blood sample to simulate copy number profiles of 3 exemplar tumors from each of 10 tumor types at 5 tumor cellularity levels (20-100%, 150 BAM files in total). To demonstrate feasibility beyond exome data, we introduced read alignments to a targeted 5-gene cell-free DNA sequencing library to simulate EGFR amplifications at frequencies consistent with circulating tumor DNA (10, 1, 0.1 and 0.01%) while retaining the multimodal insert size distribution of the original data. We expect Bamgineer to be of use for development and systematic benchmarking of CNV calling algorithms by users using locally-generated data for a variety of applications. The source code is freely available at http://github.com/pughlab/bamgineer.

  20. OpenSWPC: an open-source integrated parallel simulation code for modeling seismic wave propagation in 3D heterogeneous viscoelastic media

    NASA Astrophysics Data System (ADS)

    Maeda, Takuto; Takemura, Shunsuke; Furumura, Takashi

    2017-07-01

    We have developed an open-source software package, Open-source Seismic Wave Propagation Code (OpenSWPC), for parallel numerical simulations of seismic wave propagation in 3D and 2D (P-SV and SH) viscoelastic media based on the finite difference method in local-to-regional scales. This code is equipped with a frequency-independent attenuation model based on the generalized Zener body and an efficient perfectly matched layer for absorbing boundary condition. A hybrid-style programming using OpenMP and the Message Passing Interface (MPI) is adopted for efficient parallel computation. OpenSWPC has wide applicability for seismological studies and great portability to allowing excellent performance from PC clusters to supercomputers. Without modifying the code, users can conduct seismic wave propagation simulations using their own velocity structure models and the necessary source representations by specifying them in an input parameter file. The code has various modes for different types of velocity structure model input and different source representations such as single force, moment tensor and plane-wave incidence, which can easily be selected via the input parameters. Widely used binary data formats, the Network Common Data Form (NetCDF) and the Seismic Analysis Code (SAC) are adopted for the input of the heterogeneous structure model and the outputs of the simulation results, so users can easily handle the input/output datasets. All codes are written in Fortran 2003 and are available with detailed documents in a public repository.[Figure not available: see fulltext.

  1. Automated forward mechanical modeling of wrinkle ridges on Mars

    NASA Astrophysics Data System (ADS)

    Nahm, Amanda; Peterson, Samuel

    2016-04-01

    One of the main goals of the InSight mission to Mars is to understand the internal structure of Mars [1], in part through passive seismology. Understanding the shallow surface structure of the landing site is critical to the robust interpretation of recorded seismic signals. Faults, such as the wrinkle ridges abundant in the proposed landing site in Elysium Planitia, can be used to determine the subsurface structure of the regions they deform. Here, we test a new automated method for modeling of the topography of a wrinkle ridge (WR) in Elysium Planitia, allowing for faster and more robust determination of subsurface fault geometry for interpretation of the local subsurface structure. We perform forward mechanical modeling of fault-related topography [e.g., 2, 3], utilizing the modeling program Coulomb [4, 5] to model surface displacements surface induced by blind thrust faulting. Fault lengths are difficult to determine for WR; we initially assume a fault length of 30 km, but also test the effects of different fault lengths on model results. At present, we model the wrinkle ridge as a single blind thrust fault with a constant fault dip, though WR are likely to have more complicated fault geometry [e.g., 6-8]. Typically, the modeling is performed using the Coulomb GUI. This approach can be time consuming, requiring user inputs to change model parameters and to calculate the associated displacements for each model, which limits the number of models and parameter space that can be tested. To reduce active user computation time, we have developed a method in which the Coulomb GUI is bypassed. The general modeling procedure remains unchanged, and a set of input files is generated before modeling with ranges of pre-defined parameter values. The displacement calculations are divided into two suites. For Suite 1, a total of 3770 input files were generated in which the fault displacement (D), dip angle (δ), depth to upper fault tip (t), and depth to lower fault tip (B) were varied. A second set of input files was created (Suite 2) after the best-fit model from Suite 1 was determined, in which fault parameters were varied with a smaller range and incremental changes, resulting in a total of 28,080 input files. RMS values were calculated for each Coulomb model. RMS values for Suite 1 models were calculated over the entire profile and for a restricted x range; the latter shows a reduced RMS misfit by 1.2 m. The minimum RMS value for Suite 2 models decreases again by 0.2 m, resulting in an overall reduction of the RMS value of ~1.4 m (18%). Models with different fault lengths (15, 30, and 60 km) are visually indistinguishable. Values for δ, t, B, and RMS misfit are either the same or very similar for each best fit model. These results indicate that the subsurface structure can be reliably determined from forward mechanical modeling even with uncertainty in fault length. Future work will test this method with the more realistic WR fault geometry. References: [1] Banerdt et al. (2013), 44th LPSC, #1915. [2] Cohen (1999), Adv. Geophys., 41, 133-231. [3] Schultz and Lin (2001), JGR, 106, 16549-16566. [4] Lin and Stein (2004), JGR, 109, B02303, doi:10.1029/2003JB002607. [5] Toda et al. (2005), JGR, 103, 24543-24565. [6] Okubo and Schultz (2004), GSAB, 116, 597-605. [7] Watters (2004), Icarus, 171, 284-294. [8] Schultz (2000), JGR, 105, 12035-12052.

  2. Development of a Distributed Parallel Computing Framework to Facilitate Regional/Global Gridded Crop Modeling with Various Scenarios

    NASA Astrophysics Data System (ADS)

    Jang, W.; Engda, T. A.; Neff, J. C.; Herrick, J.

    2017-12-01

    Many crop models are increasingly used to evaluate crop yields at regional and global scales. However, implementation of these models across large areas using fine-scale grids is limited by computational time requirements. In order to facilitate global gridded crop modeling with various scenarios (i.e., different crop, management schedule, fertilizer, and irrigation) using the Environmental Policy Integrated Climate (EPIC) model, we developed a distributed parallel computing framework in Python. Our local desktop with 14 cores (28 threads) was used to test the distributed parallel computing framework in Iringa, Tanzania which has 406,839 grid cells. High-resolution soil data, SoilGrids (250 x 250 m), and climate data, AgMERRA (0.25 x 0.25 deg) were also used as input data for the gridded EPIC model. The framework includes a master file for parallel computing, input database, input data formatters, EPIC model execution, and output analyzers. Through the master file for parallel computing, the user-defined number of threads of CPU divides the EPIC simulation into jobs. Then, Using EPIC input data formatters, the raw database is formatted for EPIC input data and the formatted data moves into EPIC simulation jobs. Then, 28 EPIC jobs run simultaneously and only interesting results files are parsed and moved into output analyzers. We applied various scenarios with seven different slopes and twenty-four fertilizer ranges. Parallelized input generators create different scenarios as a list for distributed parallel computing. After all simulations are completed, parallelized output analyzers are used to analyze all outputs according to the different scenarios. This saves significant computing time and resources, making it possible to conduct gridded modeling at regional to global scales with high-resolution data. For example, serial processing for the Iringa test case would require 113 hours, while using the framework developed in this study requires only approximately 6 hours, a nearly 95% reduction in computing time.

  3. Temperature increases on the external root surface during endodontic treatment using single file systems.

    PubMed

    Özkocak, I; Taşkan, M M; Gökt Rk, H; Aytac, F; Karaarslan, E Şirin

    2015-01-01

    The aim of this study is to evaluate increases in temperature on the external root surface during endodontic treatment with different rotary systems. Fifty human mandibular incisors with a single root canal were selected. All root canals were instrumented using a size 20 Hedstrom file, and the canals were irrigated with 5% sodium hypochlorite solution. The samples were randomly divided into the following three groups of 15 teeth: Group 1: The OneShape Endodontic File no.: 25; Group 2: The Reciproc Endodontic File no.: 25; Group 3: The WaveOne Endodontic File no.: 25. During the preparation, the temperature changes were measured in the middle third of the roots using a noncontact infrared thermometer. The temperature data were transferred from the thermometer to the computer and were observed graphically. Statistical analysis was performed using the Kruskal-Wallis analysis of variance at a significance level of 0.05. The increases in temperature caused by the OneShape file system were lower than those of the other files (P < 0.05). The WaveOne file showed the highest temperature increases. However, there were no significant differences between the Reciproc and WaveOne files. The single file rotary systems used in this study may be recommended for clinical use.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, Benjamin S.

    The Futility package contains the following: 1) Definition of the size of integers and real numbers; 2) A generic Unit test harness; 3) Definitions for some basic extensions to the Fortran language: arbitrary length strings, a parameter list construct, exception handlers, command line processor, timers; 4) Geometry definitions: point, line, plane, box, cylinder, polyhedron; 5) File wrapper functions: standard Fortran input/output files, Fortran binary files, HDF5 files; 6) Parallel wrapper functions: MPI, and Open MP abstraction layers, partitioning algorithms; 7) Math utilities: BLAS, Matrix and Vector definitions, Linear Solver methods and wrappers for other TPLs (PETSC, MKL, etc), preconditioner classes;more » 8) Misc: random number generator, water saturation properties, sorting algorithms.« less

  5. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  6. LACIE performance predictor final operational capability program description, volume 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Given the swath table files, the segment set for one country and cloud cover data, the SAGE program determines how many times and under what conditions each segment is accessed by satellites. The program writes a record for each segment on a data file which contains the pertinent acquisition data. The weather data file can also be generated from a NASA supplied tape. The Segment Acquisition Selector Program (SACS) selects data from the segment reference file based upon data input manually and from a crop window file. It writes the extracted data to a data acquisition file and prints two summary reports. The POUT program reads from associated LACIE files and produces printed reports. The major types of reports that can be produced are: (1) Substrate Reference Data Reports, (2) Population Mean, Standard Deviation and Histogram Reports, (3) Histograms of Monte Carlo Statistics Reports, and (4) Frequency of Sample Segment Acquisitions Reports.

  7. EVALUATING HYDROLOGICAL RESPONSE TO ...

    EPA Pesticide Factsheets

    Studies of future management and policy options based on different assumptions provide a mechanism to examine possible outcomes and especially their likely benefits or consequences. Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extensive data requirements and the difficult task of building input parameter files, however, have long been an obstacle to the timely and cost-effective use of such complex models by resource managers. The U.S. EPA Landscape Ecology Branch in collaboration with the USDA-ARS Southwest Watershed Research Center has developed a geographic information system (GIS) tool to facilitate this process. A GIS provides the framework within which spatially distributed data are collected and used to prepare model input files, and model results are evaluated. The Automated Geospatial Watershed Assessment (AGWA) tool uses widely available standardized spatial datasets that can be obtained via the internet at no cost to the user. The data are used to develop input parameter files for KINEROS2 and SWAT, two watershed runoff and erosion simulation models that operate at different spatial and temporal scales. AGWA automates the process of transforming digital data into simulation model results and provides a visualization tool

  8. Active Brownian particles escaping a channel in single file.

    PubMed

    Locatelli, Emanuele; Baldovin, Fulvio; Orlandini, Enzo; Pierno, Matteo

    2015-02-01

    Active particles may happen to be confined in channels so narrow that they cannot overtake each other (single-file conditions). This interesting situation reveals nontrivial physical features as a consequence of the strong interparticle correlations developed in collective rearrangements. We consider a minimal two-dimensional model for active Brownian particles with the aim of studying the modifications introduced by activity with respect to the classical (passive) single-file picture. Depending on whether their motion is dominated by translational or rotational diffusion, we find that active Brownian particles in single file may arrange into clusters that are continuously merging and splitting (active clusters) or merely reproduce passive-motion paradigms, respectively. We show that activity conveys to self-propelled particles a strategic advantage for trespassing narrow channels against external biases (e.g., the gravitational field).

  9. Active Brownian particles escaping a channel in single file

    NASA Astrophysics Data System (ADS)

    Locatelli, Emanuele; Baldovin, Fulvio; Orlandini, Enzo; Pierno, Matteo

    2015-02-01

    Active particles may happen to be confined in channels so narrow that they cannot overtake each other (single-file conditions). This interesting situation reveals nontrivial physical features as a consequence of the strong interparticle correlations developed in collective rearrangements. We consider a minimal two-dimensional model for active Brownian particles with the aim of studying the modifications introduced by activity with respect to the classical (passive) single-file picture. Depending on whether their motion is dominated by translational or rotational diffusion, we find that active Brownian particles in single file may arrange into clusters that are continuously merging and splitting (active clusters) or merely reproduce passive-motion paradigms, respectively. We show that activity conveys to self-propelled particles a strategic advantage for trespassing narrow channels against external biases (e.g., the gravitational field).

  10. 76 FR 12204 - Self-Regulatory Organizations; BATS Y-Exchange, Inc.; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-04

    ...-Regulatory Organizations; BATS Y-Exchange, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule Change by BATS Exchange, Inc. To Adopt BYX Rule 11.21, entitled ``Input of Accurate Information...\\ 15 U.S.C. 78s(b)(3)(A). \\4\\ 17 CFR 240.19b-4(f)(6)(iii). I. Self-Regulatory Organization's Statement...

  11. Using the General Algebraic Modeling System on Peregrine | High-Performance

    Science.gov Websites

    directory, type the following: module load gams cp /nopt/nrel/apps/gams/example/trnsport.gms . gams trnsport file. For example, if your model input uses LP procedure and you want to use Gurobi solver to solve it directory that you run GAMS. For example, for the Gurobi solver, its option file is "gurobi.opt"

  12. MODFLOW-2000, the U.S. Geological Survey modular ground-water model -- Documentation of MOD-PREDICT for predictions, prediction sensitivity analysis, and evaluation of uncertainty

    USGS Publications Warehouse

    Tonkin, M.J.; Hill, Mary C.; Doherty, John

    2003-01-01

    This document describes the MOD-PREDICT program, which helps evaluate userdefined sets of observations, prior information, and predictions, using the ground-water model MODFLOW-2000. MOD-PREDICT takes advantage of the existing Observation and Sensitivity Processes (Hill and others, 2000) by initiating runs of MODFLOW-2000 and using the output files produced. The names and formats of the MODFLOW-2000 input files are unchanged, such that full backward compatibility is maintained. A new name file and input files are required for MOD-PREDICT. The performance of MOD-PREDICT has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program using the email address available at the web address below. Updates might occasionally be made to this document, to the MOD-PREDICT program, and to MODFLOW- 2000. Users can check for updates on the Internet at URL http://water.usgs.gov/software/ground water.html/.

  13. Software for Managing Parametric Studies

    NASA Technical Reports Server (NTRS)

    Yarrow, Maurice; McCann, Karen M.; DeVivo, Adrian

    2003-01-01

    The Information Power Grid Virtual Laboratory (ILab) is a Practical Extraction and Reporting Language (PERL) graphical-user-interface computer program that generates shell scripts to facilitate parametric studies performed on the Grid. (The Grid denotes a worldwide network of supercomputers used for scientific and engineering computations involving data sets too large to fit on desktop computers.) Heretofore, parametric studies on the Grid have been impeded by the need to create control language scripts and edit input data files painstaking tasks that are necessary for managing multiple jobs on multiple computers. ILab reflects an object-oriented approach to automation of these tasks: All data and operations are organized into packages in order to accelerate development and debugging. A container or document object in ILab, called an experiment, contains all the information (data and file paths) necessary to define a complex series of repeated, sequenced, and/or branching processes. For convenience and to enable reuse, this object is serialized to and from disk storage. At run time, the current ILab experiment is used to generate required input files and shell scripts, create directories, copy data files, and then both initiate and monitor the execution of all computational processes.

  14. Artificial neural networks for modeling ammonia emissions released from sewage sludge composting

    NASA Astrophysics Data System (ADS)

    Boniecki, P.; Dach, J.; Pilarski, K.; Piekarska-Boniecka, H.

    2012-09-01

    The project was designed to develop, test and validate an original Neural Model describing ammonia emissions generated in composting sewage sludge. The composting mix was to include the addition of such selected structural ingredients as cereal straw, sawdust and tree bark. All created neural models contain 7 input variables (chemical and physical parameters of composting) and 1 output (ammonia emission). The α data file was subdivided into three subfiles: the learning file (ZU) containing 330 cases, the validation file (ZW) containing 110 cases and the test file (ZT) containing 110 cases. The standard deviation ratios (for all 4 created networks) ranged from 0.193 to 0.218. For all of the selected models, the correlation coefficient reached the high values of 0.972-0.981. The results show that he predictive neural model describing ammonia emissions from composted sewage sludge is well suited for assessing such emissions. The sensitivity analysis of the model for the input of variables of the process in question has shown that the key parameters describing ammonia emissions released in composting sewage sludge are pH and the carbon to nitrogen ratio (C:N).

  15. ACTG: novel peptide mapping onto gene models.

    PubMed

    Choi, Seunghyuk; Kim, Hyunwoo; Paek, Eunok

    2017-04-15

    In many proteogenomic applications, mapping peptide sequences onto genome sequences can be very useful, because it allows us to understand origins of the gene products. Existing software tools either take the genomic position of a peptide start site as an input or assume that the peptide sequence exactly matches the coding sequence of a given gene model. In case of novel peptides resulting from genomic variations, especially structural variations such as alternative splicing, these existing tools cannot be directly applied unless users supply information about the variant, either its genomic position or its transcription model. Mapping potentially novel peptides to genome sequences, while allowing certain genomic variations, requires introducing novel gene models when aligning peptide sequences to gene structures. We have developed a new tool called ACTG (Amino aCids To Genome), which maps peptides to genome, assuming all possible single exon skipping, junction variation allowing three edit distances from the original splice sites, exon extension and frame shift. In addition, it can also consider SNVs (single nucleotide variations) during mapping phase if a user provides the VCF (variant call format) file as an input. Available at http://prix.hanyang.ac.kr/ACTG/search.jsp . eunokpaek@hanyang.ac.kr. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  16. Scalability Analysis and Use of Compression at the Goddard DAAC and End-to-End MODIS Transfers

    NASA Technical Reports Server (NTRS)

    Menasce, Daniel A.

    1998-01-01

    The goal of this task is to analyze the performance of single and multiple FTP transfer between SCF's and the Goddard DAAC. We developed an analytic model to compute the performance of FTP sessions as a function of various key parameters, implemented the model as a program called FTP Analyzer, and carried out validations with real data obtained by running single and multiple FTP transfer between GSFC and the Miami SCF. The input parameters to the model include the mix to FTP sessions (scenario), and for each FTP session, the file size. The network parameters include the round trip time, packet loss rate, the limiting bandwidth of the network connecting the SCF to a DAAC, TCP's basic timeout, TCP's Maximum Segment Size, and TCP's Maximum Receiver's Window Size. The modeling approach used consisted of modeling TCP's overall throughput, computing TCP's delay per FTP transfer, and then solving a queuing network model that includes the FTP clients and servers.

  17. Next-Generation Lightweight Mirror Modeling Software

    NASA Technical Reports Server (NTRS)

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, Phil

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible

  18. Next Generation Lightweight Mirror Modeling Software

    NASA Technical Reports Server (NTRS)

    Arnold, William; Fitzgerald, Matthew; Stahl, Philip

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible.

  19. Next Generation Lightweight Mirror Modeling Software

    NASA Technical Reports Server (NTRS)

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, H. Philip

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.

  20. BLT-EC (Breach, Leach and Transport-Equilibrium Chemistry) data input guide. A computer model for simulating release and coupled geochemical transport of contaminants from a subsurface disposal facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacKinnon, R.J.; Sullivan, T.M.; Kinsey, R.R.

    1997-05-01

    The BLT-EC computer code has been developed, implemented, and tested. BLT-EC is a two-dimensional finite element computer code capable of simulating the time-dependent release and reactive transport of aqueous phase species in a subsurface soil system. BLT-EC contains models to simulate the processes (container degradation, waste-form performance, transport, chemical reactions, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is provided through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste-form performancemore » considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, and solubility. Transport considers the processes of advection, dispersion, diffusion, chemical reaction, radioactive production and decay, and sources (waste form releases). Chemical reactions accounted for include complexation, sorption, dissolution-precipitation, oxidation-reduction, and ion exchange. Radioactive production and decay in the waste form is simulated. To improve the usefulness of BLT-EC, a pre-processor, ECIN, which assists in the creation of chemistry input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. BLT-EC also includes an extensive database of thermodynamic data that is also accessible to ECIN. This document reviews the models implemented in BLT-EC and serves as a guide to creating input files and applying BLT-EC.« less

  1. ProMC: Input-output data format for HEP applications using varint encoding

    NASA Astrophysics Data System (ADS)

    Chekanov, S. V.; May, E.; Strand, K.; Van Gemmeren, P.

    2014-10-01

    A new data format for Monte Carlo (MC) events, or any structural data, including experimental data, is discussed. The format is designed to store data in a compact binary form using variable-size integer encoding as implemented in the Google's Protocol Buffers package. This approach is implemented in the PROMC library which produces smaller file sizes for MC records compared to the existing input-output libraries used in high-energy physics (HEP). Other important features of the proposed format are a separation of abstract data layouts from concrete programming implementations, self-description and random access. Data stored in PROMC files can be written, read and manipulated in a number of programming languages, such C++, JAVA, FORTRAN and PYTHON.

  2. Issues in ATM Support of High-Performance, Geographically Distributed Computing

    NASA Technical Reports Server (NTRS)

    Claus, Russell W.; Dowd, Patrick W.; Srinidhi, Saragur M.; Blade, Eric D.G

    1995-01-01

    This report experimentally assesses the effect of the underlying network in a cluster-based computing environment. The assessment is quantified by application-level benchmarking, process-level communication, and network file input/output. Two testbeds were considered, one small cluster of Sun workstations and another large cluster composed of 32 high-end IBM RS/6000 platforms. The clusters had Ethernet, fiber distributed data interface (FDDI), Fibre Channel, and asynchronous transfer mode (ATM) network interface cards installed, providing the same processors and operating system for the entire suite of experiments. The primary goal of this report is to assess the suitability of an ATM-based, local-area network to support interprocess communication and remote file input/output systems for distributed computing.

  3. Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN

    NASA Technical Reports Server (NTRS)

    Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.

    1996-01-01

    A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.

  4. Generalized three-dimensional simulation of ferruled coupled-cavity traveling-wave-tube dispersion and impedance characteristics

    NASA Technical Reports Server (NTRS)

    Maruschek, Joseph W.; Kory, Carol L.; Wilson, Jeffrey D.

    1993-01-01

    The frequency-phase dispersion and Pierce on-axis interaction impedance of a ferruled, coupled-cavity, traveling-wave tube (TWT), slow-wave circuit were calculated using the three-dimensional simulation code Micro-SOS. The utilization of the code to reduce costly and time-consuming experimental cold tests is demonstrated by the accuracy achieved in calculating these parameters. A generalized input file was developed so that ferruled coupled-cavity TWT slow-wave circuits of arbitrary dimensions could be easily modeled. The practicality of the generalized input file was tested by applying it to the ferruled coupled-cavity slow-wave circuit of the Hughes Aircraft Company model 961HA TWT and by comparing the results with experimental results.

  5. FY92 Progress Report for the Gyrotron Backward-Wave-Oscillator Experiment

    DTIC Science & Technology

    1993-07-01

    C. SAMPLE CABLE CALIBRATION 23 D. ASYST CHANNEL SETUPS 26 E. SAMPLE MAGNET INPUT DATA DECK FOR THE GYRO-BWO 32 F. SAMPLE EGUN INPUT DATA DECK FOR THE...of the first coil of the Helmholtz pair; zero also corresponds to the diode end of the experiment). Another computer code used was the EGUN code (Ref...a short computer program was written to superimpose the two magnetic fields; DC and Helmholtz). An example of an EGUN input data file is included in

  6. Decision & Management Tools for DNAPL Sites: Optimization of Chlorinated Solvent Source and Plume Remediation Considering Uncertainty

    DTIC Science & Technology

    2010-09-01

    differentiated between source codes and input/output files. The text makes references to a REMChlor-GoldSim model. The text also refers to the REMChlor...To the extent possible, the instructions should be accurate and precise. The documentation should differentiate between describing what is actually...Windows XP operating system Model Input Paran1eters. · n1e input parameters were identical to those utilized and reported by CDM (See Table .I .from

  7. AFT Program Description Navigation/Strike Tasks. Phase II,

    DTIC Science & Technology

    1972-09-01

    1 Subroutine ............... 2- 96 2-23 Data Input/Output - PMSG : 1 Subroutine ................ 2-97 2-24 Data Input/Output - LPMSG: 1 Subroutine...T99DI3 GOFLAG Exercise Start Flag PAD Roll Rate (degrees/second) PHIS Bank Angle (degrees) PMSG 17 KBP Message INPUT STUDENT FILE DATA 2-41 PMSG T3 KBP...Message CRASH PMSG T4 KBP Message DEPRESS THE RESET-TO-ZERO CONSOLE BUTTON PSI F-4 Heading (degrees) PSIAFT Desired AFT Heading RCIS Average Rate-of

  8. Super Strypi HWIL 6DOF (Hardware-In-Loop six-degree-of-freedom) Rev. 2175

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilkey, Jeff C.; Harl, Nathan R.; Kowalchuk, Scott A.

    2016-02-23

    The Super Strypi HWIL is a six degree-of-freedom (6DOF) simulation for the Super Strypi Launch Vehicle. The simulation is used to test the NGC flight software including the navigation software. Aerodynamic and propulsive forces, mass properties, ACS (attitude control system) parameters are defined in input files. Output parameters are saved to a Matlab mat file.

  9. Minimum Hamiltonian ascent trajectory evaluation (MASTRE) program (update to automatic flight trajectory design, performance prediction, and vehicle sizing for support of shuttle and shuttle derived vehicles) users manual

    NASA Technical Reports Server (NTRS)

    Lyons, J. T.; Borchers, William R.

    1993-01-01

    Documentation for the User Interface Program for the Minimum Hamiltonian Ascent Trajectory Evaluation (MASTRE) is provided. The User Interface Program is a separate software package designed to ease the user input requirements when using the MASTRE Trajectory Program. This document supplements documentation on the MASTRE Program that consists of the MASTRE Engineering Manual and the MASTRE Programmers Guide. The User Interface Program provides a series of menus and tables using the VAX Screen Management Guideline (SMG) software. These menus and tables allow the user to modify the MASTRE Program input without the need for learning the various program dependent mnemonics. In addition, the User Interface Program allows the user to modify and/or review additional input Namelist and data files, to build and review command files, to formulate and calculate mass properties related data, and to have a plotting capability.

  10. The Lake Tahoe Basin Land Use Simulation Model

    USGS Publications Warehouse

    Forney, William M.; Oldham, I. Benson

    2011-01-01

    This U.S. Geological Survey Open-File Report describes the final modeling product for the Tahoe Decision Support System project for the Lake Tahoe Basin funded by the Southern Nevada Public Land Management Act and the U.S. Geological Survey's Geographic Analysis and Monitoring Program. This research was conducted by the U.S. Geological Survey Western Geographic Science Center. The purpose of this report is to describe the basic elements of the novel Lake Tahoe Basin Land Use Simulation Model, publish samples of the data inputs, basic outputs of the model, and the details of the Python code. The results of this report include a basic description of the Land Use Simulation Model, descriptions and summary statistics of model inputs, two figures showing the graphical user interface from the web-based tool, samples of the two input files, seven tables of basic output results from the web-based tool and descriptions of their parameters, and the fully functional Python code.

  11. INSPECT: A graphical user interface software package for IDARC-2D

    NASA Astrophysics Data System (ADS)

    AlHamaydeh, Mohammad; Najib, Mohamad; Alawnah, Sameer

    Modern day Performance-Based Earthquake Engineering (PBEE) pivots about nonlinear analysis and its feasibility. IDARC-2D is a widely used and accepted software for nonlinear analysis; it possesses many attractive features and capabilities. However, it is operated from the command prompt in the DOS/Unix systems and requires elaborate text-based input files creation by the user. To complement and facilitate the use of IDARC-2D, a pre-processing GUI software package (INSPECT) is introduced herein. INSPECT is created in the C# environment and utilizes the .NET libraries and SQLite database. Extensive testing and verification demonstrated successful and high-fidelity re-creation of several existing IDARC-2D input files. Its design and built-in features aim at expediting, simplifying and assisting in the modeling process. Moreover, this practical aid enhances the reliability of the results and improves accuracy by reducing and/or eliminating many potential and common input mistakes. Such benefits would be appreciated by novice and veteran IDARC-2D users alike.

  12. FD_BH: a program for simulating electromagnetic waves from a borehole antenna

    USGS Publications Warehouse

    Ellefsen, Karl J.

    2002-01-01

    Program FD_BH is used to simulate the electromagnetic waves generated by an antenna in a borehole. The model representing the antenna may include metallic parts, a coaxial cable as a feed to the driving point, and resistive loading. The program is written in the C programming language, and the program has been tested on both the Windows and the UNIX operating systems. This Open-File Report describes • The contents and organization of the Zip file (section 2). • The program files, the installation of the program, the input files, and the execution of the program (section 3). • Address to which suggestions for improving the program may be sent (section 4).

  13. Optical mass memory system (AMM-13). AMM/DBMS interface control document

    NASA Technical Reports Server (NTRS)

    Bailey, G. A.

    1980-01-01

    The baseline for external interfaces of a 10 to the 13th power bit, optical archival mass memory system (AMM-13) is established. The types of interfaces addressed include data transfer; AMM-13, Data Base Management System, NASA End-to-End Data System computer interconnect; data/control input and output interfaces; test input data source; file management; and facilities interface.

  14. Method for data compression by associating complex numbers with files of data values

    DOEpatents

    Feo, J.T.; Hanks, D.C.; Kraay, T.A.

    1998-02-10

    A method for compressing data for storage or transmission is disclosed. Given a complex polynomial and a value assigned to each root, a root generated data file (RGDF) is created, one entry at a time. Each entry is mapped to a point in a complex plane. An iterative root finding technique is used to map the coordinates of the point to the coordinates of one of the roots of the polynomial. The value associated with that root is assigned to the entry. An equational data compression (EDC) method reverses this procedure. Given a target data file, the EDC method uses a search algorithm to calculate a set of m complex numbers and a value map that will generate the target data file. The error between a simple target data file and generated data file is typically less than 10%. Data files can be transmitted or stored without loss by transmitting the m complex numbers, their associated values, and an error file whose size is at most one-tenth of the size of the input data file. 4 figs.

  15. Method for data compression by associating complex numbers with files of data values

    DOEpatents

    Feo, John Thomas; Hanks, David Carlton; Kraay, Thomas Arthur

    1998-02-10

    A method for compressing data for storage or transmission. Given a complex polynomial and a value assigned to each root, a root generated data file (RGDF) is created, one entry at a time. Each entry is mapped to a point in a complex plane. An iterative root finding technique is used to map the coordinates of the point to the coordinates of one of the roots of the polynomial. The value associated with that root is assigned to the entry. An equational data compression (EDC) method reverses this procedure. Given a target data file, the EDC method uses a search algorithm to calculate a set of m complex numbers and a value map that will generate the target data file. The error between a simple target data file and generated data file is typically less than 10%. Data files can be transmitted or stored without loss by transmitting the m complex numbers, their associated values, and an error file whose size is at most one-tenth of the size of the input data file.

  16. 16 CFR 610.2 - Centralized source for requesting annual file disclosures from nationwide consumer reporting...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... FEDERAL TRADE COMMISSION THE FAIR CREDIT REPORTING ACT FREE ANNUAL FILE DISCLOSURES § 610.2 Centralized... request methods, at the consumers' option: (i) A single, dedicated Internet website, (ii) A single, dedicated toll-free telephone number; and (iii) Mail directed to a single address; (2) Be designed, funded...

  17. 16 CFR 610.2 - Centralized source for requesting annual file disclosures from nationwide consumer reporting...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... FEDERAL TRADE COMMISSION THE FAIR CREDIT REPORTING ACT FREE ANNUAL FILE DISCLOSURES § 610.2 Centralized... request methods, at the consumers' option: (i) A single, dedicated Internet website, (ii) A single, dedicated toll-free telephone number; and (iii) Mail directed to a single address; (2) Be designed, funded...

  18. Technological advances in endodontics: treatment of a mandibular molar with internal root resorption using a reciprocating single-file system.

    PubMed

    de Souza, Samir Noronha; Marques, André Augusto Franco; Sponchiado-Júnior, EmÍlio Carlos; Roberti Garcia, Lucas da Fonseca; da Frota, Matheus Franco; de Carvalho, Fredson Márcio Acris

    2017-01-01

    The field of endodontics has become increasingly successful due to technological advances that allow clinicians to solve clinical cases that would have been problematic a few years ago. Despite such advances, endodontic treatment of teeth with internal root resorption remains challenging. This article presents a clinical case in which a reciprocating single-file system was used for endodontic treatment of a mandibular molar with internal root resorption. Radiographic examination revealed the presence of internal root resorption in the distobuccal root canal of the mandibular right first molar. A reciprocating single-file system was used for root canal instrumentation and final preparation, and filling was obtained through a thermal compaction technique. No painful symptoms or periapical lesions were observed in 12 months of follow-up. The results indicate that a reciprocating single-file system is an adequate alternative for root canal instrumentation, particularly in teeth with internal root resorption.

  19. Maneuver Automation Software

    NASA Technical Reports Server (NTRS)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; hide

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  20. SNP2TFBS - a database of regulatory SNPs affecting predicted transcription factor binding site affinity.

    PubMed

    Kumar, Sunil; Ambrosini, Giovanna; Bucher, Philipp

    2017-01-04

    SNP2TFBS is a computational resource intended to support researchers investigating the molecular mechanisms underlying regulatory variation in the human genome. The database essentially consists of a collection of text files providing specific annotations for human single nucleotide polymorphisms (SNPs), namely whether they are predicted to abolish, create or change the affinity of one or several transcription factor (TF) binding sites. A SNP's effect on TF binding is estimated based on a position weight matrix (PWM) model for the binding specificity of the corresponding factor. These data files are regenerated at regular intervals by an automatic procedure that takes as input a reference genome, a comprehensive SNP catalogue and a collection of PWMs. SNP2TFBS is also accessible over a web interface, enabling users to view the information provided for an individual SNP, to extract SNPs based on various search criteria, to annotate uploaded sets of SNPs or to display statistics about the frequencies of binding sites affected by selected SNPs. Homepage: http://ccg.vital-it.ch/snp2tfbs/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. PLIP: fully automated protein-ligand interaction profiler.

    PubMed

    Salentin, Sebastian; Schreiber, Sven; Haupt, V Joachim; Adasme, Melissa F; Schroeder, Michael

    2015-07-01

    The characterization of interactions in protein-ligand complexes is essential for research in structural bioinformatics, drug discovery and biology. However, comprehensive tools are not freely available to the research community. Here, we present the protein-ligand interaction profiler (PLIP), a novel web service for fully automated detection and visualization of relevant non-covalent protein-ligand contacts in 3D structures, freely available at projects.biotec.tu-dresden.de/plip-web. The input is either a Protein Data Bank structure, a protein or ligand name, or a custom protein-ligand complex (e.g. from docking). In contrast to other tools, the rule-based PLIP algorithm does not require any structure preparation. It returns a list of detected interactions on single atom level, covering seven interaction types (hydrogen bonds, hydrophobic contacts, pi-stacking, pi-cation interactions, salt bridges, water bridges and halogen bonds). PLIP stands out by offering publication-ready images, PyMOL session files to generate custom images and parsable result files to facilitate successive data processing. The full python source code is available for download on the website. PLIP's command-line mode allows for high-throughput interaction profiling. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. CUTSETS - MINIMAL CUT SET CALCULATION FOR DIGRAPH AND FAULT TREE RELIABILITY MODELS

    NASA Technical Reports Server (NTRS)

    Iverson, D. L.

    1994-01-01

    Fault tree and digraph models are frequently used for system failure analysis. Both type of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Fault trees must have a tree structure and do not allow cycles or loops in the graph. Digraphs allow any pattern of interconnection between loops in the graphs. A common operation performed on digraph and fault tree models is the calculation of minimal cut sets. A cut set is a set of basic failures that could cause a given target failure event to occur. A minimal cut set for a target event node in a fault tree or digraph is any cut set for the node with the property that if any one of the failures in the set is removed, the occurrence of the other failures in the set will not cause the target failure event. CUTSETS will identify all the minimal cut sets for a given node. The CUTSETS package contains programs that solve for minimal cut sets of fault trees and digraphs using object-oriented programming techniques. These cut set codes can be used to solve graph models for reliability analysis and identify potential single point failures in a modeled system. The fault tree minimal cut set code reads in a fault tree model input file with each node listed in a text format. In the input file the user specifies a top node of the fault tree and a maximum cut set size to be calculated. CUTSETS will find minimal sets of basic events which would cause the failure at the output of a given fault tree gate. The program can find all the minimal cut sets of a node, or minimal cut sets up to a specified size. The algorithm performs a recursive top down parse of the fault tree, starting at the specified top node, and combines the cut sets of each child node into sets of basic event failures that would cause the failure event at the output of that gate. Minimal cut set solutions can be found for all nodes in the fault tree or just for the top node. The digraph cut set code uses the same techniques as the fault tree cut set code, except it includes all upstream digraph nodes in the cut sets for a given node and checks for cycles in the digraph during the solution process. CUTSETS solves for specified nodes and will not automatically solve for all upstream digraph nodes. The cut sets will be output as a text file. CUTSETS includes a utility program that will convert the popular COD format digraph model description files into text input files suitable for use with the CUTSETS programs. FEAT (MSC-21873) and FIRM (MSC-21860) available from COSMIC are examples of programs that produce COD format digraph model description files that may be converted for use with the CUTSETS programs. CUTSETS is written in C-language to be machine independent. It has been successfully implemented on a Sun running SunOS, a DECstation running ULTRIX, a Macintosh running System 7, and a DEC VAX running VMS. The RAM requirement varies with the size of the models. CUTSETS is available in UNIX tar format on a .25 inch streaming magnetic tape cartridge (standard distribution) or on a 3.5 inch diskette. It is also available on a 3.5 inch Macintosh format diskette or on a 9-track 1600 BPI magnetic tape in DEC VAX FILES-11 format. Sample input and sample output are provided on the distribution medium. An electronic copy of the documentation in Macintosh Microsoft Word format is included on the distribution medium. Sun and SunOS are trademarks of Sun Microsystems, Inc. DEC, DeCstation, ULTRIX, VAX, and VMS are trademarks of Digital Equipment Corporation. UNIX is a registered trademark of AT&T Bell Laboratories. Macintosh is a registered trademark of Apple Computer, Inc.

  3. Culvert Analysis Program Graphical User Interface 1.0--A preprocessing and postprocessing tool for estimating flow through culvert

    USGS Publications Warehouse

    Bradley, D. Nathan

    2013-01-01

    The peak discharge of a flood can be estimated from the elevation of high-water marks near the inlet and outlet of a culvert after the flood has occurred. This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks on trees or buildings. When combined with the cross-sectional geometry of the channel upstream from the culvert and the culvert size, shape, roughness, and orientation, the high-water marks define a water-surface profile that can be used to estimate the peak discharge by using the methods described by Bodhaine (1968). This type of measurement is in contrast to a “direct” measurement of discharge made during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a streamgage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the streamgage, resulting in more accurate computation of high flows. The Culvert Analysis Program (CAP) (Fulford, 1998) is a command-line program written in Fortran for computing peak discharges and culvert rating surfaces or curves. CAP reads input data from a formatted text file and prints results to another formatted text file. Preparing and correctly formatting the input file may be time-consuming and prone to errors. This document describes the CAP graphical user interface (GUI)—a modern, cross-platform, menu-driven application that prepares the CAP input file, executes the program, and helps the user interpret the output

  4. The Open Spectral Database: an open platform for sharing and searching spectral data.

    PubMed

    Chalk, Stuart J

    2016-01-01

    A number of websites make available spectral data for download (typically as JCAMP-DX text files) and one (ChemSpider) that also allows users to contribute spectral files. As a result, searching and retrieving such spectral data can be time consuming, and difficult to reuse if the data is compressed in the JCAMP-DX file. What is needed is a single resource that allows submission of JCAMP-DX files, export of the raw data in multiple formats, searching based on multiple chemical identifiers, and is open in terms of license and access. To address these issues a new online resource called the Open Spectral Database (OSDB) http://osdb.info/ has been developed and is now available. Built using open source tools, using open code (hosted on GitHub), providing open data, and open to community input about design and functionality, the OSDB is available for anyone to submit spectral data, making it searchable and available to the scientific community. This paper details the concept and coding, internal architecture, export formats, Representational State Transfer (REST) Application Programming Interface and options for submission of data. The OSDB website went live in November 2015. Concurrently, the GitHub repository was made available at https://github.com/stuchalk/OSDB/, and is open for collaborators to join the project, submit issues, and contribute code. The combination of a scripting environment (PHPStorm), a PHP Framework (CakePHP), a relational database (MySQL) and a code repository (GitHub) provides all the capabilities to easily develop REST based websites for ingestion, curation and exposure of open chemical data to the community at all levels. It is hoped this software stack (or equivalent ones in other scripting languages) will be leveraged to make more chemical data available for both humans and computers.

  5. Construct User Guide

    DTIC Science & Technology

    2012-11-01

    validation using calibrated grounding. In 2007 BRIMS Conference Proceedings, Norfolk, VA. Simon, H. A. (1957). Administrative Behavior: A study of...Construct will write the output to the directory specified by the path name. Users should ensure that if they have opened any output files (e.g., in Excel... open an input file, it will exit and close. There are times when an error message is not present to the user in this situation! Users should ensure

  6. Technical Report for the Period 1 October 1987 - 30 September 1989

    DTIC Science & Technology

    1990-03-01

    low pass filter results. -dt dt specifies the sampling rate in seconds. -gin specifies .w file (binary waveform data) input. - gout specifies .w file...waves arriving at moderate incidence angles, * high signal-to-noise ratio (SNR). The following assumptions are made, for simplicity* * additive...spatially uncorrelated noise, * simple signal model, free of refraction and scattering effects. This study is limited to the case of a plane incident P

  7. SWIFT MODELLER: a Java based GUI for molecular modeling.

    PubMed

    Mathur, Abhinav; Shankaracharya; Vidyarthi, Ambarish S

    2011-10-01

    MODELLER is command line argument based software which requires tedious formatting of inputs and writing of Python scripts which most people are not comfortable with. Also the visualization of output becomes cumbersome due to verbose files. This makes the whole software protocol very complex and requires extensive study of MODELLER manuals and tutorials. Here we describe SWIFT MODELLER, a GUI that automates formatting, scripting and data extraction processes and present it in an interactive way making MODELLER much easier to use than before. The screens in SWIFT MODELLER are designed keeping homology modeling in mind and their flow is a depiction of its steps. It eliminates the formatting of inputs, scripting processes and analysis of verbose output files through automation and makes pasting of the target sequence as the only prerequisite. Jmol (3D structure visualization tool) has been integrated into the GUI which opens and demonstrates the protein data bank files created by the MODELLER software. All files required and created by the software are saved in a folder named after the work instance's date and time of execution. SWIFT MODELLER lowers the skill level required for the software through automation of many of the steps in the original software protocol, thus saving an enormous amount of time per instance and making MODELLER very easy to work with.

  8. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)

    NASA Technical Reports Server (NTRS)

    Manteufel, R.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  9. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  10. 78 FR 76529 - Members of a Family for Purpose of Filing CBP Family Declaration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-18

    ...This final rule affects persons eligible to file a single customs declaration. The final rule expands the definitions of family members residing in one household. As a result of this expansion, more U.S. returning resident and non-resident visitor families will be eligible to file a single customs declaration, and correspondingly, more U.S. returning resident family members may group their personal duty exemptions.

  11. Supporting the operational use of process based hydrological models and NASA Earth Observations for use in land management and post-fire remediation through a Rapid Response Erosion Database (RRED).

    NASA Astrophysics Data System (ADS)

    Miller, M. E.; Elliot, W.; Billmire, M.; Robichaud, P. R.; Banach, D. M.

    2017-12-01

    We have built a Rapid Response Erosion Database (RRED, http://rred.mtri.org/rred/) for the continental United States to allow land managers to access properly formatted spatial model inputs for the Water Erosion Prediction Project (WEPP). Spatially-explicit process-based models like WEPP require spatial inputs that include digital elevation models (DEMs), soil, climate and land cover. The online database delivers either a 10m or 30m USGS DEM, land cover derived from the Landfire project, and soil data derived from SSURGO and STATSGO datasets. The spatial layers are projected into UTM coordinates and pre-registered for modeling. WEPP soil parameter files are also created along with linkage files to match both spatial land cover and soils data with the appropriate WEPP parameter files. Our goal is to make process-based models more accessible by preparing spatial inputs ahead of time allowing modelers to focus on addressing scenarios of concern. The database provides comprehensive support for post-fire hydrological modeling by allowing users to upload spatial soil burn severity maps, and within moments returns spatial model inputs. Rapid response is critical following natural disasters. After moderate and high severity wildfires, flooding, erosion, and debris flows are a major threat to life, property and municipal water supplies. Mitigation measures must be rapidly implemented if they are to be effective, but they are expensive and cannot be applied everywhere. Fire, runoff, and erosion risks also are highly heterogeneous in space, creating an urgent need for rapid, spatially-explicit assessment. The database has been used to help assess and plan remediation on over a dozen wildfires in the Western US. Future plans include expanding spatial coverage, improving model input data and supporting additional models. Our goal is to facilitate the use of the best possible datasets and models to support the conservation of soil and water.

  12. ACON: a multipurpose production controller for plasma physics codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snell, C.

    1983-01-01

    ACON is a BCON controller designed to run large production codes on the CTSS Cray-1 or the LTSS 7600 computers. ACON can also be operated interactively, with input from the user's terminal. The controller can run one code or a sequence of up to ten codes during the same job. Options are available to get and save Mass storage files, to perform Historian file updating operations, to compile and load source files, and to send out print and film files. Special features include ability to retry after Mass failures, backup options for saving files, startup messages for the various codes,more » and ability to reserve specified amounts of computer time after successive code runs. ACON's flexibility and power make it useful for running a number of different production codes.« less

  13. Flexibility and Performance of Parallel File Systems

    NASA Technical Reports Server (NTRS)

    Kotz, David; Nieuwejaar, Nils

    1996-01-01

    As we gain experience with parallel file systems, it becomes increasingly clear that a single solution does not suit all applications. For example, it appears to be impossible to find a single appropriate interface, caching policy, file structure, or disk-management strategy. Furthermore, the proliferation of file-system interfaces and abstractions make applications difficult to port. We propose that the traditional functionality of parallel file systems be separated into two components: a fixed core that is standard on all platforms, encapsulating only primitive abstractions and interfaces, and a set of high-level libraries to provide a variety of abstractions and application-programmer interfaces (API's). We present our current and next-generation file systems as examples of this structure. Their features, such as a three-dimensional file structure, strided read and write interfaces, and I/O-node programs, are specifically designed with the flexibility and performance necessary to support a wide range of applications.

  14. User's guide for mapIMG 3--Map image re-projection software package

    USGS Publications Warehouse

    Finn, Michael P.; Mattli, David M.

    2012-01-01

    Version 0.0 (1995), Dan Steinwand, U.S. Geological Survey (USGS)/Earth Resources Observation Systems (EROS) Data Center (EDC)--Version 0.0 was a command line version for UNIX that required four arguments: the input metadata, the output metadata, the input data file, and the output destination path. Version 1.0 (2003), Stephen Posch and Michael P. Finn, USGS/Mid-Continent Mapping Center (MCMC--Version 1.0 added a GUI interface that was built using the Qt library for cross platform development. Version 1.01 (2004), Jason Trent and Michael P. Finn, USGS/MCMC--Version 1.01 suggested bounds for the parameters of each projection. Support was added for larger input files, storage of the last used input and output folders, and for TIFF/ GeoTIFF input images. Version 2.0 (2005), Robert Buehler, Jason Trent, and Michael P. Finn, USGS/National Geospatial Technical Operations Center (NGTOC)--Version 2.0 added Resampling Methods (Mean, Mode, Min, Max, and Sum), updated the GUI design, and added the viewer/pre-viewer. The metadata style was changed to XML and was switched to a new naming convention. Version 3.0 (2009), David Mattli and Michael P. Finn, USGS/Center of Excellence for Geospatial Information Science (CEGIS)--Version 3.0 brings optimized resampling methods, an updated GUI, support for less than global datasets, UTM support and the whole codebase was ported to Qt4.

  15. correlcalc: Two-point correlation function from redshift surveys

    NASA Astrophysics Data System (ADS)

    Rohin, Yeluripati

    2017-11-01

    correlcalc calculates two-point correlation function (2pCF) of galaxies/quasars using redshift surveys. It can be used for any assumed geometry or Cosmology model. Using BallTree algorithms to reduce the computational effort for large datasets, it is a parallelised code suitable for running on clusters as well as personal computers. It takes redshift (z), Right Ascension (RA) and Declination (DEC) data of galaxies and random catalogs as inputs in form of ascii or fits files. If random catalog is not provided, it generates one of desired size based on the input redshift distribution and mangle polygon file (in .ply format) describing the survey geometry. It also calculates different realisations of (3D) anisotropic 2pCF. Optionally it makes healpix maps of the survey providing visualization.

  16. Easy boundary definition for EGUN

    NASA Astrophysics Data System (ADS)

    Becker, R.

    1989-06-01

    The relativistic electron optics program EGUN [1] has reached a broad distribution, and many users have asked for an easier way of boundary input. A preprocessor to EGUN has been developed that accepts polygonal input of boundary points, and offers features such as rounding off of corners, shifting and squeezing of electrodes and simple input of slanted Neumann boundaries. This preprocessor can either be used on a PC that is linked to a mainframe using the FORTRAN version of EGUN, or in connection with the version EGNc, which also runs on a PC. In any case, direct graphic response on the PC greatly facilitates the creation of correct input files for EGUN.

  17. Update to the NASA Lewis Ice Accretion Code LEWICE

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    1994-01-01

    This report is intended as an update to NASA CR-185129 'User's Manual for the NASA Lewis Ice Accretion Prediction Code (LEWICE).' It describes modifications and improvements made to this code as well as changes to the input and output files, interactive input, and graphics output. The comparison of this code to experimental data is shown to have improved as a result of these modifications.

  18. Technology for Analysis of Student Interactions With Complex Programs. Final Report for Period January 1972-February 1973.

    ERIC Educational Resources Information Center

    Lukas, George; Feurzeig, Wallace

    A description is provided of a computer system designed to aid in the analysis of student programing work. The first section of the report consists of an overview and user's guide. In it, the system input is described in terms of a "dribble file" which records all student inputs generated; also an introduction is given to the aids…

  19. THERMINATOR 2: THERMal heavy Io N gener ATOR 2

    NASA Astrophysics Data System (ADS)

    Chojnacki, Mikołaj; Kisiel, Adam; Florkowski, Wojciech; Broniowski, Wojciech

    2012-03-01

    We present an extended version of THERMINATOR, a Monte Carlo event generator dedicated to studies of the statistical production of particles in relativistic heavy-ion collisions. The package is written in C++ and uses the CERN ROOT data-analysis environment. The largely increased functionality of the code contains the following main features: 1) The possibility of input of any shape of the freeze-out hypersurface and the expansion velocity field, including the 3+1-dimensional profiles, in particular those generated externally with various hydrodynamic codes. 2) The hypersurfaces may have variable thermal parameters, which allow studies departing significantly from the mid-rapidity region where the baryon chemical potential becomes large. 3) We include a library of standard sets of hypersurfaces and velocity profiles describing the RHIC Au + Au data at √{s}=200 GeV for various centralities, as well as those anticipated for the LHC Pb + Pb collisions at √{s}=5.5 TeV. 4) A separate code, FEMTO-THERMINATOR, is provided to carry out the analysis of the pion-pion femtoscopic correlations which are an important source of information concerning the size and expansion of the system. 5) We also include several useful scripts that carry out auxiliary tasks, such as obtaining an estimate of the number of elastic collisions after the freeze-out, counting of particles flowing back into the fireball and violating causality (typically very few), or visualizing various results: the particle p-spectra, the elliptic flow coefficients, and the HBT correlation radii. Program summaryProgram title:THERMINATOR 2 Catalogue identifier: ADXL_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXL_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 423 444 No. of bytes in distributed program, including test data, etc.: 2 854 602 Distribution format: tar.gz Programming language:C++ with the CERN ROOT libraries, BASH shell Computer: Any with a C++ compiler and the CERN ROOT environment, ver. 5.26 or later, tested with Intel Core2 Duo CPU E8400 @ 3 GHz, 4 GB RAM Operating system: Linux Ubuntu 10.10 x64 (gcc 4.4.5) ROOT 5.26 Linux Ubuntu 11.04 x64 (gcc Ubuntu/Linaro 4.5.2-8ubuntu4) ROOT 5.30/00 (compiled from source) Linux CentOS 5.2 (gcc Red Hat 4.1.2-42) ROOT 5.30/00 (compiled from source) Mac OS X 10.6.8 (i686-apple-darwin10-g++-4.2.1) ROOT 5.30/00 (for Mac OS X 10.6 x86-64 with gcc 4.2.1) cygwin-1.7.9-1 (gcc gcc4-g++-4.3.4-4) ROOT 5.30/00 (for cygwin gcc 4.3) RAM: 30 MB therm2 events 150 MB therm2 femto Classification: 11.2 Catalogue identifier of previous version: ADXL_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 669 External routines: CERN ROOT ( http://root.cern.ch/drupal/) Does the new version supersede the previous version?: Yes Nature of problem: Particle production via statistical hadronization in relativistic heavy-ion collisions. Solution method: Monte Carlo simulation, analyzed with ROOT. Reasons for new version: The increased functionality of the code contains the following important features. The input of any shape of the freeze-out hypersurface and the expansion velocity field, including the 3+1-dimensional profiles, in particular those generated externally with the various popular hydrodynamic codes. The hypersurfaces may have variable thermal parameters, which allows for studies departing significantly from the mid-rapidity region. We include a library of standard sets of hypersurfaces and velocity profiles describing the RHIC Au + Au and the LHC Pb+Pb data. A separate code, FEMTO-THERMINATOR, is provided to carry out the analysis of femtoscopic correlations. Summary of revisions: THERMINATOR 2 incorporates major revisions to encompass the enhanced functionality. Classes: The Integrator class has been expanded and a new subgroup of classes defined. Model and abstract class: These classes are responsible for the physical models of the freeze-out process. The functionality and readability of the code has been substantially increased by implementing each freeze-out model in a different class. The Hypersurface class was added to handle the input form hydrodynamic codes. The hydro input is passed to the program as a lattice of the freeze-out hypersurface. That information is stored in the .xml files. Input: THERMINATOR 2 programs are now controlled by *. ini type files. The programs parameters and the freeze-out model parameters are now in separate ini files. Output: The event files generated by the therm2_events program are not backward compatible with the previous version. The event*. root file structure was expanded with two new TTree structures. From the particle entry it is possible to back-trace the whole cascade. Event text output is now optional. The ROOT macros produce the *. eps figures with physics results, e.g. the pT-spectra, the elliptic-flow coefficient, rapidity distributions, etc. The THERMINATOR HBT package creates the ROOT files femto*. root ( therm2_femto) and hbtfit*. root ( therm2_hbtfit). Directory structure: The directory structure has been reorganized. Source code resides in the build directory. The freeze-out model input files, event files, ROOT macros are stored separately. The THERMINATOR 2 system, after installation, is able to run on a cluster. Scripts: The package contains a few BASH scripts helpful when running e.g. on a cluster the whole system can be executed via a single script. Additional comments: Typical data file size: default configuration. 45 MB/500 events; 35 MB/correlation file (one k bin); 45 kB/fit file (projections and fits). Running time: Default configuration at 3 GHz. primordial multiplicities 70 min (calculated only once per case); 8 min/500 events; 10 min - draw all figures; 25 min/one k bin in the HBT analysis with 5000 events.

  20. Nickel-Titanium Single-file System in Endodontics.

    PubMed

    Dagna, Alberto

    2015-10-01

    This work describes clinical cases treated with a innovative single-use and single-file nickel-titanium (NiTi) system used in continuous rotation. Nickel-titanium files are commonly used for root canal treatment but they tend to break because of bending stresses and torsional stresses. Today new instruments used only for one treatment have been introduced. They help the clinician to make the root canal shaping easier and safer because they do not require sterilization and after use have to be discarded. A new sterile instrument is used for each treatment in order to reduce the possibility of fracture inside the canal. The new One Shape NiTi single-file instrument belongs to this group. One Shape is used for complete shaping of root canal after an adequate preflaring. Its protocol is simple and some clinical cases are presented. It is helpful for easy cases and reliable for difficult canals. After 2 years of clinical practice, One Shape seems to be helpful for the treatment of most of the root canals, with low risk of separation. After each treatment, the instrument is discarded and not sterilized in autoclave or re-used. This single-use file simplifies the endodontic therapy, because only one instrument is required for canal shaping of many cases. The respect of clinical protocol guarantees predictable good results.

  1. iPat: intelligent prediction and association tool for genomic research.

    PubMed

    Chen, Chunpeng James; Zhang, Zhiwu

    2018-06-01

    The ultimate goal of genomic research is to effectively predict phenotypes from genotypes so that medical management can improve human health and molecular breeding can increase agricultural production. Genomic prediction or selection (GS) plays a complementary role to genome-wide association studies (GWAS), which is the primary method to identify genes underlying phenotypes. Unfortunately, most computing tools cannot perform data analyses for both GWAS and GS. Furthermore, the majority of these tools are executed through a command-line interface (CLI), which requires programming skills. Non-programmers struggle to use them efficiently because of the steep learning curves and zero tolerance for data formats and mistakes when inputting keywords and parameters. To address these problems, this study developed a software package, named the Intelligent Prediction and Association Tool (iPat), with a user-friendly graphical user interface. With iPat, GWAS or GS can be performed using a pointing device to simply drag and/or click on graphical elements to specify input data files, choose input parameters and select analytical models. Models available to users include those implemented in third party CLI packages such as GAPIT, PLINK, FarmCPU, BLINK, rrBLUP and BGLR. Users can choose any data format and conduct analyses with any of these packages. File conversions are automatically conducted for specified input data and selected packages. A GWAS-assisted genomic prediction method was implemented to perform genomic prediction using any GWAS method such as FarmCPU. iPat was written in Java for adaptation to multiple operating systems including Windows, Mac and Linux. The iPat executable file, user manual, tutorials and example datasets are freely available at http://zzlab.net/iPat. zhiwu.zhang@wsu.edu.

  2. Revised Extended Grid Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martz, Roger L.

    The Revised Eolus Grid Library (REGL) is a mesh-tracking library that was developed for use with the MCNP6TM computer code so that (radiation) particles can track on an unstructured mesh. The unstructured mesh is a finite element representation of any geometric solid model created with a state-of-the-art CAE/CAD tool. The mesh-tracking library is written using modern Fortran and programming standards; the library is Fortran 2003 compliant. The library was created with a defined application programmer interface (API) so that it could easily integrate with other particle tracking/transport codes. The library does not handle parallel processing via the message passing interfacemore » (mpi), but has been used successfully where the host code handles the mpi calls. The library is thread-safe and supports the OpenMP paradigm. As a library, all features are available through the API and overall a tight coupling between it and the host code is required. Features of the library are summarized with the following list: Can accommodate first and second order 4, 5, and 6-sided polyhedra; any combination of element types may appear in a single geometry model; parts may not contain tetrahedra mixed with other element types; pentahedra and hexahedra can be together in the same part; robust handling of overlaps and gaps; tracks element-to-element to produce path length results at the element level; finds element numbers for a given mesh location; finds intersection points on element faces for the particle tracks; produce a data file for post processing results analysis; reads Abaqus .inp input (ASCII) files to obtain information for the global mesh-model; supports parallel input processing via mpi; and support parallel particle transport by both mpi and OpenMP.« less

  3. Operation of the helicopter antenna radiation prediction code

    NASA Technical Reports Server (NTRS)

    Braeden, E. W.; Klevenow, F. T.; Newman, E. H.; Rojas, R. G.; Sampath, K. S.; Scheik, J. T.; Shamansky, H. T.

    1993-01-01

    HARP is a front end as well as a back end for the AMC and NEWAIR computer codes. These codes use the Method of Moments (MM) and the Uniform Geometrical Theory of Diffraction (UTD), respectively, to calculate the electromagnetic radiation patterns for antennas on aircraft. The major difficulty in using these codes is in the creation of proper input files for particular aircraft and in verifying that these files are, in fact, what is intended. HARP creates these input files in a consistent manner and allows the user to verify them for correctness using sophisticated 2 and 3D graphics. After antenna field patterns are calculated using either MM or UTD, HARP can display the results on the user's screen or provide hardcopy output. Because the process of collecting data, building the 3D models, and obtaining the calculated field patterns was completely automated by HARP, the researcher's productivity can be many times what it could be if these operations had to be done by hand. A complete, step by step, guide is provided so that the researcher can quickly learn to make use of all the capabilities of HARP.

  4. Shaping ability of 4 different single-file systems in simulated S-shaped canals.

    PubMed

    Saleh, Abdulrahman Mohammed; Vakili Gilani, Pouyan; Tavanafar, Saeid; Schäfer, Edgar

    2015-04-01

    The aim of this study was to compare the shaping ability of 4 different single-file systems in simulated S-shaped canals. Sixty-four S-shaped canals in resin blocks were prepared to an apical size of 25 using Reciproc (VDW, Munich, Germany), WaveOne (Dentsply Maillefer, Ballaigues, Switzerland), OneShape (Micro Méga, Besançon, France), and F360 (Komet Brasseler, Lemgo, Germany) (n = 16 canals/group) systems. Composite images were made from the superimposition of pre- and postinstrumentation images. The amount of resin removed by each system was measured by using a digital template and image analysis software. Canal aberrations and the preparation time were also recorded. The data were statistically analyzed by using analysis of variance, Tukey, and chi-square tests. Canals prepared with the F360 and OneShape systems were better centered compared with the Reciproc and WaveOne systems. Reciproc and WaveOne files removed significantly greater amounts of resin from the inner side of both curvatures (P < .05). Instrumentation with OneShape and Reciproc files was significantly faster compared with WaveOne and F360 files (P < .05). No instrument fractured during canal preparation. Under the conditions of this study, all single-file instruments were safe to use and were able to prepare the canals efficiently. However, single-file systems that are less tapered seem to be more favorable when preparing S-shaped canals. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  5. MIMS for TRIM

    EPA Pesticide Factsheets

    MIMS supports complex computational studies that use multiple interrelated models / programs, such as the modules within TRIM. MIMS is used by TRIM to run various models in sequence, while sharing input and output files.

  6. IDEA: Interactive Display for Evolutionary Analyses.

    PubMed

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-12-08

    The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  7. IDEA: Interactive Display for Evolutionary Analyses

    PubMed Central

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-01-01

    Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data. PMID:19061522

  8. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study.

    PubMed

    Agarwal, Rolly S; Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-05-01

    The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20(o) to 35(o) were divided into three groups of 20 samples each: ProTaper PT (group I) - full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO - single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey's honestly significant difference test. It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant. It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal transportation and centering ability comparable to full sequence Protaper system in curved root canals.

  9. [In vitro study of shaping ability of single-file techniques in curved canals].

    PubMed

    Zeng, Yajie; Gu, Lisha; Cai, Yanling; Chen, Dian; Wei, Xi

    2014-11-01

    To compare the shaping quality in curved canals of two single-file technique systems with other two traditional full-sequential systems. Eighty mature molar canals with the curvature between 20 and 45 degrees were randomly divided into four groups. Specimens in each group were prepared to size 25 at working length using A (Reciproc), B (OneShape), C (MTwo) and D (Revo S), respectively. Each canal was scanned by micro-computed tomography before and after preparation. Parameters including changes in dentine volume, percentage of uninstrumented area, degree and tendency of transportation were analyzed. The operating time was also recorded. In full canal length, there was no difference in canal dentine removal, instrumented percentage and transportation degree among four groups (P > 0.05). In the apical 4 mm region, group A removed more dentine [(2.14±0.76) mm(2) of canal surface area and (0.38 ± 0.15) mm(3) of canal volume] than groups B and C(P < 0.05). At 1 mm level, median of transportation degree of group A was 0.05 (0.03)mm, which was smaller than other groups (P < 0.05). Groups A and B took (86.3±24.6) s and (85.9±21.3) s, while groups C and D took (147.4±28.3) s and (126.3±27.7) s srespectively to finish preparation. Single file techniques were significantly faster than the two full-sequential systems (P < 0.01). Compared with the continuous rotary systems, the reciprocating single-file system A showed better apical shaping ability. Both single-file techniques were more efficient than full-sequential systems for curved canal preparation. Single-file techniques appear to be the effective and efficient method for curved canal preparation.

  10. On the role of adhesion in single-file dynamics

    NASA Astrophysics Data System (ADS)

    Fouad, Ahmed M.; Noel, John A.

    2017-08-01

    For a one-dimensional interacting system of Brownian particles with hard-core interactions (a single-file model), we study the effect of adhesion on both the collective diffusion (diffusion of the entire system with respect to its center of mass) and the tracer diffusion (diffusion of the individual tagged particles). For the case with no adhesion, all properties of these particle systems that are independent of particle labeling (symmetric in all particle coordinates and velocities) are identical to those of non-interacting particles (Lebowitz and Percus, 1967). We clarify this last fact twice. First, we derive our analytical predictions that show that the probability-density functions of single-file (ρsf) and ordinary (ρord) diffusion are identical, ρsf =ρord, predicting a nonanomalous (ordinary) behavior for the collective single-file diffusion, where the average second moment with respect to the center of mass, < x(t) 2 > , is calculated from ρ for both diffusion processes. Second, for single-file diffusion, we show, both analytically and through large-scale simulations, that < x(t) 2 > grows linearly with time, confirming the nonanomalous behavior. This nonanomalous collective behavior comes in contrast to the well-known anomalous sub-diffusion behavior of the individual tagged particles (Harris, 1965). We introduce adhesion to single-file dynamics as a second inter-particle interaction rule and, interestingly, we show that adding adhesion does reduce the magnitudes of both < x(t) 2 > and the mean square displacement per particle Δx2; but the diffusion behavior remains intact independent of adhesion in both cases. Moreover, we study the dependence of both the collective diffusion constant D and the tracer diffusion constant DT on the adhesion coefficient α.

  11. HDFT Webtool

    EPA Pesticide Factsheets

    Because HSPF requires extensive input data, its Data-Formatting Tool (HDFT) allows users to format that data and import it to a WDM file. HDFT aids urban watershed modeling applications that use sub-hourly temporal resolutions.

  12. Generic Procedure for Coupling the PHREEQC Geochemical Modeling Framework with Flow and Solute Transport Simulators

    NASA Astrophysics Data System (ADS)

    Wissmeier, L. C.; Barry, D. A.

    2009-12-01

    Computer simulations of water availability and quality play an important role in state-of-the-art water resources management. However, many of the most utilized software programs focus either on physical flow and transport phenomena (e.g., MODFLOW, MT3DMS, FEFLOW, HYDRUS) or on geochemical reactions (e.g., MINTEQ, PHREEQC, CHESS, ORCHESTRA). In recent years, several couplings between both genres of programs evolved in order to consider interactions between flow and biogeochemical reactivity (e.g., HP1, PHWAT). Software coupling procedures can be categorized as ‘close couplings’, where programs pass information via the memory stack at runtime, and ‘remote couplings’, where the information is exchanged at each time step via input/output files. The former generally involves modifications of software codes and therefore expert programming skills are required. We present a generic recipe for remotely coupling the PHREEQC geochemical modeling framework and flow and solute transport (FST) simulators. The iterative scheme relies on operator splitting with continuous re-initialization of PHREEQC and the FST of choice at each time step. Since PHREEQC calculates the geochemistry of aqueous solutions in contact with soil minerals, the procedure is primarily designed for couplings to FST’s for liquid phase flow in natural environments. It requires the accessibility of initial conditions and numerical parameters such as time and space discretization in the input text file for the FST and control of the FST via commands to the operating system (batch on Windows; bash/shell on Unix/Linux). The coupling procedure is based on PHREEQC’s capability to save the state of a simulation with all solid, liquid and gaseous species as a PHREEQC input file by making use of the dump file option in the TRANSPORT keyword. The output from one reaction calculation step is therefore reused as input for the following reaction step where changes in element amounts due to advection/dispersion are introduced as irreversible reactions. An example for the coupling of PHREEQC and MATLAB for the solution of unsaturated flow and transport is provided.

  13. Tolerance and UQ4SIM: Nimble Uncertainty Documentation and Analysis Software

    NASA Technical Reports Server (NTRS)

    Kleb, Bil

    2008-01-01

    Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and variabilities is a necessary first step toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. The basic premise of uncertainty markup is to craft a tolerance and tagging mini-language that offers a natural, unobtrusive presentation and does not depend on parsing each type of input file format. Each file is marked up with tolerances and optionally, associated tags that serve to label the parameters and their uncertainties. The evolution of such a language, often called a Domain Specific Language or DSL, is given in [1], but in final form it parallels tolerances specified on an engineering drawing, e.g., 1 +/- 0.5, 5 +/- 10%, 2 +/- 10 where % signifies percent and o signifies order of magnitude. Tags, necessary for error propagation, can be added by placing a quotation-mark-delimited tag after the tolerance, e.g., 0.7 +/- 20% 'T_effective'. In addition, tolerances might have different underlying distributions, e.g., Uniform, Normal, or Triangular, or the tolerances may merely be intervals due to lack of knowledge (uncertainty). Finally, to address pragmatic considerations such as older models that require specific number-field formats, C-style format specifiers can be appended to the tolerance like so, 1.35 +/- 10U_3.2f. As an example of use, consider figure 1, where a chemical reaction input file is has been marked up to include tolerances and tags per table 1. Not only does the technique provide a natural method of specifying tolerances, but it also servers as in situ documentation of model uncertainties. This tolerance language comes with a utility to strip the tolerances (and tags), to provide a path to the nominal model parameter file. And, as shown in [1], having the ability to quickly mark and identify model parameter uncertainties facilitates error propagation, which in turn yield output uncertainties.

  14. Histological evaluation of the cleaning effectiveness of two reciprocating single-file systems in severely curved root canals: Reciproc versus WaveOne.

    PubMed

    Carvalho, Maira de Souza; Junior, Emílio Carlos Sponchiado; Bitencourt Garrido, Angela Delfina; Roberti Garcia, Lucas da Fonseca; Franco Marques, André Augusto

    2015-01-01

    The aim of this study was to evaluate the cleaning effectiveness achieved with two reciprocating single-file systems in severely curved root canals: Reciproc and WaveOne. Twenty-five mesial roots of mandibular molars were randomly separated into two groups, according to the instrumentation system used. The negative control group consisted of five specimens that were not instrumented. The mesial canals (buccal and lingual) in Reciproc Group were instrumented with file R25 and the WaveOne group with the Primary file. The samples were submitted to histological processing and analyzed under a digital microscope. The WaveOne group presented a larger amount of debris than the Reciproc Group, however, without statistically significant difference (P > 0.05). A larger amount of debris in the control group was observed, with statistically significant difference to Reciproc and WaveOne groups (P < 0.05). The two reciprocating single-file instrumentation systems presented similar effectiveness for root canal cleaning.

  15. Development of interactive graphic user interfaces for modeling reaction-based biogeochemical processes in batch systems with BIOGEOCHEM

    NASA Astrophysics Data System (ADS)

    Chang, C.; Li, M.; Yeh, G.

    2010-12-01

    The BIOGEOCHEM numerical model (Yeh and Fang, 2002; Fang et al., 2003) was developed with FORTRAN for simulating reaction-based geochemical and biochemical processes with mixed equilibrium and kinetic reactions in batch systems. A complete suite of reactions including aqueous complexation, adsorption/desorption, ion-exchange, redox, precipitation/dissolution, acid-base reactions, and microbial mediated reactions were embodied in this unique modeling tool. Any reaction can be treated as fast/equilibrium or slow/kinetic reaction. An equilibrium reaction is modeled with an implicit finite rate governed by a mass action equilibrium equation or by a user-specified algebraic equation. A kinetic reaction is modeled with an explicit finite rate with an elementary rate, microbial mediated enzymatic kinetics, or a user-specified rate equation. None of the existing models has encompassed this wide array of scopes. To ease the input/output learning curve using the unique feature of BIOGEOCHEM, an interactive graphic user interface was developed with the Microsoft Visual Studio and .Net tools. Several user-friendly features, such as pop-up help windows, typo warning messages, and on-screen input hints, were implemented, which are robust. All input data can be real-time viewed and automated to conform with the input file format of BIOGEOCHEM. A post-processor for graphic visualizations of simulated results was also embedded for immediate demonstrations. By following data input windows step by step, errorless BIOGEOCHEM input files can be created even if users have little prior experiences in FORTRAN. With this user-friendly interface, the time effort to conduct simulations with BIOGEOCHEM can be greatly reduced.

  16. Performance of the Galley Parallel File System

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    As the input/output (I/O) needs of parallel scientific applications increase, file systems for multiprocessors are being designed to provide applications with parallel access to multiple disks. Many parallel file systems present applications with a conventional Unix-like interface that allows the application to access multiple disks transparently. This interface conceals the parallism within the file system, which increases the ease of programmability, but makes it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. Furthermore, most current parallel file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic parallel workloads. Initial experiments, reported in this paper, indicate that Galley is capable of providing high-performance 1/O to applications the applications that rely on them. In Section 3 we describe that access data in patterns that have been observed to be common.

  17. File formats commonly used in mass spectrometry proteomics.

    PubMed

    Deutsch, Eric W

    2012-12-01

    The application of mass spectrometry (MS) to the analysis of proteomes has enabled the high-throughput identification and abundance measurement of hundreds to thousands of proteins per experiment. However, the formidable informatics challenge associated with analyzing MS data has required a wide variety of data file formats to encode the complex data types associated with MS workflows. These formats encompass the encoding of input instruction for instruments, output products of the instruments, and several levels of information and results used by and produced by the informatics analysis tools. A brief overview of the most common file formats in use today is presented here, along with a discussion of related topics.

  18. Lipid-converter, a framework for lipid manipulations in molecular dynamics simulations

    PubMed Central

    Larsson, Per; Kasson, Peter M.

    2014-01-01

    Construction of lipid membrane and membrane protein systems for molecular dynamics simulations can be a challenging process. In addition, there are few available tools to extend existing studies by repeating simulations using other force fields and lipid compositions. To facilitate this, we introduce lipidconverter, a modular Python framework for exchanging force fields and lipid composition in coordinate files obtained from simulations. Force fields and lipids are specified by simple text files, making it easy to introduce support for additional force fields and lipids. The converter produces simulation input files that can be used for structural relaxation of the new membranes. PMID:25081234

  19. ZOOM Lite: next-generation sequencing data mapping and visualization software

    PubMed Central

    Zhang, Zefeng; Lin, Hao; Ma, Bin

    2010-01-01

    High-throughput next-generation sequencing technologies pose increasing demands on the efficiency, accuracy and usability of data analysis software. In this article, we present ZOOM Lite, a software for efficient reads mapping and result visualization. With a kernel capable of mapping tens of millions of Illumina or AB SOLiD sequencing reads efficiently and accurately, and an intuitive graphical user interface, ZOOM Lite integrates reads mapping and result visualization into a easy to use pipeline on desktop PC. The software handles both single-end and paired-end reads, and can output both the unique mapping result or the top N mapping results for each read. Additionally, the software takes a variety of input file formats and outputs to several commonly used result formats. The software is freely available at http://bioinfor.com/zoom/lite/. PMID:20530531

  20. The program complex for vocal recognition

    NASA Astrophysics Data System (ADS)

    Konev, Anton; Kostyuchenko, Evgeny; Yakimuk, Alexey

    2017-01-01

    This article discusses the possibility of applying the algorithm of determining the pitch frequency for the note recognition problems. Preliminary study of programs-analogues were carried out for programs with function “recognition of the music”. The software package based on the algorithm for pitch frequency calculation was implemented and tested. It was shown that the algorithm allows recognizing the notes in the vocal performance of the user. A single musical instrument, a set of musical instruments, and a human voice humming a tune can be the sound source. The input file is initially presented in the .wav format or is recorded in this format from a microphone. Processing is performed by sequentially determining the pitch frequency and conversion of its values to the note. According to test results, modification of algorithms used in the complex was planned.

  1. Pulse-Echo Ultrasonic Imaging Method for Eliminating Sample Thickness Variation Effects

    NASA Technical Reports Server (NTRS)

    Roth, Don J. (Inventor)

    1997-01-01

    A pulse-echo, immersion method for ultrasonic evaluation of a material which accounts for and eliminates nonlevelness in the equipment set-up and sample thickness variation effects employs a single transducer and automatic scanning and digital imaging to obtain an image of a property of the material, such as pore fraction. The nonlevelness and thickness variation effects are accounted for by pre-scan adjustments of the time window to insure that the echoes received at each scan point are gated in the center of the window. This information is input into the scan file so that, during the automatic scanning for the material evaluation, each received echo is centered in its time window. A cross-correlation function calculates the velocity at each scan point, which is then proportionalized to a color or grey scale and displayed on a video screen.

  2. Incorporating Brokers within Collaboration Environments

    NASA Astrophysics Data System (ADS)

    Rajasekar, A.; Moore, R.; de Torcy, A.

    2013-12-01

    A collaboration environment, such as the integrated Rule Oriented Data System (iRODS - http://irods.diceresearch.org), provides interoperability mechanisms for accessing storage systems, authentication systems, messaging systems, information catalogs, networks, and policy engines from a wide variety of clients. The interoperability mechanisms function as brokers, translating actions requested by clients to the protocol required by a specific technology. The iRODS data grid is used to enable collaborative research within hydrology, seismology, earth science, climate, oceanography, plant biology, astronomy, physics, and genomics disciplines. Although each domain has unique resources, data formats, semantics, and protocols, the iRODS system provides a generic framework that is capable of managing collaborative research initiatives that span multiple disciplines. Each interoperability mechanism (broker) is linked to a name space that enables unified access across the heterogeneous systems. The collaboration environment provides not only support for brokers, but also support for virtualization of name spaces for users, files, collections, storage systems, metadata, and policies. The broker enables access to data or information in a remote system using the appropriate protocol, while the collaboration environment provides a uniform naming convention for accessing and manipulating each object. Within the NSF DataNet Federation Consortium project (http://www.datafed.org), three basic types of interoperability mechanisms have been identified and applied: 1) drivers for managing manipulation at the remote resource (such as data subsetting), 2) micro-services that execute the protocol required by the remote resource, and 3) policies for controlling the execution. For example, drivers have been written for manipulating NetCDF and HDF formatted files within THREDDS servers. Micro-services have been written that manage interactions with the CUAHSI data repository, the DataONE information catalog, and the GeoBrain broker. Policies have been written that manage transfer of messages between an iRODS message queue and the Advanced Message Queuing Protocol. Examples of these brokering mechanisms will be presented. The DFC collaboration environment serves as the intermediary between community resources and compute grids, enabling reproducible data-driven research. It is possible to create an analysis workflow that retrieves data subsets from a remote server, assemble the required input files, automate the execution of the workflow, automatically track the provenance of the workflow, and share the input files, workflow, and output files. A collaborator can re-execute a shared workflow, compare results, change input files, and re-execute an analysis.

  3. VORTAB - A data-tablet method of developing input data for the VORLAX program

    NASA Technical Reports Server (NTRS)

    Denn, F. M.

    1979-01-01

    A method of developing an input data file for use in the aerodynamic analysis of a complete airplane with the VORLAX computer program is described. The hardware consists of an interactive graphics terminal equipped with a graphics tablet. Software includes graphics routines from the Tektronix PLOT 10 package as well as the VORTAB program described. The user determines the size and location of each of the major panels for the aircraft before using the program. Data is entered both from the terminal keyboard and the graphics tablet. The size of the resulting data file is dependent on the complexity of the model and can vary from ten to several hundred card images. After the data are entered, two programs READB and PLOTB, are executed which plot the configuration allowing visual inspection of the model.

  4. Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments.

    PubMed

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2016-01-05

    We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  5. Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments

    PubMed Central

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2016-01-01

    We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. PMID:26745406

  6. Photon-HDF5: an open file format for single-molecule fluorescence experiments using photon-counting detectors

    DOE PAGES

    Ingargiola, A.; Laurence, T. A.; Boutelle, R.; ...

    2015-12-23

    We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode (SPAD), photomultiplier tube (PMT) or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. Themore » format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. As a result, to encourage adoption by the academic and commercial communities, all software is released under the MIT open source license.« less

  7. Comparative Evaluation of Stress Distribution in Experimentally Designed Nickel-titanium Rotary Files with Varying Cross Sections: A Finite Element Analysis.

    PubMed

    Basheer Ahamed, Shadir Bughari; Vanajassun, Purushothaman Pranav; Rajkumar, Kothandaraman; Mahalaxmi, Sekar

    2018-04-01

    Single cross-sectional nickel-titanium (NiTi) rotary instruments during continuous rotations are subjected to constant and variable stresses depending on the canal anatomy. This study was intended to create 2 new experimental, theoretic single-file designs with combinations of triple U (TU), triangle (TR), and convex triangle (CT) cross sections and to compare their bending stresses in simulated root canals with a single cross-sectional instrument using finite element analysis. A 3-dimensional model of the simulated root canal with 45° curvature and NiTi files with 5 cross-sectional designs were created using Pro/ENGINEER Wildfire 4.0 software (PTC Inc, Needham, MA) and ANSYS software (version 17; ANSYS, Inc, Canonsburg, PA) for finite element analysis. The NiTi files of 3 groups had single cross-sectional shapes of CT, TR, and TU designs, and 2 experimental groups had a CT, TR, and TU (CTU) design and a TU, TR, and CT (UTC) design. The file was rotated in simulated root canals to analyze the bending stress, and the von Mises stress value for every file was recorded in MPa. Statistical analysis was performed using the Kruskal-Wallis test and the Bonferroni-adjusted Mann-Whitney test for multiple pair-wise comparison with a P value <.05 (95 %). The maximum bending stress of the rotary file was observed in the apical third of the CT design, whereas comparatively less stress was recorded in the CTU design. The TU and TR designs showed a similar stress pattern at the curvature, whereas the UTC design showed greater stress in the apical and middle thirds of the file in curved canals. All the file designs showed a statistically significant difference. The CTU designed instruments showed the least bending stress on a 45° angulated simulated root canal when compared with all the other tested designs. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  8. A Review of Aeromagnetic Anomalies in the Sawatch Range, Central Colorado

    USGS Publications Warehouse

    Bankey, Viki

    2010-01-01

    This report contains digital data and image files of aeromagnetic anomalies in the Sawatch Range of central Colorado. The primary product is a data layer of polygons with linked data records that summarize previous interpretations of aeromagnetic anomalies in this region. None of these data files and images are new; rather, they are presented in updated formats that are intended to be used as input to geographic information systems, standard graphics software, or map-plotting packages.

  9. Fallon, Nevada FORGE Distinct Element Reservoir Modeling

    DOE Data Explorer

    Blankenship, Doug; Pettitt, Will; Riahi, Azadeh; Hazzard, Jim; Blanksma, Derrick

    2018-03-12

    Archive containing input/output data for distinct element reservoir modeling for Fallon FORGE. Models created using 3DEC, InSite, and in-house Python algorithms (ITASCA). List of archived files follows; please see 'Modeling Metadata.pdf' (included as a resource below) for additional file descriptions. Data sources include regional geochemical model, well positions and geometry, principal stress field, capability for hydraulic fractures, capability for hydro-shearing, reservoir geomechanical model-stimulation into multiple zones, modeled thermal behavior during circulation, and microseismicity.

  10. MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data

    PubMed Central

    2014-01-01

    Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103

  11. GeNemo: a search engine for web-based functional genomic data.

    PubMed

    Zhang, Yongqing; Cao, Xiaoyi; Zhong, Sheng

    2016-07-08

    A set of new data types emerged from functional genomic assays, including ChIP-seq, DNase-seq, FAIRE-seq and others. The results are typically stored as genome-wide intensities (WIG/bigWig files) or functional genomic regions (peak/BED files). These data types present new challenges to big data science. Here, we present GeNemo, a web-based search engine for functional genomic data. GeNemo searches user-input data against online functional genomic datasets, including the entire collection of ENCODE and mouse ENCODE datasets. Unlike text-based search engines, GeNemo's searches are based on pattern matching of functional genomic regions. This distinguishes GeNemo from text or DNA sequence searches. The user can input any complete or partial functional genomic dataset, for example, a binding intensity file (bigWig) or a peak file. GeNemo reports any genomic regions, ranging from hundred bases to hundred thousand bases, from any of the online ENCODE datasets that share similar functional (binding, modification, accessibility) patterns. This is enabled by a Markov Chain Monte Carlo-based maximization process, executed on up to 24 parallel computing threads. By clicking on a search result, the user can visually compare her/his data with the found datasets and navigate the identified genomic regions. GeNemo is available at www.genemo.org. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. RCHILD - an R-package for flexible use of the landscape evolution model CHILD

    NASA Astrophysics Data System (ADS)

    Dietze, Michael

    2014-05-01

    Landscape evolution models provide powerful approaches to numerically assess earth surface processes, to quantify rates of landscape change, infer sediment transfer rates, estimate sediment budgets, investigate the consequences of changes in external drivers on a geomorphic system, to provide spatio-temporal interpolations between known landscape states or to test conceptual hypotheses. CHILD (Channel-Hillslope Integrated Landscape Development Model) is one of the most-used models of landscape change in the context of at least tectonic and geomorphologic process interactions. Running CHILD from command line and working with the model output can be a rather awkward task (static model control via text input file, only numeric output in text files). The package RCHILD is a collection of functions for the free statistical software R that help using CHILD in a flexible, dynamic and user-friendly way. The comprised functions allow creating maps, real-time scenes, animations and further thematic plots from model output. The model input files can be modified dynamically and, hence, (feedback-related) changes in external factors can be implemented iteratively. Output files can be written to common formats that can be readily imported to standard GIS software. This contribution presents the basic functionality of the model CHILD as visualised and modified by the package. A rough overview of the available functions is given. Application examples help to illustrate the great potential of numeric modelling of geomorphologic processes.

  13. Workflow Management for Complex HEP Analyses

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Fischer, R.; Rieger, M.; von Cube, R. F.

    2017-10-01

    We present the novel Analysis Workflow Management (AWM) that provides users with the tools and competences of professional large scale workflow systems, e.g. Apache’s Airavata[1]. The approach presents a paradigm shift from executing parts of the analysis to defining the analysis. Within AWM an analysis consists of steps. For example, a step defines to run a certain executable for multiple files of an input data collection. Each call to the executable for one of those input files can be submitted to the desired run location, which could be the local computer or a remote batch system. An integrated software manager enables automated user installation of dependencies in the working directory at the run location. Each execution of a step item creates one report for bookkeeping purposes containing error codes and output data or file references. Required files, e.g. created by previous steps, are retrieved automatically. Since data storage and run locations are exchangeable from the steps perspective, computing resources can be used opportunistically. A visualization of the workflow as a graph of the steps in the web browser provides a high-level view on the analysis. The workflow system is developed and tested alongside of a ttbb cross section measurement where, for instance, the event selection is represented by one step and a Bayesian statistical inference is performed by another. The clear interface and dependencies between steps enables a make-like execution of the whole analysis.

  14. Dynamic Non-Hierarchical File Systems for Exascale Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Darrell E.; Miller, Ethan L

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in themore » scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search appliances. These search applications are often optimized for a single file system, making it difficult to move files and their metadata between file systems. Users have tried to solve this problem in several ways, including the use of separate databases to index file properties, the encoding of file properties into file names, and separately gathering and managing provenance data, but none of these approaches has worked well, either due to limited usefulness or scalability, or both. Our research addressed several key issues: High-performance, real-time metadata harvesting: extracting important attributes from files dynamically and immediately updating indexes used to improve search; Transparent, automatic, and secure provenance capture: recording the data inputs and processing steps used in the production of each file in the system; Scalable indexing: indexes that are optimized for integration with the file system; Dynamic file system structure: our approach provides dynamic directories similar to those in semantic file systems, but these are the native organization rather than a feature grafted onto a conventional system. In addition to these goals, our research effort will include evaluating the impact of new storage technologies on the file system design and performance. In particular, the indexing and metadata harvesting functions can potentially benefit from the performance improvements promised by new storage class memories.« less

  15. 78 FR 78353 - Hydro Green Energy, LLC; Notice of Preliminary Permit Application Accepted for Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-26

    ... concrete tetrapods; and (10) a new single- circuit 230-kilovolt transmission line approximately 9 miles in... encourages electronic filing. Please file comments, motions to intervene, notices of intent, and [email protected] , (866) 208-3676 (toll free), or (202) 502- 8659 (TTY). In lieu of electronic filing...

  16. Log file-based patient dose calculations of double-arc VMAT for head-and-neck radiotherapy.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Majima, Kazuhiro; Matsushita, Haruo; Takeda, Ken; Jingu, Keiichi

    2018-04-01

    The log file-based method cannot display dosimetric changes due to linac component miscalibration because of the insensitivity of log files to linac component miscalibration. The purpose of this study was to supply dosimetric changes in log file-based patient dose calculations for double-arc volumetric-modulated arc therapy (VMAT) in head-and-neck cases. Fifteen head-and-neck cases participated in this study. For each case, treatment planning system (TPS) doses were produced by double-arc and single-arc VMAT. Miscalibration-simulated log files were generated by inducing a leaf miscalibration of ±0.5 mm into the log files that were acquired during VMAT irradiation. Subsequently, patient doses were estimated using the miscalibration-simulated log files. For double-arc VMAT, regarding planning target volume (PTV), the change from TPS dose to miscalibration-simulated log file dose in D mean was 0.9 Gy and that for tumor control probability was 1.4%. As for organ-at-risks (OARs), the change in D mean was <0.7 Gy and normal tissue complication probability was <1.8%. A comparison between double-arc and single-arc VMAT for PTV showed statistically significant differences in the changes evaluated by D mean and radiobiological metrics (P < 0.01), even though the magnitude of these differences was small. Similarly, for OARs, the magnitude of these changes was found to be small. Using the log file-based method for PTV and OARs, the log file-based method estimate of patient dose using the double-arc VMAT has accuracy comparable to that obtained using the single-arc VMAT. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Fast, accurate photon beam accelerator modeling using BEAMnrc: A systematic investigation of efficiency enhancing methods and cross-section data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fragoso, Margarida; Kawrakow, Iwan; Faddegon, Bruce A.

    In this work, an investigation of efficiency enhancing methods and cross-section data in the BEAMnrc Monte Carlo (MC) code system is presented. Additionally, BEAMnrc was compared with VMC++, another special-purpose MC code system that has recently been enhanced for the simulation of the entire treatment head. BEAMnrc and VMC++ were used to simulate a 6 MV photon beam from a Siemens Primus linear accelerator (linac) and phase space (PHSP) files were generated at 100 cm source-to-surface distance for the 10x10 and 40x40 cm{sup 2} field sizes. The BEAMnrc parameters/techniques under investigation were grouped by (i) photon and bremsstrahlung cross sections,more » (ii) approximate efficiency improving techniques (AEITs), (iii) variance reduction techniques (VRTs), and (iv) a VRT (bremsstrahlung photon splitting) in combination with an AEIT (charged particle range rejection). The BEAMnrc PHSP file obtained without the efficiency enhancing techniques under study or, when not possible, with their default values (e.g., EXACT algorithm for the boundary crossing algorithm) and with the default cross-section data (PEGS4 and Bethe-Heitler) was used as the ''base line'' for accuracy verification of the PHSP files generated from the different groups described previously. Subsequently, a selection of the PHSP files was used as input for DOSXYZnrc-based water phantom dose calculations, which were verified against measurements. The performance of the different VRTs and AEITs available in BEAMnrc and of VMC++ was specified by the relative efficiency, i.e., by the efficiency of the MC simulation relative to that of the BEAMnrc base-line calculation. The highest relative efficiencies were {approx}935 ({approx}111 min on a single 2.6 GHz processor) and {approx}200 ({approx}45 min on a single processor) for the 10x10 field size with 50 million histories and 40x40 cm{sup 2} field size with 100 million histories, respectively, using the VRT directional bremsstrahlung splitting (DBS) with no electron splitting. When DBS was used with electron splitting and combined with augmented charged particle range rejection, a technique recently introduced in BEAMnrc, relative efficiencies were {approx}420 ({approx}253 min on a single processor) and {approx}175 ({approx}58 min on a single processor) for the 10x10 and 40x40 cm{sup 2} field sizes, respectively. Calculations of the Siemens Primus treatment head with VMC++ produced relative efficiencies of {approx}1400 ({approx}6 min on a single processor) and {approx}60 ({approx}4 min on a single processor) for the 10x10 and 40x40 cm{sup 2} field sizes, respectively. BEAMnrc PHSP calculations with DBS alone or DBS in combination with charged particle range rejection were more efficient than the other efficiency enhancing techniques used. Using VMC++, accurate simulations of the entire linac treatment head were performed within minutes on a single processor. Noteworthy differences ({+-}1%-3%) in the mean energy, planar fluence, and angular and spectral distributions were observed with the NIST bremsstrahlung cross sections compared with those of Bethe-Heitler (BEAMnrc default bremsstrahlung cross section). However, MC calculated dose distributions in water phantoms (using combinations of VRTs/AEITs and cross-section data) agreed within 2% of measurements. Furthermore, MC calculated dose distributions in a simulated water/air/water phantom, using NIST cross sections, were within 2% agreement with the BEAMnrc Bethe-Heitler default case.« less

  18. Computerized Integrated Inventory Control for an Air Force Base-Level Supply System.

    DTIC Science & Technology

    1980-06-01

    3465 4710-4730 9110-9180 3515-3540 4810-4820 9320-9360 3620-3694 4910-4940 9505 -9540 3720-2750 5110-5180 9620-9650 3805-3030 5210-5280 3910-3995 4010...Buffers Disc Files Input 1.2 K Output 1.2 K 2.4 K Printer 2 (double) x 150 .0003 K Tape Input 1.2 K Output 1.2 K 2.4 K Card Reader 2 (double) x 100 . 0002

  19. A program to compute three-dimensional subsonic unsteady aerodynamic characteristics using the doublet lattic method, L216 (DUBFLX). Volume 1: Engineering and usage

    NASA Technical Reports Server (NTRS)

    Richard, M.; Harrison, B. A.

    1979-01-01

    The program input presented consists of configuration geometry, aerodynamic parameters, and modal data; output includes element geometry, pressure difference distributions, integrated aerodynamic coefficients, stability derivatives, generalized aerodynamic forces, and aerodynamic influence coefficient matrices. Optionally, modal data may be input on magnetic file (tape or disk), and certain geometric and aerodynamic output may be saved for subsequent use.

  20. NOSS altimeter algorithm specifications

    NASA Technical Reports Server (NTRS)

    Hancock, D. W.; Forsythe, R. G.; Mcmillan, J. D.

    1982-01-01

    A description of all algorithms required for altimeter processing is given. Each description includes title, description, inputs/outputs, general algebraic sequences and data volume. All required input/output data files are described and the computer resources required for the entire altimeter processing system were estimated. The majority of the data processing requirements for any radar altimeter of the Seasat-1 type are scoped. Additions and deletions could be made for the specific altimeter products required by other projects.

  1. OpenMP GNU and Intel Fortran programs for solving the time-dependent Gross-Pitaevskii equation

    NASA Astrophysics Data System (ADS)

    Young-S., Luis E.; Muruganandam, Paulsamy; Adhikari, Sadhan K.; Lončar, Vladimir; Vudragović, Dušan; Balaž, Antun

    2017-11-01

    We present Open Multi-Processing (OpenMP) version of Fortran 90 programs for solving the Gross-Pitaevskii (GP) equation for a Bose-Einstein condensate in one, two, and three spatial dimensions, optimized for use with GNU and Intel compilers. We use the split-step Crank-Nicolson algorithm for imaginary- and real-time propagation, which enables efficient calculation of stationary and non-stationary solutions, respectively. The present OpenMP programs are designed for computers with multi-core processors and optimized for compiling with both commercially-licensed Intel Fortran and popular free open-source GNU Fortran compiler. The programs are easy to use and are elaborated with helpful comments for the users. All input parameters are listed at the beginning of each program. Different output files provide physical quantities such as energy, chemical potential, root-mean-square sizes, densities, etc. We also present speedup test results for new versions of the programs. Program files doi:http://dx.doi.org/10.17632/y8zk3jgn84.2 Licensing provisions: Apache License 2.0 Programming language: OpenMP GNU and Intel Fortran 90. Computer: Any multi-core personal computer or workstation with the appropriate OpenMP-capable Fortran compiler installed. Number of processors used: All available CPU cores on the executing computer. Journal reference of previous version: Comput. Phys. Commun. 180 (2009) 1888; ibid.204 (2016) 209. Does the new version supersede the previous version?: Not completely. It does supersede previous Fortran programs from both references above, but not OpenMP C programs from Comput. Phys. Commun. 204 (2016) 209. Nature of problem: The present Open Multi-Processing (OpenMP) Fortran programs, optimized for use with commercially-licensed Intel Fortran and free open-source GNU Fortran compilers, solve the time-dependent nonlinear partial differential (GP) equation for a trapped Bose-Einstein condensate in one (1d), two (2d), and three (3d) spatial dimensions for six different trap symmetries: axially and radially symmetric traps in 3d, circularly symmetric traps in 2d, fully isotropic (spherically symmetric) and fully anisotropic traps in 2d and 3d, as well as 1d traps, where no spatial symmetry is considered. Solution method: We employ the split-step Crank-Nicolson algorithm to discretize the time-dependent GP equation in space and time. The discretized equation is then solved by imaginary- or real-time propagation, employing adequately small space and time steps, to yield the solution of stationary and non-stationary problems, respectively. Reasons for the new version: Previously published Fortran programs [1,2] have now become popular tools [3] for solving the GP equation. These programs have been translated to the C programming language [4] and later extended to the more complex scenario of dipolar atoms [5]. Now virtually all computers have multi-core processors and some have motherboards with more than one physical computer processing unit (CPU), which may increase the number of available CPU cores on a single computer to several tens. The C programs have been adopted to be very fast on such multi-core modern computers using general-purpose graphic processing units (GPGPU) with Nvidia CUDA and computer clusters using Message Passing Interface (MPI) [6]. Nevertheless, previously developed Fortran programs are also commonly used for scientific computation and most of them use a single CPU core at a time in modern multi-core laptops, desktops, and workstations. Unless the Fortran programs are made aware and capable of making efficient use of the available CPU cores, the solution of even a realistic dynamical 1d problem, not to mention the more complicated 2d and 3d problems, could be time consuming using the Fortran programs. Previously, we published auto-parallel Fortran programs [2] suitable for Intel (but not GNU) compiler for solving the GP equation. Hence, a need for the full OpenMP version of the Fortran programs to reduce the execution time cannot be overemphasized. To address this issue, we provide here such OpenMP Fortran programs, optimized for both Intel and GNU Fortran compilers and capable of using all available CPU cores, which can significantly reduce the execution time. Summary of revisions: Previous Fortran programs [1] for solving the time-dependent GP equation in 1d, 2d, and 3d with different trap symmetries have been parallelized using the OpenMP interface to reduce the execution time on multi-core processors. There are six different trap symmetries considered, resulting in six programs for imaginary-time propagation and six for real-time propagation, totaling to 12 programs included in BEC-GP-OMP-FOR software package. All input data (number of atoms, scattering length, harmonic oscillator trap length, trap anisotropy, etc.) are conveniently placed at the beginning of each program, as before [2]. Present programs introduce a new input parameter, which is designated by Number_of_Threads and defines the number of CPU cores of the processor to be used in the calculation. If one sets the value 0 for this parameter, all available CPU cores will be used. For the most efficient calculation it is advisable to leave one CPU core unused for the background system's jobs. For example, on a machine with 20 CPU cores such that we used for testing, it is advisable to use up to 19 CPU cores. However, the total number of used CPU cores can be divided into more than one job. For instance, one can run three simulations simultaneously using 10, 4, and 5 CPU cores, respectively, thus totaling to 19 used CPU cores on a 20-core computer. The Fortran source programs are located in the directory src, and can be compiled by the make command using the makefile in the root directory BEC-GP-OMP-FOR of the software package. The examples of produced output files can be found in the directory output, although some large density files are omitted, to save space. The programs calculate the values of actually used dimensionless nonlinearities from the physical input parameters, where the input parameters correspond to the identical nonlinearity values as in the previously published programs [1], so that the output files of the old and new programs can be directly compared. The output files are conveniently named such that their contents can be easily identified, following the naming convention introduced in Ref. [2]. For example, a file named -out.txt, where is a name of the individual program, represents the general output file containing input data, time and space steps, nonlinearity, energy and chemical potential, and was named fort.7 in the old Fortran version of programs [1]. A file named -den.txt is the output file with the condensate density, which had the names fort.3 and fort.4 in the old Fortran version [1] for imaginary- and real-time propagation programs, respectively. Other possible density outputs, such as the initial density, are commented out in the programs to have a simpler set of output files, but users can uncomment and re-enable them, if needed. In addition, there are output files for reduced (integrated) 1d and 2d densities for different programs. In the real-time programs there is also an output file reporting the dynamics of evolution of root-mean-square sizes after a perturbation is introduced. The supplied real-time programs solve the stationary GP equation, and then calculate the dynamics. As the imaginary-time programs are more accurate than the real-time programs for the solution of a stationary problem, one can first solve the stationary problem using the imaginary-time programs, adapt the real-time programs to read the pre-calculated wave function and then study the dynamics. In that case the parameter NSTP in the real-time programs should be set to zero and the space mesh and nonlinearity parameters should be identical in both programs. The reader is advised to consult our previous publication where a complete description of the output files is given [2]. A readme.txt file, included in the root directory, explains the procedure to compile and run the programs. We tested our programs on a workstation with two 10-core Intel Xeon E5-2650 v3 CPUs. The parameters used for testing are given in sample input files, provided in the corresponding directory together with the programs. In Table 1 we present wall-clock execution times for runs on 1, 6, and 19 CPU cores for programs compiled using Intel and GNU Fortran compilers. The corresponding columns "Intel speedup" and "GNU speedup" give the ratio of wall-clock execution times of runs on 1 and 19 CPU cores, and denote the actual measured speedup for 19 CPU cores. In all cases and for all numbers of CPU cores, although the GNU Fortran compiler gives excellent results, the Intel Fortran compiler turns out to be slightly faster. Note that during these tests we always ran only a single simulation on a workstation at a time, to avoid any possible interference issues. Therefore, the obtained wall-clock times are more reliable than the ones that could be measured with two or more jobs running simultaneously. We also studied the speedup of the programs as a function of the number of CPU cores used. The performance of the Intel and GNU Fortran compilers is illustrated in Fig. 1, where we plot the speedup and actual wall-clock times as functions of the number of CPU cores for 2d and 3d programs. We see that the speedup increases monotonically with the number of CPU cores in all cases and has large values (between 10 and 14 for 3d programs) for the maximal number of cores. This fully justifies the development of OpenMP programs, which enable much faster and more efficient solving of the GP equation. However, a slow saturation in the speedup with the further increase in the number of CPU cores is observed in all cases, as expected. The speedup tends to increase for programs in higher dimensions, as they become more complex and have to process more data. This is why the speedups of the supplied 2d and 3d programs are larger than those of 1d programs. Also, for a single program the speedup increases with the size of the spatial grid, i.e., with the number of spatial discretization points, since this increases the amount of calculations performed by the program. To demonstrate this, we tested the supplied real2d-th program and varied the number of spatial discretization points NX=NY from 20 to 1000. The measured speedup obtained when running this program on 19 CPU cores as a function of the number of discretization points is shown in Fig. 2. The speedup first increases rapidly with the number of discretization points and eventually saturates. Additional comments: Example inputs provided with the programs take less than 30 minutes to run on a workstation with two Intel Xeon E5-2650 v3 processors (2 QPI links, 10 CPU cores, 25 MB cache, 2.3 GHz).

  2. Taking advantage of HTML5 browsers to realize the concepts of session state and workflow sharing in web-tool applications

    NASA Astrophysics Data System (ADS)

    Suftin, I.; Read, J. S.; Walker, J.

    2013-12-01

    Scientists prefer not having to be tied down to a specific machine or operating system in order to analyze local and remote data sets or publish work. Increasingly, analysis has been migrating to decentralized web services and data sets, using web clients to provide the analysis interface. While simplifying workflow access, analysis, and publishing of data, the move does bring with it its own unique set of issues. Web clients used for analysis typically offer workflows geared towards a single user, with steps and results that are often difficult to recreate and share with others. Furthermore, workflow results often may not be easily used as input for further analysis. Older browsers further complicate things by having no way to maintain larger chunks of information, often offloading the job of storage to the back-end server or trying to squeeze it into a cookie. It has been difficult to provide a concept of "session storage" or "workflow sharing" without a complex orchestration of the back-end for storage depending on either a centralized file system or database. With the advent of HTML5, browsers gained the ability to store more information through the use of the Web Storage API (a browser-cookie holds a maximum of 4 kilobytes). Web Storage gives us the ability to store megabytes of arbitrary data in-browser either with an expiration date or just for a session. This allows scientists to create, update, persist and share their workflow without depending on the backend to store session information, providing the flexibility for new web-based workflows to emerge. In the DSASWeb portal ( http://cida.usgs.gov/DSASweb/ ), using these techniques, the representation of every step in the analyst's workflow is stored as plain-text serialized JSON, which we can generate as a text file and provide to the analyst as an upload. This file may then be shared with others and loaded back into the application, restoring the application to the state it was in when the session file was generated. A user may then view results produced during that session or go back and alter input parameters, creating new results and producing new, unique sessions which they can then again share. This technique not only provides independence for the user to manage their session as they like, but also allows much greater freedom for the application provider to scale out without having to worry about carrying over user information or maintaining it in a central location.

  3. 76 FR 6311 - Regulations Affecting Publication of the United States Government Manual

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-04

    ... as a single PDF file that includes bookmarks. Finally, he asked if any smart phone applications... an annual online edition of the Manual in both text-only files and PDF files. It is now possible to...

  4. SEDIMENT DATA - COMMENCEMENT BAY HYLEBOS WATERWAY - TACOMA, WA - PRE-REMEDIAL DESIGN PROGRAM

    EPA Science Inventory

    Event 1A/1B Data Files URL address: http://www.epa.gov/r10earth/datalib/superfund/hybos1ab.htm. Sediment Chemistry Data (Database Format): HYBOS1AB.EXE is a self-extracting file which expands to the single-value per record .DBF format database file HYBOS1AB.DBF. This file contai...

  5. 78 FR 78353 - Hydro Green Energy, LLC; Notice of Preliminary Permit Application Accepted for Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-26

    ... tetrapods; and (10) a new single-circuit 230-kilovolt transmission line approximately 20 miles in length... electronic filing. Please file comments, motions to intervene, notices of intent, and competing applications....gov , (866) 208-3676 (toll free), or (202) 502- 8659 (TTY). In lieu of electronic filing, please send...

  6. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study

    PubMed Central

    Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-01-01

    Background The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. Aim The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Materials and Methods Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20o to 35o were divided into three groups of 20 samples each: ProTaper PT (group I) – full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO – single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey’s honestly significant difference test. Results It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant Conclusion It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal transportation and centering ability comparable to full sequence Protaper system in curved root canals. PMID:26155551

  7. BaCoCa--a heuristic software tool for the parallel assessment of sequence biases in hundreds of gene and taxon partitions.

    PubMed

    Kück, Patrick; Struck, Torsten H

    2014-01-01

    BaCoCa (BAse COmposition CAlculator) is a user-friendly software that combines multiple statistical approaches (like RCFV and C value calculations) to identify biases in aligned sequence data which potentially mislead phylogenetic reconstructions. As a result of its speed and flexibility, the program provides the possibility to analyze hundreds of pre-defined gene partitions and taxon subsets in one single process run. BaCoCa is command-line driven and can be easily integrated into automatic process pipelines of phylogenomic studies. Moreover, given the tab-delimited output style the results can be easily used for further analyses in programs like Excel or statistical packages like R. A built-in option of BaCoCa is the generation of heat maps with hierarchical clustering of certain results using R. As input files BaCoCa can handle FASTA and relaxed PHYLIP, which are commonly used in phylogenomic pipelines. BaCoCa is implemented in Perl and works on Windows PCs, Macs and Linux operating systems. The executable source code as well as example test files and a detailed documentation of BaCoCa are freely available at http://software.zfmk.de. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Corps Helicopter Attack Planning System (CHAPS). Positional Handbook. Appendix A. Messages. Appendix B. Statespace Construction Sample Session

    DTIC Science & Technology

    1989-10-01

    REVIEW MENU PROGRAM (S) CHAPS PURPOSE AND OVERVIEV The Do Review menu allows the user to select which missions to perform detailed analysis on and...input files must be resident on the computer you are running SUPR on. Any interface or file transfer programs must be successfully executed prior to... COMPUTER PROGRAM WAS DEVELOPED BY SYSTEMS CONTROL TECHNOLOGY FOR THE DEPUTY CHIEF OF STAFF/OPERATIONS,HQ USAFE. THE USE OF THE COMPUTER PROGRAM IS

  9. Organic geochemistry data of Alaska

    USGS Publications Warehouse

    complied by Threlkeld, Charles N.; Obuch, Raymond C.; Gunther, G.L.

    2000-01-01

    In order to archive the results of various petroleum geochemical analyses of the Alaska resource assessment, the USGS developed an Alaskan Organic Geochemical Data Base (AOGDB) in 1978 to house the data generated from USGS and subcontracted laboratories. Prior to the AOGDB, the accumulated data resided in a flat data file entitled 'PGS' that was maintained by Petroleum Information Corporation with technical input from the USGS. The information herein is a breakout of the master flat file format into a relational data base table format (akdata).

  10. Validation Results for LEWICE 2.0. [Supplement

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Rutkowski, Adam

    1999-01-01

    Two CD-ROMs contain experimental ice shapes and code prediction used for validation of LEWICE 2.0 (see NASA/CR-1999-208690, CASI ID 19990021235). The data include ice shapes for both experiment and for LEWICE, all of the input and output files for the LEWICE cases, JPG files of all plots generated, an electronic copy of the text of the validation report, and a Microsoft Excel(R) spreadsheet containing all of the quantitative measurements taken. The LEWICE source code and executable are not contained on the discs.

  11. User's manual for SYNC: A FORTRAN program for merging and time-synchronizing data

    NASA Technical Reports Server (NTRS)

    Maine, R. E.

    1981-01-01

    The FORTRAN 77 computer program SYNC for merging and time synchronizing data is described. The program SYNC reads one or more input files which contain either synchronous data frames or time-tagged data points, which can be compressed. The program decompresses and time synchronizes the data, correcting for any channel time skews. Interpolation and hold last value synchronization algorithms are available. The output from SYNC is a file of time synchronized data frames at any requested sample rate.

  12. A Prototype Model for Automating Nursing Diagnosis, Nurse Care Planning and Patient Classification.

    DTIC Science & Technology

    1986-03-01

    Each diagnosis has an assessment level. Assessment levels are defining characteristics observed by the nurse or subjectively stated by the patient... characteristics of this order line. Select IV Order (Figure 4.l.1.le] is the first screen of a series of three. Select IV Order has up to 10 selections...For I F Upatient orders. Input Files Used: IVC.Scr and Procfile.Prg * Output Files Used: None Calling Routine: IUB.Prg * Routine Called: None

  13. Conversion of the Forces Mobilization Model (FORCEMOB) from FORTRAN to C

    DTIC Science & Technology

    2015-08-01

    300 K !’"vale Data 18.192 K 136 K Slack 2.560 K 84 K Mapped File 412 K 412 K Sharel!ble 5.444 K 4.440 K Managed Heap - r age Table l.klusable...the C version of FORCEMOB is ready for operational use. This page is intentionally blank. v Contents 1. Introduction...without a graphical user interface (GUI): once run, FORCEMOB reads user-created input files, performs mathematical operations upon them, and outputs text

  14. Flight dynamics analysis and simulation of heavy lift airships. Volume 3: User's manual

    NASA Technical Reports Server (NTRS)

    Emmen, R. D.; Tischler, M. B.

    1982-01-01

    The User's Manual provides the basic information necessary to run the programs. This includes descriptions of the various data files necessary for the program, the various outputs from the program and the options available to the user when executing the program. Additional data file information is contained in the three appendices to the manual. These appendices list all input variables and their permissible values, an example listing of these variables, and all output variables available to the user.

  15. Comparison between rotary and manual instrumentation in primary teeth.

    PubMed

    Crespo, S; Cortes, O; Garcia, C; Perez, L

    2008-01-01

    The aim of this study was to compare the efficiency in both, preparation time and root canal shape, when using the Nickel Titanium (Ni-Ti) rotary and K-Files hand instrumentation on root canal preparation of single rooted primary teeth. Sixty single rooted primary teeth were selected and divided into two equal groups: Group (I) 30 teeth instrumented with manual K-files and group (II) 30 teeth instrumented with Ni-Ti rotary files (ProFile 0.04). Instrumentation times were calculated and root canal impressions were taken with light bodied silicone in order to evaluate the shape. The data was analyzed with SPSS program using the t-test and the Chi-square test to compare their means. The preparation time with group (I) K-files was significantly higher than in group (II) rotary files (ProFile 0.04), with a p= .005. The ProFile system showed a significantly more favorable canal taper when compared to the K-files system (P= .002). The use of rotary files in primary teeth has several advantages when compared with manual K files: the efficiency in both, preparation time and root canal shape. 1. A decreased working time, that helps maintain patient cooperation by diminishing the potential for tiredness. 2. The shape of the root canal is more conical, favoring a higher quality of the root canal filling, and increasing clinical success.

  16. Development of a database for Louisiana highway bridge scour data : technical summary.

    DOT National Transportation Integrated Search

    1999-10-01

    The objectives of the project included: 1) developed a database with manipulation capabilities such as data retrieval, visualization, and update; 2) Input the existing scour data from DOTD files into the database.

  17. Statistical evaluation of PACSTAT random number generation capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, G.F.; Toland, M.R.; Harty, H.

    1988-05-01

    This report summarizes the work performed in verifying the general purpose Monte Carlo driver-program PACSTAT. The main objective of the work was to verify the performance of PACSTAT's random number generation capabilities. Secondary objectives were to document (using controlled configuration management procedures) changes made in PACSTAT at Pacific Northwest Laboratory, and to assure that PACSTAT input and output files satisfy quality assurance traceability constraints. Upon receipt of the PRIME version of the PACSTAT code from the Basalt Waste Isolation Project, Pacific Northwest Laboratory staff converted the code to run on Digital Equipment Corporation (DEC) VAXs. The modifications to PACSTAT weremore » implemented using the WITNESS configuration management system, with the modifications themselves intended to make the code as portable as possible. Certain modifications were made to make the PACSTAT input and output files conform to quality assurance traceability constraints. 10 refs., 17 figs., 6 tabs.« less

  18. Developing and utilizing an Euler computational method for predicting the airframe/propulsion effects for an aft-mounted turboprop transport. Volume 1: Theory document

    NASA Technical Reports Server (NTRS)

    Chen, H. C.; Yu, N. Y.

    1991-01-01

    An Euler flow solver was developed for predicting the airframe/propulsion integration effects for an aft-mounted turboprop transport. This solver employs a highly efficient multigrid scheme, with a successive mesh-refinement procedure to accelerate the convergence of the solution. A new dissipation model was also implemented to render solutions that are grid insensitive. The propeller power effects are simulated by the actuator disk concept. An embedded flow solution method was developed for predicting the detailed flow characteristics in the local vicinity of an aft-mounted propfan engine in the presence of a flow field induced by a complete aircraft. Results from test case analysis are presented. A user's guide for execution of computer programs, including format of various input files, sample job decks, and sample input files, is provided in an accompanying volume.

  19. Parallel line analysis: multifunctional software for the biomedical sciences

    NASA Technical Reports Server (NTRS)

    Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.

    1990-01-01

    An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.

  20. Preprocessor and postprocessor computer programs for a radial-flow finite-element model

    USGS Publications Warehouse

    Pucci, A.A.; Pope, D.A.

    1987-01-01

    Preprocessing and postprocessing computer programs that enhance the utility of the U.S. Geological Survey radial-flow model have been developed. The preprocessor program: (1) generates a triangular finite element mesh from minimal data input, (2) produces graphical displays and tabulations of data for the mesh , and (3) prepares an input data file to use with the radial-flow model. The postprocessor program is a version of the radial-flow model, which was modified to (1) produce graphical output for simulation and field results, (2) generate a statistic for comparing the simulation results with observed data, and (3) allow hydrologic properties to vary in the simulated region. Examples of the use of the processor programs for a hypothetical aquifer test are presented. Instructions for the data files, format instructions, and a listing of the preprocessor and postprocessor source codes are given in the appendixes. (Author 's abstract)

  1. VizieR Online Data Catalog: Habitable zone code (Valle+, 2014)

    NASA Astrophysics Data System (ADS)

    Valle, G.; Dell'Omodarme, M.; Prada Moroni, P. G.; Degl'Innocenti, S.

    2014-06-01

    A C computation code that provide in output the distance dm (i for which the duration of habitability is longest, the corresponding duration tm (in Gyr), the width W (in AU) of the zone for which the habitability lasts tm/2, the inner (Ri) and outer (Ro) boundaries of the 4Gyr continuously habitable zone. The code read the input file HZ-input.dat, containing in each row the mass of the host star (range: 0.70-1.10M⊙), its metallicity (either Z (range: 0.005-0.004) or [Fe/H]), the helium-to-metal enrichment ratio (range: 1-3, standard value = 2), the equilibrium temperature for habitable zone outer boundary computation (range: 169-203K) and the planet Bond Albedo (range: 0.0-1.0, Earth = 0.3). The output is printed on-screen. Compilation: just use your favorite C compiler: gcc hz.c -lm -o HZ (2 data files).

  2. A procedure for automating CFD simulations of an inlet-bleed problem

    NASA Technical Reports Server (NTRS)

    Chyu, Wei J.; Rimlinger, Mark J.; Shih, Tom I.-P.

    1995-01-01

    A procedure was developed to improve the turn-around time for computational fluid dynamics (CFD) simulations of an inlet-bleed problem involving oblique shock-wave/boundary-layer interactions on a flat plate with bleed into a plenum through one or more circular holes. This procedure is embodied in a preprocessor called AUTOMAT. With AUTOMAT, once data for the geometry and flow conditions have been specified (either interactively or via a namelist), it will automatically generate all input files needed to perform a three-dimensional Navier-Stokes simulation of the prescribed inlet-bleed problem by using the PEGASUS and OVERFLOW codes. The input files automatically generated by AUTOMAT include those for the grid system and those for the initial and boundary conditions. The grid systems automatically generated by AUTOMAT are multi-block structured grids of the overlapping type. Results obtained by using AUTOMAT are presented to illustrate its capability.

  3. Ceramic material life prediction: A program to translate ANSYS results to CARES/LIFE reliability analysis

    NASA Technical Reports Server (NTRS)

    Vonhermann, Pieter; Pintz, Adam

    1994-01-01

    This manual describes the use of the ANSCARES program to prepare a neutral file of FEM stress results taken from ANSYS Release 5.0, in the format needed by CARES/LIFE ceramics reliability program. It is intended for use by experienced users of ANSYS and CARES. Knowledge of compiling and linking FORTRAN programs is also required. Maximum use is made of existing routines (from other CARES interface programs and ANSYS routines) to extract the finite element results and prepare the neutral file for input to the reliability analysis. FORTRAN and machine language routines as described are used to read the ANSYS results file. Sub-element stresses are computed and written to a neutral file using FORTRAN subroutines which are nearly identical to those used in the NASCARES (MSC/NASTRAN to CARES) interface.

  4. Ontorat: automatic generation of new ontology terms, annotations, and axioms based on ontology design patterns.

    PubMed

    Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; He, Yongqun

    2015-01-01

    It is time-consuming to build an ontology with many terms and axioms. Thus it is desired to automate the process of ontology development. Ontology Design Patterns (ODPs) provide a reusable solution to solve a recurrent modeling problem in the context of ontology engineering. Because ontology terms often follow specific ODPs, the Ontology for Biomedical Investigations (OBI) developers proposed a Quick Term Templates (QTTs) process targeted at generating new ontology classes following the same pattern, using term templates in a spreadsheet format. Inspired by the ODPs and QTTs, the Ontorat web application is developed to automatically generate new ontology terms, annotations of terms, and logical axioms based on a specific ODP(s). The inputs of an Ontorat execution include axiom expression settings, an input data file, ID generation settings, and a target ontology (optional). The axiom expression settings can be saved as a predesigned Ontorat setting format text file for reuse. The input data file is generated based on a template file created by a specific ODP (text or Excel format). Ontorat is an efficient tool for ontology expansion. Different use cases are described. For example, Ontorat was applied to automatically generate over 1,000 Japan RIKEN cell line cell terms with both logical axioms and rich annotation axioms in the Cell Line Ontology (CLO). Approximately 800 licensed animal vaccines were represented and annotated in the Vaccine Ontology (VO) by Ontorat. The OBI team used Ontorat to add assay and device terms required by ENCODE project. Ontorat was also used to add missing annotations to all existing Biobank specific terms in the Biobank Ontology. A collection of ODPs and templates with examples are provided on the Ontorat website and can be reused to facilitate ontology development. With ever increasing ontology development and applications, Ontorat provides a timely platform for generating and annotating a large number of ontology terms by following design patterns. http://ontorat.hegroup.org/.

  5. From data to analysis: linking NWChem and Avogadro with the syntax and semantics of Chemical Markup Language.

    PubMed

    de Jong, Wibe A; Walker, Andrew M; Hanwell, Marcus D

    2013-05-24

    Multidisciplinary integrated research requires the ability to couple the diverse sets of data obtained from a range of complex experiments and computer simulations. Integrating data requires semantically rich information. In this paper an end-to-end use of semantically rich data in computational chemistry is demonstrated utilizing the Chemical Markup Language (CML) framework. Semantically rich data is generated by the NWChem computational chemistry software with the FoX library and utilized by the Avogadro molecular editor for analysis and visualization. The NWChem computational chemistry software has been modified and coupled to the FoX library to write CML compliant XML data files. The FoX library was expanded to represent the lexical input files and molecular orbitals used by the computational chemistry software. Draft dictionary entries and a format for molecular orbitals within CML CompChem were developed. The Avogadro application was extended to read in CML data, and display molecular geometry and electronic structure in the GUI allowing for an end-to-end solution where Avogadro can create input structures, generate input files, NWChem can run the calculation and Avogadro can then read in and analyse the CML output produced. The developments outlined in this paper will be made available in future releases of NWChem, FoX, and Avogadro. The production of CML compliant XML files for computational chemistry software such as NWChem can be accomplished relatively easily using the FoX library. The CML data can be read in by a newly developed reader in Avogadro and analysed or visualized in various ways. A community-based effort is needed to further develop the CML CompChem convention and dictionary. This will enable the long-term goal of allowing a researcher to run simple "Google-style" searches of chemistry and physics and have the results of computational calculations returned in a comprehensible form alongside articles from the published literature.

  6. From data to analysis: linking NWChem and Avogadro with the syntax and semantics of Chemical Markup Language

    PubMed Central

    2013-01-01

    Background Multidisciplinary integrated research requires the ability to couple the diverse sets of data obtained from a range of complex experiments and computer simulations. Integrating data requires semantically rich information. In this paper an end-to-end use of semantically rich data in computational chemistry is demonstrated utilizing the Chemical Markup Language (CML) framework. Semantically rich data is generated by the NWChem computational chemistry software with the FoX library and utilized by the Avogadro molecular editor for analysis and visualization. Results The NWChem computational chemistry software has been modified and coupled to the FoX library to write CML compliant XML data files. The FoX library was expanded to represent the lexical input files and molecular orbitals used by the computational chemistry software. Draft dictionary entries and a format for molecular orbitals within CML CompChem were developed. The Avogadro application was extended to read in CML data, and display molecular geometry and electronic structure in the GUI allowing for an end-to-end solution where Avogadro can create input structures, generate input files, NWChem can run the calculation and Avogadro can then read in and analyse the CML output produced. The developments outlined in this paper will be made available in future releases of NWChem, FoX, and Avogadro. Conclusions The production of CML compliant XML files for computational chemistry software such as NWChem can be accomplished relatively easily using the FoX library. The CML data can be read in by a newly developed reader in Avogadro and analysed or visualized in various ways. A community-based effort is needed to further develop the CML CompChem convention and dictionary. This will enable the long-term goal of allowing a researcher to run simple “Google-style” searches of chemistry and physics and have the results of computational calculations returned in a comprehensible form alongside articles from the published literature. PMID:23705910

  7. File Formats Commonly Used in Mass Spectrometry Proteomics*

    PubMed Central

    Deutsch, Eric W.

    2012-01-01

    The application of mass spectrometry (MS) to the analysis of proteomes has enabled the high-throughput identification and abundance measurement of hundreds to thousands of proteins per experiment. However, the formidable informatics challenge associated with analyzing MS data has required a wide variety of data file formats to encode the complex data types associated with MS workflows. These formats encompass the encoding of input instruction for instruments, output products of the instruments, and several levels of information and results used by and produced by the informatics analysis tools. A brief overview of the most common file formats in use today is presented here, along with a discussion of related topics. PMID:22956731

  8. Next generation lightweight mirror modeling software

    NASA Astrophysics Data System (ADS)

    Arnold, William R.; Fitzgerald, Matthew; Rosa, Rubin Jaca; Stahl, H. Philip

    2013-09-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 3-5 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any text editor, all the shell thickness parameters and suspension spring rates are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.

  9. Time-dependent Data System (TDDS); an interactive program to assemble, manage, and appraise input data and numerical output of flow/transport simulation models

    USGS Publications Warehouse

    Regan, R.S.; Schaffranek, R.W.; Baltzer, R.A.

    1996-01-01

    A system of functional utilities and computer routines, collectively identified as the Time-Dependent Data System CI DDS), has been developed and documented by the U.S. Geological Survey. The TDDS is designed for processing time sequences of discrete, fixed-interval, time-varying geophysical data--in particular, hydrologic data. Such data include various, dependent variables and related parameters typically needed as input for execution of one-, two-, and three-dimensional hydrodynamic/transport and associated water-quality simulation models. Such data can also include time sequences of results generated by numerical simulation models. Specifically, TDDS provides the functional capabilities to process, store, retrieve, and compile data in a Time-Dependent Data Base (TDDB) in response to interactive user commands or pre-programmed directives. Thus, the TDDS, in conjunction with a companion TDDB, provides a ready means for processing, preparation, and assembly of time sequences of data for input to models; collection, categorization, and storage of simulation results from models; and intercomparison of field data and simulation results. The TDDS can be used to edit and verify prototype, time-dependent data to affirm that selected sequences of data are accurate, contiguous, and appropriate for numerical simulation modeling. It can be used to prepare time-varying data in a variety of formats, such as tabular lists, sequential files, arrays, graphical displays, as well as line-printer plots of single or multiparameter data sets. The TDDB is organized and maintained as a direct-access data base by the TDDS, thus providing simple, yet efficient, data management and access. A single, easily used, program interface that provides all access to and from a particular TDDB is available for use directly within models, other user-provided programs, and other data systems. This interface, together with each major functional utility of the TDDS, is described and documented in this report.

  10. Flight dynamics analysis and simulation of heavy lift airships, volume 4. User's guide: Appendices

    NASA Technical Reports Server (NTRS)

    Emmen, R. D.; Tischler, M. B.

    1982-01-01

    This table contains all of the input variables to the three programs. The variables are arranged according to the name list groups in which they appear in the data files. The program name, subroutine name, definition and, where appropriate, a default input value and any restrictions are listed with each variable. The default input values are user supplied, not generated by the computer. These values remove a specific effect from the calculations, as explained in the table. The phrase "not used' indicates that a variable is not used in the calculations and are for identification purposes only. The engineering symbol, where it exists, is listed to assist the user in correlating these inputs with the discussion in the Technical Manual.

  11. High-Performance, Multi-Node File Copies and Checksums for Clustered File Systems

    NASA Technical Reports Server (NTRS)

    Kolano, Paul Z.; Ciotti, Robert B.

    2012-01-01

    Modern parallel file systems achieve high performance using a variety of techniques, such as striping files across multiple disks to increase aggregate I/O bandwidth and spreading disks across multiple servers to increase aggregate interconnect bandwidth. To achieve peak performance from such systems, it is typically necessary to utilize multiple concurrent readers/writers from multiple systems to overcome various singlesystem limitations, such as number of processors and network bandwidth. The standard cp and md5sum tools of GNU coreutils found on every modern Unix/Linux system, however, utilize a single execution thread on a single CPU core of a single system, and hence cannot take full advantage of the increased performance of clustered file systems. Mcp and msum are drop-in replacements for the standard cp and md5sum programs that utilize multiple types of parallelism and other optimizations to achieve maximum copy and checksum performance on clustered file systems. Multi-threading is used to ensure that nodes are kept as busy as possible. Read/write parallelism allows individual operations of a single copy to be overlapped using asynchronous I/O. Multinode cooperation allows different nodes to take part in the same copy/checksum. Split-file processing allows multiple threads to operate concurrently on the same file. Finally, hash trees allow inherently serial checksums to be performed in parallel. Mcp and msum provide significant performance improvements over standard cp and md5sum using multiple types of parallelism and other optimizations. The total speed-ups from all improvements are significant. Mcp improves cp performance over 27x, msum improves md5sum performance almost 19x, and the combination of mcp and msum improves verified copies via cp and md5sum by almost 22x. These improvements come in the form of drop-in replacements for cp and md5sum, so are easily used and are available for download as open source software at http://mutil.sourceforge.net.

  12. Crop phenology and LANDSAT-based irrigated lands inventory in the high plains. [United States of America

    NASA Technical Reports Server (NTRS)

    Martinko, E. A. (Principal Investigator); Poracsky, J.; Kipp, E. R.; Krieger, H.

    1980-01-01

    The activity concentrated on identifying crop and irrigation data sources for the eight states within the High Plains Aquifer and making contacts concerning the nature of these data. A mail questionnaire was developed to gather specific data not routinely reported through standard data collection channels. Input/output routines were designed for High Plains crop and irrigation data and initial statistical data on crops were input to computer files.

  13. Purge-Alert | High-Performance Computing | NREL

    Science.gov Websites

    and maintain if you want notifications for files that will be removed within 7 days. Simply add a single email address to this file: /scratch/$USER/.notify-email The email you will receive will provide instructions as to where the files are located. You will only receive emails if there are files that will be

  14. Benefits to the Simulation Training Community of a New ANSI Standard for the Exchange of Aero Simulation Models

    NASA Technical Reports Server (NTRS)

    Hildreth, Bruce L.; Jackson, E. Bruce

    2009-01-01

    The American Institute of Aeronautics Astronautics (AIAA) Modeling and Simulation Technical Committee is in final preparation of a new standard for the exchange of flight dynamics models. The standard will become an ANSI standard and is under consideration for submission to ISO for acceptance by the international community. The standard has some a spects that should provide benefits to the simulation training community. Use of the new standard by the training simulation community will reduce development, maintenance and technical refresh investment on each device. Furthermore, it will significantly lower the cost of performing model updates to improve fidelity or expand the envelope of the training device. Higher flight fidelity should result in better transfer of training, a direct benefit to the pilots under instruction. Costs of adopting the standard are minimal and should be paid back within the cost of the first use for that training device. The standard achie ves these advantages by making it easier to update the aerodynamic model. It provides a standard format for the model in a custom eXtensible Markup Language (XML) grammar, the Dynamic Aerospace Vehicle Exchange Markup Language (DAVE-ML). It employs an existing XML grammar, MathML, to describe the aerodynamic model in an input data file, eliminating the requirement for actual software compilation. The major components of the aero model become simply an input data file, and updates are simply new XML input files. It includes naming and axis system conventions to further simplify the exchange of information.

  15. JAva GUi for Applied Research (JAGUAR) v 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JAGUAR is a Java software tool for automatically rendering a graphical user interface (GUI) from a structured input specification. It is designed as a plug-in to the Eclipse workbench to enable users to create, edit, and externally execute analysis application input decks and then view the results. JAGUAR serves as a GUI for Sandia's DAKOTA software toolkit for optimization and uncertainty quantification. It will include problem (input deck)set-up, option specification, analysis execution, and results visualization. Through the use of wizards, templates, and views, JAGUAR helps uses navigate the complexity of DAKOTA's complete input specification. JAGUAR is implemented in Java, leveragingmore » Eclipse extension points and Eclipse user interface. JAGUAR parses a DAKOTA NIDR input specification and presents the user with linked graphical and plain text representations of problem set-up and option specification for DAKOTA studies. After the data has been input by the user, JAGUAR generates one or more input files for DAKOTA, executes DAKOTA, and captures and interprets the results« less

  16. Washington Play Fairway Analysis Geothermal GIS Data

    DOE Data Explorer

    Corina Forson

    2015-12-15

    This file contains file geodatabases of the Mount St. Helens seismic zone (MSHSZ), Wind River valley (WRV) and Mount Baker (MB) geothermal play-fairway sites in the Washington Cascades. The geodatabases include input data (feature classes) and output rasters (generated from modeling and interpolation) from the geothermal play-fairway in Washington State, USA. These data were gathered and modeled to provide an estimate of the heat and permeability potential within the play-fairways based on: mapped volcanic vents, hot springs and fumaroles, geothermometry, intrusive rocks, temperature-gradient wells, slip tendency, dilation tendency, displacement, displacement gradient, max coulomb shear stress, sigma 3, maximum shear strain rate, and dilational strain rate at 200m and 3 km depth. In addition this file contains layer files for each of the output rasters. For details on the areas of interest please see the 'WA_State_Play_Fairway_Phase_1_Technical_Report' in the download package. This submission also includes a file with the geothermal favorability of the Washington Cascade Range based off of an earlier statewide assessment. Additionally, within this file there are the maximum shear and dilational strain rate rasters for all of Washington State.

  17. HDF-EOS 5 Validator

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A computer program partly automates the task of determining whether an HDF-EOS 5 file is valid in that it conforms to specifications for such characteristics as attribute names, dimensionality of data products, and ranges of legal data values. ["HDF-EOS" and variants thereof are defined in "Converting EOS Data From HDF-EOS to netCDF" (GSC-15007-1), which is the first of several preceding articles in this issue of NASA Tech Briefs.] Previously, validity of a file was determined in a tedious and error-prone process in which a person examined human-readable dumps of data-file-format information. The present software helps a user to encode the specifications for an HDFEOS 5 file, and then inspects the file for conformity with the specifications: First, the user writes the specifications in Extensible Markup Language (XML) by use of a document type definition (DTD) that is part of the program. Next, the portion of the program (denoted the validator) that performs the inspection is executed, using, as inputs, the specifications in XML and the HDF-EOS 5 file to be validated. Finally, the user examines the output of the validator.

  18. Indel Group in Genomes (IGG) Molecular Genetic Markers1[OPEN

    PubMed Central

    Burkart-Waco, Diana; Kuppu, Sundaram; Britt, Anne; Chetelat, Roger

    2016-01-01

    Genetic markers are essential when developing or working with genetically variable populations. Indel Group in Genomes (IGG) markers are primer pairs that amplify single-locus sequences that differ in size for two or more alleles. They are attractive for their ease of use for rapid genotyping and their codominant nature. Here, we describe a heuristic algorithm that uses a k-mer-based approach to search two or more genome sequences to locate polymorphic regions suitable for designing candidate IGG marker primers. As input to the IGG pipeline software, the user provides genome sequences and the desired amplicon sizes and size differences. Primer sequences flanking polymorphic insertions/deletions are produced as output. IGG marker files for three sets of genomes, Solanum lycopersicum/Solanum pennellii, Arabidopsis (Arabidopsis thaliana) Columbia-0/Landsberg erecta-0 accessions, and S. lycopersicum/S. pennellii/Solanum tuberosum (three-way polymorphic) are included. PMID:27436831

  19. T.I.M.S: TaqMan Information Management System, tools to organize data flow in a genotyping laboratory

    PubMed Central

    Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico

    2005-01-01

    Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298

  20. The SRS-Viewer: A Software Tool for Displaying and Evaluation of Pyroshock Data

    NASA Astrophysics Data System (ADS)

    Eberl, Stefan

    2014-06-01

    For the evaluation of the success of a pyroshock, the time domain and the corresponding Shock-Response- Spectra (SRS) have to be considered. The SRS-Viewer is an IABG developed software tool [1] to read data in Universal File format (*.unv) and either display or plot for each accelerometer the time domain, corresponding SRS and the specified Reference-SRS with tolerances in the background.The software calculates the "Average (AVG)", "Maximum (MAX)" and "Minimum (MIN)" SRS of any selection of accelerometers. A statistical analysis calculates the percentages of measured SRS above the specified Reference-SRS level and the percentage within the tolerance bands for comparison with the specified success criteria.Overlay plots of single accelerometers of different test runs enable to monitor the repeatability of the shock input and the integrity of the specimen. Furthermore the difference between the shock on a mass-dummy and the real test unit can be examined.

  1. VizieR Online Data Catalog: The Red MSX Source Survey: massive protostars (Lumsden+, 2013)

    NASA Astrophysics Data System (ADS)

    Lumsden, S. L.; Hoare, M. G.; Urquhart, J. S.; Oudmaijer, R. D.; Davies, B.; Mottram, J. C.; Cooper, H. D. B.; Moore, T. J. T.

    2013-10-01

    The Midcourse Space Experiment (MSX) satellite mission included an astronomy experiment (SPIRIT III) designed to acquire mid-infrared photometry of sources in the Galactic plane (b<5°). MSX had a raw resolution of 18.3", a beam size 50 times smaller than that of IRAS at 12 and 25um. MSX observed six bands between 4 and 21um, of which the four between 8 and 21um are sensitive to astronomical sources. We used v2.3 of the MSX PSC (Egan et al. 2003, Cat. V/114) as our basic input, restricting ourselves to the main Galactic plane catalog, which excludes sources seen in only a single observing pass and those seen in multiple passes but with low significance. We restricted our catalog to 10

  2. Pulse-echo ultrasonic imaging method for eliminating sample thickness variation effects

    NASA Technical Reports Server (NTRS)

    Roth, Don J. (Inventor)

    1995-01-01

    A pulse-echo, immersion method for ultrasonic evaluation of a material is discussed. It accounts for and eliminates nonlevelness in the equipment set-up and sample thickness variation effects employs a single transducer, automatic scanning and digital imaging to obtain an image of a property of the material, such as pore fraction. The nonlevelness and thickness variation effects are accounted for by pre-scan adjusments of the time window to insure that the echoes received at each scan point are gated in the center of the window. This information is input into the scan file so that, during the automatic scanning for the material evaluation, each received echo is centered in its time window. A cross-correlation function calculates the velocity at each scan point, which is then proportionalized to a color or grey scale and displayed on a video screen.

  3. FGGE/ERBM tape specification and shipping letter description

    NASA Technical Reports Server (NTRS)

    Han, D.; Lo, H.

    1983-01-01

    The Nimbus-7 FGGE/ERBM tape contains 27 ERB parameters which are extracted and reformatted from the Nimbus-7 ERB-MATRIX tape. There are four types of files on a FGGE/ERBM tape: a test file; tape-header file which describes the data set characteristics and the contents of the tape; a grid-descriptor file which contains the information of the ERB scanning channel target number and their associated latitude limits and longitude intervals; and one or more data files. A single end-of-file (EOF) tape mark is written after each file, and two EOF marks are written after the last data file on the tape.

  4. Decision Support System for Evaluation of Gunnison River Flow Regimes With Respect To Resources of the Black Canyon of the Gunnison National Park

    USGS Publications Warehouse

    Auble, Gregor T.; Wondzell, Mark; Talbert, Colin

    2009-01-01

    This report describes and documents a decision support system for the Gunnison River in Black Canyon of the Gunnison National Park. It is a macro-embedded EXCEL program that calculates and displays indicators representing valued characteristics or processes in the Black Canyon based on daily flows of the Gunnison River. The program is designed to easily accept input from downloaded stream gage records or output from the RIVERWARE reservoir operations model being used for the upstream Aspinall Unit. The decision support system is structured to compare as many as eight alternative flow regimes, where each alternative is represented by a daily sequence of at least 20 calendar years of streamflow. Indicators include selected flow statistics, riparian plant community distribution, clearing of box elder by inundation and scour, several measures of sediment mobilization, trout fry habitat, and federal reserved water rights. Calculation of variables representing National Park Service federal reserved water rights requires additional secondary input files pertaining to forecast and actual basin inflows and storage levels in Blue Mesa reservoir. Example input files representing a range of situations including historical, reconstructed natural, and simulated alternative reservoir operations are provided with the software.

  5. The Design and Usage of the New Data Management Features in NASTRAN

    NASA Technical Reports Server (NTRS)

    Pamidi, P. R.; Brown, W. K.

    1984-01-01

    Two new data management features are installed in the April 1984 release of NASTRAN. These two features are the Rigid Format Data Base and the READFILE capability. The Rigid Format Data Base is stored on external files in card image format and can be easily maintained and expanded by the use of standard text editors. This data base provides the user and the NASTRAN maintenance contractor with an easy means for making changes to a Rigid Format or for generating new Rigid Formats without unnecessary compilations and link editing of NASTRAN. Each Rigid Format entry in the data base contains the Direct Matrix Abstraction Program (DMAP), along with the associated restart, DMAP sequence subset and substructure control flags. The READFILE capability allows an user to reference an external secondary file from the NASTRAN primary input file and to read data from this secondary file. There is no limit to the number of external secondary files that may be referenced and read.

  6. BOPACE 3-D addendum: The Boeing plastic analysis capabilities for 3-dimensional solids using isoparametric finite elements

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Straayer, J. W.

    1975-01-01

    Modifications and additions incorporated into the BOPACE 3-D program are described. Updates to the program input data formats, error messages, file usage, size limitations, and overlay schematic are included.

  7. Microtomography-based comparison of reciprocating single-file F2 ProTaper technique versus rotary full sequence.

    PubMed

    Paqué, Frank; Zehnder, Matthias; De-Deus, Gustavo

    2011-10-01

    A preparation technique with only 1 single instrument was proposed on the basis of the reciprocating movement of the F2 ProTaper instrument. The present study was designed to quantitatively assess canal preparation outcomes achieved by this technique. Twenty-five extracted human mandibular first molars with 2 separate mesial root canals were selected. Canals were randomly assigned to 1 of the 2 experimental groups: group 1, rotary conventional preparation by using ProTaper, and group 2, reciprocate instrumentation with 1 single ProTaper F2 instrument. Specimens were scanned initially and after root canal preparation with an isotropic resolution of 20 μm by using a micro-computed tomography system. The following parameters were assessed: changes in dentin volume, percentage of shaped canal walls, and degree of canal transportation. In addition, the time required to reach working length with the F2 instrument was recorded. Preoperatively, there were no differences regarding root canal curvature and volume between experimental groups. Overall, instrumentation led to enlarged canal shapes with no evidence of preparation errors. There were no statistical differences between the 2 preparation techniques in the anatomical parameters assessed (P > .01), except for a significantly higher canal transportation caused by the reciprocating file in the coronal canal third. On the other hand, preparation was faster by using the single-file technique (P < .01). Shaping outcomes with the single-file F2 ProTaper technique and conventional ProTaper full-sequence rotary approach were similar. However, the single-file F2 ProTaper technique was markedly faster in reaching working length. Copyright © 2011 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  8. mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data

    PubMed Central

    Larralde, Martin; Lawson, Thomas N.; Weber, Ralf J. M.; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R.; Steinbeck, Christoph; Salek, Reza M.

    2017-01-01

    Abstract Summary Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. Availability and Implementation mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. Contact reza.salek@ebi.ac.uk or isatools@googlegroups.com Supplementary information Supplementary data are available at Bioinformatics online. PMID:28402395

  9. mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data.

    PubMed

    Larralde, Martin; Lawson, Thomas N; Weber, Ralf J M; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R; Steinbeck, Christoph; Salek, Reza M

    2017-08-15

    Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. reza.salek@ebi.ac.uk or isatools@googlegroups.com. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  10. Evaluation of apical extrusion of debris and irrigant using two new reciprocating and one continuous rotation single file systems.

    PubMed

    Nayak, Gurudutt; Singh, Inderpreet; Shetty, Shashit; Dahiya, Surya

    2014-05-01

    Apical extrusion of debris and irrigants during cleaning and shaping of the root canal is one of the main causes of periapical inflammation and postoperative flare-ups. The purpose of this study was to quantitatively measure the amount of debris and irrigants extruded apically in single rooted canals using two reciprocating and one rotary single file nickel-titanium instrumentation systems. Sixty human mandibular premolars, randomly assigned to three groups (n = 20) were instrumented using two reciprocating (Reciproc and Wave One) and one rotary (One Shape) single-file nickel-titanium systems. Bidistilled water was used as irrigant with traditional needle irrigation delivery system. Eppendorf tubes were used as test apparatus for collection of debris and irrigant. The volume of extruded irrigant was collected and quantified via 0.1-mL increment measure supplied on the disposable plastic insulin syringe. The liquid inside the tubes was dried and the mean weight of debris was assessed using an electronic microbalance. The data were statistically analysed using Kruskal-Wallis nonparametric test and Mann Whitney U test with Bonferroni adjustment. P-values less than 0.05 were considered significant. The Reciproc file system produced significantly more debris compared with OneShape file system (P<0.05), but no statistically significant difference was obtained between the two reciprocating instruments (P>0.05). Extrusion of irrigant was statistically insignificant irrespective of the instrument or instrumentation technique used (P >0.05). Although all systems caused apical extrusion of debris and irrigant, continuous rotary instrumentation was associated with less extrusion as compared with the use of reciprocating file systems.

  11. MPT Prediction of Aircraft-Engine Fan Noise

    NASA Technical Reports Server (NTRS)

    Connell, Stuart D.

    2004-01-01

    A collection of computer programs has been developed that implements a procedure for predicting multiple-pure-tone (MPT) noise generated by fan blades of an aircraft engine (e.g., a turbofan engine). MPT noise arises when the fan is operating with supersonic relative tip Mach No. Under this flow condition, there is a strong upstream running shock. The strength and position of this shock are very sensitive to blade geometry variations. For a fan where all the blades are identical, the primary tone observed upstream of the fan will be the blade passing frequency. If there are small variations in geometry between blades, then tones below the blade passing frequency arise MPTs. Stagger angle differences as small as 0.1 can give rise to significant MPT. It is also noted that MPT noise is more pronounced when the fan is operating in an unstarted mode. Computational results using a three-dimensional flow solver to compute the complete annulus flow with non-uniform fans indicate that MPT noise can be estimated in a relatively simple way. Hence, once the effect of a typical geometry variation of one blade in an otherwise uniform blade row is known, the effect of all the blades being different can be quickly computed via superposition. Two computer programs that were developed as part of this work are used in conjunction with a user s computational fluid dynamics (CFD) code to predict MPT spectra for a fan with a specified set of geometric variations: (1) The first program ROTBLD reads the users CFD solution files for a single blade passage via an API (Application Program Interface). There are options to replicate and perturb the geometry with typical variations stagger, camber, thickness, and pitch. The multi-passage CFD solution files are then written in the user s file format using the API. (2) The second program SUPERPOSE requires two input files: the first is the circumferential upstream pressure distribution extracted from the CFD solution on the multi-passage mesh, the second file defines the geometry variations of each blade in a complete fan. Superposition is used to predict the spectra resulting from the geometric variations.

  12. MSTor version 2013: A new version of the computer code for the multi-structural torsional anharmonicity, now with a coupled torsional potential

    NASA Astrophysics Data System (ADS)

    Zheng, Jingjing; Meana-Pañeda, Rubén; Truhlar, Donald G.

    2013-08-01

    We present an improved version of the MSTor program package, which calculates partition functions and thermodynamic functions of complex molecules involving multiple torsions; the method is based on either a coupled torsional potential or an uncoupled torsional potential. The program can also carry out calculations in the multiple-structure local harmonic approximation. The program package also includes seven utility codes that can be used as stand-alone programs to calculate reduced moment of inertia matrices by the method of Kilpatrick and Pitzer, to generate conformational structures, to calculate, either analytically or by Monte Carlo sampling, volumes for torsional subdomains defined by Voronoi tessellation of the conformational subspace, to generate template input files for the MSTor calculation and Voronoi calculation, and to calculate one-dimensional torsional partition functions using the torsional eigenvalue summation method. Restrictions: There is no limit on the number of torsions that can be included in either the Voronoi calculation or the full MS-T calculation. In practice, the range of problems that can be addressed with the present method consists of all multitorsional problems for which one can afford to calculate all the conformational structures and their frequencies. Unusual features: The method can be applied to transition states as well as stable molecules. The program package also includes the hull program for the calculation of Voronoi volumes, the symmetry program for determining point group symmetry of a molecule, and seven utility codes that can be used as stand-alone programs to calculate reduced moment-of-inertia matrices by the method of Kilpatrick and Pitzer, to generate conformational structures, to calculate, either analytically or by Monte Carlo sampling, volumes of the torsional subdomains defined by Voronoi tessellation of the conformational subspace, to generate template input files, and to calculate one-dimensional torsional partition functions using the torsional eigenvalue summation method. Additional comments: The program package includes a manual, installation script, and input and output files for a test suite. Running time: There are 26 test runs. The running time of the test runs on a single processor of the Itasca computer is less than 2 s. References: [1] MS-T(C) method: Quantum Thermochemistry: Multi-Structural Method with Torsional Anharmonicity Based on a Coupled Torsional Potential, J. Zheng and D.G. Truhlar, Journal of Chemical Theory and Computation 9 (2013) 1356-1367, DOI: http://dx.doi.org/10.1021/ct3010722. [2] MS-T(U) method: Practical Methods for Including Torsional Anharmonicity in Thermochemical Calculations of Complex Molecules: The Internal-Coordinate Multi-Structural Approximation, J. Zheng, T. Yu, E. Papajak, I, M. Alecu, S.L. Mielke, and D.G. Truhlar, Physical Chemistry Chemical Physics 13 (2011) 10885-10907.

  13. Deep PDF parsing to extract features for detecting embedded malware.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munson, Miles Arthur; Cross, Jesse S.

    2011-09-01

    The number of PDF files with embedded malicious code has risen significantly in the past few years. This is due to the portability of the file format, the ways Adobe Reader recovers from corrupt PDF files, the addition of many multimedia and scripting extensions to the file format, and many format properties the malware author may use to disguise the presence of malware. Current research focuses on executable, MS Office, and HTML formats. In this paper, several features and properties of PDF Files are identified. Features are extracted using an instrumented open source PDF viewer. The feature descriptions of benignmore » and malicious PDFs can be used to construct a machine learning model for detecting possible malware in future PDF files. The detection rate of PDF malware by current antivirus software is very low. A PDF file is easy to edit and manipulate because it is a text format, providing a low barrier to malware authors. Analyzing PDF files for malware is nonetheless difficult because of (a) the complexity of the formatting language, (b) the parsing idiosyncrasies in Adobe Reader, and (c) undocumented correction techniques employed in Adobe Reader. In May 2011, Esparza demonstrated that PDF malware could be hidden from 42 of 43 antivirus packages by combining multiple obfuscation techniques [4]. One reason current antivirus software fails is the ease of varying byte sequences in PDF malware, thereby rendering conventional signature-based virus detection useless. The compression and encryption functions produce sequences of bytes that are each functions of multiple input bytes. As a result, padding the malware payload with some whitespace before compression/encryption can change many of the bytes in the final payload. In this study we analyzed a corpus of 2591 benign and 87 malicious PDF files. While this corpus is admittedly small, it allowed us to test a system for collecting indicators of embedded PDF malware. We will call these indicators features throughout the rest of this report. The features are extracted using an instrumented PDF viewer, and are the inputs to a prediction model that scores the likelihood of a PDF file containing malware. The prediction model is constructed from a sample of labeled data by a machine learning algorithm (specifically, decision tree ensemble learning). Preliminary experiments show that the model is able to detect half of the PDF malware in the corpus with zero false alarms. We conclude the report with suggestions for extending this work to detect a greater variety of PDF malware.« less

  14. PAnalyzer: a software tool for protein inference in shotgun proteomics.

    PubMed

    Prieto, Gorka; Aloria, Kerman; Osinalde, Nerea; Fullaondo, Asier; Arizmendi, Jesus M; Matthiesen, Rune

    2012-11-05

    Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA) approaches have emerged as an alternative to the traditional data dependent acquisition (DDA) in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS) software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file) files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates integration. PAnalyzer is an easy to use multiplatform and free software tool.

  15. PAnalyzer: A software tool for protein inference in shotgun proteomics

    PubMed Central

    2012-01-01

    Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA) approaches have emerged as an alternative to the traditional data dependent acquisition (DDA) in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS) software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file) files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates integration. PAnalyzer is an easy to use multiplatform and free software tool. PMID:23126499

  16. LONGLIB - A GRAPHICS LIBRARY

    NASA Technical Reports Server (NTRS)

    Long, D.

    1994-01-01

    This library is a set of subroutines designed for vector plotting to CRT's, plotters, dot matrix, and laser printers. LONGLIB subroutines are invoked by program calls similar to standard CALCOMP routines. In addition to the basic plotting routines, LONGLIB contains an extensive set of routines to allow viewport clipping, extended character sets, graphic input, shading, polar plots, and 3-D plotting with or without hidden line removal. LONGLIB capabilities include surface plots, contours, histograms, logarithm axes, world maps, and seismic plots. LONGLIB includes master subroutines, which are self-contained series of commonly used individual subroutines. When invoked, the master routine will initialize the plotting package, and will plot multiple curves, scatter plots, log plots, 3-D plots, etc. and then close the plot package, all with a single call. Supported devices include VT100 equipped with Selanar GR100 or GR100+ boards, VT125s, VT240s, VT220 equipped with Selanar SG220, Tektronix 4010/4014 or 4107/4109 and compatibles, and Graphon GO-235 terminals. Dot matrix printer output is available by using the provided raster scan conversion routines for DEC LA50, Printronix printers, and high or low resolution Trilog printers. Other output devices include QMS laser printers, Postscript compatible laser printers, and HPGL compatible plotters. The LONGLIB package includes the graphics library source code, an on-line help library, scan converter and meta file conversion programs, and command files for installing, creating, and testing the library. The latest version, 5.0, is significantly enhanced and has been made more portable. Also, the new version's meta file format has been changed and is incompatible with previous versions. A conversion utility is included to port the old meta files to the new format. Color terminal plotting has been incorporated. LONGLIB is written in FORTRAN 77 for batch or interactive execution and has been implemented on a DEC VAX series computer operating under VMS. This program was developed in 1985, and last updated in 1988.

  17. Peak Dose Assessment for Proposed DOE-PPPO Authorized Limits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maldonado, Delis

    2012-06-01

    The Oak Ridge Institute for Science and Education (ORISE), a U.S. Department of Energy (DOE) prime contractor, was contracted by the DOE Portsmouth/Paducah Project Office (DOE-PPPO) to conduct a peak dose assessment in support of the Authorized Limits Request for Solid Waste Disposal at Landfill C-746-U at the Paducah Gaseous Diffusion Plant (DOE-PPPO 2011a). The peak doses were calculated based on the DOE-PPPO Proposed Single Radionuclides Soil Guidelines and the DOE-PPPO Proposed Authorized Limits (AL) Volumetric Concentrations available in DOE-PPPO 2011a. This work is provided as an appendix to the Dose Modeling Evaluations and Technical Support Document for the Authorizedmore » Limits Request for the C-746-U Landfill at the Paducah Gaseous Diffusion Plant, Paducah, Kentucky (ORISE 2012). The receptors evaluated in ORISE 2012 were selected by the DOE-PPPO for the additional peak dose evaluations. These receptors included a Landfill Worker, Trespasser, Resident Farmer (onsite), Resident Gardener, Recreational User, Outdoor Worker and an Offsite Resident Farmer. The RESRAD (Version 6.5) and RESRAD-OFFSITE (Version 2.5) computer codes were used for the peak dose assessments. Deterministic peak dose assessments were performed for all the receptors and a probabilistic dose assessment was performed only for the Offsite Resident Farmer at the request of the DOE-PPPO. In a deterministic analysis, a single input value results in a single output value. In other words, a deterministic analysis uses single parameter values for every variable in the code. By contrast, a probabilistic approach assigns parameter ranges to certain variables, and the code randomly selects the values for each variable from the parameter range each time it calculates the dose (NRC 2006). The receptor scenarios, computer codes and parameter input files were previously used in ORISE 2012. A few modifications were made to the parameter input files as appropriate for this effort. Some of these changes included increasing the time horizon beyond 1,050 years (yr), and using the radionuclide concentrations provided by the DOE-PPPO as inputs into the codes. The deterministic peak doses were evaluated within time horizons of 70 yr (for the Landfill Worker and Trespasser), 1,050 yr, 10,000 yr and 100,000 yr (for the Resident Farmer [onsite], Resident Gardener, Recreational User, Outdoor Worker and Offsite Resident Farmer) at the request of the DOE-PPPO. The time horizons of 10,000 yr and 100,000 yr were used at the request of the DOE-PPPO for informational purposes only. The probabilistic peak of the mean dose assessment was performed for the Offsite Resident Farmer using Technetium-99 (Tc-99) and a time horizon of 1,050 yr. The results of the deterministic analyses indicate that among all receptors and time horizons evaluated, the highest projected dose, 2,700 mrem/yr, occurred for the Resident Farmer (onsite) at 12,773 yr. The exposure pathways contributing to the peak dose are ingestion of plants, external gamma, and ingestion of milk, meat and soil. However, this receptor is considered an implausible receptor. The only receptors considered plausible are the Landfill Worker, Recreational User, Outdoor Worker and the Offsite Resident Farmer. The maximum projected dose among the plausible receptors is 220 mrem/yr for the Outdoor Worker and it occurs at 19,045 yr. The exposure pathways contributing to the dose for this receptor are external gamma and soil ingestion. The results of the probabilistic peak of the mean dose analysis for the Offsite Resident Farmer indicate that the average (arithmetic mean) of the peak of the mean doses for this receptor is 0.98 mrem/yr and it occurs at 1,050 yr. This dose corresponds to Tc-99 within the time horizon of 1,050 yr.« less

  18. Effect of coronal flaring on apical extrusion of debris during root canal instrumentation using single-file systems.

    PubMed

    Topçuoğlu, H S; Üstün, Y; Akpek, F; Aktı, A; Topçuoğlu, G

    2016-09-01

    To evaluate the effect of coronal flaring on the amount of debris extruded apically during root canal preparation using the Reciproc, WaveOne (WO) and OneShape (OS) single-file systems. Ninety extracted single-rooted mandibular incisor teeth were randomly assigned to six groups (n = 15 for each group) for canal instrumentation. Endodontic access cavities were prepared in each tooth. In three of the six groups, coronal flaring was not performed; coronal flaring was performed with Gates-Glidden drills on all teeth in the remaining three groups. The canals were then instrumented with one or other of the following single-file instrument systems: Reciproc, WO and OS. Debris extruded apically during instrumentation was collected into pre-weighed Eppendorf tubes. The tubes were then stored in an incubator at 70 °C for 5 days. The weight of the dry extruded debris was established by subtracting the pre-instrumentation and post-instrumentation weight of the Eppendorf tubes for each group. Data were analysed using one-way analysis of variance (anova) and Tukey's post hoc tests (P = 0.05). Reciproc and WO files without coronal flaring produced significantly more debris compared with the other groups (P < 0.05). There was no significant difference in apical extrusion of debris amongst the other groups (P > 0.05). All single-file systems caused apical extrusion of debris. Performing coronal flaring prior to canal preparation reduced the amount of apically extruded debris when using Reciproc or WO systems. © 2015 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  19. Modifications to the accuracy assessment analysis routine MLTCRP to produce an output file

    NASA Technical Reports Server (NTRS)

    Carnes, J. G.

    1978-01-01

    Modifications are described that were made to the analysis program MLTCRP in the accuracy assessment software system to produce a disk output file. The output files produced by this modified program are used to aggregate data for regions greater than a single segment.

  20. The ISMARA client

    PubMed Central

    Ioannidis, Vassilios; van Nimwegen, Erik; Stockinger, Heinz

    2016-01-01

    ISMARA ( ismara.unibas.ch) automatically infers the key regulators and regulatory interactions from high-throughput gene expression or chromatin state data. However, given the large sizes of current next generation sequencing (NGS) datasets, data uploading times are a major bottleneck. Additionally, for proprietary data, users may be uncomfortable with uploading entire raw datasets to an external server. Both these problems could be alleviated by providing a means by which users could pre-process their raw data locally, transferring only a small summary file to the ISMARA server. We developed a stand-alone client application that pre-processes large input files (RNA-seq or ChIP-seq data) on the user's computer for performing ISMARA analysis in a completely automated manner, including uploading of small processed summary files to the ISMARA server. This reduces file sizes by up to a factor of 1000, and upload times from many hours to mere seconds. The client application is available from ismara.unibas.ch/ISMARA/client. PMID:28232860

  1. USEEIO Satellite Tables

    EPA Pesticide Factsheets

    These files contain the environmental data as particular emissions or resources associated with a BEA sectors that are used in the USEEIO model. They are organized by the emission or resources type, as described in the manuscript. The main files (without SI) show the final satellite tables in the 'Exchanges' sheet which have emissions or resource use per USD for 2013. The other sheets in these files provide meta data for the create of the tables, including general information, sources, etc. The 'export' sheet is used for saving the satellite table for csv export. The data dictionary describes the fields in this sheet. The supporting files provide all the details data transformation and organization for the development of the satellite tables.This dataset is associated with the following publication:Yang, Y., W. Ingwersen, T. Hawkins, and D. Meyer. USEEIO: a New and Transparent United States Environmentally Extended Input-Output Model. JOURNAL OF CLEANER PRODUCTION. Elsevier Science Ltd, New York, NY, USA,

  2. Federal Logistics Information System (FLIS) Procedures Manual. Volume 8. Document Identifier Code Input/Output Formats (Fixed Length)

    DTIC Science & Technology

    1994-07-01

    REQUIRED MIX OF SEGMENTS OR INDIVIDUAL DATA ELEMENTS TO BE EXTRACTED. IN SEGMENT R ON AN INTERROGATION TRANSACTION (LTI), DATA RECORD NUMBER (DRN 0950) ONLY...and zation and Marketing input DICs. insert the Continuation Indicator Code (DRN 8555) in position 80 of this record. Maximum of OF The assigned NSN...for Procurement KFR, File Data Minus Security Classified Characteristics Data KFC 8.5-2 DoD 4100.39-M Volume 8 CHAPTER 5 ALPHABETIC INDEX OF DIC

  3. User's guide to resin infusion simulation program in the FORTRAN language

    NASA Technical Reports Server (NTRS)

    Weideman, Mark H.; Hammond, Vince H.; Loos, Alfred C.

    1992-01-01

    RTMCL is a user friendly computer code which simulates the manufacture of fabric composites by the resin infusion process. The computer code is based on the process simulation model described in reference 1. Included in the user's guide is a detailed step by step description of how to run the program and enter and modify the input data set. Sample input and output files are included along with an explanation of the results. Finally, a complete listing of the program is provided.

  4. Development of the TACOM (Tank Automotive Command) Thermal Imaging Model (TTIM). Volume 1. Technical Guide and User’s Manual.

    DTIC Science & Technology

    1984-12-01

    BLOCK DATA Default values for variables input by menus. LIBR Interface with frame I/O routines. SNSR Interface with sensor routines. ATMOS Interface with...Routines Included in Frame I/O Interface Routine Description LIBR Selects options for input or output to a data library. FRREAD Reads frame from file and/or...Layer", Journal of Applied Meteorology 20, pp. 242-249, March 1981. 15 L.J. Harding, Numerical Analysis and Applications Software Abstracts, Computing

  5. JCL (Job Control Language) Procedures to Run the Hull Code on the Cyber 205 Computer Installed on CSIRONET.

    DTIC Science & Technology

    1986-11-01

    START THE RUN>>> USERNUIDNUPW. CHARGEGROUPNPID. SETJOB, DC= NO . COMMENT . GET CR ATTACH THE INPUT DATA TO GO TO VSOS. GET, INDATA=DATFILE/NA. IFE...NtPW. CHARGEGROUPNPID. SETTL, 200. SETJOB. DC= NO . COMMENT . RUN SAIL ON NOS TO GENERATE THE MAIN PROGRAM. PURGE, SAl LOUT/NA. PURGE, PROG-PROBLEMID...NOSPASS. CHARGEDFCDFCPR.F. SETJOB. DC= NO . COMMENT . GET OR ATTACH THE INPUT DATA To Go To VSOS. GET. INDATA=MYDATA/NA. IFE. .NOT.FILE(INDATA.AS) .DOATT

  6. NASCAP user's manual, 1978

    NASA Technical Reports Server (NTRS)

    Cassidy, J. J., III

    1978-01-01

    NASCAP simulates the charging process for a complex object in either tenuous plasma (geosynchronous orbit) or ground test (electron gun source) environment. Program control words, the structure of user input files, and various user options available are described in this computer programmer's user manual.

  7. DATA FOR ENVIRONMENTAL MODELING: AN OVERVIEW

    EPA Science Inventory

    The objective of the project described here, entitled Data for Environmental Modeling (D4EM), is the development of a comprehensive set of software tools that allow an environmental model developer to automatically populate model input files with environmental data available from...

  8. Mobility Research for Future Vehicles: A Methodology to Create a Unified Trade-Off Environment for Advanced Aerospace Vehicle

    DTIC Science & Technology

    2018-01-31

    Language for SeBBAS ............................................................... 23 2.4.3 Running SeBBAS Algorithm in MATLAB...Input File Error Checking ................................................................................................... 76 4.4.3 Running ...99 6.2 5- Blade Rotor System Investigation

  9. Full-text, Downloading, & Other Issues.

    ERIC Educational Resources Information Center

    Tenopir, Carol

    1983-01-01

    Issues having a possible impact on online search services in libraries are discussed including full text databases, front-end processors which translate user's input into the command language of an appropriate system, downloading to create personal files from commercial databases, and pricing. (EJS)

  10. Shaping Ability of Single-file Systems with Different Movements: A Micro-computed Tomographic Study.

    PubMed

    Santa-Rosa, Joedy; de Sousa-Neto, Manoel Damião; Versiani, Marco Aurelio; Nevares, Giselle; Xavier, Felipe; Romeiro, Kaline; Cassimiro, Marcely; Leoni, Graziela Bianchi; de Menezes, Rebeca Ferraz; Albuquerque, Diana

    2016-01-01

    This study aimed to perform a rigorous sample standardization and also evaluate the preparation of mesiobuccal (MB) root canals of maxillary molars with severe curvatures using two single-file engine-driven systems (WaveOne with reciprocating motion and OneShape with rotary movement), using micro-computed tomography (micro-CT). Ten MB roots with single canals were included, uniformly distributed into two groups (n=5). The samples were prepared with a WaveOne or OneShape files. The shaping ability and amount of canal transportation were assessed by a comparison of the pre- and post-instrumentation micro-CT scans. The Kolmogorov-Smirnov and t-tests were used for statistical analysis. The level of significance was set at 0.05. Instrumentation of canals increased their surface area and volume. Canal transportation occurred in coronal, middle and apical thirds and no statistical difference was observed between the two systems (P>0.05). In apical third, significant differences were found between groups in canal roundness (in 3 mm level) and perimeter (in 3 and 4 mm levels) (P<0.05). The WaveOne and One Shape single-file systems were able to shape curved root canals, producing minor changes in the canal curvature.

  11. Interactive digital signal processor

    NASA Technical Reports Server (NTRS)

    Mish, W. H.; Wenger, R. M.; Behannon, K. W.; Byrnes, J. B.

    1982-01-01

    The Interactive Digital Signal Processor (IDSP) is examined. It consists of a set of time series analysis Operators each of which operates on an input file to produce an output file. The operators can be executed in any order that makes sense and recursively, if desired. The operators are the various algorithms used in digital time series analysis work. User written operators can be easily interfaced to the sysatem. The system can be operated both interactively and in batch mode. In IDSP a file can consist of up to n (currently n=8) simultaneous time series. IDSP currently includes over thirty standard operators that range from Fourier transform operations, design and application of digital filters, eigenvalue analysis, to operators that provide graphical output, allow batch operation, editing and display information.

  12. BaHaMAS A Bash Handler to Monitor and Administrate Simulations

    NASA Astrophysics Data System (ADS)

    Sciarra, Alessandro

    2018-03-01

    Numerical QCD is often extremely resource demanding and it is not rare to run hundreds of simulations at the same time. Each of these can last for days or even months and it typically requires a job-script file as well as an input file with the physical parameters for the application to be run. Moreover, some monitoring operations (i.e. copying, moving, deleting or modifying files, resume crashed jobs, etc.) are often required to guarantee that the final statistics is correctly accumulated. Proceeding manually in handling simulations is probably the most error-prone way and it is deadly uncomfortable and inefficient! BaHaMAS was developed and successfully used in the last years as a tool to automatically monitor and administrate simulations.

  13. Three-Dimensional Simulation of Traveling-Wave Tube Cold-Test Characteristics Using MAFIA

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.; Wilson, Jeffrey D.

    1995-01-01

    The three-dimensional simulation code MAFIA was used to compute the cold-test parameters - frequency-phase dispersion, beam on-axis interaction impedance, and attenuation - for two types of traveling-wave tube (TWT) slow-wave circuits. The potential for this electromagnetic computer modeling code to reduce the time and cost of TWT development is demonstrated by the high degree of accuracy achieved in calculating these parameters. Generalized input files were developed for ferruled coupled-cavity and TunneLadder slow-wave circuits. These files make it easy to model circuits of arbitrary dimensions. The utility of these files was tested by applying each to a specific TWT slow-wave circuit and comparing the results with experimental data. Excellent agreement was obtained.

  14. Surprise! Infants consider possible bases of generalization for a single input example.

    PubMed

    Gerken, LouAnn; Dawson, Colin; Chatila, Razanne; Tenenbaum, Josh

    2015-01-01

    Infants have been shown to generalize from a small number of input examples. However, existing studies allow two possible means of generalization. One is via a process of noting similarities shared by several examples. Alternatively, generalization may reflect an implicit desire to explain the input. The latter view suggests that generalization might occur when even a single input example is surprising, given the learner's current model of the domain. To test the possibility that infants are able to generalize based on a single example, we familiarized 9-month-olds with a single three-syllable input example that contained either one surprising feature (syllable repetition, Experiment 1) or two features (repetition and a rare syllable, Experiment 2). In both experiments, infants generalized only to new strings that maintained all of the surprising features from familiarization. This research suggests that surprise can promote very rapid generalization. © 2014 John Wiley & Sons Ltd.

  15. Sub-Fickean Diffusion in a One-Dimensional Plasma Ring

    NASA Astrophysics Data System (ADS)

    Theisen, W. L.

    2013-12-01

    A one-dimensional dusty plasma ring is formed in a strongly-coupled complex plasma. The dust particles in the ring can be characterized as a one-dimensional system where the particles cannot pass each other. The particles perform random walks due to thermal motions. This single-file self diffusion is characterized by the mean-squared displacement (msd) of the individual particles which increases with time t. Diffusive processes that follow Ficks law predict that the msd increases as t, however, single-file diffusion is sub-Fickean meaning that the msd is predicted to increase as t^(1/2). Particle position data from the dusty plasma ring is analyzed to determine the scaling of the msd with time. Results are compared with predictions of single-file diffusion theory.

  16. Developing CORBA-Based Distributed Scientific Applications From Legacy Fortran Programs

    NASA Technical Reports Server (NTRS)

    Sang, Janche; Kim, Chan; Lopez, Isaac

    2000-01-01

    An efficient methodology is presented for integrating legacy applications written in Fortran into a distributed object framework. Issues and strategies regarding the conversion and decomposition of Fortran codes into Common Object Request Broker Architecture (CORBA) objects are discussed. Fortran codes are modified as little as possible as they are decomposed into modules and wrapped as objects. A new conversion tool takes the Fortran application as input and generates the C/C++ header file and Interface Definition Language (IDL) file. In addition, the performance of the client server computing is evaluated.

  17. User's manual for the Macintosh version of PASCO

    NASA Technical Reports Server (NTRS)

    Lucas, S. H.; Davis, Randall C.

    1991-01-01

    A user's manual for Macintosh PASCO is presented. Macintosh PASCO is an Apple Macintosh version of PASCO, an existing computer code for structural analysis and optimization of longitudinally stiffened composite panels. PASCO combines a rigorous buckling analysis program with a nonlinear mathematical optimization routine to minimize panel mass. Macintosh PASCO accepts the same input as mainframe versions of PASCO. As output, Macintosh PASCO produces a text file and mode shape plots in the form of Apple Macintosh PICT files. Only the user interface for Macintosh is discussed here.

  18. Tuning HDF5 subfiling performance on parallel file systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byna, Suren; Chaarawi, Mohamad; Koziol, Quincey

    Subfiling is a technique used on parallel file systems to reduce locking and contention issues when multiple compute nodes interact with the same storage target node. Subfiling provides a compromise between the single shared file approach that instigates the lock contention problems on parallel file systems and having one file per process, which results in generating a massive and unmanageable number of files. In this paper, we evaluate and tune the performance of recently implemented subfiling feature in HDF5. In specific, we explain the implementation strategy of subfiling feature in HDF5, provide examples of using the feature, and evaluate andmore » tune parallel I/O performance of this feature with parallel file systems of the Cray XC40 system at NERSC (Cori) that include a burst buffer storage and a Lustre disk-based storage. We also evaluate I/O performance on the Cray XC30 system, Edison, at NERSC. Our results show performance benefits of 1.2X to 6X performance advantage with subfiling compared to writing a single shared HDF5 file. We present our exploration of configurations, such as the number of subfiles and the number of Lustre storage targets to storing files, as optimization parameters to obtain superior I/O performance. Based on this exploration, we discuss recommendations for achieving good I/O performance as well as limitations with using the subfiling feature.« less

  19. Validation of a virtual source model of medical linac for Monte Carlo dose calculation using multi-threaded Geant4

    NASA Astrophysics Data System (ADS)

    Aboulbanine, Zakaria; El Khayati, Naïma

    2018-04-01

    The use of phase space in medical linear accelerator Monte Carlo (MC) simulations significantly improves the execution time and leads to results comparable to those obtained from full calculations. The classical representation of phase space stores directly the information of millions of particles, producing bulky files. This paper presents a virtual source model (VSM) based on a reconstruction algorithm, taking as input a compressed file of roughly 800 kb derived from phase space data freely available in the International Atomic Energy Agency (IAEA) database. This VSM includes two main components; primary and scattered particle sources, with a specific reconstruction method developed for each. Energy spectra and other relevant variables were extracted from IAEA phase space and stored in the input description data file for both sources. The VSM was validated for three photon beams: Elekta Precise 6 MV/10 MV and a Varian TrueBeam 6 MV. Extensive calculations in water and comparisons between dose distributions of the VSM and IAEA phase space were performed to estimate the VSM precision. The Geant4 MC toolkit in multi-threaded mode (Geant4-[mt]) was used for fast dose calculations and optimized memory use. Four field configurations were chosen for dose calculation validation to test field size and symmetry effects, , , and for squared fields, and for an asymmetric rectangular field. Good agreement in terms of formalism, for 3%/3 mm and 2%/3 mm criteria, for each evaluated radiation field and photon beam was obtained within a computation time of 60 h on a single WorkStation for a 3 mm voxel matrix. Analyzing the VSM’s precision in high dose gradient regions, using the distance to agreement concept (DTA), showed also satisfactory results. In all investigated cases, the mean DTA was less than 1 mm in build-up and penumbra regions. In regards to calculation efficiency, the event processing speed is six times faster using Geant4-[mt] compared to sequential Geant4, when running the same simulation code for both. The developed VSM for 6 MV/10 MV beams widely used, is a general concept easy to adapt in order to reconstruct comparable beam qualities for various linac configurations, facilitating its integration for MC treatment planning purposes.

  20. Java Library for Input and Output of Image Data and Metadata

    NASA Technical Reports Server (NTRS)

    Deen, Robert; Levoe, Steven

    2003-01-01

    A Java-language library supports input and output (I/O) of image data and metadata (label data) in the format of the Video Image Communication and Retrieval (VICAR) image-processing software and in several similar formats, including a subset of the Planetary Data System (PDS) image file format. The library does the following: It provides low-level, direct access layer, enabling an application subprogram to read and write specific image files, lines, or pixels, and manipulate metadata directly. Two coding/decoding subprograms ("codecs" for short) based on the Java Advanced Imaging (JAI) software provide access to VICAR and PDS images in a file-format-independent manner. The VICAR and PDS codecs enable any program that conforms to the specification of the JAI codec to use VICAR or PDS images automatically, without specific knowledge of the VICAR or PDS format. The library also includes Image I/O plugin subprograms for VICAR and PDS formats. Application programs that conform to the Image I/O specification of Java version 1.4 can utilize any image format for which such a plug-in subprogram exists, without specific knowledge of the format itself. Like the aforementioned codecs, the VICAR and PDS Image I/O plug-in subprograms support reading and writing of metadata.

  1. Aviation Environmental Design Tool (AEDT) AEDT Standard Input File (ASIF) reference guide version 2a

    DOT National Transportation Integrated Search

    2014-01-01

    The Federal Aviation Administration, Office of Environment and Energy (FAA-AEE) has developed the Aviation : Environmental Design Tool (AEDT) version 2a software system with the support of the following development team: : FAA, National Aeronautics a...

  2. 78 FR 21668 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-11

    ... amounts of data. Some firms base their business models largely on incorporating non-display data into... cost of market data product production and distribution in isolation from the cost of all of the inputs...

  3. Circuit Recognition of VLSI Layouts

    DTIC Science & Technology

    1989-09-01

    from the ** ** input file contain information on each transitor . ** totaltransistors=O; while(((strcmp(buffer. "n")))=O) 1Ms(trcmp(buffer.tp"))-=O)) I... statistics and information on transistors ** ** inverters and passgates prior to entering level2 recognition.** fprintf (fo. "no more transistors.\

  4. Spec2Harv: Converting Spectrum output to HARVEST input

    Treesearch

    Eric J. Gustafson; Luke V. Rasmussen; Larry A. Leefers

    2003-01-01

    Spec2Harv was developed to automate the conversion of harvest schedules generated by the Spectrum model into script files that can be used by the HARVEST simulation model to simulate the implementation of the Spectrum schedules in a spatially explicit way.

  5. Evaluation of Apical Extrusion of Debris and Irrigant Using Two New Reciprocating and One Continuous Rotation Single File Systems

    PubMed Central

    Nayak, Gurudutt; Singh, Inderpreet; Shetty, Shashit; Dahiya, Surya

    2014-01-01

    Objective: Apical extrusion of debris and irrigants during cleaning and shaping of the root canal is one of the main causes of periapical inflammation and postoperative flare-ups. The purpose of this study was to quantitatively measure the amount of debris and irrigants extruded apically in single rooted canals using two reciprocating and one rotary single file nickel-titanium instrumentation systems. Materials and Methods: Sixty human mandibular premolars, randomly assigned to three groups (n = 20) were instrumented using two reciprocating (Reciproc and Wave One) and one rotary (One Shape) single-file nickel-titanium systems. Bidistilled water was used as irrigant with traditional needle irrigation delivery system. Eppendorf tubes were used as test apparatus for collection of debris and irrigant. The volume of extruded irrigant was collected and quantified via 0.1-mL increment measure supplied on the disposable plastic insulin syringe. The liquid inside the tubes was dried and the mean weight of debris was assessed using an electronic microbalance. The data were statistically analysed using Kruskal-Wallis nonparametric test and Mann Whitney U test with Bonferroni adjustment. P-values less than 0.05 were considered significant. Results: The Reciproc file system produced significantly more debris compared with OneShape file system (P<0.05), but no statistically significant difference was obtained between the two reciprocating instruments (P>0.05). Extrusion of irrigant was statistically insignificant irrespective of the instrument or instrumentation technique used (P >0.05). Conclusions: Although all systems caused apical extrusion of debris and irrigant, continuous rotary instrumentation was associated with less extrusion as compared with the use of reciprocating file systems. PMID:25628665

  6. 1995 Joseph E. Whitley, MD, Award. A World Wide Web gateway to the radiologic learning file.

    PubMed

    Channin, D S

    1995-12-01

    Computer networks in general, and the Internet specifically, are changing the way information is manipulated in the world at large and in radiology. The goal of this project was to develop a computer system in which images from the Radiologic Learning File, available previously only via a single-user laser disc, are made available over a generic, high-availability computer network to many potential users simultaneously. Using a networked workstation in our laboratory and freely available distributed hypertext software, we established a World Wide Web (WWW) information server for radiology. Images from the Radiologic Learning File are requested through the WWW client software, digitized from a single laser disc containing the entire teaching file and then transmitted over the network to the client. The text accompanying each image is incorporated into the transmitted document. The Radiologic Learning File is now on-line, and requests to view the cases result in the delivery of the text and images. Image digitization via a frame grabber takes 1/30th of a second. Conversion of the image to a standard computer graphic format takes 45-60 sec. Text and image transmission speed on a local area network varies between 200 and 400 kilobytes (KB) per second depending on the network load. We have made images from a laser disc of the Radiologic Learning File available through an Internet-based hypertext server. The images previously available through a single-user system located in a remote section of our department are now ubiquitously available throughout our department via the department's computer network. We have thus converted a single-user, limited functionality system into a multiuser, widely available resource.

  7. 77 FR 70202 - Self-Regulatory Organizations; ICE Clear Credit LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-23

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-68253; File No. SR-ICC-2012-20] Self-Regulatory... Single Name Contract November 16, 2012. Pursuant to Section 19(b)(1) of the Securities Exchange Act of... Clear Credit LLC (``ICC'') filed with the Securities and Exchange Commission (``Commission'') the...

  8. VizieR Online Data Catalog: The KepVIM catalog (Makarov+, 2016)

    NASA Astrophysics Data System (ADS)

    Makarov, V. V.; Goldin, A.

    2016-07-01

    The algorithm described in section 4 was applied to the entire collection of "long-cadence" files archived in the MAST for the principal Kepler mission. A single variability-induced motion (VIM) detection corresponds to a complete data set for a given target collected during one quarter. Therefore, a single target can generate up to 17 VIM detections in the catalog. (2 data files).

  9. COVART 6.1: FASTGEN Legacy Model User’s Manual

    DTIC Science & Technology

    2010-03-31

    Program Office • Crystal Gateway #4 • Suite 1103 • 200 12 th St. South • Arlington, VA 22202 REPORT DOCUMENTATION PAGE Form Approved... Single Proximity Burst File Layout ................................................ 208 Figure 23-2 OFRAGB Multiple Proximity Burst File Layout...dimensional normal, distribution of shotlines about an aim point (SHOT1) 2. Multiple shotlines over a two-dimensional grid (SHOT2) 3. A single shotline at

  10. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-file Systems.

    PubMed

    Prabhakar, Attiguppe R; Yavagal, Chandrashekar; Dixit, Kratika; Naik, Saraswathi V

    2016-01-01

    Primary root canals are considered to be most challenging due to their complex anatomy. "Wave one" and "one shape" are single-file systems with reciprocating and rotary motion respectively. The aim of this study was to evaluate and compare dentin thickness, centering ability, canal transportation, and instrumentation time of wave one and one shape files in primary root canals using a cone beam computed tomographic (CBCT) analysis. This is an experimental, in vitro study comparing the two groups. A total of 24 extracted human primary teeth with minimum 7 mm root length were included in the study. Cone beam computed tomographic images were taken before and after the instrumentation for each group. Dentin thickness, centering ability, canal transportation, and instrumentation times were evaluated for each group. A significant difference was found in instrumentation time and canal transportation measures between the two groups. Wave one showed less canal transportation as compared with one shape, and the mean instrumentation time of wave one was significantly less than one shape. Reciprocating single-file systems was found to be faster with much less procedural errors and can hence be recommended for shaping the root canals of primary teeth. How to cite this article: Prabhakar AR, Yavagal C, Dixit K, Naik SV. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-File Systems. Int J Clin Pediatr Dent 2016;9(1):45-49.

  11. Astronomical Image Processing with Hadoop

    NASA Astrophysics Data System (ADS)

    Wiley, K.; Connolly, A.; Krughoff, S.; Gardner, J.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-07-01

    In the coming decade astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. With a requirement that these images be analyzed in real time to identify moving sources such as potentially hazardous asteroids or transient objects such as supernovae, these data streams present many computational challenges. In the commercial world, new techniques that utilize cloud computing have been developed to handle massive data streams. In this paper we describe how cloud computing, and in particular the map-reduce paradigm, can be used in astronomical data processing. We will focus on our experience implementing a scalable image-processing pipeline for the SDSS database using Hadoop (http://hadoop.apache.org). This multi-terabyte imaging dataset approximates future surveys such as those which will be conducted with the LSST. Our pipeline performs image coaddition in which multiple partially overlapping images are registered, integrated and stitched into a single overarching image. We will first present our initial implementation, then describe several critical optimizations that have enabled us to achieve high performance, and finally describe how we are incorporating a large in-house existing image processing library into our Hadoop system. The optimizations involve prefiltering of the input to remove irrelevant images from consideration, grouping individual FITS files into larger, more efficient indexed files, and a hybrid system in which a relational database is used to determine the input images relevant to the task. The incorporation of an existing image processing library, written in C++, presented difficult challenges since Hadoop is programmed primarily in Java. We will describe how we achieved this integration and the sophisticated image processing routines that were made feasible as a result. We will end by briefly describing the longer term goals of our work, namely detection and classification of transient objects and automated object classification.

  12. Assessing the impact of non-tidal atmospheric loading on a Kalman filter-based terrestrial reference frame

    NASA Astrophysics Data System (ADS)

    Abbondanza, Claudio; Altamimi, Zuheir; Chin, Toshio; Collilieux, Xavier; Dach, Rolf; Gross, Richard; Heflin, Michael; König, Rolf; Lemoine, Frank; Macmillan, Dan; Parker, Jay; van Dam, Tonie; Wu, Xiaoping

    2014-05-01

    The International Terrestrial Reference Frame (ITRF) adopts a piece-wise linear model to parameterize regularized station positions and velocities. The space-geodetic (SG) solutions from VLBI, SLR, GPS and DORIS used as input in the ITRF combination process account for tidal loading deformations, but ignore the non-tidal part. As a result, the non-linear signal observed in the time series of SG-derived station positions in part reflects non-tidal loading displacements not introduced in the SG data reduction. In this analysis, we assess the impact of non-tidal atmospheric loading (NTAL) corrections on the TRF computation. Focusing on the a-posteriori approach, (i) the NTAL model derived from the National Centre for Environmental Prediction (NCEP) surface pressure is removed from the SINEX files of the SG solutions used as inputs to the TRF determinations; (ii) adopting a Kalman-filter based approach, two distinct linear TRFs are estimated combining the 4 SG solutions with (corrected TRF solution) and without the NTAL displacements (standard TRF solution). Linear fits (offset and atmospheric velocity) of the NTAL displacements removed during step (i) are estimated accounting for the station position discontinuities introduced in the SG solutions and adopting different weighting strategies. The NTAL-derived (atmospheric) velocity fields are compared to those obtained from the TRF reductions during step (ii). The consistency between the atmospheric and the TRF-derived velocity fields is examined. We show how the presence of station position discontinuities in SG solutions degrades the agreement between the velocity fields and compare the effect of different weighting structure adopted while estimating the linear fits to the NTAL displacements. Finally, we evaluate the effect of restoring the atmospheric velocities determined through the linear fits of the NTAL displacements to the single-technique linear reference frames obtained by stacking the standard SG SINEX files. Differences between the velocity fields obtained restoring the NTAL displacements and the standard stacked linear reference frames are discussed.

  13. BrainIACS: a system for web-based medical image processing

    NASA Astrophysics Data System (ADS)

    Kishore, Bhaskar; Bazin, Pierre-Louis; Pham, Dzung L.

    2009-02-01

    We describe BrainIACS, a web-based medical image processing system that permits and facilitates algorithm developers to quickly create extensible user interfaces for their algorithms. Designed to address the challenges faced by algorithm developers in providing user-friendly graphical interfaces, BrainIACS is completely implemented using freely available, open-source software. The system, which is based on a client-server architecture, utilizes an AJAX front-end written using the Google Web Toolkit (GWT) and Java Servlets running on Apache Tomcat as its back-end. To enable developers to quickly and simply create user interfaces for configuring their algorithms, the interfaces are described using XML and are parsed by our system to create the corresponding user interface elements. Most of the commonly found elements such as check boxes, drop down lists, input boxes, radio buttons, tab panels and group boxes are supported. Some elements such as the input box support input validation. Changes to the user interface such as addition and deletion of elements are performed by editing the XML file or by using the system's user interface creator. In addition to user interface generation, the system also provides its own interfaces for data transfer, previewing of input and output files, and algorithm queuing. As the system is programmed using Java (and finally Java-script after compilation of the front-end code), it is platform independent with the only requirements being that a Servlet implementation be available and that the processing algorithms can execute on the server platform.

  14. A Digital Control Algorithm for Magnetic Suspension Systems

    NASA Technical Reports Server (NTRS)

    Britton, Thomas C.

    1996-01-01

    An ongoing program exists to investigate and develop magnetic suspension technologies and modelling techniques at NASA Langley Research Center. Presently, there is a laboratory-scale large air-gap suspension system capable of five degree-of-freedom (DOF) control that is operational and a six DOF system that is under development. Those systems levitate a cylindrical element containing a permanent magnet core above a planar array of electromagnets, which are used for levitation and control purposes. In order to evaluate various control approaches with those systems, the Generic Real-Time State-Space Controller (GRTSSC) software package was developed. That control software package allows the user to implement multiple control methods and allows for varied input/output commands. The development of the control algorithm is presented. The desired functionality of the software is discussed, including the ability to inject noise on sensor inputs and/or actuator outputs. Various limitations, common issues, and trade-offs are discussed including data format precision; the drawbacks of using either Direct Memory Access (DMA), interrupts, or program control techniques for data acquisition; and platform dependent concerns related to the portability of the software, such as memory addressing formats. Efforts to minimize overall controller loop-rate and a comparison of achievable controller sample rates are discussed. The implementation of a modular code structure is presented. The format for the controller input data file and the noise information file is presented. Controller input vector information is available for post-processing by mathematical analysis software such as MATLAB1.

  15. Glycan Reader is improved to recognize most sugar types and chemical modifications in the Protein Data Bank.

    PubMed

    Park, Sang-Jun; Lee, Jumin; Patel, Dhilon S; Ma, Hongjing; Lee, Hui Sun; Jo, Sunhwan; Im, Wonpil

    2017-10-01

    Glycans play a central role in many essential biological processes. Glycan Reader was originally developed to simplify the reading of Protein Data Bank (PDB) files containing glycans through the automatic detection and annotation of sugars and glycosidic linkages between sugar units and to proteins, all based on atomic coordinates and connectivity information. Carbohydrates can have various chemical modifications at different positions, making their chemical space much diverse. Unfortunately, current PDB files do not provide exact annotations for most carbohydrate derivatives and more than 50% of PDB glycan chains have at least one carbohydrate derivative that could not be correctly recognized by the original Glycan Reader. Glycan Reader has been improved and now identifies most sugar types and chemical modifications (including various glycolipids) in the PDB, and both PDB and PDBx/mmCIF formats are supported. CHARMM-GUI Glycan Reader is updated to generate the simulation system and input of various glycoconjugates with most sugar types and chemical modifications. It also offers a new functionality to edit the glycan structures through addition/deletion/modification of glycosylation types, sugar types, chemical modifications, glycosidic linkages, and anomeric states. The simulation system and input files can be used for CHARMM, NAMD, GROMACS, AMBER, GENESIS, LAMMPS, Desmond, OpenMM, and CHARMM/OpenMM. Glycan Fragment Database in GlycanStructure.Org is also updated to provide an intuitive glycan sequence search tool for complex glycan structures with various chemical modifications in the PDB. http://www.charmm-gui.org/input/glycan and http://www.glycanstructure.org. wonpil@lehigh.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  16. LODVIEW: a computer program for the graphical evaluation of lod score results in exclusion mapping of human disease genes.

    PubMed

    Hildebrandt, F; Pohlmann, A; Omran, H

    1993-12-01

    For linkage analysis projects aimed at mapping hereditary disease genes in humans, hundreds of highly polymorphic microsatellite markers which can be typed by PCR (PCR markers) have become available. With this technical improvement, the availability of a technique allowing for transparency in the handling of rapidly generated lod score data is becoming important. We present a computer program LODVIEW for the graphical representation of lod score data. It is designed for the input of lod score data generated with the LINKAGE package or similar programs. LODVIEW consists of 24 preformatted files, one for each chromosome. Each file contains a table for the input of lod score data and a file for the graphical representation of the data, which will show automatically any entry that is made in the respective input table. The program provides the user with published PCR marker information pre-entered into a table and graph at the correct positions corresponding to the genetic distances between markers. The graphical display of LODVIEW allows for the rapid evaluation of lod score results calculated from PCR markers on each chromosome. The following information can be obtained from the graphical display at one glance: (i) Regions of exclusion (Z(theta) < -2) and nonexclusion, (ii) markers with positive lod scores, (iii) the distribution of positive and negative lod scores among the families examined (indication of genetic heterogeneity), (iv) multipoint lod scores, and (v) the availability of PCR markers in regions of interest. The program is continually updated for novel PCR marker information from the literature. The program will help to efficiently monitor and direct the progress of exclusion mapping projects.

  17. Descriptions and Implementations of DL_F Notation: A Natural Chemical Expression System of Atom Types for Molecular Simulations.

    PubMed

    Yong, Chin W

    2016-08-22

    DL_F Notation is an easy-to-understand, standardized atom typesetting expression for molecular simulations for a range of organic force field (FF) schemes such as OPLSAA, PCFF, and CVFF. It is implemented within DL_FIELD, a software program that facilitates the setting up of molecular FF models for DL_POLY molecular dynamics simulation software. By making use of the Notation, a single core conversion module (the DL_F conversion Engine) implemented within DL_FIELD can be used to analyze a molecular structure and determine the types of atoms for a given FF scheme. Users only need to provide the molecular input structure in a simple xyz format and DL_FIELD can produce the necessary force field file for DL_POLY automatically. In commensurate with the development concept of DL_FIELD, which placed emphasis on robustness and user friendliness, the Engine provides a single-step solution to setup complex FF models. This allows users to switch from one of the above-mentioned FF seamlessly to another while at the same time provides a consistent atom typing that is expressed in a natural chemical sense.

  18. ModelArchiver—A program for facilitating the creation of groundwater model archives

    USGS Publications Warehouse

    Winston, Richard B.

    2018-03-01

    ModelArchiver is a program designed to facilitate the creation of groundwater model archives that meet the requirements of the U.S. Geological Survey (USGS) policy (Office of Groundwater Technical Memorandum 2016.02, https://water.usgs.gov/admin/memo/GW/gw2016.02.pdf, https://water.usgs.gov/ogw/policy/gw-model/). ModelArchiver version 1.0 leads the user step-by-step through the process of creating a USGS groundwater model archive. The user specifies the contents of each of the subdirectories within the archive and provides descriptions of the archive contents. Descriptions of some files can be specified automatically using file extensions. Descriptions also can be specified individually. Those descriptions are added to a readme.txt file provided by the user. ModelArchiver moves the content of the archive to the archive folder and compresses some folders into .zip files.As part of the archive, the modeler must create a metadata file describing the archive. The program has a built-in metadata editor and provides links to websites that can aid in creation of the metadata. The built-in metadata editor is also available as a stand-alone program named FgdcMetaEditor version 1.0, which also is described in this report. ModelArchiver updates the metadata file provided by the user with descriptions of the files in the archive. An optional archive list file generated automatically by ModelMuse can streamline the creation of archives by identifying input files, output files, model programs, and ancillary files for inclusion in the archive.

  19. Technology Benefit Estimator (T/BEST): User's Manual

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Chamis, Christos C.; Abumeri, Galib

    1994-01-01

    The Technology Benefit Estimator (T/BEST) system is a formal method to assess advanced technologies and quantify the benefit contributions for prioritization. T/BEST may be used to provide guidelines to identify and prioritize high payoff research areas, help manage research and limited resources, show the link between advanced concepts and the bottom line, i.e., accrued benefit and value, and to communicate credibly the benefits of research. The T/BEST software computer program is specifically designed to estimating benefits, and benefit sensitivities, of introducing new technologies into existing propulsion systems. Key engine cycle, structural, fluid, mission and cost analysis modules are used to provide a framework for interfacing with advanced technologies. An open-ended, modular approach is used to allow for modification and addition of both key and advanced technology modules. T/BEST has a hierarchical framework that yields varying levels of benefit estimation accuracy that are dependent on the degree of input detail available. This hierarchical feature permits rapid estimation of technology benefits even when the technology is at the conceptual stage. As knowledge of the technology details increases the accuracy of the benefit analysis increases. Included in T/BEST's framework are correlations developed from a statistical data base that is relied upon if there is insufficient information given in a particular area, e.g., fuel capacity or aircraft landing weight. Statistical predictions are not required if these data are specified in the mission requirements. The engine cycle, structural fluid, cost, noise, and emissions analyses interact with the default or user material and component libraries to yield estimates of specific global benefits: range, speed, thrust, capacity, component life, noise, emissions, specific fuel consumption, component and engine weights, pre-certification test, mission performance engine cost, direct operating cost, life cycle cost, manufacturing cost, development cost, risk, and development time. Currently, T/BEST operates on stand-alone or networked workstations, and uses a UNIX shell or script to control the operation of interfaced FORTRAN based analyses. T/BEST's interface structure works equally well with non-FORTRAN or mixed software analysis. This interface structure is designed to maintain the integrity of the expert's analyses by interfacing with expert's existing input and output files. Parameter input and output data (e.g., number of blades, hub diameters, etc.) are passed via T/BEST's neutral file, while copious data (e.g., finite element models, profiles, etc.) are passed via file pointers that point to the expert's analyses output files. In order to make the communications between the T/BEST's neutral file and attached analyses codes simple, only two software commands, PUT and GET, are required. This simplicity permits easy access to all input and output variables contained within the neutral file. Both public domain and proprietary analyses codes may be attached with a minimal amount of effort, while maintaining full data and analysis integrity, and security. T/BESt's sotware framework, status, beginner-to-expert operation, interface architecture, analysis module addition, and key analysis modules are discussed. Representative examples of T/BEST benefit analyses are shown.

  20. Technology Benefit Estimator (T/BEST): User's manual

    NASA Astrophysics Data System (ADS)

    Generazio, Edward R.; Chamis, Christos C.; Abumeri, Galib

    1994-12-01

    The Technology Benefit Estimator (T/BEST) system is a formal method to assess advanced technologies and quantify the benefit contributions for prioritization. T/BEST may be used to provide guidelines to identify and prioritize high payoff research areas, help manage research and limited resources, show the link between advanced concepts and the bottom line, i.e., accrued benefit and value, and to communicate credibly the benefits of research. The T/BEST software computer program is specifically designed to estimating benefits, and benefit sensitivities, of introducing new technologies into existing propulsion systems. Key engine cycle, structural, fluid, mission and cost analysis modules are used to provide a framework for interfacing with advanced technologies. An open-ended, modular approach is used to allow for modification and addition of both key and advanced technology modules. T/BEST has a hierarchical framework that yields varying levels of benefit estimation accuracy that are dependent on the degree of input detail available. This hierarchical feature permits rapid estimation of technology benefits even when the technology is at the conceptual stage. As knowledge of the technology details increases the accuracy of the benefit analysis increases. Included in T/BEST's framework are correlations developed from a statistical data base that is relied upon if there is insufficient information given in a particular area, e.g., fuel capacity or aircraft landing weight. Statistical predictions are not required if these data are specified in the mission requirements. The engine cycle, structural fluid, cost, noise, and emissions analyses interact with the default or user material and component libraries to yield estimates of specific global benefits: range, speed, thrust, capacity, component life, noise, emissions, specific fuel consumption, component and engine weights, pre-certification test, mission performance engine cost, direct operating cost, life cycle cost, manufacturing cost, development cost, risk, and development time. Currently, T/BEST operates on stand-alone or networked workstations, and uses a UNIX shell or script to control the operation of interfaced FORTRAN based analyses. T/BEST's interface structure works equally well with non-FORTRAN or mixed software analysis. This interface structure is designed to maintain the integrity of the expert's analyses by interfacing with expert's existing input and output files. Parameter input and output data (e.g., number of blades, hub diameters, etc.) are passed via T/BEST's neutral file, while copious data (e.g., finite element models, profiles, etc.) are passed via file pointers that point to the expert's analyses output files. In order to make the communications between the T/BEST's neutral file and attached analyses codes simple, only two software commands, PUT and GET, are required. This simplicity permits easy access to all input and output variables contained within the neutral file. Both public domain and proprietary analyses codes may be attached with a minimal amount of effort, while maintaining full data and analysis integrity, and security.

  1. Wavelength meter having single mode fiber optics multiplexed inputs

    DOEpatents

    Hackel, R.P.; Paris, R.D.; Feldman, M.

    1993-02-23

    A wavelength meter having a single mode fiber optics input is disclosed. The single mode fiber enables a plurality of laser beams to be multiplexed to form a multiplexed input to the wavelength meter. The wavelength meter can provide a determination of the wavelength of any one or all of the plurality of laser beams by suitable processing. Another aspect of the present invention is that one of the laser beams could be a known reference laser having a predetermined wavelength. Hence, the improved wavelength meter can provide an on-line calibration capability with the reference laser input as one of the plurality of laser beams.

  2. Wavelength meter having single mode fiber optics multiplexed inputs

    DOEpatents

    Hackel, Richard P.; Paris, Robert D.; Feldman, Mark

    1993-01-01

    A wavelength meter having a single mode fiber optics input is disclosed. The single mode fiber enables a plurality of laser beams to be multiplexed to form a multiplexed input to the wavelength meter. The wavelength meter can provide a determination of the wavelength of any one or all of the plurality of laser beams by suitable processing. Another aspect of the present invention is that one of the laser beams could be a known reference laser having a predetermined wavelength. Hence, the improved wavelength meter can provide an on-line calibration capability with the reference laser input as one of the plurality of laser beams.

  3. Quantum design rules for single molecule logic gates.

    PubMed

    Renaud, N; Hliwa, M; Joachim, C

    2011-08-28

    Recent publications have demonstrated how to implement a NOR logic gate with a single molecule using its interaction with two surface atoms as logical inputs [W. Soe et al., ACS Nano, 2011, 5, 1436]. We demonstrate here how this NOR logic gate belongs to the general family of quantum logic gates where the Boolean truth table results from a full control of the quantum trajectory of the electron transfer process through the molecule by very local and classical inputs practiced on the molecule. A new molecule OR gate is proposed for the logical inputs to be also single metal atoms, one per logical input.

  4. CalSimHydro Tool - A Web-based interactive tool for the CalSim 3.0 Hydrology Prepropessor

    NASA Astrophysics Data System (ADS)

    Li, P.; Stough, T.; Vu, Q.; Granger, S. L.; Jones, D. J.; Ferreira, I.; Chen, Z.

    2011-12-01

    CalSimHydro, the CalSim 3.0 Hydrology Preprocessor, is an application designed to automate the various steps in the computation of hydrologic inputs for CalSim 3.0, a water resources planning model developed jointly by California State Department of Water Resources and United States Bureau of Reclamation, Mid-Pacific Region. CalSimHydro consists of a five-step FORTRAN based program that runs the individual models in succession passing information from one model to the next and aggregating data as required by each model. The final product of CalSimHydro is an updated CalSim 3.0 state variable (SV) DSS input file. CalSimHydro consists of (1) a Rainfall-Runoff Model to compute monthly infiltration, (2) a Soil moisture and demand calculator (IDC) that estimates surface runoff, deep percolation, and water demands for natural vegetation cover and various crops other than rice, (3) a Rice Water Use Model to compute the water demands, deep percolation, irrigation return flow, and runoff from precipitation for the rice fields, (4) a Refuge Water Use Model that simulates the ponding operations for managed wetlands, and (5) a Data Aggregation and Transfer Module to aggregate the outputs from the above modules and transfer them to the CalSim SV input file. In this presentation, we describe a web-based user interface for CalSimHydro using Google Earth Plug-In. The CalSimHydro tool allows users to - interact with geo-referenced layers of the Water Budget Areas (WBA) and Demand Units (DU) displayed over the Sacramento Valley, - view the input parameters of the hydrology preprocessor for a selected WBA or DU in a time series plot or a tabular form, - edit the values of the input parameters in the table or by downloading a spreadsheet of the selected parameter in a selected time range, - run the CalSimHydro modules in the backend server and notify the user when the job is done, - visualize the model output and compare it with a base run result, - download the output SV file to be used to run CalSim 3.0. The CalSimHydro tool streamlines the complicated steps to configure and run the hydrology preprocessor by providing a user-friendly visual interface and back-end services to validate user inputs and manage the model execution. It is a powerful addition to the new CalSim 3.0 system.

  5. Sandbox for Mac Malware v 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walkup, Elizabeth

    This software is an analyzer for automated sandbox analysis of malware on the OS X operating system. It runs inside an OS X virtual machine to collect data about what happens when a given file is opened or run. As of August 2014, there was no sandbox software for Mac OS X malware, as it requires different methods from those used on the Windows OS (which most sandboxes are written for). This software adds OS X analysis capabilities to an existing open-source sandbox, Cuckoo Sandbox (http://cuckoosandbox.org/), which previously only worked for Windows. The analyzer itself can take many different typesmore » of files as input: the traditional Mach-O and FAT executables, .app files, zip files, Python scripts, Java archives, and web pages, as well as PDFs and other documents. While the file is running, the analyzer also simulates rudimentary human interaction with clicks and mouse movements in order to bypass the tests some malware use to see if they are being analyzed. The analyzer outputs several different kinds of data: function call traces, network captures, screenshots, and all created and modified files. This work also includes a static analysis Cuckoo module for Mach-O binary files. It extracts file structures, code library imports and exports, and signatures. This data can be used along with the analyzer results to create signatures for malware.« less

  6. Unbinding Transition of Probes in Single-File Systems

    NASA Astrophysics Data System (ADS)

    Bénichou, Olivier; Démery, Vincent; Poncet, Alexis

    2018-02-01

    Single-file transport, arising in quasi-one-dimensional geometries where particles cannot pass each other, is characterized by the anomalous dynamics of a probe, notably its response to an external force. In these systems, the motion of several probes submitted to different external forces, although relevant to mixtures of charged and neutral or active and passive objects, remains unexplored. Here, we determine how several probes respond to external forces. We rely on a hydrodynamic description of the symmetric exclusion process to obtain exact analytical results at long times. We show that the probes can either move as a whole, or separate into two groups moving away from each other. In between the two regimes, they separate with a different dynamical exponent, as t1 /4. This unbinding transition also occurs in several continuous single-file systems and is expected to be observable.

  7. An approach to enhance pnetCDF performance in environmental modeling applications

    EPA Science Inventory

    Data intensive simulations are often limited by their I/O (input/output) performance, and "novel" techniques need to be developed in order to overcome this limitation. The software package pnetCDF (parallel network Common Data Form), which works with parallel file syste...

  8. Evaluation of the Enhanced Integrated Climatic Model for modulus-based construction specification for Oklahoma pavements.

    DOT National Transportation Integrated Search

    2013-07-01

    The study provides estimation of site specific variation in environmental factors that can be : used in predicting seasonal and long-term variations in moduli of unbound materials. Using : these site specific estimates, the EICM climatic input files ...

  9. Integrated Structural Analysis and Test Program

    NASA Technical Reports Server (NTRS)

    Kaufman, Daniel

    2005-01-01

    An integrated structural-analysis and structure-testing computer program is being developed in order to: Automate repetitive processes in testing and analysis; Accelerate pre-test analysis; Accelerate reporting of tests; Facilitate planning of tests; Improve execution of tests; Create a vibration, acoustics, and shock test database; and Integrate analysis and test data. The software package includes modules pertaining to sinusoidal and random vibration, shock and time replication, acoustics, base-driven modal survey, and mass properties and static/dynamic balance. The program is commanded by use of ActiveX controls. There is minimal need to generate command lines. Analysis or test files are selected by opening a Windows Explorer display. After selecting the desired input file, the program goes to a so-called analysis data process or test data process, depending on the type of input data. The status of the process is given by a Windows status bar, and when processing is complete, the data are reported in graphical, tubular, and matrix form.

  10. CAP: A Computer Code for Generating Tabular Thermodynamic Functions from NASA Lewis Coefficients. Revised

    NASA Technical Reports Server (NTRS)

    Zehe, Michael J.; Gordon, Sanford; McBride, Bonnie J.

    2002-01-01

    For several decades the NASA Glenn Research Center has been providing a file of thermodynamic data for use in several computer programs. These data are in the form of least-squares coefficients that have been calculated from tabular thermodynamic data by means of the NASA Properties and Coefficients (PAC) program. The source thermodynamic data are obtained from the literature or from standard compilations. Most gas-phase thermodynamic functions are calculated by the authors from molecular constant data using ideal gas partition functions. The Coefficients and Properties (CAP) program described in this report permits the generation of tabulated thermodynamic functions from the NASA least-squares coefficients. CAP provides considerable flexibility in the output format, the number of temperatures to be tabulated, and the energy units of the calculated properties. This report provides a detailed description of input preparation, examples of input and output for several species, and a listing of all species in the current NASA Glenn thermodynamic data file.

  11. CAP: A Computer Code for Generating Tabular Thermodynamic Functions from NASA Lewis Coefficients

    NASA Technical Reports Server (NTRS)

    Zehe, Michael J.; Gordon, Sanford; McBride, Bonnie J.

    2001-01-01

    For several decades the NASA Glenn Research Center has been providing a file of thermodynamic data for use in several computer programs. These data are in the form of least-squares coefficients that have been calculated from tabular thermodynamic data by means of the NASA Properties and Coefficients (PAC) program. The source thermodynamic data are obtained from the literature or from standard compilations. Most gas-phase thermodynamic functions are calculated by the authors from molecular constant data using ideal gas partition functions. The Coefficients and Properties (CAP) program described in this report permits the generation of tabulated thermodynamic functions from the NASA least-squares coefficients. CAP provides considerable flexibility in the output format, the number of temperatures to be tabulated, and the energy units of the calculated properties. This report provides a detailed description of input preparation, examples of input and output for several species, and a listing of all species in the current NASA Glenn thermodynamic data file.

  12. CloudMC: a cloud computing application for Monte Carlo simulation.

    PubMed

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-04-21

    This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.

  13. 78 FR 53455 - Notice of Agreements Filed

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-29

    .... Parties: Zim Integrated Shipping Services, Ltd.; China Shipping Container Line Co., Ltd. and China Shipping Container Lines (Hong Kong) Co., Ltd. (acting as a single party). Filing Party: Mark E. Newcomb... the [[Page 53456

  14. Computer-program documentation of an interactive-accounting model to simulate streamflow, water quality, and water-supply operations in a river basin

    USGS Publications Warehouse

    Burns, A.W.

    1988-01-01

    This report describes an interactive-accounting model used to simulate streamflow, chemical-constituent concentrations and loads, and water-supply operations in a river basin. The model uses regression equations to compute flow from incremental (internode) drainage areas. Conservative chemical constituents (typically dissolved solids) also are computed from regression equations. Both flow and water quality loads are accumulated downstream. Optionally, the model simulates the water use and the simplified groundwater systems of a basin. Water users include agricultural, municipal, industrial, and in-stream users , and reservoir operators. Water users list their potential water sources, including direct diversions, groundwater pumpage, interbasin imports, or reservoir releases, in the order in which they will be used. Direct diversions conform to basinwide water law priorities. The model is interactive, and although the input data exist in files, the user can modify them interactively. A major feature of the model is its color-graphic-output options. This report includes a description of the model, organizational charts of subroutines, and examples of the graphics. Detailed format instructions for the input data, example files of input data, definitions of program variables, and listing of the FORTRAN source code are Attachments to the report. (USGS)

  15. SOLDESIGN user's manual copyright

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pillsbury, R.D. Jr.

    1991-02-01

    SOLDESIGN is a general purpose program for calculating and plotting magnetic fields, Lorentz body forces, resistances and inductances for a system of coaxial uniform current density solenoidal elements. The program was originally written in 1980 and has been evolving ever since. SOLDESIGN can be used with either interactive (terminal) or file input. Output can be to the terminal or to a file. All input is free-field with comma or space separators. SOLDESIGN contains an interactive help feature that allows the user to examine documentation while executing the program. Input to the program consists of a sequence of word commands andmore » numeric data. Initially, the geometry of the elements or coils is defined by specifying either the coordinates of one corner of the coil or the coil centroid, a symmetry parameter to allow certain reflections of the coil (e.g., a split pair), the radial and axial builds, and either the overall current density or the total ampere-turns (NI). A more general quadrilateral element is also available. If inductances or resistances are desired, the number of turns must be specified. Field, force, and inductance calculations also require the number of radial current sheets (or integration points). Work is underway to extend the field, force, and, possibly, inductances to non-coaxial solenoidal elements.« less

  16. 78 FR 78355 - Hydro Green Energy, LLC; Notice of Preliminary Permit Application Accepted for Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-26

    ... (10) a new single-circuit 230-kilovolt transmission line approximately 12 miles in length. The... intent must meet the requirements of 18 CFR 4.36. The Commission strongly encourages electronic filing... (toll free), or (202) 502- 8659 (TTY). In lieu of electronic filing, please send a paper copy to...

  17. Preliminary Data Pipeline for SunRISE: Assessing the Performance of Space Based Radio Arrays

    NASA Astrophysics Data System (ADS)

    Hegedus, A. M.; Kasper, J. C.; Lazio, J.; Amiri, N.; Stuart, J.

    2017-12-01

    The Sun Radio Interferometer Space Experiment (SunRISE) is a NASA Heliophysics Explorer Mission of Opportunity that was recently awarded phase A funding. SunRISE's main science goals are to localize the source of particle acceleration in coronal mass ejections to 1/4th of their width, and trace the path of electron beams along magnetic field lines out to 20 solar radii. These processes generate cascading Type II and III bursts that have ever only been detected in low frequencies with single spacecraft antenna. These bursts emit below the ionospheric cutoff of 10 MHz past 2 solar radii, so a synthetic aperture made from multiple space antennae is needed to pinpoint the origin of these bursts. In this work, we create an end to end simulation of the data processing pipeline of SunRISE, which uses 6 small satellites to do this localization. One of the main inputs of the simulation is a ground truth of what we want the array to image. We idealized this as an elliptical Gaussian offset from the sun, which previous modeling suggests is a good approximation of what SunRISE would see in space. Another input is an orbit file describing the positions of all the spacecraft. The simulated orbit determinations are made with GPS sidelobes and have an error associated with the recovered positions. From there we compute the Fourier coefficients every antenna will see, then apply the correct phase lags and multiply each pair of coefficients to simulate the process of correlation. We compute the projected UVW coordinates and put these along with the correlated visibilities into a CASA MS file. The correlated visibilities are compared to CASA's simulated visibilities at the same UVW coordinates, verifying the accuracy of our method. The visibilities are then subjected to realistic thermal noise, as well as phase noise from uncertainties in the spacecraft position. We employ CASA's CLEAN algorithm to image the data, and CASA's imfit algorithm to estimate the parameters of the imaged elliptical Gaussian, which we can compare directly to the input. We find that at the upper frequencies the phase noise can negatively affect performance of the array, but for the large majority of the tracking range of interest, SunRISE can sufficiently resolve the radio bursts to fulfill its science requirements and constrain Solar Energetic Particle acceleration and transport.

  18. Photonic lantern adaptive spatial mode control in LMA fiber amplifiers.

    PubMed

    Montoya, Juan; Aleshire, Chris; Hwang, Christopher; Fontaine, Nicolas K; Velázquez-Benítez, Amado; Martz, Dale H; Fan, T Y; Ripin, Dan

    2016-02-22

    We demonstrate adaptive-spatial mode control (ASMC) in few-moded double-clad large mode area (LMA) fiber amplifiers by using an all-fiber-based photonic lantern. Three single-mode fiber inputs are used to adaptively inject the appropriate superposition of input modes in a multimode gain fiber to achieve the desired mode at the output. By actively adjusting the relative phase of the single-mode inputs, near-unity coherent combination resulting in a single fundamental mode at the output is achieved.

  19. CRANS - CONFIGURABLE REAL-TIME ANALYSIS SYSTEM

    NASA Technical Reports Server (NTRS)

    Mccluney, K.

    1994-01-01

    In a real-time environment, the results of changes or failures in a complex, interconnected system need evaluation quickly. Tabulations showing the effects of changes and/or failures of a given item in the system are generally only useful for a single input, and only with regard to that item. Subsequent changes become harder to evaluate as combinations of failures produce a cascade effect. When confronted by multiple indicated failures in the system, it becomes necessary to determine a single cause. In this case, failure tables are not very helpful. CRANS, the Configurable Real-time ANalysis System, can interpret a logic tree, constructed by the user, describing a complex system and determine the effects of changes and failures in it. Items in the tree are related to each other by Boolean operators. The user is then able to change the state of these items (ON/OFF FAILED/UNFAILED). The program then evaluates the logic tree based on these changes and determines any resultant changes to other items in the tree. CRANS can also search for a common cause for multiple item failures, and allow the user to explore the logic tree from within the program. A "help" mode and a reference check provide the user with a means of exploring an item's underlying logic from within the program. A commonality check determines single point failures for an item or group of items. Output is in the form of a user-defined matrix or matrices of colored boxes, each box representing an item or set of items from the logic tree. Input is via mouse selection of the matrix boxes, using the mouse buttons to toggle the state of the item. CRANS is written in C-language and requires the MIT X Window System, Version 11 Revision 4 or Revision 5. It requires 78K of RAM for execution and a three button mouse. It has been successfully implemented on Sun4 workstations running SunOS, HP9000 workstations running HP-UX, and DECstations running ULTRIX. No executable is provided on the distribution medium; however, a sample makefile is included. Sample input files are also included. The standard distribution medium is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. Alternate distribution media and formats are available upon request. This program was developed in 1992.

  20. ModelMate - A graphical user interface for model analysis

    USGS Publications Warehouse

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  1. Development of Software to Model AXAF-I Image Quality

    NASA Technical Reports Server (NTRS)

    Geary, Joseph; Hawkins, Lamar; Ahmad, Anees; Gong, Qian

    1997-01-01

    This report describes work conducted on Delivery Order 181 between October 1996 through June 1997. During this period software was written to: compute axial PSD's from RDOS AXAF-I mirror surface maps; plot axial surface errors and compute PSD's from HDOS "Big 8" axial scans; plot PSD's from FITS format PSD files; plot band-limited RMS vs axial and azimuthal position for multiple PSD files; combine and organize PSD's from multiple mirror surface measurements formatted as input to GRAZTRACE; modify GRAZTRACE to read FITS formatted PSD files; evaluate AXAF-I test results; improve and expand the capabilities of the GT x-ray mirror analysis package. During this period work began on a more user-friendly manual for the GT program, and improvements were made to the on-line help manual.

  2. OFFSET - RAY TRACING OPTICAL ANALYSIS OF OFFSET SOLAR COLLECTOR FOR SPACE STATION SOLAR DYNAMIC POWER SYSTEM

    NASA Technical Reports Server (NTRS)

    Jefferies, K.

    1994-01-01

    OFFSET is a ray tracing computer code for optical analysis of a solar collector. The code models the flux distributions within the receiver cavity produced by reflections from the solar collector. It was developed to model the offset solar collector of the solar dynamic electric power system being developed for Space Station Freedom. OFFSET has been used to improve the understanding of the collector-receiver interface and to guide the efforts of NASA contractors also researching the optical components of the power system. The collector for Space Station Freedom consists of 19 hexagonal panels each containing 24 triangular, reflective facets. Current research is geared toward optimizing flux distribution inside the receiver via changes in collector design and receiver orientation. OFFSET offers many options for experimenting with the design of the system. The offset parabolic collector model configuration is determined by an input file of facet corner coordinates. The user may choose other configurations by changing this file, but to simulate collectors that have other than 19 groups of 24 triangular facets would require modification of the FORTRAN code. Each of the roughly 500 facets in the assembled collector may be independently aimed to smooth out, or tailor, the flux distribution on the receiver's wall. OFFSET simulates the effects of design changes such as in receiver aperture location, tilt angle, and collector facet contour. Unique features of OFFSET include: 1) equations developed to pseudo-randomly select ray originating sources on the Sun which appear evenly distributed and include solar limb darkening; 2) Cone-optics technique used to add surface specular error to the ray originating sources to determine the apparent ray sources of the reflected sun; 3) choice of facet reflective surface contour -- spherical, ideal parabolic, or toroidal; 4) Gaussian distributions of radial and tangential components of surface slope error added to the surface normals at the ten nodal points on each facet; and 5) color contour plots of receiver incident flux distribution generated by PATRAN processing of FORTRAN computer code output. OFFSET output includes a file of input data for confirmation, a PATRAN results file containing the values necessary to plot the flux distribution at the receiver surface, a PATRAN results file containing the intensity distribution on a 40 x 40 cm area of the receiver aperture plane, a data file containing calculated information on the system configuration, a file including the X-Y coordinates of the target points of each collector facet on the aperture opening, and twelve P/PLOT input data files to allow X-Y plotting of various results data. OFFSET is written in FORTRAN (70%) for the IBM VM operating system. The code contains PATRAN statements (12%) and P/PLOT statements (18%) for generating plots. Once the program has been run on VM (or an equivalent system), the PATRAN and P/PLOT files may be transferred to a DEC VAX (or equivalent system) with access to PATRAN for PATRAN post processing. OFFSET was written in 1988 and last updated in 1989. PATRAN is a registered trademark of PDA Engineering. IBM is a registered trademark of International Business Machines Corporation. DEC VAX is a registered trademark of Digital Equipment Corporation.

  3. Single lens laser beam shaper

    DOEpatents

    Liu, Chuyu [Newport News, VA; Zhang, Shukui [Yorktown, VA

    2011-10-04

    A single lens bullet-shaped laser beam shaper capable of redistributing an arbitrary beam profile into any desired output profile comprising a unitary lens comprising: a convex front input surface defining a focal point and a flat output portion at the focal point; and b) a cylindrical core portion having a flat input surface coincident with the flat output portion of the first input portion at the focal point and a convex rear output surface remote from the convex front input surface.

  4. Canary: an atomic pipeline for clinical amplicon assays.

    PubMed

    Doig, Kenneth D; Ellul, Jason; Fellowes, Andrew; Thompson, Ella R; Ryland, Georgina; Blombery, Piers; Papenfuss, Anthony T; Fox, Stephen B

    2017-12-15

    High throughput sequencing requires bioinformatics pipelines to process large volumes of data into meaningful variants that can be translated into a clinical report. These pipelines often suffer from a number of shortcomings: they lack robustness and have many components written in multiple languages, each with a variety of resource requirements. Pipeline components must be linked together with a workflow system to achieve the processing of FASTQ files through to a VCF file of variants. Crafting these pipelines requires considerable bioinformatics and IT skills beyond the reach of many clinical laboratories. Here we present Canary, a single program that can be run on a laptop, which takes FASTQ files from amplicon assays through to an annotated VCF file ready for clinical analysis. Canary can be installed and run with a single command using Docker containerization or run as a single JAR file on a wide range of platforms. Although it is a single utility, Canary performs all the functions present in more complex and unwieldy pipelines. All variants identified by Canary are 3' shifted and represented in their most parsimonious form to provide a consistent nomenclature, irrespective of sequencing variation. Further, proximate in-phase variants are represented as a single HGVS 'delins' variant. This allows for correct nomenclature and consequences to be ascribed to complex multi-nucleotide polymorphisms (MNPs), which are otherwise difficult to represent and interpret. Variants can also be annotated with hundreds of attributes sourced from MyVariant.info to give up to date details on pathogenicity, population statistics and in-silico predictors. Canary has been used at the Peter MacCallum Cancer Centre in Melbourne for the last 2 years for the processing of clinical sequencing data. By encapsulating clinical features in a single, easily installed executable, Canary makes sequencing more accessible to all pathology laboratories. Canary is available for download as source or a Docker image at https://github.com/PapenfussLab/Canary under a GPL-3.0 License.

  5. Description, Usage, and Validation of the MVL-15 Modified Vortex Lattice Analysis Capability

    NASA Technical Reports Server (NTRS)

    Ozoroski, Thomas A.

    2015-01-01

    MVL-15 is the most recent version of the Modified Vortex-Lattice (MVL) code developed within the Aerodynamics Systems Analysis Branch (ASAB) at NASA LaRC. The term "modified" refers to the primary modification of the core vortex-lattice methodology: inclusion of viscous aerodynamics tables that are linked to the linear solution via iterative processes. The inclusion of the viscous aerodynamics inherently converts the MVL-15 from a purely analytic linearized method to a semi-empirical blend which retains the rapid execution speed of the linearized method while empirically characterizing the section aerodynamics at all spanwise lattice points. The modification provides a means to assess non-linear effects on lift that occur at angles of attack near stall, and provides a means to determine the drag associated with the application of design strategies for lift augmentation such as the use of flaps or blowing. The MVL-15 code is applicable to the analyses of aircraft aerodynamics during cruise, but it is most advantageously applied to the analysis of aircraft operating in various high-lift configurations. The MVL methodology has been previously conceived and implemented; the initial concept version was delivered to the ASAB in 2001 (van Dam, C.), subsequently revised (Gelhausen, P. and Ozoroski, T. 2002 / AVID Inc., Gelhausen, P., and Roberts, M. 2004), and then overhauled (Ozoroski, T., Hahn, A. 2008). The latest version, MVL-15 has been refined to provide analysis transparency and enhanced to meet the analysis requirements of the Environmentally Responsible Aviation (ERA) Project. Each revision has been implemented with reasonable success. Separate applications of the methodology are in use, including a similar in-house capability, developed by Olson, E. that is tailored for structural and acoustics analyses. A central premise of the methodology is that viscous aerodynamic data can be associated with analytic inviscid aerodynamic results at each spanwise wing section, thereby providing a pathway to map viscous data to the inviscid results. However, a number of factors can sidetrack the analysis consistency during various stages of this process. For example, it should be expected that the final airplane lift curve and drag polar results depend strongly on the geometry and aerodynamics of the airfoil section; however, flap deflections and flap chord extensions change the local reference geometry of the input airfoil, the airplane wing, the tabulated non-dimensional viscous aerodynamics, and the spanwise links between the linear and the viscous aerodynamics. These changes also affect the bound circulation and therefore, calculation and integration of the induced angle of attack and induced drag. MVL-15 is configured to ensure these types of challenges are properly addressed. This report is a comprehensive manual describing the theory, use, and validation of the MVL-15 analysis tool. Section 3 summarizes theoretical, procedural, and characteristic features of MVL-15, and includes a list of the files required to setup, execute, and summarize an analysis. Section 4, Section 5, Section 6, and Section 7 combine to comprise the User's Guide portions of this report. The MVL-15 input and output files are described in Section 4 and Section 5, respectively; the descriptions are supplemented with example files and information about the file formats, parameter definitions, and typical parameter values. Section 6 describes the Wing Geometry Setup Utility and the 2d-Variants Utility files that simplify and assist setting up a consistent set of MVL-15 geometry and aerodynamics input parameters and input files. Section 7 describes the use of the 3d-Results Presentation Utility file that can be used to automatically create summary tables and charts from the MVL-15 output files. Section 8 documents the Validation Results of an extensive and varied validation test matrix, including results of an airplane analysis representative of the ERA Program. A start-to-finish example of the airplane analysis procedure is described in Section 7.

  6. Java-based Graphical User Interface for MAVERIC-II

    NASA Technical Reports Server (NTRS)

    Seo, Suk Jai

    2005-01-01

    A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in files and examine the output data later. Users can also view the output stored in output files by calling a plotting program such as gnuplot. A typical scenario of the use of MAVERIC consists of three-steps; editing existing input data files, running MAVERIC, and plotting output results.

  7. Construction of a Distributed-network Digital Watershed Management System with B/S Techniques

    NASA Astrophysics Data System (ADS)

    Zhang, W. C.; Liu, Y. M.; Fang, J.

    2017-07-01

    Integrated watershed assessment tools for supporting land management and hydrologic research are becoming established tools in both basic and applied research. The core of these tools are mainly spatially distributed hydrologic models as they can provide a mechanism for investigating interactions among climate, topography, vegetation, and soil. However, the extensive data requirements and the difficult task of building input parameter files for driving these distributed models, have long been an obstacle to the timely and cost-effective use of such complex models by watershed managers and policy-makers. Recently, a web based geographic information system (GIS) tool to facilitate this process has been developed for a large watersheds of Jinghe and Weihe catchments located in the loess plateau of the Huanghe River basin in north-western China. A web-based GIS provides the framework within which spatially distributed data are collected and used to prepare model input files of these two watersheds and evaluate model results as well as to provide the various clients for watershed information inquiring, visualizing and assessment analysis. This Web-based Automated Geospatial Watershed Assessment GIS (WAGWA-GIS) tool uses widely available standardized spatial datasets that can be obtained via the internet oracle databank designed with association of Map Guide platform to develop input parameter files for online simulation at different spatial and temporal scales with Xing’anjiang and TOPMODEL that integrated with web-based digital watershed. WAGWA-GIS automates the process of transforming both digital data including remote sensing data, DEM, Land use/cover, soil digital maps and meteorological and hydrological station geo-location digital maps and text files containing meteorological and hydrological data obtained from stations of the watershed into hydrological models for online simulation and geo-spatial analysis and provides a visualization tool to help the user interpret results. The utility of WAGWA-GIS in jointing hydrologic and ecological investigations has been demonstrated on such diverse landscapes as Jinhe and Weihe watersheds, and will be extended to be utilized in the other watersheds in China step by step in coming years

  8. Analysis of Parallel Algorithms on SMP Node and Cluster of Workstations Using Parallel Programming Models with New Tile-based Method for Large Biological Datasets.

    PubMed

    Shrimankar, D D; Sathe, S R

    2016-01-01

    Sequence alignment is an important tool for describing the relationships between DNA sequences. Many sequence alignment algorithms exist, differing in efficiency, in their models of the sequences, and in the relationship between sequences. The focus of this study is to obtain an optimal alignment between two sequences of biological data, particularly DNA sequences. The algorithm is discussed with particular emphasis on time, speedup, and efficiency optimizations. Parallel programming presents a number of critical challenges to application developers. Today's supercomputer often consists of clusters of SMP nodes. Programming paradigms such as OpenMP and MPI are used to write parallel codes for such architectures. However, the OpenMP programs cannot be scaled for more than a single SMP node. However, programs written in MPI can have more than single SMP nodes. But such a programming paradigm has an overhead of internode communication. In this work, we explore the tradeoffs between using OpenMP and MPI. We demonstrate that the communication overhead incurs significantly even in OpenMP loop execution and increases with the number of cores participating. We also demonstrate a communication model to approximate the overhead from communication in OpenMP loops. Our results are astonishing and interesting to a large variety of input data files. We have developed our own load balancing and cache optimization technique for message passing model. Our experimental results show that our own developed techniques give optimum performance of our parallel algorithm for various sizes of input parameter, such as sequence size and tile size, on a wide variety of multicore architectures.

  9. The multidimensional Self-Adaptive Grid code, SAGE, version 2

    NASA Technical Reports Server (NTRS)

    Davies, Carol B.; Venkatapathy, Ethiraj

    1995-01-01

    This new report on Version 2 of the SAGE code includes all the information in the original publication plus all upgrades and changes to the SAGE code since that time. The two most significant upgrades are the inclusion of a finite-volume option and the ability to adapt and manipulate zonal-matching multiple-grid files. In addition, the original SAGE code has been upgraded to Version 1.1 and includes all options mentioned in this report, with the exception of the multiple grid option and its associated features. Since Version 2 is a larger and more complex code, it is suggested (but not required) that Version 1.1 be used for single-grid applications. This document contains all the information required to run both versions of SAGE. The formulation of the adaption method is described in the first section of this document. The second section is presented in the form of a user guide that explains the input and execution of the code. The third section provides many examples. Successful application of the SAGE code in both two and three dimensions for the solution of various flow problems has proven the code to be robust, portable, and simple to use. Although the basic formulation follows the method of Nakahashi and Deiwert, many modifications have been made to facilitate the use of the self-adaptive grid method for complex grid structures. Modifications to the method and the simple but extensive input options make this a flexible and user-friendly code. The SAGE code can accommodate two-dimensional and three-dimensional, finite-difference and finite-volume, single grid, and zonal-matching multiple grid flow problems.

  10. Analysis of Parallel Algorithms on SMP Node and Cluster of Workstations Using Parallel Programming Models with New Tile-based Method for Large Biological Datasets

    PubMed Central

    Shrimankar, D. D.; Sathe, S. R.

    2016-01-01

    Sequence alignment is an important tool for describing the relationships between DNA sequences. Many sequence alignment algorithms exist, differing in efficiency, in their models of the sequences, and in the relationship between sequences. The focus of this study is to obtain an optimal alignment between two sequences of biological data, particularly DNA sequences. The algorithm is discussed with particular emphasis on time, speedup, and efficiency optimizations. Parallel programming presents a number of critical challenges to application developers. Today’s supercomputer often consists of clusters of SMP nodes. Programming paradigms such as OpenMP and MPI are used to write parallel codes for such architectures. However, the OpenMP programs cannot be scaled for more than a single SMP node. However, programs written in MPI can have more than single SMP nodes. But such a programming paradigm has an overhead of internode communication. In this work, we explore the tradeoffs between using OpenMP and MPI. We demonstrate that the communication overhead incurs significantly even in OpenMP loop execution and increases with the number of cores participating. We also demonstrate a communication model to approximate the overhead from communication in OpenMP loops. Our results are astonishing and interesting to a large variety of input data files. We have developed our own load balancing and cache optimization technique for message passing model. Our experimental results show that our own developed techniques give optimum performance of our parallel algorithm for various sizes of input parameter, such as sequence size and tile size, on a wide variety of multicore architectures. PMID:27932868

  11. The incorporation of plotting capability into the Unified Subsonic Supersonic Aerodynamic Analysis program, version B

    NASA Technical Reports Server (NTRS)

    Winter, O. A.

    1980-01-01

    The B01 version of the United Subsonic Supersonic Aerodynamic Analysis program is the result of numerous modifications and additions made to the B00 version. These modifications and additions affect the program input, its computational options, the code readability, and the overlay structure. The following are described: (1) the revised input; (2) the plotting overlay programs which were also modified, and their associated subroutines, (3) the auxillary files used by the program, the revised output data; and (4) the program overlay structure.

  12. Multibody dynamics model building using graphical interfaces

    NASA Technical Reports Server (NTRS)

    Macala, Glenn A.

    1989-01-01

    In recent years, the extremely laborious task of manually deriving equations of motion for the simulation of multibody spacecraft dynamics has largely been eliminated. Instead, the dynamicist now works with commonly available general purpose dynamics simulation programs which generate the equations of motion either explicitly or implicitly via computer codes. The user interface to these programs has predominantly been via input data files, each with its own required format and peculiarities, causing errors and frustrations during program setup. Recent progress in a more natural method of data input for dynamics programs: the graphical interface, is described.

  13. IOS: PDP 11/45 formatted input/output task stacker and processer. [In MACRO-II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koschik, J.

    1974-07-08

    IOS allows the programer to perform formated Input/Output at assembly language level to/from any peripheral device. It runs under DOS versions V8-O8 or V9-19, reading and writing DOS-compatible files. Additionally, IOS will run, with total transparency, in an environment with memory management enabled. Minimum hardware required is a 16K PDP 11/45, Keyboard Device, DISK (DK,DF, or DC), and Line Frequency Clock. The source language is MACRO-11 (3.3K Decimal Words).

  14. FEMFLOW3D; a finite-element program for the simulation of three-dimensional aquifers; version 1.0

    USGS Publications Warehouse

    Durbin, Timothy J.; Bond, Linda D.

    1998-01-01

    This document also includes model validation, source code, and example input and output files. Model validation was performed using four test problems. For each test problem, the results of a model simulation with FEMFLOW3D were compared with either an analytic solution or the results of an independent numerical approach. The source code, written in the ANSI x3.9-1978 FORTRAN standard, and the complete input and output of an example problem are listed in the appendixes.

  15. Automatic Residential/Commercial Classification of Parcels with Solar Panel Detections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morton, April M; Omitaomu, Olufemi A; Kotikot, Susan

    A computational method to automatically detect solar panels on rooftops to aid policy and financial assessment of solar distributed generation. The code automatically classifies parcels containing solar panels in the U.S. as residential or commercial. The code allows the user to specify an input dataset containing parcels and detected solar panels, and then uses information about the parcels and solar panels to automatically classify the rooftops as residential or commercial using machine learning techniques. The zip file containing the code includes sample input and output datasets for the Boston and DC areas.

  16. Investigating Access Performance of Long Time Series with Restructured Big Model Data

    NASA Astrophysics Data System (ADS)

    Shen, S.; Ostrenga, D.; Vollmer, B.; Meyer, D. J.

    2017-12-01

    Data sets generated by models are substantially increasing in volume, due to increases in spatial and temporal resolution, and the number of output variables. Many users wish to download subsetted data in preferred data formats and structures, as it is getting increasingly difficult to handle the original full-size data files. For example, application research users, such as those involved with wind or solar energy, or extreme weather events, are likely only interested in daily or hourly model data at a single point or for a small area for a long time period, and prefer to have the data downloaded in a single file. With native model file structures, such as hourly data from NASA Modern-Era Retrospective analysis for Research and Applications Version-2 (MERRA-2), it may take over 10 hours for the extraction of interested parameters at a single point for 30 years. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is exploring methods to address this particular user need. One approach is to create value-added data by reconstructing the data files. Taking MERRA-2 data as an example, we have tested converting hourly data from one-day-per-file into different data cubes, such as one-month, one-year, or whole-mission. Performance are compared for reading local data files and accessing data through interoperable service, such as OPeNDAP. Results show that, compared to the original file structure, the new data cubes offer much better performance for accessing long time series. We have noticed that performance is associated with the cube size and structure, the compression method, and how the data are accessed. An optimized data cube structure will not only improve data access, but also may enable better online analytic services.

  17. Investigating Access Performance of Long Time Series with Restructured Big Model Data

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Ostrenga, Dana M.; Vollmer, Bruce E.; Meyer, Dave

    2017-01-01

    Data sets generated by models are substantially increasing in volume, due to increases in spatial and temporal resolution, and the number of output variables. Many users wish to download subsetted data in preferred data formats and structures, as it is getting increasingly difficult to handle the original full-size data files. For example, application research users such as those involved with wind or solar energy, or extreme weather events are likely only interested in daily or hourly model data at a single point (or for a small area) for a long time period, and prefer to have the data downloaded in a single file. With native model file structures, such as hourly data from NASA Modern-Era Retrospective analysis for Research and Applications Version-2 (MERRA-2), it may take over 10 hours for the extraction of parameters-of-interest at a single point for 30 years. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is exploring methods to address this particular user need. One approach is to create value-added data by reconstructing the data files. Taking MERRA-2 data as an example, we have tested converting hourly data from one-day-per-file into different data cubes, such as one-month, or one-year. Performance is compared for reading local data files and accessing data through interoperable services, such as OPeNDAP. Results show that, compared to the original file structure, the new data cubes offer much better performance for accessing long time series. We have noticed that performance is associated with the cube size and structure, the compression method, and how the data are accessed. An optimized data cube structure will not only improve data access, but also may enable better online analysis services

  18. Automation at the University of Georgia Libraries.

    ERIC Educational Resources Information Center

    Christoffersson, John G.

    1979-01-01

    Presents the design procedures, bibliographic system, file structures, acquisitions and circulation systems, functional implementation, and future development of the Managing Resources for University Libraries (MARVEL) data base at the University of Georgia Libraries, which accepts MARC input from OCLC and Library of Congress (LC) MARC tapes. (CWM)

  19. Automated Instructional Management Systems (AIMS) Version III, Users Manual.

    ERIC Educational Resources Information Center

    New York Inst. of Tech., Old Westbury.

    This document sets forth the procedures necessary to utilize and understand the operating characteristics of the Automated Instructional Management System - Version III, a computer-based system for management of educational processes. Directions for initialization, including internal and user files; system and operational input requirements;…

  20. Study of Mechanization in DOD Libraries and Information Centers.

    ERIC Educational Resources Information Center

    Booz, Allen Applied Research, Inc., Bethesda, MD.

    This report summarizes the on-site study of mechanization in DoD libraries and information centers. Included are presentations and evaluations on thesaurus building, file structure, input processing, serial control, selective dissemination of information, circulation control, equipments being used, recommendations on information retrieval systems,…

Top