SAMICS: Input data preparation. [Solar Array Manufacturing Industry Costing Standards
NASA Technical Reports Server (NTRS)
Chamberlain, R. G.; Aster, R. W.
1979-01-01
The Solar Array Manufacturing Industry Costing Standards (SAMICS) provide standard formats, data, assumptions, and procedures for estimating the price that a manufacturer would have to charge for the product of a specified manufacturing process sequence. A line-by-line explanation is given of those standard formats which describe the economically important characteristics of the manufacturing processes and the technological structure of the companies and the industry. This revision provides an updated presentation of Format A Process Description, consistent with the October 1978 version of that form. A checklist of items which should be entered on Format A as direct expenses is included.
Manual for Getdata Version 3.1: a FORTRAN Utility Program for Time History Data
NASA Technical Reports Server (NTRS)
Maine, Richard E.
1987-01-01
This report documents version 3.1 of the GetData computer program. GetData is a utility program for manipulating files of time history data, i.e., data giving the values of parameters as functions of time. The most fundamental capability of GetData is extracting selected signals and time segments from an input file and writing the selected data to an output file. Other capabilities include converting file formats, merging data from several input files, time skewing, interpolating to common output times, and generating calculated output signals as functions of the input signals. This report also documents the interface standards for the subroutines used by GetData to read and write the time history files. All interface to the data files is through these subroutines, keeping the main body of GetData independent of the precise details of the file formats. Different file formats can be supported by changes restricted to these subroutines. Other computer programs conforming to the interface standards can call the same subroutines to read and write files in compatible formats.
2017-07-01
Output Re-Constructor 1. General This standard defines the recommended multiplexer format for single-channel data recording on small-format (1/2 in...which is 1-based, is determined by the position of the channel’s module in the ARMOR system . The first input channel found in the ARMOR system is
USSAERO version D computer program development using ANSI standard FORTRAN 77 and DI-3000 graphics
NASA Technical Reports Server (NTRS)
Wiese, M. R.
1986-01-01
The D version of the Unified Subsonic Supersonic Aerodynamic Analysis (USSAERO) program is the result of numerous modifications and enhancements to the B01 version. These changes include conversion to ANSI standard FORTRAN 77; use of the DI-3000 graphics package; removal of the overlay structure; a revised input format; the addition of an input data analysis routine; and increasing the number of aeronautical components allowed.
Standard interface files and procedures for reactor physics codes, version III
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmichael, B.M.
Standards and procedures for promoting the exchange of reactor physics codes are updated to Version-III status. Standards covering program structure, interface files, file handling subroutines, and card input format are included. The implementation status of the standards in codes and the extension of the standards to new code areas are summarized. (15 references) (auth)
ERIC Educational Resources Information Center
Hamm, Michael S.
In 1993-1996, the Grocers Research and Educational Foundation of the National Grocers Association developed entry-level skill standards for the food marketing industry. A coalition formed early in the project directed the skill standard development process and solicited input from major organizations involved in the industry. The validity of the…
NASA Astrophysics Data System (ADS)
Ghiringhelli, Luca M.; Carbogno, Christian; Levchenko, Sergey; Mohamed, Fawzi; Huhs, Georg; Lüders, Martin; Oliveira, Micael; Scheffler, Matthias
2017-11-01
With big-data driven materials research, the new paradigm of materials science, sharing and wide accessibility of data are becoming crucial aspects. Obviously, a prerequisite for data exchange and big-data analytics is standardization, which means using consistent and unique conventions for, e.g., units, zero base lines, and file formats. There are two main strategies to achieve this goal. One accepts the heterogeneous nature of the community, which comprises scientists from physics, chemistry, bio-physics, and materials science, by complying with the diverse ecosystem of computer codes and thus develops "converters" for the input and output files of all important codes. These converters then translate the data of each code into a standardized, code-independent format. The other strategy is to provide standardized open libraries that code developers can adopt for shaping their inputs, outputs, and restart files, directly into the same code-independent format. In this perspective paper, we present both strategies and argue that they can and should be regarded as complementary, if not even synergetic. The represented appropriate format and conventions were agreed upon by two teams, the Electronic Structure Library (ESL) of the European Center for Atomic and Molecular Computations (CECAM) and the NOvel MAterials Discovery (NOMAD) Laboratory, a European Centre of Excellence (CoE). A key element of this work is the definition of hierarchical metadata describing state-of-the-art electronic-structure calculations.
A standard format and a graphical user interface for spin system specification.
Biternas, A G; Charnock, G T P; Kuprov, Ilya
2014-03-01
We introduce a simple and general XML format for spin system description that is the result of extensive consultations within Magnetic Resonance community and unifies under one roof all major existing spin interaction specification conventions. The format is human-readable, easy to edit and easy to parse using standard XML libraries. We also describe a graphical user interface that was designed to facilitate construction and visualization of complicated spin systems. The interface is capable of generating input files for several popular spin dynamics simulation packages. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Transferable Output ASCII Data (TOAD) gateway: Version 1.0 user's guide
NASA Technical Reports Server (NTRS)
Bingel, Bradford D.
1991-01-01
The Transferable Output ASCII Data (TOAD) Gateway, release 1.0 is described. This is a software tool for converting tabular data from one format into another via the TOAD format. This initial release of the Gateway allows free data interchange among the following file formats: TOAD; Standard Interface File (SIF); Program to Optimize Simulated Trajectories (POST) input; Comma Separated Value (TSV); and a general free-form file format. As required, additional formats can be accommodated quickly and easily.
Tool for Merging Proposals Into DSN Schedules
NASA Technical Reports Server (NTRS)
Khanampornpan, Teerapat; Kwok, John; Call, Jared
2008-01-01
A Practical Extraction and Reporting Language (Perl) script called merge7da has been developed to facilitate determination, by a project scheduler in NASA's Deep Space Network, of whether a proposal for use of the DSN could create a conflict with the current DSN schedule. Prior to the development of merge7da, there was no way to quickly identify potential schedule conflicts: it was necessary to submit a proposal and wait a day or two for a response from a DSN scheduling facility. By using merge7da to detect and eliminate potential schedule conflicts before submitting a proposal, a project scheduler saves time and gains assurance that the proposal will probably be accepted. merge7da accepts two input files, one of which contains the current DSN schedule and is in a DSN-standard format called '7da'. The other input file contains the proposal and is in another DSN-standard format called 'C1/C2'. merge7da processes the two input files to produce a merged 7da-format output file that represents the DSN schedule as it would be if the proposal were to be adopted. This 7da output file can be loaded into various DSN scheduling software tools now in use.
Implementation of AAPG exchange format
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiser, K.; Guerrero, I.
1989-03-01
The American Association of Petroleum Geologists (AAPG) has proposed a format for exchanging geologic and other petroleum data. The AAPG Computer Applications Committee approved the proposal at the March 1988 AAPG annual meeting in Houston, Texas. By adopting this format, data input into application software and data exchange between software packages are greatly simplified. Benefits to both users and suppliers of software are substantial. The AAPG exchange format supports a flexible, generic data structure. This flexibility allows application software to use the standard format for storing internal control data. In some cases, extensions to the standard format, such as separationmore » of header and data files and use of data delimiters, permits the use of AAPG format translator programs on data that were defined and generated before the emergence of the exchange format. Translation software, programmed in C, has been written and contributes to successful implementation of the AAPG exchange format in application software.« less
A CERIF-Compatible Research Management System Based on the MARC 21 Format
ERIC Educational Resources Information Center
Ivanovic, Dragan; Milosavljevic, Gordana; Milosavljevic, Branko; Surla, Dusan
2010-01-01
Purpose: Entering data about published research results should be implemented as a web application that enables authors to input their own data without the knowledge of the bibliographic standard. The aim of this research is to develop a research management system based on a bibliographic standard and to provide data exchange with other research…
Electronic collection system for spacelab mission timeline requirements
NASA Technical Reports Server (NTRS)
Lindberg, James P.; Piner, John R.; Huang, Allen K. H.
1995-01-01
This paper describes the Functional Objective Requirements Collection System (FORCS) software tool that has been developed for use by Principal Investigators (PI's) and Payload Element Developers (PED's) on their own personal computers to develop on-orbit timelining requirements for their payloads. The FORCS tool can be used either in a totally stand-alone mode, storing the information in a local file on the user's personal computer hard disk or in a remote mode where the user's computer is linked to a host computer containing the integrated database of the timeline requirements for all of the payloads on a mission. There are a number of features incorporated in the FORCS software to assist the user. The user may move freely back and forth between the various forms for inputting the data. Several methods are used to input the information, depending on the type of the information. These methods range from filling in text boxes, using check boxes and radio buttons, to inputting information into a spreadsheet format. There are automated features provided to assist in developing the proper format for the data, ranging from limit checking on some of the parameters to automatic conversion of different formats of time data inputs to the one standard format used for the timeline scheduling software.
Marra, Kristen R.
2017-10-24
In 2017, the U.S. Geological Survey (USGS) completed an updated assessment of undiscovered, technically recoverable oil and gas resources in the Spraberry Formation of the Midland Basin (Permian Basin Province) in southwestern Texas (Marra and others, 2017). The Spraberry Formation was assessed using both the standard continuous (unconventional) and conventional methodologies established by the USGS for three assessment units (AUs): (1) Lower Spraberry Continuous Oil Trend AU, (2) Middle Spraberry Continuous Oil Trend AU, and (3) Northern Spraberry Conventional Oil AU. The revised assessment resulted in total estimated mean resources of 4,245 million barrels of oil, 3,112 billion cubic feet of gas, and 311 million barrels of natural gas liquids. The purpose of this report is to provide supplemental documentation of the input parameters used in the USGS 2017 Spraberry Formation assessment.
A uniform input data convention for the CALL 3-D crash victim simulation
NASA Astrophysics Data System (ADS)
Shaibani, S. J.
1982-07-01
Logical schemes for the labelling of planes (cards D) and functions (cards E) in the input decks used for the Calspan 3-D Crash Victim Simulation (CVS) program are proposed. One benefit of introducing such a standardized format for these inputs would be to facilitate greatly the interchange of data for different vehicles. A further advantage would be that the table of allowed contacts (cards F) could remain largely unaltered. It is hoped that the uniformity of the convention described by these schemes would help to promote the exchange of readily usable data between CVS users.
Blurring the Inputs: A Natural Language Approach to Sensitivity Analysis
NASA Technical Reports Server (NTRS)
Kleb, William L.; Thompson, Richard A.; Johnston, Christopher O.
2007-01-01
To document model parameter uncertainties and to automate sensitivity analyses for numerical simulation codes, a natural-language-based method to specify tolerances has been developed. With this new method, uncertainties are expressed in a natural manner, i.e., as one would on an engineering drawing, namely, 5.25 +/- 0.01. This approach is robust and readily adapted to various application domains because it does not rely on parsing the particular structure of input file formats. Instead, tolerances of a standard format are added to existing fields within an input file. As a demonstration of the power of this simple, natural language approach, a Monte Carlo sensitivity analysis is performed for three disparate simulation codes: fluid dynamics (LAURA), radiation (HARA), and ablation (FIAT). Effort required to harness each code for sensitivity analysis was recorded to demonstrate the generality and flexibility of this new approach.
VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapiro, A.; Huria, H.C.; Cho, K.W.
1991-12-01
VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less
SCIFIO: an extensible framework to support scientific image formats.
Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W
2016-12-07
No gold standard exists in the world of scientific image acquisition; a proliferation of instruments each with its own proprietary data format has made out-of-the-box sharing of that data nearly impossible. In the field of light microscopy, the Bio-Formats library was designed to translate such proprietary data formats to a common, open-source schema, enabling sharing and reproduction of scientific results. While Bio-Formats has proved successful for microscopy images, the greater scientific community was lacking a domain-independent framework for format translation. SCIFIO (SCientific Image Format Input and Output) is presented as a freely available, open-source library unifying the mechanisms of reading and writing image data. The core of SCIFIO is its modular definition of formats, the design of which clearly outlines the components of image I/O to encourage extensibility, facilitated by the dynamic discovery of the SciJava plugin framework. SCIFIO is structured to support coexistence of multiple domain-specific open exchange formats, such as Bio-Formats' OME-TIFF, within a unified environment. SCIFIO is a freely available software library developed to standardize the process of reading and writing scientific image formats.
Designing an evaluation framework for WFME basic standards for medical education.
Tackett, Sean; Grant, Janet; Mmari, Kristin
2016-01-01
To create an evaluation plan for the World Federation for Medical Education (WFME) accreditation standards for basic medical education. We conceptualized the 100 basic standards from "Basic Medical Education: WFME Global Standards for Quality Improvement: The 2012 Revision" as medical education program objectives. Standards were simplified into evaluable items, which were then categorized as inputs, processes, outputs and/or outcomes to generate a logic model and corresponding plan for data collection. WFME standards posed significant challenges to evaluation due to complex wording, inconsistent formatting and lack of existing assessment tools. Our resulting logic model contained 244 items. Standard B 5.1.1 separated into 24 items, the most for any single standard. A large proportion of items (40%) required evaluation of more than one input, process, output and/or outcome. Only one standard (B 3.2.2) was interpreted as requiring evaluation of a program outcome. Current WFME standards are difficult to use for evaluation planning. Our analysis may guide adaptation and revision of standards to make them more evaluable. Our logic model and data collection plan may be useful to medical schools planning an institutional self-review and to accrediting authorities wanting to provide guidance to schools under their purview.
CFEST Coupled Flow, Energy & Solute Transport Version CFEST005 User’s Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freedman, Vicky L.; Chen, Yousu; Gilca, Alex
2006-07-20
The CFEST (Coupled Flow, Energy, and Solute Transport) simulator described in this User’s Guide is a three-dimensional finite-element model used to evaluate groundwater flow and solute mass transport. Confined and unconfined aquifer systems, as well as constant and variable density fluid flows can be represented with CFEST. For unconfined aquifers, the model uses a moving boundary for the water table, deforming the numerical mesh so that the uppermost nodes are always at the water table. For solute transport, changes in concentra¬tion of a single dissolved chemical constituent are computed for advective and hydrodynamic transport, linear sorption represented by a retardationmore » factor, and radioactive decay. Although several thermal parameters described in this User’s Guide are required inputs, thermal transport has not yet been fully implemented in the simulator. Once fully implemented, transport of thermal energy in the groundwater and solid matrix of the aquifer can also be used to model aquifer thermal regimes. The CFEST simulator is written in the FORTRAN 77 language, following American National Standards Institute (ANSI) standards. Execution of the CFEST simulator is controlled through three required text input files. These input file use a structured format of associated groups of input data. Example input data lines are presented for each file type, as well as a description of the structured FORTRAN data format. Detailed descriptions of all input requirements, output options, and program structure and execution are provided in this User’s Guide. Required inputs for auxillary CFEST utilities that aide in post-processing data are also described. Global variables are defined for those with access to the source code. Although CFEST is a proprietary code (CFEST, Inc., Irvine, CA), the Pacific Northwest National Laboratory retains permission to maintain its own source, and to distribute executables to Hanford subcontractors.« less
High-speed asynchronous data mulitiplexer/demultiplexer for high-density digital recorders
NASA Astrophysics Data System (ADS)
Berdugo, Albert; Small, Martin B.
1996-11-01
Modern High Density Digital Recorders are ideal devices for the storage of large amounts of digital and/or wideband analog data. Ruggedized versions of these recorders are currently available and are supporting many military and commercial flight test applications. However, in certain cases, the storage format becomes very critical, e.g., when a large number of data types are involved, or when channel- to-channel correlation is critical, or when the original data source must be accurately recreated during post mission analysis. A properly designed storage format will not only preserve data quality, but will yield the maximum storage capacity and record time for any given recorder family or data type. This paper describes a multiplex/demultiplex technique that formats multiple high speed data sources into a single, common format for recording. The method is compatible with many popular commercial recorder standards such as DCRsi, VLDS, and DLT. Types of input data typically include PCM, wideband analog data, video, aircraft data buses, avionics, voice, time code, and many others. The described method preserves tight data correlation with minimal data overhead. The described technique supports full reconstruction of the original input signals during data playback. Output data correlation across channels is preserved for all types of data inputs. Simultaneous real- time data recording and reconstruction are also supported.
VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system. Version 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapiro, A.; Huria, H.C.; Cho, K.W.
1991-12-01
VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less
Godson, Richard H.
1974-01-01
GEOPAC .consists of a series of subroutines to primarily process potential-field geophysical data but other types of data can also be used with the program. The package contains routines to reduce, store, process and display information in two-dimensional or three-dimensional form. Input and output formats are standardized and temporary disk storage permits data sets to be processed by several subroutines in one job step. The subroutines are link-edited in an overlay mode to form one program and they can be executed by submitting a card containing the subroutine name in the input stream.
Carey, A.E.; Prudic, David E.
1996-01-01
Documentation is provided of model input and sample output used in a previous report for analysis of ground-water flow and simulated pumping scenarios in Paradise Valley, Humboldt County, Nevada.Documentation includes files containing input values and listings of sample output. The files, in American International Standard Code for Information Interchange (ASCII) or binary format, are compressed and put on a 3-1/2-inch diskette. The decompressed files require approximately 8.4 megabytes of disk space on an International Business Machine (IBM)- compatible microcomputer using the MicroSoft Disk Operating System (MS-DOS) operating system version 5.0 or greater.
LANDSAT: Non-US standard catalog no. N-36. [LANDSAT imagery for August, 1975
NASA Technical Reports Server (NTRS)
1975-01-01
Information regarding the availability of LANDSAT imagery processed and input to the data files by the NASA Data Processing Facility is published on a monthly basis. The U.S. Standard Catalog includes imagery covering the continental United States, Alaska, and Hawaii. The Non-U.S. Standard Catalog identifies all the remaining coverage. Sections 1 and 2 describe the contents and format for the catalogs and the associated microfilm. Section 3 provides a cross reference defining the beginning and ending dates for LANDSAT cycles.
LANDSAT 2 world standard catalog, 1 May - 31 July 1978. [LANDSAT imagery for May through July 1978
NASA Technical Reports Server (NTRS)
1978-01-01
Information regarding the availability of LANDSAT imagery processed and input to the data files by the NASA Data Processing Facility is published on a monthly basis. The U.S. Standard Catalog includes imagery covering the continental United States, Alaska and Hawaii. The Non-U.S. Standard Catalog identifies all the remaining coverage. Sections 1 and 2 describe the contents and format for the catalogs and the associated microfilm. Section 3 provides a cross-reference defining the beginning and ending dates for LANDSAT cycles.
LANDSAT: Non-US standard catalog no. N-30. [LANDSAT imagery for February, 1975
NASA Technical Reports Server (NTRS)
1975-01-01
Information regarding the availability of LANDSAT imagery processed and input to the data files by the NASA Data Processing Facility is published on a monthly basis. The U.S. Standard Catalog includes imagery covering the continental United States, Alaska, and Hawaii. The Non-U.S. Standard Catalog identifies all the remaining coverage. Sections 1 and 2 describe the contents and format for the catalogs and the associated microfilm. Section 3 provides a cross-reference defining the beginning and ending dates for LANDSAT cycles.
NASA Technical Reports Server (NTRS)
Folta, David; Bauer, Frank H. (Technical Monitor)
2001-01-01
The autonomous formation flying control algorithm developed by the Goddard Space Flight Center (GSFC) for the New Millennium Program (NMP) Earth Observing-1 (EO-1) mission is investigated for applicability to libration point orbit formations. In the EO-1 formation-flying algorithm, control is accomplished via linearization about a reference transfer orbit with a state transition matrix (STM) computed from state inputs. The effect of libration point orbit dynamics on this algorithm architecture is explored via computation of STMs using the flight proven code, a monodromy matrix developed from a N-body model of a libration orbit, and a standard STM developed from the gravitational and coriolis effects as measured at the libration point. A comparison of formation flying Delta-Vs calculated from these methods is made to a standard linear quadratic regulator (LQR) method. The universal 3-D approach is optimal in the sense that it can be accommodated as an open-loop or closed-loop control using only state information.
GNAP (Graphic Normative Analysis Program)
Bowen, Roger W.; Odell, John
1979-01-01
A user-oriented command language is developed to provide direct control over the computation and output of the standard CIPW norm. A user-supplied input format for the oxide values may be given or a standard CIPW Rock Analysis format may be used. Once the oxide values have been read by the computer, these values may be manipulated by the user and the 'norm' recalculated on the basis of the manipulated or 'adjusted' values. Additional output capabilities include tabular listing of computed values, summary listings suitable for publication, x-y plots, and ternary diagrams. As many as 20 rock analysis cards may be processed as a group. Any number of such groups may be processed in any one computer run.
NASA Technical Reports Server (NTRS)
Hildreth, Bruce L.; Jackson, E. Bruce
2009-01-01
The American Institute of Aeronautics Astronautics (AIAA) Modeling and Simulation Technical Committee is in final preparation of a new standard for the exchange of flight dynamics models. The standard will become an ANSI standard and is under consideration for submission to ISO for acceptance by the international community. The standard has some a spects that should provide benefits to the simulation training community. Use of the new standard by the training simulation community will reduce development, maintenance and technical refresh investment on each device. Furthermore, it will significantly lower the cost of performing model updates to improve fidelity or expand the envelope of the training device. Higher flight fidelity should result in better transfer of training, a direct benefit to the pilots under instruction. Costs of adopting the standard are minimal and should be paid back within the cost of the first use for that training device. The standard achie ves these advantages by making it easier to update the aerodynamic model. It provides a standard format for the model in a custom eXtensible Markup Language (XML) grammar, the Dynamic Aerospace Vehicle Exchange Markup Language (DAVE-ML). It employs an existing XML grammar, MathML, to describe the aerodynamic model in an input data file, eliminating the requirement for actual software compilation. The major components of the aero model become simply an input data file, and updates are simply new XML input files. It includes naming and axis system conventions to further simplify the exchange of information.
FMC: a one-liner Python program to manage, classify and plot focal mechanisms
NASA Astrophysics Data System (ADS)
Álvarez-Gómez, José A.
2014-05-01
The analysis of earthquake focal mechanisms (or Seismic Moment Tensor, SMT) is a key tool on seismotectonics research. Each focal mechanism is characterized by several location parameters of the earthquake hypocenter, the earthquake size (magnitude and scalar moment tensor) and some geometrical characteristics of the rupture (nodal planes orientations, SMT components and/or SMT main axes orientations). The aim of FMC is to provide a simple but powerful tool to manage focal mechanism data. The data should be input to the program formatted as one of two of the focal mechanisms formatting options of the GMT (Generic Mapping Tools) package (Wessel and Smith, 1998): the Harvard CMT convention and the single nodal plane Aki and Richards (1980) convention. The former is a SMT format that can be downloaded directly from the Global CMT site (http://www.globalcmt.org/), while the later is the simplest way to describe earthquake rupture data. FMC is programmed in Python language, which is distributed as Open Source GPL-compatible, and therefore can be used to develop Free Software. Python runs on almost any machine, and has a wide support and presence in any operative system. The program has been conceived with the modularity and versatility of the classical UNIX-like tools. Is called from the command line and can be easily integrated into shell scripts (*NIX systems) or batch files (DOS/Windows systems). The program input and outputs can be done by means of ASCII files or using standard input (or redirection "<"), standard output (screen or redirection ">") and pipes ("|"). By default FMC will read the input and write the output as a Harvard CMT (psmeca formatted) ASCII file, although other formats can be used. Optionally FMC will produce a classification diagram representing the rupture type of the focal mechanisms processed. In order to count with a detailed classification of the focal mechanisms I decided to classify the focal mechanism in a series of fields that include the oblique slip regimes. This approximation is similar to the Johnston et al. (1994) classification; with 7 classes of earthquakes: 1) Normal; 2) Normal - Strike-slip; 3) Strike-slip - Normal; 4) Strike-slip; 5) Strike-slip - Reverse; 6) Reverse - strike-slip and 7) Reverse. FMC uses by default this classification in the resulting diagram, based on the Kaverina et al. (1996) projection, which improves the Frohlich and Apperson (1992) ternary diagram.
Yanagita, Satoshi; Imahana, Masato; Suwa, Kazuaki; Sugimura, Hitomi; Nishiki, Masayuki
2016-01-01
Japanese Society of Radiological Technology (JSRT) standard digital image database contains many useful cases of chest X-ray images, and has been used in many state-of-the-art researches. However, the pixel values of all the images are simply digitized as relative density values by utilizing a scanned film digitizer. As a result, the pixel values are completely different from the standardized display system input value of digital imaging and communications in medicine (DICOM), called presentation value (P-value), which can maintain a visual consistency when observing images using different display luminance. Therefore, we converted all the images from JSRT standard digital image database to DICOM format followed by the conversion of the pixel values to P-value using an original program developed by ourselves. Consequently, JSRT standard digital image database has been modified so that the visual consistency of images is maintained among different luminance displays.
Sensory and Postural Input in the Occurrence of a Gender Difference in Orienting Liquid Surfaces
ERIC Educational Resources Information Center
Robert, Michele; Longpre, Sophie
2005-01-01
In the water-level task, both spatial skill and physical knowledge contribute to representing the surface of a liquid as horizontal irrespective of the container's tilt. Under the standard visual format of the task, men systematically surpass women at drawing correct water lines in outlines of tilted containers. The present exploratory experiments…
Proposed Computer System for Library Catalog Maintenance. Part II: System Design.
ERIC Educational Resources Information Center
Stein (Theodore) Co., New York, NY.
The logic of the system presented in this report is divided into six parts for computer processing and manipulation. They are: (1) processing of Library of Congress copy, (2) editing of input into standard format, (3) processing of information into and out from the authority files, (4) creation of the catalog records, (5) production of the…
LANDSAT non-US standard catalog, 1 May 1977 - 31 May 1977
NASA Technical Reports Server (NTRS)
1977-01-01
Information regarding the availability of LANDSAT imagery processed and input to the data files by the NASA Data Processing Facility is published on a monthly basis. The U.S. Standard Catalog includes imagery covering the continental United States, Alaska and Hawaii. the Non-U.S. Standard Catalog identifies all the remaining coverage. Sections 1 and 2 describe the contents and format for the catalogs and associated microfilm. Section 3 provides a cross-reference defining the beginning and ending dates for LANDSAT cycles. Sections 4 and 5 cover LANDSAT-1 and LANDSAT-2 coverage, respectively.
Data Management Rubric for Video Data in Organismal Biology.
Brainerd, Elizabeth L; Blob, Richard W; Hedrick, Tyson L; Creamer, Andrew T; Müller, Ulrike K
2017-07-01
Standards-based data management facilitates data preservation, discoverability, and access for effective data reuse within research groups and across communities of researchers. Data sharing requires community consensus on standards for data management, such as storage and formats for digital data preservation, metadata (i.e., contextual data about the data) that should be recorded and stored, and data access. Video imaging is a valuable tool for measuring time-varying phenotypes in organismal biology, with particular application for research in functional morphology, comparative biomechanics, and animal behavior. The raw data are the videos, but videos alone are not sufficient for scientific analysis. Nearly endless videos of animals can be found on YouTube and elsewhere on the web, but these videos have little value for scientific analysis because essential metadata such as true frame rate, spatial calibration, genus and species, weight, age, etc. of organisms, are generally unknown. We have embarked on a project to build community consensus on video data management and metadata standards for organismal biology research. We collected input from colleagues at early stages, organized an open workshop, "Establishing Standards for Video Data Management," at the Society for Integrative and Comparative Biology meeting in January 2017, and then collected two more rounds of input on revised versions of the standards. The result we present here is a rubric consisting of nine standards for video data management, with three levels within each standard: good, better, and best practices. The nine standards are: (1) data storage; (2) video file formats; (3) metadata linkage; (4) video data and metadata access; (5) contact information and acceptable use; (6) camera settings; (7) organism(s); (8) recording conditions; and (9) subject matter/topic. The first four standards address data preservation and interoperability for sharing, whereas standards 5-9 establish minimum metadata standards for organismal biology video, and suggest additional metadata that may be useful for some studies. This rubric was developed with substantial input from researchers and students, but still should be viewed as a living document that should be further refined and updated as technology and research practices change. The audience for these standards includes researchers, journals, and granting agencies, and also the developers and curators of databases that may contribute to video data sharing efforts. We offer this project as an example of building community consensus for data management, preservation, and sharing standards, which may be useful for future efforts by the organismal biology research community. © The Author 2017. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology.
Data Management Rubric for Video Data in Organismal Biology
Brainerd, Elizabeth L.; Blob, Richard W.; Hedrick, Tyson L.; Creamer, Andrew T.; Müller, Ulrike K.
2017-01-01
Synopsis Standards-based data management facilitates data preservation, discoverability, and access for effective data reuse within research groups and across communities of researchers. Data sharing requires community consensus on standards for data management, such as storage and formats for digital data preservation, metadata (i.e., contextual data about the data) that should be recorded and stored, and data access. Video imaging is a valuable tool for measuring time-varying phenotypes in organismal biology, with particular application for research in functional morphology, comparative biomechanics, and animal behavior. The raw data are the videos, but videos alone are not sufficient for scientific analysis. Nearly endless videos of animals can be found on YouTube and elsewhere on the web, but these videos have little value for scientific analysis because essential metadata such as true frame rate, spatial calibration, genus and species, weight, age, etc. of organisms, are generally unknown. We have embarked on a project to build community consensus on video data management and metadata standards for organismal biology research. We collected input from colleagues at early stages, organized an open workshop, “Establishing Standards for Video Data Management,” at the Society for Integrative and Comparative Biology meeting in January 2017, and then collected two more rounds of input on revised versions of the standards. The result we present here is a rubric consisting of nine standards for video data management, with three levels within each standard: good, better, and best practices. The nine standards are: (1) data storage; (2) video file formats; (3) metadata linkage; (4) video data and metadata access; (5) contact information and acceptable use; (6) camera settings; (7) organism(s); (8) recording conditions; and (9) subject matter/topic. The first four standards address data preservation and interoperability for sharing, whereas standards 5–9 establish minimum metadata standards for organismal biology video, and suggest additional metadata that may be useful for some studies. This rubric was developed with substantial input from researchers and students, but still should be viewed as a living document that should be further refined and updated as technology and research practices change. The audience for these standards includes researchers, journals, and granting agencies, and also the developers and curators of databases that may contribute to video data sharing efforts. We offer this project as an example of building community consensus for data management, preservation, and sharing standards, which may be useful for future efforts by the organismal biology research community. PMID:28881939
NASA Technical Reports Server (NTRS)
West, R. S.
1975-01-01
The system is described as a computer-based system designed to track the status of problems and corrective actions pertinent to space shuttle hardware. The input, processing, output, and performance requirements of the system are presented along with standard display formats and examples. Operational requirements, hardware, requirements, and test requirements are also included.
Automatic Feature Extraction System.
1982-12-01
exploitation. It was used for * processing of black and white and multispectral reconnaissance photography, side-looking synthetic aperture radar imagery...the image data and different software modules for image queing and formatting, the result of the input process will be images in standard AFES file...timely manner. The FFS configuration provides the environment necessary for integrated testing of image processing functions and design and
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-11
... rulemaking. This document asks for responses to a series of questions regarding formats, types of material to... the types of material that could potentially be moved from the MUTCD to the Applications Supplement... seeks input on the type of material to be included in the MUTCD and the Applications Supplement, as well...
SutraPrep, a pre-processor for SUTRA, a model for ground-water flow with solute or energy transport
Provost, Alden M.
2002-01-01
SutraPrep facilitates the creation of three-dimensional (3D) input datasets for the USGS ground-water flow and transport model SUTRA Version 2D3D.1. It is most useful for applications in which the geometry of the 3D model domain and the spatial distribution of physical properties and boundary conditions is relatively simple. SutraPrep can be used to create a SUTRA main input (?.inp?) file, an initial conditions (?.ics?) file, and a 3D plot of the finite-element mesh in Virtual Reality Modeling Language (VRML) format. Input and output are text-based. The code can be run on any platform that has a standard FORTRAN-90 compiler. Executable code is available for Microsoft Windows.
On star formation in stellar systems. I - Photoionization effects in protoglobular clusters
NASA Technical Reports Server (NTRS)
Tenorio-Tagle, G.; Bodenheimer, P.; Lin, D. N. C.; Noriega-Crespo, A.
1986-01-01
The progressive ionization and subsequent dynamical evolution of nonhomogeneously distributed low-metal-abundance diffuse gas after star formation in globular clusters are investigated analytically, taking the gravitational acceleration due to the stars into account. The basic equations are derived; the underlying assumptions, input parameters, and solution methods are explained; and numerical results for three standard cases (ionization during star formation, ionization during expansion, and evolution resulting in a stable H II region at its equilibrium Stromgren radius) are presented in graphs and characterized in detail. The time scale of residual-gas loss in typical clusters is found to be about the same as the lifetime of a massive star on the main sequence.
iSpy: a powerful and lightweight event display
NASA Astrophysics Data System (ADS)
Alverson, G.; Eulisse, G.; McCauley, T.; Taylor, L.
2012-12-01
iSpy is a general-purpose event data and detector visualization program that was developed as an event display for the CMS experiment at the LHC and has seen use by the general public and teachers and students in the context of education and outreach. Central to the iSpy design philosophy is ease of installation, use, and extensibility. The application itself uses the open-access packages Qt4 and Open Inventor and is distributed either as a fully-bound executable or a standard installer package: one can simply download and double-click to begin. Mac OSX, Linux, and Windows are supported. iSpy renders the standard 2D, 3D, and tabular views, and the architecture allows for a generic approach to production of new views and projections. iSpy reads and displays data in the ig format: event information is written in compressed JSON format files designed for distribution over a network. This format is easily extensible and makes the iSpy client indifferent to the original input data source. The ig format is the one used for release of approved CMS data to the public.
Certification Testing Methodology for Composite Structure. Volume 2. Methodology Development
1986-10-01
parameter, sample size and fa- tigue test duration. The required input are 1. Residual strength Weibull shape parameter ( ALPR ) 2. Fatigue life Weibull shape...INPUT STRENGTH ALPHA’) READ(*,*) ALPR ALPRI = 1.O/ ALPR WRITE(*, 2) 2 FORMAT( 2X, ’PLEASE INPUT LIFE ALPHA’) READ(*,*) ALPL ALPLI - 1.0/ALPL WRITE(*, 3...3 FORMAT(2X,’PLEASE INPUT SAMPLE SIZE’) READ(*,*) N AN - N WRITE(*,4) 4 FORMAT(2X,’PLEASE INPUT TEST DURATION’) READ(*,*) T RALP - ALPL/ ALPR ARGR - 1
Efficient visualization of high-throughput targeted proteomics experiments: TAPIR.
Röst, Hannes L; Rosenberger, George; Aebersold, Ruedi; Malmström, Lars
2015-07-15
Targeted mass spectrometry comprises a set of powerful methods to obtain accurate and consistent protein quantification in complex samples. To fully exploit these techniques, a cross-platform and open-source software stack based on standardized data exchange formats is required. We present TAPIR, a fast and efficient Python visualization software for chromatograms and peaks identified in targeted proteomics experiments. The input formats are open, community-driven standardized data formats (mzML for raw data storage and TraML encoding the hierarchical relationships between transitions, peptides and proteins). TAPIR is scalable to proteome-wide targeted proteomics studies (as enabled by SWATH-MS), allowing researchers to visualize high-throughput datasets. The framework integrates well with existing automated analysis pipelines and can be extended beyond targeted proteomics to other types of analyses. TAPIR is available for all computing platforms under the 3-clause BSD license at https://github.com/msproteomicstools/msproteomicstools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
An object oriented fully 3D tomography visual toolkit.
Agostinelli, S; Paoli, G
2001-04-01
In this paper we present a modern object oriented component object model (COMM) C + + toolkit dedicated to fully 3D cone-beam tomography. The toolkit allows the display and visual manipulation of analytical phantoms, projection sets and volumetric data through a standard Windows graphical user interface. Data input/output is performed using proprietary file formats but import/export of industry standard file formats, including raw binary, Windows bitmap and AVI, ACR/NEMA DICOMM 3 and NCSA HDF is available. At the time of writing built-in implemented data manipulators include a basic phantom ray-tracer and a Matrox Genesis frame grabbing facility. A COMM plug-in interface is provided for user-defined custom backprojector algorithms: a simple Feldkamp ActiveX control, including source code, is provided as an example; our fast Feldkamp plug-in is also available.
Data compression/error correction digital test system. Appendix 2: Theory of operation
NASA Technical Reports Server (NTRS)
1972-01-01
An overall block diagram of the DC/EC digital system test is shown. The system is divided into two major units: the transmitter and the receiver. In operation, the transmitter and receiver are connected only by a real or simulated transmission link. The system inputs consist of: (1) standard format TV video, (2) two channels of analog voice, and (3) one serial PCM bit stream.
MTpy: A Python toolbox for magnetotellurics
Krieger, Lars; Peacock, Jared R.
2014-01-01
In this paper, we introduce the structure and concept of MTpy . Additionally, we show some examples from an everyday work-flow of MT data processing: the generation of standard EDI data files from raw electric (E-) and magnetic flux density (B-) field time series as input, the conversion into MiniSEED data format, as well as the generation of a graphical data representation in the form of a Phase Tensor pseudosection.
Development a computer codes to couple PWR-GALE output and PC-CREAM input
NASA Astrophysics Data System (ADS)
Kuntjoro, S.; Budi Setiawan, M.; Nursinta Adi, W.; Deswandri; Sunaryo, G. R.
2018-02-01
Radionuclide dispersion analysis is part of an important reactor safety analysis. From the analysis it can be obtained the amount of doses received by radiation workers and communities around nuclear reactor. The radionuclide dispersion analysis under normal operating conditions is carried out using the PC-CREAM code, and it requires input data such as source term and population distribution. Input data is derived from the output of another program that is PWR-GALE and written Population Distribution data in certain format. Compiling inputs for PC-CREAM programs manually requires high accuracy, as it involves large amounts of data in certain formats and often errors in compiling inputs manually. To minimize errors in input generation, than it is make coupling program for PWR-GALE and PC-CREAM programs and a program for writing population distribution according to the PC-CREAM input format. This work was conducted to create the coupling programming between PWR-GALE output and PC-CREAM input and programming to written population data in the required formats. Programming is done by using Python programming language which has advantages of multiplatform, object-oriented and interactive. The result of this work is software for coupling data of source term and written population distribution data. So that input to PC-CREAM program can be done easily and avoid formatting errors. Programming sourceterm coupling program PWR-GALE and PC-CREAM is completed, so that the creation of PC-CREAM inputs in souceterm and distribution data can be done easily and according to the desired format.
Factors leading to different viability predictions for a grizzly bear data set
Mills, L.S.; Hayes, S.G.; Wisdom, M.J.; Citta, J.; Mattson, D.J.; Murphy, K.
1996-01-01
Population viability analysis programs are being used increasingly in research and management applications, but there has not been a systematic study of the congruence of different program predictions based on a single data set. We performed such an analysis using four population viability analysis computer programs: GAPPS, INMAT, RAMAS/AGE, and VORTEX. The standardized demographic rates used in all programs were generalized from hypothetical increasing and decreasing grizzly bear (Ursus arctos horribilis) populations. Idiosyncracies of input format for each program led to minor differences in intrinsic growth rates that translated into striking differences in estimates of extinction rates and expected population size. In contrast, the addition of demographic stochasticity, environmental stochasticity, and inbreeding costs caused only a small divergence in viability predictions. But, the addition of density dependence caused large deviations between the programs despite our best attempts to use the same density-dependent functions. Population viability programs differ in how density dependence is incorporated, and the necessary functions are difficult to parameterize accurately. Thus, we recommend that unless data clearly suggest a particular density-dependent model, predictions based on population viability analysis should include at least one scenario without density dependence. Further, we describe output metrics that may differ between programs; development of future software could benefit from standardized input and output formats across different programs.
Geiss, Karla; Meyer, Martin
2013-09-01
Standardized mortality ratios and standardized incidence ratios are widely used in cohort studies to compare mortality or incidence in a study population to that in the general population on a age-time-specific basis, but their computation is not included in standard statistical software packages. Here we present a user-friendly Microsoft Windows program for computing standardized mortality ratios and standardized incidence ratios based on calculation of exact person-years at risk stratified by sex, age and calendar time. The program offers flexible import of different file formats for input data and easy handling of general population reference rate tables, such as mortality or incidence tables exported from cancer registry databases. The application of the program is illustrated with two examples using empirical data from the Bavarian Cancer Registry. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
F77NNS - A FORTRAN-77 NEURAL NETWORK SIMULATOR
NASA Technical Reports Server (NTRS)
Mitchell, P. H.
1994-01-01
F77NNS (A FORTRAN-77 Neural Network Simulator) simulates the popular back error propagation neural network. F77NNS is an ANSI-77 FORTRAN program designed to take advantage of vectorization when run on machines having this capability, but it will run on any computer with an ANSI-77 FORTRAN Compiler. Artificial neural networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to biological nerve cells. Problems which involve pattern matching or system modeling readily fit the class of problems which F77NNS is designed to solve. The program's formulation trains a neural network using Rumelhart's back-propagation algorithm. Typically the nodes of a network are grouped together into clumps called layers. A network will generally have an input layer through which the various environmental stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. The back-propagation training algorithm can require massive computational resources to implement a large network such as a network capable of learning text-to-phoneme pronunciation rules as in the famous Sehnowski experiment. The Sehnowski neural network learns to pronounce 1000 common English words. The standard input data defines the specific inputs that control the type of run to be made, and input files define the NN in terms of the layers and nodes, as well as the input/output (I/O) pairs. The program has a restart capability so that a neural network can be solved in stages suitable to the user's resources and desires. F77NNS allows the user to customize the patterns of connections between layers of a network. The size of the neural network to be solved is limited only by the amount of random access memory (RAM) available to the user. The program has a memory requirement of about 900K. The standard distribution medium for this package is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. F77NNS was developed in 1989.
BioXSD: the common data-exchange format for everyday bioinformatics web services.
Kalas, Matús; Puntervoll, Pål; Joseph, Alexandre; Bartaseviciūte, Edita; Töpfer, Armin; Venkataraman, Prabakar; Pettifer, Steve; Bryne, Jan Christian; Ison, Jon; Blanchet, Christophe; Rapacki, Kristoffer; Jonassen, Inge
2010-09-15
The world-wide community of life scientists has access to a large number of public bioinformatics databases and tools, which are developed and deployed using diverse technologies and designs. More and more of the resources offer programmatic web-service interface. However, efficient use of the resources is hampered by the lack of widely used, standard data-exchange formats for the basic, everyday bioinformatics data types. BioXSD has been developed as a candidate for standard, canonical exchange format for basic bioinformatics data. BioXSD is represented by a dedicated XML Schema and defines syntax for biological sequences, sequence annotations, alignments and references to resources. We have adapted a set of web services to use BioXSD as the input and output format, and implemented a test-case workflow. This demonstrates that the approach is feasible and provides smooth interoperability. Semantics for BioXSD is provided by annotation with the EDAM ontology. We discuss in a separate section how BioXSD relates to other initiatives and approaches, including existing standards and the Semantic Web. The BioXSD 1.0 XML Schema is freely available at http://www.bioxsd.org/BioXSD-1.0.xsd under the Creative Commons BY-ND 3.0 license. The http://bioxsd.org web page offers documentation, examples of data in BioXSD format, example workflows with source codes in common programming languages, an updated list of compatible web services and tools and a repository of feature requests from the community.
Analysis of view synthesis prediction architectures in modern coding standards
NASA Astrophysics Data System (ADS)
Tian, Dong; Zou, Feng; Lee, Chris; Vetro, Anthony; Sun, Huifang
2013-09-01
Depth-based 3D formats are currently being developed as extensions to both AVC and HEVC standards. The availability of depth information facilitates the generation of intermediate views for advanced 3D applications and displays, and also enables more efficient coding of the multiview input data through view synthesis prediction techniques. This paper outlines several approaches that have been explored to realize view synthesis prediction in modern video coding standards such as AVC and HEVC. The benefits and drawbacks of various architectures are analyzed in terms of performance, complexity, and other design considerations. It is hence concluded that block-based VSP prediction for multiview video signals provides attractive coding gains with comparable complexity as traditional motion/disparity compensation.
The Design and Usage of the New Data Management Features in NASTRAN
NASA Technical Reports Server (NTRS)
Pamidi, P. R.; Brown, W. K.
1984-01-01
Two new data management features are installed in the April 1984 release of NASTRAN. These two features are the Rigid Format Data Base and the READFILE capability. The Rigid Format Data Base is stored on external files in card image format and can be easily maintained and expanded by the use of standard text editors. This data base provides the user and the NASTRAN maintenance contractor with an easy means for making changes to a Rigid Format or for generating new Rigid Formats without unnecessary compilations and link editing of NASTRAN. Each Rigid Format entry in the data base contains the Direct Matrix Abstraction Program (DMAP), along with the associated restart, DMAP sequence subset and substructure control flags. The READFILE capability allows an user to reference an external secondary file from the NASTRAN primary input file and to read data from this secondary file. There is no limit to the number of external secondary files that may be referenced and read.
xiSPEC: web-based visualization, analysis and sharing of proteomics data.
Kolbowski, Lars; Combe, Colin; Rappsilber, Juri
2018-05-08
We present xiSPEC, a standard compliant, next-generation web-based spectrum viewer for visualizing, analyzing and sharing mass spectrometry data. Peptide-spectrum matches from standard proteomics and cross-linking experiments are supported. xiSPEC is to date the only browser-based tool supporting the standardized file formats mzML and mzIdentML defined by the proteomics standards initiative. Users can either upload data directly or select files from the PRIDE data repository as input. xiSPEC allows users to save and share their datasets publicly or password protected for providing access to collaborators or readers and reviewers of manuscripts. The identification table features advanced interaction controls and spectra are presented in three interconnected views: (i) annotated mass spectrum, (ii) peptide sequence fragmentation key and (iii) quality control error plots of matched fragments. Highlighting or selecting data points in any view is represented in all other views. Views are interactive scalable vector graphic elements, which can be exported, e.g. for use in publication. xiSPEC allows for re-annotation of spectra for easy hypothesis testing by modifying input data. xiSPEC is freely accessible at http://spectrumviewer.org and the source code is openly available on https://github.com/Rappsilber-Laboratory/xiSPEC.
A Review of Aeromagnetic Anomalies in the Sawatch Range, Central Colorado
Bankey, Viki
2010-01-01
This report contains digital data and image files of aeromagnetic anomalies in the Sawatch Range of central Colorado. The primary product is a data layer of polygons with linked data records that summarize previous interpretations of aeromagnetic anomalies in this region. None of these data files and images are new; rather, they are presented in updated formats that are intended to be used as input to geographic information systems, standard graphics software, or map-plotting packages.
SnopViz, an interactive snow profile visualization tool
NASA Astrophysics Data System (ADS)
Fierz, Charles; Egger, Thomas; gerber, Matthias; Bavay, Mathias; Techel, Frank
2016-04-01
SnopViz is a visualization tool for both simulation outputs of the snow-cover model SNOWPACK and observed snow profiles. It has been designed to fulfil the needs of operational services (Swiss Avalanche Warning Service, Avalanche Canada) as well as offer the flexibility required to satisfy the specific needs of researchers. This JavaScript application runs on any modern browser and does not require an active Internet connection. The open source code is available for download from models.slf.ch where examples can also be run. Both the SnopViz library and the SnopViz User Interface will become a full replacement of the current research visualization tool SN_GUI for SNOWPACK. The SnopViz library is a stand-alone application that parses the provided input files, for example, a single snow profile (CAAML file format) or multiple snow profiles as output by SNOWPACK (PRO file format). A plugin architecture allows for handling JSON objects (JavaScript Object Notation) as well and plugins for other file formats may be added easily. The outputs are provided either as vector graphics (SVG) or JSON objects. The SnopViz User Interface (UI) is a browser based stand-alone interface. It runs in every modern browser, including IE, and allows user interaction with the graphs. SVG, the XML based standard for vector graphics, was chosen because of its easy interaction with JS and a good software support (Adobe Illustrator, Inkscape) to manipulate graphs outside SnopViz for publication purposes. SnopViz provides new visualization for SNOWPACK timeline output as well as time series input and output. The actual output format for SNOWPACK timelines was retained while time series are read from SMET files, a file format used in conjunction with the open source data handling code MeteoIO. Finally, SnopViz is able to render single snow profiles, either observed or modelled, that are provided as CAAML-file. This file format (caaml.org/Schemas/V5.0/Profiles/SnowProfileIACS) is an international standard to exchange snow profile data. It is supported by the International Association of Cryospheric Sciences (IACS) and was developed in collaboration with practitioners (Avalanche Canada).
NASA Astrophysics Data System (ADS)
Mohlman, H. T.
1983-04-01
The Air Force community noise prediction model (NOISEMAP) is used to describe the aircraft noise exposure around airbases and thereby aid airbase planners to minimize exposure and prevent community encroachment which could limit mission effectiveness of the installation. This report documents two computer programs (OMEGA 10 and OMEGA 11) which were developed to prepare aircraft flight and ground runup noise data for input to NOISEMAP. OMEGA 10 is for flight operations and OMEGA 11 is for aircraft ground runups. All routines in each program are documented at a level useful to a programmer working with the code or a reader interested in a general overview of what happens within a specific subroutine. Both programs input normalized, reference aircraft noise data; i.e., data at a standard reference distance from the aircraft, for several fixed engine power settings, a reference airspeed and standard day meteorological conditions. Both programs operate on these normalized, reference data in accordance with user-defined, non-reference conditions to derive single-event noise data for 22 distances (200 to 25,000 feet) in a variety of physical and psycho-acoustic metrics. These outputs are in formats ready for input to NOISEMAP.
PySE: Python Source Extractor for radio astronomical images
NASA Astrophysics Data System (ADS)
Spreeuw, Hanno; Swinbank, John; Molenaar, Gijs; Staley, Tim; Rol, Evert; Sanders, John; Scheers, Bart; Kuiack, Mark
2018-05-01
PySE finds and measures sources in radio telescope images. It is run with several options, such as the detection threshold (a multiple of the local noise), grid size, and the forced clean beam fit, followed by a list of input image files in standard FITS or CASA format. From these, PySe provides a list of found sources; information such as the calculated background image, source list in different formats (e.g. text, region files importable in DS9), and other data may be saved. PySe can be integrated into a pipeline; it was originally written as part of the LOFAR Transient Detection Pipeline (TraP, ascl:1412.011).
Regestein Née Meissner, Lena; Arndt, Julia; Palmen, Thomas G; Jestel, Tim; Mitsunaga, Hitoshi; Fukusaki, Eiichiro; Büchs, Jochen
2017-01-01
Poly(γ-glutamic acid) (γ-PGA) is a biopolymer with many useful properties making it applicable for instance in food and skin care industries, in wastewater treatment, in biodegradable plastics or in the pharmaceutical industry. γ-PGA is usually produced microbially by different Bacillus spp. The produced γ-PGA increases the viscosity of the fermentation broth. In case of shake flask fermentations, this results in an increase of the volumetric power input. The power input in shake flasks can be determined by measuring the torque of an orbitally rotating lab shaker. The online measurement of the volumetric power input enables to continuously monitor the formation or degradation of viscous products like γ-PGA. Combined with the online measurement of the oxygen transfer rate (OTR), the respiration activity of the organisms can be observed at the same time. Two different Bacillus licheniformis strains and three medium compositions were investigated using online volumetric power input and OTR measurements as well as thorough offline analysis. The online volumetric power input measurement clearly depicted changes in γ-PGA formation due to different medium compositions as well as differences in the production behavior of the two investigated strains. A higher citric acid concentration and the addition of trace elements to the standard medium showed a positive influence on γ-PGA production. The online power input signal was used to derive an online viscosity signal which was validated with offline determined viscosity values. The online measurement of the OTR proved to be a valuable tool to follow the respiration activity of the cultivated strains and to determine its reproducibility under different cultivation conditions. The combination of the volumetric power input and the OTR allows for an easy and reliable investigation of new strains, cultivation conditions and medium compositions for their potential in γ-PGA production. The power input signal and the derived online viscosity directly reflect changes in γ-PGA molecular weight and concentration, respectively, due to different cultivation conditions or production strains.
Optimal input sizes for neural network de-interlacing
NASA Astrophysics Data System (ADS)
Choi, Hyunsoo; Seo, Guiwon; Lee, Chulhee
2009-02-01
Neural network de-interlacing has shown promising results among various de-interlacing methods. In this paper, we investigate the effects of input size for neural networks for various video formats when the neural networks are used for de-interlacing. In particular, we investigate optimal input sizes for CIF, VGA and HD video formats.
Reichmann, Thomas L.; Richter, Klaus W.; Delsante, Simona; Borzone, Gabriella; Ipser, Herbert
2014-01-01
In the present study standard enthalpies of formation were measured by reaction and solution calorimetry at stoichiometric compositions of Cd2Pr, Cd3Pr, Cd58Pr13 and Cd6Pr. The corresponding values were determined to be −46.0, −38.8, −35.2 and −24.7 kJ/mol(at), respectively. These data together with thermodynamic data and phase diagram information from literature served as input data for a CALPHAD-type optimization of the Cd–Pr phase diagram. The complete composition range could be described precisely with the present models, both with respect to phase equilibria as well as to thermodynamic input data. The thermodynamic parameters of all intermetallic compounds were modelled following Neumann–Kopp rule. Temperature dependent contributions to the individual Gibbs energies were used for all compounds. Extended solid solubilities are well described for the low- and high-temperature modifications of Pr and also for the intermetallic compound CdPr. A quite good agreement with all viable data available from literature was found and is presented. PMID:25540475
Oil Formation Volume Factor Determination Through a Fused Intelligence
NASA Astrophysics Data System (ADS)
Gholami, Amin
2016-12-01
Volume change of oil between reservoir condition and standard surface condition is called oil formation volume factor (FVF), which is very time, cost and labor intensive to determine. This study proposes an accurate, rapid and cost-effective approach for determining FVF from reservoir temperature, dissolved gas oil ratio, and specific gravity of both oil and dissolved gas. Firstly, structural risk minimization (SRM) principle of support vector regression (SVR) was employed to construct a robust model for estimating FVF from the aforementioned inputs. Subsequently, an alternating conditional expectation (ACE) was used for approximating optimal transformations of input/output data to a higher correlated data and consequently developing a sophisticated model between transformed data. Eventually, a committee machine with SVR and ACE was constructed through the use of hybrid genetic algorithm-pattern search (GA-PS). Committee machine integrates ACE and SVR models in an optimal linear combination such that makes benefit of both methods. A group of 342 data points was used for model development and a group of 219 data points was used for blind testing the constructed model. Results indicated that the committee machine performed better than individual models.
NASA Astrophysics Data System (ADS)
Jiang, W.; Wang, F.; Meng, Q.; Li, Z.; Liu, B.; Zheng, X.
2018-04-01
This paper presents a new standardized data format named Fire Markup Language (FireML), extended by the Geography Markup Language (GML) of OGC, to elaborate upon the fire hazard model. The proposed FireML is able to standardize the input and output documents of a fire model for effectively communicating with different disaster management systems to ensure a good interoperability. To demonstrate the usage of FireML and testify its feasibility, an adopted forest fire spread model being compatible with FireML is described. And a 3DGIS disaster management system is developed to simulate the dynamic procedure of forest fire spread with the defined FireML documents. The proposed approach will enlighten ones who work on other disaster models' standardization work.
Naval Open Architecture Contract Guidebook for Program Managers
2010-06-30
a whole, transform inputs into outputs. [IEEE/EIA Std. 12207 /1997] “APP233/ ISO 10303” – APP233 an “Application Protocol” for Systems Engineering...Language Metadata Interchange (XMI) and AP233/ ISO 10303). The contractor shall identify the proposed standards and formats to be used. The contractor...ANSI ISO /IEC 9075-1, ISO /IEC 9075-2, ISO /IEC 9075-3, ISO /IEC 9075-4, ISO /IEC 9075-5) 2. HTML for presentation layer (e.g., XML 1.0
BioXSD: the common data-exchange format for everyday bioinformatics web services
Kalaš, Matúš; Puntervoll, Pæl; Joseph, Alexandre; Bartaševičiūtė, Edita; Töpfer, Armin; Venkataraman, Prabakar; Pettifer, Steve; Bryne, Jan Christian; Ison, Jon; Blanchet, Christophe; Rapacki, Kristoffer; Jonassen, Inge
2010-01-01
Motivation: The world-wide community of life scientists has access to a large number of public bioinformatics databases and tools, which are developed and deployed using diverse technologies and designs. More and more of the resources offer programmatic web-service interface. However, efficient use of the resources is hampered by the lack of widely used, standard data-exchange formats for the basic, everyday bioinformatics data types. Results: BioXSD has been developed as a candidate for standard, canonical exchange format for basic bioinformatics data. BioXSD is represented by a dedicated XML Schema and defines syntax for biological sequences, sequence annotations, alignments and references to resources. We have adapted a set of web services to use BioXSD as the input and output format, and implemented a test-case workflow. This demonstrates that the approach is feasible and provides smooth interoperability. Semantics for BioXSD is provided by annotation with the EDAM ontology. We discuss in a separate section how BioXSD relates to other initiatives and approaches, including existing standards and the Semantic Web. Availability: The BioXSD 1.0 XML Schema is freely available at http://www.bioxsd.org/BioXSD-1.0.xsd under the Creative Commons BY-ND 3.0 license. The http://bioxsd.org web page offers documentation, examples of data in BioXSD format, example workflows with source codes in common programming languages, an updated list of compatible web services and tools and a repository of feature requests from the community. Contact: matus.kalas@bccs.uib.no; developers@bioxsd.org; support@bioxsd.org PMID:20823319
Attitude profile design program
NASA Technical Reports Server (NTRS)
1991-01-01
The Attitude Profile Design (APD) Program was designed to be used as a stand-alone addition to the Simplex Computation of Optimum Orbital Trajectories (SCOOT). The program uses information from a SCOOT output file and the user defined attitude profile to produce time histories of attitude, angular body rates, and accelerations. The APD program is written in standard FORTRAN77 and should be portable to any machine that has an appropriate compiler. The input and output are through formatted files. The program reads the basic flight data, such as the states of the vehicles, acceleration profiles, and burn information, from the SCOOT output file. The user inputs information about the desired attitude profile during coasts in a high level manner. The program then takes these high level commands and executes the maneuvers, outputting the desired information.
NASA Astrophysics Data System (ADS)
Foster, K.
1994-09-01
This document is a description of a computer program called Format( )MEDIC( )Input. The purpose of this program is to allow the user to quickly reformat wind velocity data in the Model Evaluation Database (MEDb) into a reasonable 'first cut' set of MEDIC input files (MEDIC.nml, StnLoc.Met, and Observ.Met). The user is cautioned that these resulting input files must be reviewed for correctness and completeness. This program will not format MEDb data into a Problem Station Library or Problem Metdata File. A description of how the program reformats the data is provided, along with a description of the required and optional user input and a description of the resulting output files. A description of the MEDb is not provided here but can be found in the RAS Division Model Evaluation Database Description document.
ArrayInitiative - a tool that simplifies creating custom Affymetrix CDFs
2011-01-01
Background Probes on a microarray represent a frozen view of a genome and are quickly outdated when new sequencing studies extend our knowledge, resulting in significant measurement error when analyzing any microarray experiment. There are several bioinformatics approaches to improve probe assignments, but without in-house programming expertise, standardizing these custom array specifications as a usable file (e.g. as Affymetrix CDFs) is difficult, owing mostly to the complexity of the specification file format. However, without correctly standardized files there is a significant barrier for testing competing analysis approaches since this file is one of the required inputs for many commonly used algorithms. The need to test combinations of probe assignments and analysis algorithms led us to develop ArrayInitiative, a tool for creating and managing custom array specifications. Results ArrayInitiative is a standalone, cross-platform, rich client desktop application for creating correctly formatted, custom versions of manufacturer-provided (default) array specifications, requiring only minimal knowledge of the array specification rules and file formats. Users can import default array specifications, import probe sequences for a default array specification, design and import a custom array specification, export any array specification to multiple output formats, export the probe sequences for any array specification and browse high-level information about the microarray, such as version and number of probes. The initial release of ArrayInitiative supports the Affymetrix 3' IVT expression arrays we currently analyze, but as an open source application, we hope that others will contribute modules for other platforms. Conclusions ArrayInitiative allows researchers to create new array specifications, in a standard format, based upon their own requirements. This makes it easier to test competing design and analysis strategies that depend on probe definitions. Since the custom array specifications are easily exported to the manufacturer's standard format, researchers can analyze these customized microarray experiments using established software tools, such as those available in Bioconductor. PMID:21548938
Gruber, Bernd; Unmack, Peter J; Berry, Oliver F; Georges, Arthur
2018-05-01
Although vast technological advances have been made and genetic software packages are growing in number, it is not a trivial task to analyse SNP data. We announce a new r package, dartr, enabling the analysis of single nucleotide polymorphism data for population genomic and phylogenomic applications. dartr provides user-friendly functions for data quality control and marker selection, and permits rigorous evaluations of conformation to Hardy-Weinberg equilibrium, gametic-phase disequilibrium and neutrality. The package reports standard descriptive statistics, permits exploration of patterns in the data through principal components analysis and conducts standard F-statistics, as well as basic phylogenetic analyses, population assignment, isolation by distance and exports data to a variety of commonly used downstream applications (e.g., newhybrids, faststructure and phylogeny applications) outside of the r environment. The package serves two main purposes: first, a user-friendly approach to lower the hurdle to analyse such data-therefore, the package comes with a detailed tutorial targeted to the r beginner to allow data analysis without requiring deep knowledge of r. Second, we use a single, well-established format-genlight from the adegenet package-as input for all our functions to avoid data reformatting. By strictly using the genlight format, we hope to facilitate this format as the de facto standard of future software developments and hence reduce the format jungle of genetic data sets. The dartr package is available via the r CRAN network and GitHub. © 2017 John Wiley & Sons Ltd.
40 CFR 60.252 - Standards for thermal dryers.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) heat input. (iii) Thermal dryers that receive all of their thermal input from a source other than coal... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Coal Preparation.../MMBtu) heat input; or (ii) The owner or operator must not cause to be discharged into the atmosphere...
1980-02-08
hours 0 Input Format: Integer b. Creatina Rescource Allocation Blocks The creation of a specific resource allocation block as a directive component is...is directed. 0 Range: N/A . Input Format: INT/NUC/CHM b. Creatina Employment Packages An employment package block has the structure portrayed in Figure
Application of the unified mask data format based on OASIS for VSB EB writers
NASA Astrophysics Data System (ADS)
Suzuki, Toshio; Hirumi, Junji; Suga, Osamu
2005-11-01
Mask data preparation (MDP) for modern mask manufacturing becomes a complex process because many kinds of EB data formats are used in mask makers and EB data files continue to become bigger by the application of RET. Therefore we developed a unified mask pattern data format named "OASIS.VSB1" and a job deck format named "MALY2" for Variable-Shaped-Beam (VSB) EB writers. OASIS.VSB is the mask pattern data format based on OASISTM 3 (Open Artwork System Interchange Standard) released as a successive format to GDSII by SEMI. We defined restrictions on OASIS for VSB EB writers to input OASIS.VSB data directly to VSB EB writers just like the native EB data. OASIS.VSB specification and MALY specification have been disclosed to the public and will become a SEMI standard in the near future. We started to promote the spread activities of OASIS.VSB and MALY. For practical use of OASIS.VSB and MALY, we are discussing the infrastructure system of MDP processing using OASIS.VSB and MALY with mask makers, VSB EB makers, and device makers. We are also discussing the tools for the infrastructure system with EDA vendors. The infrastructure system will enable TAT, the man-hour, and the cost in MDP to be reduced. In this paper, we propose the plan of the infrastructure system of MDP processing using OASIS.VSB and MALY as an application of OASIS.VSB and MALY.
Albar, Juan Pablo; Binz, Pierre-Alain; Eisenacher, Martin; Jones, Andrew R; Mayer, Gerhard; Omenn, Gilbert S; Orchard, Sandra; Vizcaíno, Juan Antonio; Hermjakob, Henning
2015-01-01
Objective To describe the goals of the Proteomics Standards Initiative (PSI) of the Human Proteome Organization, the methods that the PSI has employed to create data standards, the resulting output of the PSI, lessons learned from the PSI’s evolution, and future directions and synergies for the group. Materials and Methods The PSI has 5 categories of deliverables that have guided the group. These are minimum information guidelines, data formats, controlled vocabularies, resources and software tools, and dissemination activities. These deliverables are produced via the leadership and working group organization of the initiative, driven by frequent workshops and ongoing communication within the working groups. Official standards are subjected to a rigorous document process that includes several levels of peer review prior to release. Results We have produced and published minimum information guidelines describing what information should be provided when making data public, either via public repositories or other means. The PSI has produced a series of standard formats covering mass spectrometer input, mass spectrometer output, results of informatics analysis (both qualitative and quantitative analyses), reports of molecular interaction data, and gel electrophoresis analyses. We have produced controlled vocabularies that ensure that concepts are uniformly annotated in the formats and engaged in extensive software development and dissemination efforts so that the standards can efficiently be used by the community. Conclusion In its first dozen years of operation, the PSI has produced many standards that have accelerated the field of proteomics by facilitating data exchange and deposition to data repositories. We look to the future to continue developing standards for new proteomics technologies and workflows and mechanisms for integration with other omics data types. Our products facilitate the translation of genomics and proteomics findings to clinical and biological phenotypes. The PSI website can be accessed at http://www.psidev.info. PMID:25726569
OSMEAN - OSCULATING/MEAN CLASSICAL ORBIT ELEMENTS CONVERSION (HP9000/7XX VERSION)
NASA Technical Reports Server (NTRS)
Guinn, J. R.
1994-01-01
OSMEAN is a sophisticated FORTRAN algorithm that converts between osculating and mean classical orbit elements. Mean orbit elements are advantageous for trajectory design and maneuver planning since they can be propagated very quickly; however, mean elements cannot describe the exact orbit at any given time. Osculating elements will enable the engineer to give an exact description of an orbit; however, computation costs are significantly higher due to the numerical integration procedure required for propagation. By calculating accurate conversions between osculating and mean orbit elements, OSMEAN allows the engineer to exploit the advantages of each approach for the design and planning of orbital trajectories and maneuver planning. OSMEAN is capable of converting mean elements to osculating elements or vice versa. The conversion is based on modelling of all first order aspherical and lunar-solar gravitation perturbations as well as a second-order aspherical term based on the second degree central body zonal perturbation. OSMEAN is written in FORTRAN 77 for HP 9000 series computers running HP-UX (NPO-18796) and DEC VAX series computers running VMS (NPO-18741). The HP version requires 388K of RAM for execution and the DEC VAX version requires 254K of RAM for execution. Sample input and output are listed in the documentation. Sample input is also provided on the distribution medium. The standard distribution medium for the HP 9000 series version is a .25 inch streaming magnetic IOTAMAT tape cartridge in UNIX tar format. It is also available on a .25 inch streaming magnetic tape cartridge in UNIX tar format or on a 3.5 inch diskette in UNIX tar format. The standard distribution medium for the DEC VAX version is a 1600 BPI 9-track magnetic tape in DEC VAX BACKUP format. It is also available on a TK50 tape cartridge in DEC VAX BACKUP format. OSMEAN was developed on a VAX 6410 in 1989, and was ported to the HP 9000 series platform in 1991. It is a copyrighted work with all copyright vested in NASA.
OSMEAN - OSCULATING/MEAN CLASSICAL ORBIT ELEMENTS CONVERSION (VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Guinn, J. R.
1994-01-01
OSMEAN is a sophisticated FORTRAN algorithm that converts between osculating and mean classical orbit elements. Mean orbit elements are advantageous for trajectory design and maneuver planning since they can be propagated very quickly; however, mean elements cannot describe the exact orbit at any given time. Osculating elements will enable the engineer to give an exact description of an orbit; however, computation costs are significantly higher due to the numerical integration procedure required for propagation. By calculating accurate conversions between osculating and mean orbit elements, OSMEAN allows the engineer to exploit the advantages of each approach for the design and planning of orbital trajectories and maneuver planning. OSMEAN is capable of converting mean elements to osculating elements or vice versa. The conversion is based on modelling of all first order aspherical and lunar-solar gravitation perturbations as well as a second-order aspherical term based on the second degree central body zonal perturbation. OSMEAN is written in FORTRAN 77 for HP 9000 series computers running HP-UX (NPO-18796) and DEC VAX series computers running VMS (NPO-18741). The HP version requires 388K of RAM for execution and the DEC VAX version requires 254K of RAM for execution. Sample input and output are listed in the documentation. Sample input is also provided on the distribution medium. The standard distribution medium for the HP 9000 series version is a .25 inch streaming magnetic IOTAMAT tape cartridge in UNIX tar format. It is also available on a .25 inch streaming magnetic tape cartridge in UNIX tar format or on a 3.5 inch diskette in UNIX tar format. The standard distribution medium for the DEC VAX version is a 1600 BPI 9-track magnetic tape in DEC VAX BACKUP format. It is also available on a TK50 tape cartridge in DEC VAX BACKUP format. OSMEAN was developed on a VAX 6410 in 1989, and was ported to the HP 9000 series platform in 1991. It is a copyrighted work with all copyright vested in NASA.
Transported Geothermal Energy Technoeconomic Screening Tool - Calculation Engine
Liu, Xiaobing
2016-09-21
This calculation engine estimates technoeconomic feasibility for transported geothermal energy projects. The TGE screening tool (geotool.exe) takes input from input file (input.txt), and list results into output file (output.txt). Both the input and ouput files are in the same folder as the geotool.exe. To use the tool, the input file containing adequate information of the case should be prepared in the format explained below, and the input file should be put into the same folder as geotool.exe. Then the geotool.exe can be executed, which will generate a output.txt file in the same folder containing all key calculation results. The format and content of the output file is explained below as well.
Miller, Derek M; DeMayo, William M; Bourdages, George H; Wittman, Samuel R; Yates, Bill J; McCall, Andrew A
2017-04-01
The integration of inputs from vestibular and proprioceptive sensors within the central nervous system is critical to postural regulation. We recently demonstrated in both decerebrate and conscious cats that labyrinthine and hindlimb inputs converge onto vestibular nucleus neurons. The pontomedullary reticular formation (pmRF) also plays a key role in postural control, and additionally participates in regulating locomotion. Thus, we hypothesized that like vestibular nucleus neurons, pmRF neurons integrate inputs from the limb and labyrinth. To test this hypothesis, we recorded the responses of pmRF neurons to passive ramp-and-hold movements of the hindlimb and to whole-body tilts, in both decerebrate and conscious felines. We found that pmRF neuronal activity was modulated by hindlimb movement in the rostral-caudal plane. Most neurons in both decerebrate (83% of units) and conscious (61% of units) animals encoded both flexion and extension movements of the hindlimb. In addition, hindlimb somatosensory inputs converged with vestibular inputs onto pmRF neurons in both preparations. Pontomedullary reticular formation neurons receiving convergent vestibular and limb inputs likely participate in balance control by governing reticulospinal outflow.
Miller, Derek M.; DeMayo, William M.; Bourdages, George H.; Wittman, Samuel; Yates, Bill J.; McCall, Andrew A.
2017-01-01
The integration of inputs from vestibular and proprioceptive sensors within the central nervous system is critical to postural regulation. We recently demonstrated in both decerebrate and conscious cats that labyrinthine and hindlimb inputs converge onto vestibular nucleus neurons. The pontomedullary reticular formation (pmRF) also plays a key role in postural control, and additionally participates in regulating locomotion. Thus, we hypothesized that like vestibular nucleus neurons, pmRF neurons integrate inputs from the limb and labyrinth. To test this hypothesis, we recorded the responses of pmRF neurons to passive ramp-and-hold movements of the hindlimb and to whole-body tilts, in both decerebrate and conscious felines. We found that pmRF neuronal activity was modulated by hindlimb movement in the rostral-caudal plane. Most neurons in both decerebrate (83% of units) and conscious (61% of units) animals encoded both flexion and extension movements of the hindlimb. Additionally, hindlimb somatosensory inputs converged with vestibular inputs onto pmRF neurons in both preparations. Pontomedullary reticular formation neurons receiving convergent vestibular and limb inputs likely participate in balance control by governing reticulospinal outflow. PMID:28188328
NASA Technical Reports Server (NTRS)
Muss, J. A.; Nguyen, T. V.; Johnson, C. W.
1991-01-01
The user's manual for the rocket combustor interactive design (ROCCID) computer program is presented. The program, written in Fortran 77, provides a standardized methodology using state of the art codes and procedures for the analysis of a liquid rocket engine combustor's steady state combustion performance and combustion stability. The ROCCID is currently capable of analyzing mixed element injector patterns containing impinging like doublet or unlike triplet, showerhead, shear coaxial, and swirl coaxial elements as long as only one element type exists in each injector core, baffle, or barrier zone. Real propellant properties of oxygen, hydrogen, methane, propane, and RP-1 are included in ROCCID. The properties of other propellants can easily be added. The analysis model in ROCCID can account for the influence of acoustic cavities, helmholtz resonators, and radial thrust chamber baffles on combustion stability. ROCCID also contains the logic to interactively create a combustor design which meets input performance and stability goals. A preliminary design results from the application of historical correlations to the input design requirements. The steady state performance and combustion stability of this design is evaluated using the analysis models, and ROCCID guides the user as to the design changes required to satisfy the user's performance and stability goals, including the design of stability aids. Output from ROCCID includes a formatted input file for the standardized JANNAF engine performance prediction procedure.
Extra dimensions: 3D in PDF documentation
Graf, Norman A.
2011-01-11
Experimental science is replete with multi-dimensional information which is often poorly represented by the two dimensions of presentation slides and print media. Past efforts to disseminate such information to a wider audience have failed for a number of reasons, including a lack of standards which are easy to implement and have broad support. Adobe's Portable Document Format (PDF) has in recent years become the de facto standard for secure, dependable electronic information exchange. It has done so by creating an open format, providing support for multiple platforms and being reliable and extensible. By providing support for the ECMA standard Universalmore » 3D (U3D) file format in its free Adobe Reader software, Adobe has made it easy to distribute and interact with 3D content. By providing support for scripting and animation, temporal data can also be easily distributed to a wide, non-technical audience. We discuss how the field of radiation imaging could benefit from incorporating full 3D information about not only the detectors, but also the results of the experimental analyses, in its electronic publications. In this article, we present examples drawn from high-energy physics, mathematics and molecular biology which take advantage of this functionality. Furthermore, we demonstrate how 3D detector elements can be documented, using either CAD drawings or other sources such as GEANT visualizations as input.« less
Extra dimensions: 3D and time in PDF documentation
NASA Astrophysics Data System (ADS)
Graf, N. A.
2011-01-01
Experimental science is replete with multi-dimensional information which is often poorly represented by the two dimensions of presentation slides and print media. Past efforts to disseminate such information to a wider audience have failed for a number of reasons, including a lack of standards which are easy to implement and have broad support. Adobe's Portable Document Format (PDF) has in recent years become the de facto standard for secure, dependable electronic information exchange. It has done so by creating an open format, providing support for multiple platforms and being reliable and extensible. By providing support for the ECMA standard Universal 3D (U3D) file format in its free Adobe Reader software, Adobe has made it easy to distribute and interact with 3D content. By providing support for scripting and animation, temporal data can also be easily distributed to a wide, non-technical audience. We discuss how the field of radiation imaging could benefit from incorporating full 3D information about not only the detectors, but also the results of the experimental analyses, in its electronic publications. In this article, we present examples drawn from high-energy physics, mathematics and molecular biology which take advantage of this functionality. We demonstrate how 3D detector elements can be documented, using either CAD drawings or other sources such as GEANT visualizations as input.
Extra Dimensions: 3D and Time in PDF Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graf, N.A.; /SLAC
2012-04-11
Experimental science is replete with multi-dimensional information which is often poorly represented by the two dimensions of presentation slides and print media. Past efforts to disseminate such information to a wider audience have failed for a number of reasons, including a lack of standards which are easy to implement and have broad support. Adobe's Portable Document Format (PDF) has in recent years become the de facto standard for secure, dependable electronic information exchange. It has done so by creating an open format, providing support for multiple platforms and being reliable and extensible. By providing support for the ECMA standard Universalmore » 3D (U3D) file format in its free Adobe Reader software, Adobe has made it easy to distribute and interact with 3D content. By providing support for scripting and animation, temporal data can also be easily distributed to a wide, non-technical audience. We discuss how the field of radiation imaging could benefit from incorporating full 3D information about not only the detectors, but also the results of the experimental analyses, in its electronic publications. In this article, we present examples drawn from high-energy physics, mathematics and molecular biology which take advantage of this functionality. We demonstrate how 3D detector elements can be documented, using either CAD drawings or other sources such as GEANT visualizations as input.« less
NASA Technical Reports Server (NTRS)
Gibson, S. G.
1983-01-01
A system of computer programs was developed to model general three dimensional surfaces. Surfaces are modeled as sets of parametric bicubic patches. There are also capabilities to transform coordinates, to compute mesh/surface intersection normals, and to format input data for a transonic potential flow analysis. A graphical display of surface models and intersection normals is available. There are additional capabilities to regulate point spacing on input curves and to compute surface/surface intersection curves. Input and output data formats are described; detailed suggestions are given for user input. Instructions for execution are given, and examples are shown.
Cloud Intrusion Detection and Repair (CIDAR)
2016-02-01
form for VLC , Swftools-png2swf, Swftools-jpeg2swf, Dillo and GIMP. The superscript indicates the bit width of each expression atom. “sext(v, w... challenges in input rectification is the need to deal with nested fields. In general, input formats are in tree structures containing arbitrarily...length indicator constraints is challeng - ing, because of the presence of nested fields in hierarchical input format. For example, an integer field may
WT - WIND TUNNEL PERFORMANCE ANALYSIS
NASA Technical Reports Server (NTRS)
Viterna, L. A.
1994-01-01
WT was developed to calculate fan rotor power requirements and output thrust for a closed loop wind tunnel. The program uses blade element theory to calculate aerodynamic forces along the blade using airfoil lift and drag characteristics at an appropriate blade aspect ratio. A tip loss model is also used which reduces the lift coefficient to zero for the outer three percent of the blade radius. The application of momentum theory is not used to determine the axial velocity at the rotor plane. Unlike a propeller, the wind tunnel rotor is prevented from producing an increase in velocity in the slipstream. Instead, velocities at the rotor plane are used as input. Other input for WT includes rotational speed, rotor geometry, and airfoil characteristics. Inputs for rotor blade geometry include blade radius, hub radius, number of blades, and pitch angle. Airfoil aerodynamic inputs include angle at zero lift coefficient, positive stall angle, drag coefficient at zero lift coefficient, and drag coefficient at stall. WT is written in APL2 using IBM's APL2 interpreter for IBM PC series and compatible computers running MS-DOS. WT requires a CGA or better color monitor for display. It also requires 640K of RAM and MS-DOS v3.1 or later for execution. Both an MS-DOS executable and the source code are provided on the distribution medium. The standard distribution medium for WT is a 5.25 inch 360K MS-DOS format diskette in PKZIP format. The utility to unarchive the files, PKUNZIP, is also included. WT was developed in 1991. APL2 and IBM PC are registered trademarks of International Business Machines Corporation. MS-DOS is a registered trademark of Microsoft Corporation. PKUNZIP is a registered trademark of PKWare, Inc.
An Overview of Flight Test Results for a Formation Flight Autopilot
NASA Technical Reports Server (NTRS)
Hanson, Curtis E.; Ryan, Jack; Allen, Michael J.; Jacobson, Steven R.
2002-01-01
The first flight test phase of the NASA Dryden Flight Research Center Autonomous Formation Flight project has successfully demonstrated precision autonomous station-keeping of an F/A-18 research airplane with a second F/A-18 airplane. Blended inertial navigation system (INS) and global positioning system (GPS) measurements have been communicated across an air-to-air telemetry link and used to compute relative-position estimates. A precision research formation autopilot onboard the trailing airplane controls lateral and vertical spacing while the leading airplane operates under production autopilot control. Four research autopilot gain sets have been designed and flight-tested, and each exceeds the project design requirement of steady-state tracking accuracy within 1 standard deviation of 10 ft. Performance also has been demonstrated using single- and multiple-axis inputs such as step commands and frequency sweeps. This report briefly describes the experimental formation flight systems employed and discusses the navigation, guidance, and control algorithms that have been flight-tested. An overview of the flight test results of the formation autopilot during steady-state tracking and maneuvering flight is presented.
Designing Input Fields for Non-Narrative Open-Ended Responses in Web Surveys
Couper, Mick P.; Kennedy, Courtney; Conrad, Frederick G.; Tourangeau, Roger
2012-01-01
Web surveys often collect information such as frequencies, currency amounts, dates, or other items requiring short structured answers in an open-ended format, typically using text boxes for input. We report on several experiments exploring design features of such input fields. We find little effect of the size of the input field on whether frequency or dollar amount answers are well-formed or not. By contrast, the use of templates to guide formatting significantly improves the well-formedness of responses to questions eliciting currency amounts. For date questions (whether month/year or month/day/year), we find that separate input fields improve the quality of responses over single input fields, while drop boxes further reduce the proportion of ill-formed answers. Drop boxes also reduce completion time when the list of responses is short (e.g., months), but marginally increases completion time when the list is long (e.g., birth dates). These results suggest that non-narrative open questions can be designed to help guide respondents to provide answers in the desired format. PMID:23411468
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less
75 FR 44930 - Stakeholder Input; Revisions to Water Quality Standards Regulation
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-30
... Input; Revisions to Water Quality Standards Regulation AGENCY: Environmental Protection Agency. ACTION... national rulemaking to make a limited set of targeted changes to EPA's water quality standards regulation... rulemaking, and to hear views from the public regarding possible changes to EPA's water quality standards...
Rudine, S.F.; Wardlaw, B.R.; Rohr, D.M.; Grant, R.E.
2000-01-01
The Guadalupian rocks of the northern Del Norte Mountains were deposited in a foreland basin between land of the Marathon orogen and a carbonate shoal established on the geanticline separating the foreland basin from the Delaware basin. Deposition was alternately influenced by coarse clastic input from the orogen and carbonate shoal, which interrupted shallow basinal siltstone depletion. Relatively deeper-water deposition is characterized by carbonate input from the shoal, and relatively shallow-water deposition is characterized by sandstone input from the orogen. Deposition was in five general transgressive-regressive packages that include (1) the Road Canyon Formation and the first siltstone member and first sandstone member of the Word Formation, (2) the second siltstone member, Appel Ranch Member, and limy sandy siltstone member of the Word Formation, (3) the Vidrio Formation, (4) the lower and part of the middle members of the Altuda Formation, and (5) part of the middle and upper members of the Altuda Formation.
Extra dimensions: 3d and time in pdf documentation
NASA Astrophysics Data System (ADS)
Graf, N. A.
2008-07-01
High energy physics is replete with multi-dimensional information which is often poorly represented by the two dimensions of presentation slides and print media. Past efforts to disseminate such information to a wider audience have failed for a number of reasons, including a lack of standards which are easy to implement and have broad support. Adobe's Portable Document Format (PDF) has in recent years become the de facto standard for secure, dependable electronic information exchange. It has done so by creating an open format, providing support for multiple platforms and being reliable and extensible. By providing support for the ECMA standard Universal 3D (U3D) file format in its free Adobe Reader software, Adobe has made it easy to distribute and interact with 3D content. By providing support for scripting and animation, temporal data can also be easily distributed to a wide audience. In this talk, we present examples of HEP applications which take advantage of this functionality. We demonstrate how 3D detector elements can be documented, using either CAD drawings or other sources such as GEANT visualizations as input. Using this technique, higher dimensional data, such as LEGO plots or time-dependent information can be included in PDF files. In principle, a complete event display, with full interactivity, can be incorporated into a PDF file. This would allow the end user not only to customize the view and representation of the data, but to access the underlying data itself.
NASA Technical Reports Server (NTRS)
Muss, J. A.; Nguyen, T. V.; Johnson, C. W.
1991-01-01
The appendices A-K to the user's manual for the rocket combustor interactive design (ROCCID) computer program are presented. This includes installation instructions, flow charts, subroutine model documentation, and sample output files. The ROCCID program, written in Fortran 77, provides a standardized methodology using state of the art codes and procedures for the analysis of a liquid rocket engine combustor's steady state combustion performance and combustion stability. The ROCCID is currently capable of analyzing mixed element injector patterns containing impinging like doublet or unlike triplet, showerhead, shear coaxial and swirl coaxial elements as long as only one element type exists in each injector core, baffle, or barrier zone. Real propellant properties of oxygen, hydrogen, methane, propane, and RP-1 are included in ROCCID. The properties of other propellants can be easily added. The analysis models in ROCCID can account for the influences of acoustic cavities, helmholtz resonators, and radial thrust chamber baffles on combustion stability. ROCCID also contains the logic to interactively create a combustor design which meets input performance and stability goals. A preliminary design results from the application of historical correlations to the input design requirements. The steady state performance and combustion stability of this design is evaluated using the analysis models, and ROCCID guides the user as to the design changes required to satisfy the user's performance and stability goals, including the design of stability aids. Output from ROCCID includes a formatted input file for the standardized JANNAF engine performance prediction procedure.
FSCATT: Angular Dependence and Filter Options.
The input routines to the code have been completely rewritten to allow for a free-form input format. The input routines now provide self-consistency checks and diagnostics for the user’s edification .
Standardization of databases for AMDB taxi routing functions
NASA Astrophysics Data System (ADS)
Pschierer, C.; Sindlinger, A.; Schiefele, J.
2010-04-01
Input, management, and display of taxi routes on airport moving map displays (AMM) have been covered in various studies in the past. The demonstrated applications are typically based on Aerodrome Mapping Databases (AMDB). Taxi routing functions require specific enhancements, typically in the form of a graph network with nodes and edges modeling all connectivities within an airport, which are not supported by the current AMDB standards. Therefore, the data schemas and data content have been defined specifically for the purpose and test scenarios of these studies. A standardization of the data format for taxi routing information is a prerequisite for turning taxi routing functions into production. The joint RTCA/EUROCAE special committee SC-217, responsible for updating and enhancing the AMDB standards DO-272 [1] and DO-291 [2], is currently in the process of studying different alternatives and defining reasonable formats. Requirements for taxi routing data are primarily driven by depiction concepts for assigned and cleared taxi routes, but also by database size and the economic feasibility. Studied concepts are similar to the ones described in the GDF (geographic data files) specification [3], which is used in most car navigation systems today. They include - A highly aggregated graph network of complex features - A modestly aggregated graph network of simple features - A non-explicit topology of plain AMDB taxi guidance line elements This paper introduces the different concepts and their advantages and disadvantages.
Application Program Interface for the Orion Aerodynamics Database
NASA Technical Reports Server (NTRS)
Robinson, Philip E.; Thompson, James
2013-01-01
The Application Programming Interface (API) for the Crew Exploration Vehicle (CEV) Aerodynamic Database has been developed to provide the developers of software an easily implemented, fully self-contained method of accessing the CEV Aerodynamic Database for use in their analysis and simulation tools. The API is programmed in C and provides a series of functions to interact with the database, such as initialization, selecting various options, and calculating the aerodynamic data. No special functions (file read/write, table lookup) are required on the host system other than those included with a standard ANSI C installation. It reads one or more files of aero data tables. Previous releases of aerodynamic databases for space vehicles have only included data tables and a document of the algorithm and equations to combine them for the total aerodynamic forces and moments. This process required each software tool to have a unique implementation of the database code. Errors or omissions in the documentation, or errors in the implementation, led to a lengthy and burdensome process of having to debug each instance of the code. Additionally, input file formats differ for each space vehicle simulation tool, requiring the aero database tables to be reformatted to meet the tool s input file structure requirements. Finally, the capabilities for built-in table lookup routines vary for each simulation tool. Implementation of a new database may require an update to and verification of the table lookup routines. This may be required if the number of dimensions of a data table exceeds the capability of the simulation tools built-in lookup routines. A single software solution was created to provide an aerodynamics software model that could be integrated into other simulation and analysis tools. The highly complex Orion aerodynamics model can then be quickly included in a wide variety of tools. The API code is written in ANSI C for ease of portability to a wide variety of systems. The input data files are in standard formatted ASCII, also for improved portability. The API contains its own implementation of multidimensional table reading and lookup routines. The same aerodynamics input file can be used without modification on all implementations. The turnaround time from aerodynamics model release to a working implementation is significantly reduced
High-Voltage-Input Level Translator Using Standard CMOS
NASA Technical Reports Server (NTRS)
Yager, Jeremy A.; Mojarradi, Mohammad M.; Vo, Tuan A.; Blalock, Benjamin J.
2011-01-01
proposed integrated circuit would translate (1) a pair of input signals having a low differential potential and a possibly high common-mode potential into (2) a pair of output signals having the same low differential potential and a low common-mode potential. As used here, "low" and "high" refer to potentials that are, respectively, below or above the nominal supply potential (3.3 V) at which standard complementary metal oxide/semiconductor (CMOS) integrated circuits are designed to operate. The input common-mode potential could lie between 0 and 10 V; the output common-mode potential would be 2 V. This translation would make it possible to process the pair of signals by use of standard 3.3-V CMOS analog and/or mixed-signal (analog and digital) circuitry on the same integrated-circuit chip. A schematic of the circuit is shown in the figure. Standard 3.3-V CMOS circuitry cannot withstand input potentials greater than about 4 V. However, there are many applications that involve low-differential-potential, high-common-mode-potential input signal pairs and in which standard 3.3-V CMOS circuitry, which is relatively inexpensive, would be the most appropriate circuitry for performing other functions on the integrated-circuit chip that handles the high-potential input signals. Thus, there is a need to combine high-voltage input circuitry with standard low-voltage CMOS circuitry on the same integrated-circuit chip. The proposed circuit would satisfy this need. In the proposed circuit, the input signals would be coupled into both a level-shifting pair and a common-mode-sensing pair of CMOS transistors. The output of the level-shifting pair would be fed as input to a differential pair of transistors. The resulting differential current output would pass through six standoff transistors to be mirrored into an output branch by four heterojunction bipolar transistors. The mirrored differential current would be converted back to potential by a pair of diode-connected transistors, which, by virtue of being identical to the input transistors, would reproduce the input differential potential at the output
LOGSIM user's manual. [Logic Simulation Program for computer aided design of logic circuits
NASA Technical Reports Server (NTRS)
Mitchell, C. L.; Taylor, J. F.
1972-01-01
The user's manual for the LOGSIM Program is presented. All program options are explained and a detailed definition of the format of each input card is given. LOGSIM Program operations, and the preparation of LOGSIM input data are discused along with data card formats, postprocessor data cards, and output interpretation.
A framework for visualization of battlefield network behavior
NASA Astrophysics Data System (ADS)
Perzov, Yury; Yurcik, William
2006-05-01
An extensible network simulation application was developed to study wireless battlefield communications. The application monitors node mobility and depicts broadcast and unicast traffic as expanding rings and directed links. The network simulation was specially designed to support fault injection to show the impact of air strikes on disabling nodes. The application takes standard ns-2 trace files as an input and provides for performance data output in different graphical forms (histograms and x/y plots). Network visualization via animation of simulation output can be saved in AVI format that may serve as a basis for a real-time battlefield awareness system.
Aerosol indirect effect from turbulence-induced broadening of cloud-droplet size distributions.
Chandrakar, Kamal Kant; Cantrell, Will; Chang, Kelken; Ciochetto, David; Niedermeier, Dennis; Ovchinnikov, Mikhail; Shaw, Raymond A; Yang, Fan
2016-12-13
The influence of aerosol concentration on the cloud-droplet size distribution is investigated in a laboratory chamber that enables turbulent cloud formation through moist convection. The experiments allow steady-state microphysics to be achieved, with aerosol input balanced by cloud-droplet growth and fallout. As aerosol concentration is increased, the cloud-droplet mean diameter decreases, as expected, but the width of the size distribution also decreases sharply. The aerosol input allows for cloud generation in the limiting regimes of fast microphysics ([Formula: see text]) for high aerosol concentration, and slow microphysics ([Formula: see text]) for low aerosol concentration; here, [Formula: see text] is the phase-relaxation time and [Formula: see text] is the turbulence-correlation time. The increase in the width of the droplet size distribution for the low aerosol limit is consistent with larger variability of supersaturation due to the slow microphysical response. A stochastic differential equation for supersaturation predicts that the standard deviation of the squared droplet radius should increase linearly with a system time scale defined as [Formula: see text], and the measurements are in excellent agreement with this finding. The result underscores the importance of droplet size dispersion for aerosol indirect effects: increasing aerosol concentration changes the albedo and suppresses precipitation formation not only through reduction of the mean droplet diameter but also by narrowing of the droplet size distribution due to reduced supersaturation fluctuations. Supersaturation fluctuations in the low aerosol/slow microphysics limit are likely of leading importance for precipitation formation.
TRASYS - THERMAL RADIATION ANALYZER SYSTEM (DEC VAX VERSION WITH NASADIG)
NASA Technical Reports Server (NTRS)
Anderson, G. E.
1994-01-01
The Thermal Radiation Analyzer System, TRASYS, is a computer software system with generalized capability to solve the radiation related aspects of thermal analysis problems. TRASYS computes the total thermal radiation environment for a spacecraft in orbit. The software calculates internode radiation interchange data as well as incident and absorbed heat rate data originating from environmental radiant heat sources. TRASYS provides data of both types in a format directly usable by such thermal analyzer programs as SINDA/FLUINT (available from COSMIC, program number MSC-21528). One primary feature of TRASYS is that it allows users to write their own driver programs to organize and direct the preprocessor and processor library routines in solving specific thermal radiation problems. The preprocessor first reads and converts the user's geometry input data into the form used by the processor library routines. Then, the preprocessor accepts the user's driving logic, written in the TRASYS modified FORTRAN language. In many cases, the user has a choice of routines to solve a given problem. Users may also provide their own routines where desirable. In particular, the user may write output routines to provide for an interface between TRASYS and any thermal analyzer program using the R-C network concept. Input to the TRASYS program consists of Options and Edit data, Model data, and Logic Flow and Operations data. Options and Edit data provide for basic program control and user edit capability. The Model data describe the problem in terms of geometry and other properties. This information includes surface geometry data, documentation data, nodal data, block coordinate system data, form factor data, and flux data. Logic Flow and Operations data house the user's driver logic, including the sequence of subroutine calls and the subroutine library. Output from TRASYS consists of two basic types of data: internode radiation interchange data, and incident and absorbed heat rate data. The flexible structure of TRASYS allows considerable freedom in the definition and choice of solution method for a thermal radiation problem. The program's flexible structure has also allowed TRASYS to retain the same basic input structure as the authors update it in order to keep up with changing requirements. Among its other important features are the following: 1) up to 3200 node problem size capability with shadowing by intervening opaque or semi-transparent surfaces; 2) choice of diffuse, specular, or diffuse/specular radiant interchange solutions; 3) a restart capability that minimizes recomputing; 4) macroinstructions that automatically provide the executive logic for orbit generation that optimizes the use of previously completed computations; 5) a time variable geometry package that provides automatic pointing of the various parts of an articulated spacecraft and an automatic look-back feature that eliminates redundant form factor calculations; 6) capability to specify submodel names to identify sets of surfaces or components as an entity; and 7) subroutines to perform functions which save and recall the internodal and/or space form factors in subsequent steps for nodes with fixed geometry during a variable geometry run. There are two machine versions of TRASYS v27: a DEC VAX version and a Cray UNICOS version. Both versions require installation of the NASADIG library (MSC-21801 for DEC VAX or COS-10049 for CRAY), which is available from COSMIC either separately or bundled with TRASYS. The NASADIG (NASA Device Independent Graphics Library) plot package provides a pictorial representation of input geometry, orbital/orientation parameters, and heating rate output as a function of time. NASADIG supports Tektronix terminals. The CRAY version of TRASYS v27 is written in FORTRAN 77 for batch or interactive execution and has been implemented on CRAY X-MP and CRAY Y-MP series computers running UNICOS. The standard distribution medium for MSC-21959 (CRAY version without NASADIG) is a 1600 BPI 9-track magnetic tape in UNIX tar format. The standard distribution medium for COS-10040 (CRAY version with NASADIG) is a set of two 6250 BPI 9-track magnetic tapes in UNIX tar format. Alternate distribution media and formats are available upon request. The DEC VAX version of TRASYS v27 is written in FORTRAN 77 for batch execution (only the plotting driver program is interactive) and has been implemented on a DEC VAX 8650 computer under VMS. Since the source codes for MSC-21030 and COS-10026 are in VAX/VMS text library files and DEC Command Language files, COSMIC will only provide these programs in the following formats: MSC-21030, TRASYS (DEC VAX version without NASADIG) is available on a 1600 BPI 9-track magnetic tape in VAX BACKUP format (standard distribution medium) or in VAX BACKUP format on a TK50 tape cartridge; COS-10026, TRASYS (DEC VAX version with NASADIG), is available in VAX BACKUP format on a set of three 6250 BPI 9-track magnetic tapes (standard distribution medium) or a set of three TK50 tape cartridges in VAX BACKUP format. TRASYS was last updated in 1993.
TRASYS - THERMAL RADIATION ANALYZER SYSTEM (CRAY VERSION WITH NASADIG)
NASA Technical Reports Server (NTRS)
Anderson, G. E.
1994-01-01
The Thermal Radiation Analyzer System, TRASYS, is a computer software system with generalized capability to solve the radiation related aspects of thermal analysis problems. TRASYS computes the total thermal radiation environment for a spacecraft in orbit. The software calculates internode radiation interchange data as well as incident and absorbed heat rate data originating from environmental radiant heat sources. TRASYS provides data of both types in a format directly usable by such thermal analyzer programs as SINDA/FLUINT (available from COSMIC, program number MSC-21528). One primary feature of TRASYS is that it allows users to write their own driver programs to organize and direct the preprocessor and processor library routines in solving specific thermal radiation problems. The preprocessor first reads and converts the user's geometry input data into the form used by the processor library routines. Then, the preprocessor accepts the user's driving logic, written in the TRASYS modified FORTRAN language. In many cases, the user has a choice of routines to solve a given problem. Users may also provide their own routines where desirable. In particular, the user may write output routines to provide for an interface between TRASYS and any thermal analyzer program using the R-C network concept. Input to the TRASYS program consists of Options and Edit data, Model data, and Logic Flow and Operations data. Options and Edit data provide for basic program control and user edit capability. The Model data describe the problem in terms of geometry and other properties. This information includes surface geometry data, documentation data, nodal data, block coordinate system data, form factor data, and flux data. Logic Flow and Operations data house the user's driver logic, including the sequence of subroutine calls and the subroutine library. Output from TRASYS consists of two basic types of data: internode radiation interchange data, and incident and absorbed heat rate data. The flexible structure of TRASYS allows considerable freedom in the definition and choice of solution method for a thermal radiation problem. The program's flexible structure has also allowed TRASYS to retain the same basic input structure as the authors update it in order to keep up with changing requirements. Among its other important features are the following: 1) up to 3200 node problem size capability with shadowing by intervening opaque or semi-transparent surfaces; 2) choice of diffuse, specular, or diffuse/specular radiant interchange solutions; 3) a restart capability that minimizes recomputing; 4) macroinstructions that automatically provide the executive logic for orbit generation that optimizes the use of previously completed computations; 5) a time variable geometry package that provides automatic pointing of the various parts of an articulated spacecraft and an automatic look-back feature that eliminates redundant form factor calculations; 6) capability to specify submodel names to identify sets of surfaces or components as an entity; and 7) subroutines to perform functions which save and recall the internodal and/or space form factors in subsequent steps for nodes with fixed geometry during a variable geometry run. There are two machine versions of TRASYS v27: a DEC VAX version and a Cray UNICOS version. Both versions require installation of the NASADIG library (MSC-21801 for DEC VAX or COS-10049 for CRAY), which is available from COSMIC either separately or bundled with TRASYS. The NASADIG (NASA Device Independent Graphics Library) plot package provides a pictorial representation of input geometry, orbital/orientation parameters, and heating rate output as a function of time. NASADIG supports Tektronix terminals. The CRAY version of TRASYS v27 is written in FORTRAN 77 for batch or interactive execution and has been implemented on CRAY X-MP and CRAY Y-MP series computers running UNICOS. The standard distribution medium for MSC-21959 (CRAY version without NASADIG) is a 1600 BPI 9-track magnetic tape in UNIX tar format. The standard distribution medium for COS-10040 (CRAY version with NASADIG) is a set of two 6250 BPI 9-track magnetic tapes in UNIX tar format. Alternate distribution media and formats are available upon request. The DEC VAX version of TRASYS v27 is written in FORTRAN 77 for batch execution (only the plotting driver program is interactive) and has been implemented on a DEC VAX 8650 computer under VMS. Since the source codes for MSC-21030 and COS-10026 are in VAX/VMS text library files and DEC Command Language files, COSMIC will only provide these programs in the following formats: MSC-21030, TRASYS (DEC VAX version without NASADIG) is available on a 1600 BPI 9-track magnetic tape in VAX BACKUP format (standard distribution medium) or in VAX BACKUP format on a TK50 tape cartridge; COS-10026, TRASYS (DEC VAX version with NASADIG), is available in VAX BACKUP format on a set of three 6250 BPI 9-track magnetic tapes (standard distribution medium) or a set of three TK50 tape cartridges in VAX BACKUP format. TRASYS was last updated in 1993.
TRASYS - THERMAL RADIATION ANALYZER SYSTEM (DEC VAX VERSION WITHOUT NASADIG)
NASA Technical Reports Server (NTRS)
Vogt, R. A.
1994-01-01
The Thermal Radiation Analyzer System, TRASYS, is a computer software system with generalized capability to solve the radiation related aspects of thermal analysis problems. TRASYS computes the total thermal radiation environment for a spacecraft in orbit. The software calculates internode radiation interchange data as well as incident and absorbed heat rate data originating from environmental radiant heat sources. TRASYS provides data of both types in a format directly usable by such thermal analyzer programs as SINDA/FLUINT (available from COSMIC, program number MSC-21528). One primary feature of TRASYS is that it allows users to write their own driver programs to organize and direct the preprocessor and processor library routines in solving specific thermal radiation problems. The preprocessor first reads and converts the user's geometry input data into the form used by the processor library routines. Then, the preprocessor accepts the user's driving logic, written in the TRASYS modified FORTRAN language. In many cases, the user has a choice of routines to solve a given problem. Users may also provide their own routines where desirable. In particular, the user may write output routines to provide for an interface between TRASYS and any thermal analyzer program using the R-C network concept. Input to the TRASYS program consists of Options and Edit data, Model data, and Logic Flow and Operations data. Options and Edit data provide for basic program control and user edit capability. The Model data describe the problem in terms of geometry and other properties. This information includes surface geometry data, documentation data, nodal data, block coordinate system data, form factor data, and flux data. Logic Flow and Operations data house the user's driver logic, including the sequence of subroutine calls and the subroutine library. Output from TRASYS consists of two basic types of data: internode radiation interchange data, and incident and absorbed heat rate data. The flexible structure of TRASYS allows considerable freedom in the definition and choice of solution method for a thermal radiation problem. The program's flexible structure has also allowed TRASYS to retain the same basic input structure as the authors update it in order to keep up with changing requirements. Among its other important features are the following: 1) up to 3200 node problem size capability with shadowing by intervening opaque or semi-transparent surfaces; 2) choice of diffuse, specular, or diffuse/specular radiant interchange solutions; 3) a restart capability that minimizes recomputing; 4) macroinstructions that automatically provide the executive logic for orbit generation that optimizes the use of previously completed computations; 5) a time variable geometry package that provides automatic pointing of the various parts of an articulated spacecraft and an automatic look-back feature that eliminates redundant form factor calculations; 6) capability to specify submodel names to identify sets of surfaces or components as an entity; and 7) subroutines to perform functions which save and recall the internodal and/or space form factors in subsequent steps for nodes with fixed geometry during a variable geometry run. There are two machine versions of TRASYS v27: a DEC VAX version and a Cray UNICOS version. Both versions require installation of the NASADIG library (MSC-21801 for DEC VAX or COS-10049 for CRAY), which is available from COSMIC either separately or bundled with TRASYS. The NASADIG (NASA Device Independent Graphics Library) plot package provides a pictorial representation of input geometry, orbital/orientation parameters, and heating rate output as a function of time. NASADIG supports Tektronix terminals. The CRAY version of TRASYS v27 is written in FORTRAN 77 for batch or interactive execution and has been implemented on CRAY X-MP and CRAY Y-MP series computers running UNICOS. The standard distribution medium for MSC-21959 (CRAY version without NASADIG) is a 1600 BPI 9-track magnetic tape in UNIX tar format. The standard distribution medium for COS-10040 (CRAY version with NASADIG) is a set of two 6250 BPI 9-track magnetic tapes in UNIX tar format. Alternate distribution media and formats are available upon request. The DEC VAX version of TRASYS v27 is written in FORTRAN 77 for batch execution (only the plotting driver program is interactive) and has been implemented on a DEC VAX 8650 computer under VMS. Since the source codes for MSC-21030 and COS-10026 are in VAX/VMS text library files and DEC Command Language files, COSMIC will only provide these programs in the following formats: MSC-21030, TRASYS (DEC VAX version without NASADIG) is available on a 1600 BPI 9-track magnetic tape in VAX BACKUP format (standard distribution medium) or in VAX BACKUP format on a TK50 tape cartridge; COS-10026, TRASYS (DEC VAX version with NASADIG), is available in VAX BACKUP format on a set of three 6250 BPI 9-track magnetic tapes (standard distribution medium) or a set of three TK50 tape cartridges in VAX BACKUP format. TRASYS was last updated in 1993.
Kamauu, Aaron W C; DuVall, Scott L; Robison, Reid J; Liimatta, Andrew P; Wiggins, Richard H; Avrin, David E
2006-01-01
Although digital teaching files are important to radiology education, there are no current satisfactory solutions for export of Digital Imaging and Communications in Medicine (DICOM) images from picture archiving and communication systems (PACS) in desktop publishing format. A vendor-neutral digital teaching file, the Radiology Interesting Case Server (RadICS), offers an efficient tool for harvesting interesting cases from PACS without requiring modifications of the PACS configurations. Radiologists push imaging studies from PACS to RadICS via the standard DICOM Send process, and the RadICS server automatically converts the DICOM images into the Joint Photographic Experts Group format, a common desktop publishing format. They can then select key images and create an interesting case series at the PACS workstation. RadICS was tested successfully against multiple unmodified commercial PACS. Using RadICS, radiologists are able to harvest and author interesting cases at the point of clinical interpretation with minimal disruption in clinical work flow. RSNA, 2006
OIL—Output input language for data connectivity between geoscientific software applications
NASA Astrophysics Data System (ADS)
Amin Khan, Khalid; Akhter, Gulraiz; Ahmad, Zulfiqar
2010-05-01
Geoscientific computing has become so complex that no single software application can perform all the processing steps required to get the desired results. Thus for a given set of analyses, several specialized software applications are required, which must be interconnected for electronic flow of data. In this network of applications the outputs of one application become inputs of other applications. Each of these applications usually involve more than one data type and may have their own data formats, making them incompatible with other applications in terms of data connectivity. Consequently several data format conversion utilities are developed in-house to provide data connectivity between applications. Practically there is no end to this problem as each time a new application is added to the system, a set of new data conversion utilities need to be developed. This paper presents a flexible data format engine, programmable through a platform independent, interpreted language named; Output Input Language (OIL). Its unique architecture allows input and output formats to be defined independent of each other by two separate programs. Thus read and write for each format is coded only once and data connectivity link between two formats is established by a combination of their read and write programs. This results in fewer programs with no redundancy and maximum reuse, enabling rapid application development and easy maintenance of data connectivity links.
CROSSER - CUMULATIVE BINOMIAL PROGRAMS
NASA Technical Reports Server (NTRS)
Bowerman, P. N.
1994-01-01
The cumulative binomial program, CROSSER, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), can be used independently of one another. CROSSER can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CROSSER calculates the point at which the reliability of a k-out-of-n system equals the common reliability of the n components. It is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The CROSSER program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CROSSER was developed in 1988.
Measuring Equity: Creating a New Standard for Inputs and Outputs
ERIC Educational Resources Information Center
Knoeppel, Robert C.; Della Sala, Matthew R.
2013-01-01
The purpose of this article is to introduce a new statistic to capture the ratio of equitable student outcomes given equitable inputs. Given the fact that finance structures should be aligned to outcome standards according to judicial interpretation, a ratio of outputs to inputs, or "equity ratio," is introduced to discern if conclusions can be…
ParamAP: Standardized Parameterization of Sinoatrial Node Myocyte Action Potentials.
Rickert, Christian; Proenza, Catherine
2017-08-22
Sinoatrial node myocytes act as cardiac pacemaker cells by generating spontaneous action potentials (APs). Much information is encoded in sinoatrial AP waveforms, but both the analysis and the comparison of AP parameters between studies is hindered by the lack of standardized parameter definitions and the absence of automated analysis tools. Here we introduce ParamAP, a standalone cross-platform computational tool that uses a template-free detection algorithm to automatically identify and parameterize APs from text input files. ParamAP employs a graphic user interface with automatic and user-customizable input modes, and it outputs data files in text and PDF formats. ParamAP returns a total of 16 AP waveform parameters including time intervals such as the AP duration, membrane potentials such as the maximum diastolic potential, and rates of change of the membrane potential such as the diastolic depolarization rate. ParamAP provides a robust AP detection algorithm in combination with a standardized AP parameter analysis over a wide range of AP waveforms and firing rates, owing in part to the use of an iterative algorithm for the determination of the threshold potential and the diastolic depolarization rate that is independent of the maximum upstroke velocity, a parameter that can vary significantly among sinoatrial APs. Because ParamAP is implemented in Python 3, it is also highly customizable and extensible. In conclusion, ParamAP is a powerful computational tool that facilitates quantitative analysis and enables comparison of sinoatrial APs by standardizing parameter definitions and providing an automated work flow. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
EnviroNET: An on-line environment data base for LDEF data
NASA Technical Reports Server (NTRS)
Lauriente, Michael
1992-01-01
EnviroNET is an on-line, free form data base intended to provide a centralized depository for a wide range of technical information on environmentally induced interactions of use to Space Shuttle customers and spacecraft designers. It provides a user friendly, menu driven format on networks that are connected globally and is available twenty-four hours a day, every day. The information updated regularly, includes expository text, tabular numerical data, charts and graphs, and models. The system pools space data collected over the years by NASA, USAF, other government facilities, industry, universities, and ESA. The models accept parameter input from the user and calculate and display the derived values corresponding to that input. In addition to the archive, interactive graphics programs are also available on space debris, the neutral atmosphere, radiation, magnetic field, and ionosphere. A user friendly informative interface is standard for all the models with a pop-up window, help window with information on inputs, outputs, and caveats. The system will eventually simplify mission analysis with analytical tools and deliver solution for computational intense graphical applications to do 'What if' scenarios. A proposed plan for developing a repository of LDEF information for a user group concludes the presentation.
Convergence and Periodic Solutions for the Input Impedance of a Standard Ladder Network
ERIC Educational Resources Information Center
Ucak, C.; Acar, C.
2007-01-01
The input impedance of an infinite ladder network is computed by using the recursive relation and by assuming that the input impedance does not change when a new block is added to the network. However, this assumption is not true in general and standard textbooks do not always treat these networks correctly. This paper develops a general solution…
Chang, Chia-Ling; Trimbuch, Thorsten; Chao, Hsiao-Tuan; Jordan, Julia-Christine; Herman, Melissa A; Rosenmund, Christian
2014-01-15
Neural circuits are composed of mainly glutamatergic and GABAergic neurons, which communicate through synaptic connections. Many factors instruct the formation and function of these synapses; however, it is difficult to dissect the contribution of intrinsic cell programs from that of extrinsic environmental effects in an intact network. Here, we perform paired recordings from two-neuron microculture preparations of mouse hippocampal glutamatergic and GABAergic neurons to investigate how synaptic input and output of these two principal cells develop. In our reduced preparation, we found that glutamatergic neurons showed no change in synaptic output or input regardless of partner neuron cell type or neuronal activity level. In contrast, we found that glutamatergic input caused the GABAergic neuron to modify its output by way of an increase in synapse formation and a decrease in synaptic release efficiency. These findings are consistent with aspects of GABAergic synapse maturation observed in many brain regions. In addition, changes in GABAergic output are cell wide and not target-cell specific. We also found that glutamatergic neuronal activity determined the AMPA receptor properties of synapses on the partner GABAergic neuron. All modifications of GABAergic input and output required activity of the glutamatergic neuron. Because our system has reduced extrinsic factors, the changes we saw in the GABAergic neuron due to glutamatergic input may reflect initiation of maturation programs that underlie the formation and function of in vivo neural circuits.
Best Practices for Preparing Interoperable Geospatial Data
NASA Astrophysics Data System (ADS)
Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.
2010-12-01
Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable mechanisms to advertise, visualize, and distribute the standardized geospatial data. In this presentation, we summarize the experiences learned and the best practices for geospatial data standardization. The presentation will describe how diverse and historical data archived in the ORNL DAAC were converted into standard and non-proprietary formats; what tools were used to make the conversion; how the spatial and temporal information are properly captured in a consistent manor; how to name a data file or a variable to make it both human-friendly and semantically interoperable; how NetCDF file format and CF convention can promote the data usage in ecosystem modeling user community; how those standardized geospatial data can be fed into OGC Web Services to support on-demand data visualization and access; and how the metadata should be collected and organized so that they can be discovered through standard catalog services.
Lv, Yueyong; Hu, Qinglei; Ma, Guangfu; Zhou, Jiakang
2011-10-01
This paper treats the problem of synchronized control of spacecraft formation flying (SFF) in the presence of input constraint and parameter uncertainties. More specifically, backstepping based robust control is first developed for the total 6 DOF dynamic model of SFF with parameter uncertainties, in which the model consists of relative translation and attitude rotation. Then this controller is redesigned to deal with the input constraint problem by incorporating a command filter such that the generated control could be implementable even under physical or operating constraints on the control input. The convergence of the proposed control algorithms is proved by the Lyapunov stability theorem. Compared with conventional methods, illustrative simulations of spacecraft formation flying are conducted to verify the effectiveness of the proposed approach to achieve the spacecraft track the desired attitude and position trajectories in a synchronized fashion even in the presence of uncertainties, external disturbances and control saturation constraint. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
40 CFR 60.44 - Standard for nitrogen oxides (NOX).
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Fossil-Fuel...) derived from gaseous fossil fuel. (2) 129 ng/J heat input (0.30 lb/MMBtu) derived from liquid fossil fuel, liquid fossil fuel and wood residue, or gaseous fossil fuel and wood residue. (3) 300 ng/J heat input (0...
40 CFR 60.44 - Standard for nitrogen oxides (NOX).
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Fossil-Fuel...) derived from gaseous fossil fuel. (2) 129 ng/J heat input (0.30 lb/MMBtu) derived from liquid fossil fuel, liquid fossil fuel and wood residue, or gaseous fossil fuel and wood residue. (3) 300 ng/J heat input (0...
Huff, G.F.
2004-01-01
The tendency of solutes in input water to precipitate efficiency lowering scale deposits on the membranes of reverse osmosis (RO) desalination systems is an important factor in determining the suitability of input water for desalination. Simulated input water evaporation can be used as a technique to quantitatively assess the potential for scale formation in RO desalination systems. The technique was demonstrated by simulating the increase in solute concentrations required to form calcite, gypsum, and amorphous silica scales at 25??C and 40??C from 23 desalination input waters taken from the literature. Simulation results could be used to quantitatively assess the potential of a given input water to form scale or to compare the potential of a number of input waters to form scale during RO desalination. Simulated evaporation of input waters cannot accurately predict the conditions under which scale will form owing to the effects of potentially stable supersaturated solutions, solution velocity, and residence time inside RO systems. However, the simulated scale-forming potential of proposed input waters could be compared with the simulated scale-forming potentials and actual scale-forming properties of input waters having documented operational histories in RO systems. This may provide a technique to estimate the actual performance and suitability of proposed input waters during RO.
Laboratory Performance Evaluation Report of SEL 421 Phasor Measurement Unit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhenyu; faris, Anthony J.; Martin, Kenneth E.
2007-12-01
PNNL and BPA have been in close collaboration on laboratory performance evaluation of phasor measurement units for over ten years. A series of evaluation tests are designed to confirm accuracy and determine measurement performance under a variety of conditions that may be encountered in actual use. Ultimately the testing conducted should provide parameters that can be used to adjust all measurements to a standardized basis. These tests are performed with a standard relay test set using recorded files of precisely generated test signals. The test set provides test signals at a level and in a format suitable for input tomore » a PMU that accurately reproduces the signals in both signal amplitude and timing. Test set outputs are checked to confirm the accuracy of the output signal. The recorded signals include both current and voltage waveforms and a digital timing track used to relate the PMU measured value with the test signal. Test signals include steady-state waveforms to test amplitude, phase, and frequency accuracy, modulated signals to determine measurement and rejection bands, and step tests to determine timing and response accuracy. Additional tests are included as necessary to fully describe the PMU operation. Testing is done with a BPA phasor data concentrator (PDC) which provides communication support and monitors data input for dropouts and data errors.« less
Robertson, Tim; Döring, Markus; Guralnick, Robert; Bloom, David; Wieczorek, John; Braak, Kyle; Otegui, Javier; Russell, Laura; Desmet, Peter
2014-01-01
The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT) is a software package developed to support biodiversity dataset publication in a common format. The IPT’s two primary functions are to 1) encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2) publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements. PMID:25099149
CheckDen, a program to compute quantum molecular properties on spatial grids.
Pacios, Luis F; Fernandez, Alberto
2009-09-01
CheckDen, a program to compute quantum molecular properties on a variety of spatial grids is presented. The program reads as unique input wavefunction files written by standard quantum packages and calculates the electron density rho(r), promolecule and density difference function, gradient of rho(r), Laplacian of rho(r), information entropy, electrostatic potential, kinetic energy densities G(r) and K(r), electron localization function (ELF), and localized orbital locator (LOL) function. These properties can be calculated on a wide range of one-, two-, and three-dimensional grids that can be processed by widely used graphics programs to render high-resolution images. CheckDen offers also other options as extracting separate atom contributions to the property computed, converting grid output data into CUBE and OpenDX volumetric data formats, and perform arithmetic combinations with grid files in all the recognized formats.
Bankey, Viki; Grauch, V.J.S.; Drenth, B.J.; ,
2006-01-01
This report contains digital data, image files, and text files describing data formats and survey procedures for aeromagnetic data collected during high-resolution aeromagnetic surveys in southern Colorado and northern New Mexico in December, 2005. One survey covers the eastern edge of the San Luis basin, including the towns of Questa, New Mexico and San Luis, Colorado. A second survey covers the mountain front east of Santa Fe, New Mexico, including the town of Chimayo and portions of the Pueblos of Tesuque and Nambe. Several derivative products from these data are also presented as grids and images, including reduced-to-pole data and data continued to a reference surface. Images are presented in various formats and are intended to be used as input to geographic information systems, standard graphics software, or map plotting packages.
User's guide for MAGIC-Meteorologic and hydrologic genscn (generate scenarios) input converter
Ortel, Terry W.; Martin, Angel
2010-01-01
Meteorologic and hydrologic data used in watershed modeling studies are collected by various agencies and organizations, and stored in various formats. Data may be in a raw, un-processed format with little or no quality control, or may be checked for validity before being made available. Flood-simulation systems require data in near real-time so that adequate flood warnings can be made. Additionally, forecasted data are needed to operate flood-control structures to potentially mitigate flood damages. Because real-time data are of a provisional nature, missing data may need to be estimated for use in floodsimulation systems. The Meteorologic and Hydrologic GenScn (Generate Scenarios) Input Converter (MAGIC) can be used to convert data from selected formats into the Hydrologic Simulation System-Fortran hourly-observations format for input to a Watershed Data Management database, for use in hydrologic modeling studies. MAGIC also can reformat the data to the Full Equations model time-series format, for use in hydraulic modeling studies. Examples of the application of MAGIC for use in the flood-simulation system for Salt Creek in northeastern Illinois are presented in this report.
A manual for microcomputer image analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rich, P.M.; Ranken, D.M.; George, J.S.
1989-12-01
This manual is intended to serve three basic purposes: as a primer in microcomputer image analysis theory and techniques, as a guide to the use of IMAGE{copyright}, a public domain microcomputer program for image analysis, and as a stimulus to encourage programmers to develop microcomputer software suited for scientific use. Topics discussed include the principals of image processing and analysis, use of standard video for input and display, spatial measurement techniques, and the future of microcomputer image analysis. A complete reference guide that lists the commands for IMAGE is provided. IMAGE includes capabilities for digitization, input and output of images,more » hardware display lookup table control, editing, edge detection, histogram calculation, measurement along lines and curves, measurement of areas, examination of intensity values, output of analytical results, conversion between raster and vector formats, and region movement and rescaling. The control structure of IMAGE emphasizes efficiency, precision of measurement, and scientific utility. 18 refs., 18 figs., 2 tabs.« less
NASA Technical Reports Server (NTRS)
Zehe, Michael J.; Gordon, Sanford; McBride, Bonnie J.
2002-01-01
For several decades the NASA Glenn Research Center has been providing a file of thermodynamic data for use in several computer programs. These data are in the form of least-squares coefficients that have been calculated from tabular thermodynamic data by means of the NASA Properties and Coefficients (PAC) program. The source thermodynamic data are obtained from the literature or from standard compilations. Most gas-phase thermodynamic functions are calculated by the authors from molecular constant data using ideal gas partition functions. The Coefficients and Properties (CAP) program described in this report permits the generation of tabulated thermodynamic functions from the NASA least-squares coefficients. CAP provides considerable flexibility in the output format, the number of temperatures to be tabulated, and the energy units of the calculated properties. This report provides a detailed description of input preparation, examples of input and output for several species, and a listing of all species in the current NASA Glenn thermodynamic data file.
CAP: A Computer Code for Generating Tabular Thermodynamic Functions from NASA Lewis Coefficients
NASA Technical Reports Server (NTRS)
Zehe, Michael J.; Gordon, Sanford; McBride, Bonnie J.
2001-01-01
For several decades the NASA Glenn Research Center has been providing a file of thermodynamic data for use in several computer programs. These data are in the form of least-squares coefficients that have been calculated from tabular thermodynamic data by means of the NASA Properties and Coefficients (PAC) program. The source thermodynamic data are obtained from the literature or from standard compilations. Most gas-phase thermodynamic functions are calculated by the authors from molecular constant data using ideal gas partition functions. The Coefficients and Properties (CAP) program described in this report permits the generation of tabulated thermodynamic functions from the NASA least-squares coefficients. CAP provides considerable flexibility in the output format, the number of temperatures to be tabulated, and the energy units of the calculated properties. This report provides a detailed description of input preparation, examples of input and output for several species, and a listing of all species in the current NASA Glenn thermodynamic data file.
Gottlieb, Alice B; Levin, Adriane A; Armstrong, April W; Abernethy, April; Duffin, Kristina Callis; Bhushan, Reva; Garg, Amit; Merola, Joseph F; Maccarone, Mara; Christensen, Robin
2015-02-01
As quality standards are increasingly in demand throughout medicine, dermatology needs to establish outcome measures to quantify the effectiveness of treatments and providers. The International Dermatology Outcome Measures Group was established to address this need. Beginning with psoriasis, the group aims to create a tool considerate of patients and providers using the input of all relevant stakeholders in assessment of disease severity and response to treatment. Herein, we delineate the procedures through which consensus is being reached and the future directions of the project. Copyright © 2014 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.
40 CFR 60.44 - Standard for nitrogen oxides (NOX).
Code of Federal Regulations, 2013 CFR
2013-07-01
... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Fossil-Fuel... NO2 in excess of: (1) 86 ng/J heat input (0.20 lb/MMBtu) derived from gaseous fossil fuel. (2) 129 ng/J heat input (0.30 lb/MMBtu) derived from liquid fossil fuel, liquid fossil fuel and wood residue...
40 CFR 60.44 - Standard for nitrogen oxides (NOX).
Code of Federal Regulations, 2014 CFR
2014-07-01
... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Fossil-Fuel... NO2 in excess of: (1) 86 ng/J heat input (0.20 lb/MMBtu) derived from gaseous fossil fuel. (2) 129 ng/J heat input (0.30 lb/MMBtu) derived from liquid fossil fuel, liquid fossil fuel and wood residue...
40 CFR 60.43 - Standard for sulfur dioxide (SO2).
Code of Federal Regulations, 2014 CFR
2014-07-01
... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Fossil-Fuel.../J heat input (0.80 lb/MMBtu) derived from liquid fossil fuel or liquid fossil fuel and wood residue. (2) 520 ng/J heat input (1.2 lb/MMBtu) derived from solid fossil fuel or solid fossil fuel and wood...
40 CFR 60.43 - Standard for sulfur dioxide (SO2).
Code of Federal Regulations, 2012 CFR
2012-07-01
... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Fossil-Fuel.../J heat input (0.80 lb/MMBtu) derived from liquid fossil fuel or liquid fossil fuel and wood residue. (2) 520 ng/J heat input (1.2 lb/MMBtu) derived from solid fossil fuel or solid fossil fuel and wood...
40 CFR 60.44 - Standard for nitrogen oxides (NOX).
Code of Federal Regulations, 2012 CFR
2012-07-01
... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Fossil-Fuel... NO2 in excess of: (1) 86 ng/J heat input (0.20 lb/MMBtu) derived from gaseous fossil fuel. (2) 129 ng/J heat input (0.30 lb/MMBtu) derived from liquid fossil fuel, liquid fossil fuel and wood residue...
40 CFR 60.43 - Standard for sulfur dioxide (SO2).
Code of Federal Regulations, 2013 CFR
2013-07-01
... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Fossil-Fuel.../J heat input (0.80 lb/MMBtu) derived from liquid fossil fuel or liquid fossil fuel and wood residue. (2) 520 ng/J heat input (1.2 lb/MMBtu) derived from solid fossil fuel or solid fossil fuel and wood...
Aerosol indirect effect from turbulence-induced broadening of cloud-droplet size distributions
Chandrakar, Kamal Kant; Cantrell, Will; Chang, Kelken; Ciochetto, David; Niedermeier, Dennis; Ovchinnikov, Mikhail; Shaw, Raymond A.; Yang, Fan
2016-01-01
The influence of aerosol concentration on the cloud-droplet size distribution is investigated in a laboratory chamber that enables turbulent cloud formation through moist convection. The experiments allow steady-state microphysics to be achieved, with aerosol input balanced by cloud-droplet growth and fallout. As aerosol concentration is increased, the cloud-droplet mean diameter decreases, as expected, but the width of the size distribution also decreases sharply. The aerosol input allows for cloud generation in the limiting regimes of fast microphysics (τc<τt) for high aerosol concentration, and slow microphysics (τc>τt) for low aerosol concentration; here, τc is the phase-relaxation time and τt is the turbulence-correlation time. The increase in the width of the droplet size distribution for the low aerosol limit is consistent with larger variability of supersaturation due to the slow microphysical response. A stochastic differential equation for supersaturation predicts that the standard deviation of the squared droplet radius should increase linearly with a system time scale defined as τs−1=τc−1+τt−1, and the measurements are in excellent agreement with this finding. The result underscores the importance of droplet size dispersion for aerosol indirect effects: increasing aerosol concentration changes the albedo and suppresses precipitation formation not only through reduction of the mean droplet diameter but also by narrowing of the droplet size distribution due to reduced supersaturation fluctuations. Supersaturation fluctuations in the low aerosol/slow microphysics limit are likely of leading importance for precipitation formation. PMID:27911802
Aerosol indirect effect from turbulence-induced broadening of cloud-droplet size distributions
Chandrakar, Kamal Kant; Cantrell, Will; Chang, Kelken; ...
2016-11-28
Here, the influence of aerosol concentration on cloud droplet size distribution is investigated in a laboratory chamber that enables turbulent cloud formation through moist convection. The experiments allow steady-state microphysics to be achieved, with aerosol input balanced by cloud droplet growth and fallout. As aerosol concentration is increased the cloud droplet mean diameter decreases as expected, but the width of the size distribution also decreases sharply. The aerosol input allows for cloud generation in the limiting regimes of fast microphysics (τ c < τ t) for high aerosol concentration, and slow microphysics (τ c > τ t) for low aerosolmore » concentration; here, τ c is the phase relaxation time and τ t is the turbulence correlation time. The increase in the width of the droplet size distribution for the low aerosol limit is consistent with larger variability of supersaturation due to the slow microphysical response. A stochastic differential equation for supersaturation predicts that the standard deviation of the squared droplet radius should increase linearly with a system time scale defined as τ s -1 =τ c -1 + τ t -1, and the measurements are in excellent agreement with this finding. This finding underscores the importance of droplet size dispersion for the aerosol indirect effect: increasing aerosol concentration not only suppresses precipitation formation through reduction of the mean droplet diameter, but perhaps more importantly, through narrowing of the droplet size distribution due to reduced supersaturation fluctuations. Supersaturation fluctuations in the low aerosol / slow microphysics limit are likely of leading importance for precipitation formation.« less
Aerosol indirect effect from turbulence-induced broadening of cloud-droplet size distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chandrakar, Kamal Kant; Cantrell, Will; Chang, Kelken
Here, the influence of aerosol concentration on cloud droplet size distribution is investigated in a laboratory chamber that enables turbulent cloud formation through moist convection. The experiments allow steady-state microphysics to be achieved, with aerosol input balanced by cloud droplet growth and fallout. As aerosol concentration is increased the cloud droplet mean diameter decreases as expected, but the width of the size distribution also decreases sharply. The aerosol input allows for cloud generation in the limiting regimes of fast microphysics (τ c < τ t) for high aerosol concentration, and slow microphysics (τ c > τ t) for low aerosolmore » concentration; here, τ c is the phase relaxation time and τ t is the turbulence correlation time. The increase in the width of the droplet size distribution for the low aerosol limit is consistent with larger variability of supersaturation due to the slow microphysical response. A stochastic differential equation for supersaturation predicts that the standard deviation of the squared droplet radius should increase linearly with a system time scale defined as τ s -1 =τ c -1 + τ t -1, and the measurements are in excellent agreement with this finding. This finding underscores the importance of droplet size dispersion for the aerosol indirect effect: increasing aerosol concentration not only suppresses precipitation formation through reduction of the mean droplet diameter, but perhaps more importantly, through narrowing of the droplet size distribution due to reduced supersaturation fluctuations. Supersaturation fluctuations in the low aerosol / slow microphysics limit are likely of leading importance for precipitation formation.« less
Aerosol indirect effect from turbulence-induced broadening of cloud-droplet size distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chandrakar, Kamal Kant; Cantrell, Will; Chang, Kelken
2016-11-28
The influence of aerosol concentration on cloud droplet size distribution is investigated in a laboratory chamber that enables turbulent cloud formation through moist convection. The experiments allow steady-state microphysics to be achieved, with aerosol input balanced by cloud droplet growth and fallout. As aerosol concentration is increased the cloud droplet mean diameter decreases as expected, but the width of the size distribution also decreases sharply. The aerosol input allows for cloud generation in the limiting regimes of fast microphysics (τ c < τ t) for high aerosol concentration, and slow microphysics (τ c > τ t) for low aerosol concentration;more » here, τ c is the phase relaxation time and τ t is the turbulence correlation time. The increase in the width of the droplet size distribution for the low aerosol limit is consistent with larger variability of supersaturation due to the slow microphysical response. A stochastic differential equation for supersaturation predicts that the standard deviation of the squared droplet radius should increase linearly with a system time scale defined as τ s -1 =τ c -1 + τ t -1, and the measurements are in excellent agreement with this finding. This finding underscores the importance of droplet size dispersion for the aerosol indirect effect: increasing aerosol concentration not only suppresses precipitation formation through reduction of the mean droplet diameter, but perhaps more importantly, through narrowing of the droplet size distribution due to reduced supersaturation fluctuations. Supersaturation fluctuations in the low aerosol / slow microphysics limit are likely of leading importance for precipitation formation.« less
Sig2BioPAX: Java tool for converting flat files to BioPAX Level 3 format.
Webb, Ryan L; Ma'ayan, Avi
2011-03-21
The World Wide Web plays a critical role in enabling molecular, cell, systems and computational biologists to exchange, search, visualize, integrate, and analyze experimental data. Such efforts can be further enhanced through the development of semantic web concepts. The semantic web idea is to enable machines to understand data through the development of protocol free data exchange formats such as Resource Description Framework (RDF) and the Web Ontology Language (OWL). These standards provide formal descriptors of objects, object properties and their relationships within a specific knowledge domain. However, the overhead of converting datasets typically stored in data tables such as Excel, text or PDF into RDF or OWL formats is not trivial for non-specialists and as such produces a barrier to seamless data exchange between researchers, databases and analysis tools. This problem is particularly of importance in the field of network systems biology where biochemical interactions between genes and their protein products are abstracted to networks. For the purpose of converting biochemical interactions into the BioPAX format, which is the leading standard developed by the computational systems biology community, we developed an open-source command line tool that takes as input tabular data describing different types of molecular biochemical interactions. The tool converts such interactions into the BioPAX level 3 OWL format. We used the tool to convert several existing and new mammalian networks of protein interactions, signalling pathways, and transcriptional regulatory networks into BioPAX. Some of these networks were deposited into PathwayCommons, a repository for consolidating and organizing biochemical networks. The software tool Sig2BioPAX is a resource that enables experimental and computational systems biologists to contribute their identified networks and pathways of molecular interactions for integration and reuse with the rest of the research community.
Gaussian-input Gaussian mixture model for representing density maps and atomic models.
Kawabata, Takeshi
2018-07-01
A new Gaussian mixture model (GMM) has been developed for better representations of both atomic models and electron microscopy 3D density maps. The standard GMM algorithm employs an EM algorithm to determine the parameters. It accepted a set of 3D points with weights, corresponding to voxel or atomic centers. Although the standard algorithm worked reasonably well; however, it had three problems. First, it ignored the size (voxel width or atomic radius) of the input, and thus it could lead to a GMM with a smaller spread than the input. Second, the algorithm had a singularity problem, as it sometimes stopped the iterative procedure due to a Gaussian function with almost zero variance. Third, a map with a large number of voxels required a long computation time for conversion to a GMM. To solve these problems, we have introduced a Gaussian-input GMM algorithm, which considers the input atoms or voxels as a set of Gaussian functions. The standard EM algorithm of GMM was extended to optimize the new GMM. The new GMM has identical radius of gyration to the input, and does not suddenly stop due to the singularity problem. For fast computation, we have introduced a down-sampled Gaussian functions (DSG) by merging neighboring voxels into an anisotropic Gaussian function. It provides a GMM with thousands of Gaussian functions in a short computation time. We also have introduced a DSG-input GMM: the Gaussian-input GMM with the DSG as the input. This new algorithm is much faster than the standard algorithm. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.
Standards application and development plan for solar thermal technologies
NASA Astrophysics Data System (ADS)
Cobb, H. R. W.
1981-07-01
Functional and standards matrices, developed from input from ST users and from the industry that will be continually reviewed and updated as commercial aspects develop are presented. The matrices highlight codes, standards, test methods, functions and definitions that need to be developed. They will be submitted through ANSI for development by national consensus bodies. A contingency action is proposed for standards development if specific input is lacking at the committee level or if early development of a standard would hasten commercialization or gain needed jurisdictional acceptance.
High-definition video display based on the FPGA and THS8200
NASA Astrophysics Data System (ADS)
Qian, Jia; Sui, Xiubao
2014-11-01
This paper presents a high-definition video display solution based on the FPGA and THS8200. THS8200 is a video decoder chip launched by TI company, this chip has three 10-bit DAC channels which can capture video data in both 4:2:2 and 4:4:4 formats, and its data synchronization can be either through the dedicated synchronization signals HSYNC and VSYNC, or extracted from the embedded video stream synchronization information SAV / EAV code. In this paper, we will utilize the address and control signals generated by FPGA to access to the data-storage array, and then the FPGA generates the corresponding digital video signals YCbCr. These signals combined with the synchronization signals HSYNC and VSYNC that are also generated by the FPGA act as the input signals of THS8200. In order to meet the bandwidth requirements of the high-definition TV, we adopt video input in the 4:2:2 format over 2×10-bit interface. THS8200 is needed to be controlled by FPGA with I2C bus to set the internal registers, and as a result, it can generate the synchronous signal that is satisfied with the standard SMPTE and transfer the digital video signals YCbCr into analog video signals YPbPr. Hence, the composite analog output signals YPbPr are consist of image data signal and synchronous signal which are superimposed together inside the chip THS8200. The experimental research indicates that the method presented in this paper is a viable solution for high-definition video display, which conforms to the input requirements of the new high-definition display devices.
Weinberg, W A; McLean, A; Snider, R L; Rintelmann, J W; Brumback, R A
1989-12-01
Eight groups of learning disabled children (N = 100), categorized by the clinical Lexical Paradigm as good readers or poor readers, were individually administered the Gilmore Oral Reading Test, Form D, by one of four input/retrieval methods: (1) the standardized method of administration in which the child reads each paragraph aloud and then answers five questions relating to the paragraph [read/recall method]; (2) the child reads each paragraph aloud and then for each question selects the correct answer from among three choices read by the examiner [read/choice method]; (3) the examiner reads each paragraph aloud and reads each of the five questions to the child to answer [listen/recall method]; and (4) the examiner reads each paragraph aloud and then for each question reads three multiple-choice answers from which the child selects the correct answer [listen/choice method]. The major difference in scores was between the groups tested by the recall versus the orally read multiple-choice methods. This study indicated that poor readers who listened to the material and were tested by orally read multiple-choice format could perform as well as good readers. The performance of good readers was not affected by listening or by the method of testing. The multiple-choice testing improved the performance of poor readers independent of the input method. This supports the arguments made previously that a "bypass approach" to education of poor readers in which testing is accomplished using an orally read multiple-choice format can enhance the child's school performance on reading-related tasks. Using a listening while reading input method may further enhance performance.
Introducing Explorer of Taxon Concepts with a case study on spider measurement matrix building.
Cui, Hong; Xu, Dongfang; Chong, Steven S; Ramirez, Martin; Rodenhausen, Thomas; Macklin, James A; Ludäscher, Bertram; Morris, Robert A; Soto, Eduardo M; Koch, Nicolás Mongiardino
2016-11-17
Taxonomic descriptions are traditionally composed in natural language and published in a format that cannot be directly used by computers. The Exploring Taxon Concepts (ETC) project has been developing a set of web-based software tools that convert morphological descriptions published in telegraphic style to character data that can be reused and repurposed. This paper introduces the first semi-automated pipeline, to our knowledge, that converts morphological descriptions into taxon-character matrices to support systematics and evolutionary biology research. We then demonstrate and evaluate the use of the ETC Input Creation - Text Capture - Matrix Generation pipeline to generate body part measurement matrices from a set of 188 spider morphological descriptions and report the findings. From the given set of spider taxonomic publications, two versions of input (original and normalized) were generated and used by the ETC Text Capture and ETC Matrix Generation tools. The tools produced two corresponding spider body part measurement matrices, and the matrix from the normalized input was found to be much more similar to a gold standard matrix hand-curated by the scientist co-authors. Special conventions utilized in the original descriptions (e.g., the omission of measurement units) were attributed to the lower performance of using the original input. The results show that simple normalization of the description text greatly increased the quality of the machine-generated matrix and reduced edit effort. The machine-generated matrix also helped identify issues in the gold standard matrix. ETC Text Capture and ETC Matrix Generation are low-barrier and effective tools for extracting measurement values from spider taxonomic descriptions and are more effective when the descriptions are self-contained. Special conventions that make the description text less self-contained challenge automated extraction of data from biodiversity descriptions and hinder the automated reuse of the published knowledge. The tools will be updated to support new requirements revealed in this case study.
LiPD and CSciBox: A Case Study in Why Data Standards are Important for Paleoscience
NASA Astrophysics Data System (ADS)
Weiss, I.; Bradley, E.; McKay, N.; Emile-Geay, J.; de Vesine, L. R.; Anderson, K. A.; White, J. W. C.; Marchitto, T. M., Jr.
2016-12-01
CSciBox [1] is an integrated software system that helps geoscientists build and evaluate age models. Its user chooses from a number of built-in analysis tools, composing them into an analysis workflow and applying it to paleoclimate proxy datasets. CSciBox employs modern database technology to store both the data and the analysis results in an easily accessible and searchable form, and offers the user access to the computational toolbox, the data, and the results via a graphical user interface and a sophisticated plotter. Standards are a staple of modern life, and underlie any form of automation. Without data standards, it is difficult, if not impossible, to construct effective computer tools for paleoscience analysis. The LiPD (Linked Paleo Data) framework [2] enables the storage of both data and metadata in systematic, meaningful, machine-readable ways. LiPD has been a primary enabler of CSciBox's goals of usability, interoperability, and reproducibility. Building LiPD capabilities into CSciBox's importer, for instance, eliminated the need to ask the user about file formats, variable names, relationships between columns in the input file, etc. Building LiPD capabilities into the exporter facilitated the storage of complete details about the input data-provenance, preprocessing steps, etc.-as well as full descriptions of any analyses that were performed using the CSciBox tool, along with citations to appropriate references. This comprehensive collection of data and metadata, which is all linked together in a semantically meaningful, machine-readable way, not only completely documents the analyses and makes them reproducible. It also enables interoperability with any other software system that employs the LiPD standard. [1] www.cs.colorado.edu/ lizb/cscience.html[2] McKay & Emile-Geay, Climate of the Past 12:1093 (2016)
NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Phillips, T. A.
1994-01-01
NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.
NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACHINE INDEPENDENT VERSION)
NASA Technical Reports Server (NTRS)
Baffes, P. T.
1994-01-01
NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.
40 CFR Table 1 to Subpart Ddddd of... - Emission Limits and Work Practice Standards
Code of Federal Regulations, 2011 CFR
2011-07-01
.... Hydrogen Chloride 0.02 lb per MMBtu of heat input. c. Mercury 0.000003 lb per MMBtu of heat input. d... input; or (0.0003 lb per MMBtu of heat input). b. Hydrogen Chloride 0.02 lb per MMBtu of heat input. c.... Hydrogen Chloride 0.02 lb per MMBtu of heat input. c. Mercury 0.000003 lb per MMBtu of heat input. 4. New...
40 CFR Table 1 to Subpart Ddddd of... - Emission Limits and Work Practice Standards
Code of Federal Regulations, 2010 CFR
2010-07-01
.... Hydrogen Chloride 0.02 lb per MMBtu of heat input. c. Mercury 0.000003 lb per MMBtu of heat input. d... input; or (0.0003 lb per MMBtu of heat input). b. Hydrogen Chloride 0.02 lb per MMBtu of heat input. c.... Hydrogen Chloride 0.02 lb per MMBtu of heat input. c. Mercury 0.000003 lb per MMBtu of heat input. 4. New...
40 CFR Table 1 to Subpart Ddddd of... - Emission Limits and Work Practice Standards
Code of Federal Regulations, 2012 CFR
2012-07-01
.... Hydrogen Chloride 0.02 lb per MMBtu of heat input. c. Mercury 0.000003 lb per MMBtu of heat input. d... input; or (0.0003 lb per MMBtu of heat input). b. Hydrogen Chloride 0.02 lb per MMBtu of heat input. c.... Hydrogen Chloride 0.02 lb per MMBtu of heat input. c. Mercury 0.000003 lb per MMBtu of heat input. 4. New...
The Aegean Sea marine security decision support system
NASA Astrophysics Data System (ADS)
Perivoliotis, L.; Krokos, G.; Nittis, K.; Korres, G.
2011-05-01
As part of the integrated ECOOP (European Coastal Sea Operational observing and Forecasting System) project, HCMR upgraded the already existing standalone Oil Spill Forecasting System for the Aegean Sea, initially developed for the Greek Operational Oceanography System (POSEIDON), into an active element of the European Decision Support System (EuroDeSS). The system is accessible through a user friendly web interface where the case scenarios can be fed into the oil spill drift model component, while the synthetic output contains detailed information about the distribution of oil spill particles and the oil spill budget and it is provided both in text based ECOOP common output format and as a series of sequential graphics. The main development steps that were necessary for this transition were the modification of the forcing input data module in order to allow the import of other system products which are usually provided in standard formats such as NetCDF and the transformation of the model's calculation routines to allow use of current, density and diffusivities data in z instead of sigma coordinates. During the implementation of the Aegean DeSS, the system was used in operational mode in order support the Greek marine authorities in handling a real accident that took place in North Aegean area. Furthermore, the introduction of common input and output files by all the partners of EuroDeSS extended the system's interoperability thus facilitating data exchanges and comparison experiments.
The Aegean sea marine security decision support system
NASA Astrophysics Data System (ADS)
Perivoliotis, L.; Krokos, G.; Nittis, K.; Korres, G.
2011-10-01
As part of the integrated ECOOP (European Coastal Sea Operational observing and Forecasting System) project, HCMR upgraded the already existing standalone Oil Spill Forecasting System for the Aegean Sea, initially developed for the Greek Operational Oceanography System (POSEIDON), into an active element of the European Decision Support System (EuroDeSS). The system is accessible through a user friendly web interface where the case scenarios can be fed into the oil spill drift model component, while the synthetic output contains detailed information about the distribution of oil spill particles and the oil spill budget and it is provided both in text based ECOOP common output format and as a series of sequential graphics. The main development steps that were necessary for this transition were the modification of the forcing input data module in order to allow the import of other system products which are usually provided in standard formats such as NetCDF and the transformation of the model's calculation routines to allow use of current, density and diffusivities data in z instead of sigma coordinates. During the implementation of the Aegean DeSS, the system was used in operational mode in order to support the Greek marine authorities in handling a real accident that took place in North Aegean area. Furthermore, the introduction of common input and output files by all the partners of EuroDeSS extended the system's interoperability thus facilitating data exchanges and comparison experiments.
EnviroNET: On-line information for LDEF
NASA Technical Reports Server (NTRS)
Lauriente, Michael
1993-01-01
EnviroNET is an on-line, free-form database intended to provide a centralized repository for a wide range of technical information on environmentally induced interactions of use to Space Shuttle customers and spacecraft designers. It provides a user-friendly, menu-driven format on networks that are connected globally and is available twenty-four hours a day - every day. The information, updated regularly, includes expository text, tabular numerical data, charts and graphs, and models. The system pools space data collected over the years by NASA, USAF, other government research facilities, industry, universities, and the European Space Agency. The models accept parameter input from the user, then calculate and display the derived values corresponding to that input. In addition to the archive, interactive graphics programs are also available on space debris, the neutral atmosphere, radiation, magnetic fields, and the ionosphere. A user-friendly, informative interface is standard for all the models and includes a pop-up help window with information on inputs, outputs, and caveats. The system will eventually simplify mission analysis with analytical tools and deliver solutions for computationally intense graphical applications to do 'What if...' scenarios. A proposed plan for developing a repository of information from the Long Duration Exposure Facility (LDEF) for a user group is presented.
SLAM, a Mathematica interface for SUSY spectrum generators
NASA Astrophysics Data System (ADS)
Marquard, Peter; Zerf, Nikolai
2014-03-01
We present and publish a Mathematica package, which can be used to automatically obtain any numerical MSSM input parameter from SUSY spectrum generators, which follow the SLHA standard, like SPheno, SOFTSUSY, SuSeFLAV or Suspect. The package enables a very comfortable way of numerical evaluations within the MSSM using Mathematica. It implements easy to use predefined high scale and low scale scenarios like mSUGRA or mhmax and if needed enables the user to directly specify the input required by the spectrum generators. In addition it supports an automatic saving and loading of SUSY spectra to and from a SQL data base, avoiding the rerun of a spectrum generator for a known spectrum. Catalogue identifier: AERX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERX_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 4387 No. of bytes in distributed program, including test data, etc.: 37748 Distribution format: tar.gz Programming language: Mathematica. Computer: Any computer where Mathematica version 6 or higher is running providing bash and sed. Operating system: Linux. Classification: 11.1. External routines: A SUSY spectrum generator such as SPheno, SOFTSUSY, SuSeFLAV or SUSPECT Nature of problem: Interfacing published spectrum generators for automated creation, saving and loading of SUSY particle spectra. Solution method: SLAM automatically writes/reads SLHA spectrum generator input/output and is able to save/load generated data in/from a data base. Restrictions: No general restrictions, specific restrictions are given in the manuscript. Running time: A single spectrum calculation takes much less than one second on a modern PC.
Predictive thermodynamics for ionic solids and liquids.
Glasser, Leslie; Jenkins, H Donald Brooke
2016-08-21
The application of thermodynamics is simple, even if the theory may appear intimidating. We describe tools, developed over recent years, which make it easy to estimate often elusive thermodynamic parameter values, generally (but not exclusively) for ionic materials, both solid and liquid, as well as for their solid hydrates and solvates. The tools are termed volume-based thermodynamics (VBT) and thermodynamic difference rules (TDR), supplemented by the simple salt approximation (SSA) and single-ion values for volume, Vm, heat capacity, , entropy, , formation enthalpy, ΔfH°, and Gibbs formation energy, ΔfG°. These tools can be applied to provide values of thermodynamic and thermomechanical properties such as standard enthalpy of formation, ΔfH°, standard entropy, , heat capacity, Cp, Gibbs function of formation, ΔfG°, lattice potential energy, UPOT, isothermal expansion coefficient, α, and isothermal compressibility, β, and used to suggest the thermodynamic feasibility of reactions among condensed ionic phases. Because many of these methods yield results largely independent of crystal structure, they have been successfully extended to the important and developing class of ionic liquids as well as to new and hypothesised materials. Finally, these predictive methods are illustrated by application to K2SnCl6, for which known experimental results are available for comparison. A selection of applications of VBT and TDR is presented which have enabled input, usually in the form of thermodynamics, to be brought to bear on a range of topical problems. Perhaps the most significant advantage of VBT and TDR methods is their inherent simplicity in that they do not require a high level of computational expertise nor expensive high-performance computation tools - a spreadsheet will usually suffice - yet the techniques are extremely powerful and accessible to non-experts. The connection between formula unit volume, Vm, and standard thermodynamic parameters represents a major advance exploited by these techniques.
Method and Apparatus for Reducing the Vulnerability of Latches to Single Event Upsets
NASA Technical Reports Server (NTRS)
Shuler, Robert L., Jr. (Inventor)
2002-01-01
A delay circuit includes a first network having an input and an output node, a second network having an input and an output, the input of the second network being coupled to the output node of the first network. The first network and the second network are configured such that: a glitch at the input to the first network having a length of approximately one-half of a standard glitch time or less does not cause the voltage at the output of the second network to cross a threshold, a glitch at the input to the first network having a length of between approximately one-half and two standard glitch times causes the voltage at the output of the second network to cross the threshold for less than the length of the glitch, and a glitch at the input to the first network having a length of greater than approximately two standard glitch times causes the voltage at the output of the second network to cross the threshold for approximately the time of the glitch. The method reduces the vulnerability of a latch to single event upsets. The latch includes a gate having an input and an output and a feedback path from the output to the input of the gate. The method includes inserting a delay into the feedback path and providing a delay in the gate.
Method and Apparatus for Reducing the Vulnerability of Latches to Single Event Upsets
NASA Technical Reports Server (NTRS)
Shuler, Robert L., Jr. (Inventor)
2002-01-01
A delay circuit includes a first network having an input and an output node, a second network having an input and an output, the input of the second network being coupled to the output node of the first network. The first network and the second network are configured such that: a glitch at the input to the first network having a length of approximately one-half of a standard glitch time or less does not cause tile voltage at the output of the second network to cross a threshold, a glitch at the input to the first network having a length of between approximately one-half and two standard glitch times causes the voltage at the output of the second network to cross the threshold for less than the length of the glitch, and a glitch at the input to the first network having a length of greater than approximately two standard glitch times causes the voltage at the output of the second network to cross the threshold for approximately the time of the glitch. A method reduces the vulnerability of a latch to single event upsets. The latch includes a gate having an input and an output and a feedback path from the output to the input of the gate. The method includes inserting a delay into the feedback path and providing a delay in the gate.
ERIC Educational Resources Information Center
Silver, Steven S.
FMS/3 is a system for producing hard copy documentation at high speed from free format text and command input. The system was originally written in assembler language for a 12K IBM 360 model 20 using a high speed 1403 printer with the UCS-TN chain option (upper and lower case). Input was from an IBM 2560 Multi-function Card Machine. The model 20…
NLEdit: A generic graphical user interface for Fortran programs
NASA Technical Reports Server (NTRS)
Curlett, Brian P.
1994-01-01
NLEdit is a generic graphical user interface for the preprocessing of Fortran namelist input files. The interface consists of a menu system, a message window, a help system, and data entry forms. A form is generated for each namelist. The form has an input field for each namelist variable along with a one-line description of that variable. Detailed help information, default values, and minimum and maximum allowable values can all be displayed via menu picks. Inputs are processed through a scientific calculator program that allows complex equations to be used instead of simple numeric inputs. A custom user interface is generated simply by entering information about the namelist input variables into an ASCII file. There is no need to learn a new graphics system or programming language. NLEdit can be used as a stand-alone program or as part of a larger graphical user interface. Although NLEdit is intended for files using namelist format, it can be easily modified to handle other file formats.
ProMC: Input-output data format for HEP applications using varint encoding
NASA Astrophysics Data System (ADS)
Chekanov, S. V.; May, E.; Strand, K.; Van Gemmeren, P.
2014-10-01
A new data format for Monte Carlo (MC) events, or any structural data, including experimental data, is discussed. The format is designed to store data in a compact binary form using variable-size integer encoding as implemented in the Google's Protocol Buffers package. This approach is implemented in the PROMC library which produces smaller file sizes for MC records compared to the existing input-output libraries used in high-energy physics (HEP). Other important features of the proposed format are a separation of abstract data layouts from concrete programming implementations, self-description and random access. Data stored in PROMC files can be written, read and manipulated in a number of programming languages, such C++, JAVA, FORTRAN and PYTHON.
Sensory and short-term memory formations observed in a Ag2S gap-type atomic switch
NASA Astrophysics Data System (ADS)
Ohno, Takeo; Hasegawa, Tsuyoshi; Nayak, Alpana; Tsuruoka, Tohru; Gimzewski, James K.; Aono, Masakazu
2011-11-01
Memorization caused by the change in conductance in a Ag2S gap-type atomic switch was investigated as a function of the amplitude and width of input voltage pulses (Vin). The conductance changed little for the first few Vin, but the information of the input was stored as a redistribution of Ag-ions in the Ag2S, indicating the formation of sensory memory. After a certain number of Vin, the conductance increased abruptly followed by a gradual decrease, indicating the formation of short-term memory (STM). We found that the probability of STM formation depends strongly on the amplitude and width of Vin, which resembles the learning behavior of the human brain.
75 FR 42819 - Airborne Area Navigation Equipment Using Loran-C Inputs
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-22
... Using Loran-C Inputs AGENCY: Federal Aviation Administration (FAA), DOT ACTION: Notice of cancellation of: (1) Loran-C navigation system Technical Standard Orders (TSO); and (2) the revocation of Loran-C... the cancellation of Technical Standard Order (TSO) C-60, Airborne Area Navigation Equipment Using...
75 FR 22674 - Airborne Area Navigation Equipment Using Loran-C Inputs
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-29
... Using Loran-C Inputs AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice of cancellation of: (1) Loran-C navigation system Technical Standard Orders (TSO); and (2) the revocation of Loran-C... the cancellation of Technical Standard Order (TSO) C-60, Airborne Area Navigation Equipment Using...
High power, high efficiency, continuous-wave supercontinuum generation using standard telecom fibers
NASA Astrophysics Data System (ADS)
Arun, S.; Choudhury, Vishal; Balaswamy, V.; Prakash, Roopa; Supradeepa, V. R.
2018-04-01
We demonstrate a simple module for octave spanning continuous-wave supercontinuum generation using standard telecom fiber. This module can accept any high power Ytterbium-doped fiber laser as input. The input light is transferred into the anomalous dispersion region of the telecom fiber through a cascade of Raman shifts. A recently proposed Raman laser architecture with distributed feedback efficiently performs these Raman conversions. A spectrum spanning over 1000nm(>1 octave) from 880-1900nm is demonstrated. The average power from the supercontinuum is ~34W with a high conversion efficiency of 44%. Input wavelength agility is demonstrated with similar supercontinua over a wide input wavelength range.
NBS computerized carpool matching system: users' guide. Final technical report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilsinn, J.F.; Landau, S.
1974-12-01
The report includes flowcharts, input/output formats, and program listings for the programs, plus details of the manual process for coordinate coding. The matching program produces, for each person desiring it, a list of others residing within a pre-specified distance of him, and is thus applicable to a single work destination having primarily one work schedule. The system is currently operational on the National Bureau of Standards' UNIVAC 1108 computer and was run in March of 1974, producing lists for about 950 employees in less than four minutes computer time. Subsequent maintenance of the system will be carried out by themore » NBS Management and Organization Division. (GRA)« less
Sentiment analysis of feature ranking methods for classification accuracy
NASA Astrophysics Data System (ADS)
Joseph, Shashank; Mugauri, Calvin; Sumathy, S.
2017-11-01
Text pre-processing and feature selection are important and critical steps in text mining. Text pre-processing of large volumes of datasets is a difficult task as unstructured raw data is converted into structured format. Traditional methods of processing and weighing took much time and were less accurate. To overcome this challenge, feature ranking techniques have been devised. A feature set from text preprocessing is fed as input for feature selection. Feature selection helps improve text classification accuracy. Of the three feature selection categories available, the filter category will be the focus. Five feature ranking methods namely: document frequency, standard deviation information gain, CHI-SQUARE, and weighted-log likelihood -ratio is analyzed.
Burn, K W; Daffara, C; Gualdrini, G; Pierantoni, M; Ferrari, P
2007-01-01
The question of Monte Carlo simulation of radiation transport in voxel geometries is addressed. Patched versions of the MCNP and MCNPX codes are developed aimed at transporting radiation both in the standard geometry mode and in the voxel geometry treatment. The patched code reads an unformatted FORTRAN file derived from DICOM format data and uses special subroutines to handle voxel-to-voxel radiation transport. The various phases of the development of the methodology are discussed together with the new input options. Examples are given of employment of the code in internal and external dosimetry and comparisons with results from other groups are reported.
Mental Verb Input for Promoting Children's Theory of Mind: A Training Study
ERIC Educational Resources Information Center
Gola, Alice Ann Howard
2012-01-01
An experimental study investigated the effect of the type of mental verb input (i.e., input with "think", "know", and "remember") on preschoolers' theory of mind development. Preschoolers (n = 72) heard 128 mental verb utterances presented in video format across four sessions over two weeks. The training conditions differed only in the way the…
NASA Technical Reports Server (NTRS)
1982-01-01
Personal data input, decompression data, nitrogen washout, nitrogen data, and update computer programs are described. Input data and formats; program output, reports, and data; program flowcharts; program listings; sample runs with input and output pages; hardware operation; and engineering data are provided.
Designs for Risk Evaluation and Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO 2) under the U.S. Department of Energy's National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO 2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakage simulations. While DREAM was developed for CO 2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format.more » The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user's manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.« less
Extra Dimensions: 3D and Time in PDF Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graf, Norman A.; /SLAC
2011-11-10
High energy physics is replete with multi-dimensional information which is often poorly represented by the two dimensions of presentation slides and print media. Past efforts to disseminate such information to a wider audience have failed for a number of reasons, including a lack of standards which are easy to implement and have broad support. Adobe's Portable Document Format (PDF) has in recent years become the de facto standard for secure, dependable electronic information exchange. It has done so by creating an open format, providing support for multiple platforms and being reliable and extensible. By providing support for the ECMA standardmore » Universal 3D (U3D) file format in its free Adobe Reader software, Adobe has made it easy to distribute and interact with 3D content. By providing support for scripting and animation, temporal data can also be easily distributed to a wide audience. In this talk, we present examples of HEP applications which take advantage of this functionality. We demonstrate how 3D detector elements can be documented, using either CAD drawings or other sources such as GEANT visualizations as input. Using this technique, higher dimensional data, such as LEGO plots or time-dependent information can be included in PDF files. In principle, a complete event display, with full interactivity, can be incorporated into a PDF file. This would allow the end user not only to customize the view and representation of the data, but to access the underlying data itself.« less
CUMBIN - CUMULATIVE BINOMIAL PROGRAMS
NASA Technical Reports Server (NTRS)
Bowerman, P. N.
1994-01-01
The cumulative binomial program, CUMBIN, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), can be used independently of one another. CUMBIN can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CUMBIN calculates the probability that a system of n components has at least k operating if the probability that any one operating is p and the components are independent. Equivalently, this is the reliability of a k-out-of-n system having independent components with common reliability p. CUMBIN can evaluate the incomplete beta distribution for two positive integer arguments. CUMBIN can also evaluate the cumulative F distribution and the negative binomial distribution, and can determine the sample size in a test design. CUMBIN is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. The CUMBIN program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMBIN was developed in 1988.
NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.
Zhang, Bo; Dai, Ji; Zhang, Tao
2017-11-13
In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided by NeoAnalysis, users can easily obtain publication-quality figures without writing complex codes. NeoAnalysis is a powerful and valuable toolbox for users doing electrophysiological experiments.
User's Guide for the Updated EST/BEST Software System
NASA Technical Reports Server (NTRS)
Shah, Ashwin
2003-01-01
This User's Guide describes the structure of the IPACS input file that reflects the modularity of each module. The structured format helps the user locate specific input data and manually enter or edit it. The IPACS input file can have any user-specified filename, but must have a DAT extension. The input file may consist of up to six input data blocks; the data blocks must be separated by delimiters beginning with the $ character. If multiple sections are desired, they must be arranged in the order listed.
User Manual for the PROTEUS Mesh Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Micheal A.; Shemon, Emily R
2016-09-19
PROTEUS is built around a finite element representation of the geometry for visualization. In addition, the PROTEUS-SN solver was built to solve the even-parity transport equation on a finite element mesh provided as input. Similarly, PROTEUS-MOC and PROTEUS-NEMO were built to apply the method of characteristics on unstructured finite element meshes. Given the complexity of real world problems, experience has shown that using commercial mesh generator to create rather simple input geometries is overly complex and slow. As a consequence, significant effort has been put into place to create multiple codes that help assist in the mesh generation and manipulation.more » There are three input means to create a mesh in PROTEUS: UFMESH, GRID, and NEMESH. At present, the UFMESH is a simple way to generate two-dimensional Cartesian and hexagonal fuel assembly geometries. The UFmesh input allows for simple assembly mesh generation while the GRID input allows the generation of Cartesian, hexagonal, and regular triangular structured grid geometry options. The NEMESH is a way for the user to create their own mesh or convert another mesh file format into a PROTEUS input format. Given that one has an input mesh format acceptable for PROTEUS, we have constructed several tools which allow further mesh and geometry construction (i.e. mesh extrusion and merging). This report describes the various mesh tools that are provided with the PROTEUS code giving both descriptions of the input and output. In many cases the examples are provided with a regression test of the mesh tools. The most important mesh tools for any user to consider using are the MT_MeshToMesh.x and the MT_RadialLattice.x codes. The former allows the conversion between most mesh types handled by PROTEUS while the second allows the merging of multiple (assembly) meshes into a radial structured grid. Note that the mesh generation process is recursive in nature and that each input specific for a given mesh tool (such as .axial or .merge) can be used as “mesh” input for any of the mesh tools discussed in this manual.« less
Banning standard cell engineering notebook
NASA Technical Reports Server (NTRS)
1976-01-01
A family of standardized thick-oxide P-MOS building blocks (standard cells) is described. The information is presented in a form useful for systems designs, logic design, and the preparation of inputs to both sets of Design Automation programs for array design and analysis. A data sheet is provided for each cell and gives the cell name, the cell number, its logic symbol, Boolean equation, truth table, circuit schematic circuit composite, input-output capacitances, and revision date. The circuit type file, also given for each cell, together with the logic drawing contained on the data sheet provides all the information required to prepare input data files for the Design Automation Systems. A detailed description of the electrical design procedure is included.
Addendum I, BIOPLUME III Graphics Conversion to SURFER Format
This procedure can be used to create a SURFER® compatible grid file from Bioplume III input and output graphics. The input data and results from Bioplume III can be contoured and printed directly from SURFER.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Petrie, L.M.; Westfall, R.M.
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
The PLEXOS Input Data Generator (PIDG) is a tool that enables PLEXOS users to better version their data, automate data processing, collaborate in developing inputs, and transfer data between different production cost modeling and other power systems analysis software. PIDG can process data that is in a generalized format from multiple input sources, including CSV files, PostgreSQL databases, and PSS/E .raw files and write it to an Excel file that can be imported into PLEXOS with only limited manual intervention.
CARE3MENU- A CARE III USER FRIENDLY INTERFACE
NASA Technical Reports Server (NTRS)
Pierce, J. L.
1994-01-01
CARE3MENU generates an input file for the CARE III program. CARE III is used for reliability prediction of complex, redundant, fault-tolerant systems including digital computers, aircraft, nuclear and chemical control systems. The CARE III input file often becomes complicated and is not easily formatted with a text editor. CARE3MENU provides an easy, interactive method of creating an input file by automatically formatting a set of user-supplied inputs for the CARE III system. CARE3MENU provides detailed on-line help for most of its screen formats. The reliability model input process is divided into sections using menu-driven screen displays. Each stage, or set of identical modules comprising the model, must be identified and described in terms of number of modules, minimum number of modules for stage operation, and critical fault threshold. The fault handling and fault occurence models are detailed in several screens by parameters such as transition rates, propagation and detection densities, Weibull or exponential characteristics, and model accuracy. The system fault tree and critical pairs fault tree screens are used to define the governing logic and to identify modules affected by component failures. Additional CARE3MENU screens prompt the user for output options and run time control values such as mission time and truncation values. There are fourteen major screens, many with default values and HELP options. The documentation includes: 1) a users guide with several examples of CARE III models, the dialog required to input them to CARE3MENU, and the output files created; and 2) a maintenance manual for assistance in changing the HELP files and modifying any of the menu formats or contents. CARE3MENU is written in FORTRAN 77 for interactive execution and has been implemented on a DEC VAX series computer operating under VMS. This program was developed in 1985.
PS2-06: Best Practices for Advancing Multi-site Chart Abstraction Research
Blick, Noelle; Cole, Deanna; King, Colleen; Riordan, Rick; Von Worley, Ann; Yarbro, Patty
2012-01-01
Background/Aims Multi-site chart abstraction studies are becoming increasingly common within the HMORN. Differences in systems among HMORN sites can pose significant obstacles to the success of these studies. It is therefore crucial to standardize abstraction activities by following best practices for multi-site chart abstraction, as consistency of processes across sites will increase efficiencies and enhance data quality. Methods Over the past few months the authors have been meeting to identify obstacles to multi-site chart abstraction and to address ways in which multi-site chart abstraction processes can be systemized and standardized. The aim of this workgroup is to create a best practice guide for multi-site chart abstraction studies. Focus areas include: abstractor training, format for chart abstraction (database, paper, etc), data quality, redaction, mechanism for transferring data, site specific access to medical records, IRB/HIPAA concerns, and budgetary issues. Results The results of the workgroup’s efforts (the best practice guide) will be presented by a panel of experts at the 2012 HMORN conference. The presentation format will also focus on discussion among attendees to elicit further input and to identify areas that need to be further addressed. Subsequently, the best practice guide will be posted on the HMORN website. Discussion The best practice guide for multi-site chart abstraction studies will establish sound guidelines and serve as an aid to researchers embarking on multi-site chart abstraction studies. Efficiencies and data quality will be further enhanced with standardized multi-site chart abstraction practices.
A survey on annotation tools for the biomedical literature.
Neves, Mariana; Leser, Ulf
2014-03-01
New approaches to biomedical text mining crucially depend on the existence of comprehensive annotated corpora. Such corpora, commonly called gold standards, are important for learning patterns or models during the training phase, for evaluating and comparing the performance of algorithms and also for better understanding the information sought for by means of examples. Gold standards depend on human understanding and manual annotation of natural language text. This process is very time-consuming and expensive because it requires high intellectual effort from domain experts. Accordingly, the lack of gold standards is considered as one of the main bottlenecks for developing novel text mining methods. This situation led the development of tools that support humans in annotating texts. Such tools should be intuitive to use, should support a range of different input formats, should include visualization of annotated texts and should generate an easy-to-parse output format. Today, a range of tools which implement some of these functionalities are available. In this survey, we present a comprehensive survey of tools for supporting annotation of biomedical texts. Altogether, we considered almost 30 tools, 13 of which were selected for an in-depth comparison. The comparison was performed using predefined criteria and was accompanied by hands-on experiences whenever possible. Our survey shows that current tools can support many of the tasks in biomedical text annotation in a satisfying manner, but also that no tool can be considered as a true comprehensive solution.
Jin, Ling; Tonse, Shaheen; Cohan, Daniel S; Mao, Xiaoling; Harley, Robert A; Brown, Nancy J
2008-05-15
We developed a first- and second-order sensitivity analysis approach with the decoupled direct method to examine spatial and temporal variations of ozone-limiting reagents and the importance of local vs upwind emission sources in the San Joaquin Valley of central California for a 5 day ozone episode (Jul 29th to Aug 3rd, 2000). Despite considerable spatial variations, nitrogen oxides (NO(x)) emission reductions are overall more effective than volatile organic compound (VOC) control for attaining the 8 h ozone standard in this region for this episode, in contrast to the VOC control that works better for attaining the prior 1 h ozone standard. Interbasin source contributions of NO(x) emissions are limited to the northern part of the SJV, while anthropogenic VOC (AVOC) emissions, especially those emitted at night, influence ozone formation in the SJV further downwind. Among model input parameters studied here, uncertainties in emissions of NO(x) and AVOC, and the rate coefficient of the OH + NO2 termination reaction, have the greatest effect on first-order ozone responses to changes in NO(x) emissions. Uncertainties in biogenic VOC emissions only have a modest effect because they are generally not collocated with anthropogenic sources in this region.
Hori, Yuki; Ihara, Naoki; Teramoto, Noboru; Kunimi, Masako; Honda, Manabu; Kato, Koichi; Hanakawa, Takashi
2015-01-01
Measurement of arterial input function (AIF) for quantitative positron emission tomography (PET) studies is technically challenging. The present study aimed to develop a method based on a standard arterial input function (SIF) to estimate input function without blood sampling. We performed 18F-fluolodeoxyglucose studies accompanied by continuous blood sampling for measurement of AIF in 11 rats. Standard arterial input function was calculated by averaging AIFs from eight anesthetized rats, after normalization with body mass (BM) and injected dose (ID). Then, the individual input function was estimated using two types of SIF: (1) SIF calibrated by the individual's BM and ID (estimated individual input function, EIFNS) and (2) SIF calibrated by a single blood sampling as proposed previously (EIF1S). No significant differences in area under the curve (AUC) or cerebral metabolic rate for glucose (CMRGlc) were found across the AIF-, EIFNS-, and EIF1S-based methods using repeated measures analysis of variance. In the correlation analysis, AUC or CMRGlc derived from EIFNS was highly correlated with those derived from AIF and EIF1S. Preliminary comparison between AIF and EIFNS in three awake rats supported an idea that the method might be applicable to behaving animals. The present study suggests that EIFNS method might serve as a noninvasive substitute for individual AIF measurement. PMID:25966947
Hori, Yuki; Ihara, Naoki; Teramoto, Noboru; Kunimi, Masako; Honda, Manabu; Kato, Koichi; Hanakawa, Takashi
2015-10-01
Measurement of arterial input function (AIF) for quantitative positron emission tomography (PET) studies is technically challenging. The present study aimed to develop a method based on a standard arterial input function (SIF) to estimate input function without blood sampling. We performed (18)F-fluolodeoxyglucose studies accompanied by continuous blood sampling for measurement of AIF in 11 rats. Standard arterial input function was calculated by averaging AIFs from eight anesthetized rats, after normalization with body mass (BM) and injected dose (ID). Then, the individual input function was estimated using two types of SIF: (1) SIF calibrated by the individual's BM and ID (estimated individual input function, EIF(NS)) and (2) SIF calibrated by a single blood sampling as proposed previously (EIF(1S)). No significant differences in area under the curve (AUC) or cerebral metabolic rate for glucose (CMRGlc) were found across the AIF-, EIF(NS)-, and EIF(1S)-based methods using repeated measures analysis of variance. In the correlation analysis, AUC or CMRGlc derived from EIF(NS) was highly correlated with those derived from AIF and EIF(1S). Preliminary comparison between AIF and EIF(NS) in three awake rats supported an idea that the method might be applicable to behaving animals. The present study suggests that EIF(NS) method might serve as a noninvasive substitute for individual AIF measurement.
NASA Technical Reports Server (NTRS)
Carlson, C. R.
1981-01-01
The user documentation of the SYSGEN model and its links with other simulations is described. The SYSGEN is a production costing and reliability model of electric utility systems. Hydroelectric, storage, and time dependent generating units are modeled in addition to conventional generating plants. Input variables, modeling options, output variables, and reports formats are explained. SYSGEN also can be run interactively by using a program called FEPS (Front End Program for SYSGEN). A format for SYSGEN input variables which is designed for use with FEPS is presented.
River Basin Standards Interoperability Pilot
NASA Astrophysics Data System (ADS)
Pesquer, Lluís; Masó, Joan; Stasch, Christoph
2016-04-01
There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture tests the combination of Gauge data in a WPS that is triggered by a meteorological alert. The data is translated into OGC WaterML 2.0 time series data format and will be ingested in a SOS 2.0. SOS data is visualized in a SOS Client that is able to handle time series. The meteorological forecast data (with the supervision of an operator manipulating the WPS user interface) ingests with WaterML 2.0 time series and terrain data is input for a flooding modelling algorithm. The WPS is able to produce flooding datasets in the form of coverages that is offered to clients via a WCS 2.0 service or a WMS 1.3 service, and downloaded and visualized by the respective clients. The WPS triggers a notification or an alert that will be monitored from an emergency control response service. Acronyms AS: Alert Service ES: Event Service ICT: Information and Communication Technology NS: Notification Service OGC: Open Geospatial Consortium RIBASE: River Basin Standards Interoperability Pilot SOS: Sensor Observation Service WaterML: Water Markup Language WCS: Web Coverage Service WMS: Web Map Service WPS: Web Processing Service
Arun, S; Choudhury, Vishal; Balaswamy, V; Prakash, Roopa; Supradeepa, V R
2018-04-02
We demonstrate a simple module for octave spanning continuous-wave supercontinuum generation using standard telecom fiber. This module can accept any high power ytterbium-doped fiber laser as input. The input light is transferred into the anomalous dispersion region of the telecom fiber through a cascade of Raman shifts. A recently proposed Raman laser architecture with distributed feedback efficiently performs these Raman conversions. A spectrum spanning over 1000nm (>1 octave) from 880 to 1900nm is demonstrated. The average power from the supercontinuum is ~34W with a high conversion efficiency of 44%. Input wavelength agility is demonstrated with similar supercontinua over a wide input wavelength range.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-02
... Software Developers on the Technical Specifications for Common Formats for Patient Safety Data Collection... software developers can provide input on these technical specifications for the Common Formats Version 1.1... specifications, which provide direction to software developers that plan to implement the Common Formats...
Drenth, B.J.; Grauch, V.J.S.; Bankey, Viki; New Sense Geophysics, Ltd.
2009-01-01
This report contains digital data, image files, and text files describing data formats and survey procedures for two high-resolution aeromagnetic surveys in south-central Colorado: one in the eastern San Luis Valley, Alamosa and Saguache Counties, and the other in the southern Upper Arkansas Valley, Chaffee County. In the San Luis Valley, the Great Sand Dunes survey covers a large part of Great Sand Dunes National Park and Preserve and extends south along the mountain front to the foot of Mount Blanca. In the Upper Arkansas Valley, the Poncha Springs survey covers the town of Poncha Springs and vicinity. The digital files include grids, images, and flight-line data. Several derivative products from these data are also presented as grids and images, including two grids of reduced-to-pole aeromagnetic data and data continued to a reference surface. Images are presented in various formats and are intended to be used as input to geographic information systems, standard graphics software, or map plotting packages.
Image recording requirements for earth observation applications in the next decade
NASA Technical Reports Server (NTRS)
Peavey, B.; Sos, J. Y.
1975-01-01
Future requirements for satellite-borne image recording systems are examined from the standpoints of system performance, system operation, product type, and product quality. Emphasis is on total system design while keeping in mind that the image recorder or scanner is the most crucial element which will affect the end product quality more than any other element within the system. Consideration of total system design and implementation for sustained operational usage must encompass the requirements for flexibility of input data and recording speed, pixel density, aspect ratio, and format size. To produce this type of system requires solution of challenging problems in interfacing the data source with the recorder, maintaining synchronization between the data source and the recorder, and maintaining a consistent level of quality. Film products of better quality than is currently achieved in a routine manner are needed. A 0.1 pixel geometric accuracy and 0.0001 d.u. radiometric accuracy on standard (240 mm) size format should be accepted as a goal to be reached in the near future.
DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SUN VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1994-01-01
Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.
DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SGI IRIS VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1994-01-01
Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.
DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1994-01-01
Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.
Mattsson, Jonathan; Hedström, Annelie; Ashley, Richard M; Viklander, Maria
2015-09-15
Ever since the advent of major sewer construction in the 1850s, the issue of increased solids deposition in sewers due to changes in domestic wastewater inputs has been frequently debated. Three recent changes considered here are the introduction of kitchen sink food waste disposers (FWDs); rising levels of inputs of fat, oil and grease (FOG); and the installation of low-flush toilets (LFTs). In this review these changes have been examined with regard to potential solids depositional impacts on sewer systems and the managerial implications. The review indicates that each of the changes has the potential to cause an increase in solids deposition in sewers and this is likely to be more pronounced for the upstream reaches of networks that serve fewer households than the downstream parts and for specific sewer features such as sags. The review has highlighted the importance of educational campaigns directed to the public to mitigate deposition as many of the observed problems have been linked to domestic behaviour in regard to FOGs, FWDs and toilet flushing. A standardized monitoring procedure of repeat sewer blockage locations can also be a means to identify depositional hot-spots. Interactions between the various changes in inputs in the studies reviewed here indicated an increased potential for blockage formation, but this would need to be further substantiated. As the precise nature of these changes in inputs have been found to be variable, depending on lifestyles and type of installation, the additional problems that may arise pose particular challenges to sewer operators and managers because of the difficulty in generalizing the nature of the changes, particularly where retrofitting projects in households are being considered. The three types of changes to inputs reviewed here highlight the need to consider whether or not more or less solid waste from households should be diverted into sewers. Copyright © 2015 Elsevier Ltd. All rights reserved.
Because HSPF requires extensive input data, its Data-Formatting Tool (HDFT) allows users to format that data and import it to a WDM file. HDFT aids urban watershed modeling applications that use sub-hourly temporal resolutions.
Marshall, Deborah A; Douglas, Patrick R; Drummond, Michael F; Torrance, George W; Macleod, Stuart; Manti, Orlando; Cheruvu, Lokanadha; Corvari, Ron
2008-01-01
Until now, there has been no standardized method of performing and presenting budget impact analyses (BIAs) in Canada. Nevertheless, most drug plan managers have been requiring this economic data to inform drug reimbursement decisions. This paper describes the process used to develop the Canadian BIA Guidelines; describes the Guidelines themselves, including the model template; and compares this guidance with other guidance on BIAs. The intended audience includes those who develop, submit or use BIA models, and drug plan managers who evaluate BIA submissions. The Patented Medicine Prices Review Board (PMPRB) initiated the development of the Canadian BIA Guidelines on behalf of the National Prescription Drug Utilisation Information System (NPDUIS). The findings and recommendations from a needs assessment with respect to BIA submissions were reviewed to inform guideline development. In addition, a literature review was performed to identify existing BIA guidance. The detailed guidance was developed on this basis, and with the input of the NPDUIS Advisory Committee, including drug plan managers from multiple provinces in Canada and a representative from the Canadian Agency for Drugs and Technologies in Health. A Microsoft Excel-based interactive model template was designed to support BIA model development. Input regarding the guidelines and model template was sought from each NPDUIS Advisory Committee member to ensure compatibility with existing drug plan needs. Decisions were made by consensus through multiple rounds of review and discussion. Finally, BIA guidance in Canadian provinces and other countries were compared on the basis of multiple criteria. The BIA guidelines consist of three major sections: Analytic Framework, Inputs and Data Sources, and Reporting Format. The Analytic Framework section contains a discussion of nine general issues surrounding BIAs (model design, analytic perspective, time horizon, target population, costing, scenarios to be compared, the characterisation of uncertainty, discounting, and validation methods). The Inputs and Data Sources section addresses methods for market size estimation, comparator selection, scenario forecasting and drug price estimation. The Reporting Format section describes methods for BIA reporting. The new Canadian BIA Guidelines represent a significant departure from the limited guidance that was previously available from some of the provinces, because they include specific details of the methods of performing BIAs. The Canadian BIA Guidelines differ from the Principles of Good Research Practice for BIAs developed by the International Society for Pharmacoeconomic and Outcomes Research (ISPOR), which provide more general guidance. The Canadian BIA Guidelines and template build upon existing guidance to address the specific requirements of each of the participating drug plans in Canada. Both have been endorsed by the NPDUIS Steering Committee and the PMPRB for the standardization of BIA submissions.
MOVES2014 for Experienced Users, September 2014 Webinar Slides
This webinar assumes a basic knowledge of past versions of the MOtor Vehicle Emission Simulator (MOVES) and includes a demonstration of the conversion of MOVES2010b input files to MOVES2014 format, changes to the MOVES GUI, and new input options.
NASA Astrophysics Data System (ADS)
Liu, G.; Wu, C.; Li, X.; Song, P.
2013-12-01
The 3D urban geological information system has been a major part of the national urban geological survey project of China Geological Survey in recent years. Large amount of multi-source and multi-subject data are to be stored in the urban geological databases. There are various models and vocabularies drafted and applied by industrial companies in urban geological data. The issues such as duplicate and ambiguous definition of terms and different coding structure increase the difficulty of information sharing and data integration. To solve this problem, we proposed a national standard-driven information classification and coding method to effectively store and integrate urban geological data, and we applied the data dictionary technology to achieve structural and standard data storage. The overall purpose of this work is to set up a common data platform to provide information sharing service. Research progresses are as follows: (1) A unified classification and coding method for multi-source data based on national standards. Underlying national standards include GB 9649-88 for geology and GB/T 13923-2006 for geography. Current industrial models are compared with national standards to build a mapping table. The attributes of various urban geological data entity models are reduced to several categories according to their application phases and domains. Then a logical data model is set up as a standard format to design data file structures for a relational database. (2) A multi-level data dictionary for data standardization constraint. Three levels of data dictionary are designed: model data dictionary is used to manage system database files and enhance maintenance of the whole database system; attribute dictionary organizes fields used in database tables; term and code dictionary is applied to provide a standard for urban information system by adopting appropriate classification and coding methods; comprehensive data dictionary manages system operation and security. (3) An extension to system data management function based on data dictionary. Data item constraint input function is making use of the standard term and code dictionary to get standard input result. Attribute dictionary organizes all the fields of an urban geological information database to ensure the consistency of term use for fields. Model dictionary is used to generate a database operation interface automatically with standard semantic content via term and code dictionary. The above method and technology have been applied to the construction of Fuzhou Urban Geological Information System, South-East China with satisfactory results.
FRED 2: an immunoinformatics framework for Python
Schubert, Benjamin; Walzer, Mathias; Brachvogel, Hans-Philipp; Szolek, András; Mohr, Christopher; Kohlbacher, Oliver
2016-01-01
Summary: Immunoinformatics approaches are widely used in a variety of applications from basic immunological to applied biomedical research. Complex data integration is inevitable in immunological research and usually requires comprehensive pipelines including multiple tools and data sources. Non-standard input and output formats of immunoinformatics tools make the development of such applications difficult. Here we present FRED 2, an open-source immunoinformatics framework offering easy and unified access to methods for epitope prediction and other immunoinformatics applications. FRED 2 is implemented in Python and designed to be extendable and flexible to allow rapid prototyping of complex applications. Availability and implementation: FRED 2 is available at http://fred-2.github.io Contact: schubert@informatik.uni-tuebingen.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153717
FRED 2: an immunoinformatics framework for Python.
Schubert, Benjamin; Walzer, Mathias; Brachvogel, Hans-Philipp; Szolek, András; Mohr, Christopher; Kohlbacher, Oliver
2016-07-01
Immunoinformatics approaches are widely used in a variety of applications from basic immunological to applied biomedical research. Complex data integration is inevitable in immunological research and usually requires comprehensive pipelines including multiple tools and data sources. Non-standard input and output formats of immunoinformatics tools make the development of such applications difficult. Here we present FRED 2, an open-source immunoinformatics framework offering easy and unified access to methods for epitope prediction and other immunoinformatics applications. FRED 2 is implemented in Python and designed to be extendable and flexible to allow rapid prototyping of complex applications. FRED 2 is available at http://fred-2.github.io schubert@informatik.uni-tuebingen.de Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Swade, Daryl; Bushouse, Howard; Greene, Gretchen; Swam, Michael
2014-07-01
Science data products for James Webb Space Telescope (JWST) ©observations will be generated by the Data Management Subsystem (DMS) within the JWST Science and Operations Center (S&OC) at the Space Telescope Science Institute (STScI). Data processing pipelines within the DMS will produce uncalibrated and calibrated exposure files, as well as higher level data products that result from combined exposures, such as mosaic images. Information to support the science observations, for example data from engineering telemetry, proposer inputs, and observation planning will be captured and incorporated into the science data products. All files will be generated in Flexible Image Transport System (FITS) format. The data products will be made available through the Mikulski Archive for Space Telescopes (MAST) and adhere to International Virtual Observatory Alliance (IVOA) standard data protocols.
WCSTools 3.0: More Tools for Image Astrometry and Catalog Searching
NASA Astrophysics Data System (ADS)
Mink, Douglas J.
For five years, WCSTools has provided image astrometry for astronomers who need accurate positions for objects they wish to observe. Other functions have been added and improved since the package was first released. Support has been added for new catalogs, such as the GSC-ACT, 2MASS Point Source Catalog, and GSC II, as they have been published. A simple command line interface can search any supported catalog, returning information in several standard formats, whether the catalog is on a local disk or searchable over the World Wide Web. The catalog searching routine can be located on either end (or both ends!) of such a web connection, and the output from one catalog search can be used as the input to another search.
Development of a Multilayer MODIS IST-Albedo Product of Greenland
NASA Technical Reports Server (NTRS)
Hall, D. K.; Comiso, J. C.; Cullather, R. I.; Digirolamo, N. E.; Nowicki, S. M.; Medley, B. C.
2017-01-01
A new multilayer IST-albedo Moderate Resolution Imaging Spectroradiometer (MODIS) product of Greenland was developed to meet the needs of the ice sheet modeling community. The multiple layers of the product enable the relationship between IST and albedo to be evaluated easily. Surface temperature is a fundamental input for dynamical ice sheet models because it is a component of the ice sheet radiation budget and mass balance. Albedo influences absorption of incoming solar radiation. The daily product will combine the existing standard MODIS Collection-6 ice-surface temperature, derived melt maps, snow albedo and water vapor products. The new product is available in a polar stereographic projection in NetCDF format. The product will ultimately extend from March 2000 through the end of 2017.
Rakszegi, Marianna; Löschenberger, Franziska; Hiltbrunner, Jürg; Vida, Gyula; Mikó, Péter
2016-06-01
An assessment was previously made of the effects of organic and low-input field management systems on the physical, grain compositional and processing quality of wheat and on the performance of varieties developed using different breeding methods ("Comparison of quality parameters of wheat varieties with different breeding origin under organic and low-input conventional conditions" [1]). Here, accompanying data are provided on the performance and stability analysis of the genotypes using the coefficient of variation and the 'ranking' and 'which-won-where' plots of GGE biplot analysis for the most important quality traits. Broad-sense heritability was also evaluated and is given for the most important physical and quality properties of the seed in organic and low-input management systems, while mean values and standard deviation of the studied properties are presented separately for organic and low-input fields.
IOS: PDP 11/45 formatted input/output task stacker and processer. [In MACRO-II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koschik, J.
1974-07-08
IOS allows the programer to perform formated Input/Output at assembly language level to/from any peripheral device. It runs under DOS versions V8-O8 or V9-19, reading and writing DOS-compatible files. Additionally, IOS will run, with total transparency, in an environment with memory management enabled. Minimum hardware required is a 16K PDP 11/45, Keyboard Device, DISK (DK,DF, or DC), and Line Frequency Clock. The source language is MACRO-11 (3.3K Decimal Words).
NASA Technical Reports Server (NTRS)
Bains, R. W.; Herwig, H. A.; Luedeman, J. K.; Torina, E. M.
1974-01-01
The Shuttle Electric Power System Analysis SEPS computer program which performs detailed load analysis including predicting energy demands and consumables requirements of the shuttle electric power system along with parameteric and special case studies on the shuttle electric power system is described. The functional flow diagram of the SEPS program is presented along with data base requirements and formats, procedure and activity definitions, and mission timeline input formats. Distribution circuit input and fixed data requirements are included. Run procedures and deck setups are described.
Terrestrial Investigation Model, TIM, has several appendices to its user guide. This is the appendix that includes an example input file in its preserved format. Both parameters and comments defining them are included.
NASA Technical Reports Server (NTRS)
Briggs, Maxwell; Schifer, Nicholas
2011-01-01
Test hardware used to validate net heat prediction models. Problem: Net Heat Input cannot be measured directly during operation. Net heat input is a key parameter needed in prediction of efficiency for convertor performance. Efficiency = Electrical Power Output (Measured) divided by Net Heat Input (Calculated). Efficiency is used to compare convertor designs and trade technology advantages for mission planning.
SSM/OOM - SSM WITH OOM MANIPULATION CODE
NASA Technical Reports Server (NTRS)
Goza, S. P.
1994-01-01
Creating, animating, and recording solid-shaded and wireframe three-dimensional geometric models can be of great assistance in the research and design phases of product development, in project planning, and in engineering analyses. SSM and OOM are application programs which together allow for interactive construction and manipulation of three-dimensional models of real-world objects as simple as boxes or as complex as Space Station Freedom. The output of SSM, in the form of binary files defining geometric three dimensional models, is used as input to OOM. Animation in OOM is done using 3D models from SSM as well as cameras and light sources. The animated results of OOM can be output to videotape recorders, film recorders, color printers and disk files. SSM and OOM are also available separately as MSC-21914 and MSC-22263, respectively. The Solid Surface Modeler (SSM) is an interactive graphics software application for solid-shaded and wireframe three-dimensional geometric modeling. The program has a versatile user interface that, in many cases, allows mouse input for intuitive operation or keyboard input when accuracy is critical. SSM can be used as a stand-alone model generation and display program and offers high-fidelity still image rendering. Models created in SSM can also be loaded into the Object Orientation Manipulator for animation or engineering simulation. The Object Orientation Manipulator (OOM) is an application program for creating, rendering, and recording three-dimensional computer-generated still and animated images. This is done using geometrically defined 3D models, cameras, and light sources, referred to collectively as animation elements. OOM does not provide the tools necessary to construct 3D models; instead, it imports binary format model files generated by the Solid Surface Modeler (SSM). Model files stored in other formats must be converted to the SSM binary format before they can be used in OOM. SSM is available as MSC-21914 or as part of the SSM/OOM bundle, COS-10047. Among OOM's features are collision detection (with visual and audio feedback), the capability to define and manipulate hierarchical relationships between animation elements, stereographic display, and ray- traced rendering. OOM uses Euler angle transformations for calculating the results of translation and rotation operations. OOM and SSM are written in C-language for implementation on SGI IRIS 4D series workstations running the IRIX operating system. A minimum of 8Mb of RAM is recommended for each program. The standard distribution medium for this program package is a .25 inch streaming magnetic IRIX tape cartridge in UNIX tar format. These versions of OOM and SSM were released in 1993.
Asynchronous brain-computer interface for cognitive assessment in people with cerebral palsy
NASA Astrophysics Data System (ADS)
Alcaide-Aguirre, R. E.; Warschausky, S. A.; Brown, D.; Aref, A.; Huggins, J. E.
2017-12-01
Objective. Typically, clinical measures of cognition require motor or speech responses. Thus, a significant percentage of people with disabilities are not able to complete standardized assessments. This situation could be resolved by employing a more accessible test administration method, such as a brain-computer interface (BCI). A BCI can circumvent motor and speech requirements by translating brain activity to identify a subject’s response. By eliminating the need for motor or speech input, one could use a BCI to assess an individual who previously did not have access to clinical tests. Approach. We developed an asynchronous, event-related potential BCI-facilitated administration procedure for the peabody picture vocabulary test (PPVT-IV). We then tested our system in typically developing individuals (N = 11), as well as people with cerebral palsy (N = 19) to compare results to the standardized PPVT-IV format and administration. Main results. Standard scores on the BCI-facilitated PPVT-IV, and the standard PPVT-IV were highly correlated (r = 0.95, p < 0.001), with a mean difference of 2.0 ± 6.4 points, which is within the standard error of the PPVT-IV. Significance. Thus, our BCI-facilitated PPVT-IV provided comparable results to the standard PPVT-IV, suggesting that populations for whom standardized cognitive tests are not accessible could benefit from our BCI-facilitated approach.
NASA Technical Reports Server (NTRS)
Anderson, O. L.; Chiappetta, L. M.; Edwards, D. E.; Mcvey, J. B.
1982-01-01
A user's manual describing the operation of three computer codes (ADD code, PTRAK code, and VAPDIF code) is presented. The general features of the computer codes, the input/output formats, run streams, and sample input cases are described.
Semi-Automated Annotation of Biobank Data Using Standard Medical Terminologies in a Graph Database.
Hofer, Philipp; Neururer, Sabrina; Goebel, Georg
2016-01-01
Data describing biobank resources frequently contains unstructured free-text information or insufficient coding standards. (Bio-) medical ontologies like Orphanet Rare Diseases Ontology (ORDO) or the Human Disease Ontology (DOID) provide a high number of concepts, synonyms and entity relationship properties. Such standard terminologies increase quality and granularity of input data by adding comprehensive semantic background knowledge from validated entity relationships. Moreover, cross-references between terminology concepts facilitate data integration across databases using different coding standards. In order to encourage the use of standard terminologies, our aim is to identify and link relevant concepts with free-text diagnosis inputs within a biobank registry. Relevant concepts are selected automatically by lexical matching and SPARQL queries against a RDF triplestore. To ensure correctness of annotations, proposed concepts have to be confirmed by medical data administration experts before they are entered into the registry database. Relevant (bio-) medical terminologies describing diseases and phenotypes were identified and stored in a graph database which was tied to a local biobank registry. Concept recommendations during data input trigger a structured description of medical data and facilitate data linkage between heterogeneous systems.
Concept Formation Skills in Long-Term Cochlear Implant Users
ERIC Educational Resources Information Center
Castellanos, Irina; Kronenberger, William G.; Beer, Jessica; Colson, Bethany G.; Henning, Shirley C.; Ditmars, Allison; Pisoni, David B.
2015-01-01
This study investigated if a period of auditory sensory deprivation followed by degraded auditory input and related language delays affects visual concept formation skills in long-term prelingually deaf cochlear implant (CI) users. We also examined if concept formation skills are mediated or moderated by other neurocognitive domains (i.e.,…
Hua, Yongzhao; Dong, Xiwang; Li, Qingdong; Ren, Zhang
2017-05-18
This paper investigates the time-varying formation robust tracking problems for high-order linear multiagent systems with a leader of unknown control input in the presence of heterogeneous parameter uncertainties and external disturbances. The followers need to accomplish an expected time-varying formation in the state space and track the state trajectory produced by the leader simultaneously. First, a time-varying formation robust tracking protocol with a totally distributed form is proposed utilizing the neighborhood state information. With the adaptive updating mechanism, neither any global knowledge about the communication topology nor the upper bounds of the parameter uncertainties, external disturbances and leader's unknown input are required in the proposed protocol. Then, in order to determine the control parameters, an algorithm with four steps is presented, where feasible conditions for the followers to accomplish the expected time-varying formation tracking are provided. Furthermore, based on the Lyapunov-like analysis theory, it is proved that the formation tracking error can converge to zero asymptotically. Finally, the effectiveness of the theoretical results is verified by simulation examples.
NASA Astrophysics Data System (ADS)
Sorota, Kristin
Metasedimentary rocks of the Merrimack terrane (MT) originated as a thick cover sequence on Ganderia consisting of sandstones, calcareous sandstones, pelitic rocks and turbidites. In order to investigate the age, provenance and stratigraphic order of these rocks and correlations with adjoining terranes, detrital zircon suites from 7 formations across the MT along a NNE-trending transect from east-central Massachusetts to SE New Hampshire were analyzed by U-Pb LA-ICP-MS methods on 90-140 grains per sample. The youngest detrital zircons in the western units, the Worcester, Oakdale and Paxton Formations, are ca. 438 Ma while those in the Kittery, Eliot and Berwick Formations in the northeast are ca. 426 Ma. The Tower Hill Formation previously interpreted to form the easternmost unit of the MT in MA, has a distinctly different zircon distribution with its youngest zircon population in the Cambrian. All samples except for the Tower Hill Formation have detrital zircon age distributions with significant peaks in the mid-to late Ordovician, similar abundances of early Paleozoic and late Neoproterozoic zircons, significant input from ˜1.0 to ˜1.8 Ga sources and limited Archean grains. The similarities in zircon provenance suggest that all units across the terrane, except for the Tower Hill Formation, belong to a single sequence of rocks, with similar sources and with the units in the NE possibly being somewhat younger than those in east-central Massachusetts. The continuous zircon age distributions observed throughout the Mesoproterozoic and late Paleoproterozoic are consistent with an Amazonian source. All samples, except the Tower Hill Formation, show sedimentary input from both Ganderian and Laurentian sources and suggest that Laurentian input increases as the maximum depositional age decreases.
Geena 2, improved automated analysis of MALDI/TOF mass spectra.
Romano, Paolo; Profumo, Aldo; Rocco, Mattia; Mangerini, Rosa; Ferri, Fabio; Facchiano, Angelo
2016-03-02
Mass spectrometry (MS) is producing high volumes of data supporting oncological sciences, especially for translational research. Most of related elaborations can be carried out by combining existing tools at different levels, but little is currently available for the automation of the fundamental steps. For the analysis of MALDI/TOF spectra, a number of pre-processing steps are required, including joining of isotopic abundances for a given molecular species, normalization of signals against an internal standard, background noise removal, averaging multiple spectra from the same sample, and aligning spectra from different samples. In this paper, we present Geena 2, a public software tool for the automated execution of these pre-processing steps for MALDI/TOF spectra. Geena 2 has been developed in a Linux-Apache-MySQL-PHP web development environment, with scripts in PHP and Perl. Input and output are managed as simple formats that can be consumed by any database system and spreadsheet software. Input data may also be stored in a MySQL database. Processing methods are based on original heuristic algorithms which are introduced in the paper. Three simple and intuitive web interfaces are available: the Standard Search Interface, which allows a complete control over all parameters, the Bright Search Interface, which leaves to the user the possibility to tune parameters for alignment of spectra, and the Quick Search Interface, which limits the number of parameters to a minimum by using default values for the majority of parameters. Geena 2 has been utilized, in conjunction with a statistical analysis tool, in three published experimental works: a proteomic study on the effects of long-term cryopreservation on the low molecular weight fraction of serum proteome, and two retrospective serum proteomic studies, one on the risk of developing breat cancer in patients affected by gross cystic disease of the breast (GCDB) and the other for the identification of a predictor of breast cancer mortality following breast cancer surgery, whose results were validated by ELISA, a completely alternative method. Geena 2 is a public tool for the automated pre-processing of MS data originated by MALDI/TOF instruments, with a simple and intuitive web interface. It is now under active development for the inclusion of further filtering options and for the adoption of standard formats for MS spectra.
Low-Speed Fingerprint Image Capture System User`s Guide, June 1, 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitus, B.R.; Goddard, J.S.; Jatko, W.B.
1993-06-01
The Low-Speed Fingerprint Image Capture System (LS-FICS) uses a Sun workstation controlling a Lenzar ElectroOptics Opacity 1000 imaging system to digitize fingerprint card images to support the Federal Bureau of Investigation`s (FBI`s) Automated Fingerprint Identification System (AFIS) program. The system also supports the operations performed by the Oak Ridge National Laboratory- (ORNL-) developed Image Transmission Network (ITN) prototype card scanning system. The input to the system is a single FBI fingerprint card of the agreed-upon standard format and a user-specified identification number. The output is a file formatted to be compatible with the National Institute of Standards and Technology (NIST)more » draft standard for fingerprint data exchange dated June 10, 1992. These NIST compatible files contain the required print and text images. The LS-FICS is designed to provide the FBI with the capability of scanning fingerprint cards into a digital format. The FBI will replicate the system to generate a data base of test images. The Host Workstation contains the image data paths and the compression algorithm. A local area network interface, disk storage, and tape drive are used for the image storage and retrieval, and the Lenzar Opacity 1000 scanner is used to acquire the image. The scanner is capable of resolving 500 pixels/in. in both x and y directions. The print images are maintained in full 8-bit gray scale and compressed with an FBI-approved wavelet-based compression algorithm. The text fields are downsampled to 250 pixels/in. and 2-bit gray scale. The text images are then compressed using a lossless Huffman coding scheme. The text fields retrieved from the output files are easily interpreted when displayed on the screen. Detailed procedures are provided for system calibration and operation. Software tools are provided to verify proper system operation.« less
NASA Astrophysics Data System (ADS)
Togunwa, Olayinka S.; Abdullah, Wan H.
2017-08-01
The Neogene strata of the onshore West Baram Province of NW Borneo contain organic rich rock formations particularly within the Sarawak basin. This basin is a proven prolific oil and gas province, thus has been a subject of great interest to characterise the nature of the organic source input and depositional environment conditions as well as thermal maturation. This study is performed on outcrop samples of Lambir, Miri and Tukau formations, which are of stratigraphic equivalence to the petroleum bearing cycles of the offshore West Baram delta province in Sarawak. The investigated mudstone samples are organic rich with a total organic carbon (TOC) content of more than 1.0 wt.%. The integration of elemental and molecular analyses indicates that there is no significant variation in the source input between these formations. The investigated biomarkers parameters achieved from acyclic isoprenoids, terpanes and steranes biomarkers of a saturated hydrocarbon biomarkers revealed that these sediments contain high contribution of land plants with minor marine organic matter input that was deposited and preserved under relatively oxic to suboxic conditions. This is further supported by low total sulphur (TS), high TOC/TN ratios, source and redox sensitive trace elements (V, Ni, Cr, Co and Mo) concentrations and their ratios, which suggest terrigenous source input deposited under oxic to suboxic conditions. Based on the analysed biomarker thermal maturity indicators, it may be deduced that the studied sediments are yet to enter the maturity stage for hydrocarbon generation, which is also supported by measured vitrinite reflectance values of 0.39-0.48% Ro.
A new conceptual model on the fate and controls of fresh and pyrolized plant litter decomposition
USDA-ARS?s Scientific Manuscript database
The leaching of dissolved organic matter (DOM) from fresh and pyrolyzed aboveground plant inputs to the soil is a major pathway by which decomposing aboveground plant material contributes to soil organic matter formation. Understanding how aboveground plant input chemical traits control the partiti...
MOVES2014 at the Project Level for Experienced Users, October 2014 Webinar Slides
This webinar covers the changes that enhance the MOtor Vehicle Emission Simulator at the project scale, changes to its graphical user interface at the project scale, how to convert a MOVES2010b project-level input file to MOVES2014 format, and new input.
Structural tailoring of advanced turboprops (STAT): User's manual
NASA Technical Reports Server (NTRS)
Brown, K. W.
1991-01-01
This user's manual describes the Structural Tailoring of Advanced Turboprops program. It contains instructions to prepare the input for optimization, blade geometry and analysis, geometry generation, and finite element program control. In addition, a sample input file is provided as well as a section describing special applications (i.e., non-standard input).
Kriener, Birgit; Helias, Moritz; Rotter, Stefan; Diesmann, Markus; Einevoll, Gaute T
2013-01-01
Pattern formation, i.e., the generation of an inhomogeneous spatial activity distribution in a dynamical system with translation invariant structure, is a well-studied phenomenon in neuronal network dynamics, specifically in neural field models. These are population models to describe the spatio-temporal dynamics of large groups of neurons in terms of macroscopic variables such as population firing rates. Though neural field models are often deduced from and equipped with biophysically meaningful properties, a direct mapping to simulations of individual spiking neuron populations is rarely considered. Neurons have a distinct identity defined by their action on their postsynaptic targets. In its simplest form they act either excitatorily or inhibitorily. When the distribution of neuron identities is assumed to be periodic, pattern formation can be observed, given the coupling strength is supracritical, i.e., larger than a critical weight. We find that this critical weight is strongly dependent on the characteristics of the neuronal input, i.e., depends on whether neurons are mean- or fluctuation driven, and different limits in linearizing the full non-linear system apply in order to assess stability. In particular, if neurons are mean-driven, the linearization has a very simple form and becomes independent of both the fixed point firing rate and the variance of the input current, while in the very strongly fluctuation-driven regime the fixed point rate, as well as the input mean and variance are important parameters in the determination of the critical weight. We demonstrate that interestingly even in "intermediate" regimes, when the system is technically fluctuation-driven, the simple linearization neglecting the variance of the input can yield the better prediction of the critical coupling strength. We moreover analyze the effects of structural randomness by rewiring individual synapses or redistributing weights, as well as coarse-graining on the formation of inhomogeneous activity patterns.
Kriener, Birgit; Helias, Moritz; Rotter, Stefan; Diesmann, Markus; Einevoll, Gaute T.
2014-01-01
Pattern formation, i.e., the generation of an inhomogeneous spatial activity distribution in a dynamical system with translation invariant structure, is a well-studied phenomenon in neuronal network dynamics, specifically in neural field models. These are population models to describe the spatio-temporal dynamics of large groups of neurons in terms of macroscopic variables such as population firing rates. Though neural field models are often deduced from and equipped with biophysically meaningful properties, a direct mapping to simulations of individual spiking neuron populations is rarely considered. Neurons have a distinct identity defined by their action on their postsynaptic targets. In its simplest form they act either excitatorily or inhibitorily. When the distribution of neuron identities is assumed to be periodic, pattern formation can be observed, given the coupling strength is supracritical, i.e., larger than a critical weight. We find that this critical weight is strongly dependent on the characteristics of the neuronal input, i.e., depends on whether neurons are mean- or fluctuation driven, and different limits in linearizing the full non-linear system apply in order to assess stability. In particular, if neurons are mean-driven, the linearization has a very simple form and becomes independent of both the fixed point firing rate and the variance of the input current, while in the very strongly fluctuation-driven regime the fixed point rate, as well as the input mean and variance are important parameters in the determination of the critical weight. We demonstrate that interestingly even in “intermediate” regimes, when the system is technically fluctuation-driven, the simple linearization neglecting the variance of the input can yield the better prediction of the critical coupling strength. We moreover analyze the effects of structural randomness by rewiring individual synapses or redistributing weights, as well as coarse-graining on the formation of inhomogeneous activity patterns. PMID:24501591
A standard protocol for describing individual-based and agent-based models
Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.
2006-01-01
Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.
Value-Based Medicine and Pharmacoeconomics.
Brown, Gary C; Brown, Melissa M
2016-01-01
Pharmacoeconomics is assuming increasing importance in the pharmaceutical field since it is entering the public policy arena in many countries. Among the variants of pharmacoeconomic analysis are cost-minimization, cost-benefit, cost-effectiveness and cost-utility analyses. The latter is the most versatile and sophisticated in that it integrates the patient benefit (patient value) conferred by a drug in terms of improvement in length and/or quality of life. It also incorporates the costs expended for that benefit, as well as the dollars returned to patients and society from the use of a drug (financial value). Unfortunately, one cost-utility analysis in the literature is generally not comparable to another because of the lack of standardized formats and standardized input variables (costs, cost perspective, quality-of-life measurement instruments, quality-of-life respondents, discounting and so forth). Thus, millions of variants can be used. Value-based medicine® (VBM) cost-utility analysis standardizes these variants so that one VBM analysis is comparable to another. This system provides a highly rational methodology that allows providers and patients to quantify and compare the patient value and financial value gains associated with the use of pharmaceutical agents for example. © 2016 S. Karger AG, Basel.
International standards for brucellosis prevention and management.
Ragan, V; Vroegindewey, G; Babcock, S
2013-04-01
International standards are a crucial element in brucellosis prevention and management. They allow policy-makers, scientists, epidemiologists, laboratories and trade entities to have a common vocabulary for communication and understanding of the disease. These standards cover the entire spectrum of activities from surveillance, testing, prophylaxis, transport and trade to policy development, research and reporting. Developing, adhering to and monitoring standards increases both the effectiveness and efficiency of prevention and management programmes. Creating standards with the input of all stakeholders ensures that the standards do not adversely affect the requirements of any of the multiple parties involved. The World Organisation for Animal Health (OIE), in conjunction with its Member Countries, and through its standing and ad hoc committees plus expert input, has taken a key leadership role in developing and reviewing brucellosis standards. These standards are used to harmonise testing, prevention processes, vaccines and reporting, to support trade and to protect human and animal health.
PATHWAYS - ELECTRON TUNNELING PATHWAYS IN PROTEINS
NASA Technical Reports Server (NTRS)
Beratan, D. N.
1994-01-01
The key to understanding the mechanisms of many important biological processes such as photosynthesis and respiration is a better understanding of the electron transfer processes which take place between metal atoms (and other groups) fixed within large protein molecules. Research is currently focused on the rate of electron transfer and the factors that influence it, such as protein composition and the distance between metal atoms. Current models explain the swift transfer of electrons over considerable distances by postulating bridge-mediated tunneling, or physical tunneling pathways, made up of interacting bonds in the medium around and between donor and acceptor sites. The program PATHWAYS is designed to predict the route along which electrons travel in the transfer processes. The basic strategy of PATHWAYS is to begin by recording each possible path element on a connectivity list, including in each entry which two atoms are connected and what contribution the connection would make to the overall rate if it were included in a pathway. The list begins with the bonded molecular structure (including the backbone sequence and side chain connectivity), and then adds probable hydrogen bond links and through-space contacts. Once this list is completed, the program runs a tree search from the donor to the acceptor site to find the dominant pathways. The speed and efficiency of the computer search offers an improvement over manual techniques. PATHWAYS is written in FORTRAN 77 for execution on DEC VAX series computers running VMS. The program inputs data from four data sets and one structure file. The software was written to input BIOGRAF (old format) structure files based on x-ray crystal structures and outputs ASCII files listing the best pathways and BIOGRAF vector files containing the paths. Relatively minor changes could be made in the input format statements for compatibility with other graphics software. The executable and source code are included with the distribution. The main memory requirement for execution is 2.6 Mb. This program is available in DEC VAX BACKUP format on a 9-track 1600 BPI magnetic tape (standard distribution) or on a TK50 tape cartridge. PATHWAYS was developed in 1988. PATHWAYS is a copyrighted work with all copyright vested in NASA. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. BIOGRAF is a trademark of Molecular Simulations, Inc., Sunnyvale, CA.
Adamski, Mateusz G; Gumann, Patryk; Baird, Alison E
2014-01-01
Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR) have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR) and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells) and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA)) permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1) the achievement of absolute quantification and (2) a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.
Providing Data Access for Interdisciplinary Research
NASA Astrophysics Data System (ADS)
Hooper, R. P.; Couch, A.
2012-12-01
Developing an interdisciplinary understanding of human and environmental interactions with water requires access to a variety of data kinds collected by various organizations. The CUAHSI Hydrologic Information System (HIS) is a standards-based, services-oriented architecture designed for time-series data. Such data represents an important type of data in water studies. Through the efforts of HIS, a standard transmission language, WaterML2, has been adopted by the Open Geospatial Consortium and is under consideration by the World Meteorologic Organization as an international standards. Web services have also been developed to retrieve data and metadata. HIS is completed with a metadata catalog, hosted by San Diego Supercomputing Center, which indexes more than 20 million time series provided from over 90 different services. This catalog is supported through a hierarchically organized controlled vocabulary that is open for community input and mediation. Data publishers include federal agencies, universities, state agencies, and non-profit organizations such as watershed associations. Accessing data from such a broad spectrum of sources through a uniform service standard promises to truly transform the way in which hydrologic research is done. CUAHSI HIS is a large-scale prototype at this time, but a proposal is under consideration by the National Science Foundation to operationalize HIS through a data facility, tentatively called the CUAHSI Water Data Center. Establishing HIS is an important step to enable research into human-environment interactions with water, but it is only one step. Other data structures will need to be made accessible and interoperable to support this research. Some data—such as two-dimensional GIS coverages—already have widely used standards for transmission and sharing. The US Federal government has long operated a clearinghouse for federal geographic data that is now being augmented with other services such as ArcGIS OnLine. Other data, such as gridded data, have standard storage formats (e.g., netCDF) but its native format is not convenient for water research. Some progress has been made to "transpose" these data sets from gridded data to a grid of virtual gages with time series. Such a format is more convenient for research of a limited spatial extent through time. Advances in relational data base structure now make it possible to serve very large data sets, such as radar-based precipitation grids, through HIS. Expanding the use of a standards-based services-oriented architecture will enable interdisciplinary research to proceed far more rapidly by putting data onto scientists' computers with a fraction of the effort previously required.
NASA Astrophysics Data System (ADS)
Turner, M. A.
2015-12-01
Because of a lack of centralized planning and no widely-adopted standards among hydrological modeling research groups, research communities, and the data management teams meant to support research, there is chaos when it comes to data formats, spatio-temporal resolutions, ontologies, and data availability. All this makes true scientific reproducibility and collaborative integrated modeling impossible without some glue to piece it all together. Our Virtual Watershed Integrated Modeling System provides the tools and modeling framework hydrologists need to accelerate and fortify new scientific investigations by tracking provenance and providing adaptors for integrated, collaborative hydrologic modeling and data management. Under global warming trends where water resources are under increasing stress, reproducible hydrological modeling will be increasingly important to improve transparency and understanding of the scientific facts revealed through modeling. The Virtual Watershed Data Engine is capable of ingesting a wide variety of heterogeneous model inputs, outputs, model configurations, and metadata. We will demonstrate one example, starting from real-time raw weather station data packaged with station metadata. Our integrated modeling system will then create gridded input data via geostatistical methods along with error and uncertainty estimates. These gridded data are then used as input to hydrological models, all of which are available as web services wherever feasible. Models may be integrated in a data-centric way where the outputs too are tracked and used as inputs to "downstream" models. This work is part of an ongoing collaborative Tri-state (New Mexico, Nevada, Idaho) NSF EPSCoR Project, WC-WAVE, comprised of researchers from multiple universities in each of the three states. The tools produced and presented here have been developed collaboratively alongside watershed scientists to address specific modeling problems with an eye on the bigger picture of scientific reproducibility and transparency, and data publication and reuse.
NASA Technical Reports Server (NTRS)
Mori, R. L.; Bergsman, A. E.; Holmes, M. J.; Yates, B. J.
2001-01-01
Changes in posture can affect the resting length of respiratory muscles, requiring alterations in the activity of these muscles if ventilation is to be unaffected. Recent studies have shown that the vestibular system contributes to altering respiratory muscle activity during movement and changes in posture. Furthermore, anatomical studies have demonstrated that many bulbospinal neurons in the medial medullary reticular formation (MRF) provide inputs to phrenic and abdominal motoneurons; because this region of the reticular formation receives substantial vestibular and other movement-related input, it seems likely that medial medullary reticulospinal neurons could adjust the activity of respiratory motoneurons during postural alterations. The objective of the present study was to determine whether functional lesions of the MRF affect inspiratory and expiratory muscle responses to activation of the vestibular system. Lidocaine or muscimol injections into the MRF produced a large increase in diaphragm and abdominal muscle responses to vestibular stimulation. These vestibulo-respiratory responses were eliminated following subsequent chemical blockade of descending pathways in the lateral medulla. However, inactivation of pathways coursing through the lateral medulla eliminated excitatory, but not inhibitory, components of vestibulo-respiratory responses. The simplest explanation for these data is that MRF neurons that receive input from the vestibular nuclei make inhibitory connections with diaphragm and abdominal motoneurons, whereas a pathway that courses laterally in the caudal medulla provides excitatory vestibular inputs to these motoneurons.
WORM - WINDOWED OBSERVATION OF RELATIVE MOTION
NASA Technical Reports Server (NTRS)
Bauer, F.
1994-01-01
The Windowed Observation of Relative Motion, WORM, program is primarily intended for the generation of simple X-Y plots from data created by other programs. It allows the user to label, zoom, and change the scale of various plots. Three dimensional contour and line plots are provided, although with more limited capabilities. The input data can be in binary or ASCII format, although all data must be in the same format. A great deal of control over the details of the plot is provided, such as gridding, size of tick marks, colors, log/semilog capability, time tagging, and multiple and phase plane plots. Many color and monochrome graphics terminals and hard copy printer/plotters are supported. The WORM executive commands, menu selections and macro files can be used to develop plots and tabular data, query the WORM Help library, retrieve data from input files, and invoke VAX DCL commands. WORM generated plots are displayed on local graphics terminals and can be copied using standard hard copy capabilities. Some of the graphics features of WORM include: zooming and dezooming various portions of the plot; plot documentation including curve labeling and function listing; multiple curves on the same plot; windowing of multiple plots and insets of the same plot; displaying a specific on a curve; and spinning the curve left, right, up, and down. WORM is written in PASCAL for interactive execution and has been implemented on a DEC VAX computer operating under VMS 4.7 with a virtual memory requirement of approximately 392K of 8 bit bytes. It uses the QPLOT device independent graphics library included with WORM. It was developed in 1988.
NASA Technical Reports Server (NTRS)
Briggs, Maxwell H.; Schifer, Nicholas A.
2012-01-01
The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two high-efficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including testing validation hardware, known as the Thermal Standard, to provide a direct comparison to numerical and empirical models used to predict convertor net heat input. This validation hardware provided a comparison for scrutinizing and improving empirical correlations and numerical models of ASC-E2 net heat input. This hardware simulated the characteristics of an ASC-E2 convertor in both an operating and non-operating mode. This paper describes the Thermal Standard testing and the conclusions of the validation effort applied to the empirical correlation methods used by the Radioisotope Power System (RPS) team at NASA Glenn.
Spacelab Data Processing Facility
NASA Technical Reports Server (NTRS)
1983-01-01
The Spacelab Data Processing Facility (SDPF) processes, monitors, and accounts for the payload data from Spacelab and other Shuttle missions and forwards relevant data to various user facilities worldwide. The SLDPF is divided into the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). The SIPS division demultiplexes, synchronizes, time tags, quality checks, accounts for the data, and formats the data onto tapes. The SOPS division further edits, blocks, formats, and records the data on tape for shipment to users. User experiments must conform to the Spacelab's onboard High Rate Multiplexer (HRM) format for maximum process ability. Audio, analog, instrumentation, high density, experiment data, input/output data, quality control and accounting, and experimental channel tapes along with a variety of spacelab ancillary tapes are provided to the user by SLDPF.
NEWTONP - CUMULATIVE BINOMIAL PROGRAMS
NASA Technical Reports Server (NTRS)
Bowerman, P. N.
1994-01-01
The cumulative binomial program, NEWTONP, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), can be used independently of one another. NEWTONP can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. NEWTONP calculates the probably p required to yield a given system reliability V for a k-out-of-n system. It can also be used to determine the Clopper-Pearson confidence limits (either one-sided or two-sided) for the parameter p of a Bernoulli distribution. NEWTONP can determine Bayesian probability limits for a proportion (if the beta prior has positive integer parameters). It can determine the percentiles of incomplete beta distributions with positive integer parameters. It can also determine the percentiles of F distributions and the midian plotting positions in probability plotting. NEWTONP is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. NEWTONP is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The NEWTONP program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. NEWTONP was developed in 1988.
Achieving mask order processing automation, interoperability and standardization based on P10
NASA Astrophysics Data System (ADS)
Rodriguez, B.; Filies, O.; Sadran, D.; Tissier, Michel; Albin, D.; Stavroulakis, S.; Voyiatzis, E.
2007-02-01
Last year the MUSCLE (Masks through User's Supply Chain: Leadership by Excellence) project was presented. Here is the project advancement. A key process in mask supply chain management is the exchange of technical information for ordering masks. This process is large, complex, company specific and error prone, and leads to longer cycle times and higher costs due to missing or wrong inputs. Its automation and standardization could produce significant benefits. We need to agree on the standard for mandatory and optional parameters, and also a common way to describe parameters when ordering. A system was created to improve the performance in terms of Key Performance Indicators (KPIs) such as cycle time and cost of production. This tool allows us to evaluate and measure the effect of factors, as well as the effect of implementing the improvements of the complete project. Next, a benchmark study and a gap analysis were performed. These studies show the feasibility of standardization, as there is a large overlap in requirements. We see that the SEMI P10 standard needs enhancements. A format supporting the standard is required, and XML offers the ability to describe P10 in a flexible way. Beyond using XML for P10, the semantics of the mask order should also be addressed. A system design and requirements for a reference implementation for a P10 based management system are presented, covering a mechanism for the evolution and for version management and a design for P10 editing and data validation.
Reaction Decoder Tool (RDT): extracting features from chemical reactions.
Rahman, Syed Asad; Torrance, Gilliean; Baldacci, Lorenzo; Martínez Cuesta, Sergio; Fenninger, Franz; Gopal, Nimish; Choudhary, Saket; May, John W; Holliday, Gemma L; Steinbeck, Christoph; Thornton, Janet M
2016-07-01
Extracting chemical features like Atom-Atom Mapping (AAM), Bond Changes (BCs) and Reaction Centres from biochemical reactions helps us understand the chemical composition of enzymatic reactions. Reaction Decoder is a robust command line tool, which performs this task with high accuracy. It supports standard chemical input/output exchange formats i.e. RXN/SMILES, computes AAM, highlights BCs and creates images of the mapped reaction. This aids in the analysis of metabolic pathways and the ability to perform comparative studies of chemical reactions based on these features. This software is implemented in Java, supported on Windows, Linux and Mac OSX, and freely available at https://github.com/asad/ReactionDecoder : asad@ebi.ac.uk or s9asad@gmail.com. © The Author 2016. Published by Oxford University Press.
Non-parametric PCM to ADM conversion. [Pulse Code to Adaptive Delta Modulation
NASA Technical Reports Server (NTRS)
Locicero, J. L.; Schilling, D. L.
1977-01-01
An all-digital technique to convert pulse code modulated (PCM) signals into adaptive delta modulation (ADM) format is presented. The converter developed is shown to be independent of the statistical parameters of the encoded signal and can be constructed with only standard digital hardware. The structure of the converter is simple enough to be fabricated on a large scale integrated circuit where the advantages of reliability and cost can be optimized. A concise evaluation of this PCM to ADM translation technique is presented and several converters are simulated on a digital computer. A family of performance curves is given which displays the signal-to-noise ratio for sinusoidal test signals subjected to the conversion process, as a function of input signal power for several ratios of ADM rate to Nyquist rate.
40 CFR 60.42c - Standard for sulfur dioxide (SO2).
Code of Federal Regulations, 2010 CFR
2010-07-01
....2 lb/MMBtu) heat input. If coal is combusted with other fuels, the affected facility shall neither... excess of 520 ng/J (1.2 lb/MMBtu) heat input. If coal is fired with coal refuse, the affected facility.../MMBtu) heat input. If coal is combusted with other fuels, the affected facility is subject to the 50...
LGSOWG CCT format CCB document: The standard CCT family of tape formats
NASA Technical Reports Server (NTRS)
1979-01-01
The tape format standardization approach recommended by the committee on CCT standardization is described and defined. All rules and conventions required to employ the superstructure approach to the CCT family of tape formats are presented for users of remote sensing data and producer of user tapes and the superstructure records are specified. The standard for future tape format design is presented as a guide to designing data records of a particular tape format. An example is provided showing how to incorporate the superstructure into an already established tape format.
Software handlers for process interfaces
NASA Technical Reports Server (NTRS)
Bercaw, R. W.
1976-01-01
The principles involved in the development of software handlers for custom interfacing problems are discussed. Handlers for the CAMAC standard are examined in detail. The types of transactions that must be supported have been established by standards groups, eliminating conflicting requirements arising out of different design philosophies and applications. Implementation of the standard handlers has been facilititated by standardization of hardware. The necessary local processors can be placed in the handler when it is written or at run time by means of input/output directives, or they can be built into a high-performance input/output processor. The full benefits of these process interfaces will only be realized when software requirements are incorporated uniformly into the hardware.
Java Library for Input and Output of Image Data and Metadata
NASA Technical Reports Server (NTRS)
Deen, Robert; Levoe, Steven
2003-01-01
A Java-language library supports input and output (I/O) of image data and metadata (label data) in the format of the Video Image Communication and Retrieval (VICAR) image-processing software and in several similar formats, including a subset of the Planetary Data System (PDS) image file format. The library does the following: It provides low-level, direct access layer, enabling an application subprogram to read and write specific image files, lines, or pixels, and manipulate metadata directly. Two coding/decoding subprograms ("codecs" for short) based on the Java Advanced Imaging (JAI) software provide access to VICAR and PDS images in a file-format-independent manner. The VICAR and PDS codecs enable any program that conforms to the specification of the JAI codec to use VICAR or PDS images automatically, without specific knowledge of the VICAR or PDS format. The library also includes Image I/O plugin subprograms for VICAR and PDS formats. Application programs that conform to the Image I/O specification of Java version 1.4 can utilize any image format for which such a plug-in subprogram exists, without specific knowledge of the format itself. Like the aforementioned codecs, the VICAR and PDS Image I/O plug-in subprograms support reading and writing of metadata.
Press releases: translating research into news.
Woloshin, Steven; Schwartz, Lisa M
2002-06-05
While medical journals strive to ensure accuracy and the acknowledgment of limitations in articles, press releases may not reflect these efforts. Telephone interviews conducted in January 2001 with press officers at 9 prominent medical journals and analysis of press releases (n = 127) about research articles for the 6 issues of each journal preceding the interviews. Seven of the 9 journals routinely issue releases; in each case, the editor with the press office selects articles based on perceived newsworthiness and releases are written by press officers trained in communications. Journals have general guidelines (eg, length) but no standards for acknowledging limitations or for data presentation. Editorial input varies from none to intense. Of the 127 releases analyzed, 29 (23%) noted study limitations and 83 (65%) reported main effects using numbers; 58 reported differences between study groups and of these, 26 (55%) provided the corresponding base rate, the format least prone to exaggeration. Industry funding was noted in only 22% of 23 studies receiving such funding. Press releases do not routinely highlight study limitations or the role of industry funding. Data are often presented using formats that may exaggerate the perceived importance of findings.
Data reduction software for LORAN-C flight test evaluation
NASA Technical Reports Server (NTRS)
Fischer, J. P.
1979-01-01
A set of programs designed to be run on an IBM 370/158 computer to read the recorded time differences from the tape produced by the LORAN data collection system, convert them to latitude/longitude and produce various plotting input files are described. The programs were written so they may be tailored easily to meet the demands of a particular data reduction job. The tape reader program is written in 370 assembler language and the remaining programs are written in standard IBM FORTRAN-IV language. The tape reader program is dependent upon the recording format used by the data collection system and on the I/O macros used at the computing facility. The other programs are generally device-independent, although the plotting routines are dependent upon the plotting method used. The data reduction programs convert the recorded data to a more readily usable form; convert the time difference (TD) numbers to latitude/longitude (lat/long), to format a printed listing of the TDs, lat/long, reference times, and other information derived from the data, and produce data files which may be used for subsequent plotting.
Gene Graphics: a genomic neighborhood data visualization web application.
Harrison, Katherine J; Crécy-Lagard, Valérie de; Zallot, Rémi
2018-04-15
The examination of gene neighborhood is an integral part of comparative genomics but no tools to produce publication quality graphics of gene clusters are available. Gene Graphics is a straightforward web application for creating such visuals. Supported inputs include National Center for Biotechnology Information gene and protein identifiers with automatic fetching of neighboring information, GenBank files and data extracted from the SEED database. Gene representations can be customized for many parameters including gene and genome names, colors and sizes. Gene attributes can be copied and pasted for rapid and user-friendly customization of homologous genes between species. In addition to Portable Network Graphics and Scalable Vector Graphics, produced representations can be exported as Tagged Image File Format or Encapsulated PostScript, formats that are standard for publication. Hands-on tutorials with real life examples inspired from publications are available for training. Gene Graphics is freely available at https://katlabs.cc/genegraphics/ and source code is hosted at https://github.com/katlabs/genegraphics. katherinejh@ufl.edu or remizallot@ufl.edu. Supplementary data are available at Bioinformatics online.
Solving Common Mathematical Problems
NASA Technical Reports Server (NTRS)
Luz, Paul L.
2005-01-01
Mathematical Solutions Toolset is a collection of five software programs that rapidly solve some common mathematical problems. The programs consist of a set of Microsoft Excel worksheets. The programs provide for entry of input data and display of output data in a user-friendly, menu-driven format, and for automatic execution once the input data has been entered.
The 1980-90 shuttle star catalog for onboard and ground programs
NASA Technical Reports Server (NTRS)
Richardson, S.; Killen, R.
1978-01-01
The 1980-90 shuttle star catalog for onboard and ground programs is presented. The data used in this catalog are explained according to derivation, input, format for the catalog, and preparation. The tables include the computer program listing, input star position, and the computed star positions for the years 1980-90.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-08
... productivity adjustment. This meeting is open to the public in accordance with the Federal Advisory Committee...). The review will include the inputs, input weights, price-measurement proxies, and productivity... the index's productivity adjustment. II. Meeting Format This meeting is open to the public. There will...
A Path to Formative Assessment through Naturalistic Inputs
ERIC Educational Resources Information Center
Cohen, Jonathan; Leroux, Audrey
2017-01-01
This paper reports on the development of a system in which naturalistic inputs are collected by a web-based e-reader and, in combination with a measurement of readers' comprehension of that text, are analyzed by a neural network to determine the nature of the relationship between the annotations and comprehension. Results showed that neural…
Category Induction via Distributional Analysis: Evidence from a Serial Reaction Time Task
ERIC Educational Resources Information Center
Hunt, Ruskin H.; Aslin, Richard N.
2010-01-01
Category formation lies at the heart of a number of higher-order behaviors, including language. We assessed the ability of human adults to learn, from distributional information alone, categories embedded in a sequence of input stimuli using a serial reaction time task. Artificial grammars generated corpora of input strings containing a…
FAST: Fitting and Assessment of Synthetic Templates
NASA Astrophysics Data System (ADS)
Kriek, Mariska; van Dokkum, Pieter G.; Labbé, Ivo; Franx, Marijn; Illingworth, Garth D.; Marchesini, Danilo; Quadri, Ryan F.; Aird, James; Coil, Alison L.; Georgakakis, Antonis
2018-03-01
FAST (Fitting and Assessment of Synthetic Templates) fits stellar population synthesis templates to broadband photometry and/or spectra. FAST is compatible with the photometric redshift code EAzY (ascl:1010.052) when fitting broadband photometry; it uses the photometric redshifts derived by EAzY, and the input files (for examply, photometric catalog and master filter file) are the same. FAST fits spectra in combination with broadband photometric data points or simultaneously fits two components, allowing for an AGN contribution in addition to the host galaxy light. Depending on the input parameters, FAST outputs the best-fit redshift, age, dust content, star formation timescale, metallicity, stellar mass, star formation rate (SFR), and their confidence intervals. Though some of FAST's functions overlap with those of HYPERZ (ascl:1108.010), it differs by fitting fluxes instead of magnitudes, allows the user to completely define the grid of input stellar population parameters and easily input photometric redshifts and their confidence intervals, and calculates calibrated confidence intervals for all parameters. Note that FAST is not a photometric redshift code, though it can be used as one.
Converting from DDOR SASF to APF
NASA Technical Reports Server (NTRS)
Gladden, Roy E.; Khanampompan, Teerapat; Fisher, Forest W.
2008-01-01
A computer program called ddor_sasf2apf converts delta-door (delta differential one-way range) request from an SASF (spacecraft activity sequence file) format to an APF (apgen plan file) format for use in the Mars Reconnaissance Orbiter (MRO) missionplanning- and-sequencing process. The APF is used as an input to APGEN/AUTOGEN in the MRO activity- planning and command-sequencegenerating process to sequence the delta-door (DDOR) activity. The DDOR activity is a spacecraft tracking technique for determining spacecraft location. The input to ddor_sasf2apf is an input request SASF provided by an observation team that utilizes DDOR. ddor_sasf2apf parses this DDOR SASF input, rearranging parameters and reformatting the request to produce an APF file for use in AUTOGEN and/or APGEN. The benefit afforded by ddor_sasf2apf is to enable the use of the DDOR SASF file earlier in the planning stage of the command-sequence-generating process and to produce sequences, optimized for DDOR operations, that are more accurate and more robust than would otherwise be possible.
Handling Input and Output for COAMPS
NASA Technical Reports Server (NTRS)
Fitzpatrick, Patrick; Tran, Nam; Li, Yongzuo; Anantharaj, Valentine
2007-01-01
Two suites of software have been developed to handle the input and output of the Coupled Ocean Atmosphere Prediction System (COAMPS), which is a regional atmospheric model developed by the Navy for simulating and predicting weather. Typically, the initial and boundary conditions for COAMPS are provided by a flat-file representation of the Navy s global model. Additional algorithms are needed for running the COAMPS software using global models. One of the present suites satisfies this need for running COAMPS using the Global Forecast System (GFS) model of the National Oceanic and Atmospheric Administration. The first step in running COAMPS downloading of GFS data from an Internet file-transfer-protocol (FTP) server computer of the National Centers for Environmental Prediction (NCEP) is performed by one of the programs (SSC-00273) in this suite. The GFS data, which are in gridded binary (GRIB) format, are then changed to a COAMPS-compatible format by another program in the suite (SSC-00278). Once a forecast is complete, still another program in the suite (SSC-00274) sends the output data to a different server computer. The second suite of software (SSC- 00275) addresses the need to ingest up-to-date land-use-and-land-cover (LULC) data into COAMPS for use in specifying typical climatological values of such surface parameters as albedo, aerodynamic roughness, and ground wetness. This suite includes (1) a program to process LULC data derived from observations by the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA s Terra and Aqua satellites, (2) programs to derive new climatological parameters for the 17-land-use-category MODIS data; and (3) a modified version of a FORTRAN subroutine to be used by COAMPS. The MODIS data files are processed to reformat them into a compressed American Standard Code for Information Interchange (ASCII) format used by COAMPS for efficient processing.
PRELIMINARY DESIGN ANALYSIS OF AXIAL FLOW TURBINES
NASA Technical Reports Server (NTRS)
Glassman, A. J.
1994-01-01
A computer program has been developed for the preliminary design analysis of axial-flow turbines. Rapid approximate generalized procedures requiring minimum input are used to provide turbine overall geometry and performance adequate for screening studies. The computations are based on mean-diameter flow properties and a stage-average velocity diagram. Gas properties are assumed constant throughout the turbine. For any given turbine, all stages, except the first, are specified to have the same shape velocity diagram. The first stage differs only in the value of inlet flow angle. The velocity diagram shape depends upon the stage work factor value and the specified type of velocity diagram. Velocity diagrams can be specified as symmetrical, zero exit swirl, or impulse; or by inputting stage swirl split. Exit turning vanes can be included in the design. The 1991 update includes a generalized velocity diagram, a more flexible meanline path, a reheat model, a radial component of velocity, and a computation of free-vortex hub and tip velocity diagrams. Also, a loss-coefficient calibration was performed to provide recommended values for airbreathing engine turbines. Input design requirements include power or pressure ratio, mass flow rate, inlet temperature and pressure, and rotative speed. The design variables include inlet and exit diameters, stator angle or exit radius ratio, and number of stages. Gas properties are input as gas constant, specific heat ratio, and viscosity. The program output includes inlet and exit annulus dimensions, exit temperature and pressure, total and static efficiencies, flow angles, blading angles, and last stage absolute and relative Mach numbers. This program is written in FORTRAN 77 and can be ported to any computer with a standard FORTRAN compiler which supports NAMELIST. It was originally developed on an IBM 7000 series computer running VM and has been implemented on IBM PC computers and compatibles running MS-DOS under Lahey FORTRAN, and DEC VAX series computers running VMS. Format statements in the code may need to be rewritten depending on your FORTRAN compiler. The source code and sample data are available on a 5.25 inch 360K MS-DOS format diskette. This program was developed in 1972 and was last updated in 1991. IBM and IBM PC are registered trademarks of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC VAX, and VMS are trademarks of Digital Equipment Corporation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moon, Joonoh, E-mail: mjo99@kims.re.kr; Ha, Heon-Young; Lee, Tae-Ho
2013-08-15
The pitting corrosion and interphase corrosion behaviors in high heat input welded heat-affected zone (HAZ) of a metastable high-nitrogen Fe–18Cr–10Mn–N austenitic stainless steel were explored through electrochemical tests. The HAZs were simulated using Gleeble simulator with high heat input welding condition of 300 kJ/cm and the peak temperature of the HAZs was changed from 1200 °C to 1350 °C, aiming to examine the effect of δ-ferrite formation on corrosion behavior. The electrochemical test results show that both pitting corrosion resistance and interphase corrosion resistance were seriously deteriorated by δ-ferrite formation in the HAZ and their aspects were different with increasingmore » δ-ferrite fraction. The pitting corrosion resistance was decreased by the formation of Cr-depleted zone along δ-ferrite/austenite (γ) interphase resulting from δ-ferrite formation; however it didn't depend on δ-ferrite fraction. The interphase corrosion resistance depends on the total amount of Cr-depleted zone as well as ferrite area and thus continuously decreased with increasing δ-ferrite fraction. The different effects of δ-ferrite fraction on pitting corrosion and interphase corrosion were carefully discussed in terms of alloying elements partitioning in the HAZ based on thermodynamic consideration. - Highlights: • Corrosion behavior in the weld HAZ of high-nitrogen austenitic alloy was studied. • Cr{sub 2}N particle was not precipitated in high heat input welded HAZ of tested alloy. • Pitting corrosion and interphase corrosion show a different behavior. • Pitting corrosion resistance was affected by whether or not δ-ferrite forms. • Interphase corrosion resistance was affected by the total amount of δ-ferrite.« less
Shock-induced solitary waves in granular crystals.
Hasan, M Arif; Nemat-Nasser, Sia
2018-02-01
Solitary waves (SWs) are generated in monoatomic (homogeneous) lightly contacting spherical granules by an applied input force of any time-variation and intensity. We consider finite duration shock loads on one-dimensional arrays of granules and focus on the transition regime that leads to the formation of SWs. Based on geometrical and material properties of the granules and the properties of the input shock, we provide explicit analytic expressions to calculate the peak value of the compressive contact force at each contact point in the transition regime that precedes the formation of a primary solitary wave. We also provide explicit expressions to estimate the number of granules involved in the transition regime and show its dependence on the characteristics of the input shock and material/geometrical properties of the interacting granules. Finally, we assess the accuracy of our theoretical results by comparing them with those obtained through numerical integration of the equations of motion.
NASA Technical Reports Server (NTRS)
Craidon, C. B.
1983-01-01
A computer program was developed to extend the geometry input capabilities of previous versions of a supersonic zero lift wave drag computer program. The arbitrary geometry input description is flexible enough to describe almost any complex aircraft concept, so that highly accurate wave drag analysis can now be performed because complex geometries can be represented accurately and do not have to be modified to meet the requirements of a restricted input format.
Visualizing astronomy data using VRML
NASA Astrophysics Data System (ADS)
Beeson, Brett; Lancaster, Michael; Barnes, David G.; Bourke, Paul D.; Rixon, Guy T.
2004-09-01
Visualisation is a powerful tool for understanding the large data sets typical of astronomical surveys and can reveal unsuspected relationships and anomalous regions of parameter space which may be difficult to find programatically. Visualisation is a classic information technology for optimising scientific return. We are developing a number of generic on-line visualisation tools as a component of the Australian Virtual Observatory project. The tools will be deployed within the framework of the International Virtual Observatory Alliance (IVOA), and follow agreed-upon standards to make them accessible by other programs and people. We and our IVOA partners plan to utilise new information technologies (such as grid computing and web services) to advance the scientific return of existing and future instrumentation. Here we present a new tool - VOlume - which visualises point data. Visualisation of astronomical data normally requires the local installation of complex software, the downloading of potentially large datasets, and very often time-consuming and tedious data format conversions. VOlume enables the astronomer to visualise data using just a web browser and plug-in. This is achieved using IVOA standards which allow us to pass data between Web Services, Java Servlet Technology and Common Gateway Interface programs. Data from a catalogue server can be streamed in eXtensible Mark-up Language format to a servlet which produces Virtual Reality Modeling Language output. The user selects elements of the catalogue to map to geometry and then visualises the result in a browser plug-in such as Cortona or FreeWRL. Other than requiring an input VOTable format file, VOlume is very general. While its major use will likely be to display and explore astronomical source catalogues, it can easily render other important parameter fields such as the sky and redshift coverage of proposed surveys or the sampling of the visibility plane by a rotation-synthesis interferometer.
NASA Astrophysics Data System (ADS)
Moon, Joonoh; Lee, Chang-Hoon; Lee, Tae-Ho; Kim, Hyoung Chan
2015-01-01
The phase transformation and mechanical properties in the weld heat-affected zone (HAZ) of a reduced activation ferritic/martensitic steel were explored. The samples for HAZs were prepared using a Gleeble simulator at different heat inputs. The base steel consisted of tempered martensite and carbides through quenching and tempering treatment, whereas the HAZs consisted of martensite, δ-ferrite, and a small volume of autotempered martensite. The prior austenite grain size, lath width of martensite, and δ-ferrite fraction in the HAZs increased with increase in the heat input. The mechanical properties were evaluated using Vickers hardness and Charpy V-notch impact test. The Vickers hardness in the HAZs was higher than that in the base steel but did not change noticeably with increase in the heat input. The HAZs showed poor impact property due to the formation of martensite and δ-ferrite as compared to the base steel. In addition, the impact property of the HAZs deteriorated more with the increase in the heat input. Post weld heat treatment contributed to improve the impact property of the HAZs through the formation of tempered martensite, but the impact property of the HAZs remained lower than that of base steel.
Wisconsin's Model Academic Standards for Agricultural Education. Bulletin No. 9003.
ERIC Educational Resources Information Center
Fortier, John D.; Albrecht, Bryan D.; Grady, Susan M.; Gagnon, Dean P.; Wendt, Sharon, W.
These model academic standards for agricultural education in Wisconsin represent the work of a task force of educators, parents, and business people with input from the public. The introductory section of this bulletin defines the academic standards and discusses developing the standards, using the standards, relating the standards to all…
DOT National Transportation Integrated Search
1997-07-14
These standards represent a guideline for preparing digital data for inclusion in the National Pipeline Mapping System Repository. The standards were created with input from the pipeline industry and government agencies. They address the submission o...
Introducing ADES: A New IAU Astrometry Data Exchange Standard
NASA Astrophysics Data System (ADS)
Chesley, Steven R.; Hockney, George M.; Holman, Matthew J.
2017-10-01
For several decades, small body astrometry has been exchanged, distributed and archived in the form of 80-column ASCII records. As a replacement for this obsolescent format, we have worked with a number of members of the community to develop the Astrometric Data Exchange Standard (ADES), which was formally adopted by IAU Commission 20 in August 2015 at the XXIX General Assembly in Honolulu, Hawaii.The purpose of ADES is to ensure that useful and available observational information is submitted, archived, and disseminated as needed. Availability of more complete information will allow orbit computers to process the data more correctly, leading to improved accuracy and reliability of orbital fits. In this way, it will be possible to fully exploit the improving accuracy and increasing number of both optical and radar observations. ADES overcomes several limitations of the previous format by allowing characterization of astrometric and photometric errors, adequate precision in time and angle fields, and flexibility and extensibility.To accommodate a diverse base of users, from automated surveys to hands-on follow-up observers, the ADES protocol allows for two file formats, eXtensible Markup Language (XML) and Pipe-Separated Values (PSV). Each format carries the same information and simple tools allow users to losslessly transform back and forth between XML and PSV.We have further developed and refined ADES since it was first announced in July 2015 [1]. The proposal at that time [2] has undergone several modest revisions to aid validation and avoid overloaded fields. We now have validation schema and file transformation utilities. Suitable example files, test suites, and input/output libraries in a number of modern programming languages are now available. Acknowledgements: Useful feedback during the development of ADES has been received from numerous colleagues in the community of observers and orbit specialists working on asteroids comets and planetary satellites. References: [1] Chesley, S.R. (2015) M.P.E.C. 2015-O06. [2] http://minorplanetcenter.net/iau/ info/IAU2015_ADES.pdf
40 CFR 60.43Da - Standard for sulfur dioxide (SO2).
Code of Federal Regulations, 2010 CFR
2010-07-01
..., and that burns 75 percent or more (by heat input) coal refuse on a 12-month rolling average basis...) of this section, any gases that contain SO2 in excess of: (1) 520 ng/J (1.20 lb/MMBtu) heat input and.../MMBtu) heat input. (b) On and after the date on which the initial performance test is completed or...
40 CFR 60.43b - Standard for particulate matter (PM).
Code of Federal Regulations, 2010 CFR
2010-07-01
...) heat input, (i) If the affected facility combusts only coal, or (ii) If the affected facility combusts.... (2) 43 ng/J (0.10 lb/MMBtu) heat input if the affected facility combusts coal and other fuels and has... greater than 10 percent (0.10) for fuels other than coal. (3) 86 ng/J (0.20 lb/MMBtu) heat input if the...
Optimization of heterogeneous Bin packing using adaptive genetic algorithm
NASA Astrophysics Data System (ADS)
Sridhar, R.; Chandrasekaran, M.; Sriramya, C.; Page, Tom
2017-03-01
This research is concentrates on a very interesting work, the bin packing using hybrid genetic approach. The optimal and feasible packing of goods for transportation and distribution to various locations by satisfying the practical constraints are the key points in this project work. As the number of boxes for packing can not be predicted in advance and the boxes may not be of same category always. It also involves many practical constraints that are why the optimal packing makes much importance to the industries. This work presents a combinational of heuristic Genetic Algorithm (HGA) for solving Three Dimensional (3D) Single container arbitrary sized rectangular prismatic bin packing optimization problem by considering most of the practical constraints facing in logistic industries. This goal was achieved in this research by optimizing the empty volume inside the container using genetic approach. Feasible packing pattern was achieved by satisfying various practical constraints like box orientation, stack priority, container stability, weight constraint, overlapping constraint, shipment placement constraint. 3D bin packing problem consists of ‘n’ number of boxes being to be packed in to a container of standard dimension in such a way to maximize the volume utilization and in-turn profit. Furthermore, Boxes to be packed may be of arbitrary sizes. The user input data are the number of bins, its size, shape, weight, and constraints if any along with standard container dimension. This user input were stored in the database and encoded to string (chromosomes) format which were normally acceptable by GA. GA operators were allowed to act over these encoded strings for finding the best solution.
Reconstruction of nonlinear wave propagation
Fleischer, Jason W; Barsi, Christopher; Wan, Wenjie
2013-04-23
Disclosed are systems and methods for characterizing a nonlinear propagation environment by numerically propagating a measured output waveform resulting from a known input waveform. The numerical propagation reconstructs the input waveform, and in the process, the nonlinear environment is characterized. In certain embodiments, knowledge of the characterized nonlinear environment facilitates determination of an unknown input based on a measured output. Similarly, knowledge of the characterized nonlinear environment also facilitates formation of a desired output based on a configurable input. In both situations, the input thus characterized and the output thus obtained include features that would normally be lost in linear propagations. Such features can include evanescent waves and peripheral waves, such that an image thus obtained are inherently wide-angle, farfield form of microscopy.
Staging memory for massively parallel processor
NASA Technical Reports Server (NTRS)
Batcher, Kenneth E. (Inventor)
1988-01-01
The invention herein relates to a computer organization capable of rapidly processing extremely large volumes of data. A staging memory is provided having a main stager portion consisting of a large number of memory banks which are accessed in parallel to receive, store, and transfer data words simultaneous with each other. Substager portions interconnect with the main stager portion to match input and output data formats with the data format of the main stager portion. An address generator is coded for accessing the data banks for receiving or transferring the appropriate words. Input and output permutation networks arrange the lineal order of data into and out of the memory banks.
Haines, Seth S.; Varela, Brian; Hawkins, Sarah J.; Gianoutsos, Nicholas J.; Tennyson, Marilyn E.
2017-01-01
The U.S. Geological Survey (USGS) has conducted an assessment of water and proppant requirements, and water production volumes, associated with possible future production of undiscovered petroleum resources in the Bakken and Three Forks Formations, Williston Basin, USA. This water and proppant assessment builds directly from the 2013 USGS petroleum assessment for the Bakken and Three Forks Formations, and it has been conducted using a new water and proppant assessment methodology that builds from the established USGS methodology for assessment of undiscovered petroleum in continuous reservoirs. We determined the assessment input values through extensive analysis of available data on per-well water and proppant use for hydraulic fracturing, including trends over time and space. We determined other assessment inputs through analysis of regional water-production trends.
Applications of the generalized information processing system (GIPSY)
Moody, D.W.; Kays, Olaf
1972-01-01
The Generalized Information Processing System (GIPSY) stores and retrieves variable-field, variable-length records consisting of numeric data, textual data, or codes. A particularly noteworthy feature of GIPSY is its ability to search records for words, word stems, prefixes, and suffixes as well as for numeric values. Moreover, retrieved records may be printed on pre-defined formats or formatted as fixed-field, fixed-length records for direct input to other-programs, which facilitates the exchange of data with other systems. At present there are some 22 applications of GIPSY falling in the general areas of bibliography, natural resources information, and management science, This report presents a description of each application including a sample input form, dictionary, and a typical formatted record. It is hoped that these examples will stimulate others to experiment with innovative uses of computer technology.
Computational Tools for Parsimony Phylogenetic Analysis of Omics Data
Salazar, Jose; Amri, Hakima; Noursi, David
2015-01-01
Abstract High-throughput assays from genomics, proteomics, metabolomics, and next generation sequencing produce massive omics datasets that are challenging to analyze in biological or clinical contexts. Thus far, there is no publicly available program for converting quantitative omics data into input formats to be used in off-the-shelf robust phylogenetic programs. To the best of our knowledge, this is the first report on creation of two Windows-based programs, OmicsTract and SynpExtractor, to address this gap. We note, as a way of introduction and development of these programs, that one particularly useful bioinformatics inferential modeling is the phylogenetic cladogram. Cladograms are multidimensional tools that show the relatedness between subgroups of healthy and diseased individuals and the latter's shared aberrations; they also reveal some characteristics of a disease that would not otherwise be apparent by other analytical methods. The OmicsTract and SynpExtractor were written for the respective tasks of (1) accommodating advanced phylogenetic parsimony analysis (through standard programs of MIX [from PHYLIP] and TNT), and (2) extracting shared aberrations at the cladogram nodes. OmicsTract converts comma-delimited data tables through assigning each data point into a binary value (“0” for normal states and “1” for abnormal states) then outputs the converted data tables into the proper input file formats for MIX or with embedded commands for TNT. SynapExtractor uses outfiles from MIX and TNT to extract the shared aberrations of each node of the cladogram, matching them with identifying labels from the dataset and exporting them into a comma-delimited file. Labels may be gene identifiers in gene-expression datasets or m/z values in mass spectrometry datasets. By automating these steps, OmicsTract and SynpExtractor offer a veritable opportunity for rapid and standardized phylogenetic analyses of omics data; their model can also be extended to next generation sequencing (NGS) data. We make OmicsTract and SynpExtractor publicly and freely available for non-commercial use in order to strengthen and build capacity for the phylogenetic paradigm of omics analysis. PMID:26230532
TRANDESNF: A computer program for transonic airfoil design and analysis in nonuniform flow
NASA Technical Reports Server (NTRS)
Chang, J. F.; Lan, C. Edward
1987-01-01
The use of a transonic airfoil code for analysis, inverse design, and direct optimization of an airfoil immersed in propfan slipstream is described. A summary of the theoretical method, program capabilities, input format, output variables, and program execution are described. Input data of sample test cases and the corresponding output are given.
Some Thoughts on the Matter of Self-Determination and Will.
ERIC Educational Resources Information Center
Deci, Edward L.
"Will" is defined in this paper as the capacity to decide how to behave based on a processing of relevant information. A sequence of motivated behavior begins with informational inputs or stimuli. These come from three sources: the environment, one's physiology, and one's memory. These inputs lead to the formation of motives or awareness of a…
A Digital Control Algorithm for Magnetic Suspension Systems
NASA Technical Reports Server (NTRS)
Britton, Thomas C.
1996-01-01
An ongoing program exists to investigate and develop magnetic suspension technologies and modelling techniques at NASA Langley Research Center. Presently, there is a laboratory-scale large air-gap suspension system capable of five degree-of-freedom (DOF) control that is operational and a six DOF system that is under development. Those systems levitate a cylindrical element containing a permanent magnet core above a planar array of electromagnets, which are used for levitation and control purposes. In order to evaluate various control approaches with those systems, the Generic Real-Time State-Space Controller (GRTSSC) software package was developed. That control software package allows the user to implement multiple control methods and allows for varied input/output commands. The development of the control algorithm is presented. The desired functionality of the software is discussed, including the ability to inject noise on sensor inputs and/or actuator outputs. Various limitations, common issues, and trade-offs are discussed including data format precision; the drawbacks of using either Direct Memory Access (DMA), interrupts, or program control techniques for data acquisition; and platform dependent concerns related to the portability of the software, such as memory addressing formats. Efforts to minimize overall controller loop-rate and a comparison of achievable controller sample rates are discussed. The implementation of a modular code structure is presented. The format for the controller input data file and the noise information file is presented. Controller input vector information is available for post-processing by mathematical analysis software such as MATLAB1.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-21
... inputs to semiautomatic self-contained dead reckoning navigation systems which were not continuously... Doppler sensor equipment that provides inputs to dead reckoning navigation systems obsolete. On August 18...
Standard Transistor Array (STAR). Volume 1: Placement technique
NASA Technical Reports Server (NTRS)
Cox, G. W.; Caroll, B. D.
1979-01-01
A large scale integration (LSI) technology, the standard transistor array uses a prefabricated understructure of transistors and a comprehensive library of digital logic cells to allow efficient fabrication of semicustom digital LSI circuits. The cell placement technique for this technology involves formation of a one dimensional cell layout and "folding" of the one dimensional placement onto the chip. It was found that, by use of various folding methods, high quality chip layouts can be achieved. Methods developed to measure of the "goodness" of the generated placements include efficient means for estimating channel usage requirements and for via counting. The placement and rating techniques were incorporated into a placement program (CAPSTAR). By means of repetitive use of the folding methods and simple placement improvement strategies, this program provides near optimum placements in a reasonable amount of time. The program was tested on several typical LSI circuits to provide performance comparisons both with respect to input parameters and with respect to the performance of other placement techniques. The results of this testing indicate that near optimum placements can be achieved by use of the procedures incurring severe time penalties.
MOVES-Matrix and distributed computing for microscale line source dispersion analysis.
Liu, Haobing; Xu, Xiaodan; Rodgers, Michael O; Xu, Yanzhi Ann; Guensler, Randall L
2017-07-01
MOVES and AERMOD are the U.S. Environmental Protection Agency's recommended models for use in project-level transportation conformity and hot-spot analysis. However, the structure and algorithms involved in running MOVES make analyses cumbersome and time-consuming. Likewise, the modeling setup process, including extensive data requirements and required input formats, in AERMOD lead to a high potential for analysis error in dispersion modeling. This study presents a distributed computing method for line source dispersion modeling that integrates MOVES-Matrix, a high-performance emission modeling tool, with the microscale dispersion models CALINE4 and AERMOD. MOVES-Matrix was prepared by iteratively running MOVES across all possible iterations of vehicle source-type, fuel, operating conditions, and environmental parameters to create a huge multi-dimensional emission rate lookup matrix. AERMOD and CALINE4 are connected with MOVES-Matrix in a distributed computing cluster using a series of Python scripts. This streamlined system built on MOVES-Matrix generates exactly the same emission rates and concentration results as using MOVES with AERMOD and CALINE4, but the approach is more than 200 times faster than using the MOVES graphical user interface. Because AERMOD requires detailed meteorological input, which is difficult to obtain, this study also recommends using CALINE4 as a screening tool for identifying the potential area that may exceed air quality standards before using AERMOD (and identifying areas that are exceedingly unlikely to exceed air quality standards). CALINE4 worst case method yields consistently higher concentration results than AERMOD for all comparisons in this paper, as expected given the nature of the meteorological data employed. The paper demonstrates a distributed computing method for line source dispersion modeling that integrates MOVES-Matrix with the CALINE4 and AERMOD. This streamlined system generates exactly the same emission rates and concentration results as traditional way to use MOVES with AERMOD and CALINE4, which are regulatory models approved by the U.S. EPA for conformity analysis, but the approach is more than 200 times faster than implementing the MOVES model. We highlighted the potentially significant benefit of using CALINE4 as screening tool for identifying potential area that may exceeds air quality standards before using AERMOD, which requires much more meteorology input than CALINE4.
Neo: an object model for handling electrophysiology data in multiple formats
Garcia, Samuel; Guarino, Domenico; Jaillet, Florent; Jennings, Todd; Pröpper, Robert; Rautenberg, Philipp L.; Rodgers, Chris C.; Sobolev, Andrey; Wachtler, Thomas; Yger, Pierre; Davison, Andrew P.
2014-01-01
Neuroscientists use many different software tools to acquire, analyze and visualize electrophysiological signals. However, incompatible data models and file formats make it difficult to exchange data between these tools. This reduces scientific productivity, renders potentially useful analysis methods inaccessible and impedes collaboration between labs. A common representation of the core data would improve interoperability and facilitate data-sharing. To that end, we propose here a language-independent object model, named “Neo,” suitable for representing data acquired from electroencephalographic, intracellular, or extracellular recordings, or generated from simulations. As a concrete instantiation of this object model we have developed an open source implementation in the Python programming language. In addition to representing electrophysiology data in memory for the purposes of analysis and visualization, the Python implementation provides a set of input/output (IO) modules for reading/writing the data from/to a variety of commonly used file formats. Support is included for formats produced by most of the major manufacturers of electrophysiology recording equipment and also for more generic formats such as MATLAB. Data representation and data analysis are conceptually separate: it is easier to write robust analysis code if it is focused on analysis and relies on an underlying package to handle data representation. For that reason, and also to be as lightweight as possible, the Neo object model and the associated Python package are deliberately limited to representation of data, with no functions for data analysis or visualization. Software for neurophysiology data analysis and visualization built on top of Neo automatically gains the benefits of interoperability, easier data sharing and automatic format conversion; there is already a burgeoning ecosystem of such tools. We intend that Neo should become the standard basis for Python tools in neurophysiology. PMID:24600386
Neo: an object model for handling electrophysiology data in multiple formats.
Garcia, Samuel; Guarino, Domenico; Jaillet, Florent; Jennings, Todd; Pröpper, Robert; Rautenberg, Philipp L; Rodgers, Chris C; Sobolev, Andrey; Wachtler, Thomas; Yger, Pierre; Davison, Andrew P
2014-01-01
Neuroscientists use many different software tools to acquire, analyze and visualize electrophysiological signals. However, incompatible data models and file formats make it difficult to exchange data between these tools. This reduces scientific productivity, renders potentially useful analysis methods inaccessible and impedes collaboration between labs. A common representation of the core data would improve interoperability and facilitate data-sharing. To that end, we propose here a language-independent object model, named "Neo," suitable for representing data acquired from electroencephalographic, intracellular, or extracellular recordings, or generated from simulations. As a concrete instantiation of this object model we have developed an open source implementation in the Python programming language. In addition to representing electrophysiology data in memory for the purposes of analysis and visualization, the Python implementation provides a set of input/output (IO) modules for reading/writing the data from/to a variety of commonly used file formats. Support is included for formats produced by most of the major manufacturers of electrophysiology recording equipment and also for more generic formats such as MATLAB. Data representation and data analysis are conceptually separate: it is easier to write robust analysis code if it is focused on analysis and relies on an underlying package to handle data representation. For that reason, and also to be as lightweight as possible, the Neo object model and the associated Python package are deliberately limited to representation of data, with no functions for data analysis or visualization. Software for neurophysiology data analysis and visualization built on top of Neo automatically gains the benefits of interoperability, easier data sharing and automatic format conversion; there is already a burgeoning ecosystem of such tools. We intend that Neo should become the standard basis for Python tools in neurophysiology.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-23
... boilers are small (less than 10 MMBtu/hr heat input) and are generally owned and operated by contractors... (> 5MMBtu/h) or five-year ( New boilers with heat input capacity greater than 10 million Btu per hour that... with heat input capacity greater than 10 million Btu per hour that are biomass-fired or oil-fired must...
Alternative Formats to Achieve More Efficient Energy Codes for Commercial Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conover, David R.; Rosenberg, Michael I.; Halverson, Mark A.
2013-01-26
This paper identifies and examines several formats or structures that could be used to create the next generation of more efficient energy codes and standards for commercial buildings. Pacific Northwest National Laboratory (PNNL) is funded by the U.S. Department of Energy’s Building Energy Codes Program (BECP) to provide technical support to the development of ANSI/ASHRAE/IES Standard 90.1. While the majority of PNNL’s ASHRAE Standard 90.1 support focuses on developing and evaluating new requirements, a portion of its work involves consideration of the format of energy standards. In its current working plan, the ASHRAE 90.1 committee has approved an energy goalmore » of 50% improvement in Standard 90.1-2013 relative to Standard 90.1-2004, and will likely be considering higher improvement targets for future versions of the standard. To cost-effectively achieve the 50% goal in manner that can gain stakeholder consensus, formats other than prescriptive must be considered. Alternative formats that include reducing the reliance on prescriptive requirements may make it easier to achieve these aggressive efficiency levels in new codes and standards. The focus on energy code and standard formats is meant to explore approaches to presenting the criteria that will foster compliance, enhance verification, and stimulate innovation while saving energy in buildings. New formats may also make it easier for building designers and owners to design and build the levels of efficiency called for in the new codes and standards. This paper examines a number of potential formats and structures, including prescriptive, performance-based (with sub-formats of performance equivalency and performance targets), capacity constraint-based, and outcome-based. The paper also discusses the pros and cons of each format from the viewpoint of code users and of code enforcers.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-21
...] Draft Guidance for Industry on Providing Submissions in Electronic Format--Standardized Study Data... Submissions in Electronic Format--Standardized Study Data.'' This draft guidance establishes FDA's recommendation that sponsors and applicants submit nonclinical and clinical study data in a standardized...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-23
... a track-and-trace system and for obtaining input from supply chain partners on potential attributes...-trace system and (2) input from supply chain partners on potential attributes and standards for the...
AITRAC: Augmented Interactive Transient Radiation Analysis by Computer. User's information manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1977-10-01
AITRAC is a program designed for on-line, interactive, DC, and transient analysis of electronic circuits. The program solves linear and nonlinear simultaneous equations which characterize the mathematical models used to predict circuit response. The program features 100 external node--200 branch capability; conversional, free-format input language; built-in junction, FET, MOS, and switch models; sparse matrix algorithm with extended-precision H matrix and T vector calculations, for fast and accurate execution; linear transconductances: beta, GM, MU, ZM; accurate and fast radiation effects analysis; special interface for user-defined equations; selective control of multiple outputs; graphical outputs in wide and narrow formats; and on-line parametermore » modification capability. The user describes the problem by entering the circuit topology and part parameters. The program then automatically generates and solves the circuit equations, providing the user with printed or plotted output. The circuit topology and/or part values may then be changed by the user, and a new analysis, requested. Circuit descriptions may be saved on disk files for storage and later use. The program contains built-in standard models for resistors, voltage and current sources, capacitors, inductors including mutual couplings, switches, junction diodes and transistors, FETS, and MOS devices. Nonstandard models may be constructed from standard models or by using the special equations interface. Time functions may be described by straight-line segments or by sine, damped sine, and exponential functions. 42 figures, 1 table. (RWR)« less
Elements of a next generation time-series ASCII data file format for Earth Sciences
NASA Astrophysics Data System (ADS)
Webster, C. J.
2015-12-01
Data in ASCII comma separated value (CSV) format are recognized as the most simple, straightforward and readable type of data present in the geosciences. Many scientific workflows developed over the years rely on data using this simple format. However, there is a need for a lightweight ASCII header format standard that is easy to create and easy to work with. Current OGC grade XML standards are complex and difficult to implement for researchers with few resources. Ideally, such a format should provide the data in CSV for easy consumption by generic applications such as spreadsheets. The format should use an existing time standard. The header should be easily human readable as well as machine parsable. The metadata format should be extendable to allow vocabularies to be adopted as they are created by external standards bodies. The creation of such a format will increase the productivity of software engineers and scientists because fewer translators and checkers would be required. Data in ASCII comma separated value (CSV) format are recognized as the most simple, straightforward and readable type of data present in the geosciences. Many scientific workflows developed over the years rely on data using this simple format. However, there is a need for a lightweight ASCII header format standard that is easy to create and easy to work with. Current OGC grade XML standards are complex and difficult to implement for researchers with few resources. Ideally, such a format would provide the data in CSV for easy consumption by generic applications such as spreadsheets. The format would use existing time standard. The header would be easily human readable as well as machine parsable. The metadata format would be extendable to allow vocabularies to be adopted as they are created by external standards bodies. The creation of such a format would increase the productivity of software engineers and scientists because fewer translators would be required.
Council for the Advancement of Standards Learning and Developmental Outcomes
ERIC Educational Resources Information Center
Council for the Advancement of Standards in Higher Education, 2008
2008-01-01
The Council for the Advancement of Standards in Higher Education (CAS) promotes standards to enhance opportunities for student learning and development from higher education programs and services. Responding to the increased shift in attention being paid by educators and their stakeholders from higher education inputs (i.e., standards and…
HYSEP: A Computer Program for Streamflow Hydrograph Separation and Analysis
Sloto, Ronald A.; Crouse, Michele Y.
1996-01-01
HYSEP is a computer program that can be used to separate a streamflow hydrograph into base-flow and surface-runoff components. The base-flow component has traditionally been associated with ground-water discharge and the surface-runoff component with precipitation that enters the stream as overland runoff. HYSEP includes three methods of hydrograph separation that are referred to in the literature as the fixed interval, sliding-interval, and local-minimum methods. The program also describes the frequency and duration of measured streamflow and computed base flow and surface runoff. Daily mean stream discharge is used as input to the program in either an American Standard Code for Information Interchange (ASCII) or binary format. Output from the program includes table,s graphs, and data files. Graphical output may be plotted on the computer screen or output to a printer, plotter, or metafile.
ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.
2016-04-01
Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less
The expected results method for data verification
NASA Astrophysics Data System (ADS)
Monday, Paul
2016-05-01
The credibility of United States Army analytical experiments using distributed simulation depends on the quality of the simulation, the pedigree of the input data, and the appropriateness of the simulation system to the problem. The second of these factors is best met by using classified performance data from the Army Materiel Systems Analysis Activity (AMSAA) for essential battlefield behaviors, like sensors, weapon fire, and damage assessment. Until recently, using classified data has been a time-consuming and expensive endeavor: it requires significant technical expertise to load, and it is difficult to verify that it works correctly. Fortunately, new capabilities, tools, and processes are available that greatly reduce these costs. This paper will discuss these developments, a new method to verify that all of the components are configured and operate properly, and the application to recent Army Capabilities Integration Center (ARCIC) experiments. Recent developments have focused improving the process to load the data. OneSAF has redesigned their input data file formats and structures so that they correspond exactly with the Standard File Format (SFF) defined by AMSAA, ARCIC developed a library of supporting configurations that correlate directly to the AMSAA nomenclature, and the Entity Validation Tool was designed to quickly execute the essential models with a test-jig approach to identify problems with the loaded data. The missing part of the process is provided by the new Expected Results Method. Instead of the usual subjective assessment of quality, e.g., "It looks about right to me", this new approach compares the performance of a combat model with authoritative expectations to quickly verify that the model, data, and simulation are all working correctly. Integrated together, these developments now make it possible to use AMSAA classified performance data with minimal time and maximum assurance that the experiment's analytical results will be of the highest quality possible.
Mark A. Bradford; Ashley D. Keiser; Christian A. Davies; Calley A. Mersmann; Michael S. Strickland
2012-01-01
Plant-carbon inputs to soils in the form of dissolved sugars, organic acids and amino acids fuel much of heterotrophic microbial activity belowground. Initial residence times of these compounds in the soil solution are on the order of hours, with microbial uptake a primary removal mechanism. Through microbial biosynthesis, the dissolved compounds become dominant...
ERIC Educational Resources Information Center
East, Martin; King, Chris
2012-01-01
In the listening component of the IELTS examination candidates hear the input once, delivered at "normal" speed. This format for listening can be problematic for test takers who often perceive normal speed input to be too fast for effective comprehension. The study reported here investigated whether using computer software to slow down…
User assessment of smoke-dispersion models for wildland biomass burning.
Steve Breyfogle; Sue A. Ferguson
1996-01-01
Several smoke-dispersion models, which currently are available for modeling smoke from biomass burns, were evaluated for ease of use, availability of input data, and output data format. The input and output components of all models are listed, and differences in model physics are discussed. Each model was installed and run on a personal computer with a simple-case...
The Challenges in Developing VET Competencies in E-Commerce.
ERIC Educational Resources Information Center
Mitchell, John
A formative evaluation was begun of an innovative project funded by the Australian National Training Authority (ANTA) to develop competencies and qualifications in e-commerce. The formative evaluation was designed to focus on inputs, processes, and interim outputs, identifying both good practice and areas for improvement. Findings to date…
Creating an Information Literacy Badges Program in Blackboard: A Formative Program Evaluation
ERIC Educational Resources Information Center
Tunon, Johanna; Ramirez, Laura Lucio; Ryckman, Brian; Campbell, Loy; Mlinar, Courtney
2015-01-01
A formative program evaluation using Stufflebeam's (2010) Context, Input, Process, Product (CIPP) model was conducted to assess the use of digital badges for tracking basic library instructional skills across academic programs at Nova Southeastern University. Based on the evaluation of pilot library modules and Blackboard Learn's badges…
DOE Office of Scientific and Technical Information (OSTI.GOV)
BERG, MICHAEL; RILEY, MARSHALL
System assessments typically yield large quantities of data from disparate sources for an analyst to scrutinize for issues. Netmeld is used to parse input from different file formats, store the data in a common format, allow users to easily query it, and enable analysts to tie different analysis tools together using a common back-end.
Computer program documentation: CYBER to Univac binary conversion user's guide
NASA Technical Reports Server (NTRS)
Martin, E. W.
1980-01-01
A user's guide for a computer program which will convert SINDA temperature history data from CDC (Cyber) binary format to UNIVAC 1100 binary format is presented. The various options available, the required input, the optional output, file assignments, and the restrictions of the program are discussed.
Sonic Hedgehog Expression in Corticofugal Projection Neurons Directs Cortical Microcircuit Formation
Harwell, Corey C.; Parker, Philip R.L.; Gee, Steven M.; Okada, Ami; McConnell, Susan K.; Kreitzer, Anatol C.; Kriegstein, Arnold R.
2012-01-01
SUMMARY The precise connectivity of inputs and outputs is critical for cerebral cortex function; however, the cellular mechanisms that establish these connections are poorly understood. Here, we show that the secreted molecule Sonic Hedgehog (Shh) is involved in synapse formation of a specific cortical circuit. Shh is expressed in layer V corticofugal projection neurons and the Shh receptor, Brother of CDO (Boc), is expressed in local and callosal projection neurons of layer II/III that synapse onto the subcortical projection neurons. Layer V neurons of mice lacking functional Shh exhibit decreased synapses. Conversely, the loss of functional Boc leads to a reduction in the strength of synaptic connections onto layer Vb, but not layer II/III, pyramidal neurons. These results demonstrate that Shh is expressed in postsynaptic target cells while Boc is expressed in a complementary population of presynaptic input neurons, and they function to guide the formation of cortical microcircuitry. PMID:22445340
NASA Astrophysics Data System (ADS)
Prasad, U.; Rahabi, A.
2001-05-01
The following utilities developed for HDF-EOS format data dump are of special use for Earth science data for NASA's Earth Observation System (EOS). This poster demonstrates their use and application. The first four tools take HDF-EOS data files as input. HDF-EOS Metadata Dumper - metadmp Metadata dumper extracts metadata from EOS data granules. It operates by simply copying blocks of metadata from the file to the standard output. It does not process the metadata in any way. Since all metadata in EOS granules is encoded in the Object Description Language (ODL), the output of metadmp will be in the form of complete ODL statements. EOS data granules may contain up to three different sets of metadata (Core, Archive, and Structural Metadata). HDF-EOS Contents Dumper - heosls Heosls dumper displays the contents of HDF-EOS files. This utility provides detailed information on the POINT, SWATH, and GRID data sets. in the files. For example: it will list, the Geo-location fields, Data fields and objects. HDF-EOS ASCII Dumper - asciidmp The ASCII dump utility extracts fields from EOS data granules into plain ASCII text. The output from asciidmp should be easily human readable. With minor editing, asciidmp's output can be made ingestible by any application with ASCII import capabilities. HDF-EOS Binary Dumper - bindmp The binary dumper utility dumps HDF-EOS objects in binary format. This is useful for feeding the output of it into existing program, which does not understand HDF, for example: custom software and COTS products. HDF-EOS User Friendly Metadata - UFM The UFM utility tool is useful for viewing ECS metadata. UFM takes an EOSDIS ODL metadata file and produces an HTML report of the metadata for display using a web browser. HDF-EOS METCHECK - METCHECK METCHECK can be invoked from either Unix or Dos environment with a set of command line options that a user might use to direct the tool inputs and output . METCHECK validates the inventory metadata in (.met file) using The Descriptor file (.desc) as the reference. The tool takes (.desc), and (.met) an ODL file as inputs, and generates a simple output file contains the results of the checking process.
Translating PI observing proposals into ALMA observing scripts
NASA Astrophysics Data System (ADS)
Liszt, Harvey S.
2014-08-01
The ALMA telescope is a complex 66-antenna array working in the specialized domain of mm- and sub-mm aperture synthesis imaging. To make ALMA accessible to technically inexperienced but scientifically expert users, the ALMA Observing Tool (OT) has been developed. Using the OT, scientifically oriented user input is formatted as observing proposals that are packaged for peer-review and assessment of technical feasibility. If accepted, the proposal's scientifically oriented inputs are translated by the OT into scheduling blocks, which function as input to observing scripts for the telescope's online control system. Here I describe the processes and practices by which this translation from PI scientific goals to online control input and schedule block execution actually occurs.
Adopting Cut Scores: Post-Standard-Setting Panel Considerations for Decision Makers
ERIC Educational Resources Information Center
Geisinger, Kurt F.; McCormick, Carina M.
2010-01-01
Standard-setting studies utilizing procedures such as the Bookmark or Angoff methods are just one component of the complete standard-setting process. Decision makers ultimately must determine what they believe to be the most appropriate standard or cut score to use, employing the input of the standard-setting panelists as one piece of information…
An accelerated training method for back propagation networks
NASA Technical Reports Server (NTRS)
Shelton, Robert O. (Inventor)
1993-01-01
The principal objective is to provide a training procedure for a feed forward, back propagation neural network which greatly accelerates the training process. A set of orthogonal singular vectors are determined from the input matrix such that the standard deviations of the projections of the input vectors along these singular vectors, as a set, are substantially maximized, thus providing an optimal means of presenting the input data. Novelty exists in the method of extracting from the set of input data, a set of features which can serve to represent the input data in a simplified manner, thus greatly reducing the time/expense to training the system.
Remote media vision-based computer input device
NASA Astrophysics Data System (ADS)
Arabnia, Hamid R.; Chen, Ching-Yi
1991-11-01
In this paper, we introduce a vision-based computer input device which has been built at the University of Georgia. The user of this system gives commands to the computer without touching any physical device. The system receives input through a CCD camera; it is PC- based and is built on top of the DOS operating system. The major components of the input device are: a monitor, an image capturing board, a CCD camera, and some software (developed by use). These are interfaced with a standard PC running under the DOS operating system.
Software Aids In Graphical Depiction Of Flow Data
NASA Technical Reports Server (NTRS)
Stegeman, J. D.
1995-01-01
Interactive Data Display System (IDDS) computer program is graphical-display program designed to assist in visualization of three-dimensional flow in turbomachinery. Grid and simulation data files in PLOT3D format required for input. Able to unwrap volumetric data cone associated with centrifugal compressor and display results in easy-to-understand two- or three-dimensional plots. IDDS provides majority of visualization and analysis capability for Integrated Computational Fluid Dynamics and Experiment (ICE) system. IDDS invoked from any subsystem, or used as stand-alone package of display software. Generates contour, vector, shaded, x-y, and carpet plots. Written in C language. Input file format used by IDDS is that of PLOT3D (COSMIC item ARC-12782).
Spacecraft Formation Flying Maneuvers Using Linear Quadratic Regulation With No Radial Axis Inputs
NASA Technical Reports Server (NTRS)
Starin, Scott R.; Yedavalli, R. K.; Sparks, Andrew G.; Bauer, Frank H. (Technical Monitor)
2001-01-01
Regarding multiple spacecraft formation flying, the observation has been made that control thrust need only be applied coplanar to the local horizon to achieve complete controllability of a two-satellite (leader-follower) formation. A formulation of orbital dynamics using the state of one satellite relative to another is used. Without the need for thrust along the radial (zenith-nadir) axis of the relative reference frame, propulsion system simplifications and weight reduction may be accomplished. This work focuses on the validation of this control system on its own merits, and in comparison to a related system which does provide thrust along the radial axis of the relative frame. Maneuver simulations are performed using commercial ODE solvers to propagate the Keplerian dynamics of a controlled satellite relative to an uncontrolled leader. These short maneuver simulations demonstrate the capacity of the controller to perform changes from one formation geometry to another. Control algorithm performance is evaluated based on measures such as the fuel required to complete a maneuver and the maximum acceleration required by the controller. Based on this evaluation, the exclusion of the radial axis of control still allows enough control authority to use Linear Quadratic Regulator (LQR) techniques to design a gain matrix of adequate performance over finite maneuvers. Additional simulations are conducted including perturbations and using no radial control inputs. A major conclusion presented is that control inputs along the three axes have significantly different relationships to the governing orbital dynamics that may be exploited using LQR.
NASA Astrophysics Data System (ADS)
Hartini, Entin; Andiwijayakusuma, Dinan
2014-09-01
This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuel type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.
Shore, Sabrina; Henderson, Jordana M; Lebedev, Alexandre; Salcedo, Michelle P; Zon, Gerald; McCaffrey, Anton P; Paul, Natasha; Hogrefe, Richard I
2016-01-01
For most sample types, the automation of RNA and DNA sample preparation workflows enables high throughput next-generation sequencing (NGS) library preparation. Greater adoption of small RNA (sRNA) sequencing has been hindered by high sample input requirements and inherent ligation side products formed during library preparation. These side products, known as adapter dimer, are very similar in size to the tagged library. Most sRNA library preparation strategies thus employ a gel purification step to isolate tagged library from adapter dimer contaminants. At very low sample inputs, adapter dimer side products dominate the reaction and limit the sensitivity of this technique. Here we address the need for improved specificity of sRNA library preparation workflows with a novel library preparation approach that uses modified adapters to suppress adapter dimer formation. This workflow allows for lower sample inputs and elimination of the gel purification step, which in turn allows for an automatable sRNA library preparation protocol.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartini, Entin, E-mail: entin@batan.go.id; Andiwijayakusuma, Dinan, E-mail: entin@batan.go.id
2014-09-30
This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuelmore » type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.« less
Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations
Buscheck, Thomas A.
2012-01-01
Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk : FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.
Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations
Buscheck, Thomas A.
2000-01-01
Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk: FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.
NASA Astrophysics Data System (ADS)
Yoshida, Minori; Miyaji, Kousuke
2018-04-01
A start-up charge pump circuit for an extremely low input voltage (V IN) is proposed and demonstrated. The proposed circuit uses an inverter level shifter to generate a 2V IN voltage swing to the gate of both main NMOS and PMOS power transistors in a charge pump to reduce the channel resistance. The proposed circuit is fully implemented in a standard 0.18 µm CMOS process, and the measurement result shows that a minimum input voltage of 190 mV is achieved and output power increases by 181% compared with the conventional forward-body-bias scheme at a 300 mV input voltage. The proposed scheme achieves a maximum efficiency of 59.2% when the input voltage is 390 mV and the output current is 320 nA. The proposed circuit is suitable as a start-up circuit in ultralow power energy harvesting power management applications to boost-up from below threshold voltage.
Knowledge-based processing for aircraft flight control
NASA Technical Reports Server (NTRS)
Painter, John H.
1991-01-01
The purpose is to develop algorithms and architectures for embedding artificial intelligence in aircraft guidance and control systems. With the approach adopted, AI-computing is used to create an outer guidance loop for driving the usual aircraft autopilot. That is, a symbolic processor monitors the operation and performance of the aircraft. Then, based on rules and other stored knowledge, commands are automatically formulated for driving the autopilot so as to accomplish desired flight operations. The focus is on developing a software system which can respond to linguistic instructions, input in a standard format, so as to formulate a sequence of simple commands to the autopilot. The instructions might be a fairly complex flight clearance, input either manually or by data-link. Emphasis is on a software system which responds much like a pilot would, employing not only precise computations, but, also, knowledge which is less precise, but more like common-sense. The approach is based on prior work to develop a generic 'shell' architecture for an AI-processor, which may be tailored to many applications by describing the application in appropriate processor data bases (libraries). Such descriptions include numerical models of the aircraft and flight control system, as well as symbolic (linguistic) descriptions of flight operations, rules, and tactics.
Reversible control of biofilm formation by Cellulomonas spp. in response to nitrogen availability.
Young, Jenna M; Leschine, Susan B; Reguera, Gemma
2012-03-01
The microbial degradation of cellulose contributes greatly to the cycling of carbon in terrestrial environments and feedbacks to the atmosphere, a process that is highly responsive to nitrogen inputs. Yet how key groups of cellulolytic microorganisms adaptively respond to the global conditions of nitrogen limitation and/or anthropogenic or climate nitrogen inputs is poorly understood. The actinobacterial genus Cellulomonas is of special interest because it incorporates the only species known to degrade cellulose aerobically and anaerobically. Furthermore, despite their inability to fix nitrogen, they are active decomposers in nitrogen-limited environments. Here we show that nitrogen limitation induced biofilm formation in Cellulomonas spp., a process that was coupled to carbon sequestration and storage in a curdlan-type biofilm matrix. The response was reversible and the curdlan matrix was solubilized and used as a carbon and energy source for biofilm dispersal once nitrogen sources became available. The biofilms attached strongly to cellulosic surfaces and, despite the growth limitation, produced cellulases and degraded cellulose more efficiently. The results show that biofilm formation is a competitive strategy for carbon and nitrogen acquisition and provide valuable insights linking nitrogen inputs to carbon sequestration and remobilization in terrestrial environments. © 2011 Society for Applied Microbiology and Blackwell Publishing Ltd.
National Assessment of Geologic Carbon Dioxide Storage Resources -- Trends and Interpretations
NASA Astrophysics Data System (ADS)
Buursink, M. L.; Blondes, M. S.; Brennan, S.; Drake, R., II; Merrill, M. D.; Roberts-Ashby, T. L.; Slucher, E. R.; Warwick, P.
2013-12-01
In 2012, the U.S. Geological Survey (USGS) completed an assessment of the technically accessible storage resource (TASR) for carbon dioxide (CO2) in geologic formations underlying the onshore and State waters area of the United States. The formations assessed are at least 3,000 feet (914 meters) below the ground surface. The TASR is an estimate of the CO2 storage resource that may be available for CO2 injection and storage that is based on present-day geologic and hydrologic knowledge of the subsurface and current engineering practices. Individual storage assessment units (SAUs) for 36 basins or study areas were defined on the basis of geologic and hydrologic characteristics outlined in the USGS assessment methodology. The mean national TASR is approximately 3,000 metric gigatons. To augment the release of the assessment, this study reviews input estimates and output results as a part of the resource calculation. Included in this study are a collection of both cross-plots and maps to demonstrate our trends and interpretations. Alongside the assessment, the input estimates were examined for consistency between SAUs and cross-plotted to verify expected trends, such as decreasing storage formation porosity with increasing SAU depth, for instance, and to show a positive correlation between storage formation porosity and permeability estimates. Following the assessment, the output results were examined for correlation with selected input estimates. For example, there exists a positive correlation between CO2 density and the TASR, and between storage formation porosity and the TASR, as expected. These correlations, in part, serve to verify our estimates for the geologic variables. The USGS assessment concluded that the Coastal Plains Region of the eastern and southeastern United States contains the largest storage resource. Within the Coastal Plains Region, the storage resources from the U.S. Gulf Coast study area represent 59 percent of the national CO2 storage capacity. As part of this follow up study, additional maps were generated to show the geographic distribution of the input estimates and the output results across the U.S. For example, the distribution of the SAUs with fresh, saline or mixed formation water quality is shown. Also mapped is the variation in CO2 density as related to basin location and to related properties such as subsurface temperature and pressure. Furthermore, variation in the estimated SAU depth and resulting TASR are shown across the assessment study areas, and these depend on the geologic basin size and filling history. Ultimately, multiple map displays are possible with the complete data set of input estimates and range of reported results. The findings from this study show the effectiveness of the USGS methodology and the robustness of the assessment.
Data reduction complex analog-to-digital data processing requirements for onsite test facilities
NASA Technical Reports Server (NTRS)
Debbrecht, J. D.
1976-01-01
The analog to digital processing requirements of onsite test facilities are described. The source and medium of all input data to the Data Reduction Complex (DRC) and the destination and medium of all output products of the analog-to-digital processing are identified. Additionally, preliminary input and output data formats are presented along with the planned use of the output products.
Wang, Meng; Cai, Elizabeth; Fujiwara, Nana; Fones, Lilah; Brown, Elizabeth; Yanagawa, Yuchio; Cave, John W
2017-05-03
Adaptation of neural circuits to changes in sensory input can modify several cellular processes within neurons, including neurotransmitter biosynthesis levels. For a subset of olfactory bulb interneurons, activity-dependent changes in GABA are reflected by corresponding changes in Glutamate decarboxylase 1 ( Gad1 ) expression levels. Mechanisms regulating Gad1 promoter activity are poorly understood, but here we show that a conserved G:C-rich region in the mouse Gad1 proximal promoter region both recruits heterogeneous nuclear ribonucleoproteins (hnRNPs) that facilitate transcription and forms single-stranded DNA secondary structures associated with transcriptional repression. This promoter architecture and function is shared with Tyrosine hydroxylase ( Th ), which is also modulated by odorant-dependent activity in the olfactory bulb. This study shows that the balance between DNA secondary structure formation and hnRNP binding on the mouse Th and Gad1 promoters in the olfactory bulb is responsive to changes in odorant-dependent sensory input. These findings reveal that Th and Gad1 share a novel transcription regulatory mechanism that facilitates sensory input-dependent regulation of dopamine and GABA expression. SIGNIFICANCE STATEMENT Adaptation of neural circuits to changes in sensory input can modify several cellular processes within neurons, including neurotransmitter biosynthesis levels. This study shows that transcription of genes encoding rate-limiting enzymes for GABA and dopamine biosynthesis ( Gad1 and Th , respectively) in the mammalian olfactory bulb is regulated by G:C-rich regions that both recruit heterogeneous nuclear ribonucleoproteins (hnRNPs) to facilitate transcription and form single-stranded DNA secondary structures associated with repression. hnRNP binding and formation of DNA secondary structure on the Th and Gad1 promoters are mutually exclusive, and odorant sensory input levels regulate the balance between these regulatory features. These findings reveal that Th and Gad1 share a transcription regulatory mechanism that facilitates odorant-dependent regulation of dopamine and GABA expression levels. Copyright © 2017 the authors 0270-6474/17/374778-12$15.00/0.
2016 Standard Scenarios Report: A U.S. Electricity Sector Outlook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Wesley; Mai, Trieu; Logan, Jeffrey
This is the webinar presentation deck used to present the 2016 Standard Scenarios work. It discusses the Annual Technology Baseline (ATB) detailed cost and performance projections for electricity-generating technologies and the standard scenarios of the power sector modeling using ATB inputs.
Abraham, Joanna; Kannampallil, Thomas G; Almoosa, Khalid F; Patel, Bela; Patel, Vimla L
2014-04-01
Handoffs vary in their structure and content, raising concerns regarding standardization. We conducted a comparative evaluation of the nature and patterns of communication on 2 functionally similar but conceptually different handoff tools: Subjective, Objective, Assessment and Plan, based on a patient problem-based format, and Handoff Intervention Tool (HAND-IT), based on a body system-based format. A nonrandomized pre-post prospective intervention study supported by audio recordings and observations of 82 resident handoffs was conducted in a medical intensive care unit. Qualitative analysis was complemented with exploratory sequential pattern analysis techniques to capture the characteristics and types of communication events (CEs) and breakdowns. Use of HAND-IT led to fewer communication breakdowns (F1,80 = 45.66: P < .0001), greater number of CEs (t40 = 4.56; P < .001), with more ideal CEs than Subjective, Objective, Assessment and Plan (t40 = 9.27; P < .001). In addition, the use of HAND-IT was characterized by more request-response CE transitions. The HAND-IT's body system-based structure afforded physicians the ability to better organize and comprehend patient information and led to an interactive and streamlined communication, with limited external input. Our results also emphasize the importance of information organization using a medical knowledge hierarchical format for fostering effective communication. Copyright © 2014 Elsevier Inc. All rights reserved.
Wade, James H; Jones, Joshua D; Lenov, Ivan L; Riordan, Colleen M; Sligar, Stephen G; Bailey, Ryan C
2017-08-22
The characterization of integral membrane proteins presents numerous analytical challenges on account of their poor activity under non-native conditions, limited solubility in aqueous solutions, and low expression in most cell culture systems. Nanodiscs are synthetic model membrane constructs that offer many advantages for studying membrane protein function by offering a native-like phospholipid bilayer environment. The successful incorporation of membrane proteins within Nanodiscs requires experimental optimization of conditions. Standard protocols for Nanodisc formation can require large amounts of time and input material, limiting the facile screening of formation conditions. Capitalizing on the miniaturization and efficient mass transport inherent to microfluidics, we have developed a microfluidic platform for efficient Nanodisc assembly and purification, and demonstrated the ability to incorporate functional membrane proteins into the resulting Nanodiscs. In addition to working with reduced sample volumes, this platform simplifies membrane protein incorporation from a multi-stage protocol requiring several hours or days into a single platform that outputs purified Nanodiscs in less than one hour. To demonstrate the utility of this platform, we incorporated Cytochrome P450 into Nanodiscs of variable size and lipid composition, and present spectroscopic evidence for the functional active site of the membrane protein. This platform is a promising new tool for membrane protein biology and biochemistry that enables tremendous versatility for optimizing the incorporation of membrane proteins using microfluidic gradients to screen across diverse formation conditions.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-25
... Software Developers on the Technical Specifications for Common Formats for Patient Safety Data Collection... designed as an interactive forum where PSOs and software developers can provide input on these technical... updated event descriptions, forms, and technical specifications for software developers. As an update to...
Precursors of Young Women's Family Formation Pathways
ERIC Educational Resources Information Center
Amato, Paul R.; Landale, Nancy S.; Havasevich-Brooks, Tara C.; Booth, Alan; Eggebeen, David J.; Schoen, Robert; McHale, Susan M.
2008-01-01
We used latent class analysis to create family formation pathways for women between the ages of 18 and 23. Input variables included cohabitation, marriage, parenthood, full-time employment, and attending school. Data (n = 2,290) came from Waves I and III of the National Longitudinal Study of Adolescent Health (Add Health). The analysis revealed…
Weng, Feng-Ju; Garcia, Rodrigo I; Lutzu, Stefano; Alviña, Karina; Zhang, Yuxiang; Dushko, Margaret; Ku, Taeyun; Zemoura, Khaled; Rich, David; Garcia-Dominguez, Dario; Hung, Matthew; Yelhekar, Tushar D; Sørensen, Andreas Toft; Xu, Weifeng; Chung, Kwanghun; Castillo, Pablo E; Lin, Yingxi
2018-03-07
Synaptic connections between hippocampal mossy fibers (MFs) and CA3 pyramidal neurons are essential for contextual memory encoding, but the molecular mechanisms regulating MF-CA3 synapses during memory formation and the exact nature of this regulation are poorly understood. Here we report that the activity-dependent transcription factor Npas4 selectively regulates the structure and strength of MF-CA3 synapses by restricting the number of their functional synaptic contacts without affecting the other synaptic inputs onto CA3 pyramidal neurons. Using an activity-dependent reporter, we identified CA3 pyramidal cells that were activated by contextual learning and found that MF inputs on these cells were selectively strengthened. Deletion of Npas4 prevented both contextual memory formation and this learning-induced synaptic modification. We further show that Npas4 regulates MF-CA3 synapses by controlling the expression of the polo-like kinase Plk2. Thus, Npas4 is a critical regulator of experience-dependent, structural, and functional plasticity at MF-CA3 synapses during contextual memory formation. Copyright © 2018 Elsevier Inc. All rights reserved.
ProtaBank: A repository for protein design and engineering data.
Wang, Connie Y; Chang, Paul M; Ary, Marie L; Allen, Benjamin D; Chica, Roberto A; Mayo, Stephen L; Olafson, Barry D
2018-03-25
We present ProtaBank, a repository for storing, querying, analyzing, and sharing protein design and engineering data in an actively maintained and updated database. ProtaBank provides a format to describe and compare all types of protein mutational data, spanning a wide range of properties and techniques. It features a user-friendly web interface and programming layer that streamlines data deposition and allows for batch input and queries. The database schema design incorporates a standard format for reporting protein sequences and experimental data that facilitates comparison of results across different data sets. A suite of analysis and visualization tools are provided to facilitate discovery, to guide future designs, and to benchmark and train new predictive tools and algorithms. ProtaBank will provide a valuable resource to the protein engineering community by storing and safeguarding newly generated data, allowing for fast searching and identification of relevant data from the existing literature, and exploring correlations between disparate data sets. ProtaBank invites researchers to contribute data to the database to make it accessible for search and analysis. ProtaBank is available at https://protabank.org. © 2018 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.
Gandy, Lisa M; Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J
2017-01-01
Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85-100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases.
Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J.; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J.
2017-01-01
Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85–100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases. PMID:28437440
Simplified Metadata Curation via the Metadata Management Tool
NASA Astrophysics Data System (ADS)
Shum, D.; Pilone, D.
2015-12-01
The Metadata Management Tool (MMT) is the newest capability developed as part of NASA Earth Observing System Data and Information System's (EOSDIS) efforts to simplify metadata creation and improve metadata quality. The MMT was developed via an agile methodology, taking into account inputs from GCMD's science coordinators and other end-users. In its initial release, the MMT uses the Unified Metadata Model for Collections (UMM-C) to allow metadata providers to easily create and update collection records in the ISO-19115 format. Through a simplified UI experience, metadata curators can create and edit collections without full knowledge of the NASA Best Practices implementation of ISO-19115 format, while still generating compliant metadata. More experienced users are also able to access raw metadata to build more complex records as needed. In future releases, the MMT will build upon recent work done in the community to assess metadata quality and compliance with a variety of standards through application of metadata rubrics. The tool will provide users with clear guidance as to how to easily change their metadata in order to improve their quality and compliance. Through these features, the MMT allows data providers to create and maintain compliant and high quality metadata in a short amount of time.
Spherical rotation orientation indication for HEVC and JEM coding of 360 degree video
NASA Astrophysics Data System (ADS)
Boyce, Jill; Xu, Qian
2017-09-01
Omnidirectional (or "360 degree") video, representing a panoramic view of a spherical 360° ×180° scene, can be encoded using conventional video compression standards, once it has been projection mapped to a 2D rectangular format. Equirectangular projection format is currently used for mapping 360 degree video to a rectangular representation for coding using HEVC/JEM. However, video in the top and bottom regions of the image, corresponding to the "north pole" and "south pole" of the spherical representation, is significantly warped. We propose to perform spherical rotation of the input video prior to HEVC/JEM encoding in order to improve the coding efficiency, and to signal parameters in a supplemental enhancement information (SEI) message that describe the inverse rotation process recommended to be applied following HEVC/JEM decoding, prior to display. Experiment results show that up to 17.8% bitrate gain (using the WS-PSNR end-to-end metric) can be achieved for the Chairlift sequence using HM16.15 and 11.9% gain using JEM6.0, and an average gain of 2.9% for HM16.15 and 2.2% for JEM6.0.
Standard free energy of formation of iron iodide
NASA Technical Reports Server (NTRS)
Khandkar, A.; Tare, V. B.; Wagner, J. B., Jr.
1983-01-01
An experiment is reported where silver iodide is used to determine the standard free energy of formation of iron iodide. By using silver iodide as a solid electrolyte, a galvanic cell, Ag/AgI/Fe-FeI2, is formulated. The standard free energy of formation of AgI is known, and hence it is possible to estimate the standard free energy of formation of FeI2 by measuring the open-circuit emf of the above cell as a function of temperature. The free standard energy of formation of FeI2 determined by this method is -38784 + 24.165T cal/mol. It is estimated that the maximum error associated with this method is plus or minus 2500 cal/mol.
Implementation of a standard format for GPS common view data
NASA Technical Reports Server (NTRS)
Weiss, Marc A.; Thomas, Claudine
1995-01-01
A new format for standardizing common view time transfer data, recommended by the Consultative Committee for the Definition of the Second, is being implemented in receivers commonly used for contributing data for the generation of International Atomic Time. We discuss three aspects of this new format that potentially improve GPS common-view time transfer: (1) the standard specifies the method for treating short term data, (2) it presents data in consistent formats including needed terms not previously available, and (3) the standard includes a header of parameters important for the GPS common-view process. In coordination with the release of firmware conforming to this new format the Bureau International des Poids et Mesures will release future international track schedules consistent with the new standard.
Orzol, Leonard L.; McGrath, Timothy S.
1992-01-01
This report documents modifications to the U.S. Geological Survey modular, three-dimensional, finite-difference, ground-water flow model, commonly called MODFLOW, so that it can read and write files used by a geographic information system (GIS). The modified model program is called MODFLOWARC. Simulation programs such as MODFLOW generally require large amounts of input data and produce large amounts of output data. Viewing data graphically, generating head contours, and creating or editing model data arrays such as hydraulic conductivity are examples of tasks that currently are performed either by the use of independent software packages or by tedious manual editing, manipulating, and transferring data. Programs such as GIS programs are commonly used to facilitate preparation of the model input data and analyze model output data; however, auxiliary programs are frequently required to translate data between programs. Data translations are required when different programs use different data formats. Thus, the user might use GIS techniques to create model input data, run a translation program to convert input data into a format compatible with the ground-water flow model, run the model, run a translation program to convert the model output into the correct format for GIS, and use GIS to display and analyze this output. MODFLOWARC, avoids the two translation steps and transfers data directly to and from the ground-water-flow model. This report documents the design and use of MODFLOWARC and includes instructions for data input/output of the Basic, Block-centered flow, River, Recharge, Well, Drain, Evapotranspiration, General-head boundary, and Streamflow-routing packages. The modification to MODFLOW and the Streamflow-Routing package was minimized. Flow charts and computer-program code describe the modifications to the original computer codes for each of these packages. Appendix A contains a discussion on the operation of MODFLOWARC using a sample problem.
CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergman, Rolf; Paget, Maria L.; Richman, Eric E.
2011-03-31
With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR® energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for allmore » equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate values and, finally, obtaining expanded uncertainties. Using this basis and examining each step of the photometric measurement and calibration methods, mathematical uncertainty models are developed. Determination of estimated values of input variables is discussed. Guidance is provided for the evaluation of the standard uncertainties of each input estimate, covariances associated with input estimates and the calculation of the result measurements. With this basis, the combined uncertainty of the measurement results and finally, the expanded uncertainty can be determined.« less
Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José
2012-07-01
This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.
K-12 Louisiana Student Standards for Mathematics. Louisiana Student Standards: Mathematics
ERIC Educational Resources Information Center
Louisiana Department of Education, 2016
2016-01-01
The Louisiana mathematics standards were created by over one hundred Louisiana educators with input by thousands of parents and teachers from across the state. Educators envisioned what mathematically proficient students should know and be able to do to compete in society and focused their efforts on creating standards that would allow them to do…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-06
... Collection; Comment Request; Input From Hawaii's Boat-based Anglers AGENCY: National Oceanic and Atmospheric... Marine Recreational Information Program's National Data Standards. The State of Hawaii is developing a... (monitoring) survey of fishing catch and effort derived from Hawaii's private boaters--a required component of...
Response Modality Variations Affect Determinations of Children's Learning Styles.
ERIC Educational Resources Information Center
Janowitz, Jeffrey M.
The Swassing-Barbe Modality Index (SBMI) uses visual, auditory, and tactile inputs, but only reconstructed output, to measure children's modality strengths. In this experiment, the SBMI's three input modalities were crossed with two output modalities (spoken and drawn) in addition to the reconstructed standard to result in nine treatment…
Proposal for a Standard Format for Neurophysiology Data Recording and Exchange.
Stead, Matt; Halford, Jonathan J
2016-10-01
The lack of interoperability between information networks is a significant source of cost in health care. Standardized data formats decrease health care cost, improve quality of care, and facilitate biomedical research. There is no common standard digital format for storing clinical neurophysiologic data. This review proposes a new standard file format for neurophysiology data (the bulk of which is video-electroencephalographic data), entitled the Multiscale Electrophysiology Format, version 3 (MEF3), which is designed to address many of the shortcomings of existing formats. MEF3 provides functionality that addresses many of the limitations of current formats. The proposed improvements include (1) hierarchical file structure with improved organization; (2) greater extensibility for big data applications requiring a large number of channels, signal types, and parallel processing; (3) efficient and flexible lossy or lossless data compression; (4) industry standard multilayered data encryption and time obfuscation that permits sharing of human data without the need for deidentification procedures; (5) resistance to file corruption; (6) facilitation of online and offline review and analysis; and (7) provision of full open source documentation. At this time, there is no other neurophysiology format that supports all of these features. MEF3 is currently gaining industry and academic community support. The authors propose the use of the MEF3 as a standard format for neurophysiology recording and data exchange. Collaboration between industry, professional organizations, research communities, and independent standards organizations is needed to move the project forward.
Holcomb, Paul S.; Hoffpauir, Brian K.; Hoyson, Mitchell C.; Jackson, Dakota R.; Deerinck, Thomas J.; Marrs, Glenn S.; Dehoff, Marlin; Wu, Jonathan; Ellisman, Mark H.
2013-01-01
Hallmark features of neural circuit development include early exuberant innervation followed by competition and pruning to mature innervation topography. Several neural systems, including the neuromuscular junction and climbing fiber innervation of Purkinje cells, are models to study neural development in part because they establish a recognizable endpoint of monoinnervation of their targets and because the presynaptic terminals are large and easily monitored. We demonstrate here that calyx of Held (CH) innervation of its target, which forms a key element of auditory brainstem binaural circuitry, exhibits all of these characteristics. To investigate CH development, we made the first application of serial block-face scanning electron microscopy to neural development with fine temporal resolution and thereby accomplished the first time series for 3D ultrastructural analysis of neural circuit formation. This approach revealed a growth spurt of added apposed surface area (ASA) >200 μm2/d centered on a single age at postnatal day 3 in mice and an initial rapid phase of growth and competition that resolved to monoinnervation in two-thirds of cells within 3 d. This rapid growth occurred in parallel with an increase in action potential threshold, which may mediate selection of the strongest input as the winning competitor. ASAs of competing inputs were segregated on the cell body surface. These data suggest mechanisms to select “winning” inputs by regional reinforcement of postsynaptic membrane to mediate size and strength of competing synaptic inputs. PMID:23926251
Electrical Advantages of Dendritic Spines
Gulledge, Allan T.; Carnevale, Nicholas T.; Stuart, Greg J.
2012-01-01
Many neurons receive excitatory glutamatergic input almost exclusively onto dendritic spines. In the absence of spines, the amplitudes and kinetics of excitatory postsynaptic potentials (EPSPs) at the site of synaptic input are highly variable and depend on dendritic location. We hypothesized that dendritic spines standardize the local geometry at the site of synaptic input, thereby reducing location-dependent variability of local EPSP properties. We tested this hypothesis using computational models of simplified and morphologically realistic spiny neurons that allow direct comparison of EPSPs generated on spine heads with EPSPs generated on dendritic shafts at the same dendritic locations. In all morphologies tested, spines greatly reduced location-dependent variability of local EPSP amplitude and kinetics, while having minimal impact on EPSPs measured at the soma. Spine-dependent standardization of local EPSP properties persisted across a range of physiologically relevant spine neck resistances, and in models with variable neck resistances. By reducing the variability of local EPSPs, spines standardized synaptic activation of NMDA receptors and voltage-gated calcium channels. Furthermore, spines enhanced activation of NMDA receptors and facilitated the generation of NMDA spikes and axonal action potentials in response to synaptic input. Finally, we show that dynamic regulation of spine neck geometry can preserve local EPSP properties following plasticity-driven changes in synaptic strength, but is inefficient in modifying the amplitude of EPSPs in other cellular compartments. These observations suggest that one function of dendritic spines is to standardize local EPSP properties throughout the dendritic tree, thereby allowing neurons to use similar voltage-sensitive postsynaptic mechanisms at all dendritic locations. PMID:22532875
Synaptic Plasticity and Memory Formation
1990-12-05
aniracetam was found recently to enhance the conductance of AMPA receptors expressed in oocytes from rat brain mRNA without altering responses by NMDA and...laboratory using the two input paradigm indicates that aniracetam increases control responses by 25 ± 8% (n = 20) but potentiated inputs by only 14 ± 6... aniracetam has no effect on NMDA receptor mediated responses (Xiao et al., in oreo.). These latter experiments used the paradigm established by Muller
The Application of HOS to PLRS
1977-11-01
of "older," more established fields, like philosophy or mathematics , and more recently, linguistics. But when working with large systems, there is...property of natural language, which is eliminated by using formal, mathematical specifications. 3.4.2.2 Network Management Processing: The Network...the format: y = f(x) That is, we must immediately begin thinking of the problem in terms of mathematical functions (mappings) acting on some input(s
NASA Technical Reports Server (NTRS)
Huffman, S.
1977-01-01
Detailed instructions on the use of two computer-aided-design programs for designing the energy storage inductor for single winding and two winding dc to dc converters are provided. Step by step procedures are given to illustrate the formatting of user input data. The procedures are illustrated by eight sample design problems which include the user input and the computer program output.
Brandt, Adam R
2008-10-01
Oil shale is a sedimentary rock that contains kerogen, a fossil organic material. Kerogen can be heated to produce oil and gas (retorted). This has traditionally been a CO2-intensive process. In this paper, the Shell in situ conversion process (ICP), which is a novel method of retorting oil shale in place, is analyzed. The ICP utilizes electricity to heat the underground shale over a period of 2 years. Hydrocarbons are produced using conventional oil production techniques, leaving shale oil coke within the formation. The energy inputs and outputs from the ICP, as applied to oil shales of the Green River formation, are modeled. Using these energy inputs, the greenhouse gas (GHG) emissions from the ICP are calculated and are compared to emissions from conventional petroleum. Energy outputs (as refined liquid fuel) are 1.2-1.6 times greater than the total primary energy inputs to the process. In the absence of capturing CO2 generated from electricity produced to fuel the process, well-to-pump GHG emissions are in the range of 30.6-37.1 grams of carbon equivalent per megajoule of liquid fuel produced. These full-fuel-cycle emissions are 21%-47% larger than those from conventionally produced petroleum-based fuels.
Ishizuka, Ken'Ichi; Satoh, Yoshihide
2012-08-16
In rats that had been anesthetized by urethane-chloralose, we investigated whether neurons in the rostral part of the parvicellular reticular formation (rRFp) mediate lingual nerve input to the rostral ventrolateral medulla (RVLM), which is involved in somato-visceral sensory integration and in controlling the cardiovascular system. We determined the effect of the lingual nerve stimulation on activity of the rRFp neurons that were activated antidromically by stimulation of the RVLM. Stimulation of the lingual trigeminal afferent gave rise to excitatory effects (10/26, 39%), inhibitory effects (6/26, 22%) and no effect (10/26, 39%) on the RVLM-projecting rRFp neurons. About two-thirds of RVLM-projecting rRFp neurons exhibited spontaneous activity; the remaining one-third did not. A half (13/26) of RVLM-projecting rRFp neurons exhibited a pulse-related activity, suggesting that they receive a variety of peripheral and CNS inputs involved in cardiovascular function. We conclude that the lingual trigeminal input exerts excitatory and/or inhibitory effects on a majority (61%) of the RVLM-projecting rRFp neurons, and their neuronal activity may be involved in the cardiovascular responses accompanied by the defense reaction. Copyright © 2012 Elsevier B.V. All rights reserved.
PATSTAGS - PATRAN-STAGSC-1 TRANSLATOR
NASA Technical Reports Server (NTRS)
Otte, N. E.
1994-01-01
PATSTAGS translates PATRAN finite model data into STAGS (Structural Analysis of General Shells) input records to be used for engineering analysis. The program reads data from a PATRAN neutral file and writes STAGS input records into a STAGS input file and a UPRESS data file. It is able to support translations of nodal constraints, nodal, element, force and pressure data. PATSTAGS uses three files: the PATRAN neutral file to be translated, a STAGS input file and a STAGS pressure data file. The user provides the names for the neutral file and the desired names of the STAGS files to be created. The pressure data file contains the element live pressure data used in the STAGS subroutine UPRESS. PATSTAGS is written in FORTRAN 77 for DEC VAX series computers running VMS. The main memory requirement for execution is approximately 790K of virtual memory. Output blocks can be modified to output the data in any format desired, allowing the program to be used to translate model data to analysis codes other than STAGSC-1 (HQN-10967). This program is available in DEC VAX BACKUP format on a 9-track magnetic tape or TK50 tape cartridge. Documentation is included in the price of the program. PATSTAGS was developed in 1990. DEC, VAX, TK50 and VMS are trademarks of Digital Equipment Corporation.
Estimating Basic Preliminary Design Performances of Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Luz, Paul L.; Alexander, Reginald
2004-01-01
Aerodynamics and Performance Estimation Toolset is a collection of four software programs for rapidly estimating the preliminary design performance of aerospace vehicles represented by doing simplified calculations based on ballistic trajectories, the ideal rocket equation, and supersonic wedges through standard atmosphere. The program consists of a set of Microsoft Excel worksheet subprograms. The input and output data are presented in a user-friendly format, and calculations are performed rapidly enough that the user can iterate among different trajectories and/or shapes to perform "what-if" studies. Estimates that can be computed by these programs include: 1. Ballistic trajectories as a function of departure angles, initial velocities, initial positions, and target altitudes; assuming point masses and no atmosphere. The program plots the trajectory in two-dimensions and outputs the position, pitch, and velocity along the trajectory. 2. The "Rocket Equation" program calculates and plots the trade space for a vehicle s propellant mass fraction over a range of specific impulse and mission velocity values, propellant mass fractions as functions of specific impulses and velocities. 3. "Standard Atmosphere" will estimate the temperature, speed of sound, pressure, and air density as a function of altitude in a standard atmosphere, properties of a standard atmosphere as functions of altitude. 4. "Supersonic Wedges" will calculate the free-stream, normal-shock, oblique-shock, and isentropic flow properties for a wedge-shaped body flying supersonically through a standard atmosphere. It will also calculate the maximum angle for which a shock remains attached, and the minimum Mach number for which a shock becomes attached, all as functions of the wedge angle, altitude, and Mach number.
Inputs to the Pulp and Paper Industry October 2011 Residual Risk Assessment
The U.S. Environmental Protection Agency (EPA) has to conduct risk assessments on each source category subject to MACT standards and determine if additional standards are needed to reduce residual risks.
Solar powered hybrid sensor module program
NASA Technical Reports Server (NTRS)
Johnson, J. M.; Holmes, H. K.
1985-01-01
Geo-orbital systems of the near future will require more sophisticated electronic and electromechanical monitoring and control systems than current satellite systems with an emphasis in the design on the electronic density and autonomy of the subsystem components. Results of a project to develop, design, and implement a proof-of-concept sensor system for space applications, with hybrids forming the active subsystem components are described. The design of the solar power hybrid sensor modules is discussed. Module construction and function are described. These modules combined low power CMOS electronics, GaAs solar cells, a crystal oscillatory standard UART data formatting, and a bidirectional optical data link into a single 1.25 x 1.25 x 0.25 inch hybrid package which has no need for electrical input or output. Several modules were built and tested. Applications of such a system for future space missions are also discussed.
BGen: A UML Behavior Network Generator Tool
NASA Technical Reports Server (NTRS)
Huntsberger, Terry; Reder, Leonard J.; Balian, Harry
2010-01-01
BGen software was designed for autogeneration of code based on a graphical representation of a behavior network used for controlling automatic vehicles. A common format used for describing a behavior network, such as that used in the JPL-developed behavior-based control system, CARACaS ["Control Architecture for Robotic Agent Command and Sensing" (NPO-43635), NASA Tech Briefs, Vol. 32, No. 10 (October 2008), page 40] includes a graph with sensory inputs flowing through the behaviors in order to generate the signals for the actuators that drive and steer the vehicle. A computer program to translate Unified Modeling Language (UML) Freeform Implementation Diagrams into a legacy C implementation of Behavior Network has been developed in order to simplify the development of C-code for behavior-based control systems. UML is a popular standard developed by the Object Management Group (OMG) to model software architectures graphically. The C implementation of a Behavior Network is functioning as a decision tree.
Ionospheric characteristics for archiving at the World Data Centers. Technical report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamache, R.R.; Reinisch, B.W.
1990-12-01
A database structure for archiving ionospheric characteristics at uneven data rates was developed at the July 1989 Ionospheric Informatics Working Group (IIWG) Lowell Workshop in Digital Ionogram Data Formats for World Data Center Archiving. This structure is proposed as a new URSI standard and is being employed by the World Data Center A for solar terrestrial physics for archiving characteristics. Here the database has been slightly refined for the application and programs written to generate these database files using as input Digisonde 256 ARTIST data, post processed by the ULCAR ADEP (ARTIST Data Editing Program) system. The characteristics program asmore » well as supplemental programs developed for this task are described here. The new software will make it possible to archive the ionospheric characteristics from the Geophysics Laboratory high latitude Digisonde network, the AWS DISS and the international Digisonde networks, and other ionospheric sounding networks.« less
Power-controlled transition from standard to negative refraction in reorientational soft matter.
Piccardi, Armando; Alberucci, Alessandro; Kravets, Nina; Buchnev, Oleksandr; Assanto, Gaetano
2014-11-25
Refraction at a dielectric interface can take an anomalous character in anisotropic crystals, when light is negatively refracted with incident and refracted beams emerging on the same side of the interface normal. In soft matter subject to reorientation, such as nematic liquid crystals, the nonlinear interaction with light allows tuning of the optical properties. We demonstrate that in such material a beam of light can experience either positive or negative refraction depending on input power, as it can alter the spatial distribution of the optic axis and, in turn, the direction of the energy flow when traveling across an interface. Moreover, the nonlinear optical response yields beam self-focusing and spatial localization into a self-confined solitary wave through the formation of a graded-index waveguide, linking the refractive transition to power-driven readdressing of copolarized guided-wave signals, with a number of output ports not limited by diffraction.
Atmospheric products from the Upper Atmosphere Research Satellite (UARS)
NASA Technical Reports Server (NTRS)
Ahmad, Suraiya P.; Johnson, James E.; Jackman, Charles H.
2003-01-01
This paper provides information on the products available at the NASA Goddard Earth Sciences (GES) Distributed Active Archive Center (DAAC) from the Upper Atmosphere Research Satellite (UARS) mission. The GES DAAC provides measurements from the primary UARS mission, which extended from launch in September 1991 through September 2001. The ten instruments aboard UARS provide measurements of atmospheric trace gas species, dynamical variables, solar irradiance input, and particle energy flux. All standard Level 3 UARS products from all ten instruments are offered free to the public and science user community. The Level 3 data are geophysical parameters, which have been transformed into a common format and equally spaced along the measurement trajectory. The UARS data have been reprocessed several times over the years following improvements to the processing algorithms. The UARS data offered from the GES DAAC are the latest versions of each instrument. The UARS data may be accessed through the GES DAAC website at
ChromA: signal-based retention time alignment for chromatography-mass spectrometry data.
Hoffmann, Nils; Stoye, Jens
2009-08-15
We describe ChromA, a web-based alignment tool for chromatography-mass spectrometry data from the metabolomics and proteomics domains. Users can supply their data in open and standardized file formats for retention time alignment using dynamic time warping with different configurable local distance and similarity functions. Additionally, user-defined anchors can be used to constrain and speedup the alignment. A neighborhood around each anchor can be added to increase the flexibility of the constrained alignment. ChromA offers different visualizations of the alignment for easier qualitative interpretation and comparison of the data. For the multiple alignment of more than two data files, the center-star approximation is applied to select a reference among input files to align to. ChromA is available at http://bibiserv.techfak.uni-bielefeld.de/chroma. Executables and source code under the L-GPL v3 license are provided for download at the same location.
XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework
NASA Astrophysics Data System (ADS)
Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò
2017-08-01
We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staley, Martin
2017-09-20
This high-performance ray tracing library provides very fast rendering; compact code; type flexibility through C++ "generic programming" techniques; and ease of use via an application programming interface (API) that operates independently of any GUI, on-screen display, or other enclosing application. Kip supports constructive solid geometry (CSG) models based on a wide variety of built-in shapes and logical operators, and also allows for user-defined shapes and operators to be provided. Additional features include basic texturing; input/output of models using a simple human-readable file format and with full error checking and detailed diagnostics; and support for shared data parallelism. Kip is writtenmore » in pure, ANSI standard C++; is entirely platform independent; and is very easy to use. As a C++ "header only" library, it requires no build system, configuration or installation scripts, wizards, non-C++ preprocessing, makefiles, shell scripts, or external libraries.« less
Instructional Simulation Integrates Research, Education, and Practice.
Teasdale, Thomas A; Mapes, Sheryl A; Henley, Omolara; Lindsey, Jeanene; Dillard, Della
2016-01-01
Instructional simulation is widely used in clinical education. Examples include the use of inanimate models meant to imitate humans, standardized patients who are actors portraying patients with certain conditions, and role-play where learners experience the disease through props and circumstances. These modalities are briefly described, and then case examples are provided of simulation curricula in use that integrate research findings and clinical practice expertise to guide development and implementation steps. The cases illustrate how formative and summative feedback from two legs of the "three-legged stool" can be potent integrating forces in development of simulation curricula. In these examples, the educational outputs benefit from purposeful inclusion of research and practice inputs. Costs are outlined for instructor and learner time commitments, space considerations, and expendables. The authors' data and experience suggest that instructional simulation that is supported by a solid scientific base and clinical expertise is appreciated by teachers and learners.
Data integration in biological research: an overview.
Lapatas, Vasileios; Stefanidakis, Michalis; Jimenez, Rafael C; Via, Allegra; Schneider, Maria Victoria
2015-12-01
Data sharing, integration and annotation are essential to ensure the reproducibility of the analysis and interpretation of the experimental findings. Often these activities are perceived as a role that bioinformaticians and computer scientists have to take with no or little input from the experimental biologist. On the contrary, biological researchers, being the producers and often the end users of such data, have a big role in enabling biological data integration. The quality and usefulness of data integration depend on the existence and adoption of standards, shared formats, and mechanisms that are suitable for biological researchers to submit and annotate the data, so it can be easily searchable, conveniently linked and consequently used for further biological analysis and discovery. Here, we provide background on what is data integration from a computational science point of view, how it has been applied to biological research, which key aspects contributed to its success and future directions.
Vandvik, Per Olav; Alonso-Coello, Pablo; Akl, Elie A; Thornton, Judith; Rigau, David; Adams, Katie; O'Connor, Paul; Guyatt, Gordon; Kristiansen, Annette
2017-01-01
Objectives To investigate practicing physicians' preferences, perceived usefulness and understanding of a new multilayered guideline presentation format—compared to a standard format—as well as conceptual understanding of trustworthy guideline concepts. Design Participants attended a standardised lecture in which they were presented with a clinical scenario and randomised to view a guideline recommendation in a multilayered format or standard format after which they answered multiple-choice questions using clickers. Both groups were also presented and asked about guideline concepts. Setting Mandatory educational lectures in 7 non-academic and academic hospitals, and 2 settings involving primary care in Lebanon, Norway, Spain and the UK. Participants 181 practicing physicians in internal medicine (156) and general practice (25). Interventions A new digitally structured, multilayered guideline presentation format and a standard narrative presentation format currently in widespread use. Primary and secondary outcome measures Our primary outcome was preference for presentation format. Understanding, perceived usefulness and perception of absolute effects were secondary outcomes. Results 72% (95% CI 65 to 79) of participants preferred the multilayered format and 16% (95% CI 10 to 22) preferred the standard format. A majority agreed that recommendations (multilayered 86% vs standard 91%, p value=0.31) and evidence summaries (79% vs 77%, p value=0.76) were useful in the context of the clinical scenario. 72% of participants randomised to the multilayered format vs 58% for standard formats reported correct understanding of the recommendations (p value=0.06). Most participants elected an appropriate clinical action after viewing the recommendations (98% vs 92%, p value=0.10). 82% of the participants considered absolute effect estimates in evidence summaries helpful or crucial. Conclusions Clinicians clearly preferred a novel multilayered presentation format to the standard format. Whether the preferred format improves decision-making and has an impact on patient important outcomes merits further investigation. PMID:28188149
Full value documentation in the Czech Food Composition Database.
Machackova, M; Holasova, M; Maskova, E
2010-11-01
The aim of this project was to launch a new Food Composition Database (FCDB) Programme in the Czech Republic; to implement a methodology for food description and value documentation according to the standards designed by the European Food Information Resource (EuroFIR) Network of Excellence; and to start the compilation of a pilot FCDB. Foods for the initial data set were selected from the list of foods included in the Czech Food Consumption Basket. Selection of 24 priority components was based on the range of components used in former Czech tables. The priority list was extended with components for which original Czech analytical data or calculated data were available. Values that were input into the compiled database were documented according to the EuroFIR standards within the entities FOOD, COMPONENT, VALUE and REFERENCE using Excel sheets. Foods were described using the LanguaL Thesaurus. A template for documentation of data according to the EuroFIR standards was designed. The initial data set comprised documented data for 162 foods. Values were based on original Czech analytical data (available for traditional and fast foods, milk and milk products, wheat flour types), data derived from literature (for example, fruits, vegetables, nuts, legumes, eggs) and calculated data. The Czech FCDB programme has been successfully relaunched. Inclusion of the Czech data set into the EuroFIR eSearch facility confirmed compliance of the database format with the EuroFIR standards. Excel spreadsheets are applicable for full value documentation in the FCDB.
Cohen, Jeremy D; Bolstad, Mark; Lee, Albert K
2017-01-01
The hippocampus is critical for producing stable representations of familiar spaces. How these representations arise is poorly understood, largely because changes to hippocampal inputs have not been measured during spatial learning. Here, using intracellular recording, we monitored inputs and plasticity-inducing complex spikes (CSs) in CA1 neurons while mice explored novel and familiar virtual environments. Inputs driving place field spiking increased in amplitude – often suddenly – during novel environment exploration. However, these increases were not sustained in familiar environments. Rather, the spatial tuning of inputs became increasingly similar across repeated traversals of the environment with experience – both within fields and throughout the whole environment. In novel environments, CSs were not necessary for place field formation. Our findings support a model in which initial inhomogeneities in inputs are amplified to produce robust place field activity, then plasticity refines this representation into one with less strongly modulated, but more stable, inputs for long-term storage. DOI: http://dx.doi.org/10.7554/eLife.23040.001 PMID:28742496
Pitot tube calculations with a TI-59
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, K.
Industrial plant and stack analysis dictates that flow measurements in ducts be accurate. This is usually accomplished by running a traverse with a pitot tube across the duct or flue. A traverse is a series of measurements taken at predetermined points across the duct. The values of these measurements are calculated into point flow rates and averaged. A program for the Texas Instruments TI-59 programmable calculator follows. The program will perform calculations for an infinite number of test points, both with the standard (combined impact type) pitot tube and the S-type (combined reverse type). The type of tube is selectedmore » by inputting an indicating valve that triggers a flag in the program. To use the standard pitot tube, a 1 is input into key E. When the S-type is used, a zero is input into key E. The program output will note if the S-type had been used. Since most process systems are not at standard conditions (32/sup 0/F, 1 atm) the program will take this into account.« less
3D video coding: an overview of present and upcoming standards
NASA Astrophysics Data System (ADS)
Merkle, Philipp; Müller, Karsten; Wiegand, Thomas
2010-07-01
An overview of existing and upcoming 3D video coding standards is given. Various different 3D video formats are available, each with individual pros and cons. The 3D video formats can be separated into two classes: video-only formats (such as stereo and multiview video) and depth-enhanced formats (such as video plus depth and multiview video plus depth). Since all these formats exist of at least two video sequences and possibly additional depth data, efficient compression is essential for the success of 3D video applications and technologies. For the video-only formats the H.264 family of coding standards already provides efficient and widely established compression algorithms: H.264/AVC simulcast, H.264/AVC stereo SEI message, and H.264/MVC. For the depth-enhanced formats standardized coding algorithms are currently being developed. New and specially adapted coding approaches are necessary, as the depth or disparity information included in these formats has significantly different characteristics than video and is not displayed directly, but used for rendering. Motivated by evolving market needs, MPEG has started an activity to develop a generic 3D video standard within the 3DVC ad-hoc group. Key features of the standard are efficient and flexible compression of depth-enhanced 3D video representations and decoupling of content creation and display requirements.
van Wijk, Esmee
2018-01-01
Strong heat loss and brine release during sea ice formation in coastal polynyas act to cool and salinify waters on the Antarctic continental shelf. Polynya activity thus both limits the ocean heat flux to the Antarctic Ice Sheet and promotes formation of Dense Shelf Water (DSW), the precursor to Antarctic Bottom Water. However, despite the presence of strong polynyas, DSW is not formed on the Sabrina Coast in East Antarctica and in the Amundsen Sea in West Antarctica. Using a simple ocean model driven by observed forcing, we show that freshwater input from basal melt of ice shelves partially offsets the salt flux by sea ice formation in polynyas found in both regions, preventing full-depth convection and formation of DSW. In the absence of deep convection, warm water that reaches the continental shelf in the bottom layer does not lose much heat to the atmosphere and is thus available to drive the rapid basal melt observed at the Totten Ice Shelf on the Sabrina Coast and at the Dotson and Getz ice shelves in the Amundsen Sea. Our results suggest that increased glacial meltwater input in a warming climate will both reduce Antarctic Bottom Water formation and trigger increased mass loss from the Antarctic Ice Sheet, with consequences for the global overturning circulation and sea level rise. PMID:29675467
Silvano, Alessandro; Rintoul, Stephen Rich; Peña-Molino, Beatriz; Hobbs, William Richard; van Wijk, Esmee; Aoki, Shigeru; Tamura, Takeshi; Williams, Guy Darvall
2018-04-01
Strong heat loss and brine release during sea ice formation in coastal polynyas act to cool and salinify waters on the Antarctic continental shelf. Polynya activity thus both limits the ocean heat flux to the Antarctic Ice Sheet and promotes formation of Dense Shelf Water (DSW), the precursor to Antarctic Bottom Water. However, despite the presence of strong polynyas, DSW is not formed on the Sabrina Coast in East Antarctica and in the Amundsen Sea in West Antarctica. Using a simple ocean model driven by observed forcing, we show that freshwater input from basal melt of ice shelves partially offsets the salt flux by sea ice formation in polynyas found in both regions, preventing full-depth convection and formation of DSW. In the absence of deep convection, warm water that reaches the continental shelf in the bottom layer does not lose much heat to the atmosphere and is thus available to drive the rapid basal melt observed at the Totten Ice Shelf on the Sabrina Coast and at the Dotson and Getz ice shelves in the Amundsen Sea. Our results suggest that increased glacial meltwater input in a warming climate will both reduce Antarctic Bottom Water formation and trigger increased mass loss from the Antarctic Ice Sheet, with consequences for the global overturning circulation and sea level rise.
Bohlen, Martin O; Warren, Susan; May, Paul J
2017-06-01
We recently demonstrated a bilateral projection to the supraoculomotor area from the central mesencephalic reticular formation (cMRF), a region implicated in horizontal gaze changes. C-group motoneurons, which supply multiply innervated fibers in the medial rectus muscle, are located within the primate supraoculomotor area, but their inputs and function are poorly understood. Here, we tested whether C-group motoneurons in Macaca fascicularis monkeys receive a direct cMRF input by injecting this portion of the reticular formation with anterograde tracers in combination with injection of retrograde tracer into the medial rectus muscle. The results indicate that the cMRF provides a dense, bilateral projection to the region of the medial rectus C-group motoneurons. Numerous close associations between labeled terminals and each multiply innervated fiber motoneuron were present. Within the oculomotor nucleus, a much sparser ipsilateral projection onto some of the A- and B- group medial rectus motoneurons that supply singly innervated fibers was observed. Ultrastructural analysis demonstrated a direct synaptic linkage between anterogradely labeled reticular terminals and retrogradely labeled medial rectus motoneurons in all three groups. These findings reinforce the notion that the cMRF is a critical hub for oculomotility by proving that it contains premotor neurons supplying horizontal extraocular muscle motoneurons. The differences between the cMRF input patterns for C-group versus A- and B-group motoneurons suggest the C-group motoneurons serve a different oculomotor role than the others. The similar patterns of cMRF input to C-group motoneurons and preganglionic Edinger-Westphal motoneurons suggest that medial rectus C-group motoneurons may play a role in accommodation-related vergence. © 2017 Wiley Periodicals, Inc.
A reconfigurable NAND/NOR genetic logic gate
2012-01-01
Background Engineering genetic Boolean logic circuits is a major research theme of synthetic biology. By altering or introducing connections between genetic components, novel regulatory networks are built in order to mimic the behaviour of electronic devices such as logic gates. While electronics is a highly standardized science, genetic logic is still in its infancy, with few agreed standards. In this paper we focus on the interpretation of logical values in terms of molecular concentrations. Results We describe the results of computational investigations of a novel circuit that is able to trigger specific differential responses depending on the input standard used. The circuit can therefore be dynamically reconfigured (without modification) to serve as both a NAND/NOR logic gate. This multi-functional behaviour is achieved by a) varying the meanings of inputs, and b) using branch predictions (as in computer science) to display a constrained output. A thorough computational study is performed, which provides valuable insights for the future laboratory validation. The simulations focus on both single-cell and population behaviours. The latter give particular insights into the spatial behaviour of our engineered cells on a surface with a non-homogeneous distribution of inputs. Conclusions We present a dynamically-reconfigurable NAND/NOR genetic logic circuit that can be switched between modes of operation via a simple shift in input signal concentration. The circuit addresses important issues in genetic logic that will have significance for more complex synthetic biology applications. PMID:22989145
A reconfigurable NAND/NOR genetic logic gate.
Goñi-Moreno, Angel; Amos, Martyn
2012-09-18
Engineering genetic Boolean logic circuits is a major research theme of synthetic biology. By altering or introducing connections between genetic components, novel regulatory networks are built in order to mimic the behaviour of electronic devices such as logic gates. While electronics is a highly standardized science, genetic logic is still in its infancy, with few agreed standards. In this paper we focus on the interpretation of logical values in terms of molecular concentrations. We describe the results of computational investigations of a novel circuit that is able to trigger specific differential responses depending on the input standard used. The circuit can therefore be dynamically reconfigured (without modification) to serve as both a NAND/NOR logic gate. This multi-functional behaviour is achieved by a) varying the meanings of inputs, and b) using branch predictions (as in computer science) to display a constrained output. A thorough computational study is performed, which provides valuable insights for the future laboratory validation. The simulations focus on both single-cell and population behaviours. The latter give particular insights into the spatial behaviour of our engineered cells on a surface with a non-homogeneous distribution of inputs. We present a dynamically-reconfigurable NAND/NOR genetic logic circuit that can be switched between modes of operation via a simple shift in input signal concentration. The circuit addresses important issues in genetic logic that will have significance for more complex synthetic biology applications.
Hippocampal 5-HT Input Regulates Memory Formation and Schaffer Collateral Excitation.
Teixeira, Catia M; Rosen, Zev B; Suri, Deepika; Sun, Qian; Hersh, Marc; Sargin, Derya; Dincheva, Iva; Morgan, Ashlea A; Spivack, Stephen; Krok, Anne C; Hirschfeld-Stoler, Tessa; Lambe, Evelyn K; Siegelbaum, Steven A; Ansorge, Mark S
2018-06-06
The efficacy and duration of memory storage is regulated by neuromodulatory transmitter actions. While the modulatory transmitter serotonin (5-HT) plays an important role in implicit forms of memory in the invertebrate Aplysia, its function in explicit memory mediated by the mammalian hippocampus is less clear. Specifically, the consequences elicited by the spatio-temporal gradient of endogenous 5-HT release are not known. Here we applied optogenetic techniques in mice to gain insight into this fundamental biological process. We find that activation of serotonergic terminals in the hippocampal CA1 region both potentiates excitatory transmission at CA3-to-CA1 synapses and enhances spatial memory. Conversely, optogenetic silencing of CA1 5-HT terminals inhibits spatial memory. We furthermore find that synaptic potentiation is mediated by 5-HT4 receptors and that systemic modulation of 5-HT4 receptor function can bidirectionally impact memory formation. Collectively, these data reveal powerful modulatory influence of serotonergic synaptic input on hippocampal function and memory formation. Copyright © 2018 Elsevier Inc. All rights reserved.
Harwell, Corey C; Parker, Philip R L; Gee, Steven M; Okada, Ami; McConnell, Susan K; Kreitzer, Anatol C; Kriegstein, Arnold R
2012-03-22
The precise connectivity of inputs and outputs is critical for cerebral cortex function; however, the cellular mechanisms that establish these connections are poorly understood. Here, we show that the secreted molecule Sonic Hedgehog (Shh) is involved in synapse formation of a specific cortical circuit. Shh is expressed in layer V corticofugal projection neurons and the Shh receptor, Brother of CDO (Boc), is expressed in local and callosal projection neurons of layer II/III that synapse onto the subcortical projection neurons. Layer V neurons of mice lacking functional Shh exhibit decreased synapses. Conversely, the loss of functional Boc leads to a reduction in the strength of synaptic connections onto layer Vb, but not layer II/III, pyramidal neurons. These results demonstrate that Shh is expressed in postsynaptic target cells while Boc is expressed in a complementary population of presynaptic input neurons, and they function to guide the formation of cortical microcircuitry. Copyright © 2012 Elsevier Inc. All rights reserved.
Amberg, Alexander; Barrett, Dave; Beale, Michael H.; Beger, Richard; Daykin, Clare A.; Fan, Teresa W.-M.; Fiehn, Oliver; Goodacre, Royston; Griffin, Julian L.; Hankemeier, Thomas; Hardy, Nigel; Harnly, James; Higashi, Richard; Kopka, Joachim; Lane, Andrew N.; Lindon, John C.; Marriott, Philip; Nicholls, Andrew W.; Reily, Michael D.; Thaden, John J.; Viant, Mark R.
2013-01-01
There is a general consensus that supports the need for standardized reporting of metadata or information describing large-scale metabolomics and other functional genomics data sets. Reporting of standard metadata provides a biological and empirical context for the data, facilitates experimental replication, and enables the re-interrogation and comparison of data by others. Accordingly, the Metabolomics Standards Initiative is building a general consensus concerning the minimum reporting standards for metabolomics experiments of which the Chemical Analysis Working Group (CAWG) is a member of this community effort. This article proposes the minimum reporting standards related to the chemical analysis aspects of metabolomics experiments including: sample preparation, experimental analysis, quality control, metabolite identification, and data pre-processing. These minimum standards currently focus mostly upon mass spectrometry and nuclear magnetic resonance spectroscopy due to the popularity of these techniques in metabolomics. However, additional input concerning other techniques is welcomed and can be provided via the CAWG on-line discussion forum at http://msi-workgroups.sourceforge.net/ or http://Msi-workgroups-feedback@lists.sourceforge.net. Further, community input related to this document can also be provided via this electronic forum. PMID:24039616
Near real-time, on-the-move software PED using VPEF
NASA Astrophysics Data System (ADS)
Green, Kevin; Geyer, Chris; Burnette, Chris; Agarwal, Sanjeev; Swett, Bruce; Phan, Chung; Deterline, Diane
2015-05-01
The scope of the Micro-Cloud for Operational, Vehicle-Based EO-IR Reconnaissance System (MOVERS) development effort, managed by the Night Vision and Electronic Sensors Directorate (NVESD), is to develop, integrate, and demonstrate new sensor technologies and algorithms that improve improvised device/mine detection using efficient and effective exploitation and fusion of sensor data and target cues from existing and future Route Clearance Package (RCP) sensor systems. Unfortunately, the majority of forward looking Full Motion Video (FMV) and computer vision processing, exploitation, and dissemination (PED) algorithms are often developed using proprietary, incompatible software. This makes the insertion of new algorithms difficult due to the lack of standardized processing chains. In order to overcome these limitations, EOIR developed the Government off-the-shelf (GOTS) Video Processing and Exploitation Framework (VPEF) to be able to provide standardized interfaces (e.g., input/output video formats, sensor metadata, and detected objects) for exploitation software and to rapidly integrate and test computer vision algorithms. EOIR developed a vehicle-based computing framework within the MOVERS and integrated it with VPEF. VPEF was further enhanced for automated processing, detection, and publishing of detections in near real-time, thus improving the efficiency and effectiveness of RCP sensor systems.
Speech Recognition Technology for Disabilities Education
ERIC Educational Resources Information Center
Tang, K. Wendy; Kamoua, Ridha; Sutan, Victor; Farooq, Omer; Eng, Gilbert; Chu, Wei Chern; Hou, Guofeng
2005-01-01
Speech recognition is an alternative to traditional methods of interacting with a computer, such as textual input through a keyboard. An effective system can replace or reduce the reliability on standard keyboard and mouse input. This can especially assist dyslexic students who have problems with character or word use and manipulation in a textual…
Modified centroid for estimating sand, silt, and clay from soil texture class
USDA-ARS?s Scientific Manuscript database
Models that require inputs of soil particle size commonly use soil texture class for input; however, texture classes do not represent the continuum of soil size fractions. Soil texture class and clay percentage are collected as a standard practice for many land management agencies (e.g., NRCS, BLM, ...
NASA Astrophysics Data System (ADS)
Suhartono, Lee, Muhammad Hisyam; Rezeki, Sri
2017-05-01
Intervention analysis is a statistical model in the group of time series analysis which is widely used to describe the effect of an intervention caused by external or internal factors. An example of external factors that often occurs in Indonesia is a disaster, both natural or man-made disaster. The main purpose of this paper is to provide the results of theoretical studies on identification step for determining the order of multi inputs intervention analysis for evaluating the magnitude and duration of the impact of interventions on time series data. The theoretical result showed that the standardized residuals could be used properly as response function for determining the order of multi inputs intervention model. Then, these results are applied for evaluating the impact of a disaster on a real case in Indonesia, i.e. the magnitude and duration of the impact of the Lapindo mud on the volume of vehicles on the highway. Moreover, the empirical results showed that the multi inputs intervention model can describe and explain accurately the magnitude and duration of the impact of disasters on a time series data.
CUTSETS - MINIMAL CUT SET CALCULATION FOR DIGRAPH AND FAULT TREE RELIABILITY MODELS
NASA Technical Reports Server (NTRS)
Iverson, D. L.
1994-01-01
Fault tree and digraph models are frequently used for system failure analysis. Both type of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Fault trees must have a tree structure and do not allow cycles or loops in the graph. Digraphs allow any pattern of interconnection between loops in the graphs. A common operation performed on digraph and fault tree models is the calculation of minimal cut sets. A cut set is a set of basic failures that could cause a given target failure event to occur. A minimal cut set for a target event node in a fault tree or digraph is any cut set for the node with the property that if any one of the failures in the set is removed, the occurrence of the other failures in the set will not cause the target failure event. CUTSETS will identify all the minimal cut sets for a given node. The CUTSETS package contains programs that solve for minimal cut sets of fault trees and digraphs using object-oriented programming techniques. These cut set codes can be used to solve graph models for reliability analysis and identify potential single point failures in a modeled system. The fault tree minimal cut set code reads in a fault tree model input file with each node listed in a text format. In the input file the user specifies a top node of the fault tree and a maximum cut set size to be calculated. CUTSETS will find minimal sets of basic events which would cause the failure at the output of a given fault tree gate. The program can find all the minimal cut sets of a node, or minimal cut sets up to a specified size. The algorithm performs a recursive top down parse of the fault tree, starting at the specified top node, and combines the cut sets of each child node into sets of basic event failures that would cause the failure event at the output of that gate. Minimal cut set solutions can be found for all nodes in the fault tree or just for the top node. The digraph cut set code uses the same techniques as the fault tree cut set code, except it includes all upstream digraph nodes in the cut sets for a given node and checks for cycles in the digraph during the solution process. CUTSETS solves for specified nodes and will not automatically solve for all upstream digraph nodes. The cut sets will be output as a text file. CUTSETS includes a utility program that will convert the popular COD format digraph model description files into text input files suitable for use with the CUTSETS programs. FEAT (MSC-21873) and FIRM (MSC-21860) available from COSMIC are examples of programs that produce COD format digraph model description files that may be converted for use with the CUTSETS programs. CUTSETS is written in C-language to be machine independent. It has been successfully implemented on a Sun running SunOS, a DECstation running ULTRIX, a Macintosh running System 7, and a DEC VAX running VMS. The RAM requirement varies with the size of the models. CUTSETS is available in UNIX tar format on a .25 inch streaming magnetic tape cartridge (standard distribution) or on a 3.5 inch diskette. It is also available on a 3.5 inch Macintosh format diskette or on a 9-track 1600 BPI magnetic tape in DEC VAX FILES-11 format. Sample input and sample output are provided on the distribution medium. An electronic copy of the documentation in Macintosh Microsoft Word format is included on the distribution medium. Sun and SunOS are trademarks of Sun Microsystems, Inc. DEC, DeCstation, ULTRIX, VAX, and VMS are trademarks of Digital Equipment Corporation. UNIX is a registered trademark of AT&T Bell Laboratories. Macintosh is a registered trademark of Apple Computer, Inc.
Web-based Toolkit for Dynamic Generation of Data Processors
NASA Astrophysics Data System (ADS)
Patel, J.; Dascalu, S.; Harris, F. C.; Benedict, K. K.; Gollberg, G.; Sheneman, L.
2011-12-01
All computation-intensive scientific research uses structured datasets, including hydrology and all other types of climate-related research. When it comes to testing their hypotheses, researchers might use the same dataset differently, and modify, transform, or convert it to meet their research needs. Currently, many researchers spend a good amount of time performing data processing and building tools to speed up this process. They might routinely repeat the same process activities for new research projects, spending precious time that otherwise could be dedicated to analyzing and interpreting the data. Numerous tools are available to run tests on prepared datasets and many of them work with datasets in different formats. However, there is still a significant need for applications that can comprehensively handle data transformation and conversion activities and help prepare the various processed datasets required by the researchers. We propose a web-based application (a software toolkit) that dynamically generates data processors capable of performing data conversions, transformations, and customizations based on user-defined mappings and selections. As a first step, the proposed solution allows the users to define various data structures and, in the next step, can select various file formats and data conversions for their datasets of interest. In a simple scenario, the core of the proposed web-based toolkit allows the users to define direct mappings between input and output data structures. The toolkit will also support defining complex mappings involving the use of pre-defined sets of mathematical, statistical, date/time, and text manipulation functions. Furthermore, the users will be allowed to define logical cases for input data filtering and sampling. At the end of the process, the toolkit is designed to generate reusable source code and executable binary files for download and use by the scientists. The application is also designed to store all data structures and mappings defined by a user (an author), and allow the original author to modify them using standard authoring techniques. The users can change or define new mappings to create new data processors for download and use. In essence, when executed, the generated data processor binary file can take an input data file in a given format and output this data, possibly transformed, in a different file format. If they so desire, the users will be able modify directly the source code in order to define more complex mappings and transformations that are not currently supported by the toolkit. Initially aimed at supporting research in hydrology, the toolkit's functions and features can be either directly used or easily extended to other areas of climate-related research. The proposed web-based data processing toolkit will be able to generate various custom software processors for data conversion and transformation in a matter of seconds or minutes, saving a significant amount of researchers' time and allowing them to focus on core research issues.
HONTIOR - HIGHER-ORDER NEURAL NETWORK FOR TRANSFORMATION INVARIANT OBJECT RECOGNITION
NASA Technical Reports Server (NTRS)
Spirkovska, L.
1994-01-01
Neural networks have been applied in numerous fields, including transformation invariant object recognition, wherein an object is recognized despite changes in the object's position in the input field, size, or rotation. One of the more successful neural network methods used in invariant object recognition is the higher-order neural network (HONN) method. With a HONN, known relationships are exploited and the desired invariances are built directly into the architecture of the network, eliminating the need for the network to learn invariance to transformations. This results in a significant reduction in the training time required, since the network needs to be trained on only one view of each object, not on numerous transformed views. Moreover, one hundred percent accuracy is guaranteed for images characterized by the built-in distortions, providing noise is not introduced through pixelation. The program HONTIOR implements a third-order neural network having invariance to translation, scale, and in-plane rotation built directly into the architecture, Thus, for 2-D transformation invariance, the network needs only to be trained on just one view of each object. HONTIOR can also be used for 3-D transformation invariant object recognition by training the network only on a set of out-of-plane rotated views. Historically, the major drawback of HONNs has been that the size of the input field was limited to the memory required for the large number of interconnections in a fully connected network. HONTIOR solves this problem by coarse coding the input images (coding an image as a set of overlapping but offset coarser images). Using this scheme, large input fields (4096 x 4096 pixels) can easily be represented using very little virtual memory (30Mb). The HONTIOR distribution consists of three main programs. The first program contains the training and testing routines for a third-order neural network. The second program contains the same training and testing procedures as the first, but it also contains a number of functions to display and edit training and test images. Finally, the third program is an auxiliary program which calculates the included angles for a given input field size. HONTIOR is written in C language, and was originally developed for Sun3 and Sun4 series computers. Both graphic and command line versions of the program are provided. The command line version has been successfully compiled and executed both on computers running the UNIX operating system and on DEC VAX series computer running VMS. The graphic version requires the SunTools windowing environment, and therefore runs only on Sun series computers. The executable for the graphics version of HONTIOR requires 1Mb of RAM. The standard distribution medium for HONTIOR is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. The package includes sample input and output data. HONTIOR was developed in 1991. Sun, Sun3 and Sun4 are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation.
The df: A proposed data format standard
NASA Technical Reports Server (NTRS)
Lait, Leslie R.; Nash, Eric R.; Newman, Paul A.
1993-01-01
A standard is proposed describing a portable format for electronic exchange of data in the physical sciences. Writing scientific data in a standard format has three basic advantages: portability; the ability to use metadata to aid in interpretation of the data (understandability); and reusability. An improperly formulated standard format tends towards four disadvantages: (1) it can be inflexible and fail to allow the user to express his data as needed; (2) reading and writing such datasets can involve high overhead in computing time and storage space; (3) the format may be accessible only on certain machines using certain languages; and (4) under some circumstances it may be uncertain whether a given dataset actually conforms to the standard. A format was designed which enhances these advantages and lessens the disadvantages. The fundamental approach is to allow the user to make her own choices regarding strategic tradeoffs to achieve the performance desired in her local environment. The choices made are encoded in a specific and portable way in a set of records. A fully detailed description and specification of the format is given, and examples are used to illustrate various concepts. Implementation is discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenthal, William Steven; Tartakovsky, Alex; Huang, Zhenyu
State and parameter estimation of power transmission networks is important for monitoring power grid operating conditions and analyzing transient stability. Wind power generation depends on fluctuating input power levels, which are correlated in time and contribute to uncertainty in turbine dynamical models. The ensemble Kalman filter (EnKF), a standard state estimation technique, uses a deterministic forecast and does not explicitly model time-correlated noise in parameters such as mechanical input power. However, this uncertainty affects the probability of fault-induced transient instability and increased prediction bias. Here a novel approach is to model input power noise with time-correlated stochastic fluctuations, and integratemore » them with the network dynamics during the forecast. While the EnKF has been used to calibrate constant parameters in turbine dynamical models, the calibration of a statistical model for a time-correlated parameter has not been investigated. In this study, twin experiments on a standard transmission network test case are used to validate our time-correlated noise model framework for state estimation of unsteady operating conditions and transient stability analysis, and a methodology is proposed for the inference of the mechanical input power time-correlation length parameter using time-series data from PMUs monitoring power dynamics at generator buses.« less
Rosenthal, William Steven; Tartakovsky, Alex; Huang, Zhenyu
2017-10-31
State and parameter estimation of power transmission networks is important for monitoring power grid operating conditions and analyzing transient stability. Wind power generation depends on fluctuating input power levels, which are correlated in time and contribute to uncertainty in turbine dynamical models. The ensemble Kalman filter (EnKF), a standard state estimation technique, uses a deterministic forecast and does not explicitly model time-correlated noise in parameters such as mechanical input power. However, this uncertainty affects the probability of fault-induced transient instability and increased prediction bias. Here a novel approach is to model input power noise with time-correlated stochastic fluctuations, and integratemore » them with the network dynamics during the forecast. While the EnKF has been used to calibrate constant parameters in turbine dynamical models, the calibration of a statistical model for a time-correlated parameter has not been investigated. In this study, twin experiments on a standard transmission network test case are used to validate our time-correlated noise model framework for state estimation of unsteady operating conditions and transient stability analysis, and a methodology is proposed for the inference of the mechanical input power time-correlation length parameter using time-series data from PMUs monitoring power dynamics at generator buses.« less
NASA Technical Reports Server (NTRS)
Tinoco, E. N.; Lu, P.; Johnson, F. T.
1980-01-01
A computer program developed for solving the subsonic, three dimensional flow over wing-body configurations with leading edge vortex separation is presented. Instructions are given for the proper set up and input of a problem into the computer code. Program input formats and output are described, as well as the overlay structure of the program. The program is written in FORTRAN.
Flight Model Discharge System.
1987-04-01
will immediately remove the charge from the front surface of the dielectric and return it to ground. The 2-hour time constant network will then reset the...ATDP programs. NEWT5 permits the digitized input of board and component position data, while ATDP automates certain phases of input and output table...format. 8.5 RESULTS The system-level results are presented as curves of AR (normalized radiator area) versus THOT and as curves of Q (heater
Borehole geological assessment
NASA Technical Reports Server (NTRS)
Spuck, W. H., III (Inventor)
1979-01-01
A method and apparatus are discussed for performing geological assessments of a formation located along a borehole, and a boring tool that bores a pair of holes into the walls of the borehole and into the surrounding strata along with a pair of probes which are installed in the holes. One of the probes applies an input such as a current or pressured fluid, and the other probe senses a corresponding input which it receives from the strata.
The Formation of Ethane from Carbon Dioxide under Cold Plasma
NASA Astrophysics Data System (ADS)
Zhang, Xiu-ling; Zhang, Lin; Dai, Bin; Gong, Wei-min; Liu, Chang-hou
2001-04-01
Pulsed-corona plasma has been used as a new method for ethane dehydrogenation at low temperature and normal pressure using carbon dioxide as an oxidant in this paper. The effect of carbon dioxide content in the feed, power input, and flow rate of the reactants on the ethane dehydrogenation has been investigated. The experimental results show that the conversion of ethane increases with the increase in the amount of carbon dioxide in the feed. The yield of ethylene and acetylene decreases with the increase in the yield of carbon monoxide, indicating that the increased carbon dioxide leads to the part of ethylene and acetylene being oxidized to carbon monoxide. Power input is primarily an electrical parameter in pulsed-corona plasma, which plays an important role in reactant conversion and product formation. When the power input reaches 16 W, ethane conversion is 41.0% and carbon dioxide conversion is 26.3%. The total yield of ethylene and acetylene is 15.6%. The reduced flow rate of feed improves the conversion of ethane, carbon dioxide and the yield of acetylene, and induces carbon deposit as well.
FEQinput—An editor for the full equations (FEQ) hydraulic modeling system
Ancalle, David S.; Ancalle, Pablo J.; Domanski, Marian M.
2017-10-30
IntroductionThe Full Equations Model (FEQ) is a computer program that solves the full, dynamic equations of motion for one-dimensional unsteady hydraulic flow in open channels and through control structures. As a result, hydrologists have used FEQ to design and operate flood-control structures, delineate inundation maps, and analyze peak-flow impacts. To aid in fighting floods, hydrologists are using the software to develop a system that uses flood-plain models to simulate real-time streamflow.Input files for FEQ are composed of text files that contain large amounts of parameters, data, and instructions that are written in a format exclusive to FEQ. Although documentation exists that can aid in the creation and editing of these input files, new users face a steep learning curve in order to understand the specific format and language of the files.FEQinput provides a set of tools to help a new user overcome the steep learning curve associated with creating and modifying input files for the FEQ hydraulic model and the related utility tool, Full Equations Utilities (FEQUTL).
NASA Astrophysics Data System (ADS)
Karthikeyan, R.; Saravanan, M.; Singaravel, B.; Sathiya, P.
This paper investigates the impact of heat input and post-weld aging behavior at different temperatures on the laser paper welded maraging steel grade 250. Three different levels of heat inputs were chosen and CO2 laser welding was performed. Aging was done at six different temperatures: 360∘C, 400∘C, 440∘C, 480∘C, 520∘C and 560∘C. The macrostructure and microstructure of the fusion zone were obtained using optical microscope. The microhardness test was performed on the weld zone. Tensile tests and impact tests were carried out for the weld samples and different age-treated weld samples. Fracture surfaces were investigated by scanning electron microscopy (SEM). Microhardness values of the fusion zone increased with increasing aging temperature, while the base metal microhardness value decreased. Tensile properties increased with aging temperature up to 480∘C and reduced for 520∘C and 560∘C. This was mainly due to the formation of reverted austenite beyond 500∘C. XRD analysis confirmed the formation of reverted austenite.
Saghaeiannejad-Isfahani, Sakineh; Mirzaeian, Razieh; Jannesari, Hasan; Ehteshami, Asghar; Feizi, Awat; Raeisi, Ahmadreza
2014-01-01
Supporting a therapeutic approach and medication therapy management, the pharmacy information system (PIS) acts as one of the pillars of hospital information system. This ensures that medication therapy is being supported with an optimal level of safety and quality similar to other treatments and services. The present study is an applied, cross-sectional study conducted on the PIS in use in selected hospitals. The research population included all users of PIS. The research sample is the same as the research population. The data collection instrument was the self-designed checklist developed from the guidelines of the American Society of Health System Pharmacists, Australia pharmaceutical Society and Therapeutic guidelines of the Drug Commission of the German Medical Association. The checklist validity was assessed by research supervisors and PIS users and pharmacists. The findings of this study were revealed that regarding the degree of meeting the standards given in the guidelines issued by the Society of Pharmacists, the highest rank in observing input standards belonged to Social Services hospitals with a mean score of 32.75. Although teaching hospitals gained the highest score both in process standards with a mean score of 29.15 and output standards with a mean score of 43.95, the private hospitals had the lowest mean score of 23.32, 17.78, 24.25 in input, process and output standards, respectively. Based on the findings, it can be claimed that the studied hospitals had a minimal compliance with the input, output and processing standards related to the PIS.
The GeoDataPortal: A Standards-based Environmental Modeling Data Access and Manipulation Toolkit
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Kunicki, T.; Booth, N.; Suftin, I.; Zoerb, R.; Walker, J.
2010-12-01
Environmental modelers from fields of study such as climatology, hydrology, geology, and ecology rely on many data sources and processing methods that are common across these disciplines. Interest in inter-disciplinary, loosely coupled modeling and data sharing is increasing among scientists from the USGS, other agencies, and academia. For example, hydrologic modelers need downscaled climate change scenarios and land cover data summarized for the watersheds they are modeling. Subsequently, ecological modelers are interested in soil moisture information for a particular habitat type as predicted by the hydrologic modeler. The USGS Center for Integrated Data Analytics Geo Data Portal (GDP) project seeks to facilitate this loose model coupling data sharing through broadly applicable open-source web processing services. These services simplify and streamline the time consuming and resource intensive tasks that are barriers to inter-disciplinary collaboration. The GDP framework includes a catalog describing projects, models, data, processes, and how they relate. Using newly introduced data, or sources already known to the catalog, the GDP facilitates access to sub-sets and common derivatives of data in numerous formats on disparate web servers. The GDP performs many of the critical functions needed to summarize data sources into modeling units regardless of scale or volume. A user can specify their analysis zones or modeling units as an Open Geospatial Consortium (OGC) standard Web Feature Service (WFS). Utilities to cache Shapefiles and other common GIS input formats have been developed to aid in making the geometry available for processing via WFS. Dataset access in the GDP relies primarily on the Unidata NetCDF-Java library’s common data model. Data transfer relies on methods provided by Unidata’s Thematic Real-time Environmental Data Distribution System Data Server (TDS). TDS services of interest include the Open-source Project for a Network Data Access Protocol (OPeNDAP) standard for gridded time series, the OGC’s Web Coverage Service for high-density static gridded data, and Unidata’s CDM-remote for point time series. OGC WFS and Sensor Observation Service (SOS) are being explored as mechanisms to serve and access static or time series data attributed to vector geometry. A set of standardized XML-based output formats allows easy transformation into a wide variety of “model-ready” formats. Interested users will have the option of submitting custom transformations to the GDP or transforming the XML output as a post-process. The GDP project aims to support simple, rapid development of thin user interfaces (like web portals) to commonly needed environmental modeling-related data access and manipulation tools. Standalone, service-oriented components of the GDP framework provide the metadata cataloging, data subset access, and spatial-statistics calculations needed to support interdisciplinary environmental modeling.
40 CFR 60.43 - Standard for sulfur dioxide (SO2).
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Fossil-Fuel... from liquid fossil fuel or liquid fossil fuel and wood residue. (2) 520 ng/J heat input (1.2 lb/MMBtu) derived from solid fossil fuel or solid fossil fuel and wood residue, except as provided in paragraph (e...
40 CFR 60.43 - Standard for sulfur dioxide (SO2).
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Fossil-Fuel... from liquid fossil fuel or liquid fossil fuel and wood residue. (2) 520 ng/J heat input (1.2 lb/MMBtu) derived from solid fossil fuel or solid fossil fuel and wood residue, except as provided in paragraph (e...
National Health Care Skill Standards.
ERIC Educational Resources Information Center
Far West Lab. for Educational Research and Development, San Francisco, CA.
This booklet contains draft national health care skill standards that were proposed during the National Health Care Skill Standards Project on the basis of input from more than 1,000 representatives of key constituencies of the health care field. The project objectives and structure are summarized in the introduction. Part 1 examines the need for…
MTpy: A Python toolbox for magnetotellurics
NASA Astrophysics Data System (ADS)
Krieger, Lars; Peacock, Jared R.
2014-11-01
We present the software package MTpy that allows handling, processing, and imaging of magnetotelluric (MT) data sets. Written in Python, the code is open source, containing sub-packages and modules for various tasks within the standard MT data processing and handling scheme. Besides the independent definition of classes and functions, MTpy provides wrappers and convenience scripts to call standard external data processing and modelling software. In its current state, modules and functions of MTpy work on raw and pre-processed MT data. However, opposite to providing a static compilation of software, we prefer to introduce MTpy as a flexible software toolbox, whose contents can be combined and utilised according to the respective needs of the user. Just as the overall functionality of a mechanical toolbox can be extended by adding new tools, MTpy is a flexible framework, which will be dynamically extended in the future. Furthermore, it can help to unify and extend existing codes and algorithms within the (academic) MT community. In this paper, we introduce the structure and concept of MTpy. Additionally, we show some examples from an everyday work-flow of MT data processing: the generation of standard EDI data files from raw electric (E-) and magnetic flux density (B-) field time series as input, the conversion into MiniSEED data format, as well as the generation of a graphical data representation in the form of a Phase Tensor pseudosection.
Variations of comoving volume and their effects on the star formation rate density
NASA Astrophysics Data System (ADS)
Kim, Sungeun; Physics and Astronomy, Sejong University, Seoul, Korea (the Republic of).
2018-01-01
To build a comprehensive picture of star formation in the universe, we havedeveloped an application to calculate the comoving volume at a specific redshift and visualize the changes of spaceand time. The application is based on the star formation rates of about a few thousands of galaxies and their redshiftvalues. Three dimensional modeling of these galaxies using the redshift, comoving volume, and star formation ratesas input data allows calculation of the star formation rate density corresponding to the redshift. This work issupported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP)(no. 2017037333).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L
2015-02-01
Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based proceduremore » system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the underlying data structure for such CBPS. The objective of the research effort is to develop guidance on how to design both the user interface and the underlying schema. This paper will describe the result and insights gained from the research activities conducted to date.« less
NASA GSFC Mechanical Engineering Latest Inputs for Verification Standards (GEVS) Updates
NASA Technical Reports Server (NTRS)
Kaufman, Daniel
2003-01-01
This viewgraph presentation provides information on quality control standards in mechanical engineering. The presentation addresses safety, structural loads, nonmetallic composite structural elements, bonded structural joints, externally induced shock, random vibration, acoustic tests, and mechanical function.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-02
... format consistent with recent revisions of other U.S. grade standards. This format has been designed to...''. These changes would provide a uniform format consistent with recent revisions of other U.S. grade standards. The term, ``Hard, woody okra material'' would be added to the standards. These terms and...
Zou, An-Min; Kumar, Krishna Dev
2012-07-01
This brief considers the attitude coordination control problem for spacecraft formation flying when only a subset of the group members has access to the common reference attitude. A quaternion-based distributed attitude coordination control scheme is proposed with consideration of the input saturation and with the aid of the sliding-mode observer, separation principle theorem, Chebyshev neural networks, smooth projection algorithm, and robust control technique. Using graph theory and a Lyapunov-based approach, it is shown that the distributed controller can guarantee the attitude of all spacecraft to converge to a common time-varying reference attitude when the reference attitude is available only to a portion of the group of spacecraft. Numerical simulations are presented to demonstrate the performance of the proposed distributed controller.
Space vehicle onboard command encoder
NASA Technical Reports Server (NTRS)
1975-01-01
A flexible onboard encoder system was designed for the space shuttle. The following areas were covered: (1) implementation of the encoder design into hardware to demonstrate the various encoding algorithms/code formats, (2) modulation techniques in a single hardware package to maintain comparable reliability and link integrity of the existing link systems and to integrate the various techniques into a single design using current technology. The primary function of the command encoder is to accept input commands, generated either locally onboard the space shuttle or remotely from the ground, format and encode the commands in accordance with the payload input requirements and appropriately modulate a subcarrier for transmission by the baseband RF modulator. The following information was provided: command encoder system design, brassboard hardware design, test set hardware and system packaging, and software.
On the dynamics of Airy beams in nonlinear media with nonlinear losses.
Ruiz-Jiménez, Carlos; Nóbrega, K Z; Porras, Miguel A
2015-04-06
We investigate on the nonlinear dynamics of Airy beams in a regime where nonlinear losses due to multi-photon absorption are significant. We identify the nonlinear Airy beam (NAB) that preserves the amplitude of the inward Hänkel component as an attractor of the dynamics. This attractor governs also the dynamics of finite-power (apodized) Airy beams, irrespective of the location of the entrance plane in the medium with respect to the Airy waist plane. A soft (linear) input long before the waist, however, strongly speeds up NAB formation and its persistence as a quasi-stationary beam in comparison to an abrupt input at the Airy waist plane, and promotes the formation of a new type of highly dissipative, fully nonlinear Airy beam not described so far.
CABS-flex: server for fast simulation of protein structure fluctuations
Jamroz, Michal; Kolinski, Andrzej; Kmiecik, Sebastian
2013-01-01
The CABS-flex server (http://biocomp.chem.uw.edu.pl/CABSflex) implements CABS-model–based protocol for the fast simulations of near-native dynamics of globular proteins. In this application, the CABS model was shown to be a computationally efficient alternative to all-atom molecular dynamics—a classical simulation approach. The simulation method has been validated on a large set of molecular dynamics simulation data. Using a single input (user-provided file in PDB format), the CABS-flex server outputs an ensemble of protein models (in all-atom PDB format) reflecting the flexibility of the input structure, together with the accompanying analysis (residue mean-square-fluctuation profile and others). The ensemble of predicted models can be used in structure-based studies of protein functions and interactions. PMID:23658222
CABS-flex: Server for fast simulation of protein structure fluctuations.
Jamroz, Michal; Kolinski, Andrzej; Kmiecik, Sebastian
2013-07-01
The CABS-flex server (http://biocomp.chem.uw.edu.pl/CABSflex) implements CABS-model-based protocol for the fast simulations of near-native dynamics of globular proteins. In this application, the CABS model was shown to be a computationally efficient alternative to all-atom molecular dynamics--a classical simulation approach. The simulation method has been validated on a large set of molecular dynamics simulation data. Using a single input (user-provided file in PDB format), the CABS-flex server outputs an ensemble of protein models (in all-atom PDB format) reflecting the flexibility of the input structure, together with the accompanying analysis (residue mean-square-fluctuation profile and others). The ensemble of predicted models can be used in structure-based studies of protein functions and interactions.
A new formation control of multiple underactuated surface vessels
NASA Astrophysics Data System (ADS)
Xie, Wenjing; Ma, Baoli; Fernando, Tyrone; Iu, Herbert Ho-Ching
2018-05-01
This work investigates a new formation control problem of multiple underactuated surface vessels. The controller design is based on input-output linearisation technique, graph theory, consensus idea and some nonlinear tools. The proposed smooth time-varying distributed control law guarantees that the multiple underactuated surface vessels globally exponentially converge to some desired geometric shape, which is especially centred at the initial average position of vessels. Furthermore, the stability analysis of zero dynamics proves that the orientations of vessels tend to some constants that are dependent on the initial values of vessels, and the velocities and control inputs of the vessels decay to zero. All the results are obtained under the communication scenarios of static directed balanced graph with a spanning tree. Effectiveness of the proposed distributed control scheme is demonstrated using a simulation example.
The Effects of Item Preview on Video-Based Multiple-Choice Listening Assessments
ERIC Educational Resources Information Center
Koyama, Dennis; Sun, Angela; Ockey, Gary J.
2016-01-01
Multiple-choice formats remain a popular design for assessing listening comprehension, yet no consensus has been reached on how multiple-choice formats should be employed. Some researchers argue that test takers must be provided with a preview of the items prior to the input (Buck, 1995; Sherman, 1997); others argue that a preview may decrease the…
NIH Seeks Input on In-patient Clinical Research Areas | Division of Cancer Prevention
[[{"fid":"2476","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Aerial view of the National Institutes of Health Clinical Center (Building 10) in Bethesda, Maryland.","field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"Aerial view of
Effect of Common Cryoprotectants on Critical Warming Rates and Ice Formation in Aqueous Solutions
Hopkins, Jesse B.; Badeau, Ryan; Warkentin, Matthew; Thorne, Robert E.
2012-01-01
Ice formation on warming is of comparable or greater importance to ice formation on cooling in determining survival of cryopreserved samples. Critical warming rates required for ice-free warming of vitrified aqueous solutions of glycerol, dimethyl sulfoxide, ethylene glycol, polyethylene glycol 200 and sucrose have been measured for warming rates of order 10 to 104 K/s. Critical warming rates are typically one to three orders of magnitude larger than critical cooling rates. Warming rates vary strongly with cooling rates, perhaps due to the presence of small ice fractions in nominally vitrified samples. Critical warming and cooling rate data spanning orders of magnitude in rates provide rigorous tests of ice nucleation and growth models and their assumed input parameters. Current models with current best estimates for input parameters provide a reasonable account of critical warming rates for glycerol solutions at high concentrations/low rates, but overestimate both critical warming and cooling rates by orders of magnitude at lower concentrations and larger rates. In vitrification protocols, minimizing concentrations of potentially damaging cryoprotectants while minimizing ice formation will require ultrafast warming rates, as well as fast cooling rates to minimize the required warming rates. PMID:22728046
Mayer, Brooke K; Daugherty, Erin; Abbaszadegan, Morteza
2015-02-01
Advanced oxidation processes (AOPs) are gaining traction as they offer mineralization potential rather than transferring contaminants between media. However, AOPs operated with limited energy and/or chemical inputs can exacerbate disinfection byproduct (DBP) formation, even as precursors such as dissolved organic carbon, UV254, and specific UV absorbance (SUVA) decrease. This study examined the relationship between DBP precursors and formation using TiO2 photocatalysis experiments, external AOP and non-AOP data, and predictive DBP models. The top-performing indicator, SUVA, generally correlated positively with trihalomethanes and haloacetic acids, but limited-energy photocatalysis yielded contrasting negative correlations. The accuracy of predicted DBP values from models based on bulk parameters was generally poor, regardless of use and extent of AOP treatment and type of source water. Though performance improved for scenarios bounded by conditions used in model development, only 0.5% of the model/dataset pairings satisfied all measured parameter boundary conditions, thereby introducing skepticism toward model usefulness. Study findings suggest that caution should be employed when using bulk indicators and/or models as a metric for AOP mitigation of DBP formation potential, particularly for limited-energy/chemical inputs. Copyright © 2014 Elsevier Ltd. All rights reserved.
Optimization of light quality from color mixing light-emitting diode systems for general lighting
NASA Astrophysics Data System (ADS)
Thorseth, Anders
2012-03-01
Given the problem of metamerisms inherent in color mixing in light-emitting diode (LED) systems with more than three distinct colors, a method for optimizing the spectral output of multicolor LED system with regards to standardized light quality parameters has been developed. The composite spectral power distribution from the LEDs are simulated using spectral radiometric measurements of single commercially available LEDs for varying input power, to account for the efficiency droop and other non-linear effects in electrical power vs. light output. The method uses electrical input powers as input parameters in a randomized steepest decent optimization. The resulting spectral power distributions are evaluated with regard to the light quality using the standard characteristics: CIE color rendering index, correlated color temperature and chromaticity distance. The results indicate Pareto optimal boundaries for each system, mapping the capabilities of the simulated lighting systems with regard to the light quality characteristics.
Sensitivity Analysis of the Integrated Medical Model for ISS Programs
NASA Technical Reports Server (NTRS)
Goodenow, D. A.; Myers, J. G.; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Young, M.
2016-01-01
Sensitivity analysis estimates the relative contribution of the uncertainty in input values to the uncertainty of model outputs. Partial Rank Correlation Coefficient (PRCC) and Standardized Rank Regression Coefficient (SRRC) are methods of conducting sensitivity analysis on nonlinear simulation models like the Integrated Medical Model (IMM). The PRCC method estimates the sensitivity using partial correlation of the ranks of the generated input values to each generated output value. The partial part is so named because adjustments are made for the linear effects of all the other input values in the calculation of correlation between a particular input and each output. In SRRC, standardized regression-based coefficients measure the sensitivity of each input, adjusted for all the other inputs, on each output. Because the relative ranking of each of the inputs and outputs is used, as opposed to the values themselves, both methods accommodate the nonlinear relationship of the underlying model. As part of the IMM v4.0 validation study, simulations are available that predict 33 person-missions on ISS and 111 person-missions on STS. These simulated data predictions feed the sensitivity analysis procedures. The inputs to the sensitivity procedures include the number occurrences of each of the one hundred IMM medical conditions generated over the simulations and the associated IMM outputs: total quality time lost (QTL), number of evacuations (EVAC), and number of loss of crew lives (LOCL). The IMM team will report the results of using PRCC and SRRC on IMM v4.0 predictions of the ISS and STS missions created as part of the external validation study. Tornado plots will assist in the visualization of the condition-related input sensitivities to each of the main outcomes. The outcomes of this sensitivity analysis will drive review focus by identifying conditions where changes in uncertainty could drive changes in overall model output uncertainty. These efforts are an integral part of the overall verification, validation, and credibility review of IMM v4.0.
A nonlinear autoregressive Volterra model of the Hodgkin-Huxley equations.
Eikenberry, Steffen E; Marmarelis, Vasilis Z
2013-02-01
We propose a new variant of Volterra-type model with a nonlinear auto-regressive (NAR) component that is a suitable framework for describing the process of AP generation by the neuron membrane potential, and we apply it to input-output data generated by the Hodgkin-Huxley (H-H) equations. Volterra models use a functional series expansion to describe the input-output relation for most nonlinear dynamic systems, and are applicable to a wide range of physiologic systems. It is difficult, however, to apply the Volterra methodology to the H-H model because is characterized by distinct subthreshold and suprathreshold dynamics. When threshold is crossed, an autonomous action potential (AP) is generated, the output becomes temporarily decoupled from the input, and the standard Volterra model fails. Therefore, in our framework, whenever membrane potential exceeds some threshold, it is taken as a second input to a dual-input Volterra model. This model correctly predicts membrane voltage deflection both within the subthreshold region and during APs. Moreover, the model naturally generates a post-AP afterpotential and refractory period. It is known that the H-H model converges to a limit cycle in response to a constant current injection. This behavior is correctly predicted by the proposed model, while the standard Volterra model is incapable of generating such limit cycle behavior. The inclusion of cross-kernels, which describe the nonlinear interactions between the exogenous and autoregressive inputs, is found to be absolutely necessary. The proposed model is general, non-parametric, and data-derived.
7 CFR 51.605 - Good heart formation.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Good heart formation. 51.605 Section 51.605... STANDARDS) United States Consumer Standards for Celery Stalks Definitions § 51.605 Good heart formation. Good heart formation means that the stalk has a reasonable number of stocky inner heart branches for...
7 CFR 51.612 - Fairly good heart formation.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Fairly good heart formation. 51.612 Section 51.612... STANDARDS) United States Consumer Standards for Celery Stalks Definitions § 51.612 Fairly good heart formation. Fairly good heart formation means that the stalk has a moderate number of fairly stocky inner...
7 CFR 51.605 - Good heart formation.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 2 2012-01-01 2012-01-01 false Good heart formation. 51.605 Section 51.605... STANDARDS) United States Consumer Standards for Celery Stalks Definitions § 51.605 Good heart formation. Good heart formation means that the stalk has a reasonable number of stocky inner heart branches for...
7 CFR 51.612 - Fairly good heart formation.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Fairly good heart formation. 51.612 Section 51.612... STANDARDS) United States Consumer Standards for Celery Stalks Definitions § 51.612 Fairly good heart formation. Fairly good heart formation means that the stalk has a moderate number of fairly stocky inner...
7 CFR 51.612 - Fairly good heart formation.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 2 2012-01-01 2012-01-01 false Fairly good heart formation. 51.612 Section 51.612... STANDARDS) United States Consumer Standards for Celery Stalks Definitions § 51.612 Fairly good heart formation. Fairly good heart formation means that the stalk has a moderate number of fairly stocky inner...
7 CFR 51.605 - Good heart formation.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Good heart formation. 51.605 Section 51.605... STANDARDS) United States Consumer Standards for Celery Stalks Definitions § 51.605 Good heart formation. Good heart formation means that the stalk has a reasonable number of stocky inner heart branches for...
Improving communication of breast cancer recurrence risk.
Brewer, Noel T; Richman, Alice R; DeFrank, Jessica T; Reyna, Valerie F; Carey, Lisa A
2012-06-01
Doctors commonly use genomic testing for breast cancer recurrence risk. We sought to assess whether the standard genomic report provided to doctors is a good approach for communicating results to patients. During 2009-2010, we interviewed 133 patients with stages I or II, node-negative, hormone receptor-positive breast cancer and eligible for the Oncotype DX genomic test. In a randomized experiment, patients viewed six vignettes that presented hypothetical recurrence risk test results. Each vignette described a low, intermediate, or high chance of breast cancer recurrence in 10 years. Vignettes used one of five risk formats of increasing complexity that we derived from the standard report that accompanies the commercial assay or a sixth format that used an icon array. Among women who received the genomic recurrence risk test, 63% said their doctors showed them the standard report. The standard report format yielded among the most errors in identification of whether a result was low, intermediate, or high risk (i.e., the gist of the results), whereas a newly developed risk continuum format yielded the fewest errors (17% vs. 5%; OR 0.23; 95% CI 0.10-0.52). For high recurrence risk results presented in the standard format, women made errors 35% of the time. Women rated the standard report as one of the least understandable and least-liked formats, but they rated the risk continuum format as among the most understandable and most liked. Results differed little by health literacy, numeracy, prior receipt of genomic test results during clinical care, and actual genomic test results. The standard genomic recurrence risk report was more difficult for women to understand and interpret than the other formats. A less complex report, potentially including the risk continuum format, would be more effective in communicating test results to patients.
Inverse methods for estimating primary input signals from time-averaged isotope profiles
NASA Astrophysics Data System (ADS)
Passey, Benjamin H.; Cerling, Thure E.; Schuster, Gerard T.; Robinson, Todd F.; Roeder, Beverly L.; Krueger, Stephen K.
2005-08-01
Mammalian teeth are invaluable archives of ancient seasonality because they record along their growth axes an isotopic record of temporal change in environment, plant diet, and animal behavior. A major problem with the intra-tooth method is that intra-tooth isotope profiles can be extremely time-averaged compared to the actual pattern of isotopic variation experienced by the animal during tooth formation. This time-averaging is a result of the temporal and spatial characteristics of amelogenesis (tooth enamel formation), and also results from laboratory sampling. This paper develops and evaluates an inverse method for reconstructing original input signals from time-averaged intra-tooth isotope profiles. The method requires that the temporal and spatial patterns of amelogenesis are known for the specific tooth and uses a minimum length solution of the linear system Am = d, where d is the measured isotopic profile, A is a matrix describing temporal and spatial averaging during amelogenesis and sampling, and m is the input vector that is sought. Accuracy is dependent on several factors, including the total measurement error and the isotopic structure of the measured profile. The method is shown to accurately reconstruct known input signals for synthetic tooth enamel profiles and the known input signal for a rabbit that underwent controlled dietary changes. Application to carbon isotope profiles of modern hippopotamus canines reveals detailed dietary histories that are not apparent from the measured data alone. Inverse methods show promise as an effective means of dealing with the time-averaging problem in studies of intra-tooth isotopic variation.
User input verification and test driven development in the NJOY21 nuclear data processing code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trainer, Amelia Jo; Conlin, Jeremy Lloyd; McCartney, Austin Paul
Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, andmore » capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.« less
49 CFR 571.126 - Standard No. 126; Electronic stability control systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
... counterclockwise steering, and the other series uses clockwise steering. The maximum time permitted between each... or side slip derivative with respect to time; (4) That has a means to monitor driver steering inputs... dwell steering input (time T0 + 1 in Figure 1) must not exceed 35 percent of the first peak value of yaw...
49 CFR 571.126 - Standard No. 126; Electronic stability control systems.
Code of Federal Regulations, 2014 CFR
2014-10-01
... counterclockwise steering, and the other series uses clockwise steering. The maximum time permitted between each... or side slip derivative with respect to time; (4) That has a means to monitor driver steering inputs... dwell steering input (time T0 + 1 in Figure 1) must not exceed 35 percent of the first peak value of yaw...
For the Record: What Education Policy Could Be
ERIC Educational Resources Information Center
Tienken, Christopher H.
2012-01-01
A review of education reform policies reveals a shift from an input guarantee approach aimed at providing funds to level the playing field for all students to an output guarantee approach based on the expectation of achieving standardized results regardless of inputs. The shift reflects a belief that where a child starts his or her cognitive,…
77 FR 34020 - National Fire Codes: Request for Public Input for Revision of Codes and Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-08
... Comment periods, there is further opportunity for debate and discussion through the Association Technical... proposed new or revised code or standard to be presented to the NFPA membership for the debate and...
78 FR 44475 - Protection System Maintenance Reliability Standard
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-24
... Protection System Maintenance--Phase 2 (Reclosing Relays)). 12. NERC states that the proposed Reliability... of the relay inputs and outputs that are essential to proper functioning of the protection system...] Protection System Maintenance Reliability Standard AGENCY: Federal Energy Regulatory Commission, Energy...
RCHILD - an R-package for flexible use of the landscape evolution model CHILD
NASA Astrophysics Data System (ADS)
Dietze, Michael
2014-05-01
Landscape evolution models provide powerful approaches to numerically assess earth surface processes, to quantify rates of landscape change, infer sediment transfer rates, estimate sediment budgets, investigate the consequences of changes in external drivers on a geomorphic system, to provide spatio-temporal interpolations between known landscape states or to test conceptual hypotheses. CHILD (Channel-Hillslope Integrated Landscape Development Model) is one of the most-used models of landscape change in the context of at least tectonic and geomorphologic process interactions. Running CHILD from command line and working with the model output can be a rather awkward task (static model control via text input file, only numeric output in text files). The package RCHILD is a collection of functions for the free statistical software R that help using CHILD in a flexible, dynamic and user-friendly way. The comprised functions allow creating maps, real-time scenes, animations and further thematic plots from model output. The model input files can be modified dynamically and, hence, (feedback-related) changes in external factors can be implemented iteratively. Output files can be written to common formats that can be readily imported to standard GIS software. This contribution presents the basic functionality of the model CHILD as visualised and modified by the package. A rough overview of the available functions is given. Application examples help to illustrate the great potential of numeric modelling of geomorphologic processes.
NASA Astrophysics Data System (ADS)
Cheng, Yuh-Jiuh; Yeh, Tzuoh-Chyau; Cheng, Shyr-Yuan
2011-09-01
In this paper, a non-blocking multicast optical packet switch based on fiber Bragg grating technology with optical output buffers is proposed. Only the header of optical packets is converted to electronic signals to control the fiber Bragg grating array of input ports and the packet payloads should be transparently destined to their output ports so that the proposed switch can reduce electronic interfaces as well as the bit rate. The modulation and the format of packet payloads may be non-standard where packet payloads could also include different wavelengths for increasing the volume of traffic. The advantage is obvious: the proposed switch could transport various types of traffic. An easily implemented architecture which can provide multicast services is also presented. An optical output buffer is designed to queue the packets if more than one incoming packet should reach to the same destination output port or including any waiting packets in optical output buffer that will be sent to the output port at a time slot. For preserving service-packet sequencing and fairness of routing sequence, a priority scheme and a round-robin algorithm are adopted at the optical output buffer. The fiber Bragg grating arrays for both input ports and output ports are designed for routing incoming packets using optical code division multiple access technology.
Lefkoff, L.J.; Gorelick, S.M.
1987-01-01
A FORTRAN-77 computer program code that helps solve a variety of aquifer management problems involving the control of groundwater hydraulics. It is intended for use with any standard mathematical programming package that uses Mathematical Programming System input format. The computer program creates the input files to be used by the optimization program. These files contain all the hydrologic information and management objectives needed to solve the management problem. Used in conjunction with a mathematical programming code, the computer program identifies the pumping or recharge strategy that achieves a user 's management objective while maintaining groundwater hydraulic conditions within desired limits. The objective may be linear or quadratic, and may involve the minimization of pumping and recharge rates or of variable pumping costs. The problem may contain constraints on groundwater heads, gradients, and velocities for a complex, transient hydrologic system. Linear superposition of solutions to the transient, two-dimensional groundwater flow equation is used by the computer program in conjunction with the response matrix optimization method. A unit stress is applied at each decision well and transient responses at all control locations are computed using a modified version of the U.S. Geological Survey two dimensional aquifer simulation model. The program also computes discounted cost coefficients for the objective function and accounts for transient aquifer conditions. (Author 's abstract)
TransFit: Finite element analysis data fitting software
NASA Technical Reports Server (NTRS)
Freeman, Mark
1993-01-01
The Advanced X-Ray Astrophysics Facility (AXAF) mission support team has made extensive use of geometric ray tracing to analyze the performance of AXAF developmental and flight optics. One important aspect of this performance modeling is the incorporation of finite element analysis (FEA) data into the surface deformations of the optical elements. TransFit is software designed for the fitting of FEA data of Wolter I optical surface distortions with a continuous surface description which can then be used by SAO's analytic ray tracing software, currently OSAC (Optical Surface Analysis Code). The improved capabilities of Transfit over previous methods include bicubic spline fitting of FEA data to accommodate higher spatial frequency distortions, fitted data visualization for assessing the quality of fit, the ability to accommodate input data from three FEA codes plus other standard formats, and options for alignment of the model coordinate system with the ray trace coordinate system. TransFit uses the AnswerGarden graphical user interface (GUI) to edit input parameters and then access routines written in PV-WAVE, C, and FORTRAN to allow the user to interactively create, evaluate, and modify the fit. The topics covered include an introduction to TransFit: requirements, designs philosophy, and implementation; design specifics: modules, parameters, fitting algorithms, and data displays; a procedural example; verification of performance; future work; and appendices on online help and ray trace results of the verification section.
Multiplexing Readout of TES Microcalorimeters Based on Analog Baseband Feedback
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takei, Y.; Yamasaki, N.Y; Mitsuda, K.
2009-12-16
A TES microcalorimeter array is a promising spectrometer with excellent energy resolution and a moderate imaging capability. To realize a large format array in space, multiplexing the TES signals at the low tempersture stage is mandatory. We are developing frequency division multiplexing (FDM) based on baseband feedback technique. In FDM, each TES is AC-biased with a different carrier frequency. Signals from several pixels are summed and then read out by one SQUID. The maximum number of multiplexed pixels are limited by the frequency band in which the SQUID can be operated in a flux-locked loop, which is {approx}1 MHz withmore » standard flux-locked loop circuit. In the baseband feedback, the signal ({approx}10 kHz band) from the TES is once demodulated. Then a reconstructed copy of the modulated signal with an appropriate phase is fed back to the SQUID input coil to maintain an approximately constant magnetic flux. This can be implemented even for large cable delays and automatically suppresses the carrier. We developed a prototype electronics for the baseband feedback based on an analog phase sensitive detector (PSD) and a multiplier. Combined with Seiko 80-SSA SQUID amp, open-loop gain of 8 has been obtained for 10 kHz baseband signal at 5 MHz carrier frequency, with a moderate noise contribution of 27pA/{radical}(Hz) at input.« less
Seismo-Live: Training in Seismology with Jupyter Notebooks
NASA Astrophysics Data System (ADS)
Krischer, Lion; Tape, Carl; Igel, Heiner
2016-04-01
Seismological training tends to occur within the isolation of a particular institution with a limited set of tools (codes, libraries) that are often not transferrable outside. Here, we propose to overcome these limitations with a community-driven library of Jupyter notebooks dedicated to training on any aspect of seismology for purposes of education and outreach, on-site or archived tutorials for codes, classroom instruction, and research. A Jupyter notebook (jupyter.org) is an open-source interactive computational environment that allows combining code execution, rich text, mathematics, and plotting. It can be considered a platform that supports reproducible research, as all inputs and outputs may be stored. Text, external graphics, equations can be handled using Markdown (incl. LaTeX) format. Jupyter notebooks are driven by standard web browsers, can be easily exchanged in text format, or converted to other documents (e.g. PDF, slide shows). They provide an ideal format for practical training in seismology. A pilot-platform was setup with a dedicated server such that the Jupyter notebooks can be run in any browser (PC, notepad, smartphone). We show the functionalities of the Seismo-Live platform with examples from computational seismology, seismic data access and processing using the ObsPy library, seismic inverse problems, and others. The current examples are all using the Python programming language but any free language can be used. Potentially, such community platforms could be integrated with the EPOS-IT infrastructure and extended to other fields of Earth sciences.
C2x: A tool for visualisation and input preparation for CASTEP and other electronic structure codes
NASA Astrophysics Data System (ADS)
Rutter, M. J.
2018-04-01
The c2x code fills two distinct roles. Its first role is in acting as a converter between the binary format .check files from the widely-used CASTEP [1] electronic structure code and various visualisation programs. Its second role is to manipulate and analyse the input and output files from a variety of electronic structure codes, including CASTEP, ONETEP and VASP, as well as the widely-used 'Gaussian cube' file format. Analysis includes symmetry analysis, and manipulation arbitrary cell transformations. It continues to be under development, with growing functionality, and is written in a form which would make it easy to extend it to working directly with files from other electronic structure codes. Data which c2x is capable of extracting from CASTEP's binary checkpoint files include charge densities, spin densities, wavefunctions, relaxed atomic positions, forces, the Fermi level, the total energy, and symmetry operations. It can recreate .cell input files from checkpoint files. Volumetric data can be output in formats useable by many common visualisation programs, and c2x will itself calculate integrals, expand data into supercells, and interpolate data via combinations of Fourier and trilinear interpolation. It can extract data along arbitrary lines (such as lines between atoms) as 1D output. C2x is able to convert between several common formats for describing molecules and crystals, including the .cell format of CASTEP. It can construct supercells, reduce cells to their primitive form, and add specified k-point meshes. It uses the spglib library [2] to report symmetry information, which it can add to .cell files. C2x is a command-line utility, so is readily included in scripts. It is available under the GPL and can be obtained from http://www.c2x.org.uk. It is believed to be the only open-source code which can read CASTEP's .check files, so it will have utility in other projects.
Biochemical thermodynamics: applications of Mathematica.
Alberty, Robert A
2006-01-01
The most efficient way to store thermodynamic data on enzyme-catalyzed reactions is to use matrices of species properties. Since equilibrium in enzyme-catalyzed reactions is reached at specified pH values, the thermodynamics of the reactions is discussed in terms of transformed thermodynamic properties. These transformed thermodynamic properties are complicated functions of temperature, pH, and ionic strength that can be calculated from the matrices of species values. The most important of these transformed thermodynamic properties is the standard transformed Gibbs energy of formation of a reactant (sum of species). It is the most important because when this function of temperature, pH, and ionic strength is known, all the other standard transformed properties can be calculated by taking partial derivatives. The species database in this package contains data matrices for 199 reactants. For 94 of these reactants, standard enthalpies of formation of species are known, and so standard transformed Gibbs energies, standard transformed enthalpies, standard transformed entropies, and average numbers of hydrogen atoms can be calculated as functions of temperature, pH, and ionic strength. For reactions between these 94 reactants, the changes in these properties can be calculated over a range of temperatures, pHs, and ionic strengths, and so can apparent equilibrium constants. For the other 105 reactants, only standard transformed Gibbs energies of formation and average numbers of hydrogen atoms at 298.15 K can be calculated. The loading of this package provides functions of pH and ionic strength at 298.15 K for standard transformed Gibbs energies of formation and average numbers of hydrogen atoms for 199 reactants. It also provides functions of temperature, pH, and ionic strength for the standard transformed Gibbs energies of formation, standard transformed enthalpies of formation, standard transformed entropies of formation, and average numbers of hydrogen atoms for 94 reactants. Thus loading this package makes available 774 mathematical functions for these properties. These functions can be added and subtracted to obtain changes in these properties in biochemical reactions and apparent equilibrium constants.
FLIS Procedures Manual. Document Identifier Code Input/Output Formats (Fixed Length). Volume 8.
1997-04-01
DATA ELE- MENTS. SEGMENT R MAY BE REPEATED A MAXIMUM OF THREE (3) TIMES IN ORDER TO ACQUIRE THE REQUIRED MIX OF SEGMENTS OR INDIVIDUAL DATA ELEMENTS TO...preceding record. Marketing input DICs. QI Next DRN of appropriate segment will be QF The assigned NSN or PSCN being can- reflected in accordance with Table...Classified KFC Notification of Possible Duplicate (Sub- KRP Characteristics Data mitter) Follow-Up Interrogation LFU Notification of Return, SSR Transaction
1979-02-01
I ser light source, a 250 x 500-mm. X-Y input table, optics, and a 500 x 1,000-mm, output drum mounted on a 3-ton granite base. As the input...computer via the teletype. The printer unit is installed in a clean-room environment, part of which is a darkroom containing-E the output drum . Since... drum -type. UNAMACE elevation data will that are repetitive, tedious, and very demanding with respect to be converted to contour line format by
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.
1979-07-01
User input data requirements are presented for certain special processors in a nuclear reactor computation system. These processors generally read data in formatted form and generate binary interface data files. Some data processing is done to convert from the user oriented form to the interface file forms. The VENTURE diffusion theory neutronics code and other computation modules in this system use the interface data files which are generated.
FlaME: Flash Molecular Editor - a 2D structure input tool for the web.
Dallakian, Pavel; Haider, Norbert
2011-02-01
So far, there have been no Flash-based web tools available for chemical structure input. The authors herein present a feasibility study, aiming at the development of a compact and easy-to-use 2D structure editor, using Adobe's Flash technology and its programming language, ActionScript. As a reference model application from the Java world, we selected the Java Molecular Editor (JME). In this feasibility study, we made an attempt to realize a subset of JME's functionality in the Flash Molecular Editor (FlaME) utility. These basic capabilities are: structure input, editing and depiction of single molecules, data import and export in molfile format. The result of molecular diagram sketching in FlaME is accessible in V2000 molfile format. By integrating the molecular editor into a web page, its communication with the HTML elements on this page is established using the two JavaScript functions, getMol() and setMol(). In addition, structures can be copied to the system clipboard. A first attempt was made to create a compact single-file application for 2D molecular structure input/editing on the web, based on Flash technology. With the application examples presented in this article, it could be demonstrated that the Flash methods are principally well-suited to provide the requisite communication between the Flash object (application) and the HTML elements on a web page, using JavaScript functions.
NASA Astrophysics Data System (ADS)
Pereyra, Y.; Ma, L.; Sak, P. B.; Gaillardet, J.; Buss, H. L.; Brantley, S. L.
2015-12-01
Dust inputs play an important role in soil formation, especially for thick soils developed on tropical volcanic islands. In these regions, soils are highly depleted due to intensive chemical weathering, and mineral nutrients from dusts have been known to be important in sustaining soil fertility and productivity. Tropical volcanic soils are an ideal system to study the impacts of dust inputs on the ecosystem. Sr and U-series isotopes are excellent tracers to identify sources of materials in an open system if the end-members have distinctive isotope signatures. These two isotope systems are particularly useful to trace the origin of atmospheric inputs into soils and to determine rates and timescales of soil formation. This study analyzes major elemental concentrations, Sr and U-series isotope ratios in highly depleted soils in the tropical volcanic island of Basse-Terre in French Guadeloupe to determine atmospheric input sources and identify key soil formation processes. We focus on three soil profiles (8 to 12 m thick) from the Bras-David, Moustique Petit-Bourg, and Deshaies watersheds; and on the adjacent rivers to these sites. Results have shown a significant depletion of U, Sr, and major elements in the deep profile (12 to 4 m) attributed to rapid chemical weathering. The top soil profiles (4 m to the surface) all show addition of elements such as Ca, Mg, U, and Sr due to atmospheric dust. More importantly, the topsoil profiles have distinct Sr and U-series isotope compositions from the deep soils. Sr and U-series isotope ratios of the top soils and sequential extraction fractions confirm that the sources of the dust are from the Saharan dessert, through long distance transport from Africa to the Caribbean region across the Atlantic Ocean. During the transport, some dust isotope signatures may also have been modified by local volcanic ashes and marine aerosols. Our study highlights that dusts and marine aerosols play important roles in element cycles and nutrient sources in the highly depleted surface soils of tropical oceanic islands.
40 CFR 60.152 - Standard for particulate matter.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 6 2011-07-01 2011-07-01 false Standard for particulate matter. 60.152... Plants § 60.152 Standard for particulate matter. (a) On and after the date on which the performance test...: (1) Particulate matter at a rate in excess of 0.65 g/kg dry sludge input (1.30 lb/ton dry sludge...
40 CFR 60.152 - Standard for particulate matter.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Standard for particulate matter. 60.152... Plants § 60.152 Standard for particulate matter. (a) On and after the date on which the performance test...: (1) Particulate matter at a rate in excess of 0.65 g/kg dry sludge input (1.30 lb/ton dry sludge...
40 CFR 60.152 - Standard for particulate matter.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 7 2014-07-01 2014-07-01 false Standard for particulate matter. 60.152... Plants § 60.152 Standard for particulate matter. (a) On and after the date on which the performance test...: (1) Particulate matter at a rate in excess of 0.65 g/kg dry sludge input (1.30 lb/ton dry sludge...
40 CFR 60.152 - Standard for particulate matter.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 7 2013-07-01 2013-07-01 false Standard for particulate matter. 60.152... Plants § 60.152 Standard for particulate matter. (a) On and after the date on which the performance test...: (1) Particulate matter at a rate in excess of 0.65 g/kg dry sludge input (1.30 lb/ton dry sludge...
40 CFR 60.152 - Standard for particulate matter.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 7 2012-07-01 2012-07-01 false Standard for particulate matter. 60.152... Plants § 60.152 Standard for particulate matter. (a) On and after the date on which the performance test...: (1) Particulate matter at a rate in excess of 0.65 g/kg dry sludge input (1.30 lb/ton dry sludge...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-11
... and emissions input data preparation, model performance evaluation, interpreting modeling results, and... standard based on ambient ozone monitoring data for the 2006- 2008 period. EPA has not yet acted on this... ppm) and years thereafter were at or below the standard. See EPA Air Quality System (AQS) data...
NASA Astrophysics Data System (ADS)
Cao, X.; Du, A.
2014-12-01
We statistically studied the response time of the SYMH to the solar wind energy input ɛ by using the RFA approach. The average response time was 64 minutes. There was no clear trend among these events concerning to the minimum SYMH and storm type. It seems that the response time of magnetosphere to the solar wind energy input is independent on the storm intensity and the solar wind condition. The response function shows one peak even when the solar wind energy input and the SYMH have multi-peak. The response time exhibits as the intrinsic property of the magnetosphere that stands for the typical formation time of the ring current. This may be controlled by magnetospheric temperature, average number density, the oxygen abundance et al.
NASA Astrophysics Data System (ADS)
Huang, Xu; Yan, Ye; Zhou, Yang
2014-12-01
The Lorentz force acting on an electrostatically charged spacecraft as it moves through the planetary magnetic field could be utilized as propellantless electromagnetic propulsion for orbital maneuvering, such as spacecraft formation establishment and formation reconfiguration. By assuming that the Earth's magnetic field could be modeled as a tilted dipole located at the center of Earth that corotates with Earth, a dynamical model that describes the relative orbital motion of Lorentz spacecraft is developed. Based on the proposed dynamical model, the energy-optimal open-loop trajectories of control inputs, namely, the required specific charges of Lorentz spacecraft, for Lorentz-propelled spacecraft formation establishment or reconfiguration problems with both fixed and free final conditions constraints are derived via Gauss pseudospectral method. The effect of the magnetic dipole tilt angle on the optimal control inputs and the relative transfer trajectories for formation establishment or reconfiguration is also investigated by comparisons with the results derived from a nontilted dipole model. Furthermore, a closed-loop integral sliding mode controller is designed to guarantee the trajectory tracking in the presence of external disturbances and modeling errors. The stability of the closed-loop system is proved by a Lyapunov-based approach. Numerical simulations are presented to verify the validity of the proposed open-loop control methods and demonstrate the performance of the closed-loop controller. Also, the results indicate the dipole tilt angle should be considered when designing control strategies for Lorentz-propelled spacecraft formation establishment or reconfiguration.
NASA Astrophysics Data System (ADS)
Powley, Helen R.; Krom, Michael D.; Van Cappellen, Philippe
2018-03-01
Human activities have significantly modified the inputs of land-derived phosphorus (P) and nitrogen (N) to the Mediterranean Sea (MS). Here, we reconstruct the external inputs of reactive P and N to the Western Mediterranean Sea (WMS) and Eastern Mediterranean Sea (EMS) over the period 1950-2030. We estimate that during this period the land derived P and N loads increased by factors of 3 and 2 to the WMS and EMS, respectively, with reactive P inputs peaking in the 1980s but reactive N inputs increasing continuously from 1950 to 2030. The temporal variations in reactive P and N inputs are imposed in a coupled P and N mass balance model of the MS to simulate the accompanying changes in water column nutrient distributions and primary production with time. The key question we address is whether these changes are large enough to be distinguishable from variations caused by confounding factors, specifically the relatively large inter-annual variability in thermohaline circulation (THC) of the MS. Our analysis indicates that for the intermediate and deep water masses of the MS the magnitudes of changes in reactive P concentrations due to changes in anthropogenic inputs are relatively small and likely difficult to diagnose because of the noise created by the natural circulation variability. Anthropogenic N enrichment should be more readily detectable in time series concentration data for dissolved organic N (DON) after the 1970s, and for nitrate (NO3) after the 1990s. The DON concentrations in the EMS are predicted to exhibit the largest anthropogenic enrichment signature. Temporal variations in annual primary production over the 1950-2030 period are dominated by variations in deep-water formation rates, followed by changes in riverine P inputs for the WMS and atmospheric P deposition for the EMS. Overall, our analysis indicates that the detection of basin-wide anthropogenic nutrient concentration trends in the MS is rendered difficult due to: (1) the Atlantic Ocean contributing the largest reactive P and N inputs to the MS, hence diluting the anthropogenic nutrient signatures, (2) the anti-estuarine circulation removing at least 45% of the anthropogenic nutrients inputs added to both basins of the MS between 1950 and 2030, and (3) variations in intermediate and deep water formation rates that add high natural noise to the P and N concentration trajectories.
User's guide for a large signal computer model of the helical traveling wave tube
NASA Technical Reports Server (NTRS)
Palmer, Raymond W.
1992-01-01
The use is described of a successful large-signal, two-dimensional (axisymmetric), deformable disk computer model of the helical traveling wave tube amplifier, an extensively revised and operationally simplified version. We also discuss program input and output and the auxiliary files necessary for operation. Included is a sample problem and its input data and output results. Interested parties may now obtain from the author the FORTRAN source code, auxiliary files, and sample input data on a standard floppy diskette, the contents of which are described herein.
CRANS - CONFIGURABLE REAL-TIME ANALYSIS SYSTEM
NASA Technical Reports Server (NTRS)
Mccluney, K.
1994-01-01
In a real-time environment, the results of changes or failures in a complex, interconnected system need evaluation quickly. Tabulations showing the effects of changes and/or failures of a given item in the system are generally only useful for a single input, and only with regard to that item. Subsequent changes become harder to evaluate as combinations of failures produce a cascade effect. When confronted by multiple indicated failures in the system, it becomes necessary to determine a single cause. In this case, failure tables are not very helpful. CRANS, the Configurable Real-time ANalysis System, can interpret a logic tree, constructed by the user, describing a complex system and determine the effects of changes and failures in it. Items in the tree are related to each other by Boolean operators. The user is then able to change the state of these items (ON/OFF FAILED/UNFAILED). The program then evaluates the logic tree based on these changes and determines any resultant changes to other items in the tree. CRANS can also search for a common cause for multiple item failures, and allow the user to explore the logic tree from within the program. A "help" mode and a reference check provide the user with a means of exploring an item's underlying logic from within the program. A commonality check determines single point failures for an item or group of items. Output is in the form of a user-defined matrix or matrices of colored boxes, each box representing an item or set of items from the logic tree. Input is via mouse selection of the matrix boxes, using the mouse buttons to toggle the state of the item. CRANS is written in C-language and requires the MIT X Window System, Version 11 Revision 4 or Revision 5. It requires 78K of RAM for execution and a three button mouse. It has been successfully implemented on Sun4 workstations running SunOS, HP9000 workstations running HP-UX, and DECstations running ULTRIX. No executable is provided on the distribution medium; however, a sample makefile is included. Sample input files are also included. The standard distribution medium is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. Alternate distribution media and formats are available upon request. This program was developed in 1992.
Code of Federal Regulations, 2011 CFR
2011-07-01
... according to the following procedures. 2.1.6.1Plot the heat input rate (mmBtu/hr) as the independent (or x... stationary gas turbine, select at least four operating parameters indicative of the turbine's NOX formation... least four operating parameters indicative of the engine's NOX formation characteristics, and define in...
Karen Robinson; Steven Selin; Chad Pierskalla
2009-01-01
This paper reports the results and management implications of a longitudinal research study examining the social factors affecting the formation of a trails network advisory group for the Monongahela National Forest (MNF) in West Virginia. A collaborative process of creating an MNF trails network with input from local users and stakeholders has been largely...
ERIC Educational Resources Information Center
O'Connor, Akira R.; Moulin, Christopher J. A.
2006-01-01
We report the case of a 25-year-old healthy, blind male, MT, who experiences normal patterns of deja vu. The optical pathway delay theory of deja vu formation assumes that neuronal input from the optical pathways is necessary for the formation of the experience. Surprisingly, although the sensation of deja vu is known to be experienced by blind…
Update on ɛK with lattice QCD inputs
NASA Astrophysics Data System (ADS)
Jang, Yong-Chull; Lee, Weonjong; Lee, Sunkyu; Leem, Jaehoon
2018-03-01
We report updated results for ɛK, the indirect CP violation parameter in neutral kaons, which is evaluated directly from the standard model with lattice QCD inputs. We use lattice QCD inputs to fix B\\hatk,|Vcb|,ξ0,ξ2,|Vus|, and mc(mc). Since Lattice 2016, the UTfit group has updated the Wolfenstein parameters in the angle-only-fit method, and the HFLAV group has also updated |Vcb|. Our results show that the evaluation of ɛK with exclusive |Vcb| (lattice QCD inputs) has 4.0σ tension with the experimental value, while that with inclusive |Vcb| (heavy quark expansion based on OPE and QCD sum rules) shows no tension.
Showler, A T; Robinson, J R C
2008-10-01
The standard practice of two or three preemptive insecticide applications at the start of pinhead (1-2-mm-diameter) squaring followed by threshold-triggered (when 10% of randomly selected squares have oviposition punctures) insecticide applications for boll weevil, Anthonomus grandis grandis Boheman (Coleoptera: Curculionidae), control does not provide reliable protection of cotton, Gossypium hirsutum L., lint production. This study, conducted during 2004 and 2005, showed that three to six fewer spray applications in a "proactive" approach, in which spraying began at the start of large (5.5-8-mm-diameter) square formation and continued at approximately 7-d intervals while large squares were abundant, resulted in fewer infested squares and 1.4- to 1.7-fold more lint than the standard treatment. Fewer sprays and increased yield made proactive spraying significantly more profitable than the standard approach, which resulted in relatively low or negative economic returns. Harvest at 75% boll-split in the proactive spray regime of 2005 resulted in four-fold greater economic return than cotton harvested at 40% boll-split because of improved protection of large squares and the elimination of late-season sprays inherent to standard spray regime despite the cost of an extra irrigation in the 75% boll-split treatments. The earlier, 40% harvest trigger does not avoid high late-season boll weevil pressure, which exerts less impact on bolls, the predominant form of fruiting body at that time, than on squares. Proactive spraying and harvest timing are based on an important relationship between nutrition, boll weevil reproduction, and economic inputs; therefore, the tactic of combining proaction with harvest at 75% boll-split is applicable where boll weevils are problematic regardless of climate or region, or whether an eradication program is ongoing.
McLeod, James; Othman, Maazuza Z; Parthasarathy, Rajarathinam
2018-05-26
The relationship between mixing energy input and biogas production was investigated by anaerobically digesting sewage sludge in lab scale, hydraulically mixed, batch mode digesters at six different specific energy inputs. The goal was to identify how mixing energy influenced digestion performance at quantitative levels to help explain the varying results in other published works. The results showed that digester homogeneity was largely uninfluenced by energy input, whereas cumulative biogas production and solids destruction were. With similar solids distributions between conditions, the observed differences were attributed to shear forces disrupting substrate-microbe flocs rather than the formation of temperature and/or concentration gradients. Disruption of the substrate-microbe flocs produced less favourable conditions for hydrolytic bacteria, resulting in less production of biomass and more biogas. Overall, this hypothesis explains the current body of research including the inhibitory conditions reported at extreme mixing power inputs. However, further work is required to definitively prove it. Copyright © 2018 Elsevier Ltd. All rights reserved.
Visualizing NetCDF Files by Using the EverVIEW Data Viewer
Conzelmann, Craig; Romañach, Stephanie S.
2010-01-01
Over the past few years, modelers in South Florida have started using Network Common Data Form (NetCDF) as the standard data container format for storing hydrologic and ecologic modeling inputs and outputs. With its origins in the meteorological discipline, NetCDF was created by the Unidata Program Center at the University Corporation for Atmospheric Research, in conjunction with the National Aeronautics and Space Administration and other organizations. NetCDF is a portable, scalable, self-describing, binary file format optimized for storing array-based scientific data. Despite attributes which make NetCDF desirable to the modeling community, many natural resource managers have few desktop software packages which can consume NetCDF and unlock the valuable data contained within. The U.S. Geological Survey and the Joint Ecosystem Modeling group, an ecological modeling community of practice, are working to address this need with the EverVIEW Data Viewer. Available for several operating systems, this desktop software currently supports graphical displays of NetCDF data as spatial overlays on a three-dimensional globe and views of grid-cell values in tabular form. An included Open Geospatial Consortium compliant, Web-mapping service client and charting interface allows the user to view Web-available spatial data as additional map overlays and provides simple charting visualizations of NetCDF grid values.
The ZPIC educational code suite
NASA Astrophysics Data System (ADS)
Calado, R.; Pardal, M.; Ninhos, P.; Helm, A.; Mori, W. B.; Decyk, V. K.; Vieira, J.; Silva, L. O.; Fonseca, R. A.
2017-10-01
Particle-in-Cell (PIC) codes are used in almost all areas of plasma physics, such as fusion energy research, plasma accelerators, space physics, ion propulsion, and plasma processing, and many other areas. In this work, we present the ZPIC educational code suite, a new initiative to foster training in plasma physics using computer simulations. Leveraging on our expertise and experience from the development and use of the OSIRIS PIC code, we have developed a suite of 1D/2D fully relativistic electromagnetic PIC codes, as well as 1D electrostatic. These codes are self-contained and require only a standard laptop/desktop computer with a C compiler to be run. The output files are written in a new file format called ZDF that can be easily read using the supplied routines in a number of languages, such as Python, and IDL. The code suite also includes a number of example problems that can be used to illustrate several textbook and advanced plasma mechanisms, including instructions for parameter space exploration. We also invite contributions to this repository of test problems that will be made freely available to the community provided the input files comply with the format defined by the ZPIC team. The code suite is freely available and hosted on GitHub at https://github.com/zambzamb/zpic. Work partially supported by PICKSC.
Evaluation of Electronic Formats of the NASA Task Load Index
NASA Technical Reports Server (NTRS)
Trujillo, Anna C.
2011-01-01
Paper questionnaires are being replaced by electronic questionnaires. The primary objective of this research was to determine whether electronic formats of paper questionnaires change subjects ratings and, if so, how the ratings changed. Results indicated that there were no statistically significant differences in self-assessment of workload when using the electronic replica or the paper format of the NASA-TLX scale. Variations of the electronic formats were tested to enforce structure to the TLX scale. Respondents had more consistent ratings with these alternative formats of the NASA-TLX. Non-pilots, in general, had lower workload ratings than pilots. The time to input the rating was the fastest for the electronic facsimile and random title formats. Also subjects preferred the electronic formats and thought these formats were easier to use. Therefore, moving questionnaires from paper to electronic media could change respondents' answers.
NASA Astrophysics Data System (ADS)
Matilainen, Ville-Pekka; Piili, Heidi; Salminen, Antti; Nyrhilä, Olli
Laser additive manufacturing (LAM) is a fabrication technology that enables production of complex parts from metallic materials with mechanical properties comparable to conventionally manufactured parts. In the LAM process, parts are manufactured by melting metallic powder layer-by-layer with a laser beam. This manufacturing technology is nowadays called powder bed fusion (PBF) according to the ASTM F2792-12a standard. This strategy involves several different independent and dependent thermal cycles, all of which have an influence on the final properties of the manufactured part. The quality of PBF parts depends strongly on the characteristics of each single laser-melted track and each single layer. This study consequently concentrates on investigating the effects of process parameters such as laser power on single track and layer formation and laser-material interaction phenomena occurring during the PBF process. Experimental tests were done with two different machines: a modified research machine based on an EOS EOSINT M-series system and an EOS EOSINT M280 system. The material used was EOS stainless steel 17-4 PH. Process monitoring was done with an active illuminated high speed camera system. After microscopy analysis, it was concluded that a keyhole can form during laser additive manufacturing of stainless steel. It was noted that heat input has an important effect on the likelihood of keyhole formation. The threshold intensity value for keyhole formation of 106 W/cm2 was exceeded in all manufactured single tracks. Laser interaction time was found to have an effect on penetration depth and keyhole formation, since the penetration depth increased with increased laser interaction time. It was also concluded that active illuminated high speed camera systems are suitable for monitoring of the manufacturing process and facilitate process control.
Immunity of medical electrical equipment to radiated RF disturbances
NASA Astrophysics Data System (ADS)
Mocha, Jan; Wójcik, Dariusz; Surma, Maciej
2018-04-01
Immunity of medical equipment to radiated radio frequency (RF) electromagnetic (EM) fields is a priority issue owing to the functions that the equipment is intended to perform. This is reflected in increasingly stringent normative requirements that medical electrical equipment has to conform to. A new version of the standard concerning electromagnetic compatibility of medical electrical equipment IEC 60601-1-2:2014 has recently been published. The paper discusses major changes introduced in this edition of the standard. The changes comprise more rigorous immunity requirements for medical equipment as regards radiated RF EM fields and a new requirement for testing the immunity of medical electrical equipment to disturbances coming from digital radio communication systems. Further on, the paper presents two typical designs of the input block: involving a multi-level filtering and amplification circuit and including a solution which integrates an input amplifier and an analog-to-digital converter in one circuit. Regardless of the applied solution, presence of electromagnetic disturbances in the input block leads to demodulation of the disturbance signal envelope. The article elaborates on mechanisms of amplitude detection occurring in such cases. Electromagnetic interferences penetration from the amplifier's input to the output is also described in the paper. If the aforementioned phenomena are taken into account, engineers will be able to develop a more conscious approach towards the issue of immunity to RF EM fields in the process of designing input circuits in medical electrical equipment.
iSEDfit: Bayesian spectral energy distribution modeling of galaxies
NASA Astrophysics Data System (ADS)
Moustakas, John
2017-08-01
iSEDfit uses Bayesian inference to extract the physical properties of galaxies from their observed broadband photometric spectral energy distribution (SED). In its default mode, the inputs to iSEDfit are the measured photometry (fluxes and corresponding inverse variances) and a measurement of the galaxy redshift. Alternatively, iSEDfit can be used to estimate photometric redshifts from the input photometry alone. After the priors have been specified, iSEDfit calculates the marginalized posterior probability distributions for the physical parameters of interest, including the stellar mass, star-formation rate, dust content, star formation history, and stellar metallicity. iSEDfit also optionally computes K-corrections and produces multiple "quality assurance" (QA) plots at each stage of the modeling procedure to aid in the interpretation of the prior parameter choices and subsequent fitting results. The software is distributed as part of the impro IDL suite.
Definitions of the Phenotypic Manifestations of Sickle Cell Disease
Ballas, Samir K.; Lieff, Susan; Benjamin, Lennette J.; Dampier, Carlton D.; Heeney, Matthew M.; Hoppe, Carolyn; Johnson, Cage S.; Rogers, Zora R.; Smith-Whitley, Kim; Wang, Winfred C.; Telen, Marilyn J.
2016-01-01
Sickle cell disease (SCD) is a pleiotropic genetic disorder of hemoglobin that has profound multi-organ effects. The low prevalence of SCD (~100,000/US) has limited progress in clinical, basic, and translational research. Lack of a large, readily accessible population for clinical studies has contributed to the absence of standard definitions and diagnostic criteria for the numerous complications of SCD and inadequate understanding of SCD pathophysiology. In 2005, the Comprehensive Sickle Cell Centers initiated a project to establish consensus definitions of the most frequently occurring complications. A group of clinicians and scientists with extensive expertise in research and treatment of SCD gathered to identify and categorize the most common complications. From this group, a formal writing team was formed that further reviewed the literature, sought specialist input, and produced definitions in a standard format. This manuscript provides an overview of the process and describes twelve body system categories and the most prevalent or severe complications within these categories. A detailed Appendix provides standardized definitions for all complications identified within each system. This report proposes use of these definitions for studies of SCD complications, so future studies can be comparably robust and treatment efficacy measured. Use of these definitions will support greater accuracy in genotype-phenotype studies, thereby achieving a better understanding of SCD pathophysiology. This should nevertheless be viewed as a dynamic rather than final document; phenotype descriptions should be reevaluated and revised periodically to provide the most current standard definitions as etiologic factors are better understood and new diagnostic options are developed. PMID:19902523
NASA Astrophysics Data System (ADS)
Čepický, Jáchym; Moreira de Sousa, Luís
2016-06-01
The OGC® Web Processing Service (WPS) Interface Standard provides rules for standardizing inputs and outputs (requests and responses) for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates publishing of geospatial processes and client discovery of processes and and binding to those processes into workflows. Data required by a WPS can be delivered across a network or they can be available at a server. PyWPS was one of the first implementations of OGC WPS on the server side. It is written in the Python programming language and it tries to connect to all existing tools for geospatial data analysis, available on the Python platform. During the last two years, the PyWPS development team has written a new version (called PyWPS-4) completely from scratch. The analysis of large raster datasets poses several technical issues in implementing the WPS standard. The data format has to be defined and validated on the server side and binary data have to be encoded using some numeric representation. Pulling raster data from remote servers introduces security risks, in addition, running several processes in parallel has to be possible, so that system resources are used efficiently while preserving security. Here we discuss these topics and illustrate some of the solutions adopted within the PyWPS implementation.
40 CFR 60.43c - Standard for particulate matter (PM).
Code of Federal Regulations, 2010 CFR
2010-07-01
... construction, reconstruction, or modification on or before February 28, 2005, that combusts coal or combusts mixtures of coal with other fuels and has a heat input capacity of 8.7 MW (30 MMBtu/hr) or greater, shall... mixtures of wood with other fuels (except coal) and has a heat input capacity of 8.7 MW (30 MMBtu/hr) or...
Contractor Productivity Measurement.
1984-06-01
Principles ( GAAP ) or Uniform Cost Accounting Standards (CAS) as detailed in Federal Acquisition Regulation (FAR) Part 30 and DOD FAR Supplement, Appendix 0...revaluation management input, investor input, taxes, depreciation , etc., are all called out and addressed. The treatment of potential problems such as...of 20 percent. Since many cash flow items require tracking of book value, depreciation and cost- reducing effects of the investment, these items are
Effects of a Format-based Second Language Teaching Method in Kindergarten.
ERIC Educational Resources Information Center
Uilenburg, Noelle; Plooij, Frans X.; de Glopper, Kees; Damhuis, Resi
2001-01-01
Focuses on second language teaching with a format-based method. The differences between a format-based teaching method and a standard approach used as treatments in a quasi-experimental, non-equivalent control group are described in detail. Examines whether the effects of a format-based teaching method and a standard foreign language method differ…
FASTRAN II - FATIGUE CRACK GROWTH STRUCTURAL ANALYSIS (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Newman, J. C.
1994-01-01
Predictions of fatigue crack growth behavior can be made with the Fatigue Crack Growth Structural Analysis (FASTRAN II) computer program. As cyclic loads are applied to a selected crack configuration with an initial crack size, FASTRAN II predicts crack growth as a function of cyclic load history until either a desired crack size is reached or failure occurs. FASTRAN II is based on plasticity-induced crack-closure behavior of cracks in metallic materials and accounts for load-interaction effects, such as retardation and acceleration, under variable-amplitude loading. The closure model is based on the Dugdale model with modifications to allow plastically deformed material to be left along the crack surfaces as the crack grows. Plane stress and plane strain conditions, as well as conditions between these two, can be simulated in FASTRAN II by using a constraint factor on tensile yielding at the crack front to approximately account for three-dimensional stress states. FASTRAN II contains seventeen predefined crack configurations (standard laboratory fatigue crack growth rate specimens and many common crack configurations found in structures); and the user can define one additional crack configuration. The baseline crack growth rate properties (effective stress-intensity factor against crack growth rate) may be given in either equation or tabular form. For three-dimensional crack configurations, such as surface cracks or corner cracks at holes or notches, the fatigue crack growth rate properties may be different in the crack depth and crack length directions. Final failure of the cracked structure can be modelled with fracture toughness properties using either linear-elastic fracture mechanics (brittle materials), a two-parameter fracture criterion (brittle to ductile materials), or plastic collapse (extremely ductile materials). The crack configurations in FASTRAN II can be subjected to either constant-amplitude, variable-amplitude or spectrum loading. The applied loads may be either tensile or compressive. Several standardized aircraft flight-load histories, such as TWIST, Mini-TWIST, FALSTAFF, Inverted FALSTAFF, Felix and Gaussian, are included as options. FASTRAN II also includes two other methods that will help the user input spectrum load histories. The two methods are: (1) a list of stress points, and (2) a flight-by-flight history of stress points. Examples are provided in the user manual. Developed as a research program, FASTRAN II has successfully predicted crack growth in many metallic materials under various aircraft spectrum loading. A computer program DKEFF which is a part of the FASTRAN II package was also developed to analyze crack growth rate data from laboratory specimens to obtain the effective stress-intensity factor against crack growth rate relations used in FASTRAN II. FASTRAN II is written in standard FORTRAN 77. It has been successfully compiled and implemented on Sun4 series computers running SunOS and on IBM PC compatibles running MS-DOS using the Lahey F77L FORTRAN compiler. Sample input and output data are included with the FASTRAN II package. The UNIX version requires 660K of RAM for execution. The standard distribution medium for the UNIX version (LAR-14865) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. The standard distribution medium for the MS-DOS version (LAR-14944) is a 5.25 inch 360K MS-DOS format diskette. The contents of the diskette are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. The program was developed in 1984 and revised in 1992. Sun4 and SunOS are trademarks of Sun Microsystems, Inc. IBM PC is a trademark of International Business Machines Corp. MS-DOS is a trademark of Microsoft, Inc. F77L is a trademark of the Lahey Computer Systems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories. PKWARE and PKUNZIP are trademarks of PKWare, Inc.
FASTRAN II - FATIGUE CRACK GROWTH STRUCTURAL ANALYSIS (IBM PC VERSION)
NASA Technical Reports Server (NTRS)
Newman, J. C.
1994-01-01
Predictions of fatigue crack growth behavior can be made with the Fatigue Crack Growth Structural Analysis (FASTRAN II) computer program. As cyclic loads are applied to a selected crack configuration with an initial crack size, FASTRAN II predicts crack growth as a function of cyclic load history until either a desired crack size is reached or failure occurs. FASTRAN II is based on plasticity-induced crack-closure behavior of cracks in metallic materials and accounts for load-interaction effects, such as retardation and acceleration, under variable-amplitude loading. The closure model is based on the Dugdale model with modifications to allow plastically deformed material to be left along the crack surfaces as the crack grows. Plane stress and plane strain conditions, as well as conditions between these two, can be simulated in FASTRAN II by using a constraint factor on tensile yielding at the crack front to approximately account for three-dimensional stress states. FASTRAN II contains seventeen predefined crack configurations (standard laboratory fatigue crack growth rate specimens and many common crack configurations found in structures); and the user can define one additional crack configuration. The baseline crack growth rate properties (effective stress-intensity factor against crack growth rate) may be given in either equation or tabular form. For three-dimensional crack configurations, such as surface cracks or corner cracks at holes or notches, the fatigue crack growth rate properties may be different in the crack depth and crack length directions. Final failure of the cracked structure can be modelled with fracture toughness properties using either linear-elastic fracture mechanics (brittle materials), a two-parameter fracture criterion (brittle to ductile materials), or plastic collapse (extremely ductile materials). The crack configurations in FASTRAN II can be subjected to either constant-amplitude, variable-amplitude or spectrum loading. The applied loads may be either tensile or compressive. Several standardized aircraft flight-load histories, such as TWIST, Mini-TWIST, FALSTAFF, Inverted FALSTAFF, Felix and Gaussian, are included as options. FASTRAN II also includes two other methods that will help the user input spectrum load histories. The two methods are: (1) a list of stress points, and (2) a flight-by-flight history of stress points. Examples are provided in the user manual. Developed as a research program, FASTRAN II has successfully predicted crack growth in many metallic materials under various aircraft spectrum loading. A computer program DKEFF which is a part of the FASTRAN II package was also developed to analyze crack growth rate data from laboratory specimens to obtain the effective stress-intensity factor against crack growth rate relations used in FASTRAN II. FASTRAN II is written in standard FORTRAN 77. It has been successfully compiled and implemented on Sun4 series computers running SunOS and on IBM PC compatibles running MS-DOS using the Lahey F77L FORTRAN compiler. Sample input and output data are included with the FASTRAN II package. The UNIX version requires 660K of RAM for execution. The standard distribution medium for the UNIX version (LAR-14865) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. The standard distribution medium for the MS-DOS version (LAR-14944) is a 5.25 inch 360K MS-DOS format diskette. The contents of the diskette are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. The program was developed in 1984 and revised in 1992. Sun4 and SunOS are trademarks of Sun Microsystems, Inc. IBM PC is a trademark of International Business Machines Corp. MS-DOS is a trademark of Microsoft, Inc. F77L is a trademark of the Lahey Computer Systems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories. PKWARE and PKUNZIP are trademarks of PKWare, Inc.
1994-07-01
REQUIRED MIX OF SEGMENTS OR INDIVIDUAL DATA ELEMENTS TO BE EXTRACTED. IN SEGMENT R ON AN INTERROGATION TRANSACTION (LTI), DATA RECORD NUMBER (DRN 0950) ONLY...and zation and Marketing input DICs. insert the Continuation Indicator Code (DRN 8555) in position 80 of this record. Maximum of OF The assigned NSN...for Procurement KFR, File Data Minus Security Classified Characteristics Data KFC 8.5-2 DoD 4100.39-M Volume 8 CHAPTER 5 ALPHABETIC INDEX OF DIC
Numerical Function Generators Using LUT Cascades
2007-06-01
either algebraically (for example, sinðxÞ) or as a table of input/ output values. The user defines the numerical function by using the syntax of Scilab ...defined function in Scilab or specify it directly. Note that, by changing the parser of our system, any format can be used for the design entry. First...Methods for Multiple-Valued Input Address Generators,” Proc. 36th IEEE Int’l Symp. Multiple-Valued Logic (ISMVL ’06), May 2006. [29] Scilab 3.0, INRIA-ENPC
NASA Technical Reports Server (NTRS)
1973-01-01
This user's manual describes the FORTRAN IV computer program developed to compute the total vertical load, normal concentrated pressure loads, and the center of pressure of typical SRB water impact slapdown pressure distributions specified in the baseline configuration. The program prepares the concentrated pressure load information in punched card format suitable for input to the STAGS computer program. In addition, the program prepares for STAGS input the inertia reacting loads to the slapdown pressure distributions.
Towards a DNA Nanoprocessor: Reusable Tile-Integrated DNA Circuits.
Gerasimova, Yulia V; Kolpashchikov, Dmitry M
2016-08-22
Modern electronic microprocessors use semiconductor logic gates organized on a silicon chip to enable efficient inter-gate communication. Here, arrays of communicating DNA logic gates integrated on a single DNA tile were designed and used to process nucleic acid inputs in a reusable format. Our results lay the foundation for the development of a DNA nanoprocessor, a small and biocompatible device capable of performing complex analyses of DNA and RNA inputs. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Thermal APU/hydraulics analysis program. User's guide and programmer's manual
NASA Technical Reports Server (NTRS)
Deluna, T. A.
1976-01-01
The User's Guide information plus program description necessary to run and have a general understanding of the Thermal APU/Hydraulics Analysis Program (TAHAP) is described. This information consists of general descriptions of the APU/hydraulic system and the TAHAP model, input and output data descriptions, and specific subroutine requirements. Deck setups and input data formats are included and other necessary and/or helpful information for using TAHAP is given. The math model descriptions for the driver program and each of its supporting subroutines are outlined.
Global Swath and Gridded Data Tiling
NASA Technical Reports Server (NTRS)
Thompson, Charles K.
2012-01-01
This software generates cylindrically projected tiles of swath-based or gridded satellite data for the purpose of dynamically generating high-resolution global images covering various time periods, scaling ranges, and colors called "tiles." It reconstructs a global image given a set of tiles covering a particular time range, scaling values, and a color table. The program is configurable in terms of tile size, spatial resolution, format of input data, location of input data (local or distributed), number of processes run in parallel, and data conditioning.
User's guide to the UTIL-ODRC tape processing program. [for the Orbital Data Reduction Center
NASA Technical Reports Server (NTRS)
Juba, S. M. (Principal Investigator)
1981-01-01
The UTIL-ODRC computer compatible tape processing program, its input/output requirements, and its interface with the EXEC 8 operating system are described. It is a multipurpose orbital data reduction center (ODRC) tape processing program enabling the user to create either exact duplicate tapes and/or tapes in SINDA/HISTRY format. Input data elements for PRAMPT/FLOPLT and/or BATCH PLOT programs, a temperature summary, and a printed summary can also be produced.
75 FR 18751 - FBI Criminal Justice Information Services Division User Fees
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-13
... Standards (SFFAS-4): Managerial Cost Accounting Concepts and Standards for the Federal Government; and other relevant financial management directives, BearingPoint developed a cost accounting methodology and related... management process that provides information about the relationships between inputs (costs) and outputs...
40 CFR 63.1326 - Batch process vents-recordkeeping provisions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutant Emissions: Group IV Polymers and Resins... requirements for Group 2 batch process vents that are exempt from the batch mass input limitation provisions...
Optimization Under Uncertainty for Electronics Cooling Design
NASA Astrophysics Data System (ADS)
Bodla, Karthik K.; Murthy, Jayathi Y.; Garimella, Suresh V.
Optimization under uncertainty is a powerful methodology used in design and optimization to produce robust, reliable designs. Such an optimization methodology, employed when the input quantities of interest are uncertain, produces output uncertainties, helping the designer choose input parameters that would result in satisfactory thermal solutions. Apart from providing basic statistical information such as mean and standard deviation in the output quantities, auxiliary data from an uncertainty based optimization, such as local and global sensitivities, help the designer decide the input parameter(s) to which the output quantity of interest is most sensitive. This helps the design of experiments based on the most sensitive input parameter(s). A further crucial output of such a methodology is the solution to the inverse problem - finding the allowable uncertainty range in the input parameter(s), given an acceptable uncertainty range in the output quantity of interest...
40 CFR 49.125 - Rule for limiting the emissions of particulate matter.
Code of Federal Regulations, 2010 CFR
2010-07-01
... used exclusively for space heating with a rated heat input capacity of less than 400,000 British... average of 0.23 grams per dry standard cubic meter (0.1 grains per dry standard cubic foot), corrected to... boiler stack must not exceed an average of 0.46 grams per dry standard cubic meter (0.2 grains per dry...
ERIC Educational Resources Information Center
Coester, Lee Anne
2010-01-01
This study was designed to gather input from early career elementary teachers with the goal of finding ways to improve elementary mathematics methods courses. Multiple areas were explored including the degree to which respondents' elementary mathematics methods course focused on the NCTM Process Standards, the teachers' current standards-based…
Rapid and automatic speech-specific learning mechanism in human neocortex.
Kimppa, Lilli; Kujala, Teija; Leminen, Alina; Vainio, Martti; Shtyrov, Yury
2015-09-01
A unique feature of human communication system is our ability to rapidly acquire new words and build large vocabularies. However, its neurobiological foundations remain largely unknown. In an electrophysiological study optimally designed to probe this rapid formation of new word memory circuits, we employed acoustically controlled novel word-forms incorporating native and non-native speech sounds, while manipulating the subjects' attention on the input. We found a robust index of neurolexical memory-trace formation: a rapid enhancement of the brain's activation elicited by novel words during a short (~30min) perceptual exposure, underpinned by fronto-temporal cortical networks, and, importantly, correlated with behavioural learning outcomes. Crucially, this neural memory trace build-up took place regardless of focused attention on the input or any pre-existing or learnt semantics. Furthermore, it was found only for stimuli with native-language phonology, but not for acoustically closely matching non-native words. These findings demonstrate a specialised cortical mechanism for rapid, automatic and phonology-dependent formation of neural word memory circuits. Copyright © 2015. Published by Elsevier Inc.
Data Availability in Appliance Standards and Labeling Program Development and Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romankiewicz, John; Khanna, Nina; Vine, Edward
2013-05-01
In this report, we describe the necessary data inputs for both standards development and program evaluation and perform an initial assessment of the availability and uncertainty of those data inputs in China. For standards development, we find that China and its standards and labeling program administrators currently has access to the basic market and technical data needed for conducting market and technology assessment and technological and economic analyses. Some data, such as shipments data, is readily available from the China Energy Label product registration database while the availability of other data, including average unit energy consumption, prices and design options,more » needs improvement. Unlike some other countries such as the United States, most of the necessary data for conducting standards development analyses are not publicly available or compiled in a consolidated data source. In addition, improved data on design and efficiency options as well as cost data (e.g., manufacturing costs, mark-ups, production and product use-phase costs) – key inputs to several technoeconomic analyses – are particularly in need given China’s unconsolidated manufacturing industry. For program evaluation, we find that while China can conduct simple savings evaluations on its incentive programs with the data it currently has available from the Ministry of Finance – the program administrator, the savings estimates produced by such an evaluation will carry high uncertainty. As such, China could benefit from an increase in surveying and metering in the next one to three years to decrease the uncertainty surrounding key data points such as unit energy savings and free ridership.« less
NASA Astrophysics Data System (ADS)
Schiepers, Christiaan; Hoh, Carl K.; Dahlbom, Magnus; Wu, Hsiao-Ming; Phelps, Michael E.
1999-05-01
PET imaging can quantify metabolic processes in-vivo; this requires the measurement of an input function which is invasive and labor intensive. A non-invasive, semi-automated, image based method of input function generation would be efficient, patient friendly, and allow quantitative PET to be applied routinely. A fully automated procedure would be ideal for studies across institutions. Factor analysis (FA) was applied as processing tool for definition of temporally changing structures in the field of view. FA has been proposed earlier, but the perceived mathematical difficulty has prevented widespread use. FA was utilized to delineate structures and extract blood and tissue time-activity-curves (TACs). These TACs were used as input and output functions for tracer kinetic modeling, the results of which were compared with those from an input function obtained with serial blood sampling. Dynamic image data of myocardial perfusion studies with N-13 ammonia, O-15 water, or Rb-82, cancer studies with F-18 FDG, and skeletal studies with F-18 fluoride were evaluated. Correlation coefficients of kinetic parameters obtained with factor and plasma input functions were high. Linear regression usually furnished a slope near unity. Processing time was 7 min/patient on an UltraSPARC. Conclusion: FA can non-invasively generate input functions from image data eliminating the need for blood sampling. Output (tissue) functions can be simultaneously generated. The method is simple, requires no sophisticated operator interaction and has little inter-operator variability. FA is well suited for studies across institutions and standardized evaluations.
Pirozzi, Enrica
2018-04-01
High variability in the neuronal response to stimulations and the adaptation phenomenon cannot be explained by the standard stochastic leaky integrate-and-fire model. The main reason is that the uncorrelated inputs involved in the model are not realistic. There exists some form of dependency between the inputs, and it can be interpreted as memory effects. In order to include these physiological features in the standard model, we reconsider it with time-dependent coefficients and correlated inputs. Due to its hard mathematical tractability, we perform simulations of it for a wide investigation of its output. A Gauss-Markov process is constructed for approximating its non-Markovian dynamics. The first passage time probability density of such a process can be numerically evaluated, and it can be used to fit the histograms of simulated firing times. Some estimates of the moments of firing times are also provided. The effect of the correlation time of the inputs on firing densities and on firing rates is shown. An exponential probability density of the first firing time is estimated for low values of input current and high values of correlation time. For comparison, a simulation-based investigation is also carried out for a fractional stochastic model that allows to preserve the memory of the time evolution of the neuronal membrane potential. In this case, the memory parameter that affects the firing activity is the fractional derivative order. In both models an adaptation level of spike frequency is attained, even if along different modalities. Comparisons and discussion of the obtained results are provided.
Standardized reporting using CODES (Crash Outcome Data Evaluation System)
DOT National Transportation Integrated Search
1999-12-01
While CODES projects have expanded to 25 states, there is no standardized reporting of the outcome measures that are available with linked data. This paper describes our efforts to build a standard format for reporting these outcomes. This format is ...
The GOLM-database standard- a framework for time-series data management based on free software
NASA Astrophysics Data System (ADS)
Eichler, M.; Francke, T.; Kneis, D.; Reusser, D.
2009-04-01
Monitoring and modelling projects usually involve time series data originating from different sources. Often, file formats, temporal resolution and meta-data documentation rarely adhere to a common standard. As a result, much effort is spent on converting, harmonizing, merging, checking, resampling and reformatting these data. Moreover, in work groups or during the course of time, these tasks tend to be carried out redundantly and repeatedly, especially when new data becomes available. The resulting duplication of data in various formats strains additional ressources. We propose a database structure and complementary scripts for facilitating these tasks. The GOLM- (General Observation and Location Management) framework allows for import and storage of time series data of different type while assisting in meta-data documentation, plausibility checking and harmonization. The imported data can be visually inspected and its coverage among locations and variables may be visualized. Supplementing scripts provide options for data export for selected stations and variables and resampling of the data to the desired temporal resolution. These tools can, for example, be used for generating model input files or reports. Since GOLM fully supports network access, the system can be used efficiently by distributed working groups accessing the same data over the internet. GOLM's database structure and the complementary scripts can easily be customized to specific needs. Any involved software such as MySQL, R, PHP, OpenOffice as well as the scripts for building and using the data base, including documentation, are free for download. GOLM was developed out of the practical requirements of the OPAQUE-project. It has been tested and further refined in the ERANET-CRUE and SESAM projects, all of which used GOLM to manage meteorological, hydrological and/or water quality data.
A laboratory procedure for measuring and georeferencing soil colour
NASA Astrophysics Data System (ADS)
Marques-Mateu, A.; Balaguer-Puig, M.; Moreno-Ramon, H.; Ibanez-Asensio, S.
2015-04-01
Remote sensing and geospatial applications very often require ground truth data to assess outcomes from spatial analyses or environmental models. Those data sets, however, may be difficult to collect in proper format or may even be unavailable. In the particular case of soil colour the collection of reliable ground data can be cumbersome due to measuring methods, colour communication issues, and other practical factors which lead to a lack of standard procedure for soil colour measurement and georeferencing. In this paper we present a laboratory procedure that provides colour coordinates of georeferenced soil samples which become useful in later processing stages of soil mapping and classification from digital images. The procedure requires a laboratory setup consisting of a light booth and a trichromatic colorimeter, together with a computer program that performs colour measurement, storage, and colour space transformation tasks. Measurement tasks are automated by means of specific data logging routines which allow storing recorded colour data in a spatial format. A key feature of the system is the ability of transforming between physically-based colour spaces and the Munsell system which is still the standard in soil science. The working scheme pursues the automation of routine tasks whenever possible and the avoidance of input mistakes by means of a convenient layout of the user interface. The program can readily manage colour and coordinate data sets which eventually allow creating spatial data sets. All the tasks regarding data joining between colorimeter measurements and samples locations are executed by the software in the background, allowing users to concentrate on samples processing. As a result, we obtained a robust and fully functional computer-based procedure which has proven a very useful tool for sample classification or cataloging purposes as well as for integrating soil colour data with other remote sensed and spatial data sets.
A Lifecycle Approach to Brokered Data Management for Hydrologic Modeling Data Using Open Standards.
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Booth, N.; Kunicki, T.; Walker, J.
2012-12-01
The U.S. Geological Survey Center for Integrated Data Analytics has formalized an information management-architecture to facilitate hydrologic modeling and subsequent decision support throughout a project's lifecycle. The architecture is based on open standards and open source software to decrease the adoption barrier and to build on existing, community supported software. The components of this system have been developed and evaluated to support data management activities of the interagency Great Lakes Restoration Initiative, Department of Interior's Climate Science Centers and WaterSmart National Water Census. Much of the research and development of this system has been in cooperation with international interoperability experiments conducted within the Open Geospatial Consortium. Community-developed standards and software, implemented to meet the unique requirements of specific disciplines, are used as a system of interoperable, discipline specific, data types and interfaces. This approach has allowed adoption of existing software that satisfies the majority of system requirements. Four major features of the system include: 1) assistance in model parameter and forcing creation from large enterprise data sources; 2) conversion of model results and calibrated parameters to standard formats, making them available via standard web services; 3) tracking a model's processes, inputs, and outputs as a cohesive metadata record, allowing provenance tracking via reference to web services; and 4) generalized decision support tools which rely on a suite of standard data types and interfaces, rather than particular manually curated model-derived datasets. Recent progress made in data and web service standards related to sensor and/or model derived station time series, dynamic web processing, and metadata management are central to this system's function and will be presented briefly along with a functional overview of the applications that make up the system. As the separate pieces of this system progress, they will be combined and generalized to form a sort of social network for nationally consistent hydrologic modeling.
File formats commonly used in mass spectrometry proteomics.
Deutsch, Eric W
2012-12-01
The application of mass spectrometry (MS) to the analysis of proteomes has enabled the high-throughput identification and abundance measurement of hundreds to thousands of proteins per experiment. However, the formidable informatics challenge associated with analyzing MS data has required a wide variety of data file formats to encode the complex data types associated with MS workflows. These formats encompass the encoding of input instruction for instruments, output products of the instruments, and several levels of information and results used by and produced by the informatics analysis tools. A brief overview of the most common file formats in use today is presented here, along with a discussion of related topics.
Weissflog, Ludwig; Krüger, Gert; Elansky, Nikolai; Putz, Erich; Pfennigsdorff, Andrea; Seyfarth, Klaus Ullrich; Nüchter, Matthias; Lange, Christian; Kotte, Karsten
2003-07-01
Trichloroacetic acid (TCA, CCl(3)COOH) is a phytotoxic chemical. Although TCA salts and derivates were once used as herbicides to combat perennial grasses and weeds, they have since been banned because of their indiscriminate herbicidal effects on woody plant species. However, TCA can also be formed in the atmosphere. For instance, the high-volatile C(2)-chlorohydrocarbons tetrachloroethene (TECE, C(2)Cl(4)) and 1,1,1-trichloroethane (TCE, CCl(3)CH(3)) can react under oxidative conditions in the atmosphere to form TCA and other substances. The ongoing industrialisation of Southeast Asia, South Africa and South America means that use of TECE as solvents in the metal and textile industries of these regions in the southern hemisphere can be expected to rise. The increasing emissions of this substance--together with the rise in the atmospheric oxidation potential caused by urban activities, slash and burn agriculture and forest fires in the southern hemisphere--could lead to a greater input/formation of TCA in the vegetation located in the lee of these emission sources. By means of biomonitoring studies, the input/formation of TCA in vegetation was detected at various locations in South America, North America, Africa, and Europe.
STARS: A general-purpose finite element computer program for analysis of engineering structures
NASA Technical Reports Server (NTRS)
Gupta, K. K.
1984-01-01
STARS (Structural Analysis Routines) is primarily an interactive, graphics-oriented, finite-element computer program for analyzing the static, stability, free vibration, and dynamic responses of damped and undamped structures, including rotating systems. The element library consists of one-dimensional (1-D) line elements, two-dimensional (2-D) triangular and quadrilateral shell elements, and three-dimensional (3-D) tetrahedral and hexahedral solid elements. These elements enable the solution of structural problems that include truss, beam, space frame, plane, plate, shell, and solid structures, or any combination thereof. Zero, finite, and interdependent deflection boundary conditions can be implemented by the program. The associated dynamic response analysis capability provides for initial deformation and velocity inputs, whereas the transient excitation may be either forces or accelerations. An effective in-core or out-of-core solution strategy is automatically employed by the program, depending on the size of the problem. Data input may be at random within a data set, and the program offers certain automatic data-generation features. Input data are formatted as an optimal combination of free and fixed formats. Interactive graphics capabilities enable convenient display of nodal deformations, mode shapes, and element stresses.
The Assay Development Working Group (ADWG) of the CPTAC Program is currently drafting a document to propose best practices for generation, quantification, storage, and handling of peptide standards used for mass spectrometry-based assays, as well as interpretation of quantitative proteomic data based on peptide standards. The ADWG is seeking input from commercial entities that provide peptide standards for mass spectrometry-based assays or that perform amino acid analysis.
Enhancing Access to Drought Information Using the CUAHSI Hydrologic Information System
NASA Astrophysics Data System (ADS)
Schreuders, K. A.; Tarboton, D. G.; Horsburgh, J. S.; Sen Gupta, A.; Reeder, S.
2011-12-01
The National Drought Information System (NIDIS) Upper Colorado River Basin pilot study is investigating and establishing capabilities for better dissemination of drought information for early warning and management. As part of this study we are using and extending functionality from the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS) to provide better access to drought-related data in the Upper Colorado River Basin. The CUAHSI HIS is a federated system for sharing hydrologic data. It is comprised of multiple data servers, referred to as HydroServers, that publish data in a standard XML format called Water Markup Language (WaterML), using web services referred to as WaterOneFlow web services. HydroServers can also publish geospatial data using Open Geospatial Consortium (OGC) web map, feature and coverage services and are capable of hosting web and map applications that combine geospatial datasets with observational data served via web services. HIS also includes a centralized metadata catalog that indexes data from registered HydroServers and a data access client referred to as HydroDesktop. For NIDIS, we have established a HydroServer to publish drought index values as well as the input data used in drought index calculations. Primary input data required for drought index calculation include streamflow, precipitation, reservoir storages, snow water equivalent, and soil moisture. We have developed procedures to redistribute the input data to the time and space scales chosen for drought index calculation, namely half monthly time intervals for HUC 10 subwatersheds. The spatial redistribution approaches used for each input parameter are dependent on the spatial linkages for that parameter, i.e., the redistribution procedure for streamflow is dependent on the upstream/downstream connectivity of the stream network, and the precipitation redistribution procedure is dependent on elevation to account for orographic effects. A set of drought indices are then calculated from the redistributed data. We have created automated data and metadata harvesters that periodically scan and harvest new data from each of the input databases, and calculates extensions to the resulting derived data sets, ensuring that the data available on the drought server is kept up to date. This paper will describe this system, showing how it facilitates the integration of data from multiple sources to inform the planning and management of water resources during drought. The system may be accessed at http://drought.usu.edu.
NASA Astrophysics Data System (ADS)
Jang, W.; Engda, T. A.; Neff, J. C.; Herrick, J.
2017-12-01
Many crop models are increasingly used to evaluate crop yields at regional and global scales. However, implementation of these models across large areas using fine-scale grids is limited by computational time requirements. In order to facilitate global gridded crop modeling with various scenarios (i.e., different crop, management schedule, fertilizer, and irrigation) using the Environmental Policy Integrated Climate (EPIC) model, we developed a distributed parallel computing framework in Python. Our local desktop with 14 cores (28 threads) was used to test the distributed parallel computing framework in Iringa, Tanzania which has 406,839 grid cells. High-resolution soil data, SoilGrids (250 x 250 m), and climate data, AgMERRA (0.25 x 0.25 deg) were also used as input data for the gridded EPIC model. The framework includes a master file for parallel computing, input database, input data formatters, EPIC model execution, and output analyzers. Through the master file for parallel computing, the user-defined number of threads of CPU divides the EPIC simulation into jobs. Then, Using EPIC input data formatters, the raw database is formatted for EPIC input data and the formatted data moves into EPIC simulation jobs. Then, 28 EPIC jobs run simultaneously and only interesting results files are parsed and moved into output analyzers. We applied various scenarios with seven different slopes and twenty-four fertilizer ranges. Parallelized input generators create different scenarios as a list for distributed parallel computing. After all simulations are completed, parallelized output analyzers are used to analyze all outputs according to the different scenarios. This saves significant computing time and resources, making it possible to conduct gridded modeling at regional to global scales with high-resolution data. For example, serial processing for the Iringa test case would require 113 hours, while using the framework developed in this study requires only approximately 6 hours, a nearly 95% reduction in computing time.
Kaltsa, O; Michon, C; Yanniotis, S; Mandala, I
2013-05-01
Ultrasonication may be a cost-effective emulsion formation technique, but its impact on emulsion final structure and droplet size needs to be further investigated. Olive oil emulsions (20wt%) were formulated (pH∼7) using whey protein (3wt%), three kinds of hydrocolloids (0.1-0.5wt%) and two different emulsification energy inputs (single- and two-stage, methods A and B, respectively). Formula and energy input effects on emulsion performance are discussed. Emulsions stability was evaluated over a 10-day storage period at 5°C recording the turbidity profiles of the emulsions. Optical micrographs, droplet size and viscosity values were also obtained. A differential scanning calorimetric (DSC) multiple cool-heat cyclic method (40 to -40°C) was performed to examine stability via crystallization phenomena of the dispersed phase. Ultrasonication energy input duplication from 11kJ to 25kJ (method B) resulted in stable emulsions production (reduction of back scattering values, dBS∼1% after 10days of storage) at 0.5wt% concentration of any of the stabilizers used. At lower gum amount samples became unstable due to depletion flocculation phenomena, regardless of emulsification energy input used. High energy input during ultrasonic emulsification also resulted in sub-micron oil-droplets emulsions (D(50)=0.615μm compared to D(50)=1.3μm using method A) with narrower particle size distribution and in viscosity reduction. DSC experiments revealed no presence of bulk oil formation, suggesting stability for XG 0.5wt% emulsions prepared by both methods. Reduced enthalpy values found when method B was applied suggesting structural modifications produced by extensive ultrasonication. Change of ultrasonication conditions results in significant changes of oil droplet size and stability of the produced emulsions. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Falcón-Barroso, Jesús; Knapen, Johan H.
2013-10-01
Preface; 1. Secular evolution in disk galaxies John Kormendy; 2. Galaxy morphology Ronald J. Buta; 3. Dynamics of secular evolution James Binney; 4. Bars and secular evolution in disk galaxies: theoretical input E. Athanassoula; 5. Stellar populations Reynier F. Peletier; 6. Star formation rate indicators Daniela Calzetti; 7. The evolving interstellar medium Jacqueline van Gorkom; 8. Evolution of star formation and gas Nick Z. Scoville; 9. Cosmological evolution of galaxies Isaac Shlosman.
New Decentralized Algorithms for Spacecraft Formation Control Based on a Cyclic Approach
2010-06-01
space framework. As metric of performance, a common quadratic norm that weights the performance error and the control effort is traded with the cost...R = DTD, then the metric of interest is (’J)",,, the square of the 2-norm from input w to output z. Given a system G with state space description A ... spaced logarithmic spiral formation. These results are derived for
A Pipeline Software Architecture for NMR Spectrum Data Translation
Ellis, Heidi J.C.; Weatherby, Gerard; Nowling, Ronald J.; Vyas, Jay; Fenwick, Matthew; Gryk, Michael R.
2012-01-01
The problem of formatting data so that it conforms to the required input for scientific data processing tools pervades scientific computing. The CONNecticut Joint University Research Group (CONNJUR) has developed a data translation tool based on a pipeline architecture that partially solves this problem. The CONNJUR Spectrum Translator supports data format translation for experiments that use Nuclear Magnetic Resonance to determine the structure of large protein molecules. PMID:24634607
Redox-Enabled, pH-Disabled Pyrazoline-Ferrocene INHIBIT Logic Gates.
Scerri, Glenn J; Cini, Miriam; Schembri, Jonathan S; da Costa, Paola F; Johnson, Alex D; Magri, David C
2017-07-05
Pyrazoline-ferrocene conjugates with an "electron-donor-spacer-fluorophore-receptor" format are demonstrated as redox-fluorescent two-input INHIBIT logic gates. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Spectra, chromatograms, Metadata: mzML-the standard data format for mass spectrometer output.
Turewicz, Michael; Deutsch, Eric W
2011-01-01
This chapter describes Mass Spectrometry Markup Language (mzML), an XML-based and vendor-neutral standard data format for storage and exchange of mass spectrometer output like raw spectra and peak lists. It is intended to replace its two precursor data formats (mzData and mzXML), which had been developed independently a few years earlier. Hence, with the release of mzML, the problem of having two different formats for the same purposes is solved, and with it the duplicated effort of maintaining and supporting two data formats. The new format has been developed by a broad-based consortium of major instrument vendors, software vendors, and academic researchers under the aegis of the Human Proteome Organisation (HUPO), Proteomics Standards Initiative (PSI), with full participation of the main developers of the precursor formats. This comprehensive approach helped mzML to become a generally accepted standard. Furthermore, the collaborative development insured that mzML has adopted the best features of its precursor formats. In this chapter, we discuss mzML's development history, its design principles and use cases, as well as its main building components. We also present the available documentation, an example file, and validation software for mzML.
A Dose of Reality: Radiation Analysis for Realistic Human Spacecraft
NASA Technical Reports Server (NTRS)
Barzilla, J. E.; Lee, K. T.
2017-01-01
INTRODUCTION As with most computational analyses, a tradeoff exists between problem complexity, resource availability and response accuracy when modeling radiation transport from the source to a detector. The largest amount of analyst time for setting up an analysis is often spent ensuring that any simplifications made have minimal impact on the results. The vehicle shield geometry of interest is typically simplified from the original CAD design in order to reduce computation time, but this simplification requires the analyst to "re-draw" the geometry with a limited set of volumes in order to accommodate a specific radiation transport software package. The resulting low-fidelity geometry model cannot be shared with or compared to other radiation transport software packages, and the process can be error prone with increased model complexity. The work presented here demonstrates the use of the DAGMC (Direct Accelerated Geometry for Monte Carlo) Toolkit from the University of Wisconsin, to model the impacts of several space radiation sources on a CAD drawing of the US Lab module. METHODS The DAGMC toolkit workflow begins with the export of an existing CAD geometry from the native CAD to the ACIS format. The ACIS format file is then cleaned using SpaceClaim to remove small holes and component overlaps. Metadata is then assigned to the cleaned geometry file using CUBIT/Trelis from csimsoft (Registered Trademark). The DAGMC plugin script removes duplicate shared surfaces, facets the geometry to a specified tolerance, and ensures that the faceted geometry is water tight. This step also writes the material and scoring information to a standard input file format that the analyst can alter as desired prior to running the radiation transport program. The scoring results can be transformed, via python script, into a 3D format that is viewable in a standard graphics program. RESULTS The CAD model of the US Lab module of the International Space Station, inclusive of all the racks and components, was simplified to remove holes and volume overlaps. Problematic features within the drawing were also removed or repaired to prevent runtime issues. The cleaned drawing was then run through the DAGMC workflow to prepare for analysis. Pilot tests modeling transport of 1GeV proton and 800MeV/A oxygen sources show that reasonable results are converged upon in an acceptable amount of overall computation time from drawing preparation to data analysis. The FLUKA radiation transport code will next be used to model both a GCR and a trapped radiation source. These results will then be compared with measurements that have been made by the radiation instrumentation deployed inside the US Lab module. DISCUSSION Early analyses have indicated that the DAGMC workflow is a promising toolkit for running vehicle geometries of interest to NASA through multiple radiation transport codes. In addition, recent work has shown that a realistic human phantom, provided via a subcontract with the University of Florida, can be placed inside any vehicle geometry for a combinatorial analysis. This added functionality gives the user the ability to score various parameters at the organ level, and the results can then be used as input for cancer risk models.
40 CFR 60.40b - Applicability and delegation of authority.
Code of Federal Regulations, 2011 CFR
2011-07-01
... applicability requirements under subpart D (Standards of performance for fossil-fuel-fired steam generators... meeting the applicability requirements under subpart D (Standards of performance for fossil-fuel-fired...) heat input of fossil fuel. If the heat recovery steam generator is subject to this subpart, only...
40 CFR 60.40b - Applicability and delegation of authority.
Code of Federal Regulations, 2010 CFR
2010-07-01
... applicability requirements under subpart D (Standards of performance for fossil-fuel-fired steam generators... meeting the applicability requirements under subpart D (Standards of performance for fossil-fuel-fired...) heat input of fossil fuel. If the heat recovery steam generator is subject to this subpart, only...
Code of Federal Regulations, 2010 CFR
2010-01-01
... Engineers (ASHRAE) Standard 16-69, “Method of Testing for Rating Room Air Conditioners.” 2. Test conditions...-1972 and in accordance with ASHRAE Standard 16-69. 4.2Determine the electrical power input (expressed...
Code of Federal Regulations, 2011 CFR
2011-01-01
... Engineers (ASHRAE) Standard 16-69, “Method of Testing for Rating Room Air Conditioners.” 2. Test conditions...-1972 and in accordance with ASHRAE Standard 16-69. 4.2Determine the electrical power input (expressed...
Decentralized or onsite wastewater treatment (OWT) systems have long been implicated in being a major source of N inputs to surface and ground waters and numerous regulatory bodies have promulgated strict total N (TN) effluent standards in N-sensitive areas. These standards, howe...