Coupled rotor/airframe vibration analysis program manual. Volume 2: Sample input and output listings
NASA Technical Reports Server (NTRS)
Cassarino, S.; Sopher, R.
1982-01-01
Sample input and output listings obtained with the base program (SIMVIB) of the coupled rotor/airframe vibration analysis and the external programs, G400/F389 and E927 are presented. Results for five of the base program test cases are shown. They represent different applications of the SIMVIB program to study the vibration characteristics of various dynamic configurations. Input and output listings obtained for one cycle of the G400/F389 coupled program are presented. Results from the rotor aeroelastic analysis E927 also appear. A brief description of the check cases is provided. A summary of the check cases for all the external programs interacting with the SIMVIB program is illustrated.
Blanks: a computer program for analyzing furniture rough-part needs in standard-size blanks
Philip A. Araman
1983-01-01
A computer program is described that allows a company to determine the number of edge-glued, standard-size blanks required to satisfy its rough-part needs for a given production period. Yield and cost information also is determined by the program. A list of the program inputs, outputs, and uses of outputs is described, and an example analysis with sample output is...
NASA Technical Reports Server (NTRS)
1982-01-01
Personal data input, decompression data, nitrogen washout, nitrogen data, and update computer programs are described. Input data and formats; program output, reports, and data; program flowcharts; program listings; sample runs with input and output pages; hardware operation; and engineering data are provided.
NASA Technical Reports Server (NTRS)
Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.
1988-01-01
A user's manual for the computer program developed for the prediction of propeller-nacelle aerodynamic performance reported in, An Analysis for High Speed Propeller-Nacelle Aerodynamic Performance Prediction: Volume 1 -- Theory and Application, is presented. The manual describes the computer program mode of operation requirements, input structure, input data requirements and the program output. In addition, it provides the user with documentation of the internal program structure and the software used in the computer program as it relates to the theory presented in Volume 1. Sample input data setups are provided along with selected printout of the program output for one of the sample setups.
Program to Optimize Simulated Trajectories (POST). Volume 2: Utilization manual
NASA Technical Reports Server (NTRS)
Bauer, G. L.; Cornick, D. E.; Habeger, A. R.; Petersen, F. M.; Stevenson, R.
1975-01-01
Information pertinent to users of the program to optimize simulated trajectories (POST) is presented. The input required and output available is described for each of the trajectory and targeting/optimization options. A sample input listing and resulting output are given.
TRANDESNF: A computer program for transonic airfoil design and analysis in nonuniform flow
NASA Technical Reports Server (NTRS)
Chang, J. F.; Lan, C. Edward
1987-01-01
The use of a transonic airfoil code for analysis, inverse design, and direct optimization of an airfoil immersed in propfan slipstream is described. A summary of the theoretical method, program capabilities, input format, output variables, and program execution are described. Input data of sample test cases and the corresponding output are given.
FORTRAN program for predicting off-design performance of radial-inflow turbines
NASA Technical Reports Server (NTRS)
Wasserbauer, C. A.; Glassman, A. J.
1975-01-01
The FORTRAN IV program uses a one-dimensional solution of flow conditions through the turbine along the mean streamline. The program inputs needed are the design-point requirements and turbine geometry. The output includes performance and velocity-diagram parameters over a range of speed and pressure ratio. Computed performance is compared with the experimental data from two radial-inflow turbines and with the performance calculated by a previous computer program. The flow equations, program listing, and input and output for a sample problem are given.
NASA Technical Reports Server (NTRS)
Chang, H.
1976-01-01
A computer program using Lemke, Salkin and Spielberg's Set Covering Algorithm (SCA) to optimize a traffic model problem in the Scheduling Algorithm for Mission Planning and Logistics Evaluation (SAMPLE) was documented. SCA forms a submodule of SAMPLE and provides for input and output, subroutines, and an interactive feature for performing the optimization and arranging the results in a readily understandable form for output.
Computer program for preliminary design analysis of axial-flow turbines
NASA Technical Reports Server (NTRS)
Glassman, A. J.
1972-01-01
The program method is based on a mean-diameter flow analysis. Input design requirements include power or pressure ratio, flow, temperature, pressure, and speed. Turbine designs are generated for any specified number of stages and for any of three types of velocity diagrams (symmetrical, zero exit swirl, or impulse). Exit turning vanes can be included in the design. Program output includes inlet and exit annulus dimensions, exit temperature and pressure, total and static efficiencies, blading angles, and last-stage critical velocity ratios. The report presents the analysis method, a description of input and output with sample cases, and the program listing.
Graphics and composite material computer program enhancements for SPAR
NASA Technical Reports Server (NTRS)
Farley, G. L.; Baker, D. J.
1980-01-01
User documentation is provided for additional computer programs developed for use in conjunction with SPAR. These programs plot digital data, simplify input for composite material section properties, and compute lamina stresses and strains. Sample problems are presented including execution procedures, program input, and graphical output.
A National Estimate of Performance: Statewide Highway Safety Program Assessment.
ERIC Educational Resources Information Center
National Highway Traffic Safety Administration (DOT), Washington, DC.
A nationwide systematic approach to assess the developments and achievements of highway safety activities was conducted to measure program outputs from 1969 through 1974 using key indicators of performance such as ratios and percentages. A sample of 10 states was selected with overall sample of 105 local jurisdictions which would provide estimated…
DOT National Transportation Integrated Search
1981-09-01
Volume III is the third and last volume of a three volume document describing the computer program HEVSIM. This volume includes appendices which list the HEVSIM program, sample part data, some typical outputs and updated nomenclature.
An Interactive Graphics Program for Investigating Digital Signal Processing.
ERIC Educational Resources Information Center
Miller, Billy K.; And Others
1983-01-01
Describes development of an interactive computer graphics program for use in teaching digital signal processing. The program allows students to interactively configure digital systems on a monitor display and observe their system's performance by means of digital plots on the system's outputs. A sample program run is included. (JN)
Computer program documentation user information for the RSO-tape print program (RSOPRNT)
NASA Technical Reports Server (NTRS)
Gibbs, P. M. (Principal Investigator)
1980-01-01
A user's guide for the RSOPRNT, a TRASYS Master Restart Output Tape (RSO) reader is presented. Background information and sample runstreams, as well as, references, input requirements and options, are included.
The Even-Rho and Even-Epsilon Algorithms for Accelerating Convergence of a Numerical Sequence
1981-12-01
equal, leading to zero or very small divisors. Computer programs implementing these algorithms are given along with sample output. An appreciable amount...calculation of the array of Shank’s transforms or, -A equivalently, of the related Padd Table. The :other, the even-rho algorithm, is closely related...leading to zero or very small divisors. Computer pro- grams implementing these algorithms are given along with sample output. An appreciable amount or
DOT National Transportation Integrated Search
1981-09-01
Volume III is the third and last volume of a three volume document describing the computer program HEVSIM. This volume includes appendices which list the HEVSIM program, sample part data, some typical outputs and updated nomenclature.
A space transportation system operations model
NASA Technical Reports Server (NTRS)
Morris, W. Douglas; White, Nancy H.
1987-01-01
Presented is a description of a computer program which permits assessment of the operational support requirements of space transportation systems functioning in both a ground- and space-based environment. The scenario depicted provides for the delivery of payloads from Earth to a space station and beyond using upper stages based at the station. Model results are scenario dependent and rely on the input definitions of delivery requirements, task times, and available resources. Output is in terms of flight rate capabilities, resource requirements, and facility utilization. A general program description, program listing, input requirements, and sample output are included.
Helicopter rotor loads using matched asymptotic expansions: User's manual
NASA Technical Reports Server (NTRS)
Pierce, G. A.; Vaidyanathan, A. R.
1983-01-01
Computer programs were developed to implement the computational scheme arising from Van Holten's asymptotic method for calculating airloads on a helicopter rotor blade in forward flight, and a similar technique which is based on a discretized version of the method. The basic outlines of the two programs are presented, followed by separate descriptions of the input requirements and output format. Two examples illustrating job entry with appropriate input data and corresponding output are included. Appendices contain a sample table of lift coefficient data for the NACA 0012 air foil and listings of the two programs.
An Instructional Approach to Modeling in Microevolution.
ERIC Educational Resources Information Center
Thompson, Steven R.
1988-01-01
Describes an approach to teaching population genetics and evolution and some of the ways models can be used to enhance understanding of the processes being studied. Discusses the instructional plan, and the use of models including utility programs and analysis with models. Provided are a basic program and sample program outputs. (CW)
NASA Technical Reports Server (NTRS)
1983-01-01
Reporting software programs provide formatted listings and summary reports of the Software Engineering Laboratory (SEL) data base contents. The operating procedures and system information for 18 different reporting software programs are described. Sample output reports from each program are provided.
A computer program for sample size computations for banding studies
Wilson, K.R.; Nichols, J.D.; Hines, J.E.
1989-01-01
Sample sizes necessary for estimating survival rates of banded birds, adults and young, are derived based on specified levels of precision. The banding study can be new or ongoing. The desired coefficient of variation (CV) for annual survival estimates, the CV for mean annual survival estimates, and the length of the study must be specified to compute sample sizes. A computer program is available for computation of the sample sizes, and a description of the input and output is provided.
Skylab S-191 spectrometer single spectral scan analysis program. [user manual
NASA Technical Reports Server (NTRS)
Downes, E. L.
1974-01-01
Documentation and user information for the S-191 single spectral scan analysis program are reported. A breakdown of the computational algorithms is supplied, followed by the program listing and examples of sample output. A copy of the flow chart which describes the driver routine in the body of the main program segment is included.
Computer program for design analysis of radial-inflow turbines
NASA Technical Reports Server (NTRS)
Glassman, A. J.
1976-01-01
A computer program written in FORTRAN that may be used for the design analysis of radial-inflow turbines was documented. The following information is included: loss model (estimation of losses), the analysis equations, a description of the input and output data, the FORTRAN program listing and list of variables, and sample cases. The input design requirements include the power, mass flow rate, inlet temperature and pressure, and rotational speed. The program output data includes various diameters, efficiencies, temperatures, pressures, velocities, and flow angles for the appropriate calculation stations. The design variables include the stator-exit angle, rotor radius ratios, and rotor-exit tangential velocity distribution. The losses are determined by an internal loss model.
Space radiator simulation manual for computer code
NASA Technical Reports Server (NTRS)
Black, W. Z.; Wulff, W.
1972-01-01
A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.
NASA Technical Reports Server (NTRS)
Reichel, R. H.; Hague, D. S.; Jones, R. T.; Glatt, C. R.
1973-01-01
This computer program manual describes in two parts the automated combustor design optimization code AUTOCOM. The program code is written in the FORTRAN 4 language. The input data setup and the program outputs are described, and a sample engine case is discussed. The program structure and programming techniques are also described, along with AUTOCOM program analysis.
Computer program user's manual for advanced general aviation propeller study
NASA Technical Reports Server (NTRS)
Worobel, R.
1972-01-01
A user's manual is presented for a computer program for predicting the performance (static, flight, and reverse), noise, weight and cost of propellers for advanced general aviation aircraft of the 1980 time period. Complete listings of this computer program with detailed instructions and samples of input and output are included.
CHINESE GRAMMARS AND THE COMPUTER AT THE OHIO STATE UNIVERSITY. PRELIMINARY REPORT.
ERIC Educational Resources Information Center
MEYERS, L.F.; YANG, J.
SAMPLE OUTPUT SENTENCES OF VARIOUS COMIT AND SNOBOL PROGRAMS FOR TESTING A CHINESE GENERATIVE GRAMMAR ARE PRESENTED. THE GRAMMAR CHOSEN FOR EXPERIMENTATION IS A PRELIMINARY VERSION OF A TRANSFORMATIONAL GRAMMAR. ALL OF THE COMIT PROGRAMS AND ONE OF THE SNOBOL PROGRAMS USE A LINEARIZED REPRESENTATION OF TREE STRUCTURES, WITH ADDITIONAL NUMERICAL…
User's guide to the SEPHIS computer code for calculating the Thorex solvent extraction system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, S.B.; Rainey, R.H.
1979-05-01
The SEPHIS computer program was developed to simulate the countercurrent solvent extraction process. The code has now been adapted to model the Acid Thorex flow sheet. This report represents a practical user's guide to SEPHIS - Thorex containing a program description, user information, program listing, and sample input and output.
Acoustic Detection Of Loose Particles In Pressure Sensors
NASA Technical Reports Server (NTRS)
Kwok, Lloyd C.
1995-01-01
Particle-impact-noise-detector (PIND) apparatus used in conjunction with computer program analyzing output of apparatus to detect extraneous particles trapped in pressure sensors. PIND tester essentially shaker equipped with microphone measuring noise in pressure sensor or other object being shaken. Shaker applies controlled vibration. Output of microphone recorded and expressed in terms of voltage, yielding history of noise subsequently processed by computer program. Data taken at sampling rate sufficiently high to enable identification of all impacts of particles on sensor diaphragm and on inner surfaces of sensor cavities.
NASA Astrophysics Data System (ADS)
Goldbery, R.; Tehori, O.
SEDPAK provides a comprehensive software package for operation of a settling tube and sand analyzer (2-0.063 mm) and includes data-processing programs for statistical and graphic output of results. The programs are menu-driven and written in APPLESOFT BASIC, conforming with APPLE 3.3 DOS. Data storage and retrieval from disc is an important feature of SEDPAK. Additional features of SEDPAK include condensation of raw settling data via standard size-calibration curves to yield statistical grain-size parameters, plots of grain-size frequency distributions and cumulative log/probability curves. The program also has a module for processing of grain-size frequency data from sieved samples. An addition feature of SEDPAK is the option for automatic data processing and graphic output of a sequential or nonsequential array of samples on one side of a disc.
Extrapolation of sonic boom pressure signatures by the waveform parameter method
NASA Technical Reports Server (NTRS)
Thomas, C. L.
1972-01-01
The waveform parameter method of sonic boom extrapolation is derived and shown to be equivalent to the F-function method. A computer program based on the waveform parameter method is presented and discussed, with a sample case demonstrating program input and output.
DOE-2 sample run book: Version 2.1E
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winkelmann, F.C.; Birdsall, B.E.; Buhl, W.F.
1993-11-01
The DOE-2 Sample Run Book shows inputs and outputs for a variety of building and system types. The samples start with a simple structure and continue to a high-rise office building, a medical building, three small office buildings, a bar/lounge, a single-family residence, a small office building with daylighting, a single family residence with an attached sunspace, a ``parameterized`` building using input macros, and a metric input/output example. All of the samples use Chicago TRY weather. The main purpose of the Sample Run Book is instructional. It shows the relationship of LOADS-SYSTEMS-PLANT-ECONOMICS inputs, displays various input styles, and illustrates manymore » of the basic and advanced features of the program. Many of the sample runs are preceded by a sketch of the building showing its general appearance and the zoning used in the input. In some cases we also show a 3-D rendering of the building as produced by the program DrawBDL. Descriptive material has been added as comments in the input itself. We find that a number of users have loaded these samples onto their editing systems and use them as ``templates`` for creating new inputs. Another way of using them would be to store various portions as files that can be read into the input using the {number_sign}{number_sign} include command, which is part of the Input Macro feature introduced in version DOE-2.lD. Note that the energy rate structures here are the same as in the DOE-2.lD samples, but have been rewritten using the new DOE-2.lE commands and keywords for ECONOMICS. The samples contained in this report are the same as those found on the DOE-2 release files. However, the output numbers that appear here may differ slightly from those obtained from the release files. The output on the release files can be used as a check set to compare results on your computer.« less
User's guide for a large signal computer model of the helical traveling wave tube
NASA Technical Reports Server (NTRS)
Palmer, Raymond W.
1992-01-01
The use is described of a successful large-signal, two-dimensional (axisymmetric), deformable disk computer model of the helical traveling wave tube amplifier, an extensively revised and operationally simplified version. We also discuss program input and output and the auxiliary files necessary for operation. Included is a sample problem and its input data and output results. Interested parties may now obtain from the author the FORTRAN source code, auxiliary files, and sample input data on a standard floppy diskette, the contents of which are described herein.
Spherical roller bearing analysis. SKF computer program SPHERBEAN. Volume 2: User's manual
NASA Technical Reports Server (NTRS)
Kleckner, R. J.; Dyba, G. J.
1980-01-01
The user's guide for the SPHERBEAN computer program for prediction of the thermomechanical performance characteristics of high speed lubricated double row spherical roller bearings is presented. The material presented is structured to guide the user in the practical and correct implementation of SPHERBEAN. Input and output, guidelines for program use, and sample executions are detailed.
Free-Radical Polymerization Using the Rotating-Sector Method.
ERIC Educational Resources Information Center
Moss, Stephen J.
1982-01-01
Discusses principles of a particular approach in teaching elementary kinetics of polymerization. Although the treatment discussed is more difficult for students to grasp, problems may be reduced using a computer program. The program, written in Applesoft Basic, is available from the author together with sample output. (JN)
User's manual for three-dimensional analysis of propeller flow fields
NASA Technical Reports Server (NTRS)
Chaussee, D. S.; Kutler, P.
1983-01-01
A detailed operating manual is presented for the prop-fan computer code (in addition to supporting programs) recently developed by Kutler, Chaussee, Sorenson, and Pulliam while at the NASA'S Ames Research Center. This code solves the inviscid Euler equations using an implicit numerical procedure developed by Beam and Warming of Ames. A description of the underlying theory, numerical techniques, and boundary conditions with equations, formulas, and methods for the mesh generation program (MGP), three dimensional prop-fan flow field program (3DPFP), and data reduction program (DRP) is provided, together with complete operating instructions. In addition, a programmer's manual is also provided to assist the user interested in modifying the codes. Included in the programmer's manual for each program is a description of the input and output variables, flow charts, program listings, sample input and output data, and operating hints.
Shuttle Data Center File-Processing Tool in Java
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Miller, Walter H.
2006-01-01
A Java-language computer program has been written to facilitate mining of data in files in the Shuttle Data Center (SDC) archives. This program can be executed on a variety of workstations or via Web-browser programs. This program is partly similar to prior C-language programs used for the same purpose, while differing from those programs in that it exploits the platform-neutrality of Java in implementing several features that are important for analysis of large sets of time-series data. The program supports regular expression queries of SDC archive files, reads the files, interleaves the time-stamped samples according to a chosen output, then transforms the results into that format. A user can choose among a variety of output file formats that are useful for diverse purposes, including plotting, Markov modeling, multivariate density estimation, and wavelet multiresolution analysis, as well as for playback of data in support of simulation and testing.
REST: a computer system for estimating logging residue by using the line-intersect method
A. Jeff Martin
1975-01-01
A computer program was designed to accept logging-residue measurements obtained by line-intersect sampling and transform them into summaries useful for the land manager. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.
NASA Technical Reports Server (NTRS)
Bozeman, Robert E.
1987-01-01
An analytic technique for accounting for the joint effects of Earth oblateness and atmospheric drag on close-Earth satellites is investigated. The technique is analytic in the sense that explicit solutions to the Lagrange planetary equations are given; consequently, no numerical integrations are required in the solution process. The atmospheric density in the technique described is represented by a rotating spherical exponential model with superposed effects of the oblate atmosphere and the diurnal variations. A computer program implementing the process is discussed and sample output is compared with output from program NSEP (Numerical Satellite Ephemeris Program). NSEP uses a numerical integration technique to account for atmospheric drag effects.
A computer program for simulating geohydrologic systems in three dimensions
Posson, D.R.; Hearne, G.A.; Tracy, J.V.; Frenzel, P.F.
1980-01-01
This document is directed toward individuals who wish to use a computer program to simulate ground-water flow in three dimensions. The strongly implicit procedure (SIP) numerical method is used to solve the set of simultaneous equations. New data processing techniques and program input and output options are emphasized. The quifer system to be modeled may be heterogeneous and anisotropic, and may include both artesian and water-table conditions. Systems which consist of well defined alternating layers of highly permeable and poorly permeable material may be represented by a sequence of equations for two dimensional flow in each of the highly permeable units. Boundaries where head or flux is user-specified may be irregularly shaped. The program also allows the user to represent streams as limited-source boundaries when the streamflow is small in relation to the hydraulic stress on the system. The data-processing techniques relating to ' cube ' input and output, to swapping of layers, to restarting of simulation, to free-format NAMELIST input, to the details of each sub-routine 's logic, and to the overlay program structure are discussed. The program is capable of processing large models that might overflow computer memories with conventional programs. Detailed instructions for selecting program options, for initializing the data arrays, for defining ' cube ' output lists and maps, and for plotting hydrographs of calculated and observed heads and/or drawdowns are provided. Output may be restricted to those nodes of particular interest, thereby reducing the volumes of printout for modelers, which may be critical when working at remote terminals. ' Cube ' input commands allow the modeler to set aquifer parameters and initialize the model with very few input records. Appendixes provide instructions to compile the program, definitions and cross-references for program variables, summary of the FLECS structured FORTRAN programming language, listings of the FLECS and FORTRAN source code, and samples of input and output for example simulations. (USGS)
Interim user's manual for boundary layer integral matrix procedure, version J
NASA Technical Reports Server (NTRS)
Evans, R. M.; Morse, H. L.
1974-01-01
A computer program for analyzing two dimensional and axisymmetric nozzle performance with a variety of wall boundary conditions is described. The program has been developed for application to rocket nozzle problems. Several aids to usage of the program and two auxiliary subroutines are provided. Some features of the output are described and three sample cases are included.
NASA Technical Reports Server (NTRS)
Cothran, E. K.
1982-01-01
The computer program written in support of one dimensional analytical approach to thermal modeling of Bridgman type crystal growth is presented. The program listing and flow charts are included, along with the complete thermal model. Sample problems include detailed comments on input and output to aid the first time user.
NASA Technical Reports Server (NTRS)
Miner, E. W.; Anderson, E. C.; Lewis, C. H.
1971-01-01
A computer program is described in detail for laminar, transitional, and/or turbulent boundary-layer flows of non-reacting (perfect gas) and reacting gas mixtures in chemical equilibrium. An implicit finite difference scheme was developed for both two dimensional and axisymmetric flows over bodies, and in rocket nozzles and hypervelocity wind tunnel nozzles. The program, program subroutines, variables, and input and output data are described. Also included is the output from a sample calculation of fully developed turbulent, perfect gas flow over a flat plate. Input data coding forms and a FORTRAN source listing of the program are included. A method is discussed for obtaining thermodynamic and transport property data which are required to perform boundary-layer calculations for reacting gases in chemical equilibrium.
Simulated trajectories error analysis program, version 2. Volume 2: Programmer's manual
NASA Technical Reports Server (NTRS)
Vogt, E. D.; Adams, G. L.; Working, M. M.; Ferguson, J. B.; Bynum, M. R.
1971-01-01
A series of three computer programs for the mathematical analysis of navigation and guidance of lunar and interplanetary trajectories was developed. All three programs require the integration of n-body trajectories for both interplanetary and lunar missions. The virutal mass technique is used in all three programs. The user's manual contains the information necessary to operate the programs. The input and output quantities of the programs are described. Sample cases are given and discussed.
NASA Technical Reports Server (NTRS)
Wie, Yong-Sun
1990-01-01
This user's manual contains a complete description of the computer programs developed to calculate three-dimensional, compressible, laminar boundary layers for perfect gas flow on general fuselage shapes. These programs include the 3-D boundary layer program (3DBLC), the body-oriented coordinate program (BCC), and the streamline coordinate program (SCC). Subroutine description, input, output and sample case are discussed. The complete FORTRAN listings of the computer programs are given.
Orzol, Leonard L.; McGrath, Timothy S.
1992-01-01
This report documents modifications to the U.S. Geological Survey modular, three-dimensional, finite-difference, ground-water flow model, commonly called MODFLOW, so that it can read and write files used by a geographic information system (GIS). The modified model program is called MODFLOWARC. Simulation programs such as MODFLOW generally require large amounts of input data and produce large amounts of output data. Viewing data graphically, generating head contours, and creating or editing model data arrays such as hydraulic conductivity are examples of tasks that currently are performed either by the use of independent software packages or by tedious manual editing, manipulating, and transferring data. Programs such as GIS programs are commonly used to facilitate preparation of the model input data and analyze model output data; however, auxiliary programs are frequently required to translate data between programs. Data translations are required when different programs use different data formats. Thus, the user might use GIS techniques to create model input data, run a translation program to convert input data into a format compatible with the ground-water flow model, run the model, run a translation program to convert the model output into the correct format for GIS, and use GIS to display and analyze this output. MODFLOWARC, avoids the two translation steps and transfers data directly to and from the ground-water-flow model. This report documents the design and use of MODFLOWARC and includes instructions for data input/output of the Basic, Block-centered flow, River, Recharge, Well, Drain, Evapotranspiration, General-head boundary, and Streamflow-routing packages. The modification to MODFLOW and the Streamflow-Routing package was minimized. Flow charts and computer-program code describe the modifications to the original computer codes for each of these packages. Appendix A contains a discussion on the operation of MODFLOWARC using a sample problem.
A portable hypergolic oxidizer vapor sensor for NASA's Space Shuttle program
NASA Technical Reports Server (NTRS)
Helms, W. R.
1978-01-01
The design and performance characteristics of an electrochemical NO2 sensor selected by NASA for the space shuttle program is described. The instrument consists of a sample pump, an electrochemical cell, and control and display electronics. The pump pushes the sample through the electrochemical cell where the vapors are analyzed and an output proportional to the NO2 concentration is produced. The output is displayed on a panel meter, and is also available at a recorder jack. The electrochemical cell is made up of a polypropylene chamber covered with teflon membrane faceplates. Plantinum electrodes are bonded to the faceplates, and the sensing and counter electrodes are potentiostatically controlled at -200 mV with respect to the reference electrode. The cell is filled with electrolyte, consisting of 13.5 cc of 23% solution of KOH.
NASA Technical Reports Server (NTRS)
Sulyma, P. R.; Mcanally, J. V.
1975-01-01
The streamline divergence program was developed to demonstrate the capability to trace inviscid surface streamlines and to calculate outflow-corrected laminar and turbulent convective heating rates on surfaces subjected to exhaust plume impingement. The analytical techniques used in formulating this program are discussed. A brief description of the streamline divergence program is given along with a user's guide. The program input and output for a sample case are also presented.
An automated program for reinforcement requirements for openings in cylindrical pressure vessels
NASA Technical Reports Server (NTRS)
Wilson, J. F.; Taylor, J. T.
1975-01-01
An automated interactive program for calculating the reinforcement requirements for openings in cylindrical pressure vessels subjected to internal pressure is described. The program is written for an electronic desk top calculator. The program calculates the required area of reinforcement for a given opening and compares this value with the area of reinforcement provided by a proposed design. All program steps, operating instructions, and example problems with input and sample output are documented.
State criminal justice telecommunications (STACOM). Volume 4: Network design software user's guide
NASA Technical Reports Server (NTRS)
Lee, J. J.
1977-01-01
A user's guide to the network design program is presented. The program is written in FORTRAN V and implemented on a UNIVAC 1108 computer under the EXEC-8 operating system which enables the user to construct least-cost network topologies for criminal justice digital telecommunications networks. A complete description of program features, inputs, processing logic, and outputs is presented, and a sample run and a program listing are included.
NASA Technical Reports Server (NTRS)
Mitchell, C. E.; Eckert, K.
1979-01-01
A program for predicting the linear stability of liquid propellant rocket engines is presented. The underlying model assumptions and analytical steps necessary for understanding the program and its input and output are also given. The rocket engine is modeled as a right circular cylinder with an injector with a concentrated combustion zone, a nozzle, finite mean flow, and an acoustic admittance, or the sensitive time lag theory. The resulting partial differential equations are combined into two governing integral equations by the use of the Green's function method. These equations are solved using a successive approximation technique for the small amplitude (linear) case. The computational method used as well as the various user options available are discussed. Finally, a flow diagram, sample input and output for a typical application and a complete program listing for program MODULE are presented.
STABCAR: A program for finding characteristic root systems having transcendental stability matrices
NASA Technical Reports Server (NTRS)
Adams, W. M., Jr.; Tiffany, S. H.; Newsom, J. R.; Peele, E. L.
1984-01-01
STABCAR can be used to determine the characteristic roots of flexible, actively controlled aircraft, including the effects of unsteady aerodynamics. A modal formulation and a transfer-matrix representation of the control system are employed. Operable in either a batch or an interactive mode, STABCAR can provide graphical or tabular output of the variation of the roots with velocity, density, altitude, dynamic pressure or feedback gains. Herein the mathematical model, program structure, input requirements, output capabilities, and a series of sample cases are detailed. STABCAR was written for use on CDC CYBER 175 equipment; modification would be required for operation on other machines.
TWINTAN: A program for transonic wall interference assessment in two-dimensional wind tunnels
NASA Technical Reports Server (NTRS)
Kemp, W. B., Jr.
1980-01-01
A method for assessing the wall interference in transonic two dimensional wind tunnel test was developed and implemented in a computer program. The method involves three successive solutions of the transonic small disturbance potential equation to define the wind tunnel flow, the perturbation attriburable to the model, and the equivalent free air flow around the model. Input includes pressure distributions on the model and along the top and bottom tunnel walls which are used as boundary conditions for the wind tunnel flow. The wall induced perturbation fields is determined as the difference between the perturbation in the tunnel flow solution and the perturbation attributable to the model. The methodology used in the program is described and detailed descriptions of the computer program input and output are presented. Input and output for a sample case are given.
NASA Technical Reports Server (NTRS)
Huffman, S.
1977-01-01
Detailed instructions on the use of two computer-aided-design programs for designing the energy storage inductor for single winding and two winding dc to dc converters are provided. Step by step procedures are given to illustrate the formatting of user input data. The procedures are illustrated by eight sample design problems which include the user input and the computer program output.
ERIC Educational Resources Information Center
Murtha, Judith Rush
The purpose of this study was to write a computer program that would not only output a color pattern weave to a cathode ray tube (CRT), but would also analyze a painted design and output a printed diagram that would show how to set up a loom in order to produce the woven design. The first of seven chapters describes the problem and the intent of…
Digital Fingerprinting of Field Programmable Gate Arrays
2008-03-01
48 vii Page Appendix B . Tranistional Sampling Outputs . . . . . . . . . . . . . . 49 Appendix C. VHDL Entities...cumulative sampling outputs by pin . . . . . . . . . . . 48 B .1. FPGA outputs for Sample 0, Clk 18 . . . . . . . . . . . . . . . 49 B .2. FPGA outputs for...Sample 0, Clk 19 . . . . . . . . . . . . . . . 49 B .3. FPGA outputs for Sample 0, Clk 21 . . . . . . . . . . . . . . . 50 B .4. FPGA outputs for Sample
TWINTN4: A program for transonic four-wall interference assessment in two-dimensional wind tunnels
NASA Technical Reports Server (NTRS)
Kemp, W. B., Jr.
1984-01-01
A method for assessing the wall interference in transonic two-dimensional wind tunnel tests including the effects of the tunnel sidewall boundary layer was developed and implemented in a computer program named TWINTN4. The method involves three successive solutions of the transonic small disturbance potential equation to define the wind tunnel flow, the equivalent free air flow around the model, and the perturbation attributable to the model. Required input includes pressure distributions on the model and along the top and bottom tunnel walls which are used as boundary conditions for the wind tunnel flow. The wall-induced perturbation field is determined as the difference between the perturbation in the tunnel flow solution and the perturbation attributable to the model. The methodology used in the program is described and detailed descriptions of the computer program input and output are presented. Input and output for a sample case are given.
User's manual for SYNC: A FORTRAN program for merging and time-synchronizing data
NASA Technical Reports Server (NTRS)
Maine, R. E.
1981-01-01
The FORTRAN 77 computer program SYNC for merging and time synchronizing data is described. The program SYNC reads one or more input files which contain either synchronous data frames or time-tagged data points, which can be compressed. The program decompresses and time synchronizes the data, correcting for any channel time skews. Interpolation and hold last value synchronization algorithms are available. The output from SYNC is a file of time synchronized data frames at any requested sample rate.
NASA Technical Reports Server (NTRS)
Srivastava, R.; Reddy, T. S. R.
1997-01-01
The program DuctE3D is used for steady or unsteady aerodynamic and aeroelastic analysis of ducted fans. This guide describes the input data required and the output files generated, in using DuctE3D. The analysis solves three dimensional unsteady, compressible Euler equations to obtain the aerodynamic forces. A normal mode structural analysis is used to obtain the aeroelastic equations, which are solved using either the time domain or the frequency domain solution method. Sample input and output files are included in this guide for steady aerodynamic analysis and aeroelastic analysis of an isolated fan row.
NASA Technical Reports Server (NTRS)
Scott, Robert C.; Pototzky, Anthony S.; Perry, Boyd, III
1994-01-01
NASA Langley Research Center has, for several years, conducted research in the area of time-correlated gust loads for linear and nonlinear aircraft. The results of this work led NASA to recommend that the Matched-Filter-Based One-Dimensional Search Method be used for gust load analyses of nonlinear aircraft. This manual describes this method, describes a FORTRAN code which performs this method, and presents example calculations for a sample nonlinear aircraft model. The name of the code is MFD1DS (Matched-Filter-Based One-Dimensional Search). The program source code, the example aircraft equations of motion, a sample input file, and a sample program output are all listed in the appendices.
Electron microprobe analysis program for biological specimens: BIOMAP
NASA Technical Reports Server (NTRS)
Edwards, B. F.
1972-01-01
BIOMAP is a Univac 1108 compatible program which facilitates the electron probe microanalysis of biological specimens. Input data are X-ray intensity data from biological samples, the X-ray intensity and composition data from a standard sample and the electron probe operating parameters. Outputs are estimates of the weight percentages of the analyzed elements, the distribution of these estimates for sets of red blood cells and the probabilities for correlation between elemental concentrations. An optional feature statistically estimates the X-ray intensity and residual background of a principal standard relative to a series of standards.
Users guide: Steady-state aerodynamic-loads program for shuttle TPS tiles
NASA Technical Reports Server (NTRS)
Kerr, P. A.; Petley, D. H.
1984-01-01
A user's guide for the computer program that calculates the steady-state aerodynamic loads on the Shuttle thermal-protection tiles is presented. The main element in the program is the MITAS-II, Martin Marietta Interactive Thermal Analysis System. The MITAS-II is used to calculate the mass flow in a nine-tile model designed to simulate conditions duing a Shuttle flight. The procedures used to execute the program using the MITAS-II software are described. A list of the necessry software and data files along with a brief description of their functions is given. The format of the data file containing the surface pressure data is specified. The interpolation techniques used to calculate the pressure profile over the tile matrix are briefly described. In addition, the output from a sample run is explained. The actual output and the procedure file used to execute the program at NASA Langley Research Center on a CDC CYBER-175 are provided in the appendices.
Integrated Composite Analyzer (ICAN): Users and programmers manual
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Chamis, C. C.
1986-01-01
The use of and relevant equations programmed in a computer code designed to carry out a comprehensive linear analysis of multilayered fiber composites is described. The analysis contains the essential features required to effectively design structural components made from fiber composites. The inputs to the code are constituent material properties, factors reflecting the fabrication process, and composite geometry. The code performs micromechanics, macromechanics, and laminate analysis, including the hygrothermal response of fiber composites. The code outputs are the various ply and composite properties, composite structural response, and composite stress analysis results with details on failure. The code is in Fortran IV and can be used efficiently as a package in complex structural analysis programs. The input-output format is described extensively through the use of a sample problem. The program listing is also included. The code manual consists of two parts.
NASA Technical Reports Server (NTRS)
Brauer, G. L.; Cornick, D. E.; Stevenson, R.
1977-01-01
The capabilities and applications of the three-degree-of-freedom (3DOF) version and the six-degree-of-freedom (6DOF) version of the Program to Optimize Simulated Trajectories (POST) are summarized. The document supplements the detailed program manuals by providing additional information that motivates and clarifies basic capabilities, input procedures, applications and computer requirements of these programs. The information will enable prospective users to evaluate the programs, and to determine if they are applicable to their problems. Enough information is given to enable managerial personnel to evaluate the capabilities of the programs and describes the POST structure, formulation, input and output procedures, sample cases, and computer requirements. The report also provides answers to basic questions concerning planet and vehicle modeling, simulation accuracy, optimization capabilities, and general input rules. Several sample cases are presented.
NASA Technical Reports Server (NTRS)
Masters, P. A.
1974-01-01
An analysis to predict the pressurant gas requirements for the discharge of cryogenic liquid propellants from storage tanks is presented, along with an algorithm and two computer programs. One program deals with the pressurization (ramp) phase of bringing the propellant tank up to its operating pressure. The method of analysis involves a numerical solution of the temperature and velocity functions for the tank ullage at a discrete set of points in time and space. The input requirements of the program are the initial ullage conditions, the initial temperature and pressure of the pressurant gas, and the time for the expulsion or the ramp. Computations are performed which determine the heat transfer between the ullage gas and the tank wall. Heat transfer to the liquid interface and to the hardware components may be included in the analysis. The program output includes predictions of mass of pressurant required, total energy transfer, and wall and ullage temperatures. The analysis, the algorithm, a complete description of input and output, and the FORTRAN 4 program listings are presented. Sample cases are included to illustrate use of the programs.
The drivers of facility-based immunization performance and costs. An application to Moldova.
Maceira, Daniel; Goguadze, Ketevan; Gotsadze, George
2015-05-07
This paper identifies factors that affect the cost and performance of the routine immunization program in Moldova through an analysis of facility-based data collected as part of a multi-country costing and financing study of routine immunization (EPIC). A nationally representative sample of health care facilities (50) was selected through multi-stage, stratified random sampling. Data on inputs, unit prices and facility outputs were collected during October 3rd 2012-January 14th 2013 using a pre-tested structured questionnaire. Ordinary least square (OLS) regression analysis was performed to determine factors affecting facility outputs (number of doses administered and fully immunized children) and explaining variation in total facility costs. The study found that the number of working hours, vaccine wastage rates, and whether or not a doctor worked at a facility (among other factors) were positively and significantly associated with output levels. In addition, the level of output, price of inputs and share of the population with university education were significantly associated with higher facility costs. A 1% increase in fully immunized child would increase total cost by 0.7%. Few costing studies of primary health care services in developing countries evaluate the drivers of performance and cost. This exercise attempted to fill this knowledge gap and helped to identify organizational and managerial factors at a primary care district and national level that could be addressed by improved program management aimed at improved performance. Copyright © 2015 Elsevier Ltd. All rights reserved.
Programming a Detector Emulator on NI's FlexRIO Platform
NASA Astrophysics Data System (ADS)
Gervais, Michelle; Crawford, Christopher; Sprow, Aaron; Nab Collaboration
2017-09-01
Recently digital detector emulators have been on the rise as a means to test data acquisition systems and analysis toolkits from a well understood data set. National Instruments' PXIe-7962R FPGA module and Active Technologies AT-1212 DAC module provide a customizable platform for analog output. Using a graphical programming language, we have developed a system capable of producing two time-correlated channels of analog output which sample unique amplitude spectra to mimic nuclear physics experiments. This system will be used to model the Nab experiment, in which a prompt beta decay electron is followed by a slow proton according to a defined time distribution. We will present the results of our work and discuss further development potential. DOE under Contract DE-SC0008107.
DITTY - a computer program for calculating population dose integrated over ten thousand years
DOE Office of Scientific and Technical Information (OSTI.GOV)
Napier, B.A.; Peloquin, R.A.; Strenge, D.L.
The computer program DITTY (Dose Integrated Over Ten Thousand Years) was developed to determine the collective dose from long term nuclear waste disposal sites resulting from the ground-water pathways. DITTY estimates the time integral of collective dose over a ten-thousand-year period for time-variant radionuclide releases to surface waters, wells, or the atmosphere. This document includes the following information on DITTY: a description of the mathematical models, program designs, data file requirements, input preparation, output interpretations, sample problems, and program-generated diagnostic messages.
Paranoia.Ada: Sample output reports
NASA Technical Reports Server (NTRS)
1986-01-01
Paranoia.Ada is a program to diagnose floating point arithmetic in the context of the Ada programming language. The program evaluates the quality of a floating point arithmetic implementation with respect to the proposed IEEE Standards P754 and P854. Paranoia.Ada is derived from the original BASIC programming language version of Paranoia. The Paranoia.Ada replicates in Ada the test algorithms originally implemented in BASIC and adheres to the evaluation criteria established by W. M. Kahan. Paranoia.Ada incorporates a major structural redesign and employs applicable Ada architectural and stylistic features.
User's guide to resin infusion simulation program in the FORTRAN language
NASA Technical Reports Server (NTRS)
Weideman, Mark H.; Hammond, Vince H.; Loos, Alfred C.
1992-01-01
RTMCL is a user friendly computer code which simulates the manufacture of fabric composites by the resin infusion process. The computer code is based on the process simulation model described in reference 1. Included in the user's guide is a detailed step by step description of how to run the program and enter and modify the input data set. Sample input and output files are included along with an explanation of the results. Finally, a complete listing of the program is provided.
NASA Technical Reports Server (NTRS)
Barbero, P.; Chin, J.
1973-01-01
The theoretical derivation of the set of equations is discussed which is applicable to modeling the dynamic characteristics of aeroelastically-scaled models flown on the two-cable mount system in a 16 ft transonic dynamics tunnel. The computer program provided for the analysis is also described. The program calculates model trim conditions as well as 3 DOF longitudinal and lateral/directional dynamic conditions for various flying cable and snubber cable configurations. Sample input and output are included.
NASA Technical Reports Server (NTRS)
Gaugler, R. E.
1978-01-01
A computer program to calculate transient and steady state temperatures, pressures, and coolant flows in a cooled, axial flow turbine blade or vane with an impingement insert is described. Coolant side heat transfer coefficients are calculated internally in the program, with the user specifying either impingement or convection heat transfer at each internal flow station. Spent impingement air flows in a chordwise direction and is discharged through the trailing edge and through film cooling holes. The ability of the program to handle film cooling is limited by the internal flow model. Sample problems, with tables of input and output, are included in the report. Input to the program includes a description of the blade geometry, coolant supply conditions, outside thermal boundary conditions, and wheel speed. The blade wall can have two layers of different materials, such as a ceramic thermal barrier coating over a metallic substrate. Program output includes the temperature at each node, the coolant pressures and flow rates, and the inside heat-transfer coefficients.
Prototype Input and Output Data Elements for the Occupational Health and Safety Information System
NASA Technical Reports Server (NTRS)
Whyte, A. A.
1980-01-01
The National Aeronautics and Space Administration plans to implement a NASA-wide computerized information system for occupational health and safety. The system is necessary to administer the occupational health and safety programs and to meet the legal and regulatory reporting, recordkeeping, and surveillance requirements. Some of the potential data elements that NASA will require as input and output for the new occupational health and safety information system are illustrated. The data elements are shown on sample forms that have been compiled from various sources, including NASA Centers and industry.
Tip vortex computer code SRATIP. User's guide
NASA Technical Reports Server (NTRS)
Levy, R.; Lin, S. J.
1985-01-01
This User's Guide applies to the three dimensional viscous flow forward marching analysis, PEPSIG, as used for the calculation of the helicopter tip vortex flow field. The guide presents a discussion of the program flow and subroutines, as well as a list of sample input and output.
NASA Technical Reports Server (NTRS)
Arbuckle, P. D.; Sliwa, S. M.; Roy, M. L.; Tiffany, S. H.
1985-01-01
A computer program for interactively developing least-squares polynomial equations to fit user-supplied data is described. The program is characterized by the ability to compute the polynomial equations of a surface fit through data that are a function of two independent variables. The program utilizes the Langley Research Center graphics packages to display polynomial equation curves and data points, facilitating a qualitative evaluation of the effectiveness of the fit. An explanation of the fundamental principles and features of the program, as well as sample input and corresponding output, are included.
Wexler, Eliezer J.
1992-01-01
Analytical solutions to the advective-dispersive solute-transport equation are useful in predicting the fate of solutes in ground water. Analytical solutions compiled from available literature or derived by the author are presented for a variety of boundary condition types and solute-source configurations in one-, two-, and three-dimensional systems having uniform ground-water flow. A set of user-oriented computer programs was created to evaluate these solutions and to display the results in tabular and computer-graphics format. These programs incorporate many features that enhance their accuracy, ease of use, and versatility. Documentation for the programs describes their operation and required input data, and presents the results of sample problems. Derivations of selected solutions, source codes for the computer programs, and samples of program input and output also are included.
Policy Information System Computer Program.
ERIC Educational Resources Information Center
Hamlin, Roger E.; And Others
The concepts and methodologies outlined in "A Policy Information System for Vocational Education" are presented in a simple computer format in this booklet. It also contains a sample output representing 5-year projections of various planning needs for vocational education. Computerized figures in the eight areas corresponding to those in the…
The focal plane reception pattern calculation for a paraboloidal antenna with a nearby fence
NASA Technical Reports Server (NTRS)
Schmidt, Richard F.; Cheng, Hwai-Soon; Kao, Michael W.
1987-01-01
A computer simulation program is described which is used to estimate the effects of a proximate diffraction fence on the performance of paraboloid antennas. The computer program is written in FORTRAN. The physical problem, mathematical formulation and coordinate references are described. The main control structure of the program and the function of the individual subroutines are discussed. The Job Control Language set-up and program instruction are provided in the user's instruction to help users execute the present program. A sample problem with an appropriate output listing is made available as an illustration of the usage of the program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barieau, R.E.
1977-03-01
The PROP Program of Wilson and Lissaman has been modified by adding the Newton-Raphson Method and a Step Wise Search Method, as options for the method of solution. In addition, an optimization method is included. Twist angles, tip speed ratio and the pitch angle may be varied to produce maximum power coefficient. The computer program listing is presented along with sample input and output data. Further improvements to the program are discussed.
NASA Technical Reports Server (NTRS)
Cassarino, S.; Sopher, R.
1982-01-01
user instruction and software descriptions for the base program of the coupled rotor/airframe vibration analysis are provided. The functional capabilities and procedures for running the program are provided. Interfaces with external programs are discussed. The procedure of synthesizing a dynamic system and the various solution methods are described. Input data and output results are presented. Detailed information is provided on the program structure. Sample test case results for five representative dynamic configurations are provided and discussed. System response are plotted to demonstrate the plots capabilities available. Instructions to install and execute SIMVIB on the CDC computer system are provided.
A User's Guide for the Differential Reduced Ejector/Mixer Analysis "DREA" Program. 1.0
NASA Technical Reports Server (NTRS)
DeChant, Lawrence J.; Nadell, Shari-Beth
1999-01-01
A system of analytical and numerical two-dimensional mixer/ejector nozzle models that require minimal empirical input has been developed and programmed for use in conceptual and preliminary design. This report contains a user's guide describing the operation of the computer code, DREA (Differential Reduced Ejector/mixer Analysis), that contains these mathematical models. This program is currently being adopted by the Propulsion Systems Analysis Office at the NASA Glenn Research Center. A brief summary of the DREA method is provided, followed by detailed descriptions of the program input and output files. Sample cases demonstrating the application of the program are presented.
SOFIP: A Short Orbital Flux Integration Program
NASA Technical Reports Server (NTRS)
Stassinopoulos, E. G.; Hebert, J. J.; Butler, E. L.; Barth, J. L.
1979-01-01
A computer code was developed to evaluate the space radiation environment encountered by geocentric satellites. The Short Orbital Flux Integration Program (SOFIP) is a compact routine of modular compositions, designed mostly with structured programming techniques in order to provide core and time economy and ease of use. The program in its simplest form produces for a given input trajectory a composite integral orbital spectrum of either protons or electrons. Additional features are available separately or in combination with the inclusion of the corresponding (optional) modules. The code is described in detail, and the function and usage of the various modules are explained. A program listing and sample outputs are attached.
3D TRUMP - A GBI launch window tool
NASA Astrophysics Data System (ADS)
Karels, Steven N.; Hancock, John; Matchett, Gary
3D TRUMP is a novel GPS and communicatons-link software analysis tool developed for the SDIO's Ground-Based Interceptor (GBI) program. 3D TRUMP uses a computationally efficient analysis tool which provides key GPS-based performance measures for an entire GBI mission's reentry vehicle and interceptor trajectories. Algorithms and sample outputs are presented.
NASA Technical Reports Server (NTRS)
Dash, S. M.; Pergament, H. S.
1978-01-01
The basic code structure is discussed, including the overall program flow and a brief description of all subroutines. Instructions on the preparation of input data, definitions of key FORTRAN variables, sample input and output, and a complete listing of the code are presented.
User's Guide to the Stand Prognosis Model
William R. Wykoff; Nicholas L. Crookston; Albert R. Stage
1982-01-01
The Stand Prognosis Model is a computer program that projects the development of forest stands in the Northern Rocky Mountains. Thinning options allow for simulation of a variety of management strategies. Input consists of a stand inventory, including sample tree records, and a set of option selection instructions. Output includes data normally found in stand, stock,...
Wexler, Eliezer J.
1989-01-01
Analytical solutions to the advective-dispersive solute-transport equation are useful in predicting the fate of solutes in ground water. Analytical solutions compiled from available literature or derived by the author are presented in this report for a variety of boundary condition types and solute-source configurations in one-, two-, and three-dimensional systems with uniform ground-water flow. A set of user-oriented computer programs was created to evaluate these solutions and to display the results in tabular and computer-graphics format. These programs incorporate many features that enhance their accuracy, ease of use, and versatility. Documentation for the programs describes their operation and required input data, and presents the results of sample problems. Derivations of select solutions, source codes for the computer programs, and samples of program input and output also are included.
A computer program for automated flutter solution and matched point determination
NASA Technical Reports Server (NTRS)
Bhatia, K. G.
1973-01-01
The use of a digital computer program (MATCH) for automated determination of the flutter velocity and the matched-point flutter density is described. The program is based on the use of the modified Laguerre iteration formula to converge to a flutter crossing or a matched-point density. A general description of the computer program is included and the purpose of all subroutines used is stated. The input required by the program and various input options are detailed, and the output description is presented. The program can solve flutter equations formulated with up to 12 vibration modes and obtain flutter solutions for up to 10 air densities. The program usage is illustrated by a sample run, and the FORTRAN program listing is included.
SAI (Systems Applications, Incorporated) Urban Airshed Model. Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schere, K.L.
1985-06-01
This magnetic tape contains the FORTRAN source code, sample input data, and sample output data for the SAI Urban Airshed Model (UAM). The UAM is a 3-dimensional gridded air-quality simulation model that is well suited for predicting the spatial and temporal distribution of photochemical pollutant concentrations in an urban area. The model is based on the equations of conservation of mass for a set of reactive pollutants in a turbulent-flow field. To solve these equations, the UAM uses numerical techniques set in a 3-D finite-difference grid array of cells, each about 1 to 10 kilometers wide and 10 to severalmore » hundred meters deep. As output, the model provides the calculated pollutant concentrations in each cell as a function of time. The chemical species of prime interest included in the UAM simulations are O3, NO, NO/sub 2/ and several organic compounds and classes of compounds. The UAM system contains at its core the Airshed Simulation Program that accesses input data consisting of 10 to 14 files, depending on the program options chosen. Each file is created by a separate data-preparation program. There are 17 programs in the entire UAM system. The services of a qualified dispersion meteorologist, a chemist, and a computer programmer will be necessary to implement and apply the UAM and to interpret the results. Software Description: The program is written in the FORTRAN programming language for implementation on a UNIVAC 1110 computer under the UNIVAC 110 0 operating system level 38R5A. Memory requirement is 80K.« less
The 3DGRAPE book: Theory, users' manual, examples
NASA Technical Reports Server (NTRS)
Sorenson, Reese L.
1989-01-01
A users' manual for a new three-dimensional grid generator called 3DGRAPE is presented. The program, written in FORTRAN, is capable of making zonal (blocked) computational grids in or about almost any shape. Grids are generated by the solution of Poisson's differential equations in three dimensions. The program automatically finds its own values for inhomogeneous terms which give near-orthogonality and controlled grid cell height at boundaries. Grids generated by 3DGRAPE have been applied to both viscous and inviscid aerodynamic problems, and to problems in other fluid-dynamic areas. The smoothness for which elliptic methods are known is seen here, including smoothness across zonal boundaries. An introduction giving the history, motivation, capabilities, and philosophy of 3DGRAPE is presented first. Then follows a chapter on the program itself. The input is then described in detail. A chapter on reading the output and debugging follows. Three examples are then described, including sample input data and plots of output. Last is a chapter on the theoretical development of the method.
An efficient routine for infrared radiative transfer in a cloudy atmosphere
NASA Technical Reports Server (NTRS)
Chou, M. D.; Kouvaris, L.
1981-01-01
A FORTRAN program that calculates the atmospheric cooling rate and infrared fluxes for partly cloudy atmospheres is documented. The IR fluxes in the water bands and the 9.6 and 15 micron bands are calculated at 15 levels ranging from 1.39 mb to the surface. The program is generalized to accept any arbitrary atmospheric temperature and humidity profiles and clouds as input and return the cooling rate and fluxes as output. Sample calculations for various atmospheric profiles and cloud situations are demonstrated.
Second Generation Integrated Composite Analyzer (ICAN) Computer Code
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Ginty, Carol A.; Sanfeliz, Jose G.
1993-01-01
This manual updates the original 1986 NASA TP-2515, Integrated Composite Analyzer (ICAN) Users and Programmers Manual. The various enhancements and newly added features are described to enable the user to prepare the appropriate input data to run this updated version of the ICAN code. For reference, the micromechanics equations are provided in an appendix and should be compared to those in the original manual for modifications. A complete output for a sample case is also provided in a separate appendix. The input to the code includes constituent material properties, factors reflecting the fabrication process, and laminate configuration. The code performs micromechanics, macromechanics, and laminate analyses, including the hygrothermal response of polymer-matrix-based fiber composites. The output includes the various ply and composite properties, the composite structural response, and the composite stress analysis results with details on failure. The code is written in FORTRAN 77 and can be used efficiently as a self-contained package (or as a module) in complex structural analysis programs. The input-output format has changed considerably from the original version of ICAN and is described extensively through the use of a sample problem.
Quantitative methods to direct exploration based on hydrogeologic information
Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.
2006-01-01
Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.
NASA Technical Reports Server (NTRS)
Chen, H. C.; Neback, H. E.; Kao, T. J.; Yu, N. Y.; Kusunose, K.
1991-01-01
This manual explains how to use an Euler based computational method for predicting the airframe/propulsion integration effects for an aft-mounted turboprop transport. The propeller power effects are simulated by the actuator disk concept. This method consists of global flow field analysis and the embedded flow solution for predicting the detailed flow characteristics in the local vicinity of an aft-mounted propfan engine. The computational procedure includes the use of several computer programs performing four main functions: grid generation, Euler solution, grid embedding, and streamline tracing. This user's guide provides information for these programs, including input data preparations with sample input decks, output descriptions, and sample Unix scripts for program execution in the UNICOS environment.
Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment
NASA Technical Reports Server (NTRS)
Ancel, Ersin; Shih, Ann T.
2014-01-01
NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.
Manual for Program PSTRESS: Peel stress computation
NASA Technical Reports Server (NTRS)
Barkey, Derek A.; Madan, Ram C.
1987-01-01
Described is the use of the interactive FORTRAN computer program PSTRESS, which computes a closed form solution for two bonded plates subjected to applied moments, vertical shears, and in-plane forces. The program calculates in-plane stresses in the plates, deflections of the plates, and peel and shear stresses in the adhesive. The document briefly outlines the analytical method used by PSTRESS, describes the input and output of the program, and presents a sample analysis. The results of the latter are shown to be within a few percent of results obtained using a NASTRAN finite element analysis. An appendix containing a listing of PSTRESS is included.
Bifilar analysis users manual, volume 2
NASA Technical Reports Server (NTRS)
Cassarino, S. J.
1980-01-01
The digital computer program developed to study the vibration response of a coupled rotor/bifilar/airframe coupled system is described. The theoretical development of the rotor/airframe system equations of motion is provided. The fuselage and bifilar absorber equations of motion are discussed. The modular block approach used in the make-up of this computer program is described. The input data needed to run the rotor and bifilar absorber analyses is described. Sample output formats are presented and discussed. The results for four test cases, which use the major logic paths of the computer program, are presented. The overall program structure is discussed in detail. The FORTRAN subroutines are described in detail.
DOT National Transportation Integrated Search
1978-01-01
A digital data acquisition system has been designed to meet the need for a long duration noise analysis capability. By sampling the DC outputs from sound level meters, it has been possible to make twenty-four hour or longer recordings, in contrast to...
Reinforced Concrete Beams under Combined Axial and Lateral Loading.
1982-01-01
NUMBER(s) Golden E. Lane, Jr. F29601-76-C-015 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT . PROJECT, TASK AREA 4 WORK UNIT NUMBERS New...acquisition system. The voltage output from the system’s digital multimeter was recorded on a floppy disk. The sampling rate was approximately two... samples per second for every channel. The same system was used to reduce and plot the data. TEST APPARATUS Figure 9 shows a schematic drawing of the load
Carbon monoxide measurement in the global atmospheric sampling program
NASA Technical Reports Server (NTRS)
Dudzinski, T. J.
1979-01-01
The carbon monoxide measurement system used in the NASA Global Atmospheric Sampling Program (GASP) is described. The system used a modified version of a commercially available infrared absorption analyzer. The modifications increased the sensitivity of the analyzer to 1 ppmv full scale, with a limit of detectability of 0.02 ppmv. Packaging was modified for automatic, unattended operation in an aircraft environment. The GASP system is described along with analyzer operation, calibration procedures, and measurement errors. Uncertainty of the CO measurement over a 2-year period ranged from + or - 3 to + or - 13 percent of reading, plus an error due to random fluctuation of the output signal + or - 3 to + or - 15 ppbv.
NASA Technical Reports Server (NTRS)
Srivastava, R.; Reddy, T. S. R.
1996-01-01
This guide describes the input data required, for steady or unsteady aerodynamic and aeroelastic analysis of propellers and the output files generated, in using PROP3D. The aerodynamic forces are obtained by solving three dimensional unsteady, compressible Euler equations. A normal mode structural analysis is used to obtain the aeroelastic equations, which are solved using either time domain or frequency domain solution method. Sample input and output files are included in this guide for steady aerodynamic analysis of single and counter-rotation propellers, and aeroelastic analysis of single-rotation propeller.
NASA Technical Reports Server (NTRS)
Hanson, Donald B.
1994-01-01
A two dimensional linear aeroacoustic theory for rotor/stator interaction with unsteady coupling was derived and explored in Volume 1 of this report. Computer program CUP2D has been written in FORTRAN embodying the theoretical equations. This volume (Volume 2) describes the structure of the code, installation and running, preparation of the input file, and interpretation of the output. A sample case is provided with printouts of the input and output. The source code is included with comments linking it closely to the theoretical equations in Volume 1.
Method for Operating a Sensor to Differentiate Between Analytes in a Sample
Kunt, Tekin; Cavicchi, Richard E; Semancik, Stephen; McAvoy, Thomas J
1998-07-28
Disclosed is a method for operating a sensor to differentiate between first and second analytes in a sample. The method comprises the steps of determining a input profile for the sensor which will enhance the difference in the output profiles of the sensor as between the first analyte and the second analyte; determining a first analyte output profile as observed when the input profile is applied to the sensor; determining a second analyte output profile as observed when the temperature profile is applied to the sensor; introducing the sensor to the sample while applying the temperature profile to the sensor, thereby obtaining a sample output profile; and evaluating the sample output profile as against the first and second analyte output profiles to thereby determine which of the analytes is present in the sample.
Instruction in Documentation for Computer Programming
ERIC Educational Resources Information Center
Westley, John W.
1976-01-01
In addition to the input/output record format, the program flowchart, the program listing, and the program test output, eight documentation items are suggested in order that they may serve as a base from which to start teaching program documentation. (Author/AG)
ICAN: Integrated composites analyzer
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Chamis, C. C.
1984-01-01
The ICAN computer program performs all the essential aspects of mechanics/analysis/design of multilayered fiber composites. Modular, open-ended and user friendly, the program can handle a variety of composite systems having one type of fiber and one matrix as constituents as well as intraply and interply hybrid composite systems. It can also simulate isotropic layers by considering a primary composite system with negligible fiber volume content. This feature is specifically useful in modeling thin interply matrix layers. Hygrothermal conditions and various combinations of in-plane and bending loads can also be considered. Usage of this code is illustrated with a sample input and the generated output. Some key features of output are stress concentration factors around a circular hole, locations of probable delamination, a summary of the laminate failure stress analysis, free edge stresses, microstresses and ply stress/strain influence coefficients. These features make ICAN a powerful, cost-effective tool to analyze/design fiber composite structures and components.
Modeling of processes of formation of the images in optical-electronic systems
NASA Astrophysics Data System (ADS)
Grudin, B. N.; Plotnikov, V. S.; Fischenko, V. K.
2001-08-01
The digital model of the multicomponent coherent optical system with arbitrary layout of optical elements (lasers, lenses, phototransparencies with recording of the function of transmission of a specimens or filters, photoregistrars), constructed with usage of fast algorithms is considered. The model is realized as the program for personal computers in operational systems Windows 95, 98 and Windows NT. At simulation, for example, coherent system consisting of twenty elementary optical cascades a relative error in the output image as a rule does not exceed 0.25% when N >= 256 (N x N - the number of discrete samples on the image), and time of calculation of the output image on a computer (Pentium-2, 300 MHz) for N = 512 does not exceed one minute. The program of simulation of coherent optical systems will be utilized in scientific researches and at tutoring the students of Far East State University.
Computer code for preliminary sizing analysis of axial-flow turbines
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1992-01-01
This mean diameter flow analysis uses a stage average velocity diagram as the basis for the computational efficiency. Input design requirements include power or pressure ratio, flow rate, temperature, pressure, and rotative speed. Turbine designs are generated for any specified number of stages and for any of three types of velocity diagrams (symmetrical, zero exit swirl, or impulse) or for any specified stage swirl split. Exit turning vanes can be included in the design. The program output includes inlet and exit annulus dimensions, exit temperature and pressure, total and static efficiencies, flow angles, and last stage absolute and relative Mach numbers. An analysis is presented along with a description of the computer program input and output with sample cases. The analysis and code presented herein are modifications of those described in NASA-TN-D-6702. These modifications improve modeling rigor and extend code applicability.
A computer program for anisotropic shallow-shell finite elements using symbolic integration
NASA Technical Reports Server (NTRS)
Andersen, C. M.; Bowen, J. T.
1976-01-01
A FORTRAN computer program for anisotropic shallow-shell finite elements with variable curvature is described. A listing of the program is presented together with printed output for a sample case. Computation times and central memory requirements are given for several different elements. The program is based on a stiffness (displacement) finite-element model in which the fundamental unknowns consist of both the displacement and the rotation components of the reference surface of the shell. Two triangular and four quadrilateral elements are implemented in the program. The triangular elements have 6 or 10 nodes, and the quadrilateral elements have 4 or 8 nodes. Two of the quadrilateral elements have internal degrees of freedom associated with displacement modes which vanish along the edges of the elements (bubble modes). The triangular elements and the remaining two quadrilateral elements do not have bubble modes. The output from the program consists of arrays corresponding to the stiffness, the geometric stiffness, the consistent mass, and the consistent load matrices for individual elements. The integrals required for the generation of these arrays are evaluated by using symbolic (or analytic) integration in conjunction with certain group-theoretic techniques. The analytic expressions for the integrals are exact and were developed using the symbolic and algebraic manipulation language.
Attitude profile design program
NASA Technical Reports Server (NTRS)
1991-01-01
The Attitude Profile Design (APD) Program was designed to be used as a stand-alone addition to the Simplex Computation of Optimum Orbital Trajectories (SCOOT). The program uses information from a SCOOT output file and the user defined attitude profile to produce time histories of attitude, angular body rates, and accelerations. The APD program is written in standard FORTRAN77 and should be portable to any machine that has an appropriate compiler. The input and output are through formatted files. The program reads the basic flight data, such as the states of the vehicles, acceleration profiles, and burn information, from the SCOOT output file. The user inputs information about the desired attitude profile during coasts in a high level manner. The program then takes these high level commands and executes the maneuvers, outputting the desired information.
The National Inventory of Down Woody Materials: Methods, Outputs, and Future Directions
Christopher W. Woodall
2003-01-01
The Forest Inventory and Analysis Program (FIA) of the USDA Forest Service conducts a national inventory of forests of the United States. A subset of FIA permanent inventory plots are sampled every year for numerous forest health indicators ranging fiom soils to understory vegetation. Down woody material (DWM) is an FIA indicator that refines estimation of forest...
User's manual for a computer program for simulating intensively managed allowable cut.
Robert W. Sassaman; Ed Holt; Karl Bergsvik
1972-01-01
Detailed operating instructions are described for SIMAC, a computerized forest simulation model which calculates the allowable cut assuming volume regulation for forests with intensively managed stands. A sample problem illustrates the required inputs and expected output. SIMAC is written in FORTRAN IV and runs on a CDC 6400 computer with a SCOPE 3.3 operating system....
NASA Technical Reports Server (NTRS)
Seidel, D. A.; Batina, J. T.
1986-01-01
The development, use and operation of the XTRAN2L program that solves the two dimensional unsteady transonic small disturbance potential equation are described. The XTRAN2L program is used to calculate steady and unsteady transonic flow fields about airfoils and is capable of performing self contained transonic flutter calculations. Operation of the XTRAN2L code is described, and tables defining all input variables, including default values, are presented. Sample cases that use various program options are shown to illustrate operation of XTRAN2L. Computer listings containing input and selected output are included as an aid to the user.
TAP 1: A Finite Element Program for Steady-State Thermal Analysis of Convectively Cooled Structures
NASA Technical Reports Server (NTRS)
Thornton, E. A.
1976-01-01
The program has a finite element library of six elements: two conduction/convection elements to model heat transfer in a solid, two convection elements to model heat transfer in a fluid, and two integrated conduction/convection elements to represent combined heat transfer in tubular and plate/fin fluid passages. Nonlinear thermal analysis due to temperature dependent thermal parameters is performed using the Newton-Raphson iteration method. Program output includes nodal temperatures and element heat fluxes. Pressure drops in fluid passages may be computed as an option. A companion plotting program for displaying the finite element model and predicted temperature distributions is presented. User instructions and sample problems are presented in appendixes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickens, J.K.
1988-03-01
This document provides a complete listing of the FORTRAN progran SCINFUL, a program designed to provide a calculated full response anticipated for either an NE-213 (liquid) scintillator or an NE-110 (solid) scintillator. The incident design neutron energy range is 0.1 to 80 MeV. Preparation of input to the program is discussed as are important features of the output. Also included is a FORTRAN listing of a subsidiary program applicable to the output of SCINFUL. This user-interactive program is named SCINSPEC from which the output of SCINFUL may be reformatted into a standard spectrum form involving either equal light-unit or equalmore » protran-energy intervals. Examples of input to this program and corresponding output are given.« less
Lucey, K.J.
1990-01-01
The U.S. Geological Survey conducts an external blind sample quality assurance project for its National Water Quality Laboratory in Denver, Colorado, based on the analysis of reference water samples. Reference samples containing selected inorganic and nutrient constituents are disguised as environmental samples at the Survey 's office in Ocala, Florida, and are sent periodically through other Survey offices to the laboratory. The results of this blind sample project indicate the quality of analytical data produced by the laboratory. This report provides instructions on the use of QADATA, an interactive, menu-driven program that allows users to retrieve the results of the blind sample quality- assurance project. The QADATA program, which is available on the U.S. Geological Survey 's national computer network, accesses a blind sample data base that contains more than 50,000 determinations from the last five water years for approximately 40 constituents at various concentrations. The data can be retrieved from the database for any user- defined time period and for any or all available constituents. After the user defines the retrieval, the program prepares statistical tables, control charts, and precision plots and generates a report which can be transferred to the user 's office through the computer network. A discussion of the interpretation of the program output is also included. This quality assurance information will permit users to document the quality of the analytical results received from the laboratory. The blind sample data is entered into the database within weeks after being produced by the laboratory and can be retrieved to meet the needs of specific projects or programs. (USGS)
NASA Technical Reports Server (NTRS)
Vadyak, J.; Hoffman, J. D.; Bishop, A. R.
1978-01-01
The calculation procedure is based on the method of characteristics for steady three-dimensional flow. The bow shock wave and the internal shock wave system were computed using a discrete shock wave fitting procedure. The general structure of the computer program is discussed, and a brief description of each subroutine is given. All program input parameters are defined, and a brief discussion on interpretation of the output is provided. A number of sample cases, complete with data deck listings, are presented.
NASA Technical Reports Server (NTRS)
Anderson, O. L.
1974-01-01
A finite-difference procedure for computing the turbulent, swirling, compressible flow in axisymmetric ducts is described. Arbitrary distributions of heat and mass transfer at the boundaries can be treated, and the effects of struts, inlet guide vanes, and flow straightening vanes can be calculated. The calculation procedure is programmed in FORTRAN 4 and has operated successfully on the UNIVAC 1108, IBM 360, and CDC 6600 computers. The analysis which forms the basis of the procedure, a detailed description of the computer program, and the input/output formats are presented. The results of sample calculations performed with the computer program are compared with experimental data.
Computer programs to predict induced effects of jets exhausting into a crossflow
NASA Technical Reports Server (NTRS)
Perkins, S. C., Jr.; Mendenhall, M. R.
1984-01-01
A user's manual for two computer programs was developed to predict the induced effects of jets exhausting into a crossflow. Program JETPLT predicts pressures induced on an infinite flat plate by a jet exhausting at angles to the plate and Program JETBOD, in conjunction with a panel code, predicts pressures induced on a body of revolution by a jet exhausting normal to the surface. Both codes use a potential model of the jet and adjacent surface with empirical corrections for the viscous or nonpotential effects. This program manual contains a description of the use of both programs, instructions for preparation of input, descriptions of the output, limitations of the codes, and sample cases. In addition, procedures to extend both codes to include additional empirical correlations are described.
GPFrontend and GPGraphics: graphical analysis tools for genetic association studies.
Uebe, Steffen; Pasutto, Francesca; Krumbiegel, Mandy; Schanze, Denny; Ekici, Arif B; Reis, André
2010-09-21
Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.
Modifications to the accuracy assessment analysis routine MLTCRP to produce an output file
NASA Technical Reports Server (NTRS)
Carnes, J. G.
1978-01-01
Modifications are described that were made to the analysis program MLTCRP in the accuracy assessment software system to produce a disk output file. The output files produced by this modified program are used to aggregate data for regions greater than a single segment.
Demonstration of Multi- and Single-Reader Sample Size Program for Diagnostic Studies software.
Hillis, Stephen L; Schartz, Kevin M
2015-02-01
The recently released software Multi- and Single-Reader Sample Size Sample Size Program for Diagnostic Studies , written by Kevin Schartz and Stephen Hillis, performs sample size computations for diagnostic reader-performance studies. The program computes the sample size needed to detect a specified difference in a reader performance measure between two modalities, when using the analysis methods initially proposed by Dorfman, Berbaum, and Metz (DBM) and Obuchowski and Rockette (OR), and later unified and improved by Hillis and colleagues. A commonly used reader performance measure is the area under the receiver-operating-characteristic curve. The program can be used with typical common reader-performance measures which can be estimated parametrically or nonparametrically. The program has an easy-to-use step-by-step intuitive interface that walks the user through the entry of the needed information. Features of the software include the following: (1) choice of several study designs; (2) choice of inputs obtained from either OR or DBM analyses; (3) choice of three different inference situations: both readers and cases random, readers fixed and cases random, and readers random and cases fixed; (4) choice of two types of hypotheses: equivalence or noninferiority; (6) choice of two output formats: power for specified case and reader sample sizes, or a listing of case-reader combinations that provide a specified power; (7) choice of single or multi-reader analyses; and (8) functionality in Windows, Mac OS, and Linux.
NASA Technical Reports Server (NTRS)
Kuhlman, J. M.; Shu, J. Y.
1981-01-01
A subsonic, linearized aerodynamic theory, wing design program for one or two planforms was developed which uses a vortex lattice near field model and a higher order panel method in the far field. The theoretical development of the wake model and its implementation in the vortex lattice design code are summarized and sample results are given. Detailed program usage instructions, sample input and output data, and a program listing are presented in the Appendixes. The far field wake model assumes a wake vortex sheet whose strength varies piecewise linearly in the spanwise direction. From this model analytical expressions for lift coefficient, induced drag coefficient, pitching moment coefficient, and bending moment coefficient were developed. From these relationships a direct optimization scheme is used to determine the optimum wake vorticity distribution for minimum induced drag, subject to constraints on lift, and pitching or bending moment. Integration spanwise yields the bound circulation, which is interpolated in the near field vortex lattice to obtain the design camber surface(s).
Music 4C, a multi-voiced synthesis program with instruments defined in C
NASA Astrophysics Data System (ADS)
Beauchamp, James W.
2003-04-01
Music 4C is a program which runs under Unix (including Linux) and provides a means for the synthesis of arbitrary signals as defined by the C code. The program is actually a loose translation of an earlier program, Music 4BF [H. S. Howe, Jr., Electronic Music Synthesis (Norton, 1975)]. A set of instrument definitions are driven by a numerical score which consists of a series of ``events.'' Each event gives an instrument name, start time and duration, and a number of parameters (e.g., pitch) which describe the event. Each instrument definition consists of event parameters, performance variables, initializations, and a synthesis algorithmic code. Thus, the synthetic signal, no matter how complex, is precisely defined. Moreover, the resulting sounds can be overlaid in any arbitrary pattern. The program serves as a mixer of algorithmically produced sounds or recorded sounds taken from sample files or synthesized from spectrum files. A score file can be entered by hand, generated from a program, translated from a MIDI file, or generated from an alpha-numeric score using an auxiliary program, Notepro. Output sample files are in wav, snd, or aiff format. The program is provided in the C source code for download.
NASA Technical Reports Server (NTRS)
Jones, L. D.
1979-01-01
The Space Environment Test Division Post-Test Data Reduction Program processes data from test history tapes generated on the Flexible Data System in the Space Environment Simulation Laboratory at the National Aeronautics and Space Administration/Lyndon B. Johnson Space Center. The program reads the tape's data base records to retrieve the item directory conversion file, the item capture file and the process link file to determine the active parameters. The desired parameter names are read in by lead cards after which the periodic data records are read to determine parameter data level changes. The data is considered to be compressed rather than full sample rate. Tabulations and/or a tape for generating plots may be output.
NASA Technical Reports Server (NTRS)
Maskew, B.
1982-01-01
VSAERO is a computer program used to predict the nonlinear aerodynamic characteristics of arbitrary three-dimensional configurations in subsonic flow. Nonlinear effects of vortex separation and vortex surface interaction are treated in an iterative wake-shape calculation procedure, while the effects of viscosity are treated in an iterative loop coupling potential-flow and integral boundary-layer calculations. The program employs a surface singularity panel method using quadrilateral panels on which doublet and source singularities are distributed in a piecewise constant form. This user's manual provides a brief overview of the mathematical model, instructions for configuration modeling and a description of the input and output data. A listing of a sample case is included.
OPDOT: A computer program for the optimum preliminary design of a transport airplane
NASA Technical Reports Server (NTRS)
Sliwa, S. M.; Arbuckle, P. D.
1980-01-01
A description of a computer program, OPDOT, for the optimal preliminary design of transport aircraft is given. OPDOT utilizes constrained parameter optimization to minimize a performance index (e.g., direct operating cost per block hour) while satisfying operating constraints. The approach in OPDOT uses geometric descriptors as independent design variables. The independent design variables are systematically iterated to find the optimum design. The technical development of the program is provided and a program listing with sample input and output are utilized to illustrate its use in preliminary design. It is not meant to be a user's guide, but rather a description of a useful design tool developed for studying the application of new technologies to transport airplanes.
Heliocentric interplanetary low thrust trajectory optimization program, supplement 1, part 2
NASA Technical Reports Server (NTRS)
Mann, F. I.; Horsewood, J. L.
1978-01-01
The improvements made to the HILTOP electric propulsion trajectory computer program are described. A more realistic propulsion system model was implemented in which various thrust subsystem efficiencies and specific impulse are modeled as variable functions of power available to the propulsion system. The number of operating thrusters are staged, and the beam voltage is selected from a set of five (or less) constant voltages, based upon the application of variational calculus. The constant beam voltages may be optimized individually or collectively. The propulsion system logic is activated by a single program input key in such a manner as to preserve the HILTOP logic. An analysis describing these features, a complete description of program input quantities, and sample cases of computer output illustrating the program capabilities are presented.
Linear combination reading program for capture gamma rays
Tanner, Allan B.
1971-01-01
This program computes a weighting function, Qj, which gives a scalar output value of unity when applied to the spectrum of a desired element and a minimum value (considering statistics) when applied to spectra of materials not containing the desired element. Intermediate values are obtained for materials containing the desired element, in proportion to the amount of the element they contain. The program is written in the BASIC language in a format specific to the Hewlett-Packard 2000A Time-Sharing System, and is an adaptation of an earlier program for linear combination reading for X-ray fluorescence analysis (Tanner and Brinkerhoff, 1971). Following the program is a sample run from a study of the application of the linear combination technique to capture-gamma-ray analysis for calcium (report in preparation).
NASA Technical Reports Server (NTRS)
Tibbetts, J. G.
1980-01-01
Detailed instructions for using the near field cruise noise prediction program, a program listing, and a sample case with output are presented. The total noise for free field lossless conditions at selected observer locations is obtained by summing the contributions from up to nine acoustic sources. These noise sources, selected at the user's option, include the fan/compressor, turbine, core (combustion), jet, shock, and airframe (trailing edge and turbulent boundary layers). The effects of acoustic suppression materials such as engine inlet treatment may also be included in the noise prediction. The program is available for use on the NASA/Langley Research Center CDC computer. Comparisons of the program predictions with measured data are also given, and some possible reasons for their lack of agreement presented.
Sampled-data chain-observer design for a class of delayed nonlinear systems
NASA Astrophysics Data System (ADS)
Kahelras, M.; Ahmed-Ali, T.; Giri, F.; Lamnabhi-Lagarrigue, F.
2018-05-01
The problem of observer design is addressed for a class of triangular nonlinear systems with not-necessarily small delay and sampled output measurements. One more difficulty is that the system state matrix is dependent on the un-delayed output signal which is not accessible to measurement, making existing observers inapplicable. A new chain observer, composed of m elementary observers in series, is designed to compensate for output sampling and arbitrary large delays. The larger the time-delay the larger the number m. Each elementary observer includes an output predictor that is conceived to compensate for the effects of output sampling and a fractional delay. The predictors are defined by first-order ordinary differential equations (ODEs) much simpler than those of existing predictors which involve both output and state predictors. Using a small gain type analysis, sufficient conditions for the observer to be exponentially convergent are established in terms of the minimal number m of elementary observers and the maximum sampling interval.
Li, Xiangpeng; Brooks, Jessica C; Hu, Juan; Ford, Katarena I; Easley, Christopher J
2017-01-17
A fully automated, 16-channel microfluidic input/output multiplexer (μMUX) has been developed for interfacing to primary cells and to improve understanding of the dynamics of endocrine tissue function. The device utilizes pressure driven push-up valves for precise manipulation of nutrient input and hormone output dynamics, allowing time resolved interrogation of the cells. The ability to alternate any of the 16 channels from input to output, and vice versa, provides for high experimental flexibility without the need to alter microchannel designs. 3D-printed interface templates were custom designed to sculpt the above-channel polydimethylsiloxane (PDMS) in microdevices, creating millimeter scale reservoirs and confinement chambers to interface primary murine islets and adipose tissue explants to the μMUX sampling channels. This μMUX device and control system was first programmed for dynamic studies of pancreatic islet function to collect ∼90 minute insulin secretion profiles from groups of ∼10 islets. The automated system was also operated in temporal stimulation and cell imaging mode. Adipose tissue explants were exposed to a temporal mimic of post-prandial insulin and glucose levels, while simultaneous switching between labeled and unlabeled free fatty acid permitted fluorescent imaging of fatty acid uptake dynamics in real time over a ∼2.5 hour period. Application with varying stimulation and sampling modes on multiple murine tissue types highlights the inherent flexibility of this novel, 3D-templated μMUX device. The tissue culture reservoirs and μMUX control components presented herein should be adaptable as individual modules in other microfluidic systems, such as organ-on-a-chip devices, and should be translatable to different tissues such as liver, heart, skeletal muscle, and others.
NONDESTRUCTIVE EDDY CURRENT TESTING
Renken, C.J. Jr.
1961-05-23
An eddy current testing device is described for measuring metal continuity independent of probe-to-sample spacing. An inductance would test probe is made a leg of a variable impedance bridge and the bridge is balanced with the probe away from the sample. An a-c signal is applied across the input terminals of the bridge circuit. As the probe is brought into proximity with the metal sample, the resulting impedance change in the probe gives an output signal from the bridge whose phase angle is proportional to the sample continuity and amplitude is proportional to the probe-tosample spacing. The output signal from the bridge is applied to a compensating network where, responsive to amplitude changes from the bridge output signal, a constant phased voltage output is maintained when the sample is continuous regardless of probe-to-sample spacing. A phase meter calibrated to read changes in resistivity of the metal sample measures the phase shift between the output of the compensating network and the original a-c signal applied to the bridge.
Ball, James W.; Nordstrom, D. Kirk; Jenne, Everett A.
1980-01-01
A computerized chemical model, WATEQ2, has resulted from extensive additions to and revision of the WATEQ model of Truesdell and Jones (Truesdell, A. H., and Jones, B. F., 1974, WATEQ, a computer program for calculating chemical equilibria of natural waters: J. Res. U. S. Geol, Survey, v. 2, p. 233-274). The model building effort has necessitated searching the literature and selecting thermochemical data pertinent to the reactions added to the model. This supplementary report manes available the details of the reactions added to the model together with the selected thermochemical data and their sources. Also listed are details of program operation and a brief description of the output of the model. Appendices-contain a glossary of identifiers used in the PL/1 computer code, the complete PL/1 listing, and sample output from three water analyses used as test cases.
Mars Global Reference Atmospheric Model 2001 Version (Mars-GRAM 2001): Users Guide
NASA Technical Reports Server (NTRS)
Justus, C. G.; Johnson, D. L.
2001-01-01
This document presents Mars Global Reference Atmospheric Model 2001 Version (Mars-GRAM 2001) and its new features. As with the previous version (mars-2000), all parameterizations fro temperature, pressure, density, and winds versus height, latitude, longitude, time of day, and season (Ls) use input data tables from NASA Ames Mars General Circulation Model (MGCM) for the surface through 80-km altitude and the University of Arizona Mars Thermospheric General Circulation Model (MTGCM) for 80 to 70 km. Mars-GRAM 2001 is based on topography from the Mars Orbiter Laser Altimeter (MOLA) and includes new MGCM data at the topographic surface. A new auxiliary program allows Mars-GRAM output to be used to compute shortwave (solar) and longwave (thermal) radiation at the surface and top of atmosphere. This memorandum includes instructions on obtaining Mars-GRAN source code and data files and for running the program. It also provides sample input and output and an example for incorporating Mars-GRAM as an atmospheric subroutine in a trajectory code.
Computer user's manual for a generalized curve fit and plotting program
NASA Technical Reports Server (NTRS)
Schlagheck, R. A.; Beadle, B. D., II; Dolerhie, B. D., Jr.; Owen, J. W.
1973-01-01
A FORTRAN coded program has been developed for generating plotted output graphs on 8-1/2 by 11-inch paper. The program is designed to be used by engineers, scientists, and non-programming personnel on any IBM 1130 system that includes a 1627 plotter. The program has been written to provide a fast and efficient method of displaying plotted data without having to generate any additions. Various output options are available to the program user for displaying data in four different types of formatted plots. These options include discrete linear, continuous, and histogram graphical outputs. The manual contains information about the use and operation of this program. A mathematical description of the least squares goodness of fit test is presented. A program listing is also included.
Essays on environmental, energy, and natural resource economics
NASA Astrophysics Data System (ADS)
Zhang, Fan
My dissertation focuses on examining the interrelationship among the environment, energy and economic development. In the first essay, I explore the effects of increased uncertainty over future output prices, input costs and productivity levels on intertemporal emission permits trading. In a dynamic programming setting, a permit price is a convex function of each of these three sources of uncertainty. Increased uncertainty about future market conditions increases the expected permit price and causes risk-neutral firms to reduce ex ante emissions to smooth marginal abatement costs over time. Empirical analysis shows that increased price volatility induced by electricity market restructuring could explain 8-11% of the allowances banked during Phase I of the U.S. sulfur dioxide trading program. Numerical simulation suggests that high uncertainty may generate substantial initial compliance costs, thereby deterring new entrants and reducing efficiency; sharp emission spikes are also more likely to occur under industry-wide uncertainty shocks. In the second essay, I examine whether electricity restructuring improves the efficiency of U.S. nuclear power generation. Based on the full sample of 73 investor-owned nuclear plants in the United States from 1992 to 1998, I estimate cross-sectional and longitudinal efficiency changes associated with restructuring, at the plant level. Various modeling strategies are presented to deal with the policy endogeneity bias that high cost plants are more likely to be restructured. Overall, I find a strikingly positive relationship between the multiple steps of restructuring and plant operating efficiency. In the third essay, I estimate the economic impact of China's national land conversion program on local farm-dependent economies. The impact of the program on 14 industrial sectors in Gansu provinces are investigated using an input-output model. Due to regulatory restrictions, the agricultural sector cannot automatically expand or shrink its land requirements in direct proportion to output changes. Therefore, I modify a standard input-output model to incorporate supply constraints on cropping activities. A spatially explicit analysis is also implemented in a geographical information system to capture the heterogeneous land productivity. The net cost of the conservation program is estimated to be a land rent of 487.21 per acre per year (1999).
Computer programs for generation and evaluation of near-optimum vertical flight profiles
NASA Technical Reports Server (NTRS)
Sorensen, J. A.; Waters, M. H.; Patmore, L. C.
1983-01-01
Two extensive computer programs were developed. The first, called OPTIM, generates a reference near-optimum vertical profile, and it contains control options so that the effects of various flight constraints on cost performance can be examined. The second, called TRAGEN, is used to simulate an aircraft flying along an optimum or any other vertical reference profile. TRAGEN is used to verify OPTIM's output, examine the effects of uncertainty in the values of parameters (such as prevailing wind) which govern the optimum profile, or compare the cost performance of profiles generated by different techniques. A general description of these programs, the efforts to add special features to them, and sample results of their usage are presented.
NASA Technical Reports Server (NTRS)
Dumbauld, R. K.; Bjorklund, J. R.; Bowers, J. F.
1973-01-01
The NASA/MSFC multilayer diffusion models are discribed which are used in applying meteorological information to the estimation of toxic fuel hazards resulting from the launch of rocket vehicle and from accidental cold spills and leaks of toxic fuels. Background information, definitions of terms, description of the multilayer concept are presented along with formulas for determining the buoyant rise of hot exhaust clouds or plumes from conflagrations, and descriptions of the multilayer diffusion models. A brief description of the computer program is given, and sample problems and their solutions are included. Derivations of the cloud rise formulas, users instructions, and computer program output lists are also included.
Model modifications for simulation of flow through stratified rocks in eastern Ohio
Helgesen, J.O.; Razem, A.C.; Larson, S.P.
1982-01-01
A quasi three-dimensional groundwater flow model is being used as part of a study to determine impacts of coal-strip mining on local hydrologic systems. Modifications to the model were necessary to simulate local hydrologic conditions properly. Perched water tables required that the method of calculating vertical flow rate be changed. A head-dependent spring-discharge function and a head-dependent stream aquifer-interchange function were added to the program. Modifications were also made to allow recharge from precipitation to any layer. The modified program, data deck instructions, and sample input and output are presented. (USGS)
1974-06-01
COMMERCE . ■ . ..,...-: . c.. . . . .... .. . . . -. . Best Available Copy .. . . ’ . . . . ■ . «ICUWITV CCMHyiCATlOM 0’ TMII PAOC tW* m ...aircraft structural weight optimization, flutter optimization program, structural synthesis 10 ABSTRACT rCwidnu« on r«v»ft« «itf* M i Three...PAaZfVhtt Data tnffd) «cs;-W!.’i \\K »ifl • Jrflu 5/ r * i» .... ,r, M I J » i| *k\\Ui »’ PUT« « UM irr;!■ ,•’ rfra A NOTICES When
Program MAMO: Models for avian management optimization-user guide
Guillaumet, Alban; Paxton, Eben H.
2017-01-01
The following chapters describe the structure and code of MAMO, and walk the reader through running the different components of the program with sample data. This manual should be used alongside a computer running R, so that the reader can copy and paste code into R, observe the output, and follow along interactively. Taken together, chapters 2–4 will allow the user to replicate a simulation study investigating the consequences of climate change and two potential management actions on the population dynamics of a vulnerable and iconic Hawaiian forest bird, the ‘I‘iwi (Drepanis coccinea; hereafter IIWI).
GEMPAK: An arbitrary aircraft geometry generator
NASA Technical Reports Server (NTRS)
Stack, S. H.; Edwards, C. L. W.; Small, W. J.
1977-01-01
A computer program, GEMPAK, has been developed to aid in the generation of detailed configuration geometry. The program was written to allow the user as much flexibility as possible in his choices of configurations and the detail of description desired and at the same time keep input requirements and program turnaround and cost to a minimum. The program consists of routines that generate fuselage and planar-surface (winglike) geometry and a routine that will determine the true intersection of all components with the fuselage. This paper describes the methods by which the various geometries are generated and provides input description with sample input and output. Also included are descriptions of the primary program variables and functions performed by the various routines. The FORTRAN program GEMPAK has been used extensively in conjunction with interfaces to several aerodynamic and plotting computer programs and has proven to be an effective aid in the preliminary design phase of aircraft configurations.
Ball, J.W.; Nordstrom, D. Kirk; Zachmann, D.W.
1987-01-01
A FORTRAN 77 version of the PL/1 computer program for the geochemical model WATEQ2, which computes major and trace element speciation and mineral saturation for natural waters has been developed. The code (WATEQ4F) has been adapted to execute on an IBM PC or compatible microcomputer. Two versions of the code are available, one operating with IBM Professional FORTRAN and an 8087 or 89287 numeric coprocessor, and one which operates without a numeric coprocessor using Microsoft FORTRAN 77. The calculation procedure is identical to WATEQ2, which has been installed on many mainframes and minicomputers. Limited data base revisions include the addition of the following ions: AlHS04(++), BaS04, CaHS04(++), FeHS04(++), NaF, SrC03, and SrHCO3(+). This report provides the reactions and references for the data base revisions, instructions for program operation, and an explanation of the input and output files. Attachments contain sample output from three water analyses used as test cases and the complete FORTRAN source listing. U.S. Geological Survey geochemical simulation program PHREEQE and mass balance program BALANCE also have been adapted to execute on an IBM PC or compatible microcomputer with a numeric coprocessor and the IBM Professional FORTRAN compiler. (Author 's abstract)
The Lake Tahoe Basin Land Use Simulation Model
Forney, William M.; Oldham, I. Benson
2011-01-01
This U.S. Geological Survey Open-File Report describes the final modeling product for the Tahoe Decision Support System project for the Lake Tahoe Basin funded by the Southern Nevada Public Land Management Act and the U.S. Geological Survey's Geographic Analysis and Monitoring Program. This research was conducted by the U.S. Geological Survey Western Geographic Science Center. The purpose of this report is to describe the basic elements of the novel Lake Tahoe Basin Land Use Simulation Model, publish samples of the data inputs, basic outputs of the model, and the details of the Python code. The results of this report include a basic description of the Land Use Simulation Model, descriptions and summary statistics of model inputs, two figures showing the graphical user interface from the web-based tool, samples of the two input files, seven tables of basic output results from the web-based tool and descriptions of their parameters, and the fully functional Python code.
User's manual for THPLOT, A FORTRAN 77 Computer program for time history plotting
NASA Technical Reports Server (NTRS)
Murray, J. E.
1982-01-01
A general purpose FORTRAN 77 computer program (THPLOT) for plotting time histories using Calcomp pen plotters is described. The program is designed to read a time history data file and to generate time history plots for selected time intervals and/or selected data channels. The capabilities of the program are described. The card input required to define the plotting operation is described and examples of card input and the resulting plotted output are given. The examples are followed by a description of the printed output, including both normal output and error messages. Lastly, implementation of the program is described. A complete listing of the program with reference maps produced by the CDC FTN 5.0 compiler is included.
Design controls for large order systems
NASA Technical Reports Server (NTRS)
Doane, George B., III
1991-01-01
The output of this task will be a program plan which will delineate how MSFC will support and implement its portion of the Inter-Center Computational Controls Program Plan. Another output will be the results of looking at various multibody/multidegree of freedom computer programs in various environments.
Ruesch, Rodney; Jenkins, Philip N.; Ma, Nan
2004-03-09
There is disclosed apparatus and apparatus for impedance control to provide for controlling the impedance of a communication circuit using an all-digital impedance control circuit wherein one or more control bits are used to tune the output impedance. In one example embodiment, the impedance control circuit is fabricated using circuit components found in a standard macro library of a computer aided design system. According to another example embodiment, there is provided a control for an output driver on an integrated circuit ("IC") device to provide for forming a resistor divider network with the output driver and a resistor off the IC device so that the divider network produces an output voltage, comparing the output voltage of the divider network with a reference voltage, and adjusting the output impedance of the output driver to attempt to match the output voltage of the divider network and the reference voltage. Also disclosed is over-sampling the divider network voltage, storing the results of the over sampling, repeating the over-sampling and storing, averaging the results of multiple over sampling operations, controlling the impedance with a plurality of bits forming a word, and updating the value of the word by only one least significant bit at a time.
User's guide: Programs for processing altimeter data over inland seas
NASA Technical Reports Server (NTRS)
Au, A. Y.; Brown, R. D.; Welker, J. E.
1989-01-01
The programs described were developed to process GEODYN-formatted satellite altimeter data, and to apply the processed results to predict geoid undulations and gravity anomalies of inland sea areas. These programs are written in standard FORTRAN 77 and are designed to run on the NSESCC IBM 3081(MVS) computer. Because of the experimental nature of these programs they are tailored to the geographical area analyzed. The attached program listings are customized for processing the altimeter data over the Black Sea. Users interested in the Caspian Sea data are expected to modify each program, although the required modifications are generally minor. Program control parameters are defined in the programs via PARAMETER statements and/or DATA statements. Other auxiliary parameters, such as labels, are hard-wired into the programs. Large data files are read in or written out through different input or output units. The program listings of these programs are accompanied by sample IBM job control language (JCL) images. Familiarity with IBM JCL and the TEMPLATE graphic package is assumed.
ERIC Educational Resources Information Center
Welty, Gordon A.
The logic of the evaluation of educational and other action programs is discussed from a methodological viewpoint. However, no attempt is made to develop methods of evaluating programs. In Part I, the structure of an educational program is viewed as a system with three components--inputs, transformation of inputs into outputs, and outputs. Part II…
Library Programs. Evaluating Federally Funded Public Library Programs.
ERIC Educational Resources Information Center
Office of Educational Research and Improvement (ED), Washington, DC.
Following an introduction by Betty J. Turock, nine reports examine key issues in library evaluation: (1) "Output Measures and the Evaluation Process" (Nancy A. Van House) describes measurement as a concept to be understood in the larger context of planning and evaluation; (2) "Adapting Output Measures to Program Evaluation"…
Method and apparatus for data sampling
Odell, Daniel M. C.
1994-01-01
A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burge, S.W.
This report describes the FORCE2 flow program input, output, and the graphical post-processor. The manual describes the steps for creating the model, executing the programs and processing the results into graphical form. The FORCE2 post-processor was developed as an interactive program written in FORTRAN-77. It uses the Graphical Kernel System (GKS) graphics standard recently adopted by International Organization for Standardization, ISO, and American National Standards Institute, ANSI, and, therefore, can be used with many terminals. The post-processor vas written with Calcomp subroutine calls and is compatible with Tektkonix terminals and Calcomp and Nicolet pen plotters. B&W has been developing themore » FORCE2 code as a general-purpose tool for flow analysis of B&W equipment. The version of FORCE2 described in this manual was developed under the sponsorship of ASEA-Babcock as part of their participation in the joint R&D venture, ``Erosion of FBC Heat Transfer Tubes,`` and is applicable to the analyses of bubbling fluid beds. This manual is the principal documentation for program usage and is segmented into several sections to facilitate usage. In Section 2.0 the program is described, including assumptions, capabilities, limitations and uses, program status and location, related programs and program hardware and software requirements. Section 3.0 is a quick user`s reference guide for preparing input, executing FORCE2, and using the post-processor. Section 4.0 is a detailed description of the FORCE2 input. In Section 5.0, FORCE2 output is summarized. Section 6.0 contains a sample application, and Section 7.0 is a detailed reference guide.« less
Scheduling Algorithm for Mission Planning and Logistics Evaluation (SAMPLE). Volume 1: User's guide
NASA Technical Reports Server (NTRS)
Dupnick, E.; Wiggins, D.
1980-01-01
An interactive computer program for automatically generating traffic models for the Space Transportation System (STS) is presented. Information concerning run stream construction, input data, and output data is provided. The flow of the interactive data stream is described. Error messages are specified, along with suggestions for remedial action. In addition, formats and parameter definitions for the payload data set (payload model), feasible combination file, and traffic model are documented.
2012-07-01
transimpedance amplifier (CTIA), an output sample and hold, and a switched output buffer. Polaris Sensor Technology designed the unit cell that has this...hold, a dual gain, capacitive transimpedance amplifier (CTIA), an output sample and hold, and a switched output buffer. 6 The detector bias... transimpedance amplifier (CTIA) is used to integrate the detector’s photocurrent. It is built around a differential amplifier , X3, shown in Figure 3. The
NASA Technical Reports Server (NTRS)
Enison, R. L.
1971-01-01
A computer program called Character String Scanner (CSS), is presented. It is designed to search a data set for any specified group of characters and then to flag this group. The output of the CSS program is a listing of the data set being searched with the specified group of characters being flagged by asterisks. Therefore, one may readily identify specific keywords, groups of keywords or specified lines of code internal to a computer program, in a program output, or in any other specific data set. Possible applications of this program include the automatic scan of an output data set for pertinent keyword data, the editing of a program to change the appearance of a certain word or group of words, and the conversion of a set of code to a different set of code.
Detection of faults and software reliability analysis
NASA Technical Reports Server (NTRS)
Knight, John C.
1987-01-01
Multi-version or N-version programming is proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. These versions are executed in parallel in the application environment; each receives identical inputs and each produces its version of the required outputs. The outputs are collected by a voter and, in principle, they should all be the same. In practice there may be some disagreement. If this occurs, the results of the majority are taken to be the correct output, and that is the output used by the system. A total of 27 programs were produced. Each of these programs was then subjected to one million randomly-generated test cases. The experiment yielded a number of programs containing faults that are useful for general studies of software reliability as well as studies of N-version programming. Fault tolerance through data diversity and analytic models of comparison testing are discussed.
A new SAS program for behavioral analysis of Electrical Penetration Graph (EPG) data
USDA-ARS?s Scientific Manuscript database
A new program is introduced that uses SAS software to duplicate output of descriptive statistics from the Sarria Excel workbook for EPG waveform analysis. Not only are publishable means and standard errors or deviations output, the user also is guided through four relatively simple sub-programs for ...
Program Predicts Performance of Optical Parametric Oscillators
NASA Technical Reports Server (NTRS)
Cross, Patricia L.; Bowers, Mark
2006-01-01
A computer program predicts the performances of solid-state lasers that operate at wavelengths from ultraviolet through mid-infrared and that comprise various combinations of stable and unstable resonators, optical parametric oscillators (OPOs), and sum-frequency generators (SFGs), including second-harmonic generators (SHGs). The input to the program describes the signal, idler, and pump beams; the SFG and OPO crystals; and the laser geometry. The program calculates the electric fields of the idler, pump, and output beams at three locations (inside the laser resonator, just outside the input mirror, and just outside the output mirror) as functions of time for the duration of the pump beam. For each beam, the electric field is used to calculate the fluence at the output mirror, plus summary parameters that include the centroid location, the radius of curvature of the wavefront leaving through the output mirror, the location and size of the beam waist, and a quantity known, variously, as a propagation constant or beam-quality factor. The program provides a typical Windows interface for entering data and selecting files. The program can include as many as six plot windows, each containing four graphs.
Method and apparatus for data sampling
Odell, D.M.C.
1994-04-19
A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples is described. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium. 6 figures.
Updated users' guide for TAWFIVE with multigrid
NASA Technical Reports Server (NTRS)
Melson, N. Duane; Streett, Craig L.
1989-01-01
A program for the Transonic Analysis of a Wing and Fuselage with Interacted Viscous Effects (TAWFIVE) was improved by the incorporation of multigrid and a method to specify lift coefficient rather than angle-of-attack. A finite volume full potential multigrid method is used to model the outer inviscid flow field. First order viscous effects are modeled by a 3-D integral boundary layer method. Both turbulent and laminar boundary layers are treated. Wake thickness effects are modeled using a 2-D strip method. A brief discussion of the engineering aspects of the program is given. The input, output, and use of the program are covered in detail. Sample results are given showing the effects of boundary layer corrections and the capability of the lift specification method.
NASA Technical Reports Server (NTRS)
Gloss, R. J.
1971-01-01
A finite difference turbulent boundary layer computer program which allows for mass transfer wall cooling and equilibrium chemistry effects is presented. The program is capable of calculating laminar or turbulent boundary layer solutions for an arbitrary ideal gas or an equilibrium hydrogen oxygen system. Either two dimensional or axisymmetric geometric configurations may be considered. The equations are solved, in nondimension-alized physical coordinates, using the implicit Crank-Nicolson technique. The finite difference forms of the conservation of mass, momentum, total enthalpy and elements equations are linearized and uncoupled, thereby generating easily solvable tridiagonal sets of algebraic equations. A detailed description of the computer program, as well as a program user's manual is provided. Detailed descriptions of all boundary layer subroutines are included, as well as a section defining all program symbols of principal importance. Instructions are then given for preparing card input to the program and for interpreting the printed output. Finally, two sample cases are included to illustrate the use of the program.
Biopower generation from kitchen wastewater using a bioreactor.
Khan, Abdul M; Naz, Shamsa
2014-01-01
This research provides a comparative study of the power output from mediator-less and mediator microbial fuel cells (MFCs) under aerobic and partially anaerobic conditions using kitchen wastewater (KWW) as a renewable energy source. The wastewater sample was subjected to different physical, chemical, biochemical, and microbial analysis. The chemical oxygen demand (COD), biochemical oxygen demand (BOD), and power output values were greater for the fermented samples than the non-fermented samples. The power output of samples was compared through the development of MFCs by using sand-salt bridge and agar-salt bridge. The H2 that was produced was converted to atomic hydrogen by using the nickel-coated zinc electrode. In addition, the power output was further enhanced by introducing air into the cathodic chamber, where oxygen reacts with the protons to form pure H2O. The study showed that the power output was increased with the increase in COD and BOD values.
Pulsed phase locked loop strain monitor
NASA Technical Reports Server (NTRS)
Froggatt, Mark E. (Inventor)
1995-01-01
A pulse phase locked loop system according to the present invention is described. A frequency generator such as a voltage controlled oscillator (VCO) generates an output signal and a reference signal having a frequency equal to that of the output signal. A transmitting gate gates the output frequency signal and this gated signal drives a transmitting transducer which transmits an acoustic wave through a material. A sample/hold samples a signal indicative of the transmitted wave which is received by a receiving transducer. Divide-by-n counters control these gating and sampling functions in response to the reference signal of the frequency generator. Specifically, the output signal is gated at a rate of F/h, wherein F is the frequency of the output signal and h is an integer; and the received signal is sampled at a delay of F/n wherein n is an integer.
Dynamical genetic programming in XCSF.
Preen, Richard J; Bull, Larry
2013-01-01
A number of representation schemes have been presented for use within learning classifier systems, ranging from binary encodings to artificial neural networks. This paper presents results from an investigation into using a temporally dynamic symbolic representation within the XCSF learning classifier system. In particular, dynamical arithmetic networks are used to represent the traditional condition-action production system rules to solve continuous-valued reinforcement learning problems and to perform symbolic regression, finding competitive performance with traditional genetic programming on a number of composite polynomial tasks. In addition, the network outputs are later repeatedly sampled at varying temporal intervals to perform multistep-ahead predictions of a financial time series.
A Comprehensive Well Testing Implementation during Exploration Phase in Rantau Dedap, Indonesia
NASA Astrophysics Data System (ADS)
Humaedi, M. T.; Alfiady; Putra, A. P.; Martikno, R.; Situmorang, J.
2016-09-01
This paper describes the implementation of comprehensive well testing programs during the 2014-2015 exploration drilling in Rantau Dedap Geothermal Field. The well testing programs were designed to provide reliable data as foundation for resource assessment as well as useful information for decision making during drilling. A series of well testing survey consisting of SFTT, completion test, heating-up downhole logging, discharge test, chemistry sampling was conducted to understand individual wells characteristics such as thermodynamic state of the reservoir fluid, permeability distribution, well output and fluid chemistry. Furthermore, interference test was carried out to investigate the response of reservoir to exploitation.
Computer program for single input-output, single-loop feedback systems
NASA Technical Reports Server (NTRS)
1976-01-01
Additional work is reported on a completely automatic computer program for the design of single input/output, single loop feedback systems with parameter uncertainly, to satisfy time domain bounds on the system response to step commands and disturbances. The inputs to the program are basically the specified time-domain response bounds, the form of the constrained plant transfer function and the ranges of the uncertain parameters of the plant. The program output consists of the transfer functions of the two free compensation networks, in the form of the coefficients of the numerator and denominator polynomials, and the data on the prescribed bounds and the extremes actually obtained for the system response to commands and disturbances.
Computer code for charge-exchange plasma propagation
NASA Technical Reports Server (NTRS)
Robinson, R. S.; Kaufman, H. R.
1981-01-01
The propagation of the charge-exchange plasma from an electrostatic ion thruster is crucial in determining the interaction of that plasma with the associated spacecraft. A model that describes this plasma and its propagation is described, together with a computer code based on this model. The structure and calling sequence of the code, named PLASIM, is described. An explanation of the program's input and output is included, together with samples of both. The code is written in ASNI Standard FORTRAN.
NASA Technical Reports Server (NTRS)
Bennett, R. M.; Bland, S. R.; Redd, L. T.
1973-01-01
Computer programs for calculating the stability characteristics of a balloon tethered in a steady wind are presented. Equilibrium conditions, characteristic roots, and modal ratios are calculated for a range of discrete values of velocity for a fixed tether-line length. Separate programs are used: (1) to calculate longitudinal stability characteristics, (2) to calculate lateral stability characteristics, (3) to plot the characteristic roots versus velocity, (4) to plot the characteristic roots in root-locus form, (5) to plot the longitudinal modes of motion, and (6) to plot the lateral modes for motion. The basic equations, program listings, and the input and output data for sample cases are presented, with a brief discussion of the overall operation and limitations. The programs are based on a linearized, stability-derivative type of analysis, including balloon aerodynamics, apparent mass, buoyancy effects, and static forces which result from the tether line.
Object-oriented sequence analysis: SCL--a C++ class library.
Vahrson, W; Hermann, K; Kleffe, J; Wittig, B
1996-04-01
SCL (Sequence Class Library) is a class library written in the C++ programming language. Designed using object-oriented programming principles, SCL consists of classes of objects performing tasks typically needed for analyzing DNA or protein sequences. Among them are very flexible sequence classes, classes accessing databases in various formats, classes managing collections of sequences, as well as classes performing higher-level tasks like calculating a pairwise sequence alignment. SCL also includes classes that provide general programming support, like a dynamically growing array, sets, matrices, strings, classes performing file input/output, and utilities for error handling. By providing these components, SCL fosters an explorative programming style: experimenting with algorithms and alternative implementations is encouraged rather than punished. A description of SCL's overall structure as well as an overview of its classes is given. Important aspects of the work with SCL are discussed in the context of a sample program.
Automatic control and detector for three-terminal resistance measurement
Fasching, George E.
1976-10-26
A device is provided for automatic control and detection in a three-terminal resistance measuring instrument. The invention is useful for the rapid measurement of the resistivity of various bulk material with a three-terminal electrode system. The device maintains the current through the sample at a fixed level while measuring the voltage across the sample to detect the sample resistance. The three-electrode system contacts the bulk material and the current through the sample is held constant by means of a control circuit connected to a first of the three electrodes and works in conjunction with a feedback controlled amplifier to null the voltage between the first electrode and a second electrode connected to the controlled amplifier output. An A.C. oscillator provides a source of sinusoidal reference voltage of the frequency at which the measurement is to be executed. Synchronous reference pulses for synchronous detectors in the control circuit and an output detector circuit are provided by a synchronous pulse generator. The output of the controlled amplifier circuit is sampled by an output detector circuit to develop at an output terminal thereof a D.C. voltage which is proportional to the sample resistance R. The sample resistance is that segment of the sample between the area of the first electrode and the third electrode, which is connected to ground potential.
TAP 2: A finite element program for thermal analysis of convectively cooled structures
NASA Technical Reports Server (NTRS)
Thornton, E. A.
1980-01-01
A finite element computer program (TAP 2) for steady-state and transient thermal analyses of convectively cooled structures is presented. The program has a finite element library of six elements: two conduction/convection elements to model heat transfer in a solid, two convection elements to model heat transfer in a fluid, and two integrated conduction/convection elements to represent combined heat transfer in tubular and plate/fin fluid passages. Nonlinear thermal analysis due to temperature-dependent thermal parameters is performed using the Newton-Raphson iteration method. Transient analyses are performed using an implicit Crank-Nicolson time integration scheme with consistent or lumped capacitance matrices as an option. Program output includes nodal temperatures and element heat fluxes. Pressure drops in fluid passages may be computed as an option. User instructions and sample problems are presented in appendixes.
NASA Technical Reports Server (NTRS)
Tripp, L. L.; Tamekuni, M.; Viswanathan, A. V.
1973-01-01
The use of the computer program BUCLASP2 is described. The program is intended for linear instability analyses of structures such as unidirectionally stiffened panels. Any structure that has a constant cross section in one direction, that may be idealized as an assemblage of beam elements and laminated flat and curved plant strip elements can be analyzed. The loadings considered are combinations of axial compressive loads and in-plane transverse loads. The two parallel ends of the panel must be simply supported and arbitrary elastic boundary conditions may be imposed along any one or both external longitudinal side. This manual consists of instructions for use of the program with sample problems, including input and output information. The theoretical basis of BUCLASP2 and correlations of calculated results with known solutions, are presented.
Application of higher harmonic blade feathering for helicopter vibration reduction
NASA Technical Reports Server (NTRS)
Powers, R. W.
1978-01-01
Higher harmonic blade feathering for helicopter vibration reduction is considered. Recent wind tunnel tests confirmed the effectiveness of higher harmonic control in reducing articulated rotor vibratory hub loads. Several predictive analyses developed in support of the NASA program were shown to be capable of calculating single harmonic control inputs required to minimize a single 4P hub response. In addition, a multiple-input, multiple-output harmonic control predictive analysis was developed. All techniques developed thus far obtain a solution by extracting empirical transfer functions from sampled data. Algorithm data sampling and processing requirements are minimal to encourage adaptive control system application of such techniques in a flight environment.
1992-02-01
develop,, and maintains computer programs for the Department of the Navy. It provides life cycle support for over 50 computer programs installed at over...the computer programs . Table 4 presents a list of possible product or output measures of functionality for ACDS Block 0 programs . Examples of output...were identified as important "causes" of process performance. Functionality of the computer programs was the result or "effect" of the combination of
NASA Technical Reports Server (NTRS)
Buck, C. H.
1975-01-01
The program documentation for the PRF ARTWORK/AIDS conversion program, which serves as the interface between the outputs of the PRF ARTWORK and AIDS programs, was presented. The document has a two-fold purpose, the first of which is a description of the software design including flowcharts of the design at the functional level. The second purpose is to provide the user with a detailed description of the input parameters and formats necessary to execute the program and a description of the output produced when the program is executed.
NASA Astrophysics Data System (ADS)
Abdul Ghani, B.
2005-09-01
"TEA CO 2 Laser Simulator" has been designed to simulate the dynamic emission processes of the TEA CO 2 laser based on the six-temperature model. The program predicts the behavior of the laser output pulse (power, energy, pulse duration, delay time, FWHM, etc.) depending on the physical and geometrical input parameters (pressure ratio of gas mixture, reflecting area of the output mirror, media length, losses, filling and decay factors, etc.). Program summaryTitle of program: TEA_CO2 Catalogue identifier: ADVW Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVW Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: P.IV DELL PC Setup: Atomic Energy Commission of Syria, Scientific Services Department, Mathematics and Informatics Division Operating system: MS-Windows 9x, 2000, XP Programming language: Delphi 6.0 No. of lines in distributed program, including test data, etc.: 47 315 No. of bytes in distributed program, including test data, etc.:7 681 109 Distribution format:tar.gz Classification: 15 Laser Physics Nature of the physical problem: "TEA CO 2 Laser Simulator" is a program that predicts the behavior of the laser output pulse by studying the effect of the physical and geometrical input parameters on the characteristics of the output laser pulse. The laser active medium consists of a CO 2-N 2-He gas mixture. Method of solution: Six-temperature model, for the dynamics emission of TEA CO 2 laser, has been adapted in order to predict the parameters of laser output pulses. A simulation of the laser electrical pumping was carried out using two approaches; empirical function equation (8) and differential equation (9). Typical running time: The program's running time mainly depends on both integration interval and step; for a 4 μs period of time and 0.001 μs integration step (defaults values used in the program), the running time will be about 4 seconds. Restrictions on the complexity: Using a very small integration step might leads to stop the program run due to the huge number of calculating points and to a small paging file size of the MS-Windows virtual memory. In such case, it is recommended to enlarge the paging file size to the appropriate size, or to use a bigger value of integration step.
KB3D Reference Manual. Version 1.a
NASA Technical Reports Server (NTRS)
Munoz, Cesar; Siminiceanu, Radu; Carreno, Victor A.; Dowek, Gilles
2005-01-01
This paper is a reference manual describing the implementation of the KB3D conflict detection and resolution algorithm. The algorithm has been implemented in the Java and C++ programming languages. The reference manual gives a short overview of the detection and resolution functions, the structural implementation of the program, inputs and outputs to the program, and describes how the program is used. Inputs to the program can be rectangular coordinates or geodesic coordinates. The reference manual also gives examples of conflict scenarios and the resolution outputs the program produces.
Barriera, Tiago V; Tudor-Locke, Catrine; Champagne, Catherine M; Broyles, Stephanie T; Johnson, William D; Katzmarzyk, Peter T
2013-02-01
The purpose of this study was to compare steps/day detected by the YAMAX SW-200 pedometer versus the Actigraph GT3X accelerometer in free-living adults. Daily YAMAX and GT3X steps were collected from a sample of 23 overweight and obese participants (78% female; age = 52.6 ± 8.4 yr.; BMI = 31.0 ± 3.7 m·kg-2). Because a pedometer is more likely to be used in a community-based intervention program, it was used as the standard for comparison. Percent difference (PD) and absolute percent difference (APD) were calculated to examine between-instrument agreement. In addition, days were categorized based on PD: a) under-counting (> -10 PD), b) acceptable counting (-10 to 10 PD), and c) over-counting (> 10 PD). The YAMAX and GT3X detected 8,025 ± 3,967 and 7131 ± 3066 steps/day, respectively, and the outputs were highly correlated (r = .87). Average PD was -3.1% ± 30.7% and average APD was 23.9% ± 19.4%. Relative to the YAMAX, 53% of the days detected by the GT3X were classified as under-counting, 25% acceptable counting, and 23% over-counting. Although the output of these 2 instruments is highly correlated, caution is advised when directly comparing or using their output interchangeably.
Parallel processor for real-time structural control
NASA Astrophysics Data System (ADS)
Tise, Bert L.
1993-07-01
A parallel processor that is optimized for real-time linear control has been developed. This modular system consists of A/D modules, D/A modules, and floating-point processor modules. The scalable processor uses up to 1,000 Motorola DSP96002 floating-point processors for a peak computational rate of 60 GFLOPS. Sampling rates up to 625 kHz are supported by this analog-in to analog-out controller. The high processing rate and parallel architecture make this processor suitable for computing state-space equations and other multiply/accumulate-intensive digital filters. Processor features include 14-bit conversion devices, low input-to-output latency, 240 Mbyte/s synchronous backplane bus, low-skew clock distribution circuit, VME connection to host computer, parallelizing code generator, and look- up-tables for actuator linearization. This processor was designed primarily for experiments in structural control. The A/D modules sample sensors mounted on the structure and the floating- point processor modules compute the outputs using the programmed control equations. The outputs are sent through the D/A module to the power amps used to drive the structure's actuators. The host computer is a Sun workstation. An OpenWindows-based control panel is provided to facilitate data transfer to and from the processor, as well as to control the operating mode of the processor. A diagnostic mode is provided to allow stimulation of the structure and acquisition of the structural response via sensor inputs.
He, Dan; Kuhn, David; Parida, Laxmi
2016-06-15
Given a set of biallelic molecular markers, such as SNPs, with genotype values encoded numerically on a collection of plant, animal or human samples, the goal of genetic trait prediction is to predict the quantitative trait values by simultaneously modeling all marker effects. Genetic trait prediction is usually represented as linear regression models. In many cases, for the same set of samples and markers, multiple traits are observed. Some of these traits might be correlated with each other. Therefore, modeling all the multiple traits together may improve the prediction accuracy. In this work, we view the multitrait prediction problem from a machine learning angle: as either a multitask learning problem or a multiple output regression problem, depending on whether different traits share the same genotype matrix or not. We then adapted multitask learning algorithms and multiple output regression algorithms to solve the multitrait prediction problem. We proposed a few strategies to improve the least square error of the prediction from these algorithms. Our experiments show that modeling multiple traits together could improve the prediction accuracy for correlated traits. The programs we used are either public or directly from the referred authors, such as MALSAR (http://www.public.asu.edu/~jye02/Software/MALSAR/) package. The Avocado data set has not been published yet and is available upon request. dhe@us.ibm.com. © The Author 2016. Published by Oxford University Press.
MRM Evaluation Research Program
NASA Technical Reports Server (NTRS)
Taylor, James C.
1998-01-01
This is an interim report on the current output of the MRM evaluation research program. During 1998 this research program has used new and existing data to create an important tool for the development and improvement of "maintenance resource management" (MRM). Thousands of surveys completed by participants in airline MRM training and/or behavior change programs have, for the first time, been consolidated into a panel of "MRM Attitudes and Opinion Profiles." These profiles can be used to compare the attitudes about decision making and communication in any given company at any stage in its MRM program with attitudes of a large sample of like employees during a similar period in their MRM involvement. This panel of comparison profiles for attitudes and opinions is a tool to help audit the effectiveness of a maintenance human factors program. The profile panel is the first of several tools envisioned for applying the information accumulating in MRM databases produced as one of the program's long range objectives.
NASA Astrophysics Data System (ADS)
Sun, Lianming; Sano, Akira
Output over-sampling based closed-loop identification algorithm is investigated in this paper. Some instinct properties of the continuous stochastic noise and the plant input, output in the over-sampling approach are analyzed, and they are used to demonstrate the identifiability in the over-sampling approach and to evaluate its identification performance. Furthermore, the selection of plant model order, the asymptotic variance of estimated parameters and the asymptotic variance of frequency response of the estimated model are also explored. It shows that the over-sampling approach can guarantee the identifiability and improve the performance of closed-loop identification greatly.
Perl-speaks-NONMEM (PsN)--a Perl module for NONMEM related programming.
Lindbom, Lars; Ribbing, Jakob; Jonsson, E Niclas
2004-08-01
The NONMEM program is the most widely used nonlinear regression software in population pharmacokinetic/pharmacodynamic (PK/PD) analyses. In this article we describe a programming library, Perl-speaks-NONMEM (PsN), intended for programmers that aim at using the computational capability of NONMEM in external applications. The library is object oriented and written in the programming language Perl. The classes of the library are built around NONMEM's data, model and output files. The specification of the NONMEM model is easily set or changed through the model and data file classes while the output from a model fit is accessed through the output file class. The classes have methods that help the programmer perform common repetitive tasks, e.g. summarising the output from a NONMEM run, setting the initial estimates of a model based on a previous run or truncating values over a certain threshold in the data file. PsN creates a basis for the development of high-level software using NONMEM as the regression tool.
VizieR Online Data Catalog: Properties of late M-dwarfs (Janson+, 2014)
NASA Astrophysics Data System (ADS)
Janson, M.; Bergfors, C.; Brandner, W.; Kudryavtseva, N.; Hormuth, F.; Hippler, S.; Henning, T.
2017-03-01
The targets in this study were selected from the Lepine & Gaidos (2011, J/AJ/142/138) sample, where stars with a spectral type (SpT) estimate of M5 or later were selected if they were sufficiently bright (J <= 10.0 mag) and sufficiently far north (>-15°) to be meaningfully observed with AstraLux Norte. In total, this gave an input sample of 408 potential targets, of which 286 were actually observed. All observations in this program were acquired with the AstraLux Norte camera on the 2.2 m telescope at Calar Alto in Spain. The 2.2 m telescope is on an equatorial mount. AstraLux uses an Andor DV887-UVB camera head equipped with a thinned, back-illuminated, electron-multiplying 512 x 512 pixel monolithic CCD. The CCD is equipped with two readout registers, one for conventional readout, and one 536 stage electron multiplication register. Each of the two registers comes with its own output amplifier. All Lucky Imaging data were obtained using the electron multiplication mode, and the associated output amplifier. (3 data files).
The NASA MSFC Earth Global Reference Atmospheric Model-2007 Version
NASA Technical Reports Server (NTRS)
Leslie, F.W.; Justus, C.G.
2008-01-01
Reference or standard atmospheric models have long been used for design and mission planning of various aerospace systems. The NASA/Marshall Space Flight Center (MSFC) Global Reference Atmospheric Model (GRAM) was developed in response to the need for a design reference atmosphere that provides complete global geographical variability, and complete altitude coverage (surface to orbital altitudes) as well as complete seasonal and monthly variability of the thermodynamic variables and wind components. A unique feature of GRAM is that, addition to providing the geographical, height, and monthly variation of the mean atmospheric state, it includes the ability to simulate spatial and temporal perturbations in these atmospheric parameters (e.g. fluctuations due to turbulence and other atmospheric perturbation phenomena). A summary comparing GRAM features to characteristics and features of other reference or standard atmospheric models, can be found Guide to Reference and Standard Atmosphere Models. The original GRAM has undergone a series of improvements over the years with recent additions and changes. The software program is called Earth-GRAM2007 to distinguish it from similar programs for other bodies (e.g. Mars, Venus, Neptune, and Titan). However, in order to make this Technical Memorandum (TM) more readable, the software will be referred to simply as GRAM07 or GRAM unless additional clarity is needed. Section 1 provides an overview of the basic features of GRAM07 including the newly added features. Section 2 provides a more detailed description of GRAM07 and how the model output generated. Section 3 presents sample results. Appendices A and B describe the Global Upper Air Climatic Atlas (GUACA) data and the Global Gridded Air Statistics (GGUAS) database. Appendix C provides instructions for compiling and running GRAM07. Appendix D gives a description of the required NAMELIST format input. Appendix E gives sample output. Appendix F provides a list of available parameters to enable the user to generate special output. Appendix G gives an example and guidance on incorporating GRAM07 as a subroutine in other programs such as trajectory codes or orbital propagation routines.
Sensing device and method for measuring emission time delay during irradiation of targeted samples
NASA Technical Reports Server (NTRS)
Danielson, J. D. Sheldon (Inventor)
2000-01-01
An apparatus for measuring emission time delay during irradiation of targeted samples by utilizing digital signal processing to determine the emission phase shift caused by the sample is disclosed. The apparatus includes a source of electromagnetic radiation adapted to irradiate a target sample. A mechanism generates first and second digital input signals of known frequencies with a known phase relationship, and a device then converts the first and second digital input signals to analog sinusoidal signals. An element is provided to direct the first input signal to the electromagnetic radiation source to modulate the source by the frequency thereof to irradiate the target sample and generate a target sample emission. A device detects the target sample emission and produces a corresponding first output signal having a phase shift relative to the phase of the first input signal, the phase shift being caused by the irradiation time delay in the sample. A member produces a known phase shift in the second input signal to create a second output signal. A mechanism is then provided for converting each of the first and second analog output signals to digital signals. A mixer receives the first and second digital output signals and compares the signal phase relationship therebetween to produce a signal indicative of the change in phase relationship between the first and second output signals caused by the target sample emission. Finally, a feedback arrangement alters the phase of the second input signal based on the mixer signal to ultimately place the first and second output signals in quadrature. Mechanisms for enhancing this phase comparison and adjustment technique are also disclosed.
NASA Technical Reports Server (NTRS)
Danielson, J. D. Sheldon (Inventor)
2006-01-01
An apparatus for measuring emission time delay during irradiation of targeted samples by utilizing digital signal processing to determine the emission phase shift caused by the sample is disclosed. The apparatus includes a source of electromagnetic radiation adapted to irradiate a target sample. A mechanism generates first and second digital input signals of known frequencies with a known phase relationship, and a device then converts the first and second digital input signals to analog sinusoidal signals. An element is provided to direct the first input signal to the electromagnetic radiation source to modulate the source by the frequency thereof to irradiate the target sample and generate a target sample emission. A device detects the target sample emission and produces a corresponding first output signal having a phase shift relative to the phase of the first input signal, the phase shift being caused by the irradiation time delay in the sample. A member produces a known phase shift in the second input signal to create a second output signal. A mechanism is then provided for converting each of the first and second analog output signals to digital signals. A mixer receives the first and second digital output signals and compares the signal phase relationship therebetween to produce a signal indicative of the change in phase relationship between the first and second output signals caused by the target sample emission. Finally, a feedback arrangement alters the phase of the second input signal based on the mixer signal to ultimately place the first and second output signals in quadrature. Mechanisms for enhancing this phase comparison and adjustment technique are also disclosed.
HYSEP: A Computer Program for Streamflow Hydrograph Separation and Analysis
Sloto, Ronald A.; Crouse, Michele Y.
1996-01-01
HYSEP is a computer program that can be used to separate a streamflow hydrograph into base-flow and surface-runoff components. The base-flow component has traditionally been associated with ground-water discharge and the surface-runoff component with precipitation that enters the stream as overland runoff. HYSEP includes three methods of hydrograph separation that are referred to in the literature as the fixed interval, sliding-interval, and local-minimum methods. The program also describes the frequency and duration of measured streamflow and computed base flow and surface runoff. Daily mean stream discharge is used as input to the program in either an American Standard Code for Information Interchange (ASCII) or binary format. Output from the program includes table,s graphs, and data files. Graphical output may be plotted on the computer screen or output to a printer, plotter, or metafile.
NASA Technical Reports Server (NTRS)
Geyser, L. C.
1978-01-01
A digital computer program, DYGABCD, was developed that generates linearized, dynamic models of simulated turbofan and turbojet engines. DYGABCD is based on an earlier computer program, DYNGEN, that is capable of calculating simulated nonlinear steady-state and transient performance of one- and two-spool turbojet engines or two- and three-spool turbofan engines. Most control design techniques require linear system descriptions. For multiple-input/multiple-output systems such as turbine engines, state space matrix descriptions of the system are often desirable. DYGABCD computes the state space matrices commonly referred to as the A, B, C, and D matrices required for a linear system description. The report discusses the analytical approach and provides a users manual, FORTRAN listings, and a sample case.
Programmed Evolution for Optimization of Orthogonal Metabolic Output in Bacteria
Eckdahl, Todd T.; Campbell, A. Malcolm; Heyer, Laurie J.; Poet, Jeffrey L.; Blauch, David N.; Snyder, Nicole L.; Atchley, Dustin T.; Baker, Erich J.; Brown, Micah; Brunner, Elizabeth C.; Callen, Sean A.; Campbell, Jesse S.; Carr, Caleb J.; Carr, David R.; Chadinha, Spencer A.; Chester, Grace I.; Chester, Josh; Clarkson, Ben R.; Cochran, Kelly E.; Doherty, Shannon E.; Doyle, Catherine; Dwyer, Sarah; Edlin, Linnea M.; Evans, Rebecca A.; Fluharty, Taylor; Frederick, Janna; Galeota-Sprung, Jonah; Gammon, Betsy L.; Grieshaber, Brandon; Gronniger, Jessica; Gutteridge, Katelyn; Henningsen, Joel; Isom, Bradley; Itell, Hannah L.; Keffeler, Erica C.; Lantz, Andrew J.; Lim, Jonathan N.; McGuire, Erin P.; Moore, Alexander K.; Morton, Jerrad; Nakano, Meredith; Pearson, Sara A.; Perkins, Virginia; Parrish, Phoebe; Pierson, Claire E.; Polpityaarachchige, Sachith; Quaney, Michael J.; Slattery, Abagael; Smith, Kathryn E.; Spell, Jackson; Spencer, Morgan; Taye, Telavive; Trueblood, Kamay; Vrana, Caroline J.; Whitesides, E. Tucker
2015-01-01
Current use of microbes for metabolic engineering suffers from loss of metabolic output due to natural selection. Rather than combat the evolution of bacterial populations, we chose to embrace what makes biological engineering unique among engineering fields – evolving materials. We harnessed bacteria to compute solutions to the biological problem of metabolic pathway optimization. Our approach is called Programmed Evolution to capture two concepts. First, a population of cells is programmed with DNA code to enable it to compute solutions to a chosen optimization problem. As analog computers, bacteria process known and unknown inputs and direct the output of their biochemical hardware. Second, the system employs the evolution of bacteria toward an optimal metabolic solution by imposing fitness defined by metabolic output. The current study is a proof-of-concept for Programmed Evolution applied to the optimization of a metabolic pathway for the conversion of caffeine to theophylline in E. coli. Introduced genotype variations included strength of the promoter and ribosome binding site, plasmid copy number, and chaperone proteins. We constructed 24 strains using all combinations of the genetic variables. We used a theophylline riboswitch and a tetracycline resistance gene to link theophylline production to fitness. After subjecting the mixed population to selection, we measured a change in the distribution of genotypes in the population and an increased conversion of caffeine to theophylline among the most fit strains, demonstrating Programmed Evolution. Programmed Evolution inverts the standard paradigm in metabolic engineering by harnessing evolution instead of fighting it. Our modular system enables researchers to program bacteria and use evolution to determine the combination of genetic control elements that optimizes catabolic or anabolic output and to maintain it in a population of cells. Programmed Evolution could be used for applications in energy, pharmaceuticals, chemical commodities, biomining, and bioremediation. PMID:25714374
Programmed evolution for optimization of orthogonal metabolic output in bacteria.
Eckdahl, Todd T; Campbell, A Malcolm; Heyer, Laurie J; Poet, Jeffrey L; Blauch, David N; Snyder, Nicole L; Atchley, Dustin T; Baker, Erich J; Brown, Micah; Brunner, Elizabeth C; Callen, Sean A; Campbell, Jesse S; Carr, Caleb J; Carr, David R; Chadinha, Spencer A; Chester, Grace I; Chester, Josh; Clarkson, Ben R; Cochran, Kelly E; Doherty, Shannon E; Doyle, Catherine; Dwyer, Sarah; Edlin, Linnea M; Evans, Rebecca A; Fluharty, Taylor; Frederick, Janna; Galeota-Sprung, Jonah; Gammon, Betsy L; Grieshaber, Brandon; Gronniger, Jessica; Gutteridge, Katelyn; Henningsen, Joel; Isom, Bradley; Itell, Hannah L; Keffeler, Erica C; Lantz, Andrew J; Lim, Jonathan N; McGuire, Erin P; Moore, Alexander K; Morton, Jerrad; Nakano, Meredith; Pearson, Sara A; Perkins, Virginia; Parrish, Phoebe; Pierson, Claire E; Polpityaarachchige, Sachith; Quaney, Michael J; Slattery, Abagael; Smith, Kathryn E; Spell, Jackson; Spencer, Morgan; Taye, Telavive; Trueblood, Kamay; Vrana, Caroline J; Whitesides, E Tucker
2015-01-01
Current use of microbes for metabolic engineering suffers from loss of metabolic output due to natural selection. Rather than combat the evolution of bacterial populations, we chose to embrace what makes biological engineering unique among engineering fields - evolving materials. We harnessed bacteria to compute solutions to the biological problem of metabolic pathway optimization. Our approach is called Programmed Evolution to capture two concepts. First, a population of cells is programmed with DNA code to enable it to compute solutions to a chosen optimization problem. As analog computers, bacteria process known and unknown inputs and direct the output of their biochemical hardware. Second, the system employs the evolution of bacteria toward an optimal metabolic solution by imposing fitness defined by metabolic output. The current study is a proof-of-concept for Programmed Evolution applied to the optimization of a metabolic pathway for the conversion of caffeine to theophylline in E. coli. Introduced genotype variations included strength of the promoter and ribosome binding site, plasmid copy number, and chaperone proteins. We constructed 24 strains using all combinations of the genetic variables. We used a theophylline riboswitch and a tetracycline resistance gene to link theophylline production to fitness. After subjecting the mixed population to selection, we measured a change in the distribution of genotypes in the population and an increased conversion of caffeine to theophylline among the most fit strains, demonstrating Programmed Evolution. Programmed Evolution inverts the standard paradigm in metabolic engineering by harnessing evolution instead of fighting it. Our modular system enables researchers to program bacteria and use evolution to determine the combination of genetic control elements that optimizes catabolic or anabolic output and to maintain it in a population of cells. Programmed Evolution could be used for applications in energy, pharmaceuticals, chemical commodities, biomining, and bioremediation.
PLASIM: A computer code for simulating charge exchange plasma propagation
NASA Technical Reports Server (NTRS)
Robinson, R. S.; Deininger, W. D.; Winder, D. R.; Kaufman, H. R.
1982-01-01
The propagation of the charge exchange plasma for an electrostatic ion thruster is crucial in determining the interaction of that plasma with the associated spacecraft. A model that describes this plasma and its propagation is described, together with a computer code based on this model. The structure and calling sequence of the code, named PLASIM, is described. An explanation of the program's input and output is included, together with samples of both. The code is written in ANSI Standard FORTRAN.
What Can Quantum Optics Say about Computational Complexity Theory?
NASA Astrophysics Data System (ADS)
Rahimi-Keshari, Saleh; Lund, Austin P.; Ralph, Timothy C.
2015-02-01
Considering the problem of sampling from the output photon-counting probability distribution of a linear-optical network for input Gaussian states, we obtain results that are of interest from both quantum theory and the computational complexity theory point of view. We derive a general formula for calculating the output probabilities, and by considering input thermal states, we show that the output probabilities are proportional to permanents of positive-semidefinite Hermitian matrices. It is believed that approximating permanents of complex matrices in general is a #P-hard problem. However, we show that these permanents can be approximated with an algorithm in the BPPNP complexity class, as there exists an efficient classical algorithm for sampling from the output probability distribution. We further consider input squeezed-vacuum states and discuss the complexity of sampling from the probability distribution at the output.
Johnsen Lind, Andreas; Helge Johnsen, Bjorn; Hill, Labarron K; Sollers Iii, John J; Thayer, Julian F
2011-01-01
The aim of the present manuscript is to present a user-friendly and flexible platform for transforming Kubios HRV output files to an .xls-file format, used by MS Excel. The program utilizes either native or bundled Java and is platform-independent and mobile. This means that it can run without being installed on a computer. It also has an option of continuous transferring of data indicating that it can run in the background while Kubios produces output files. The program checks for changes in the file structure and automatically updates the .xls- output file.
ERIC Educational Resources Information Center
Wiley, Caroline H.; Good, Thomas L.; McCaslin, Mary
2008-01-01
Background/Context: The achievement effects of Comprehensive School Reform (CSR) programs have been studied through the use of input-output models, in which type of CSR program is the input and student achievement is the output. Although specific programs have been found to be more effective and evaluated more than others, teaching practices in…
A high-fidelity satellite ephemeris program for Earth satellites in eccentric orbits
NASA Technical Reports Server (NTRS)
Simmons, David R.
1990-01-01
A program for mission planning called the Analytic Satellite Ephemeris Program (ASEP), produces projected data for orbits that remain fairly close to the Earth. ASEP does not take into account lunar and solar perturbations. These perturbations are accounted for in another program called GRAVE, which incorporates more flexible means of input for initial data, provides additional kinds of output information, and makes use of structural programming techniques to make the program more understandable and reliable. GRAVE was revised, and a new program called ORBIT was developed. It is divided into three major phases: initialization, integration, and output. Results of the program development are presented.
NASA Technical Reports Server (NTRS)
Walatka, Pamela P.; Buning, Pieter G.; Pierce, Larry; Elson, Patricia A.
1990-01-01
PLOT3D is a computer graphics program designed to visualize the grids and solutions of computational fluid dynamics. Seventy-four functions are available. Versions are available for many systems. PLOT3D can handle multiple grids with a million or more grid points, and can produce varieties of model renderings, such as wireframe or flat shaded. Output from PLOT3D can be used in animation programs. The first part of this manual is a tutorial that takes the reader, keystroke by keystroke, through a PLOT3D session. The second part of the manual contains reference chapters, including the helpfile, data file formats, advice on changing PLOT3D, and sample command files.
NASA Technical Reports Server (NTRS)
Barrett, C. E.; Presler, A. F.
1976-01-01
A FORTRAN computer program (COREST) was developed to analyze the high-temperature paralinear oxidation behavior of metals. It is based on a mass-balance approach and uses typical gravimetric input data. COREST was applied to predominantly Cr2O3-forming alloys tested isothermally for long times. These alloys behaved paralinearly above 1100 C as a result of simultaneous scale formation and scale vaporization. Output includes the pertinent formation and vaporization constants and kinetic values of interest. COREST also estimates specific sample weight and specific scale weight as a function of time. Most importantly, from a corrosion standpoint, it estimates specific metal loss.
Aerodynamic preliminary analysis system 2. Part 1: Theory
NASA Technical Reports Server (NTRS)
Bonner, E.; Clever, W.; Dunn, K.
1981-01-01
A subsonic/supersonic/hypersonic aerodynamic analysis was developed by integrating the Aerodynamic Preliminary Analysis System (APAS), and the inviscid force calculation modules of the Hypersonic Arbitrary Body Program. APAS analysis was extended for nonlinear vortex forces using a generalization of the Polhamus analogy. The interactive system provides appropriate aerodynamic models for a single input geometry data base and has a run/output format similar to a wind tunnel test program. The user's manual was organized to cover the principle system activities of a typical application, geometric input/editing, aerodynamic evaluation, and post analysis review/display. Sample sessions are included to illustrate the specific task involved and are followed by a comprehensive command/subcommand dictionary used to operate the system.
Simulator for multilevel optimization research
NASA Technical Reports Server (NTRS)
Padula, S. L.; Young, K. C.
1986-01-01
A computer program designed to simulate and improve multilevel optimization techniques is described. By using simple analytic functions to represent complex engineering analyses, the simulator can generate and test a large variety of multilevel decomposition strategies in a relatively short time. This type of research is an essential step toward routine optimization of large aerospace systems. The paper discusses the types of optimization problems handled by the simulator and gives input and output listings and plots for a sample problem. It also describes multilevel implementation techniques which have value beyond the present computer program. Thus, this document serves as a user's manual for the simulator and as a guide for building future multilevel optimization applications.
Sampling estimators of total mill receipts for use in timber product output studies
John P. Brown; Richard G. Oderwald
2012-01-01
Data from the 2001 timber product output study for Georgia was explored to determine new methods for stratifying mills and finding suitable sampling estimators. Estimators for roundwood receipts totals comprised several types: simple random sample, ratio, stratified sample, and combined ratio. Two stratification methods were examined: the Dalenius-Hodges (DH) square...
Baumes, Laurent A
2006-01-01
One of the main problems in high-throughput research for materials is still the design of experiments. At early stages of discovery programs, purely exploratory methodologies coupled with fast screening tools should be employed. This should lead to opportunities to find unexpected catalytic results and identify the "groups" of catalyst outputs, providing well-defined boundaries for future optimizations. However, very few new papers deal with strategies that guide exploratory studies. Mostly, traditional designs, homogeneous covering, or simple random samplings are exploited. Typical catalytic output distributions exhibit unbalanced datasets for which an efficient learning is hardly carried out, and interesting but rare classes are usually unrecognized. Here is suggested a new iterative algorithm for the characterization of the search space structure, working independently of learning processes. It enhances recognition rates by transferring catalysts to be screened from "performance-stable" space zones to "unsteady" ones which necessitate more experiments to be well-modeled. The evaluation of new algorithm attempts through benchmarks is compulsory due to the lack of past proofs about their efficiency. The method is detailed and thoroughly tested with mathematical functions exhibiting different levels of complexity. The strategy is not only empirically evaluated, the effect or efficiency of sampling on future Machine Learning performances is also quantified. The minimum sample size required by the algorithm for being statistically discriminated from simple random sampling is investigated.
An interface for the direct coupling of small liquid samples to AMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ognibene, T. J.; Thomas, A. T.; Daley, P. F.
We describe the moving wire interface attached to the 1-MV AMS system at LLNL’s Center for Accelerator Mass Spectrometry for the analysis of nonvolatile liquid samples as either discrete drops or from the direct output of biochemical separatory instrumentation, such as high-performance liquid chromatography (HPLC). Discrete samples containing at least a few 10 s of nanograms of carbon and as little as 50 zmol 14C can be measured with a 3–5% precision in a few minutes. The dynamic range of our system spans approximately 3 orders in magnitude. Sample to sample memory is minimized by the use of fresh targetsmore » for each discrete sample or by minimizing the amount of carbon present in a peak generated by an HPLC containing a significant amount of 14C. As a result, liquid sample AMS provides a new technology to expand our biomedical AMS program by enabling the capability to measure low-level biochemicals in extremely small samples that would otherwise be inaccessible.« less
An interface for the direct coupling of small liquid samples to AMS
Ognibene, T. J.; Thomas, A. T.; Daley, P. F.; ...
2015-05-28
We describe the moving wire interface attached to the 1-MV AMS system at LLNL’s Center for Accelerator Mass Spectrometry for the analysis of nonvolatile liquid samples as either discrete drops or from the direct output of biochemical separatory instrumentation, such as high-performance liquid chromatography (HPLC). Discrete samples containing at least a few 10 s of nanograms of carbon and as little as 50 zmol 14C can be measured with a 3–5% precision in a few minutes. The dynamic range of our system spans approximately 3 orders in magnitude. Sample to sample memory is minimized by the use of fresh targetsmore » for each discrete sample or by minimizing the amount of carbon present in a peak generated by an HPLC containing a significant amount of 14C. As a result, liquid sample AMS provides a new technology to expand our biomedical AMS program by enabling the capability to measure low-level biochemicals in extremely small samples that would otherwise be inaccessible.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tauke-Pedretti, Anna; Skogen, Erik J; Vawter, Gregory A
An optical sampler includes a first and second 1.times.n optical beam splitters splitting an input optical sampling signal and an optical analog input signal into n parallel channels, respectively, a plurality of optical delay elements providing n parallel delayed input optical sampling signals, n photodiodes converting the n parallel optical analog input signals into n respective electrical output signals, and n optical modulators modulating the input optical sampling signal or the optical analog input signal by the respective electrical output signals, and providing n successive optical samples of the optical analog input signal. A plurality of output photodiodes and eADCsmore » convert the n successive optical samples to n successive digital samples. The optical modulator may be a photodiode interconnected Mach-Zehnder Modulator. A method of sampling the optical analog input signal is disclosed.« less
NASA Technical Reports Server (NTRS)
Vadyak, J.; Hoffman, J. D.
1982-01-01
A computer program was developed which is capable of calculating the flow field in the supersonic portion of a mixed compression aircraft inlet operating at angle of attack. The supersonic core flow is computed using a second-order three dimensional method-of-characteristics algorithm. The bow shock and the internal shock train are treated discretely using a three dimensional shock fitting procedure. The boundary layer flows are computed using a second-order implicit finite difference method. The shock wave-boundary layer interaction is computed using an integral formulation. The general structure of the computer program is discussed, and a brief description of each subroutine is given. All program input parameters are defined, and a brief discussion on interpretation of the output is provided. A number of sample cases, complete with data listings, are provided.
Life and dynamic capacity modeling for aircraft transmissions
NASA Technical Reports Server (NTRS)
Savage, Michael
1991-01-01
A computer program to simulate the dynamic capacity and life of parallel shaft aircraft transmissions is presented. Five basic configurations can be analyzed: single mesh, compound, parallel, reverted, and single plane reductions. In execution, the program prompts the user for the data file prefix name, takes input from a ASCII file, and writes its output to a second ASCII file with the same prefix name. The input data file includes the transmission configuration, the input shaft torque and speed, and descriptions of the transmission geometry and the component gears and bearings. The program output file describes the transmission, its components, their capabilities, locations, and loads. It also lists the dynamic capability, ninety percent reliability, and mean life of each component and the transmission as a system. Here, the program, its input and output files, and the theory behind the operation of the program are described.
Processing Device for High-Speed Execution of an Xrisc Computer Program
NASA Technical Reports Server (NTRS)
Ng, Tak-Kwong (Inventor); Mills, Carl S. (Inventor)
2016-01-01
A processing device for high-speed execution of a computer program is provided. A memory module may store one or more computer programs. A sequencer may select one of the computer programs and controls execution of the selected program. A register module may store intermediate values associated with a current calculation set, a set of output values associated with a previous calculation set, and a set of input values associated with a subsequent calculation set. An external interface may receive the set of input values from a computing device and provides the set of output values to the computing device. A computation interface may provide a set of operands for computation during processing of the current calculation set. The set of input values are loaded into the register and the set of output values are unloaded from the register in parallel with processing of the current calculation set.
Flight dynamics analysis and simulation of heavy lift airships. Volume 3: User's manual
NASA Technical Reports Server (NTRS)
Emmen, R. D.; Tischler, M. B.
1982-01-01
The User's Manual provides the basic information necessary to run the programs. This includes descriptions of the various data files necessary for the program, the various outputs from the program and the options available to the user when executing the program. Additional data file information is contained in the three appendices to the manual. These appendices list all input variables and their permissible values, an example listing of these variables, and all output variables available to the user.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxberry, Geoffrey
Google Test MPI Listener is a plugin for the Google Test c++ unit testing library that organizes test output of software that uses both the MPI parallel programming model and Google Test. Typically, such output is ordered arbitrarily and disorganized, making difficult the process of interpreting test output. This plug organizes output in MPI rank order, enabling easy interpretation of test results.
User's guide to the Radiometric Age Data Bank (RADB)
Zartman, Robert Eugene; Cole, James C.; Marvin, Richard F.
1976-01-01
The Radiometric Age Data Bank (RADB) has been established by the U.S. Geological Survey, as a means for collecting and organizing the estimated 100,000 radiometric ages presently published for the United States. RADB has been constructed such that a complete sample description (location, rock type, etc.), literature citation, and extensive analytical data are linked to form an independent record for each sample reported in a published work. Analytical data pertinent to the potassium-argon, rubidium-strontium, uranium-thorium-lead, lead-alpha, and fission-track methods can be accommodated, singly or in combinations, for each record. Data processing is achieved using the GIPSY program (University of Oklahoma) which maintains the data file and builds, updates, searches, and prints the records using simple yet versatile command statements. Searching and selecting records is accomplished by specifying the presence, absence, or (numeric or alphabetic) value of any element of information in the data bank, and these specifications can be logically linked to develop sophisticated searching strategies. Output is available in the form of complete data records, abbreviated tests, or columnar tabulations. Samples of data-reporting forms, GIPSY command statements, output formats, and data records are presented to illustrate the comprehensive nature and versatility of the Radiometric Age Data Bank.
Tsai, Jason Sheng-Hong; Du, Yan-Yi; Huang, Pei-Hsiang; Guo, Shu-Mei; Shieh, Leang-San; Chen, Yuhua
2011-07-01
In this paper, a digital redesign methodology of the iterative learning-based decentralized adaptive tracker is proposed to improve the dynamic performance of sampled-data linear large-scale control systems consisting of N interconnected multi-input multi-output subsystems, so that the system output will follow any trajectory which may not be presented by the analytic reference model initially. To overcome the interference of each sub-system and simplify the controller design, the proposed model reference decentralized adaptive control scheme constructs a decoupled well-designed reference model first. Then, according to the well-designed model, this paper develops a digital decentralized adaptive tracker based on the optimal analog control and prediction-based digital redesign technique for the sampled-data large-scale coupling system. In order to enhance the tracking performance of the digital tracker at specified sampling instants, we apply the iterative learning control (ILC) to train the control input via continual learning. As a result, the proposed iterative learning-based decentralized adaptive tracker not only has robust closed-loop decoupled property but also possesses good tracking performance at both transient and steady state. Besides, evolutionary programming is applied to search for a good learning gain to speed up the learning process of ILC. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Fox, S. R.; Smetana, F. O.
1980-01-01
The listings, user's instructions, sample inputs, and sample outputs of two computer programs which are especially useful in obtaining an approximate solution of the viscous flow over an arbitrary nonlifting three dimensional body are provided. The first program performs a potential flow solution by a well known panel method and readjusts this initial solution to account for the effects of the boundary layer displacement thickness, a nonuniform but unidirectional onset flow field, and the presence of air intakes and exhausts. The second program is effectually a geometry package which allows the user to change or refine the shape of a body to satisfy particular needs without a significant amount of human intervention. An effort to reduce the cruise drag of light aircraft through an analytical study of the contributions to the drag arising from the engine cowl shape and the foward fuselage area and also that resulting from the cooling air mass flowing through intake and exhaust sites on the nacelle is presented. The programs may be effectively used to determine the appropriate body modifications or flow port locations to reduce the cruise drag as well as to provide sufficient air flow for cooling the engine.
Predictive sensor method and apparatus
NASA Technical Reports Server (NTRS)
Nail, William L. (Inventor); Koger, Thomas L. (Inventor); Cambridge, Vivien (Inventor)
1990-01-01
A predictive algorithm is used to determine, in near real time, the steady state response of a slow responding sensor such as hydrogen gas sensor of the type which produces an output current proportional to the partial pressure of the hydrogen present. A microprocessor connected to the sensor samples the sensor output at small regular time intervals and predicts the steady state response of the sensor in response to a perturbation in the parameter being sensed, based on the beginning and end samples of the sensor output for the current sample time interval.
Multidimensional System Analysis of Electro-Optic Sensors with Sampled Deterministic Output.
1987-12-18
System descriptions of scanning and staring electro - optic sensors with sampled output are developed as follows. Functions representing image...to complete the system descriptions. The results should be useful for designing electro - optic sensor systems and correcting data for instrumental...effects and other experimental conditions. Keywords include: Electro - optic system analysis, Scanning sensors, Staring sensors, Spatial sampling, and Temporal sampling.
Optical analog-to-digital converter
Vawter, G Allen [Corrales, NM; Raring, James [Goleta, CA; Skogen, Erik J [Albuquerque, NM
2009-07-21
An optical analog-to-digital converter (ADC) is disclosed which converts an input optical analog signal to an output optical digital signal at a sampling rate defined by a sampling optical signal. Each bit of the digital representation is separately determined using an optical waveguide interferometer and an optical thresholding element. The interferometer uses the optical analog signal and the sampling optical signal to generate a sinusoidally-varying output signal using cross-phase-modulation (XPM) or a photocurrent generated from the optical analog signal. The sinusoidally-varying output signal is then digitized by the thresholding element, which includes a saturable absorber or at least one semiconductor optical amplifier, to form the optical digital signal which can be output either in parallel or serially.
Software development guidelines
NASA Technical Reports Server (NTRS)
Kovalevsky, N.; Underwood, J. M.
1979-01-01
Analysis, modularization, flowcharting, existing programs and subroutines, compatibility, input and output data, adaptability to checkout, and general-purpose subroutines are summarized. Statement ordering and numbering, specification statements, variable names, arrays, arithemtical expressions and statements, control statements, input/output, and subroutines are outlined. Intermediate results, desk checking, checkout data, dumps, storage maps, diagnostics, and program timing are reviewed.
NASA Technical Reports Server (NTRS)
Slaby, J. G.
1986-01-01
Free piston Stirling technology is applicable for both solar and nuclear powered systems. As such, the Lewis Research Center serves as the project office to manage the newly initiated SP-100 Advanced Technology Program. This five year program provides the technology push for providing significant component and subsystem options for increased efficiency, reliability and survivability, and power output growth at reduced specific mass. One of the major elements of the program is the development of advanced power conversion concepts of which the Stirling cycle is a viable candidate. Under this program the research findings of the 25 kWe opposed piston Space Power Demonstrator Engine (SPDE) are presented. Included in the SPDE discussions are initial differences between predicted and experimental power outputs and power output influenced by variations in regenerators. Projections are made for future space power requirements over the next few decades. And a cursory comparison is presented showing the mass benefits that a Stirling system has over a Brayton system for the same peak temperature and output power.
A high-fidelity N-body ephemeris generator for satellites in Earth orbit
NASA Astrophysics Data System (ADS)
Simmons, David R.
1991-10-01
A program is currently used for mission planning called the Analytic Satellite Ephemeris Program (ASEP), which produces projected data for orbits that remain fairly close to Earth. Lunar and solar perturbations are taken into account in another program called GRAVE. This project is a revision of GRAVE which incorporates more flexible means of input for initial data, provides additional kinds of output information, and makes use of structured programming techniques to make the program more understandable and reliable. The computer program ORBIT was tested against tracking data for the first 313 days of operation of the CRRES satellite. A sample graph is given comparing the semi-major axis calculated by the program with the values supplied by NORAD. When calculated for points at which CRRES passes through the ascending node, the argument of perigee, the right ascension of the ascending node, and the mean anomaly all stay within about a degree of the corresponding values from NORAD; the inclination of the orbital plane is much closer. The program value of the eccentricity is in error by no more than 0.0002.
Elemental analysis using temporal gating of a pulsed neutron generator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitra, Sudeep
Technologies related to determining elemental composition of a sample that comprises fissile material are described herein. In a general embodiment, a pulsed neutron generator periodically emits bursts of neutrons, and is synchronized with an analyzer circuit. The bursts of neutrons are used to interrogate the sample, and the sample outputs gamma rays based upon the neutrons impacting the sample. A detector outputs pulses based upon the gamma rays impinging upon the material of the detector, and the analyzer circuit assigns the pulses to temporally-based bins based upon the analyzer circuit being synchronized with the pulsed neutron generator. A computing devicemore » outputs data that is indicative of elemental composition of the sample based upon the binned pulses.« less
Evaluation of a women group led health communication program in Haryana, India.
Kaur, Manmeet; Jaswal, Nidhi; Saddi, Anil Kumar
2017-12-01
Sakshar Mahila Smooh (SMS) program was launched in rural areas of Haryana in India during 2008. A total of 6788 SMSs, each having 5-10 literate women, were equipped to enhance health communication. We carried out process evaluation of this program as an external agency. After a review of program documents, a random sample survey of Auxiliary Nurse Midwives (ANMs), SMS members, and village women was conducted. Out of four divisions of the state, one was randomly chosen, which had five districts. From 330 randomly chosen villages, 283 ANMs, 1164 SMS members, and 1123 village women were interviewed using a semi- structured interview schedule. Program inputs, processes, and outputs were compared in the five districts. Chi square was used for significance test. In the sampled division, out of 2009 villages, 1732 (86%) had functional SMS. In three years, SMS conducted 15036 group meetings, 2795 rallies, 2048 wall writings, and 803 competitions, and 44.5% of allocated budget was utilized. Most ANMs opined that SMSs are better health communicators. SMS members were aware about their roles and responsibilities. Majority of village women reported that SMS carry out useful health education activities. The characteristics of SMS members were similar but program performance was better in districts where health managers were proactive in program planning and monitoring. SMS Program has communicated health messages to majority of rural population, however, better planning & monitoring can improve program performance. Copyright © 2017 Elsevier Ltd. All rights reserved.
Orbital Maneuvering Engine Feed System Coupled Stability Investigation, Computer User's Manual
NASA Technical Reports Server (NTRS)
Schuman, M. D.; Fertig, K. W.; Hunting, J. K.; Kahn, D. R.
1975-01-01
An operating manual for the feed system coupled stability model was given, in partial fulfillment of a program designed to develop, verify, and document a digital computer model that can be used to analyze and predict engine/feed system coupled instabilities in pressure-fed storable propellant propulsion systems over a frequency range of 10 to 1,000 Hz. The first section describes the analytical approach to modelling the feed system hydrodynamics, combustion dynamics, chamber dynamics, and overall engineering model structure, and presents the governing equations in each of the technical areas. This is followed by the program user's guide, which is a complete description of the structure and operation of the computerized model. Last, appendices provide an alphabetized FORTRAN symbol table, detailed program logic diagrams, computer code listings, and sample case input and output data listings.
Parallel processor for real-time structural control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tise, B.L.
1992-01-01
A parallel processor that is optimized for real-time linear control has been developed. This modular system consists of A/D modules, D/A modules, and floating-point processor modules. The scalable processor uses up to 1,000 Motorola DSP96002 floating-point processors for a peak computational rate of 60 GFLOPS. Sampling rates up to 625 kHz are supported by this analog-in to analog-out controller. The high processing rate and parallel architecture make this processor suitable for computing state-space equations and other multiply/accumulate-intensive digital filters. Processor features include 14-bit conversion devices, low input-output latency, 240 Mbyte/s synchronous backplane bus, low-skew clock distribution circuit, VME connection tomore » host computer, parallelizing code generator, and look-up-tables for actuator linearization. This processor was designed primarily for experiments in structural control. The A/D modules sample sensors mounted on the structure and the floating-point processor modules compute the outputs using the programmed control equations. The outputs are sent through the D/A module to the power amps used to drive the structure's actuators. The host computer is a Sun workstation. An Open Windows-based control panel is provided to facilitate data transfer to and from the processor, as well as to control the operating mode of the processor. A diagnostic mode is provided to allow stimulation of the structure and acquisition of the structural response via sensor inputs.« less
An interactive graphics program for manipulation and display of panel method geometry
NASA Technical Reports Server (NTRS)
Hall, J. F.; Neuhart, D. H.; Walkley, K. B.
1983-01-01
Modern aerodynamic panel methods that handle large, complex geometries have made evident the need to interactively manipulate, modify, and view such configurations. With this purpose in mind, the GEOM program was developed. It is a menu driven, interactive program that uses the Tektronix PLOT 10 graphics software to display geometry configurations which are characterized by an abutting set of networks. These networks are composed of quadrilateral panels which are described by the coordinates of their corners. GEOM is divided into fourteen executive controlled functions. These functions are used to build configurations, scale and rotate networks, transpose networks defining M and N lines, graphically display selected networks, join and split networks, create wake networks, produce symmetric images of networks, repanel and rename networks, display configuration cross sections, and output network geometry in two formats. A data base management system is used to facilitate data transfers in this program. A sample session illustrating various capabilities of the code is included as a guide to program operation.
NASA Technical Reports Server (NTRS)
Saltsman, James F.
1992-01-01
This manual presents computer programs for characterizing and predicting fatigue and creep-fatigue resistance of metallic materials in the high-temperature, long-life regime for isothermal and nonisothermal fatigue. The programs use the total strain version of Strainrange Partitioning (TS-SRP). An extensive database has also been developed in a parallel effort. This database is probably the largest source of high-temperature, creep-fatigue test data available in the public domain and can be used with other life prediction methods as well. This users manual, software, and database are all in the public domain and are available through COSMIC (382 East Broad Street, Athens, GA 30602; (404) 542-3265, FAX (404) 542-4807). Two disks accompany this manual. The first disk contains the source code, executable files, and sample output from these programs. The second disk contains the creep-fatigue data in a format compatible with these programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, A.B.; Wackerle, J.
1983-07-01
This report describes a package of five computer codes for analyzing stress-gauge data from shock-wave experiments on reactive materials. The aim of the analysis is to obtain rate laws from experiment. A Lagrangian analysis of the stress records, performed by program LANAL, provides flow histories of particle velocity, density, and energy. Three postprocessing programs, LOOKIT, LOOK1, and LOOK2, are included in the package of codes for producing graphical output of the results of LANAL. Program RATE uses the flow histories in conjunction with an equation of state to calculate reaction-rate histories. RATE can be programmed to examine correlations between themore » rate histories and thermodynamic variables. Observed correlations can be incorporated into an appropriately parameterized rate law. Program RATE determines the values of these parameters that best reproduce the observed rate histories. The procedure is illustrated with a sample problem.« less
Eddy Current Method for Fatigue Testing
NASA Technical Reports Server (NTRS)
Simpson, John W. (Inventor); Fulton, James P. (Inventor); Wincheski, Russell A. (Inventor); Todhunter, Ronald G. (Inventor); Namkung, Min (Inventor); Nath, Shridhar C. (Inventor)
1997-01-01
Flux-focusing electromagnetic sensor using a ferromagnetic flux-focusing lens simplifies inspections and increases detectability of fatigue cracks and material loss in high conductivity material. A ferrous shield isolates a high-turn pick-up coil from an excitation coil. Use of the magnetic shield produces a null voltage output across the receiving coil in presence of an unflawed sample. Redistribution of the current flow in the sample caused by the presence of flaws. eliminates the shielding condition and a large output voltage is produced, yielding a clear unambiguous flaw signal. Maximum sensor output is obtained when positioned symmetrically above the crack. By obtaining position of maximum sensor output, it is possible to track the fault and locate the area surrounding its tip. Accuracy of tip location is enhanced by two unique features of the sensor; a very high signal-to-noise ratio of the probe's output resulting in an extremely smooth signal peak across the fault, and a rapidly decaying sensor output outside a small area surrounding the crack tip enabling the search region to be clearly defined. Under low frequency operation, material thinning due to corrosion causes incomplete shielding of the pick-up coil. Low frequency output voltage of the probe is therefore a direct indicator of thickness of the test sample. Fatigue testing a conductive material is accomplished by applying load to the material, applying current to the sensor, scanning the material with the sensor, monitoring the sensor output signal, adjusting material load based on the sensor output signal of the sensor, and adjusting position of the sensor based on its output signal.
ASAP- ARTIFICIAL SATELLITE ANALYSIS PROGRAM
NASA Technical Reports Server (NTRS)
Kwok, J.
1994-01-01
The Artificial Satellite Analysis Program (ASAP) is a general orbit prediction program which incorporates sufficient orbit modeling accuracy for mission design, maneuver analysis, and mission planning. ASAP is suitable for studying planetary orbit missions with spacecraft trajectories of reconnaissance (flyby) and exploratory (mapping) nature. Sample data is included for a geosynchronous station drift cycle study, a Venus radar mapping strategy, a frozen orbit about Mars, and a repeat ground trace orbit. ASAP uses Cowell's method in the numerical integration of the equations of motion. The orbital mechanics calculation contains perturbations due to non-sphericity (up to a 40 X 40 field) of the planet, lunar and solar effects, and drag and solar radiation pressure. An 8th order Runge-Kutta integration scheme with variable step size control is used for efficient propagation. The input includes the classical osculating elements, orbital elements of the sun relative to the planet, reference time and dates, drag coefficient, gravitational constants, and planet radius, rotation rate, etc. The printed output contains Cartesian coordinates, velocity, equinoctial elements, and classical elements for each time step or event step. At each step, selected output is added to a plot file. The ASAP package includes a program for sorting this plot file. LOTUS 1-2-3 is used in the supplied examples to graph the results, but any graphics software package could be used to process the plot file. ASAP is not written to be mission-specific. Instead, it is intended to be used for most planetary orbiting missions. As a consequence, the user has to have some basic understanding of orbital mechanics to provide the correct input and interpret the subsequent output. ASAP is written in FORTRAN 77 for batch execution and has been implemented on an IBM PC compatible computer operating under MS-DOS. The ASAP package requires a math coprocessor and a minimum of 256K RAM. This program was last updated in 1988 with version 2.03. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. Lotus and 1-2-3 are registered trademarks of Lotus Development Corporation.
A computer program for helicopter rotor noise using Lowson's formula in the time domain
NASA Technical Reports Server (NTRS)
Parks, C. L.
1975-01-01
A computer program (D3910) was developed to calculate both the far field and near field acoustic pressure signature of a tilted rotor in hover or uniform forward speed. The analysis, carried out in the time domain, is based on Lowson's formulation of the acoustic field of a moving force. The digital computer program is described, including methods used in the calculations, a flow chart, program D3910 source listing, instructions for the user, and two test cases with input and output listings and output plots.
Java-based Graphical User Interface for MAVERIC-II
NASA Technical Reports Server (NTRS)
Seo, Suk Jai
2005-01-01
A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in files and examine the output data later. Users can also view the output stored in output files by calling a plotting program such as gnuplot. A typical scenario of the use of MAVERIC consists of three-steps; editing existing input data files, running MAVERIC, and plotting output results.
User Guide and Documentation for Five MODFLOW Ground-Water Modeling Utility Programs
Banta, Edward R.; Paschke, Suzanne S.; Litke, David W.
2008-01-01
This report documents five utility programs designed for use in conjunction with ground-water flow models developed with the U.S. Geological Survey's MODFLOW ground-water modeling program. One program extracts calculated flow values from one model for use as input to another model. The other four programs extract model input or output arrays from one model and make them available in a form that can be used to generate an ArcGIS raster data set. The resulting raster data sets may be useful for visual display of the data or for further geographic data processing. The utility program GRID2GRIDFLOW reads a MODFLOW binary output file of cell-by-cell flow terms for one (source) model grid and converts the flow values to input flow values for a different (target) model grid. The spatial and temporal discretization of the two models may differ. The four other utilities extract selected 2-dimensional data arrays in MODFLOW input and output files and write them to text files that can be imported into an ArcGIS geographic information system raster format. These four utilities require that the model cells be square and aligned with the projected coordinate system in which the model grid is defined. The four raster-conversion utilities are * CBC2RASTER, which extracts selected stress-package flow data from a MODFLOW binary output file of cell-by-cell flows; * DIS2RASTER, which extracts cell-elevation data from a MODFLOW Discretization file; * MFBIN2RASTER, which extracts array data from a MODFLOW binary output file of head or drawdown; and * MULT2RASTER, which extracts array data from a MODFLOW Multiplier file.
Energy use in the marine transportation industry. Task II. Efficiency improvements. Draft report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1977-06-02
Research and development areas that hold promise for maritime energy conservation are identified and evaluated. The methodology used in the evaluation of potential research areas and results, conclusions, and recommendations are presented. Fifteen programs are identified in four generic technologies and these are discussed in detail in appendices A-D. The areas are: main propulsion plants, propulsors, hydrodynamics, and vessel operations. Fuels are discussed briefly in appendix E. Additional information is presented on the generic US flag baseline operational and cost parameters; a sample output model is presented. (MCW)
Web-based emergency response exercise management systems and methods thereof
Goforth, John W.; Mercer, Michael B.; Heath, Zach; Yang, Lynn I.
2014-09-09
According to one embodiment, a method for simulating portions of an emergency response exercise includes generating situational awareness outputs associated with a simulated emergency and sending the situational awareness outputs to a plurality of output devices. Also, the method includes outputting to a user device a plurality of decisions associated with the situational awareness outputs at a decision point, receiving a selection of one of the decisions from the user device, generating new situational awareness outputs based on the selected decision, and repeating the sending, outputting and receiving steps based on the new situational awareness outputs. Other methods, systems, and computer program products are included according to other embodiments of the invention.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jung, Jinwoo; Lee, Jewon; Song, Hanjung
2011-03-15
This paper presents a fully integrated circuit implementation of an operational amplifier (op-amp) based chaotic neuron model with a bipolar output function, experimental measurements, and analyses of its chaotic behavior. The proposed chaotic neuron model integrated circuit consists of several op-amps, sample and hold circuits, a nonlinear function block for chaotic signal generation, a clock generator, a nonlinear output function, etc. Based on the HSPICE (circuit program) simulation results, approximated empirical equations for analyses were formulated. Then, the chaotic dynamical responses such as bifurcation diagrams, time series, and Lyapunov exponent were calculated using these empirical equations. In addition, we performedmore » simulations about two chaotic neuron systems with four synapses to confirm neural network connections and got normal behavior of the chaotic neuron such as internal state bifurcation diagram according to the synaptic weight variation. The proposed circuit was fabricated using a 0.8-{mu}m single poly complementary metal-oxide semiconductor technology. Measurements of the fabricated single chaotic neuron with {+-}2.5 V power supplies and a 10 kHz sampling clock frequency were carried out and compared with the simulated results.« less
NASA Astrophysics Data System (ADS)
Bouaynaya, N.; Schonfeld, Dan
2005-03-01
Many real world applications in computer and multimedia such as augmented reality and environmental imaging require an elastic accurate contour around a tracked object. In the first part of the paper we introduce a novel tracking algorithm that combines a motion estimation technique with the Bayesian Importance Sampling framework. We use Adaptive Block Matching (ABM) as the motion estimation technique. We construct the proposal density from the estimated motion vector. The resulting algorithm requires a small number of particles for efficient tracking. The tracking is adaptive to different categories of motion even with a poor a priori knowledge of the system dynamics. Particulary off-line learning is not needed. A parametric representation of the object is used for tracking purposes. In the second part of the paper, we refine the tracking output from a parametric sample to an elastic contour around the object. We use a 1D active contour model based on a dynamic programming scheme to refine the output of the tracker. To improve the convergence of the active contour, we perform the optimization over a set of randomly perturbed initial conditions. Our experiments are applied to head tracking. We report promising tracking results in complex environments.
Cheung, Imelda W Y; Li-Chan, Eunice C Y
2014-02-15
The objective of this study was to investigate the potential of an instrumental taste-sensing system to distinguish between shrimp processing by-products hydrolysates produced using different proteases and hydrolysis conditions, and the possible association of taste sensor outputs with human gustatory assessment, salt content, and bioactivity. Principal component analysis of taste sensor output data categorised samples according to the proteases used for hydrolysis. High umami sensor outputs were characteristic of bromelain- and Flavourzyme-produced hydrolysates, compared to low saltiness and high bitterness outputs of Alcalase-produced hydrolysates, and high saltiness and low umami outputs of Protamex-produced hydrolysates. Extensively hydrolysed samples showed higher sourness outputs. Saltiness sensor outputs were correlated with conductivity and sodium content, while umami sensor responses were related to gustatory sweetness, bitterness and umami, as well as angiotensin-I converting enzyme inhibitory activity. Further research should explore the dose dependence and sensitivity of each taste sensor to specific amino acids and peptides. Copyright © 2013 Elsevier Ltd. All rights reserved.
From samples to populations in retinex models
NASA Astrophysics Data System (ADS)
Gianini, Gabriele
2017-05-01
Some spatial color algorithms, such as Brownian Milano retinex (MI-retinex) and random spray retinex (RSR), are based on sampling. In Brownian MI-retinex, memoryless random walks (MRWs) explore the neighborhood of a pixel and are then used to compute its output. Considering the relative redundancy and inefficiency of MRW exploration, the algorithm RSR replaced the walks by samples of points (the sprays). Recent works point to the fact that a mapping from the sampling formulation to the probabilistic formulation of the corresponding sampling process can offer useful insights into the models, at the same time featuring intrinsically noise-free outputs. The paper continues the development of this concept and shows that the population-based versions of RSR and Brownian MI-retinex can be used to obtain analytical expressions for the outputs of some test images. The comparison of the two analytic expressions from RSR and from Brownian MI-retinex demonstrates not only that the two outputs are, in general, different but also that they depend in a qualitatively different way upon the features of the image.
A visual LISP program for voxelizing AutoCAD solid models
NASA Astrophysics Data System (ADS)
Marschallinger, Robert; Jandrisevits, Carmen; Zobl, Fritz
2015-01-01
AutoCAD solid models are increasingly recognized in geological and geotechnical 3D modeling. In order to bridge the currently existing gap between AutoCAD solid models and the grid modeling realm, a Visual LISP program is presented that converts AutoCAD solid models into voxel arrays. Acad2Vox voxelizer works on a 3D-model that is made up of arbitrary non-overlapping 3D-solids. After definition of the target voxel array geometry, 3D-solids are scanned at grid positions and properties are streamed to an ASCII output file. Acad2Vox has a novel voxelization strategy that combines a hierarchical reduction of sampling dimensionality with an innovative use of AutoCAD-specific methods for a fast and memory-saving operation. Acad2Vox provides georeferenced, voxelized analogs of 3D design data that can act as regions-of-interest in later geostatistical modeling and simulation. The Supplement includes sample geological solid models with instructions for practical work with Acad2Vox.
NASA Technical Reports Server (NTRS)
1977-01-01
The 20x9 TDI array was developed to meet the LANDSAT Thematic Mapper Requirements. This array is based upon a self-aligned, transparent gate, buried channel process. The process features: (1) buried channel, four phase, overlapping gate CCD's for high transfer efficiency without fat zero; (2) self-aligned transistors to minimize clock feedthrough and parasitic capacitance; and (3) transparent tin oxide electrode for high quantum efficiency with front surface irradiation. The requirements placed on the array and the performance achieved are summarized. This data is the result of flat field measurements only, no imaging or dynamic target measurements were made during this program. Measurements were performed with two different test stands. The bench test equipment fabricated for this program operated at the 8 micro sec line time and employed simple sampling of the gated MOSFET output video signal. The second stand employed Correlated Doubled Sampling (CDS) and operated at 79.2 micro sec line time.
NASA Technical Reports Server (NTRS)
Lu, Yun-Chi; Chang, Hyo Duck; Krupp, Brian; Kumar, Ravindra; Swaroop, Anand
1992-01-01
Information on Earth Observing System (EOS) output data products and input data requirements that has been compiled by the Science Processing Support Office (SPSO) at GSFC is presented. Since Version 1.0 of the SPSO Report was released in August 1991, there have been significant changes in the EOS program. In anticipation of a likely budget cut for the EOS Project, NASA HQ restructured the EOS program. An initial program consisting of two large platforms was replaced by plans for multiple, smaller platforms, and some EOS instruments were either deselected or descoped. Updated payload information reflecting the restructured EOS program superseding the August 1991 version of the SPSO report is included. This report has been expanded to cover information on non-EOS data products, and consists of three volumes (Volumes 1, 2, and 3). Volume 1 provides information on instrument outputs and input requirements. Volume 2 is devoted to Interdisciplinary Science (IDS) outputs and input requirements, including the 'best' and 'alternative' match analysis. Volume 3 provides information about retrieval algorithms, non-EOS input requirements of instrument teams and IDS investigators, and availability of non-EOS data products at seven primary Distributed Active Archive Centers (DAAC's).
Rapid, quantitative determination of bacteria in water. [adenosine triphosphate
NASA Technical Reports Server (NTRS)
Chappelle, E. W.; Picciolo, G. L.; Thomas, R. R.; Jeffers, E. L.; Deming, J. W. (Inventor)
1978-01-01
A bioluminescent assay for ATP in water borne bacteria is made by adding nitric acid to a water sample with concentrated bacteria to rupture the bacterial cells. The sample is diluted with sterile, deionized water, then mixed with a luciferase-luciferin mixture and the resulting light output of the bioluminescent reaction is measured and correlated with bacteria present. A standard and a blank also are presented so that the light output can be correlated to bacteria in the sample and system noise can be substracted from the readings. A chemiluminescent assay for iron porphyrins in water borne bacteria is made by adding luminol reagent to a water sample with concentrated bacteria and measuring the resulting light output of the chemiluminescent reaction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbose, Galen; Wiser, Ryan; Bolinger, Mark
Some stakeholders continue to voice concerns about the performance of customer-sited photovoltaic (PV) systems, particularly because these systems typically receive financial support through ratepayer- or publicly-funded programs. Although much remains to be understood about the extent and specific causes of poor PV system performance, several studies of the larger programs and markets have shed some light on the issue. An evaluation of the California Energy Commission (CEC)'s Emerging Renewables Program, for example, found that 7% of systems, in a sample of 95, had lower-than-expected power output due to shading or soiling (KEMA 2005). About 3% of a larger sample ofmore » 140 systems were not operating at all or were operating well below expected output, due to failed equipment, faulty installation workmanship, and/or a lack of basic maintenance. In a recent evaluation of the other statewide PV incentive program in California, the Self-Generation Incentive Program, 9 of 52 projects sampled were found to have annual capacity factors less than 14.5%, although reasons for these low capacity factors generally were not identified (Itron 2005). Studies of PV systems in Germany and Japan, the two largest PV markets worldwide, have also revealed some performance problems associated with issues such as shading, equipment and installation defects, inverter failure, and deviations from module manufacturers' specifications (Otani et al. 2004, Jahn & Nasse 2004). Although owners of PV systems have an inherent incentive to ensure that their systems perform well, many homeowners and building operators may lack the necessary information and expertise to carry out this task effectively. Given this barrier, and the responsibility of PV incentive programs to ensure that public funds are prudently spent, these programs should (and often do) play a critical role in promoting PV system performance. Performance-based incentives (PBIs), which are based on actual energy production rather than the rated capacity of the modules or system, are often suggested as one possible strategy. Somewhat less recognized are the many other program design options also available, each with its particular advantages and disadvantages. To provide a point of reference for assessing the current state of the art, and to inform program design efforts going forward, we examine the approaches to encouraging PV system performance - including, but not limited to, PBIs - used by 32 prominent PV incentive programs in the U.S. (see Table 1).1 We focus specifically on programs that offer an explicit subsidy payment for customer-sited PV installations. PV support programs that offer other forms of financial support or that function primarily as a mechanism for purchasing renewable energy credits (RECs) through energy production-based payments are outside the scope of our review.2 The information presented herein is derived primarily from publicly available sources, including program websites and guidebooks, programs evaluations, and conference papers, as well as from a limited number of personal communications with program staff. The remainder of this report is organized as follows. The next section presents a simple conceptual framework for understanding the issues that affect PV system performance and provides an overview of the eight general strategies to encourage performance used among the programs reviewed in this report. The subsequent eight sections discuss in greater detail each of these program design strategies and describe how they have been implemented among the programs surveyed. Based on this review, we then offer a series of recommendations for how PV incentive programs can effectively promote PV system performance.« less
User's Manual for Program PeakFQ, Annual Flood-Frequency Analysis Using Bulletin 17B Guidelines
Flynn, Kathleen M.; Kirby, William H.; Hummel, Paul R.
2006-01-01
Estimates of flood flows having given recurrence intervals or probabilities of exceedance are needed for design of hydraulic structures and floodplain management. Program PeakFQ provides estimates of instantaneous annual-maximum peak flows having recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years (annual-exceedance probabilities of 0.50, 0.20, 0.10, 0.04, 0.02, 0.01, 0.005, and 0.002, respectively). As implemented in program PeakFQ, the Pearson Type III frequency distribution is fit to the logarithms of instantaneous annual peak flows following Bulletin 17B guidelines of the Interagency Advisory Committee on Water Data. The parameters of the Pearson Type III frequency curve are estimated by the logarithmic sample moments (mean, standard deviation, and coefficient of skewness), with adjustments for low outliers, high outliers, historic peaks, and generalized skew. This documentation provides an overview of the computational procedures in program PeakFQ, provides a description of the program menus, and provides an example of the output from the program.
A Computer Program for the Calculation of Three-Dimensional Transonic Nacelle/Inlet Flowfields
NASA Technical Reports Server (NTRS)
Vadyak, J.; Atta, E. H.
1983-01-01
A highly efficient computer analysis was developed for predicting transonic nacelle/inlet flowfields. This algorithm can compute the three dimensional transonic flowfield about axisymmetric (or asymmetric) nacelle/inlet configurations at zero or nonzero incidence. The flowfield is determined by solving the full-potential equation in conservative form on a body-fitted curvilinear computational mesh. The difference equations are solved using the AF2 approximate factorization scheme. This report presents a discussion of the computational methods used to both generate the body-fitted curvilinear mesh and to obtain the inviscid flow solution. Computed results and correlations with existing methods and experiment are presented. Also presented are discussions on the organization of the grid generation (NGRIDA) computer program and the flow solution (NACELLE) computer program, descriptions of the respective subroutines, definitions of the required input parameters for both algorithms, a brief discussion on interpretation of the output, and sample cases to illustrate application of the analysis.
Performance regression manager for large scale systems
Faraj, Daniel A.
2017-10-17
System and computer program product to perform an operation comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputting for display an indication of a result of the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.
Programmable noise bandwidth reduction by means of digital averaging
NASA Technical Reports Server (NTRS)
Poklemba, John J. (Inventor)
1993-01-01
Predetection noise bandwidth reduction is effected by a pre-averager capable of digitally averaging the samples of an input data signal over two or more symbols, the averaging interval being defined by the input sampling rate divided by the output sampling rate. As the averaged sample is clocked to a suitable detector at a much slower rate than the input signal sampling rate the noise bandwidth at the input to the detector is reduced, the input to the detector having an improved signal to noise ratio as a result of the averaging process, and the rate at which such subsequent processing must operate is correspondingly reduced. The pre-averager forms a data filter having an output sampling rate of one sample per symbol of received data. More specifically, selected ones of a plurality of samples accumulated over two or more symbol intervals are output in response to clock signals at a rate of one sample per symbol interval. The pre-averager includes circuitry for weighting digitized signal samples using stored finite impulse response (FIR) filter coefficients. A method according to the present invention is also disclosed.
Schauss, Thorsten; Glaeser, Stefanie P.; Gütschow, Alexandra; Dott, Wolfgang; Kämpfer, Peter
2015-01-01
The presence of extended-spectrum beta-lactamase (ESBL)-producing Escherichia coli was investigated in input (manure from livestock husbandry) and output samples of six German biogas plants in 2012 (one sampling per biogas plant) and two German biogas plants investigated in an annual cycle four times in 2013/2014. ESBL-producing Escherichia coli were cultured by direct plating on CHROMagar ESBL from input samples in the range of 100 to 104 colony forming units (CFU) per g dry weight but not from output sample. This initially indicated a complete elimination of ESBL-producing E. coli by the biogas plant process. Detected non target bacteria were assigned to the genera Acinetobacter, Pseudomonas, Bordetella, Achromobacter, Castellaniella, and Ochrobactrum. A selective pre-enrichment procedure increased the detection efficiency of ESBL-producing E. coli in input samples and enabled the detection in five of eight analyzed output samples. In total 119 ESBL-producing E. coli were isolated from input and 46 from output samples. Most of the E. coli isolates carried CTX-M-type and/or TEM-type beta lactamases (94%), few SHV-type beta lactamase (6%). Sixty-four bla CTX-M genes were characterized more detailed and assigned mainly to CTX-M-groups 1 (85%) and 9 (13%), and one to group 2. Phylogenetic grouping of 80 E. coli isolates showed that most were assigned to group A (71%) and B1 (27%), only one to group D (2%). Genomic fingerprinting and multilocus sequence typing (MLST) showed a high clonal diversity with 41 BOX-types and 19 ST-types. The two most common ST-types were ST410 and ST1210. Antimicrobial susceptibility testing of 46 selected ESBL-producing E. coli revealed that several isolates were additionally resistant to other veterinary relevant antibiotics and some grew on CHROMagar STEC but shiga-like toxine (SLT) genes were not detected. Resistance to carbapenems was not detected. In summary the study showed for the first time the presence of ESBL-producing E. coli in output samples of German biogas plants. PMID:25799434
ERIC Educational Resources Information Center
Chiang, Ching-hsin
This thesis reports on the designer's plans and experiences in carrying out the design, development, implementation, and evaluation of a project, the purpose of which was to develop a training program that would enable foreign students at the New York Institute of Technology (NYIT) to use the Computer Output Microform Catalog (COMCAT) and to…
2010-01-01
Background Fellowships are a component of many professional education programs. They provide opportunities to develop skills and competencies in an environment where time is protected and resources and technical support are more readily available. The SEA-ORCHID fellowships program aimed to increase capacity for evidence-based practice and research synthesis, and to encourage fellows to become leaders in these areas. Methods Fellows included doctors, nurses, midwives and librarians working in the maternal and neonatal areas of nine hospitals in South East Asia. Fellowships were undertaken in Australia and involved specific outputs related to evidence-based practice or research synthesis. Training and support was tailored according to the type of output and the fellow's experience and expertise. We evaluated the fellowships program quantitatively and qualitatively through written evaluations, interviews and follow-up of fellowship activities. Results During 2006-07, 23 fellows from Thailand, Indonesia, Malaysia and the Philippines undertook short-term fellowships (median four weeks) in Australia. The main outputs were drafts of Cochrane systematic reviews, clinical practice guidelines and protocols for randomised trials, and training materials to support evidence-based practice. Protocols for Cochrane systematic reviews were more likely to be completed than other outcomes. The fellows identified several components that were critical to the program's overall success; these included protected time, tailored training, and access to technical expertise and resources. On returning home, fellows identified a lack of time and limited access to the internet and evidence-based resources as barriers to completing their outputs. The support of colleagues and senior staff was noted as an important enabler of progress, and research collaborators from other institutions and countries were also important sources of support. Conclusions The SEA-ORCHID fellowships program provided protected time to work on an output which would facilitate evidence-based practice. While the fellows faced substantial barriers to completing their fellowship outputs once they returned home, these fellowships resulted in a greater understanding, enthusiasm and skills for evidence-based practice. The experience of the SEA-ORCHID fellowships program may be useful for other initiatives aiming to build capacity in evidence-based practice. PMID:20492706
Flux focusing eddy current probe
NASA Technical Reports Server (NTRS)
Simpson, John W. (Inventor); Clendenin, C. Gerald (Inventor); Fulton, James P. (Inventor); Wincheski, Russell A. (Inventor); Todhunter, Ronald G. (Inventor); Namkung, Min (Inventor); Nath, Shridhar C. (Inventor)
1997-01-01
A flux-focusing electromagnetic sensor which uses a ferromagnetic flux-focusing lens simplifies inspections and increases detectability of fatigue cracks and material loss in high conductivity material. The unique feature of the device is the ferrous shield isolating a high-turn pick-up coil from an excitation coil. The use of the magnetic shield is shown to produce a null voltage output across the receiving coil in the presence of an unflawed sample. A redistribution of the current flow in the sample caused by the presence of flaws, however, eliminates the shielding condition and a large output voltage is produced, yielding a clear unambiguous flaw signal. The maximum sensor output is obtained when positioned symmetrically above the crack. Hence, by obtaining the position of the maximum sensor output, it is possible to track the fault and locate the area surrounding its tip. The accuracy of tip location is enhanced by two unique features of the sensor; a very high signal-to-noise ratio of the probe's output which results in an extremely smooth signal peak across the fault, and a rapidly decaying sensor output outside a small area surrounding the crack tip which enables the region for searching to be clearly defined. Under low frequency operation, material thinning due to corrosion damage causes an incomplete shielding of the pick-up coil. The low frequency output voltage of the probe is therefore a direct indicator of the thickness of the test sample.
NASA Technical Reports Server (NTRS)
Mendenhall, M. R.
1978-01-01
A user's manual is presented for a computer program in which a vortex-lattice lifting-surface method is used to model the wing and multiple flaps. The engine wake model consists of a series of closely spaced vortex rings with rectangular cross sections. The jet wake is positioned such that the lower boundary of the jet is tangent to the wing and flap upper surfaces. The two potential flow models are used to calculate the wing-flap loading distribution including the influence of the wakes from up to two engines on the semispan. The method is limited to the condition where the flow and geometry of the configurations are symmetric about the vertical plane containing the wing root chord. The results include total configuration forces and moments, individual lifting-surface load distributions, pressure distributions, flap hinge moments, and flow field calculation at arbitrary field points. The use of the program, preparation of input, the output, program listing, and sample cases are described.
HYDES: A generalized hybrid computer program for studying turbojet or turbofan engine dynamics
NASA Technical Reports Server (NTRS)
Szuch, J. R.
1974-01-01
This report describes HYDES, a hybrid computer program capable of simulating one-spool turbojet, two-spool turbojet, or two-spool turbofan engine dynamics. HYDES is also capable of simulating two- or three-stream turbofans with or without mixing of the exhaust streams. The program is intended to reduce the time required for implementing dynamic engine simulations. HYDES was developed for running on the Lewis Research Center's Electronic Associates (EAI) 690 Hybrid Computing System and satisfies the 16384-word core-size and hybrid-interface limits of that machine. The program could be modified for running on other computing systems. The use of HYDES to simulate a single-spool turbojet and a two-spool, two-stream turbofan engine is demonstrated. The form of the required input data is shown and samples of output listings (teletype) and transient plots (x-y plotter) are provided. HYDES is shown to be capable of performing both steady-state design and off-design analyses and transient analyses.
Development and testing of tip devices for horizontal axis wind turbines
NASA Technical Reports Server (NTRS)
Gyatt, G. W.; Lissaman, P. B. S.
1985-01-01
A theoretical and field experimental program has been carried out to investigate the use of tip devices on horizontal axis wind turbine rotors. The objective was to improve performance by the reduction of tip losses. While power output can always be increased by a simple radial tip extension, such a modification also results in an increased gale load both because of the extra projected area and longer moment arm. Tip devices have the potential to increase power output without such a structural penalty. A vortex lattice computer model was used to optimize three basic tip configuration types for a 25 kW stall limited commercial wind turbine. The types were a change in tip planform, and a single-element and double-element nonplanar tip extension (winglets). A complete data acquisition system was developed which recorded three wind speed components, ambient pressure, temperature, and turbine output. The system operated unattended and could perform real-time processing of the data, displaying the measured power curve as data accumulated in either a bin sort mode or polynomial curve fit. Approximately 270 hr of perormance data were collected over a three-month period. The sampling interval was 2.4 sec; thrus over 400,000 raw data points were logged. Results for each of the three new tip devices, compared with the original tip, showed a small decrease (of the order of 1 kW) in power output over the measured range of wind speeds from cut-in at about 4 m/s to over 20 m/s, well into the stall limiting region. Changes in orientation and angle-of-attack of the winglets were not made. For aircraft wing tip devices, favorable tip shapes have been reported and it is likely that the tip devices tested in this program did not improve rotor performance because they were not optimally adjusted.
NASA Technical Reports Server (NTRS)
McBride, Bonnie J.; Gordon, Sanford
1996-01-01
This users manual is the second part of a two-part report describing the NASA Lewis CEA (Chemical Equilibrium with Applications) program. The program obtains chemical equilibrium compositions of complex mixtures with applications to several types of problems. The topics presented in this manual are: (1) details for preparing input data sets; (2) a description of output tables for various types of problems; (3) the overall modular organization of the program with information on how to make modifications; (4) a description of the function of each subroutine; (5) error messages and their significance; and (6) a number of examples that illustrate various types of problems handled by CEA and that cover many of the options available in both input and output. Seven appendixes give information on the thermodynamic and thermal transport data used in CEA; some information on common variables used in or generated by the equilibrium module; and output tables for 14 example problems. The CEA program was written in ANSI standard FORTRAN 77. CEA should work on any system with sufficient storage. There are about 6300 lines in the source code, which uses about 225 kilobytes of memory. The compiled program takes about 975 kilobytes.
NASA Technical Reports Server (NTRS)
Raju, I. S.
1986-01-01
The Q3DG is a computer program developed to perform a quasi-three-dimensional stress analysis for composite laminates which may contain delaminations. The laminates may be subjected to mechanical, thermal, and hygroscopic loads. The program uses the finite element method and models the laminates with eight-noded parabolic isoparametric elements. The program computes the strain-energy-release components and the total strain-energy release in all three modes for delamination growth. A rectangular mesh and data file generator, DATGEN, is included. The DATGEN program can be executed interactively and is user friendly. The documentation includes sections dealing with the Q3D analysis theory, derivation of element stiffness matrices and consistent load vectors for the parabolic element. Several sample problems with the input for Q3DG and output from the program are included. The capabilities of the DATGEN program are illustrated with examples of interactive sessions. A microfiche of all the examples is included. The Q3DG and DATGEN programs have been implemented on CYBER 170 class computers. Q3DG and DATGEN were developed at the Langley Research Center during the early eighties and documented in 1984 to 1985.
Program to convert SUDS2ASC files to a single binary SEGY file
Goldman, Mark
2000-01-01
This program, SUDS2SEGY, converts and combines ASCII files created using SUDS2ASC Version 2.60, to a single SEGY file. SUDS2ASC has been used previously to create an ASCII file of three-component seismic data for an individual recording station. However, many seismic processing packages have difficulty reading in ASCII data. In addition, it may be cumbersome to process a separate file for each recording station, particularly if traces from different recording stations contain a different number of data samples and/or a different start time. This new program - SUDS2SEGY - combines these recording station files into a single SEGY file. In addition, SUDS2SEGY normalizes the trace times so that each trace starts at a given time and consists of a fixed number of samples. This normalization allows seismic data from many different stations to be read in as a single "data gather". SUDS2SEGY also produces a report summarizing the offset and maximum absolute amplitude for each component in a station file. These data are output separately to an ASCII file and can be subsequently input to a plotting package.
ARL Eye Safer Fiber Laser Testbed Lab View Automation and Control
2013-09-01
output voltage value in volts. gpc n Program the output current value in amperes. grst Reset and bring the power supplies to safe state. gout n...Turn the output on/off: gout 1 = turn on, gout 0 = turn off Figure 4 shows the front panel of power supplies and back panel RS 485 link. 4
Spectroscopic analysis and control
Tate; , James D.; Reed, Christopher J.; Domke, Christopher H.; Le, Linh; Seasholtz, Mary Beth; Weber, Andy; Lipp, Charles
2017-04-18
Apparatus for spectroscopic analysis which includes a tunable diode laser spectrometer having a digital output signal and a digital computer for receiving the digital output signal from the spectrometer, the digital computer programmed to process the digital output signal using a multivariate regression algorithm. In addition, a spectroscopic method of analysis using such apparatus. Finally, a method for controlling an ethylene cracker hydrogenator.
NASA Technical Reports Server (NTRS)
Sproles, Darrell W.; Bavuso, Salvatore J.
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical postprocessor program HARPO (HARP Output). HARPO reads ASCII files generated by HARP. It provides an interactive plotting capability that can be used to display alternate model data for trade-off analyses. File data can also be imported to other commercial software programs.
Bidirectional power converter control electronics
NASA Technical Reports Server (NTRS)
Mildice, J. W.
1987-01-01
The object of this program was to design, build, test, and deliver a set of control electronics suitable for control of bidirectional resonant power processing equipment of the direct output type. The program is described, including the technical background, and results discussed. Even though the initial program tested only the logic outputs, the hardware was subsequently tested with high-power breadboard equipment, and in the testbed of NASA contract NAS3-24399. The completed equipment is now operating as part of the Space Station Power System Test Facility at NASA Lewis Research Center.
Sarabia, Jose Manuel; Fernandez-Fernandez, Jaime; Juan-Recio, Casto; Hernández-Davó, Hector; Urbán, Tomás; Moya, Manuel
2015-01-01
This study examined the effects of a 6-week non-failure strength training program in youth tennis players. Twenty tennis players (age: 15.0 ± 1 years, body height: 170.9 ± 5.1 cm, body mass: 63.3 ± 9.1 kg) were divided into experimental and control groups. Pre and post-tests included half squats, bench press, squat jumps, countermovement-jumps and side-ball throws. Salivary cortisol samples were collected, and the Profile of Mood States questionnaire was used weekly during an anatomical adaptation period, a main training period and after a tapering week. The results showed that, after the main training period, the experimental group significantly improved (p<0.05) in mean and peak power output and in the total number of repetitions during the half-squat endurance test; mean force, power and velocity in the half-squat power output test; Profile of Mood States (in total mood disturbance between the last week of the mean training period and the tapering week); and in squat-jump and countermovement-jump height. Moreover, significant differences were found between the groups at the post-tests in the total number of repetitions, mean and peak power during the half-squat endurance test, mean velocity in the half-squat power output test, salivary cortisol concentration (baselines, first and third week of the mean training period) and in the Profile of Mood States (in fatigue subscale: first and third week of the mean training period). In conclusion, a non-failure strength training protocol improved lower-limb performance levels and produced a moderate psychophysiological impact in youth elite tennis players, suggesting that it is a suitable program to improve strength. Such training protocols do not increase the total training load of tennis players and may be recommended to improve strength. PMID:25964812
Program Aids In Printing FORTRAN-Coded Output
NASA Technical Reports Server (NTRS)
Akian, Richard A.
1993-01-01
FORPRINT computer program prints FORTRAN-coded output files on most non-Postscript printers with such extra features as control of fonts for Epson and Hewlett Packard printers. Rewrites data to printer and inserts correct printer-control codes. Alternative uses include ability to separate data or ASCII file during printing by use of editing software to insert "1" in first column of data line that starts new page. Written in FORTRAN 77.
Cullen, Patricia; Clapham, Kathleen; Byrne, Jake; Hunter, Kate; Senserrick, Teresa; Keay, Lisa; Ivers, Rebecca
2016-08-01
Evidence indicates that Aboriginal people are underrepresented among driver licence holders in New South Wales, which has been attributed to licensing barriers for Aboriginal people. The Driving Change program was developed to provide culturally responsive licensing services that engage Aboriginal communities and build local capacity. This paper outlines the formative evaluation of the program, including logic model construction and exploration of contextual factors. Purposive sampling was used to identify key informants (n=12) from a consultative committee of key stakeholders and program staff. Semi-structured interviews were transcribed and thematically analysed. Data from interviews informed development of the logic model. Participants demonstrated high level of support for the program and reported that it filled an important gap. The program context revealed systemic barriers to licensing that were correspondingly targeted by specific program outputs in the logic model. Addressing underlying assumptions of the program involved managing local capacity and support to strengthen implementation. This formative evaluation highlights the importance of exploring program context as a crucial first step in logic model construction. The consultation process assisted in clarifying program goals and ensuring that the program was responding to underlying systemic factors that contribute to inequitable licensing access for Aboriginal people. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Muss, J. A.; Nguyen, T. V.; Johnson, C. W.
1991-01-01
The appendices A-K to the user's manual for the rocket combustor interactive design (ROCCID) computer program are presented. This includes installation instructions, flow charts, subroutine model documentation, and sample output files. The ROCCID program, written in Fortran 77, provides a standardized methodology using state of the art codes and procedures for the analysis of a liquid rocket engine combustor's steady state combustion performance and combustion stability. The ROCCID is currently capable of analyzing mixed element injector patterns containing impinging like doublet or unlike triplet, showerhead, shear coaxial and swirl coaxial elements as long as only one element type exists in each injector core, baffle, or barrier zone. Real propellant properties of oxygen, hydrogen, methane, propane, and RP-1 are included in ROCCID. The properties of other propellants can be easily added. The analysis models in ROCCID can account for the influences of acoustic cavities, helmholtz resonators, and radial thrust chamber baffles on combustion stability. ROCCID also contains the logic to interactively create a combustor design which meets input performance and stability goals. A preliminary design results from the application of historical correlations to the input design requirements. The steady state performance and combustion stability of this design is evaluated using the analysis models, and ROCCID guides the user as to the design changes required to satisfy the user's performance and stability goals, including the design of stability aids. Output from ROCCID includes a formatted input file for the standardized JANNAF engine performance prediction procedure.
Lam, H K
2012-02-01
This paper investigates the stability of sampled-data output-feedback (SDOF) polynomial-fuzzy-model-based control systems. Representing the nonlinear plant using a polynomial fuzzy model, an SDOF fuzzy controller is proposed to perform the control process using the system output information. As only the system output is available for feedback compensation, it is more challenging for the controller design and system analysis compared to the full-state-feedback case. Furthermore, because of the sampling activity, the control signal is kept constant by the zero-order hold during the sampling period, which complicates the system dynamics and makes the stability analysis more difficult. In this paper, two cases of SDOF fuzzy controllers, which either share the same number of fuzzy rules or not, are considered. The system stability is investigated based on the Lyapunov stability theory using the sum-of-squares (SOS) approach. SOS-based stability conditions are obtained to guarantee the system stability and synthesize the SDOF fuzzy controller. Simulation examples are given to demonstrate the merits of the proposed SDOF fuzzy control approach.
NASA Technical Reports Server (NTRS)
Berg, R. F.; Holcomb, J. E.; Kelroy, E. A.; Levine, D. A.; Mee, C., III
1970-01-01
Generalized information storage and retrieval system capable of generating and maintaining a file, gathering statistics, sorting output, and generating final reports for output is reviewed. File generation and file maintenance programs written for the system are general purpose routines.
Talking Drums: Generating drum grooves with neural networks
NASA Astrophysics Data System (ADS)
Hutchings, P.
2017-05-01
Presented is a method of generating a full drum kit part for a provided kick-drum sequence. A sequence to sequence neural network model used in natural language translation was adopted to encode multiple musical styles and an online survey was developed to test different techniques for sampling the output of the softmax function. The strongest results were found using a sampling technique that drew from the three most probable outputs at each subdivision of the drum pattern but the consistency of output was found to be heavily dependent on style.
Flux-focusing eddy current probe and method for flaw detection
NASA Technical Reports Server (NTRS)
Simpson, John W. (Inventor); Clendenin, C. Gerald (Inventor)
1993-01-01
A flux-focusing electromagnetic sensor which uses a ferromagnetic flux-focusing lens simplifies inspections and increases detectability of fatigue cracks and material loss in high conductivity material is presented. The unique feature of the device is the ferrous shield isolating a high-turn pick-up coil from an excitation coil. The use of the magnetic shield is shown to produce a null voltage output across the receiving coil in the presence of an unflawed sample. A redistribution of the current flow in the sample caused by the presence of flaws, however, eliminates the shielding condition and a large output voltage is produced, yielding a clear unambiguous flaw signal. The maximum sensor output is obtained when positioned symmetrically above the crack. Hence, by obtaining the position of the maximum sensor output, it is possible to track the fault and locate the area surrounding its tip. The accuracy of tip location is enhanced by two unique features of the sensor; a very high signal-to-noise ratio of the probe's output which results in an extremely smooth signal peak across the fault, and a rapidly decaying sensor output outside a small area surrounding the crack tip which enables the region for searching to be clearly defined. Under low frequency operation, material thinning due to corrosion damage causes an incomplete shielding of the pick-up coil. The low frequency output voltage of the probe is therefore a direct indicator of the thickness of the test sample.
Evaluation of a Postdischarge Call System Using the Logic Model.
Frye, Timothy C; Poe, Terri L; Wilson, Marisa L; Milligan, Gary
2018-02-01
This mixed-method study was conducted to evaluate a postdischarge call program for congestive heart failure patients at a major teaching hospital in the southeastern United States. The program was implemented based on the premise that it would improve patient outcomes and overall quality of life, but it had never been evaluated for effectiveness. The Logic Model was used to evaluate the input of key staff members to determine whether the outputs and results of the program matched the expectations of the organization. Interviews, online surveys, reviews of existing patient outcome data, and reviews of publicly available program marketing materials were used to ascertain current program output. After analyzing both qualitative and quantitative data from the evaluation, recommendations were made to the organization to improve the effectiveness of the program.
NASA Astrophysics Data System (ADS)
Meng, Su; Chen, Jie; Sun, Jian
2017-10-01
This paper investigates the problem of observer-based output feedback control for networked control systems with non-uniform sampling and time-varying transmission delay. The sampling intervals are assumed to vary within a given interval. The transmission delay belongs to a known interval. A discrete-time model is first established, which contains time-varying delay and norm-bounded uncertainties coming from non-uniform sampling intervals. It is then converted to an interconnection of two subsystems in which the forward channel is delay-free. The scaled small gain theorem is used to derive the stability condition for the closed-loop system. Moreover, the observer-based output feedback controller design method is proposed by utilising a modified cone complementary linearisation algorithm. Finally, numerical examples illustrate the validity and superiority of the proposed method.
Keshavarz, M; Mojra, A
2015-05-01
Geometrical features of a cancerous tumor embedded in biological soft tissue, including tumor size and depth, are a necessity in the follow-up procedure and making suitable therapeutic decisions. In this paper, a new socio-politically motivated global search strategy which is called imperialist competitive algorithm (ICA) is implemented to train a feed forward neural network (FFNN) to estimate the tumor's geometrical characteristics (FFNNICA). First, a viscoelastic model of liver tissue is constructed by using a series of in vitro uniaxial and relaxation test data. Then, 163 samples of the tissue including a tumor with different depths and diameters are generated by making use of PYTHON programming to link the ABAQUS and MATLAB together. Next, the samples are divided into 123 samples as training dataset and 40 samples as testing dataset. Training inputs of the network are mechanical parameters extracted from palpation of the tissue through a developing noninvasive technology called artificial tactile sensing (ATS). Last, to evaluate the FFNNICA performance, outputs of the network including tumor's depth and diameter are compared with desired values for both training and testing datasets. Deviations of the outputs from desired values are calculated by a regression analysis. Statistical analysis is also performed by measuring Root Mean Square Error (RMSE) and Efficiency (E). RMSE in diameter and depth estimations are 0.50 mm and 1.49, respectively, for the testing dataset. Results affirm that the proposed optimization algorithm for training neural network can be useful to characterize soft tissue tumors accurately by employing an artificial palpation approach. Copyright © 2015 John Wiley & Sons, Ltd.
An Open-Source Toolbox for Surrogate Modeling of Joint Contact Mechanics
Eskinazi, Ilan
2016-01-01
Goal Incorporation of elastic joint contact models into simulations of human movement could facilitate studying the interactions between muscles, ligaments, and bones. Unfortunately, elastic joint contact models are often too expensive computationally to be used within iterative simulation frameworks. This limitation can be overcome by using fast and accurate surrogate contact models that fit or interpolate input-output data sampled from existing elastic contact models. However, construction of surrogate contact models remains an arduous task. The aim of this paper is to introduce an open-source program called Surrogate Contact Modeling Toolbox (SCMT) that facilitates surrogate contact model creation, evaluation, and use. Methods SCMT interacts with the third party software FEBio to perform elastic contact analyses of finite element models and uses Matlab to train neural networks that fit the input-output contact data. SCMT features sample point generation for multiple domains, automated sampling, sample point filtering, and surrogate model training and testing. Results An overview of the software is presented along with two example applications. The first example demonstrates creation of surrogate contact models of artificial tibiofemoral and patellofemoral joints and evaluates their computational speed and accuracy, while the second demonstrates the use of surrogate contact models in a forward dynamic simulation of an open-chain leg extension-flexion motion. Conclusion SCMT facilitates the creation of computationally fast and accurate surrogate contact models. Additionally, it serves as a bridge between FEBio and OpenSim musculoskeletal modeling software. Significance Researchers may now create and deploy surrogate models of elastic joint contact with minimal effort. PMID:26186761
Performance regression manager for large scale systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faraj, Daniel A.
System and computer program product to perform an operation comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputtingmore » for display an indication of a result of the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.« less
Surface laser marking optimization using an experimental design approach
NASA Astrophysics Data System (ADS)
Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.
2017-04-01
Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.
Dodds, M W; Dodds, A P
1997-04-01
The objective of this study was to determine whether improvements in the level of diabetic control in a group of subjects with poorly controlled non-insulin-dependent diabetes mellitus influence salivary output and composition. Repeated whole unstimulated and stimulated parotid saliva samples were collected from diabetic patients attending an outpatient diabetes education program and a matched nondiabetic control group. Saliva was analyzed for flow rates, parotid protein concentration and composition, and amylase activity. Subjective responses to questions about salivary hypofunction were tested. There were no significant differences in whole unstimulated and stimulated parotid flow rates or stimulated parotid protein concentration and composition between diabetics and the control group. Amylase activity was higher in diabetics and decreased with improved glycemic control. Subjects reporting taste alterations had higher mean blood glucose levels than subjects with normal taste sensation. Poorly controlled non-insulin-dependent diabetes mellitus has no influence on saliva output, although amylase activity may be elevated, and there may be taste alterations.
ALOHA: Automatic libraries of helicity amplitudes for Feynman diagram computations
NASA Astrophysics Data System (ADS)
de Aquino, Priscila; Link, William; Maltoni, Fabio; Mattelaer, Olivier; Stelzer, Tim
2012-10-01
We present an application that automatically writes the HELAS (HELicity Amplitude Subroutines) library corresponding to the Feynman rules of any quantum field theory Lagrangian. The code is written in Python and takes the Universal FeynRules Output (UFO) as an input. From this input it produces the complete set of routines, wave-functions and amplitudes, that are needed for the computation of Feynman diagrams at leading as well as at higher orders. The representation is language independent and currently it can output routines in Fortran, C++, and Python. A few sample applications implemented in the MADGRAPH 5 framework are presented. Program summary Program title: ALOHA Catalogue identifier: AEMS_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: http://www.opensource.org/licenses/UoI-NCSA.php No. of lines in distributed program, including test data, etc.: 6094320 No. of bytes in distributed program, including test data, etc.: 7479819 Distribution format: tar.gz Programming language: Python2.6 Computer: 32/64 bit Operating system: Linux/Mac/Windows RAM: 512 Mbytes Classification: 4.4, 11.6 Nature of problem: An effcient numerical evaluation of a squared matrix element can be done with the help of the helicity routines implemented in the HELAS library [1]. This static library contains a limited number of helicity functions and is therefore not always able to provide the needed routine in the presence of an arbitrary interaction. This program provides a way to automatically create the corresponding routines for any given model. Solution method: ALOHA takes the Feynman rules associated to the vertex obtained from the model information (in the UFO format [2]), and multiplies it by the different wavefunctions or propagators. As a result the analytical expression of the helicity routines is obtained. Subsequently, this expression is automatically written in the requested language (Python, Fortran or C++) Restrictions: The allowed fields are currently spin 0, 1/2, 1 and 2, and the propagators of these particles are canonical. Running time: A few seconds for the SM and the MSSM, and up to a few minutes for models with spin 2 particles. References: [1] Murayama, H. and Watanabe, I. and Hagiwara, K., HELAS: HELicity Amplitude Subroutines for Feynman diagram evaluations, KEK-91-11, (1992) http://www-lib.kek.jp/cgi-bin/img_index?199124011 [2] C. Degrande, C. Duhr, B. Fuks, D. Grellscheid, O. Mattelaer, et al., UFO— The Universal FeynRules Output, Comput. Phys. Commun. 183 (2012) 1201-1214. arXiv:1108.2040, doi:10.1016/j.cpc.2012.01.022.
NASA's lithium cell technology program
NASA Technical Reports Server (NTRS)
Juvinall, G. L.
1978-01-01
Briefly outlined are the activities of the various research centers involved in the NASA program. Graphs are presented for: (1) the initial results on SOCl2 decomposition rate; (2) effect of rate on output of Li-SOCl2 cells; (3) comparison of high and low rate Li-SOCl2 cells; and (4) effect of temperature on output of Li-SOCl2 cells. Abusive test results and a description of secondary lithium cells are also presented.
NASA Technical Reports Server (NTRS)
Richard, M.; Harrison, B. A.
1979-01-01
The program input presented consists of configuration geometry, aerodynamic parameters, and modal data; output includes element geometry, pressure difference distributions, integrated aerodynamic coefficients, stability derivatives, generalized aerodynamic forces, and aerodynamic influence coefficient matrices. Optionally, modal data may be input on magnetic file (tape or disk), and certain geometric and aerodynamic output may be saved for subsequent use.
1992-01-09
Crystal Polymers Tracy Reed Geophysics Laboratory (GEO) 9 Analysis of Model Output Statistics Thunderstorm Prediction Model Frank Lasley 10...four hours to twenty-four hours. It was predicted that the dogbones would turn brown once they reached the approximate annealing temperature. This was...LYS Hanscom AFB Frank A. Lasley Abstracft. Model Output Statistics (MOS) Thunderstorm prediction information and Service A weather observations
User's Manual for Aerofcn: a FORTRAN Program to Compute Aerodynamic Parameters
NASA Technical Reports Server (NTRS)
Conley, Joseph L.
1992-01-01
The computer program AeroFcn is discussed. AeroFcn is a utility program that computes the following aerodynamic parameters: geopotential altitude, Mach number, true velocity, dynamic pressure, calibrated airspeed, equivalent airspeed, impact pressure, total pressure, total temperature, Reynolds number, speed of sound, static density, static pressure, static temperature, coefficient of dynamic viscosity, kinematic viscosity, geometric altitude, and specific energy for a standard- or a modified standard-day atmosphere using compressible flow and normal shock relations. Any two parameters that define a unique flight condition are selected, and their values are entered interactively. The remaining parameters are computed, and the solutions are stored in an output file. Multiple cases can be run, and the multiple case solutions can be stored in another output file for plotting. Parameter units, the output format, and primary constants in the atmospheric and aerodynamic equations can also be changed.
Robot Task Commander with Extensible Programming Environment
NASA Technical Reports Server (NTRS)
Hart, Stephen W (Inventor); Wightman, Brian J (Inventor); Dinh, Duy Paul (Inventor); Yamokoski, John D. (Inventor); Gooding, Dustin R (Inventor)
2014-01-01
A system for developing distributed robot application-level software includes a robot having an associated control module which controls motion of the robot in response to a commanded task, and a robot task commander (RTC) in networked communication with the control module over a network transport layer (NTL). The RTC includes a script engine(s) and a GUI, with a processor and a centralized library of library blocks constructed from an interpretive computer programming code and having input and output connections. The GUI provides access to a Visual Programming Language (VPL) environment and a text editor. In executing a method, the VPL is opened, a task for the robot is built from the code library blocks, and data is assigned to input and output connections identifying input and output data for each block. A task sequence(s) is sent to the control module(s) over the NTL to command execution of the task.
The N-BOD2 user's and programmer's manual
NASA Technical Reports Server (NTRS)
Frisch, H. P.
1978-01-01
A general purpose digital computer program was developed and designed to aid in the analysis of spacecraft attitude dynamics. The program provides the analyst with the capability of automatically deriving and numerically solving the equations of motion of any system that can be modeled as a topological tree of coupled rigid bodies, flexible bodies, point masses, and symmetrical momentum wheels. Two modes of output are available. The composite system equations of motion may be outputted on a line printer in a symbolic form that may be easily translated into common vector-dyadic notation, or the composite system equations of motion may be solved numerically and any desirable set of system state variables outputted as a function of time.
ListingAnalyst: A program for analyzing the main output file from MODFLOW
Winston, Richard B.; Paulinski, Scott
2014-01-01
ListingAnalyst is a Windows® program for viewing the main output file from MODFLOW-2005, MODFLOW-NWT, or MODFLOW-LGR. It organizes and displays large files quickly without using excessive memory. The sections and subsections of the file are displayed in a tree-view control, which allows the user to navigate quickly to desired locations in the files. ListingAnalyst gathers error and warning messages scattered throughout the main output file and displays them all together in an error and a warning tab. A grid view displays tables in a readable format and allows the user to copy the table into a spreadsheet. The user can also search the file for terms of interest.
Chaos: Understanding and Controlling Laser Instability
NASA Technical Reports Server (NTRS)
Blass, William E.
1997-01-01
In order to characterize the behavior of tunable diode lasers (TDL), the first step in the project involved the redesign of the TDL system here at the University of Tennessee Molecular Systems Laboratory (UTMSL). Having made these changes it was next necessary to optimize the new optical system. This involved the fine adjustments to the optical components, particularly in the monochromator, to minimize the aberrations of coma and astigmatism and to assure that the energy from the beam is focused properly on the detector element. The next step involved the taking of preliminary data. We were then ready for the analysis of the preliminary data. This required the development of computer programs that use mathematical techniques to look for signatures of chaos. Commercial programs were also employed. We discovered some indication of high dimensional chaos, but were hampered by the low sample rate of 200 KSPS (kilosamples/sec) and even more by our sample size of 1024 (1K) data points. These limitations were expected and we added a high speed data acquisition board. We incorporated into the system a computer with a 40 MSPS (million samples/sec) data acquisition board. This board can also capture 64K of data points so that were then able to perform the more accurate tests for chaos. The results were dramatic and compelling, we had demonstrated that the lead salt diode laser had a chaotic frequency output. Having identified the chaotic character in our TDL data, we proceeded to stage two as outlined in our original proposal. This required the use of an Occasional Proportional Feedback (OPF) controller to facilitate the control and stabilization of the TDL system output. The controller was designed and fabricated at GSFC and debugged in our laboratories. After some trial and error efforts, we achieved chaos control of the frequency emissions of the laser. The two publications appended to this introduction detail the entire project and its results.
The TMDL Program Results Analysis Project: Matching Results Measures with Program Expectations
The paper provides a detailed description of the aims, methods and outputs of the program evaluation project undertaken by EPA in order to generate the insights needed to make TMDL program improvements.
NASA Technical Reports Server (NTRS)
Keith, J. S.; Ferguson, D. R.; Heck, P. H.
1972-01-01
The computer program, Streamtube Curvature Analysis, is described for the engineering user and for the programmer. The user oriented documentation includes a description of the mathematical governing equations, their use in the solution, and the method of solution. The general logical flow of the program is outlined and detailed instructions for program usage and operation are explained. General procedures for program use and the program capabilities and limitations are described. From the standpoint of the grammar, the overlay structure of the program is described. The various storage tables are defined and their uses explained. The input and output are discussed in detail. The program listing includes numerous comments so that the logical flow within the program is easily followed. A test case showing input data and output format is included as well as an error printout description.
Local classifier weighting by quadratic programming.
Cevikalp, Hakan; Polikar, Robi
2008-10-01
It has been widely accepted that the classification accuracy can be improved by combining outputs of multiple classifiers. However, how to combine multiple classifiers with various (potentially conflicting) decisions is still an open problem. A rich collection of classifier combination procedures -- many of which are heuristic in nature -- have been developed for this goal. In this brief, we describe a dynamic approach to combine classifiers that have expertise in different regions of the input space. To this end, we use local classifier accuracy estimates to weight classifier outputs. Specifically, we estimate local recognition accuracies of classifiers near a query sample by utilizing its nearest neighbors, and then use these estimates to find the best weights of classifiers to label the query. The problem is formulated as a convex quadratic optimization problem, which returns optimal nonnegative classifier weights with respect to the chosen objective function, and the weights ensure that locally most accurate classifiers are weighted more heavily for labeling the query sample. Experimental results on several data sets indicate that the proposed weighting scheme outperforms other popular classifier combination schemes, particularly on problems with complex decision boundaries. Hence, the results indicate that local classification-accuracy-based combination techniques are well suited for decision making when the classifiers are trained by focusing on different regions of the input space.
Cady, R.E.; Peckenpaugh, J.M.
1985-01-01
RAQSIM, a generalized flow model of a groundwater system using finite-element methods, is documented to explain how it works and to demonstrate that it gives valid results. Three support programs that are used to compute recharge and discharge data required as input to RAQSIM are described. RAQSIM was developed to solve transient, two-dimensional, regional groundwater flow problems with isotropic or anisotropic conductance. The model can also simulate radially-symmetric flow to a well and steady-state flow. The mathematical basis, program structure, data input and output procedures, organization of data sets, and program features and options of RAQSIM are discussed. An example , containing listings of data and results and illustrating RAQSIM 's capabilities, is discussed in detail. Two test problems also are discussed comparing RAQSIM 's results with analytical procedures. The first support program described, the PET Program, uses solar radiation and other climatic data in the Jensen-Haise method to compute potential evapotranspiration. The second support program, the Soil-Water Program, uses output from the PET Program, soil characteristics, and the ratio of potential to actual evapotranspiration for each crop to compute infiltration, storage, and removal of water from the soil zone. The third program, the Recharge-Discharge Program, uses output from the Soil-Water Program together with other data to compute recharge and discharge from the groundwater flow system. For each support program, a program listing and examples of the data and results for the Twin Platte-Middle Republican study are provided. In addition, a brief discussion on how each program operates and on procedures for running and modifying these programs are presented. (Author 's abstract)
Remote temperature-set-point controller
Burke, W.F.; Winiecki, A.L.
1984-10-17
An instrument is described for carrying out mechanical strain tests on metallic samples with the addition of means for varying the temperature with strain. The instrument includes opposing arms and associated equipment for holding a sample and varying the mechanical strain on the sample through a plurality of cycles of increasing and decreasing strain within predetermined limits, circuitry for producing an output signal representative of the strain during the tests, apparatus including a a set point and a coil about the sample for providing a controlled temperature in the sample, and circuitry interconnected between the strain output signal and set point for varying the temperature of the sample linearly with strain during the tests.
Remote temperature-set-point controller
Burke, William F.; Winiecki, Alan L.
1986-01-01
An instrument for carrying out mechanical strain tests on metallic samples with the addition of an electrical system for varying the temperature with strain, the instrument including opposing arms and associated equipment for holding a sample and varying the mechanical strain on the sample through a plurality of cycles of increasing and decreasing strain within predetermined limits, circuitry for producing an output signal representative of the strain during the tests, apparatus including a set point and a coil about the sample for providing a controlled temperature in the sample, and circuitry interconnected between the strain output signal and set point for varying the temperature of the sample linearly with strain during the tests.
The NASA Lewis Research Center's Expendable Launch Vehicle Program: An Economic Impact Study
NASA Technical Reports Server (NTRS)
Austrian, Ziona
1996-01-01
This study investigates the economic impact of the Lewis Research Center's (LeRC) Expendable Launch Vehicle Program (ELVP) on Northeast Ohio's economy. It was conducted by The Urban Center's Economic Development Program in Cleveland State University's Levin College of Urban Affairs. The study measures ELVP's direct impact on the local economy in terms of jobs, output, payroll, and taxes, as well as the indirect impact of these economic activities when they "ripple" throughout the economy. The study uses regional economic multipliers based on input-output models to estimate the effect of ELVP spending on the Northeast Ohio economy.
The NASA Lewis Research Center's Expendable Launch Vehicle Program: An Economic Impact Study
NASA Technical Reports Server (NTRS)
Austrian, Ziona
1996-01-01
This study investigates the economic impact of the Lewis Research Center's (LeRC) Expendable Launch Vehicle Program (ELVP) on Northeast Ohio's economy. It was conducted by The Urban Center's Economic Development Program in Cleveland State University's Levin College of Urban Affairs. The study measures ELVP's direct impact on the local economy in terms of jobs, output, payroll, and taxes, as well as the indirect impact of these economic activities when they 'ripple' throughout the economy. The study uses regional economic multipliers based on input-output models to estimate the effect of ELVP spending on the Northeast Ohio economy.
New multirate sampled-data control law structure and synthesis algorithm
NASA Technical Reports Server (NTRS)
Berg, Martin C.; Mason, Gregory S.; Yang, Gen-Sheng
1992-01-01
A new multirate sampled-data control law structure is defined and a new parameter-optimization-based synthesis algorithm for that structure is introduced. The synthesis algorithm can be applied to multirate, multiple-input/multiple-output, sampled-data control laws having a prescribed dynamic order and structure, and a priori specified sampling/update rates for all sensors, processor states, and control inputs. The synthesis algorithm is applied to design two-input, two-output tip position controllers of various dynamic orders for a sixth-order, two-link robot arm model.
Development a computer codes to couple PWR-GALE output and PC-CREAM input
NASA Astrophysics Data System (ADS)
Kuntjoro, S.; Budi Setiawan, M.; Nursinta Adi, W.; Deswandri; Sunaryo, G. R.
2018-02-01
Radionuclide dispersion analysis is part of an important reactor safety analysis. From the analysis it can be obtained the amount of doses received by radiation workers and communities around nuclear reactor. The radionuclide dispersion analysis under normal operating conditions is carried out using the PC-CREAM code, and it requires input data such as source term and population distribution. Input data is derived from the output of another program that is PWR-GALE and written Population Distribution data in certain format. Compiling inputs for PC-CREAM programs manually requires high accuracy, as it involves large amounts of data in certain formats and often errors in compiling inputs manually. To minimize errors in input generation, than it is make coupling program for PWR-GALE and PC-CREAM programs and a program for writing population distribution according to the PC-CREAM input format. This work was conducted to create the coupling programming between PWR-GALE output and PC-CREAM input and programming to written population data in the required formats. Programming is done by using Python programming language which has advantages of multiplatform, object-oriented and interactive. The result of this work is software for coupling data of source term and written population distribution data. So that input to PC-CREAM program can be done easily and avoid formatting errors. Programming sourceterm coupling program PWR-GALE and PC-CREAM is completed, so that the creation of PC-CREAM inputs in souceterm and distribution data can be done easily and according to the desired format.
Floating-point system quantization errors in digital control systems
NASA Technical Reports Server (NTRS)
Phillips, C. L.
1973-01-01
The results are reported of research into the effects on system operation of signal quantization in a digital control system. The investigation considered digital controllers (filters) operating in floating-point arithmetic in either open-loop or closed-loop systems. An error analysis technique is developed, and is implemented by a digital computer program that is based on a digital simulation of the system. As an output the program gives the programing form required for minimum system quantization errors (either maximum of rms errors), and the maximum and rms errors that appear in the system output for a given bit configuration. The program can be integrated into existing digital simulations of a system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaustad, K.L.; De Steese, J.G.
A computer program was developed to analyze the viability of integrating superconducting magnetic energy storage (SMES) with proposed wind farm scenarios at a site near Browning, Montana. The program simulated an hour-by-hour account of the charge/discharge history of a SMES unit for a representative wind-speed year. Effects of power output, storage capacity, and power conditioning capability on SMES performance characteristics were analyzed on a seasonal, diurnal, and hourly basis. The SMES unit was assumed to be charged during periods when power output of the wind resource exceeded its average value. Energy was discharged from the SMES unit into the gridmore » during periods of low wind speed to compensate for below-average output of the wind resource. The option of using SMES to provide power continuity for a wind farm supplemented by combustion turbines was also investigated. Levelizing the annual output of large wind energy systems operating in the Blackfeet area of Montana was found to require a storage capacity too large to be economically viable. However, it appears that intermediate-sized SMES economically levelize the wind energy output on a seasonal basis.« less
Modularized compact positron emission tomography detector for rapid system development
Xi, Daoming; Liu, Xiang; Zeng, Chen; Liu, Wei; Li, Yanzhao; Hua, Yuexuan; Mei, Xiongze; Kim, Heejong; Xiao, Peng; Kao, Chien-Min; Xie, Qingguo
2016-01-01
Abstract. We report the development of a modularized compact positron emission tomography (PET) detector that outputs serial streams of digital samples of PET event pulses via an Ethernet interface using the UDP/IP protocol to enable rapid configuration of a PET system by connecting multiple such detectors via a network switch to a computer. Presently, the detector is 76 mm×50 mm×55 mm in extent (excluding I/O connectors) and contains an 18×12 array of 4.2×4.2×20 mm3 one-to-one coupled lutetium-yttrium oxyorthosilicate/silicon photomultiplier pixels. It employs cross-wire and stripline readouts to merge the outputs of the 216 detector pixels to 24 channels. Signals at these channels are sampled using a built-in 24-ch, 4-level field programmable gate arrays-only multivoltage threshold digitizer. In the computer, software programs are implemented to analyze the digital samples to extract event information and to perform energy qualification and coincidence filtering. We have developed two such detectors. We show that all their pixels can be accurately discriminated and measure a crystal-level energy resolution of 14.4% to 19.4% and a detector-level coincidence time resolution of 1.67 ns FWHM. Preliminary imaging results suggests that a PET system based on the detectors can achieve an image resolution of ∼1.6 mm. PMID:28018941
Ozgen, Hacer; Ozcan, Yasar A
2002-06-01
To examine market competition and facility characteristics that can be related to technical efficiency in the production of multiple dialysis outputs from the perspective of the industrial organization model. Freestanding dialysis facilities that operated in 1997 submitted cost report fonns to the Health Care Financing Administration (HCFA), and offered all three outputs--outpatient dialysis, dialysis training, and home program dialysis. The Independent Renal Facility Cost Report Data file (IRFCRD) from HCFA was utilized to obtain information on output and input variables and market and facility features for 791 multiple-output facilities. Information regarding population characteristics was obtained from the Area Resources File. Cross-sectional data for the year 1997 were utilized to obtain facility-specific technical efficiency scores estimated through Data Envelopment Analysis (DEA). A binary variable of efficiency status was then regressed against its market and facility characteristics and control factors in a multivariate logistic regression analysis. The majority of the facilities in the sample are functioning technically inefficiently. Neither the intensity of market competition nor a policy of dialyzer reuse has a significant effect on the facilities' efficiency. Technical efficiency is significantly associated, however, with type of ownership, with the interaction between the market concentration of for-profits and ownership type, and with affiliations with chains of different sizes. Nonprofit and government-owned Facilities are more likely than their for-profit counterparts to become inefficient producers of renal dialysis outputs. On the other hand, that relationship between ownership form and efficiency is reversed as the market concentration of for-profits in a given market increases. Facilities that are members of large chains are more likely to be technically inefficient. Facilities do not appear to benefit from joint production of a variety of dialysis outputs, which may explain the ongoing tendency toward single-output production. Ownership form does make a positive difference in production efficiency, but only in local markets where competition exists between nonprofit and for-profit facilities. The increasing inefficiency associated with membership in large chains suggests that the growing consolidation in the dialysis industry may not, in fact, be the strategy for attaining more technical efficiency in the production of multiple dialysis outputs.
NASA Technical Reports Server (NTRS)
Allen, Robert J.
1988-01-01
An assembly language program using the Intel 80386 CPU and 80387 math co-processor chips was written to increase the speed of data gathering and processing, and provide control of a scanning CW ring dye laser system. This laser system is used in high resolution (better than 0.001 cm-1) water vapor spectroscopy experiments. Laser beam power is sensed at the input and output of white cells and the output of a Fabry-Perot. The assembly language subroutine is called from Basic, acquires the data and performs various calculations at rates greater than 150 faster than could be performed by the higher level language. The width of output control pulses generated in assembly language are 3 to 4 microsecs as compared to 2 to 3.7 millisecs for those generated in Basic (about 500 to 1000 times faster). Included are a block diagram and brief description of the spectroscopy experiment, a flow diagram of the Basic and assembly language programs, listing of the programs, scope photographs of the computer generated 5-volt pulses used for control and timing analysis, and representative water spectrum curves obtained using these programs.
Adaptation of time line analysis program to single pilot instrument flight research
NASA Technical Reports Server (NTRS)
Hinton, D. A.; Shaughnessy, J. D.
1978-01-01
A data base was developed for SPIFR operation and the program was run. The outputs indicated that further work was necessary on the workload models. In particular, the workload model for the cognitive channel should be modified as the output workload appears to be too small. Included in the needed refinements are models to show the workload when in turbulence, when overshooting a radial or glideslope, and when copying air traffic control clearances.
Tempo: A Toolkit for the Timed Input/Output Automata Formalism
2008-01-30
generation of distributed code from specifications. F .4.3 [Formal Languages]: Tempo;, D.3 [Programming Many distributed systems involve a combination of...and require The chek (i) transition is enabled when process i’s program the simulator to check the assertions after every single step counter is set to...output foo (n:Int) The Tempo simulator addresses this issue by putting the states x: Int : = 10;transitions modeler in charge of resolving the non
Turbofan aft duct suppressor study program listing and user's guide
NASA Technical Reports Server (NTRS)
Joshi, M. C.; Kraft, R. E.
1983-01-01
A description of the structure of the Annular Flow Duct Program (AFDP) for the calculation of acoustic suppression due to treatment in a finite length annular duct carrying sheared flow is presented. Although most appropriate for engine exhaust ducts, this program can be used to study sound propagation in any duct that maintains annular geometry over a considerable length of the duct. The program is based on the modal analysis of sound propagation in ducts with axial segments of different wall impedances. For specified duct geometry, wall impedance, flow and acoustic conditions in the duct (including mode amplitude distribution of the source) and duct termination reflection characteristics, the program calculates the suppression due to the treatment in the duct. The presence of forward and backward traveling modes in the duct due to the reflection and redistribution of modes at segment interfaces and duct end terminations are taken into account in the calculations. The effects of thin wall boundary layers (with a linear or mean flow velocity profile) on the acoustic propagation are also included in the program. A functional description of the major subroutines is included and a sample run is provided with an explanation of the output.
FORTRAN manpower account program
NASA Technical Reports Server (NTRS)
Strand, J. N.
1972-01-01
Computer program for determining manpower costs for full time, part time, and contractor personnel is discussed. Twelve different tables resulting from computer output are described. Program is written in FORTRAN 4 for IBM 360/65 computer.
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Process Validation Table (PVT) Widget Class ( Class is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing network registration services for Information Sharing Protocol (ISP) graphical-user-interface (GUI) computer programs. Heretofore, ISP PVT programming tasks have required many method calls to identify, query, and interpret the connections and messages exchanged between a client and a PVT server. Normally, programmers have utilized direct access to UNIX socket libraries to implement the PVT protocol queries, necessitating the use of many lines of source code to perform frequent tasks. Now, the X-Windows PVT Widget Class encapsulates ISP client server network registration management tasks within the framework of an X Windows widget. Use of the widget framework enables an X Windows GUI program to interact with PVT services in an abstract way and in the same manner as that of other graphical widgets, making it easier to program PVT clients. Wrapping the PVT services inside the widget framework enables a programmer to treat a PVT server interface as though it were a GUI. Moreover, an alternate subclass could implement another service in a widget of the same type. This program was written by Matthew R. Barry of United Space Alliance for Johnson Space Center. For further information, contact the Johnson Technology Transfer Office at (281) 483-3809. MSC-23582 Shuttle Data Center File- Processing Tool in Java A Java-language computer program has been written to facilitate mining of data in files in the Shuttle Data Center (SDC) archives. This program can be executed on a variety of workstations or via Web-browser programs. This program is partly similar to prior C-language programs used for the same purpose, while differing from those programs in that it exploits the platform-neutrality of Java in implementing several features that are important for analysis of large sets of time-series data. The program supports regular expression queries of SDC archive files, reads the files, interleaves the time-stamped samples according to a chosen output, then transforms the results into that format. A user can choose among a variety of output file formats that are useful for diverse purposes, including plotting, Markov modeling, multivariate density estimation, and wavelet multiresolution analysis, as well as for playback of data in support of simulation and testing.
A computer program to trace seismic ray distribution in complex two-dimensional geological models
Yacoub, Nazieh K.; Scott, James H.
1970-01-01
A computer program has been developed to trace seismic rays and their amplitudes and energies through complex two-dimensional geological models, for which boundaries between elastic units are defined by a series of digitized X-, Y-coordinate values. Input data for the program includes problem identification, control parameters, model coordinates and elastic parameter for the elastic units. The program evaluates the partitioning of ray amplitude and energy at elastic boundaries, computes the total travel time, total travel distance and other parameters for rays arising at the earth's surface. Instructions are given for punching program control cards and data cards, and for arranging input card decks. An example of printer output for a simple problem is presented. The program is written in FORTRAN IV language. The listing of the program is shown in the Appendix, with an example output from a CDC-6600 computer.
NASA Technical Reports Server (NTRS)
Aucoin, P. J.; Stewart, J.; Mckay, M. F. (Principal Investigator)
1980-01-01
This document presents instructions for analysts who use the EOD-LARSYS as programmed on the Purdue University IBM 370/148 (recently replaced by the IBM 3031) computer. It presents sample applications, control cards, and error messages for all processors in the system and gives detailed descriptions of the mathematical procedures and information needed to execute the system and obtain the desired output. EOD-LARSYS is the JSC version of an integrated batch system for analysis of multispectral scanner imagery data. The data included is designed for use with the as built documentation (volume 3) and the program listings (volume 4). The system is operational from remote terminals at Johnson Space Center under the virtual machine/conversational monitor system environment.
NASA Technical Reports Server (NTRS)
Reichert, R, S.; Biringen, S.; Howard, J. E.
1999-01-01
LINER is a system of Fortran 77 codes which performs a 2D analysis of acoustic wave propagation and noise suppression in a rectangular channel with a continuous liner at the top wall. This new implementation is designed to streamline the usage of the several codes making up LINER, resulting in a useful design tool. Major input parameters are placed in two main data files, input.inc and nurn.prm. Output data appear in the form of ASCII files as well as a choice of GNUPLOT graphs. Section 2 briefly describes the physical model. Section 3 discusses the numerical methods; Section 4 gives a detailed account of program usage, including input formats and graphical options. A sample run is also provided. Finally, Section 5 briefly describes the individual program files.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, R. L.
1976-06-14
Program GRAY is written to perform the matrix manipulations necessary to convert black-body radiation heat-transfer view factors to gray-body view factors as required by thermal analyzer codes. The black-body view factors contain only geometric relationships. Program GRAY allows the effects of multiple gray-body reflections to be included. The resulting effective gray-body factors can then be used with the corresponding fourth-power temperature differences to obtain the net radiative heat flux. The program is written to accept a matrix input or the card image output generated by the black-body view factor program CNVUFAC. The resulting card image output generated by GRAY ismore » in a form usable by the TRUMP thermal analyzer.« less
Manual for Getdata Version 3.1: a FORTRAN Utility Program for Time History Data
NASA Technical Reports Server (NTRS)
Maine, Richard E.
1987-01-01
This report documents version 3.1 of the GetData computer program. GetData is a utility program for manipulating files of time history data, i.e., data giving the values of parameters as functions of time. The most fundamental capability of GetData is extracting selected signals and time segments from an input file and writing the selected data to an output file. Other capabilities include converting file formats, merging data from several input files, time skewing, interpolating to common output times, and generating calculated output signals as functions of the input signals. This report also documents the interface standards for the subroutines used by GetData to read and write the time history files. All interface to the data files is through these subroutines, keeping the main body of GetData independent of the precise details of the file formats. Different file formats can be supported by changes restricted to these subroutines. Other computer programs conforming to the interface standards can call the same subroutines to read and write files in compatible formats.
GROSS- GAMMA RAY OBSERVATORY ATTITUDE DYNAMICS SIMULATOR
NASA Technical Reports Server (NTRS)
Garrick, J.
1994-01-01
The Gamma Ray Observatory (GRO) spacecraft will constitute a major advance in gamma ray astronomy by offering the first opportunity for comprehensive observations in the range of 0.1 to 30,000 megaelectronvolts (MeV). The Gamma Ray Observatory Attitude Dynamics Simulator, GROSS, is designed to simulate this mission. The GRO Dynamics Simulator consists of three separate programs: the Standalone Profile Program; the Simulator Program, which contains the Simulation Control Input/Output (SCIO) Subsystem, the Truth Model (TM) Subsystem, and the Onboard Computer (OBC) Subsystem; and the Postprocessor Program. The Standalone Profile Program models the environment of the spacecraft and generates a profile data set for use by the simulator. This data set contains items such as individual external torques; GRO spacecraft, Tracking and Data Relay Satellite (TDRS), and solar and lunar ephemerides; and star data. The Standalone Profile Program is run before a simulation. The SCIO subsystem is the executive driver for the simulator. It accepts user input, initializes parameters, controls simulation, and generates output data files and simulation status display. The TM subsystem models the spacecraft dynamics, sensors, and actuators. It accepts ephemerides, star data, and environmental torques from the Standalone Profile Program. With these and actuator commands from the OBC subsystem, the TM subsystem propagates the current state of the spacecraft and generates sensor data for use by the OBC and SCIO subsystems. The OBC subsystem uses sensor data from the TM subsystem, a Kalman filter (for attitude determination), and control laws to compute actuator commands to the TM subsystem. The OBC subsystem also provides output data to the SCIO subsystem for output to the analysts. The Postprocessor Program is run after simulation is completed. It generates printer and CRT plots and tabular reports of the simulated data at the direction of the user. GROSS is written in FORTRAN 77 and ASSEMBLER and has been implemented on a VAX 11/780 under VMS 4.5. It has a virtual memory requirement of 255k. GROSS was developed in 1986.
Genetic programs constructed from layered logic gates in single cells
Moon, Tae Seok; Lou, Chunbo; Tamsir, Alvin; Stanton, Brynne C.; Voigt, Christopher A.
2014-01-01
Genetic programs function to integrate environmental sensors, implement signal processing algorithms and control expression dynamics1. These programs consist of integrated genetic circuits that individually implement operations ranging from digital logic to dynamic circuits2–6, and they have been used in various cellular engineering applications, including the implementation of process control in metabolic networks and the coordination of spatial differentiation in artificial tissues. A key limitation is that the circuits are based on biochemical interactions occurring in the confined volume of the cell, so the size of programs has been limited to a few circuits1,7. Here we apply part mining and directed evolution to build a set of transcriptional AND gates in Escherichia coli. Each AND gate integrates two promoter inputs and controls one promoter output. This allows the gates to be layered by having the output promoter of an upstream circuit serve as the input promoter for a downstream circuit. Each gate consists of a transcription factor that requires a second chaperone protein to activate the output promoter. Multiple activator–chaperone pairs are identified from type III secretion pathways in different strains of bacteria. Directed evolution is applied to increase the dynamic range and orthogonality of the circuits. These gates are connected in different permutations to form programs, the largest of which is a 4-input AND gate that consists of 3 circuits that integrate 4 inducible systems, thus requiring 11 regulatory proteins. Measuring the performance of individual gates is sufficient to capture the behaviour of the complete program. Errors in the output due to delays (faults), a common problem for layered circuits, are not observed. This work demonstrates the successful layering of orthogonal logic gates, a design strategy that could enable the construction of large, integrated circuits in single cells. PMID:23041931
Dieter, Peter Erich
2009-07-01
The Carl Gustav Carus Faculty of Medicine, University of Technology Dresden, Germany, was founded in 1993 after the reunification of Germany. In 1999, a reform process of medical education was started together with Harvard Medical International.The traditional teacher- and discipline-centred curriculum was displaced by a student-centred, interdisciplinary and integrative curriculum, which has been named Dresden Integrative Patient/Problem-Oriented Learning (DIPOL). The reform process was accompanied and supported by a parallel-ongoing Faculty Development Program. In 2004, a Quality Management Program in medical education was implemented, and in 2005 medical education received DIN EN ISO 9001:2000 certification. Quality Management Program and DIN EN ISO 9001:2000 certification were/are unique for the 34 medical schools in Germany.The students play a very important strategic role in all processes. They are members in all committees like the Faculty Board, the Board of Study Affairs (with equal representation) and the ongoing audits in the Quality Management Program. The Faculty Development program, including a reform in medical education, the establishment of the Quality Management program and the certification, resulted in an improvement of the quality and output of medical education and was accompanied in an improvement of the quality and output of basic sciences and clinical research and interdisciplinary patient care.
Natural Resource Information System. Volume 2: System operating procedures and instructions
NASA Technical Reports Server (NTRS)
1972-01-01
A total computer software system description is provided for the prototype Natural Resource Information System designed to store, process, and display data of maximum usefulness to land management decision making. Program modules are described, as are the computer file design, file updating methods, digitizing process, and paper tape conversion to magnetic tape. Operating instructions for the system, data output, printed output, and graphic output are also discussed.
Young, Kevin L [Idaho Falls, ID; Hungate, Kevin E [Idaho Falls, ID
2010-02-23
A system for providing operational feedback to a user of a detection probe may include an optical sensor to generate data corresponding to a position of the detection probe with respect to a surface; a microprocessor to receive the data; a software medium having code to process the data with the microprocessor and pre-programmed parameters, and making a comparison of the data to the parameters; and an indicator device to indicate results of the comparison. A method of providing operational feedback to a user of a detection probe may include generating output data with an optical sensor corresponding to the relative position with respect to a surface; processing the output data, including comparing the output data to pre-programmed parameters; and indicating results of the comparison.
A study of low-cost reliable actuators for light aircraft. Part B: Appendices
NASA Technical Reports Server (NTRS)
Eijsink, H.; Rice, M.
1978-01-01
Computer programs written in FORTRAN are given for time response calculations on pneumatic and linear hydraulic actuators. The programs are self-explanatory with comment statements. Program output is also included.
NASA Technical Reports Server (NTRS)
Gendreau, Keith (Inventor); Martins, Jose Vanderlei (Inventor); Arzoumanian, Zaven (Inventor)
2010-01-01
An X-ray diffraction and X-ray fluorescence instrument for analyzing samples having no sample preparation includes a X-ray source configured to output a collimated X-ray beam comprising a continuum spectrum of X-rays to a predetermined coordinate and a photon-counting X-ray imaging spectrometer disposed to receive X-rays output from an unprepared sample disposed at the predetermined coordinate upon exposure of the unprepared sample to the collimated X-ray beam. The X-ray source and the photon-counting X-ray imaging spectrometer are arranged in a reflection geometry relative to the predetermined coordinate.
NASA Astrophysics Data System (ADS)
Wright, Robyn; Thornberg, Steven M.
SEDIDAT is a series of compiled IBM-BASIC (version 2.0) programs that direct the collection, statistical calculation, and graphic presentation of particle settling velocity and equivalent spherical diameter for samples analyzed using the settling tube technique. The programs follow a menu-driven format that is understood easily by students and scientists with little previous computer experience. Settling velocity is measured directly (cm,sec) and also converted into Chi units. Equivalent spherical diameter (reported in Phi units) is calculated using a modified Gibbs equation for different particle densities. Input parameters, such as water temperature, settling distance, particle density, run time, and Phi;Chi interval are changed easily at operator discretion. Optional output to a dot-matrix printer includes a summary of moment and graphic statistical parameters, a tabulation of individual and cumulative weight percents, a listing of major distribution modes, and cumulative and histogram plots of a raw time, settling velocity. Chi and Phi data.
Tools for Basic Statistical Analysis
NASA Technical Reports Server (NTRS)
Luz, Paul L.
2005-01-01
Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.
High speed cylindrical roller bearing analysis. SKF computer program CYBEAN. Volume 2: User's manual
NASA Technical Reports Server (NTRS)
Dyba, G. J.; Kleckner, R. J.
1981-01-01
CYBEAN (CYlindrical BEaring ANalysis) was created to detail radially loaded, aligned and misaligned cylindrical roller bearing performance under a variety of operating conditions. Emphasis was placed on detailing the effects of high speed, preload and system thermal coupling. Roller tilt, skew, radial, circumferential and axial displacement as well as flange contact were considered. Variable housing and flexible out-of-round outer ring geometries, and both steady state and time transient temperature calculations were enabled. The complete range of elastohydrodynamic contact considerations, employing full and partial film conditions were treated in the computation of raceway and flange contacts. The practical and correct implementation of CYBEAN is discussed. The capability to execute the program at four different levels of complexity was included. In addition, the program was updated to properly direct roller-to-raceway contact load vectors automatically in those cases where roller or ring profiles have small radii of curvature. Input and output architectures containing guidelines for use and two sample executions are detailed.
NASA Astrophysics Data System (ADS)
Reinert, K. A.
The use of linear decision rules (LDR) and chance constrained programming (CCP) to optimize the performance of wind energy conversion clusters coupled to storage systems is described. Storage is modelled by LDR and output by CCP. The linear allocation rule and linear release rule prescribe the size and optimize a storage facility with a bypass. Chance constraints are introduced to explicitly treat reliability in terms of an appropriate value from an inverse cumulative distribution function. Details of deterministic programming structure and a sample problem involving a 500 kW and a 1.5 MW WECS are provided, considering an installed cost of $1/kW. Four demand patterns and three levels of reliability are analyzed for optimizing the generator choice and the storage configuration for base load and peak operating conditions. Deficiencies in ability to predict reliability and to account for serial correlations are noted in the model, which is concluded useful for narrowing WECS design options.
Nuclear Engine System Simulation (NESS). Version 2.0: Program user's guide
NASA Technical Reports Server (NTRS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman
1993-01-01
This Program User's Guide discusses the Nuclear Thermal Propulsion (NTP) engine system design features and capabilities modeled in the Nuclear Engine System Simulation (NESS): Version 2.0 program (referred to as NESS throughout the remainder of this document), as well as its operation. NESS was upgraded to include many new modeling capabilities not available in the original version delivered to NASA LeRC in Dec. 1991, NESS's new features include the following: (1) an improved input format; (2) an advanced solid-core NERVA-type reactor system model (ENABLER 2); (3) a bleed-cycle engine system option; (4) an axial-turbopump design option; (5) an automated pump-out turbopump assembly sizing option; (6) an off-design gas generator engine cycle design option; (7) updated hydrogen properties; (8) an improved output format; and (9) personal computer operation capability. Sample design cases are presented in the user's guide that demonstrate many of the new features associated with this upgraded version of NESS, as well as design modeling features associated with the original version of NESS.
MISSILE DATA COMPENDIUM (DATCOM) User Manual 2014 Revision
2014-10-01
2014 revision of the Missile Datcom computer program. It supersedes AFRL-RB- WP-TR-2011-3071. 15. SUBJECT TERMS aerodynamics , stability and control...45 3.1.8 Namelist TRIM - Trim Aerodynamics ............................................................. 46 3.1.9... Aerodynamic Output Summary .......................... 64 4.2 PARTIAL OUTPUT
Computer programs for thermodynamic and transport properties of hydrogen
NASA Technical Reports Server (NTRS)
Hall, W. J.; Mc Carty, R. D.; Roder, H. M.
1968-01-01
Computer program subroutines provide the thermodynamic and transport properties of hydrogen in tabular form. The programs provide 18 combinations of input and output variables. This program is written in FORTRAN 4 for use on the IBM 7044 or CDC 3600 computers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraus, Terrence D.
2017-04-01
This report specifies the electronic file format that was agreed upon to be used as the file format for normalized radiological data produced by the software tool developed under this TI project. The NA-84 Technology Integration (TI) Program project (SNL17-CM-635, Normalizing Radiological Data for Analysis and Integration into Models) investigators held a teleconference on December 7, 2017 to discuss the tasks to be completed under the TI program project. During this teleconference, the TI project investigators determined that the comma-separated values (CSV) file format is the most suitable file format for the normalized radiological data that will be outputted frommore » the normalizing tool developed under this TI project. The CSV file format was selected because it provides the requisite flexibility to manage different types of radiological data (i.e., activity concentration, exposure rate, dose rate) from other sources [e.g., Radiological Assessment and Monitoring System (RAMS), Aerial Measuring System (AMS), Monitoring and Sampling). The CSV file format also is suitable for the file format of the normalized radiological data because this normalized data can then be ingested by other software [e.g., RAMS, Visual Sampling Plan (VSP)] used by the NA-84’s Consequence Management Program.« less
NASA Technical Reports Server (NTRS)
Ohri, A. K.; Owen, H. A.; Wilson, T. G.; Rodriguez, G. E.
1974-01-01
The simulation of converter-controller combinations by means of a flexible digital computer program which produces output to a graphic display is discussed. The procedure is an alternative to mathematical analysis of converter systems. The types of computer programming involved in the simulation are described. Schematic diagrams, state equations, and output equations are displayed for four basic forms of inductor-energy-storage dc to dc converters. Mathematical models are developed to show the relationship of the parameters.
1992-01-09
Materials 22 Deply of Laminated Panels with Perforation due to Impact John Lair 23 Actuator Location and Optimal Control Design for Flexible Structures...procedure is the focusing and alignment of the UV souce. Though the output of a vapor lamp is nonuniform ., intensity peaks can be smoothed by expanding the...surface, localized surface heatig may occur. Secondly, the output of a mercury vapor lame is nonuniform , requiring diffusion tc obtain a more- uniform
User's manual for the Simulated Life Analysis of Vehicle Elements (SLAVE) model
NASA Technical Reports Server (NTRS)
Paul, D. D., Jr.
1972-01-01
The simulated life analysis of vehicle elements model was designed to perform statistical simulation studies for any constant loss rate. The outputs of the model consist of the total number of stages required, stages successfully completing their lifetime, and average stage flight life. This report contains a complete description of the model. Users' instructions and interpretation of input and output data are presented such that a user with little or no prior programming knowledge can successfully implement the program.
2016-01-01
outputs, customers , and outcomes (see Figure 2.1). In the Taylor-Powell and Henert simple three-part example, the food would constitute an input, finding... Customer Activities etaidemretnI Goals Strategic Goals Annual Goals Management Objectives Operations M ission External factors Annual...Partners are the individuals or organizations that work with programs to conduct activities or enable outputs. • Customers (intermediate and final
NASA Technical Reports Server (NTRS)
Knauber, R. N.
1982-01-01
A FORTRAN IV coded computer program is presented for post-flight analysis of a missile's control surface response. It includes preprocessing of digitized telemetry data for time lags, biases, non-linear calibration changes and filtering. Measurements include autopilot attitude rate and displacement gyro output and four control surface deflections. Simple first order lags are assumed for the pitch, yaw and roll axes of control. Each actuator is also assumed to be represented by a first order lag. Mixing of pitch, yaw and roll commands to four control surfaces is assumed. A pseudo-inverse technique is used to obtain the pitch, yaw and roll components from the four measured deflections. This program has been used for over 10 years on the NASA/SCOUT launch vehicle for post-flight analysis and was helpful in detecting incipient actuator stall due to excessive hinge moments. The program is currently set up for a CDC CYBER 175 computer system. It requires 34K words of memory and contains 675 cards. A sample problem presented herein including the optional plotting requires eleven (11) seconds of central processor time.
NASA Technical Reports Server (NTRS)
Raju, I. S.; Newman, J. C., Jr.
1993-01-01
A computer program, surf3d, that uses the 3D finite-element method to calculate the stress-intensity factors for surface, corner, and embedded cracks in finite-thickness plates with and without circular holes, was developed. The cracks are assumed to be either elliptic or part eliptic in shape. The computer program uses eight-noded hexahedral elements to model the solid. The program uses a skyline storage and solver. The stress-intensity factors are evaluated using the force method, the crack-opening displacement method, and the 3-D virtual crack closure methods. In the manual the input to and the output of the surf3d program are described. This manual also demonstrates the use of the program and describes the calculation of the stress-intensity factors. Several examples with sample data files are included with the manual. To facilitate modeling of the user's crack configuration and loading, a companion program (a preprocessor program) that generates the data for the surf3d called gensurf was also developed. The gensurf program is a three dimensional mesh generator program that requires minimal input and that builds a complete data file for surf3d. The program surf3d is operational on Unix machines such as CRAY Y-MP, CRAY-2, and Convex C-220.
Image subsampling and point scoring approaches for large-scale marine benthic monitoring programs
NASA Astrophysics Data System (ADS)
Perkins, Nicholas R.; Foster, Scott D.; Hill, Nicole A.; Barrett, Neville S.
2016-07-01
Benthic imagery is an effective tool for quantitative description of ecologically and economically important benthic habitats and biota. The recent development of autonomous underwater vehicles (AUVs) allows surveying of spatial scales that were previously unfeasible. However, an AUV collects a large number of images, the scoring of which is time and labour intensive. There is a need to optimise the way that subsamples of imagery are chosen and scored to gain meaningful inferences for ecological monitoring studies. We examine the trade-off between the number of images selected within transects and the number of random points scored within images on the percent cover of target biota, the typical output of such monitoring programs. We also investigate the efficacy of various image selection approaches, such as systematic or random, on the bias and precision of cover estimates. We use simulated biotas that have varying size, abundance and distributional patterns. We find that a relatively small sampling effort is required to minimise bias. An increased precision for groups that are likely to be the focus of monitoring programs is best gained through increasing the number of images sampled rather than the number of points scored within images. For rare species, sampling using point count approaches is unlikely to provide sufficient precision, and alternative sampling approaches may need to be employed. The approach by which images are selected (simple random sampling, regularly spaced etc.) had no discernible effect on mean and variance estimates, regardless of the distributional pattern of biota. Field validation of our findings is provided through Monte Carlo resampling analysis of a previously scored benthic survey from temperate waters. We show that point count sampling approaches are capable of providing relatively precise cover estimates for candidate groups that are not overly rare. The amount of sampling required, in terms of both the number of images and number of points, varies with the abundance, size and distributional pattern of target biota. Therefore, we advocate either the incorporation of prior knowledge or the use of baseline surveys to establish key properties of intended target biota in the initial stages of monitoring programs.
Administrative Restructuring of a Residency Training Program for Improved Efficiency and Output
ERIC Educational Resources Information Center
van Zyl, Louis T.; Finch, Susan J.; Davidson, Paul R.; Arboleda-Florez, Julio
2005-01-01
Objectives: Canadian residency training programs (RTP) have a program director (PD) and a residency program committee (RPC) overseeing program administration. Limited guidance is available about the ideal administrative structure of an RTP. This article describes administrative load in Canadian RTPs, presents a novel approach to delegating core…
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.; Jones, Scott M.
1991-01-01
This analysis and this computer code apply to full, split, and dual expander cycles. Heat regeneration from the turbine exhaust to the pump exhaust is allowed. The combustion process is modeled as one of chemical equilibrium in an infinite-area or a finite-area combustor. Gas composition in the nozzle may be either equilibrium or frozen during expansion. This report, which serves as a users guide for the computer code, describes the system, the analysis methodology, and the program input and output. Sample calculations are included to show effects of key variables such as nozzle area ratio and oxidizer-to-fuel mass ratio.
1974-06-01
I0J4I11 • IIMI MTAIII • tKHI WUIII • tmi • tlltl «0*111 • tim n ev*iti • tmn MI04II1 • tlltl cm i .*■•—J 1000 I CsiD 687 COtON ...iriocoTKiii 91O.MO.99I MI ocmii.ii • octTiitii lflOCSIli.li - Ollll 9H.9M.9M S92SCWTIi.ll . TOTIIl/OII7Kaii<l|i.ll C IMCmOM FINISH 0*0 IflSCSTIWH
The NYU inverse swept wing code
NASA Technical Reports Server (NTRS)
Bauer, F.; Garabedian, P.; Mcfadden, G.
1983-01-01
An inverse swept wing code is described that is based on the widely used transonic flow program FLO22. The new code incorporates a free boundary algorithm permitting the pressure distribution to be prescribed over a portion of the wing surface. A special routine is included to calculate the wave drag, which can be minimized in its dependence on the pressure distribution. An alternate formulation of the boundary condition at infinity was introduced to enhance the speed and accuracy of the code. A FORTRAN listing of the code and a listing of a sample run are presented. There is also a user's manual as well as glossaries of input and output parameters.
Automated nystagmus analysis. [on-line computer technique for eye data processing
NASA Technical Reports Server (NTRS)
Oman, C. M.; Allum, J. H. J.; Tole, J. R.; Young, L. R.
1973-01-01
Several methods have recently been used for on-line analysis of nystagmus: A digital computer program has been developed to accept sampled records of eye position, detect fast phase components, and output cumulative slow phase position, continuous slow phase velocity, instantaneous fast phase frequency, and other parameters. The slow phase velocity is obtained by differentiation of the calculated cumulative position rather than the original eye movement record. Also, a prototype analog device has been devised which calculates the velocity of the slow phase component during caloric testing. Examples of clinical and research eye movement records analyzed with these devices are shown.
CHIRAL--A Computer Aided Application of the Cahn-Ingold-Prelog Rules.
ERIC Educational Resources Information Center
Meyer, Edgar F., Jr.
1978-01-01
A computer program is described for identification of chiral centers in molecules. Essential input to the program includes both atomic and bonding information. The program does not require computer graphic input-output. (BB)
Translator program converts computer printout into braille language
NASA Technical Reports Server (NTRS)
Powell, R. A.
1967-01-01
Computer program converts print image tape files into six dot Braille cells, enabling a blind computer programmer to monitor and evaluate data generated by his own programs. The Braille output is printed 8 lines per inch.
SIPT: a seismic refraction inverse modeling program for timeshare terminal computer systems
Scott, James Henry
1977-01-01
SIPB is an interactive Fortran computer program that was developed for use with a timeshare computer system with program control information submitted from a remote terminal, and output data displayed on the terminal or printed on a line printer. The program is an upgraded version of FSIPI (Scott, Tibbetts, and Burdick, 1972) with several major improvements in addition to .its adaptation to timeshare operation. The most significant improvement was made in the procedure for handling data from in-line offset shotpoints beyond the end shotpoints of the geophone spread. The changes and improvements are described, user's instructions are outlined, examples of input and output data for a test problem are presented, and the Fortran program is listed in this report. An upgraded batch-mode program, SIPB, is available for users who do not have a timeshare computer system available (Scott, 1977).
SIPB: a seismic refraction inverse modeling program for batch computer systems
Scott, James Henry
1977-01-01
SIPB is an interactive Fortran computer program that was developed for use with a timeshare computer system with program control information submitted from a remote terminal, and output data displayed on the terminal or printed on a line printer. The program is an upgraded version of FSIPI (Scott, Tibbetts, and Burdick, 1972) with several major improvements in addition to .its adaptation to timeshare operation. The most significant improvement was made in the procedure for handling data from in-line offset shotpoints beyond the end shotpoints of the geophone spread. The changes and improvements are described, user's instructions are outlined, examples of input and output data for a test problem are presented, and the Fortran program is listed in this report. An upgraded batch-mode program, SIPB, is available for users who do not have a timeshare computer system available (Scott, 1977).
Low Power, High Voltage Power Supply with Fast Rise/Fall Time
NASA Technical Reports Server (NTRS)
Bearden, Douglas B. (Inventor)
2007-01-01
A low power, high voltage power supply system includes a high voltage power supply stage and a preregulator for programming the power supply stage so as to produce an output voltage which is a predetermined fraction of a desired voltage level. The power supply stage includes a high voltage, voltage doubler stage connected to receive the output voltage from the preregulator and for, when activated, providing amplification of the output voltage to the desired voltage level. A first feedback loop is connected between the output of the preregulator and an input of the preregulator while a second feedback loop is connected between the output of the power supply stage and the input of the preregulator.
Low power, high voltage power supply with fast rise/fall time
NASA Technical Reports Server (NTRS)
Bearden, Douglas B. (Inventor)
2007-01-01
A low power, high voltage power supply system includes a high voltage power supply stage and a preregulator for programming the power supply stage so as to produce an output voltage which is a predetermined fraction of a desired voltage level. The power supply stage includes a high voltage, voltage doubler stage connected to receive the output voltage from the preregulator and for, when activated, providing amplification of the output voltage to the desired voltage level. A first feedback loop is connected between the output of the preregulator and an input of the preregulator while a second feedback loop is connected between the output of the power supply stage and the input of the preregulator.
Communications technology satellite output-tube design and development
NASA Technical Reports Server (NTRS)
Connolly, D. J.; Forman, R.; Jones, C. L.; Kosmahl, H.; Sharp, G. R.
1977-01-01
The design and development of a 200-watt-output, traveling-wave tube (TWT) for the Communications Technology Satellite (CTS) is discussed, with emphasis on the design evolution during the manufacturing phase of the development program. Possible further improvements to the tube design are identified.
Improvement of Computer Software Quality through Software Automated Tools.
1986-08-30
information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types
Planetary quarantine computer applications
NASA Technical Reports Server (NTRS)
Rafenstein, M.
1973-01-01
The computer programs are identified pertaining to planetary quarantine activities within the Project Engineering Division, both at the Air Force Eastern Test Range and on site at the Jet Propulsion Laboratory. A brief description of each program and program inputs are given and typical program outputs are shown.
Load estimator (LOADEST): a FORTRAN program for estimating constituent loads in streams and rivers
Runkel, Robert L.; Crawford, Charles G.; Cohn, Timothy A.
2004-01-01
LOAD ESTimator (LOADEST) is a FORTRAN program for estimating constituent loads in streams and rivers. Given a time series of streamflow, additional data variables, and constituent concentration, LOADEST assists the user in developing a regression model for the estimation of constituent load (calibration). Explanatory variables within the regression model include various functions of streamflow, decimal time, and additional user-specified data variables. The formulated regression model then is used to estimate loads over a user-specified time interval (estimation). Mean load estimates, standard errors, and 95 percent confidence intervals are developed on a monthly and(or) seasonal basis. The calibration and estimation procedures within LOADEST are based on three statistical estimation methods. The first two methods, Adjusted Maximum Likelihood Estimation (AMLE) and Maximum Likelihood Estimation (MLE), are appropriate when the calibration model errors (residuals) are normally distributed. Of the two, AMLE is the method of choice when the calibration data set (time series of streamflow, additional data variables, and concentration) contains censored data. The third method, Least Absolute Deviation (LAD), is an alternative to maximum likelihood estimation when the residuals are not normally distributed. LOADEST output includes diagnostic tests and warnings to assist the user in determining the appropriate estimation method and in interpreting the estimated loads. This report describes the development and application of LOADEST. Sections of the report describe estimation theory, input/output specifications, sample applications, and installation instructions.
NASA Technical Reports Server (NTRS)
Dejarnette, F. R.; Jones, M. H.
1971-01-01
A description of the computer program used for heating rate calculation for blunt bodies in hypersonic flow is given. The main program and each subprogram are described by defining the pertinent symbols involved and presenting a detailed flow diagram and complete computer program listing. Input and output parameters are discussed in detail. Listings are given for the computation of heating rates on (1) a blunted 15 deg half-angle cone at 20 deg incidence and Mach 10.6, (2) a blunted 70 deg slab delta wing at 10 deg incidence and Mach 8, and (3) the HL-10 lifting body at 20 deg incidence and Mach 10. In addition, the computer program output for two streamlines on the blunted 15 deg half-angle cone is listed. For Part 1, see N71-36186.
An interactive computer program for sizing spacecraft momentum storage devices
NASA Technical Reports Server (NTRS)
Wilcox, F. J., Jr.
1980-01-01
An interactive computer program was developed which computes the sizing requirements for nongimbled reaction wheels, control moment gyros (CMG), and dual momentum control devices (DMCD) used in Earth-orbiting spacecraft. The program accepts as inputs the spacecraft's environmental disturbance torques, rotational inertias, maneuver rates, and orbital data. From these inputs, wheel weights are calculated for a range of radii and rotational speeds. The shape of the momentum wheel may be chosen to be either a hoop, solid cylinder, or annular cylinder. The program provides graphic output illustrating the trade-off potential between the weight, radius, and wheel speed. A number of the intermediate calculations such as the X-, Y-, and Z-axis total momentum, the momentum absorption requirements for reaction wheels, CMG's, DMCD's, and basic orbit analysis information are also provided as program output.
GDF v2.0, an enhanced version of GDF
NASA Astrophysics Data System (ADS)
Tsoulos, Ioannis G.; Gavrilis, Dimitris; Dermatas, Evangelos
2007-12-01
An improved version of the function estimation program GDF is presented. The main enhancements of the new version include: multi-output function estimation, capability of defining custom functions in the grammar and selection of the error function. The new version has been evaluated on a series of classification and regression datasets, that are widely used for the evaluation of such methods. It is compared to two known neural networks and outperforms them in 5 (out of 10) datasets. Program summaryTitle of program: GDF v2.0 Catalogue identifier: ADXC_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXC_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 98 147 No. of bytes in distributed program, including test data, etc.: 2 040 684 Distribution format: tar.gz Programming language: GNU C++ Computer: The program is designed to be portable in all systems running the GNU C++ compiler Operating system: Linux, Solaris, FreeBSD RAM: 200000 bytes Classification: 4.9 Does the new version supersede the previous version?: Yes Nature of problem: The technique of function estimation tries to discover from a series of input data a functional form that best describes them. This can be performed with the use of parametric models, whose parameters can adapt according to the input data. Solution method: Functional forms are being created by genetic programming which are approximations for the symbolic regression problem. Reasons for new version: The GDF package was extended in order to be more flexible and user customizable than the old package. The user can extend the package by defining his own error functions and he can extend the grammar of the package by adding new functions to the function repertoire. Also, the new version can perform function estimation of multi-output functions and it can be used for classification problems. Summary of revisions: The following features have been added to the package GDF: Multi-output function approximation. The package can now approximate any function f:R→R. This feature gives also to the package the capability of performing classification and not only regression. User defined function can be added to the repertoire of the grammar, extending the regression capabilities of the package. This feature is limited to 3 functions, but easily this number can be increased. Capability of selecting the error function. The package offers now to the user apart from the mean square error other error functions such as: mean absolute square error, maximum square error. Also, user defined error functions can be added to the set of error functions. More verbose output. The main program displays more information to the user as well as the default values for the parameters. Also, the package gives to the user the capability to define an output file, where the output of the gdf program for the testing set will be stored after the termination of the process. Additional comments: A technical report describing the revisions, experiments and test runs is packaged with the source code. Running time: Depending on the train data.
Vibration Pattern Imager (VPI): A control and data acquisition system for scanning laser vibrometers
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Brown, Donald E.; Shaffer, Thomas A.
1993-01-01
The Vibration Pattern Imager (VPI) system was designed to control and acquire data from scanning laser vibrometer sensors. The PC computer based system uses a digital signal processing (DSP) board and an analog I/O board to control the sensor and to process the data. The VPI system was originally developed for use with the Ometron VPI Sensor, but can be readily adapted to any commercially available sensor which provides an analog output signal and requires analog inputs for control of mirror positioning. The sensor itself is not part of the VPI system. A graphical interface program, which runs on a PC under the MS-DOS operating system, functions in an interactive mode and communicates with the DSP and I/O boards in a user-friendly fashion through the aid of pop-up menus. Two types of data may be acquired with the VPI system: single point or 'full field.' In the single point mode, time series data is sampled by the A/D converter on the I/O board (at a user-defined sampling rate for a selectable number of samples) and is stored by the PC. The position of the measuring point (adjusted by mirrors in the sensor) is controlled via a mouse input. The mouse input is translated to output voltages by the D/A converter on the I/O board to control the mirror servos. In the 'full field' mode, the measurement point is moved over a user-selectable rectangular area. The time series data is sampled by the A/D converter on the I/O board (at a user-defined sampling rate for a selectable number of samples) and converted to a root-mean-square (rms) value by the DSP board. The rms 'full field' velocity distribution is then uploaded for display and storage on the PC.
NASA Astrophysics Data System (ADS)
Gaustad, K. L.; Desteese, J. G.
1993-07-01
A computer program was developed to analyze the viability of integrating superconducting magnetic energy storage (SMES) with proposed wind farm scenarios at a site near Browning, Montana. The program simulated an hour-by-hour account of the charge/discharge history of a SMES unit for a representative wind-speed year. Effects of power output, storage capacity, and power conditioning capability on SMES performance characteristics were analyzed on a seasonal, diurnal, and hourly basis. The SMES unit was assumed to be charged during periods when power output of the wind resource exceeded its average value. Energy was discharged from the SMES unit into the grid during periods of low wind speed to compensate for below-average output of the wind resource. The option of using SMES to provide power continuity for a wind farm supplemented by combustion turbines was also investigated. Levelizing the annual output of large wind energy systems operating in the Blackfeet area of Montana was found to require a storage capacity too large to be economically viable. However, it appears that intermediate-sized SMES economically levelize the wind energy output on a seasonal basis.
Latin Hypercube Sampling (LHS) UNIX Library/Standalone
DOE Office of Scientific and Technical Information (OSTI.GOV)
2004-05-13
The LHS UNIX Library/Standalone software provides the capability to draw random samples from over 30 distribution types. It performs the sampling by a stratified sampling method called Latin Hypercube Sampling (LHS). Multiple distributions can be sampled simultaneously, with user-specified correlations amongst the input distributions, LHS UNIX Library/ Standalone provides a way to generate multi-variate samples. The LHS samples can be generated either as a callable library (e.g., from within the DAKOTA software framework) or as a standalone capability. LHS UNIX Library/Standalone uses the Latin Hypercube Sampling method (LHS) to generate samples. LHS is a constrained Monte Carlo sampling scheme. Inmore » LHS, the range of each variable is divided into non-overlapping intervals on the basis of equal probability. A sample is selected at random with respect to the probability density in each interval, If multiple variables are sampled simultaneously, then values obtained for each are paired in a random manner with the n values of the other variables. In some cases, the pairing is restricted to obtain specified correlations amongst the input variables. Many simulation codes have input parameters that are uncertain and can be specified by a distribution, To perform uncertainty analysis and sensitivity analysis, random values are drawn from the input parameter distributions, and the simulation is run with these values to obtain output values. If this is done repeatedly, with many input samples drawn, one can build up a distribution of the output as well as examine correlations between input and output variables.« less
Ozgen, Hacer; A. Ozcan, Yasar
2002-01-01
Objective To examine market competition and facility characteristics that can be related to technical efficiency in the production of multiple dialysis outputs from the perspective of the industrial organization model. Study Setting Freestanding dialysis facilities that operated in 1997 submitted cost report forms to the Health Care Financing Administration (HCFA), and offered all three outputs—outpatient dialysis, dialysis training, and home program dialysis. Data Sources The Independent Renal Facility Cost Report Data file (IRFCRD) from HCFA was utilized to obtain information on output and input variables and market and facility features for 791 multiple-output facilities. Information regarding population characteristics was obtained from the Area Resources File. Study Design Cross-sectional data for the year 1997 were utilized to obtain facility-specific technical efficiency scores estimated through Data Envelopment Analysis (DEA). A binary variable of efficiency status was then regressed against its market and facility characteristics and control factors in a multivariate logistic regression analysis. Principal Findings The majority of the facilities in the sample are functioning technically inefficiently. Neither the intensity of market competition nor a policy of dialyzer reuse has a significant effect on the facilities' efficiency. Technical efficiency is significantly associated, however, with type of ownership, with the interaction between the market concentration of for-profits and ownership type, and with affiliations with chains of different sizes. Nonprofit and government-owned facilities are more likely than their for-profit counterparts to become inefficient producers of renal dialysis outputs. On the other hand, that relationship between ownership form and efficiency is reversed as the market concentration of for-profits in a given market increases. Facilities that are members of large chains are more likely to be technically inefficient. Conclusions Facilities do not appear to benefit from joint production of a variety of dialysis outputs, which may explain the ongoing tendency toward single-output production. Ownership form does make a positive difference in production efficiency, but only in local markets where competition exists between nonprofit and for-profit facilities. The increasing inefficiency associated with membership in large chains suggests that the growing consolidation in the dialysis industry may not, in fact, be the strategy for attaining more technical efficiency in the production of multiple dialysis outputs. PMID:12132602
Understanding and Improving High-Performance I/O Subsystems
NASA Technical Reports Server (NTRS)
El-Ghazawi, Tarek A.; Frieder, Gideon; Clark, A. James
1996-01-01
This research program has been conducted in the framework of the NASA Earth and Space Science (ESS) evaluations led by Dr. Thomas Sterling. In addition to the many important research findings for NASA and the prestigious publications, the program has helped orienting the doctoral research program of two students towards parallel input/output in high-performance computing. Further, the experimental results in the case of the MasPar were very useful and helpful to MasPar with which the P.I. has had many interactions with the technical management. The contributions of this program are drawn from three experimental studies conducted on different high-performance computing testbeds/platforms, and therefore presented in 3 different segments as follows: 1. Evaluating the parallel input/output subsystem of a NASA high-performance computing testbeds, namely the MasPar MP- 1 and MP-2; 2. Characterizing the physical input/output request patterns for NASA ESS applications, which used the Beowulf platform; and 3. Dynamic scheduling techniques for hiding I/O latency in parallel applications such as sparse matrix computations. This study also has been conducted on the Intel Paragon and has also provided an experimental evaluation for the Parallel File System (PFS) and parallel input/output on the Paragon. This report is organized as follows. The summary of findings discusses the results of each of the aforementioned 3 studies. Three appendices, each containing a key scholarly research paper that details the work in one of the studies are included.
Microcumpter computation of water quality discharges
Helsel, Dennis R.
1983-01-01
A fully prompted program (SEDQ) has been developed to calculate daily and instantaneous water quality (QW) discharges. It is written in a version of BASIC, and requires inputs of gage heights, discharge rating curve, shifts, and water quality concentration information. Concentration plots may be modified interactively using the display screen. Semi-logarithmic plots of concentration and water quality discharge are output to the display screen, and optionally to plotters. A summary table of data is also output. SEDQ could be a model program for micro and minicomputer systems likely to be in use within the Water Resources Division, USGS, in the near future. The daily discharge-weighted mean concentration is one output from SEDQ. It is defined in this report, differentiated from the currently used mean concentration, and designated the ' equivalent concentration. ' (USGS)
An Advanced simulation Code for Modeling Inductive Output Tubes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thuc Bui; R. Lawrence Ives
2012-04-27
During the Phase I program, CCR completed several major building blocks for a 3D large signal, inductive output tube (IOT) code using modern computer language and programming techniques. These included a 3D, Helmholtz, time-harmonic, field solver with a fully functional graphical user interface (GUI), automeshing and adaptivity. Other building blocks included the improved electrostatic Poisson solver with temporal boundary conditions to provide temporal fields for the time-stepping particle pusher as well as the self electric field caused by time-varying space charge. The magnetostatic field solver was also updated to solve for the self magnetic field caused by time changing currentmore » density in the output cavity gap. The goal function to optimize an IOT cavity was also formulated, and the optimization methodologies were investigated.« less
Description of the IV + V System Software Package.
ERIC Educational Resources Information Center
Microcomputers for Information Management: An International Journal for Library and Information Services, 1984
1984-01-01
Describes the IV + V System, a software package designed by the Institut fur Maschinelle Dokumentation for the United Nations General Information Programme and UNISIST to support automation of local information and documentation services. Principle program features and functions outlined include input/output, databank, text image, output, and…
LOP- LONG-TERM ORBIT PREDICTOR
NASA Technical Reports Server (NTRS)
Kwok, J. H.
1994-01-01
The Long-Term Orbit Predictor (LOP) trajectory propagation program is a useful tool in lifetime analysis of orbiting spacecraft. LOP is suitable for studying planetary orbit missions with reconnaissance (flyby) and exploratory (mapping) trajectories. Sample data is included for a geosynchronous station drift cycle study, a Venus radar mapping strategy, a frozen orbit about Mars, and a repeat ground trace orbit. LOP uses the variation-of-parameters method in formulating the equations of motion. Terms involving the mean anomaly are removed from numerical integrations so that large step sizes, on the order of days, are possible. Consequently, LOP executes much faster than programs based on Cowell's method, such as the companion program ASAP (the Artificial Satellite Analysis Program, NPO-17522, also available through COSMIC). The program uses a force model with a gravity field of up to 21 by 21, lunisolar perturbation, drag, and solar radiation pressure. The input includes classical orbital elements (either mean or oscillating), orbital elements of the sun relative to the planet, reference time and dates, drag coefficients, gravitational constants, planet radius, rotation rate. The printed output contains the classical elements for each time step or event step, and additional orbital data such as true anomaly, eccentric anomaly, latitude, longitude, periapsis altitude, and the rate of change per day of certain elements. Selected output is additionally written to a plot file for postprocessing by the user. LOP is written in FORTRAN 77 for batch execution on IBM PC compatibles running MS-DOS with a minimum of 256K RAM. Recompiling the source requires the Lahey F77 v2.2 compiler. The LOP package includes examples that use LOTUS 1-2-3 for graphical displays, but any graphics software package should be able to handle the ASCII plot file. The program is available on two 5.25 inch 360K MS-DOS format diskettes. The program was written in 1986 and last updated in 1989. LOP is a copyrighted work with all copyright vested in NASA. IBM PC is a registered trademark of International Business Machines Corporation. Lotus 1-2-3 is a registered trademark of Lotus Development Corporation. MS-DOS is a trademark of Microsoft Corporation.
Research study: STS-1 Orbiter Descent
NASA Technical Reports Server (NTRS)
Hickey, J. S.
1981-01-01
The conversion of STS-1 orbiter descent data from AVE-SESAME contact programs to the REEDA system and the reduction of raw radiosonde data is summarized. A first difference program, contact data program, plot data program, and 30 second data program were developed. Six radiosonde soundings were taken. An example of the outputs of each of the programs is presented.
Ultra-wideband radar motion sensor
McEwan, Thomas E.
1994-01-01
A motion sensor is based on ultra-wideband (UWB) radar. UWB radar range is determined by a pulse-echo interval. For motion detection, the sensors operate by staring at a fixed range and then sensing any change in the averaged radar reflectivity at that range. A sampling gate is opened at a fixed delay after the emission of a transmit pulse. The resultant sampling gate output is averaged over repeated pulses. Changes in the averaged sampling gate output represent changes in the radar reflectivity at a particular range, and thus motion.
Ultra-wideband radar motion sensor
McEwan, T.E.
1994-11-01
A motion sensor is based on ultra-wideband (UWB) radar. UWB radar range is determined by a pulse-echo interval. For motion detection, the sensors operate by staring at a fixed range and then sensing any change in the averaged radar reflectivity at that range. A sampling gate is opened at a fixed delay after the emission of a transmit pulse. The resultant sampling gate output is averaged over repeated pulses. Changes in the averaged sampling gate output represent changes in the radar reflectivity at a particular range, and thus motion. 15 figs.
VizieR Online Data Catalog: GALEX/S4G surface brightness profiles. I. (Bouquin+, 2018)
NASA Astrophysics Data System (ADS)
Bouquin, A. Y. K.; Gil de, Paz A.; Munoz-Mateos, J. C.; Boissier, S.; Sheth, K.; Zaritsky, D.; Peletier, R. F.; Knapen, J. H.; Gallego, J.
2018-03-01
The Spitzer Survey of Stellar Structure in Galaxies (S4 Sheth+ 2010, J/PASP/122/1397) galaxy sample is a deep infrared survey of a (mainly) volume-limited sample of nearby galaxies within d<40Mpc observed at 3.6 and 4.5um with the IRAC. In this paper, we have used the surface photometry at 3.6um (IRAC1) measurements from the output of pipeline 3 (P3) of the S4G sample (Munoz-Mateos+ 2015ApJS..219....3M). We have collected these data from the IRSA database. We gathered all available GALEX FUV and NUV images and related data products for 1931 S4G galaxies that had been observed in at least one of these two UV bands. We collected imaging data from all kinds of surveys, such as the All-sky Imaging Survey, Medium Imaging Survey, Deep Imaging Survey, and Nearby Galaxy Survey, as well as from Guest Investigator (GIs/GIIs) Programs. (5 data files).
Discriminator Stabilized Superconductor/Ferroelectric Thin Film Local Oscillator
NASA Technical Reports Server (NTRS)
Romanofsky, Robert R. (Inventor); Miranda, Felix A. (Inventor)
2000-01-01
A tunable local oscillator with a tunable circuit that includes a resonator and a transistor as an active element for oscillation. Tuning of the circuit is achieved with an externally applied dc bias across coupled lines on the resonator. Preferably the resonator is a high temperature superconductor microstrip ring resonator with integral coupled lines formed over a thin film ferroelectric material. A directional coupler samples the output of the oscillator which is fed into a diplexer for determining whether the oscillator is performing at a desired frequency. The high-pass and lowpass outputs of the diplexer are connected to diodes respectively for inputting the sampled signals into a differential operational amplifier. The amplifier compares the sampled signals and emits an output signal if there is a difference between the resonant and crossover frequencies. Based on the sampled signal, a bias supplied to the ring resonator is either increased or decreased for raising or lowering the resonant frequency by decreasing or increasing, respectively, the dielectric constant of the ferroelectric.
Wang, Tong; Gao, Huijun; Qiu, Jianbin
2016-02-01
This paper investigates the multirate networked industrial process control problem in double-layer architecture. First, the output tracking problem for sampled-data nonlinear plant at device layer with sampling period T(d) is investigated using adaptive neural network (NN) control, and it is shown that the outputs of subsystems at device layer can track the decomposed setpoints. Then, the outputs and inputs of the device layer subsystems are sampled with sampling period T(u) at operation layer to form the index prediction, which is used to predict the overall performance index at lower frequency. Radial basis function NN is utilized as the prediction function due to its approximation ability. Then, considering the dynamics of the overall closed-loop system, nonlinear model predictive control method is proposed to guarantee the system stability and compensate the network-induced delays and packet dropouts. Finally, a continuous stirred tank reactor system is given in the simulation part to demonstrate the effectiveness of the proposed method.
Embedding Fonts in MetaPost Output
2016-04-19
by John Hobby ) based on Donald Knuth’s META- FONT [4] with high quality PostScript output. An outstanding feature of MetaPost is that typeset fonts in...output, the graphics are perfectly scalable to any arbitrary res- olution. John Hobby , its author, writes: “[MetaPost] is really a programming lan- guage...for generating graphics, especially fig- ures for TEX [5] and troff documents.” This quote by Hobby indicates that MetaPost figures are not only
Interval Predictor Models with a Formal Characterization of Uncertainty and Reliability
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Giesy, Daniel P.; Kenny, Sean P.
2014-01-01
This paper develops techniques for constructing empirical predictor models based on observations. By contrast to standard models, which yield a single predicted output at each value of the model's inputs, Interval Predictors Models (IPM) yield an interval into which the unobserved output is predicted to fall. The IPMs proposed prescribe the output as an interval valued function of the model's inputs, render a formal description of both the uncertainty in the model's parameters and of the spread in the predicted output. Uncertainty is prescribed as a hyper-rectangular set in the space of model's parameters. The propagation of this set through the empirical model yields a range of outputs of minimal spread containing all (or, depending on the formulation, most) of the observations. Optimization-based strategies for calculating IPMs and eliminating the effects of outliers are proposed. Outliers are identified by evaluating the extent by which they degrade the tightness of the prediction. This evaluation can be carried out while the IPM is calculated. When the data satisfies mild stochastic assumptions, and the optimization program used for calculating the IPM is convex (or, when its solution coincides with the solution to an auxiliary convex program), the model's reliability (that is, the probability that a future observation would be within the predicted range of outputs) can be bounded rigorously by a non-asymptotic formula.
Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico
2005-01-01
Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298
NASA Astrophysics Data System (ADS)
Kobayashi, Kiyoshi; Suzuki, Tohru S.
2018-03-01
A new algorithm for the automatic estimation of an equivalent circuit and the subsequent parameter optimization is developed by combining the data-mining concept and complex least-squares method. In this algorithm, the program generates an initial equivalent-circuit model based on the sampling data and then attempts to optimize the parameters. The basic hypothesis is that the measured impedance spectrum can be reproduced by the sum of the partial-impedance spectra presented by the resistor, inductor, resistor connected in parallel to a capacitor, and resistor connected in parallel to an inductor. The adequacy of the model is determined by using a simple artificial-intelligence function, which is applied to the output function of the Levenberg-Marquardt module. From the iteration of model modifications, the program finds an adequate equivalent-circuit model without any user input to the equivalent-circuit model.
Propulsion/flight control integration technology (PROFIT) software system definition
NASA Technical Reports Server (NTRS)
Carlin, C. M.; Hastings, W. J.
1978-01-01
The Propulsion Flight Control Integration Technology (PROFIT) program is designed to develop a flying testbed dedicated to controls research. The control software for PROFIT is defined. Maximum flexibility, needed for long term use of the flight facility, is achieved through a modular design. The Host program, processes inputs from the telemetry uplink, aircraft central computer, cockpit computer control and plant sensors to form an input data base for use by the control algorithms. The control algorithms, programmed as application modules, process the input data to generate an output data base. The Host program formats the data for output to the telemetry downlink, the cockpit computer control, and the control effectors. Two applications modules are defined - the bill of materials F-100 engine control and the bill of materials F-15 inlet control.
Grid-coordinate generation program
Cosner, Oliver J.; Horwich, Esther
1974-01-01
This program description of the grid-coordinate generation program is written for computer users who are familiar with digital aquifer models. The program computes the coordinates for a variable grid -used in the 'Pinder Model' (a finite-difference aquifer simulator), for input to the CalComp GPCP (general purpose contouring program). The program adjusts the y-value by a user-supplied constant in order to transpose the origin of the model grid from the upper left-hand corner to the lower left-hand corner of the grid. The user has the options of, (1.) choosing the boundaries of the plot; (2.) adjusting the z-values (altitudes) by a constant; (3.) deleting superfluous z-values and (4.) subtracting the simulated surfaces from each other to obtain the decline. Output of this program includes the fixed format CNTL data cards and the other data cards required for input to GPCP. The output from GPCP then is used to produce a potentiometric map or a decline map by means of the CalComp plotter.
NASA Technical Reports Server (NTRS)
1971-01-01
The optimal allocation of resources to the national space program over an extended time period requires the solution of a large combinatorial problem in which the program elements are interdependent. The computer model uses an accelerated search technique to solve this problem. The model contains a large number of options selectable by the user to provide flexible input and a broad range of output for use in sensitivity analyses of all entering elements. Examples of these options are budget smoothing under varied appropriation levels, entry of inflation and discount effects, and probabilistic output which provides quantified degrees of certainty that program costs will remain within planned budget. Criteria and related analytic procedures were established for identifying potential new space program directions. Used in combination with the optimal resource allocation model, new space applications can be analyzed in realistic perspective, including the advantage gain from existing space program plant and on-going programs such as the space transportation system.
TLIFE: a Program for Spur, Helical and Spiral Bevel Transmission Life and Reliability Modeling
NASA Technical Reports Server (NTRS)
Savage, M.; Prasanna, M. G.; Rubadeux, K. L.
1994-01-01
This report describes a computer program, 'TLIFE', which models the service life of a transmission. The program is written in ANSI standard Fortran 77 and has an executable size of about 157 K bytes for use on a personal computer running DOS. It can also be compiled and executed in UNIX. The computer program can analyze any one of eleven unit transmissions either singly or in a series combination of up to twenty-five unit transmissions. Metric or English unit calculations are performed with the same routines using consistent input data and a units flag. Primary outputs are the dynamic capacity of the transmission and the mean lives of the transmission and of the sum of its components. The program uses a modular approach to separate the load analyses from the system life calculations. The program and its input and output data files are described herein. Three examples illustrate its use. A development of the theory behind the analysis in the program is included after the examples.
Pandey, Anil Kumar; Saroha, Kartik; Sharma, Param Dev; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
In this study, we have developed a simple image processing application in MATLAB that uses suprathreshold stochastic resonance (SSR) and helps the user to visualize abdominopelvic tumor on the exported prediuretic positron emission tomography/computed tomography (PET/CT) images. A brainstorming session was conducted for requirement analysis for the program. It was decided that program should load the screen captured PET/CT images and then produces output images in a window with a slider control that should enable the user to view the best image that visualizes the tumor, if present. The program was implemented on personal computer using Microsoft Windows and MATLAB R2013b. The program has option for the user to select the input image. For the selected image, it displays output images generated using SSR in a separate window having a slider control. The slider control enables the user to view images and select one which seems to provide the best visualization of the area(s) of interest. The developed application enables the user to select, process, and view output images in the process of utilizing SSR to detect the presence of abdominopelvic tumor on prediuretic PET/CT image.
Capacitive acoustic wave detector and method of using same
NASA Technical Reports Server (NTRS)
Yost, William T. (Inventor)
1994-01-01
A capacitor having two substantially parallel conductive faces is acoustically coupled to a conductive sample end such that the sample face is one end of the capacitor. A non-contacting dielectric may serve as a spacer between the two conductive plates. The formed capacitor is connected to an LC oscillator circuit such as a Hartley oscillator circuit producing an output frequency which is a function of the capacitor spacing. This capacitance oscillates as the sample end coating is oscillated by an acoustic wave generated in the sample by a transmitting transducer. The electrical output can serve as an absolute indicator of acoustic wave displacement.
Computer Program To Transliterate Into Arabic
NASA Technical Reports Server (NTRS)
Stephan, E.
1986-01-01
Conceptual program for TRS-80, Model 12 (or equivalent) computer transliterates from English letters of computer keyboard to Arabic characters in output of associated printer. Program automatically changes character sequence from left-to-right of English to right-to-left of Arabic.
Computer program optimizes design of nuclear radiation shields
NASA Technical Reports Server (NTRS)
Lahti, G. P.
1971-01-01
Computer program, OPEX 2, determines minimum weight, volume, or cost for shields. Program incorporates improved coding, simplified data input, spherical geometry, and an expanded output. Method is capable of altering dose-thickness relationship when a shield layer has been removed.
Program for creating an operating system generation cross reference index (SGINDEX)
NASA Technical Reports Server (NTRS)
Barth, C. W.
1972-01-01
Computer program to collect key data from Stage Two input of OS/360 system and to prepare formatted listing of index entries collected is discussed. Program eliminates manual paging through system output by providing comprehensive cross reference.
Efficiency measures and output specification : the case of European railways
DOT National Transportation Integrated Search
2000-12-01
This study analyzes the sensitivity of the efficiency indicators of a sample of European railway companies to different alternatives in output specification. The results vary according to the specification selected. Investigating the causes of these ...
INDES User's guide multistep input design with nonlinear rotorcraft modeling
NASA Technical Reports Server (NTRS)
1979-01-01
The INDES computer program, a multistep input design program used as part of a data processing technique for rotorcraft systems identification, is described. Flight test inputs base on INDES improve the accuracy of parameter estimates. The input design algorithm, program input, and program output are presented.
Solid rocket booster thermal radiation model. Volume 2: User's manual
NASA Technical Reports Server (NTRS)
Lee, A. L.
1976-01-01
A user's manual was prepared for the computer program of a solid rocket booster (SRB) thermal radiation model. The following information was included: (1) structure of the program, (2) input information required, (3) examples of input cards and output printout, (4) program characteristics, and (5) program listing.
Program user's manual for an unsteady helicopter rotor-fuselage aerodynamic analysis
NASA Technical Reports Server (NTRS)
Lorber, Peter F.
1988-01-01
The Rotor-Fuselage Analysis is a method of calculating the aerodynamic reaction between a helicopter rotor and fuselage. This manual describes the structure and operation of the computer programs that make up the Rotor-Fuselage Analysis, programs which prepare the input and programs which display the output.
News of the Day... view past news Central Pacific Hurricane Season Outlook for 2018 2017-18 Hawaii Wet Local Graphics National Graphics Model Output River and Lakes Climate and Past Weather Local National Model Output Climate and Past Weather Local National More... Hawaii Climate Portal Local Programs
Compensator improvement for multivariable control systems
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Mcdaniel, W. L., Jr.; Gresham, L. L.
1977-01-01
A theory and the associated numerical technique are developed for an iterative design improvement of the compensation for linear, time-invariant control systems with multiple inputs and multiple outputs. A strict constraint algorithm is used in obtaining a solution of the specified constraints of the control design. The result of the research effort is the multiple input, multiple output Compensator Improvement Program (CIP). The objective of the Compensator Improvement Program is to modify in an iterative manner the free parameters of the dynamic compensation matrix so that the system satisfies frequency domain specifications. In this exposition, the underlying principles of the multivariable CIP algorithm are presented and the practical utility of the program is illustrated with space vehicle related examples.
Siciliani, Luigi
2006-01-01
Policy makers are increasingly interested in developing performance indicators that measure hospital efficiency. These indicators may give the purchasers of health services an additional regulatory tool to contain health expenditure. Using panel data, this study compares different parametric (econometric) and non-parametric (linear programming) techniques for the measurement of a hospital's technical efficiency. This comparison was made using a sample of 17 Italian hospitals in the years 1996-9. Highest correlations are found in the efficiency scores between the non-parametric data envelopment analysis under the constant returns to scale assumption (DEA-CRS) and several parametric models. Correlation reduces markedly when using more flexible non-parametric specifications such as data envelopment analysis under the variable returns to scale assumption (DEA-VRS) and the free disposal hull (FDH) model. Correlation also generally reduces when moving from one output to two-output specifications. This analysis suggests that there is scope for developing performance indicators at hospital level using panel data, but it is important that extensive sensitivity analysis is carried out if purchasers wish to make use of these indicators in practice.
Rath, N; Kato, S; Levesque, J P; Mauel, M E; Navratil, G A; Peng, Q
2014-04-01
Fast, digital signal processing (DSP) has many applications. Typical hardware options for performing DSP are field-programmable gate arrays (FPGAs), application-specific integrated DSP chips, or general purpose personal computer systems. This paper presents a novel DSP platform that has been developed for feedback control on the HBT-EP tokamak device. The system runs all signal processing exclusively on a Graphics Processing Unit (GPU) to achieve real-time performance with latencies below 8 μs. Signals are transferred into and out of the GPU using PCI Express peer-to-peer direct-memory-access transfers without involvement of the central processing unit or host memory. Tests were performed on the feedback control system of the HBT-EP tokamak using forty 16-bit floating point inputs and outputs each and a sampling rate of up to 250 kHz. Signals were digitized by a D-TACQ ACQ196 module, processing done on an NVIDIA GTX 580 GPU programmed in CUDA, and analog output was generated by D-TACQ AO32CPCI modules.
Examining variation in treatment costs: a cost function for outpatient methadone treatment programs.
Dunlap, Laura J; Zarkin, Gary A; Cowell, Alexander J
2008-06-01
To estimate a hybrid cost function of the relationship between total annual cost for outpatient methadone treatment and output (annual patient days and selected services), input prices (wages and building space costs), and selected program and patient case-mix characteristics. Data are from a multistate study of 159 methadone treatment programs that participated in the Center for Substance Abuse Treatment's Evaluation of the Methadone/LAAM Treatment Program Accreditation Project between 1998 and 2000. Using least squares regression for weighted data, we estimate the relationship between total annual costs and selected output measures, wages, building space costs, and selected program and patient case-mix characteristics. Findings indicate that total annual cost is positively associated with program's annual patient days, with a 10 percent increase in patient days associated with an 8.2 percent increase in total cost. Total annual cost also increases with counselor wages (p<.01), but no significant association is found for nurse wages or monthly building costs. Surprisingly, program characteristics and patient case mix variables do not appear to explain variations in methadone treatment costs. Similar results are found for a model with services as outputs. This study provides important new insights into the determinants of methadone treatment costs. Our findings concur with economic theory in that total annual cost is positively related to counselor wages. However, among our factor inputs, counselor wages are the only significant driver of these costs. Furthermore, our findings suggest that methadone programs may realize economies of scale; however, other important factors, such as patient access, should be considered.
NASA Technical Reports Server (NTRS)
Wheeler, Kevin; Timucin, Dogan; Rabbette, Maura; Curry, Charles; Allan, Mark; Lvov, Nikolay; Clanton, Sam; Pilewskie, Peter
2002-01-01
The goal of visual inference programming is to develop a software framework data analysis and to provide machine learning algorithms for inter-active data exploration and visualization. The topics include: 1) Intelligent Data Understanding (IDU) framework; 2) Challenge problems; 3) What's new here; 4) Framework features; 5) Wiring diagram; 6) Generated script; 7) Results of script; 8) Initial algorithms; 9) Independent Component Analysis for instrument diagnosis; 10) Output sensory mapping virtual joystick; 11) Output sensory mapping typing; 12) Closed-loop feedback mu-rhythm control; 13) Closed-loop training; 14) Data sources; and 15) Algorithms. This paper is in viewgraph form.
Carey, A.E.; Prudic, David E.
1996-01-01
Documentation is provided of model input and sample output used in a previous report for analysis of ground-water flow and simulated pumping scenarios in Paradise Valley, Humboldt County, Nevada.Documentation includes files containing input values and listings of sample output. The files, in American International Standard Code for Information Interchange (ASCII) or binary format, are compressed and put on a 3-1/2-inch diskette. The decompressed files require approximately 8.4 megabytes of disk space on an International Business Machine (IBM)- compatible microcomputer using the MicroSoft Disk Operating System (MS-DOS) operating system version 5.0 or greater.
Space shuttle propulsion estimation development verification, volume 1
NASA Technical Reports Server (NTRS)
Rogers, Robert M.
1989-01-01
The results of the Propulsion Estimation Development Verification are summarized. A computer program developed under a previous contract (NAS8-35324) was modified to include improved models for the Solid Rocket Booster (SRB) internal ballistics, the Space Shuttle Main Engine (SSME) power coefficient model, the vehicle dynamics using quaternions, and an improved Kalman filter algorithm based on the U-D factorized algorithm. As additional output, the estimated propulsion performances, for each device are computed with the associated 1-sigma bounds. The outputs of the estimation program are provided in graphical plots. An additional effort was expended to examine the use of the estimation approach to evaluate single engine test data. In addition to the propulsion estimation program PFILTER, a program was developed to produce a best estimate of trajectory (BET). The program LFILTER, also uses the U-D factorized algorithm form of the Kalman filter as in the propulsion estimation program PFILTER. The necessary definitions and equations explaining the Kalman filtering approach for the PFILTER program, the models used for this application for dynamics and measurements, program description, and program operation are presented.
Dockres: a computer program that analyzes the output of virtual screening of small molecules
2010-01-01
Background This paper describes a computer program named Dockres that is designed to analyze and summarize results of virtual screening of small molecules. The program is supplemented with utilities that support the screening process. Foremost among these utilities are scripts that run the virtual screening of a chemical library on a large number of processors in parallel. Methods Dockres and some of its supporting utilities are written Fortran-77; other utilities are written as C-shell scripts. They support the parallel execution of the screening. The current implementation of the program handles virtual screening with Autodock-3 and Autodock-4, but can be extended to work with the output of other programs. Results Analysis of virtual screening by Dockres led to both active and selective lead compounds. Conclusions Analysis of virtual screening was facilitated and enhanced by Dockres in both the authors' laboratories as well as laboratories elsewhere. PMID:20205801
Designing a two-rank acceptance sampling plan for quality inspection of geospatial data products
NASA Astrophysics Data System (ADS)
Tong, Xiaohua; Wang, Zhenhua; Xie, Huan; Liang, Dan; Jiang, Zuoqin; Li, Jinchao; Li, Jun
2011-10-01
To address the disadvantages of classical sampling plans designed for traditional industrial products, we originally propose a two-rank acceptance sampling plan (TRASP) for the inspection of geospatial data outputs based on the acceptance quality level (AQL). The first rank sampling plan is to inspect the lot consisting of map sheets, and the second is to inspect the lot consisting of features in an individual map sheet. The TRASP design is formulated as an optimization problem with respect to sample size and acceptance number, which covers two lot size cases. The first case is for a small lot size with nonconformities being modeled by a hypergeometric distribution function, and the second is for a larger lot size with nonconformities being modeled by a Poisson distribution function. The proposed TRASP is illustrated through two empirical case studies. Our analysis demonstrates that: (1) the proposed TRASP provides a general approach for quality inspection of geospatial data outputs consisting of non-uniform items and (2) the proposed acceptance sampling plan based on TRASP performs better than other classical sampling plans. It overcomes the drawbacks of percent sampling, i.e., "strictness for large lot size, toleration for small lot size," and those of a national standard used specifically for industrial outputs, i.e., "lots with different sizes corresponding to the same sampling plan."
NASA Astrophysics Data System (ADS)
Nakhostin, M.
2015-10-01
In this paper, we have compared the performances of the digital zero-crossing and charge-comparison methods for n/γ discrimination with liquid scintillation detectors at low light outputs. The measurements were performed with a 2″×2″ cylindrical liquid scintillation detector of type BC501A whose outputs were sampled by means of a fast waveform digitizer with 10-bit resolution, 4 GS/s sampling rate and one volt input range. Different light output ranges were measured by operating the photomultiplier tube at different voltages and a new recursive algorithm was developed to implement the digital zero-crossing method. The results of our study demonstrate the superior performance of the digital zero-crossing method at low light outputs when a large dynamic range is measured. However, when the input range of the digitizer is used to measure a narrow range of light outputs, the charge-comparison method slightly outperforms the zero-crossing method. The results are discussed in regard to the effects of the quantization noise and the noise filtration performance of the zero-crossing filter.
Microcomputer-Aided Control Systems Design.
ERIC Educational Resources Information Center
Roat, S. D.; Melsheimer, S. S.
1987-01-01
Describes a single input/single output feedback control system design program for IBM PC and compatible microcomputers. Uses a heat exchanger temperature control loop to illustrate the various applications of the program. (ML)
ERIC Educational Resources Information Center
Akiba, Y.; And Others
This user's manual for the simulation program Graphical Evaluation and Review Technique (GERT) GQ contains sections on nodes, branches, program input description and format, and program output, as well as examples. Also included is a programmer's manual which contains information on scheduling, subroutine descriptions, COMMON Variables, and…
NASA Technical Reports Server (NTRS)
1971-01-01
A modular program for design optimization of thermal protection systems is discussed. Its capabilities and limitations are reviewed. Instructions for the operation of the program, output, and the program itself are given.
Apparatus and method for detecting full-capture radiation events
Odell, D.M.C.
1994-10-11
An apparatus and method are disclosed for sampling the output signal of a radiation detector and distinguishing full-capture radiation events from Compton scattering events. The output signal of a radiation detector is continuously sampled. The samples are converted to digital values and input to a discriminator where samples that are representative of events are identified. The discriminator transfers only event samples, that is, samples representing full-capture events and Compton events, to a signal processor where the samples are saved in a three-dimensional count matrix with time (from the time of onset of the pulse) on the first axis, sample pulse current amplitude on the second axis, and number of samples on the third axis. The stored data are analyzed to separate the Compton events from full-capture events, and the energy of the full-capture events is determined without having determined the energies of any of the individual radiation detector events. 4 figs.
Apparatus and method for detecting full-capture radiation events
Odell, Daniel M. C.
1994-01-01
An apparatus and method for sampling the output signal of a radiation detector and distinguishing full-capture radiation events from Compton scattering events. The output signal of a radiation detector is continuously sampled. The samples are converted to digital values and input to a discriminator where samples that are representative of events are identified. The discriminator transfers only event samples, that is, samples representing full-capture events and Compton events, to a signal processor where the samples are saved in a three-dimensional count matrix with time (from the time of onset of the pulse) on the first axis, sample pulse current amplitude on the second axis, and number of samples on the third axis. The stored data are analyzed to separate the Compton events from full-capture events, and the energy of the full-capture events is determined without having determined the energies of any of the individual radiation detector events.
Standard Transistor Array (Star): SIMLOG/TESTGN programmer's guide, volume 2, addendum 2
NASA Technical Reports Server (NTRS)
Carroll, B. D.
1979-01-01
A brief introduction to the SIMLOG/TESTGN system of programs is given. SIMLOG is a logic simulation program, whereas TESTGN is a program for generating test sequences from output produced by SIMLOG. The structures of the two programs are described. Data base, main program, and subprogram details are also given. Guidelines for program modifications are discussed. Commented program listings are included.
Simulation of water-quality data at selected stream sites in the Missouri River Basin, Montana
Knapton, J.R.; Jacobson, M.A.
1980-01-01
Modification of sampling programs at some water-quality stations in the Missouri River basin in Montana has eliminated the means by which solute loads have been directly obtained in past years. To compensate for this loss, water-quality and streamflow data were statistically analyzed and solute loads were simulated using computer techniques.Functional relationships existing between specific conductance and solute concentration for monthly samples were used to develop linear regression models. The models were then used to simulate daily solute concentrations using daily specific conductance as the independent variable. Once simulated, the solute concentrations, in milligrams per liter, were transformed into daily solute loads, in tons, using mean daily streamflow records.Computer output was formatted into tables listing simulated mean monthly solute concentrations, in milligrams per liter, and the monthly and annual solute loads, in tons, for water years 1975-78.
Method and apparatus configured for identification of a material
Slater, John M.; Crawford, Thomas M.
2000-01-01
The present invention includes an apparatus configured for identification of a material, and methods of identifying a material. One embodiment of the invention provides an apparatus including a first region configured to receive a first sample, the first region being configured to output a first spectrum corresponding to the first sample and responsive to exposure of the first sample to radiation; a modulator configured to modulate the first spectrum according to a first frequency; a second region configured to receive a second sample, the second region being configured to output a second spectrum corresponding to the second sample and responsive to exposure of the second sample to the modulated first spectrum; and a detector configured to detect the second spectrum having a second frequency greater than the first frequency.
Proposal for Microwave Boson Sampling.
Peropadre, Borja; Guerreschi, Gian Giacomo; Huh, Joonsuk; Aspuru-Guzik, Alán
2016-09-30
Boson sampling, the task of sampling the probability distribution of photons at the output of a photonic network, is believed to be hard for any classical device. Unlike other models of quantum computation that require thousands of qubits to outperform classical computers, boson sampling requires only a handful of single photons. However, a scalable implementation of boson sampling is missing. Here, we show how superconducting circuits provide such platform. Our proposal differs radically from traditional quantum-optical implementations: rather than injecting photons in waveguides, making them pass through optical elements like phase shifters and beam splitters, and finally detecting their output mode, we prepare the required multiphoton input state in a superconducting resonator array, control its dynamics via tunable and dispersive interactions, and measure it with nondemolition techniques.
Computer program for plotting and fairing wind-tunnel data
NASA Technical Reports Server (NTRS)
Morgan, H. L., Jr.
1983-01-01
A detailed description of the Langley computer program PLOTWD which plots and fairs experimental wind-tunnel data is presented. The program was written for use primarily on the Langley CDC computer and CALCOMP plotters. The fundamental operating features of the program are that the input data are read and written to a random-access file for use during program execution, that the data for a selected run can be sorted and edited to delete duplicate points, and that the data can be plotted and faired using tension splines, least-squares polynomial, or least-squares cubic-spline curves. The most noteworthy feature of the program is the simplicity of the user-supplied input requirements. Several subroutines are also included that can be used to draw grid lines, zero lines, axis scale values and lables, and legends. A detailed description of the program operational features and each sub-program are presented. The general application of the program is also discussed together with the input and output for two typical plot types. A listing of the program code, user-guide, and output description are presented in appendices. The program has been in use at Langley for several years and has proven to be both easy to use and versatile.
NASA Astrophysics Data System (ADS)
Villa, Carlos; Kumavor, Patrick; Donkor, Eric
2008-04-01
Photonics Analog-to-Digital Converters (ADCs) utilize a train of optical pulses to sample an electrical input waveform applied to an electrooptic modulator or a reverse biased photodiode. In the former, the resulting train of amplitude-modulated optical pulses is detected (converter to electrical) and quantized using a conversional electronics ADC- as at present there are no practical, cost-effective optical quantizers available with performance that rival electronic quantizers. In the latter, the electrical samples are directly quantized by the electronics ADC. In both cases however, the sampling rate is limited by the speed with which the electronics ADC can quantize the electrical samples. One way to increase the sampling rate by a factor N is by using the time-interleaved technique which consists of a parallel array of N electrical ADC converters, which have the same sampling rate but different sampling phase. Each operating at a quantization rate of fs/N where fs is the aggregated sampling rate. In a system with no real-time operation, the N channels digital outputs are stored in memory, and then aggregated (multiplexed) to obtain the digital representation of the analog input waveform. Alternatively, for real-time operation systems the reduction of storing time in the multiplexing process is desired to improve the time response of the ADC. The complete elimination of memories come expenses of concurrent timing and synchronization in the aggregation of the digital signal that became critical for a good digital representation of the analog signal waveform. In this paper we propose and demonstrate a novel optically synchronized encoder and multiplexer scheme for interleaved photonics ADCs that utilize the N optical signals used to sample different phases of an analog input signal to synchronize the multiplexing of the resulting N digital output channels in a single digital output port. As a proof of concept, four 320 Megasamples/sec 12-bit of resolution digital signals were multiplexed to form an aggregated 1.28 Gigasamples/sec single digital output signal.
Analysis of selected volatile organic compounds at background level in South Africa.
NASA Astrophysics Data System (ADS)
Ntsasa, Napo; Tshilongo, James; Lekoto, Goitsemang
2017-04-01
Volatile organic compounds (VOC) are measured globally at urban air pollution monitoring and background level at specific locations such as the Cape Point station. The urban pollution monitoring is legislated at government level; however, the background levels are scientific outputs of the World Meteorological Organisation Global Atmospheric Watch program (WMO/GAW). The Cape Point is a key station in the Southern Hemisphere which monitors greenhouse gases and halocarbons, with reported for over the past decade. The Cape Point station does not have the measurement capability VOC's currently. A joint research between the Cape Point station and the National Metrology Institute of South Africa (NMISA) objective is to perform qualitative and quantitative analysis of volatile organic compounds listed in the GAW program. NMISA is responsible for development, maintain and disseminate primary reference gas mixtures which are directly traceable to the International System of Units (SI) The results of some volatile organic compounds which where sampled in high pressure gas cylinders will be presented. The analysis of samples was performed on the gas chromatography with flame ionisation detector and mass selective detector (GC-FID/MSD) with a dedicate cryogenic pre-concentrator system. Keywords: volatile organic compounds, gas chromatography, pre-concentrator
DOT National Transportation Integrated Search
1973-12-01
The contents are: Appendix B - Detailed flow diagrams - new systems cost program; Appendix C and D - Typical input and output data - new system cost program; Appendix E - Compiled listings - highway transit cost program; Appendix F and G - Typical in...
LOGSIM user's manual. [Logic Simulation Program for computer aided design of logic circuits
NASA Technical Reports Server (NTRS)
Mitchell, C. L.; Taylor, J. F.
1972-01-01
The user's manual for the LOGSIM Program is presented. All program options are explained and a detailed definition of the format of each input card is given. LOGSIM Program operations, and the preparation of LOGSIM input data are discused along with data card formats, postprocessor data cards, and output interpretation.
NASA Technical Reports Server (NTRS)
Redwine, W. J.
1979-01-01
A timeline containing altitude, control surface deflection rates and angles, hinge moment loads, thrust vector control gimbal rates, and main throttle settings is used to derive the model. The timeline is constructed from the output of one or more trajectory simulation programs.
Fuel cell serves as oxygen level detector
NASA Technical Reports Server (NTRS)
1965-01-01
Monitoring the oxygen level in the air is accomplished by a fuel cell detector whose voltage output is proportional to the partial pressure of oxygen in the sampled gas. The relationship between output voltage and partial pressure of oxygen can be calibrated.
A programmable power processor for high power space applications
NASA Technical Reports Server (NTRS)
Lanier, J. R., Jr.; Graves, J. R.; Kapustka, R. E.; Bush, J. R., Jr.
1982-01-01
A Programmable Power Processor (P3) has been developed for application in future large space power systems. The P3 is capable of operation over a wide range of input voltage (26 to 375 Vdc) and output voltage (24 to 180 Vdc). The peak output power capability is 18 kW (180 V at 100 A). The output characteristics of the P3 can be programmed to any voltage and/or current level within the limits of the processor and may be controlled as a function of internal or external parameters. Seven breadboard P3s and one 'flight-type' engineering model P3 have been built and tested both individually and in electrical power systems. The programmable feature allows the P3 to be used in a variety of applications by changing the output characteristics. Test results, including efficiency at various input/output combinations, transient response, and output impedance, are presented.
Thermal imagers: from ancient analog video output to state-of-the-art video streaming
NASA Astrophysics Data System (ADS)
Haan, Hubertus; Feuchter, Timo; Münzberg, Mario; Fritze, Jörg; Schlemmer, Harry
2013-06-01
The video output of thermal imagers stayed constant over almost two decades. When the famous Common Modules were employed a thermal image at first was presented to the observer in the eye piece only. In the early 1990s TV cameras were attached and the standard output was CCIR. In the civil camera market output standards changed to digital formats a decade ago with digital video streaming being nowadays state-of-the-art. The reasons why the output technique in the thermal world stayed unchanged over such a long time are: the very conservative view of the military community, long planning and turn-around times of programs and a slower growth of pixel number of TIs in comparison to consumer cameras. With megapixel detectors the CCIR output format is not sufficient any longer. The paper discusses the state-of-the-art compression and streaming solutions for TIs.
Multichannel extremely broadband near-IR radiation sources for optical coherence tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wojtkowski, M; Fujimoto, J G; Lapin, P I
The construction and output parameters of two experimental samples of near-IR radiation sources based on the superposition of radiation from several superluminescent diodes are described. The first, three-channel sample emitting 18 mW of cw output power in a spectral band of width 105 nm through a single-mode fibre, is optimised for ophthalmology coherence tomography. The second, four-channel sample emits the 870-nm band of width more than 200 nm, which corresponds to the record coherence length smaller than 4 {mu}m. (laser applications and other topics in quantum electronics)
Buican, T.N.
1993-05-04
Apparatus and method is described for measuring intensities at a plurality of wavelengths and lifetimes. A source of multiple-wavelength electromagnetic radiation is passed through a first interferometer modulated at a first frequency, the output thereof being directed into a sample to be investigated. The light emitted from the sample as a result of the interaction thereof with the excitation radiation is directed into a second interferometer modulated at a second frequency, and the output detected and analyzed. In this manner excitation, emission, and lifetime information may be obtained for a multiplicity of fluorochromes in the sample.
Logic Models: A Tool for Designing and Monitoring Program Evaluations. REL 2014-007
ERIC Educational Resources Information Center
Lawton, Brian; Brandon, Paul R.; Cicchinelli, Louis; Kekahio, Wendy
2014-01-01
introduction to logic models as a tool for designing program evaluations defines the major components of education programs--resources, activities, outputs, and short-, mid-, and long-term outcomes--and uses an example to demonstrate the relationships among them. This quick…
Computer program for the analysis of the cross flow in a radial inflow turbine scroll
NASA Technical Reports Server (NTRS)
Hamed, A.; Abdallah, S.; Tabakoff, W.
1977-01-01
A computer program was used to solve the governing of the potential flow in the cross sectional planes of a radial inflow turbine scroll. A list of the main program, the subroutines, and typical output example are included.
User guide for MODPATH version 6 - A particle-tracking model for MODFLOW
Pollock, David W.
2012-01-01
MODPATH is a particle-tracking post-processing model that computes three-dimensional flow paths using output from groundwater flow simulations based on MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. This report documents MODPATH version 6. Previous versions were documented in USGS Open-File Reports 89-381 and 94-464. The program uses a semianalytical particle-tracking scheme that allows an analytical expression of a particle's flow path to be obtained within each finite-difference grid cell. A particle's path is computed by tracking the particle from one cell to the next until it reaches a boundary, an internal sink/source, or satisfies another termination criterion. Data input to MODPATH consists of a combination of MODFLOW input data files, MODFLOW head and flow output files, and other input files specific to MODPATH. Output from MODPATH consists of several output files, including a number of particle coordinate output files intended to serve as input data for other programs that process, analyze, and display the results in various ways. MODPATH is written in FORTRAN and can be compiled by any FORTRAN compiler that fully supports FORTRAN-2003 or by most commercially available FORTRAN-95 compilers that support the major FORTRAN-2003 language extensions.
NASA Technical Reports Server (NTRS)
Smolka, S. A.; Preuss, R. D.; Tseng, K.; Morino, L.
1980-01-01
A user/programmer manual for the computer program SOUSSA P 1.1 is presented. The program was designed to provide accurate and efficient evaluation of steady and unsteady loads on aircraft having arbitrary shapes and motions, including structural deformations. These design goals were in part achieved through the incorporation of the data handling capabilities of the SPAR finite element Structural Analysis computer program. As a further result, SOUSSA P possesses an extensive checkpoint/ restart facility. The programmer's portion of this manual includes overlay/subroutine hierarchy, logical flow of control, definition of SOUSSA P 1.1 FORTRAN variables, and definition of SOUSSA P 1.1 subroutines. Purpose of the SOUSSA P 1.1 modules, input data to the program, output of the program, hardware/software requirements, error detection and reporting capabilities, job control statements, a summary of the procedure for running the program and two test cases including input and output and listings are described in the user oriented portion of the manual.
User's manual for CNVUFAC, the general dynamics heat-transfer radiation view factor program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, R. L.
CNVUFAC, the General Dynamics heat-transfer radiation veiw factor program, has been adapted for use on the LLL CDC 7600 computer system. The input and output have been modified, and a node incrementing logic was included to make the code compatible with the TRUMP thermal analyzer and related codes. The program performs the multiple integration necessary to evaluate the geometric black-body radiaton node to node view factors. Card image output that contains node number and view factor information is generated for input into the related program GRAY. Program GRAY is then used to include the effects of gray-body emissivities and multiplemore » reflections, generating the effective gray-body view factors usable in TRUMP. CNVUFAC uses an elemental area summation scheme to evaluate the multiple integrals. The program permits shadowing and self-shadowing. The basic configuration shapes that can be considered are cylinders, cones, spheres, ellipsoids, flat plates, disks, toroids, and polynomials of revolution. Portions of these shapes can also be considered.« less
Distinguishing the Forest from the Trees: Synthesizing IHRMP Research
Gregory B. Greenwood
1991-01-01
A conceptual model of hardwood rangelands as multi-output resource system is developed and used to achieve a synthesis of Integrated Hardwood Range Management Program (IHRMP) research. The model requires the definition of state variables which characterize the system at any time, processes that move the system to different states, outputs...
The Design and the Formative Evaluation of a Web-Based Course for Simulation Analysis Experiences
ERIC Educational Resources Information Center
Tao, Yu-Hui; Guo, Shin-Ming; Lu, Ya-Hui
2006-01-01
Simulation output analysis has received little attention comparing to modeling and programming in real-world simulation applications. This is further evidenced by our observation that students and beginners acquire neither adequate details of knowledge nor relevant experience of simulation output analysis in traditional classroom learning. With…
Educational Resource Multipliers for Use in Local Public Finance: An Input-Output Approach.
ERIC Educational Resources Information Center
Boardman, A. E.; Schinnar, A. P.
1982-01-01
Develops an input-output model, with related multipliers, showing how changes in earmarked and discretionary educational funds (whether local, state, or federal) affect all of a state's districts and educational programs. Illustrates the model with Pennsylvania data and relates it to the usual educational finance approach, which uses demand…
40 CFR Appendix D to Part 72 - Calculation of Potential Electric Output Capacity
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 17 2013-07-01 2013-07-01 false Calculation of Potential Electric Output Capacity D Appendix D to Part 72 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) PERMITS REGULATION Pt. 72, App. D Appendix D to Part 72—Calculation of...
40 CFR Appendix D to Part 72 - Calculation of Potential Electric Output Capacity
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 17 2014-07-01 2014-07-01 false Calculation of Potential Electric Output Capacity D Appendix D to Part 72 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) PERMITS REGULATION Pt. 72, App. D Appendix D to Part 72—Calculation of...
47 CFR Appendix - Technical Appendix 1
Code of Federal Regulations, 2010 CFR
2010-10-01
... display program material that has been encoded in any and all of the video formats contained in Table A3... frame rate of the transmitted video format. 2. Output Formats Equipment shall support 4:3 center cut-out... for composite video (yellow). Output shall produce video with ITU-R BT.500-11 quality scale of Grade 4...
40 CFR Appendix D to Part 72 - Calculation of Potential Electric Output Capacity
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Calculation of Potential Electric Output Capacity D Appendix D to Part 72 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) PERMITS REGULATION Pt. 72, App. D Appendix D to Part 72—Calculation of...
Stone, Vathsala I; Lane, Joseph P
2012-05-16
Government-sponsored science, technology, and innovation (STI) programs support the socioeconomic aspects of public policies, in addition to expanding the knowledge base. For example, beneficial healthcare services and devices are expected to result from investments in research and development (R&D) programs, which assume a causal link to commercial innovation. Such programs are increasingly held accountable for evidence of impact-that is, innovative goods and services resulting from R&D activity. However, the absence of comprehensive models and metrics skews evidence gathering toward bibliometrics about research outputs (published discoveries), with less focus on transfer metrics about development outputs (patented prototypes) and almost none on econometrics related to production outputs (commercial innovations). This disparity is particularly problematic for the expressed intent of such programs, as most measurable socioeconomic benefits result from the last category of outputs. This paper proposes a conceptual framework integrating all three knowledge-generating methods into a logic model, useful for planning, obtaining, and measuring the intended beneficial impacts through the implementation of knowledge in practice. Additionally, the integration of the Context-Input-Process-Product (CIPP) model of evaluation proactively builds relevance into STI policies and programs while sustaining rigor. The resulting logic model framework explicitly traces the progress of knowledge from inputs, following it through the three knowledge-generating processes and their respective knowledge outputs (discovery, invention, innovation), as it generates the intended socio-beneficial impacts. It is a hybrid model for generating technology-based innovations, where best practices in new product development merge with a widely accepted knowledge-translation approach. Given the emphasis on evidence-based practice in the medical and health fields and "bench to bedside" expectations for knowledge transfer, sponsors and grantees alike should find the model useful for planning, implementing, and evaluating innovation processes. High-cost/high-risk industries like healthcare require the market deployment of technology-based innovations to improve domestic society in a global economy. An appropriate balance of relevance and rigor in research, development, and production is crucial to optimize the return on public investment in such programs. The technology-innovation process needs a comprehensive operational model to effectively allocate public funds and thereby deliberately and systematically accomplish socioeconomic benefits.
2012-01-01
Background Government-sponsored science, technology, and innovation (STI) programs support the socioeconomic aspects of public policies, in addition to expanding the knowledge base. For example, beneficial healthcare services and devices are expected to result from investments in research and development (R&D) programs, which assume a causal link to commercial innovation. Such programs are increasingly held accountable for evidence of impact—that is, innovative goods and services resulting from R&D activity. However, the absence of comprehensive models and metrics skews evidence gathering toward bibliometrics about research outputs (published discoveries), with less focus on transfer metrics about development outputs (patented prototypes) and almost none on econometrics related to production outputs (commercial innovations). This disparity is particularly problematic for the expressed intent of such programs, as most measurable socioeconomic benefits result from the last category of outputs. Methods This paper proposes a conceptual framework integrating all three knowledge-generating methods into a logic model, useful for planning, obtaining, and measuring the intended beneficial impacts through the implementation of knowledge in practice. Additionally, the integration of the Context-Input-Process-Product (CIPP) model of evaluation proactively builds relevance into STI policies and programs while sustaining rigor. Results The resulting logic model framework explicitly traces the progress of knowledge from inputs, following it through the three knowledge-generating processes and their respective knowledge outputs (discovery, invention, innovation), as it generates the intended socio-beneficial impacts. It is a hybrid model for generating technology-based innovations, where best practices in new product development merge with a widely accepted knowledge-translation approach. Given the emphasis on evidence-based practice in the medical and health fields and “bench to bedside” expectations for knowledge transfer, sponsors and grantees alike should find the model useful for planning, implementing, and evaluating innovation processes. Conclusions High-cost/high-risk industries like healthcare require the market deployment of technology-based innovations to improve domestic society in a global economy. An appropriate balance of relevance and rigor in research, development, and production is crucial to optimize the return on public investment in such programs. The technology-innovation process needs a comprehensive operational model to effectively allocate public funds and thereby deliberately and systematically accomplish socioeconomic benefits. PMID:22591638
Real-time biochemical sensor based on Raman scattering with CMOS contact imaging.
Muyun Cao; Yuhua Li; Yadid-Pecht, Orly
2015-08-01
This work presents a biochemical sensor based on Raman scattering with Complementary metal-oxide-semiconductor (CMOS) contact imaging. This biochemical optical sensor is designed for detecting the concentration of solutions. The system is built with a laser diode, an optical filter, a sample holder and a commercial CMOS sensor. The output of the system is analyzed by an image processing program. The system provides instant measurements with a resolution of 0.2 to 0.4 Mol. This low cost and easy-operated small scale system is useful in chemical, biomedical and environmental labs for quantitative bio-chemical concentration detection with results reported comparable to a highly cost commercial spectrometer.
High speed cylindrical roller bearing analysis, SKF computer program CYBEAN. Volume 2: User's manual
NASA Technical Reports Server (NTRS)
Kleckner, R. J.; Pirvics, J.
1978-01-01
The CYBEAN (Cylindrical Bearing Analysis) was created to detail radially loaded, aligned and misaligned cylindrical roller bearing performance under a variety of operating conditions. Emphasis was placed on detailing the effects of high speed, preload and system thermal coupling. Roller tilt, skew, radial, circumferential and axial displacement as well as flange contact were considered. Variable housing and flexible out-of-round outer ring geometries, and both steady state and time transient temperature calculations were enabled. The complete range of elastohydrodynamic contact considerations, employing full and partial film conditions were treated in the computation of raceway and flange contacts. Input and output architectures containing guidelines for use and a sample execution are detailed.
NASA Astrophysics Data System (ADS)
Vogelmann, A. M.; Gustafson, W. I., Jr.; Toto, T.; Endo, S.; Cheng, X.; Li, Z.; Xiao, H.
2015-12-01
The Department of Energy's Atmospheric Radiation Measurement (ARM) Climate Research Facilities' Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) Workflow is currently being designed to provide output from routine LES to complement its extensive observations. The modeling portion of the LASSO workflow is presented by Gustafson et al., which will initially focus on shallow convection over the ARM megasite in Oklahoma, USA. This presentation describes how the LES output will be combined with observations to construct multi-dimensional and dynamically consistent "data cubes", aimed at providing the best description of the atmospheric state for use in analyses by the community. The megasite observations are used to constrain large-eddy simulations that provide a complete spatial and temporal coverage of observables and, further, the simulations also provide information on processes that cannot be observed. Statistical comparisons of model output with their observables are used to assess the quality of a given simulated realization and its associated uncertainties. A data cube is a model-observation package that provides: (1) metrics of model-observation statistical summaries to assess the simulations and the ensemble spread; (2) statistical summaries of additional model property output that cannot be or are very difficult to observe; and (3) snapshots of the 4-D simulated fields from the integration period. Searchable metrics are provided that characterize the general atmospheric state to assist users in finding cases of interest, such as categorization of daily weather conditions and their specific attributes. The data cubes will be accompanied by tools designed for easy access to cube contents from within the ARM archive and externally, the ability to compare multiple data streams within an event as well as across events, and the ability to use common grids and time sampling, where appropriate.
OpenMP GNU and Intel Fortran programs for solving the time-dependent Gross-Pitaevskii equation
NASA Astrophysics Data System (ADS)
Young-S., Luis E.; Muruganandam, Paulsamy; Adhikari, Sadhan K.; Lončar, Vladimir; Vudragović, Dušan; Balaž, Antun
2017-11-01
We present Open Multi-Processing (OpenMP) version of Fortran 90 programs for solving the Gross-Pitaevskii (GP) equation for a Bose-Einstein condensate in one, two, and three spatial dimensions, optimized for use with GNU and Intel compilers. We use the split-step Crank-Nicolson algorithm for imaginary- and real-time propagation, which enables efficient calculation of stationary and non-stationary solutions, respectively. The present OpenMP programs are designed for computers with multi-core processors and optimized for compiling with both commercially-licensed Intel Fortran and popular free open-source GNU Fortran compiler. The programs are easy to use and are elaborated with helpful comments for the users. All input parameters are listed at the beginning of each program. Different output files provide physical quantities such as energy, chemical potential, root-mean-square sizes, densities, etc. We also present speedup test results for new versions of the programs. Program files doi:http://dx.doi.org/10.17632/y8zk3jgn84.2 Licensing provisions: Apache License 2.0 Programming language: OpenMP GNU and Intel Fortran 90. Computer: Any multi-core personal computer or workstation with the appropriate OpenMP-capable Fortran compiler installed. Number of processors used: All available CPU cores on the executing computer. Journal reference of previous version: Comput. Phys. Commun. 180 (2009) 1888; ibid.204 (2016) 209. Does the new version supersede the previous version?: Not completely. It does supersede previous Fortran programs from both references above, but not OpenMP C programs from Comput. Phys. Commun. 204 (2016) 209. Nature of problem: The present Open Multi-Processing (OpenMP) Fortran programs, optimized for use with commercially-licensed Intel Fortran and free open-source GNU Fortran compilers, solve the time-dependent nonlinear partial differential (GP) equation for a trapped Bose-Einstein condensate in one (1d), two (2d), and three (3d) spatial dimensions for six different trap symmetries: axially and radially symmetric traps in 3d, circularly symmetric traps in 2d, fully isotropic (spherically symmetric) and fully anisotropic traps in 2d and 3d, as well as 1d traps, where no spatial symmetry is considered. Solution method: We employ the split-step Crank-Nicolson algorithm to discretize the time-dependent GP equation in space and time. The discretized equation is then solved by imaginary- or real-time propagation, employing adequately small space and time steps, to yield the solution of stationary and non-stationary problems, respectively. Reasons for the new version: Previously published Fortran programs [1,2] have now become popular tools [3] for solving the GP equation. These programs have been translated to the C programming language [4] and later extended to the more complex scenario of dipolar atoms [5]. Now virtually all computers have multi-core processors and some have motherboards with more than one physical computer processing unit (CPU), which may increase the number of available CPU cores on a single computer to several tens. The C programs have been adopted to be very fast on such multi-core modern computers using general-purpose graphic processing units (GPGPU) with Nvidia CUDA and computer clusters using Message Passing Interface (MPI) [6]. Nevertheless, previously developed Fortran programs are also commonly used for scientific computation and most of them use a single CPU core at a time in modern multi-core laptops, desktops, and workstations. Unless the Fortran programs are made aware and capable of making efficient use of the available CPU cores, the solution of even a realistic dynamical 1d problem, not to mention the more complicated 2d and 3d problems, could be time consuming using the Fortran programs. Previously, we published auto-parallel Fortran programs [2] suitable for Intel (but not GNU) compiler for solving the GP equation. Hence, a need for the full OpenMP version of the Fortran programs to reduce the execution time cannot be overemphasized. To address this issue, we provide here such OpenMP Fortran programs, optimized for both Intel and GNU Fortran compilers and capable of using all available CPU cores, which can significantly reduce the execution time. Summary of revisions: Previous Fortran programs [1] for solving the time-dependent GP equation in 1d, 2d, and 3d with different trap symmetries have been parallelized using the OpenMP interface to reduce the execution time on multi-core processors. There are six different trap symmetries considered, resulting in six programs for imaginary-time propagation and six for real-time propagation, totaling to 12 programs included in BEC-GP-OMP-FOR software package. All input data (number of atoms, scattering length, harmonic oscillator trap length, trap anisotropy, etc.) are conveniently placed at the beginning of each program, as before [2]. Present programs introduce a new input parameter, which is designated by Number_of_Threads and defines the number of CPU cores of the processor to be used in the calculation. If one sets the value 0 for this parameter, all available CPU cores will be used. For the most efficient calculation it is advisable to leave one CPU core unused for the background system's jobs. For example, on a machine with 20 CPU cores such that we used for testing, it is advisable to use up to 19 CPU cores. However, the total number of used CPU cores can be divided into more than one job. For instance, one can run three simulations simultaneously using 10, 4, and 5 CPU cores, respectively, thus totaling to 19 used CPU cores on a 20-core computer. The Fortran source programs are located in the directory src, and can be compiled by the make command using the makefile in the root directory BEC-GP-OMP-FOR of the software package. The examples of produced output files can be found in the directory output, although some large density files are omitted, to save space. The programs calculate the values of actually used dimensionless nonlinearities from the physical input parameters, where the input parameters correspond to the identical nonlinearity values as in the previously published programs [1], so that the output files of the old and new programs can be directly compared. The output files are conveniently named such that their contents can be easily identified, following the naming convention introduced in Ref. [2]. For example, a file named -out.txt, where is a name of the individual program, represents the general output file containing input data, time and space steps, nonlinearity, energy and chemical potential, and was named fort.7 in the old Fortran version of programs [1]. A file named -den.txt is the output file with the condensate density, which had the names fort.3 and fort.4 in the old Fortran version [1] for imaginary- and real-time propagation programs, respectively. Other possible density outputs, such as the initial density, are commented out in the programs to have a simpler set of output files, but users can uncomment and re-enable them, if needed. In addition, there are output files for reduced (integrated) 1d and 2d densities for different programs. In the real-time programs there is also an output file reporting the dynamics of evolution of root-mean-square sizes after a perturbation is introduced. The supplied real-time programs solve the stationary GP equation, and then calculate the dynamics. As the imaginary-time programs are more accurate than the real-time programs for the solution of a stationary problem, one can first solve the stationary problem using the imaginary-time programs, adapt the real-time programs to read the pre-calculated wave function and then study the dynamics. In that case the parameter NSTP in the real-time programs should be set to zero and the space mesh and nonlinearity parameters should be identical in both programs. The reader is advised to consult our previous publication where a complete description of the output files is given [2]. A readme.txt file, included in the root directory, explains the procedure to compile and run the programs. We tested our programs on a workstation with two 10-core Intel Xeon E5-2650 v3 CPUs. The parameters used for testing are given in sample input files, provided in the corresponding directory together with the programs. In Table 1 we present wall-clock execution times for runs on 1, 6, and 19 CPU cores for programs compiled using Intel and GNU Fortran compilers. The corresponding columns "Intel speedup" and "GNU speedup" give the ratio of wall-clock execution times of runs on 1 and 19 CPU cores, and denote the actual measured speedup for 19 CPU cores. In all cases and for all numbers of CPU cores, although the GNU Fortran compiler gives excellent results, the Intel Fortran compiler turns out to be slightly faster. Note that during these tests we always ran only a single simulation on a workstation at a time, to avoid any possible interference issues. Therefore, the obtained wall-clock times are more reliable than the ones that could be measured with two or more jobs running simultaneously. We also studied the speedup of the programs as a function of the number of CPU cores used. The performance of the Intel and GNU Fortran compilers is illustrated in Fig. 1, where we plot the speedup and actual wall-clock times as functions of the number of CPU cores for 2d and 3d programs. We see that the speedup increases monotonically with the number of CPU cores in all cases and has large values (between 10 and 14 for 3d programs) for the maximal number of cores. This fully justifies the development of OpenMP programs, which enable much faster and more efficient solving of the GP equation. However, a slow saturation in the speedup with the further increase in the number of CPU cores is observed in all cases, as expected. The speedup tends to increase for programs in higher dimensions, as they become more complex and have to process more data. This is why the speedups of the supplied 2d and 3d programs are larger than those of 1d programs. Also, for a single program the speedup increases with the size of the spatial grid, i.e., with the number of spatial discretization points, since this increases the amount of calculations performed by the program. To demonstrate this, we tested the supplied real2d-th program and varied the number of spatial discretization points NX=NY from 20 to 1000. The measured speedup obtained when running this program on 19 CPU cores as a function of the number of discretization points is shown in Fig. 2. The speedup first increases rapidly with the number of discretization points and eventually saturates. Additional comments: Example inputs provided with the programs take less than 30 minutes to run on a workstation with two Intel Xeon E5-2650 v3 processors (2 QPI links, 10 CPU cores, 25 MB cache, 2.3 GHz).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id
Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this softwaremore » 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.« less
Software Compensates Electronic-Nose Readings for Humidity
NASA Technical Reports Server (NTRS)
Zhou, Hanying
2007-01-01
A computer program corrects for the effects of humidity on the readouts of an array of chemical sensors (an "electronic nose"). To enable the use of this program, the array must incorporate an independent humidity sensor in addition to sensors designed to detect analytes other than water vapor. The basic principle of the program was described in "Compensating for Effects of Humidity on Electronic Noses" (NPO-30615), NASA Tech Briefs, Vol. 28, No. 6 (June 2004), page 63. To recapitulate: The output of the humidity sensor is used to generate values that are subtracted from the outputs of the other sensors to correct for contributions of humidity to those readings. Hence, in principle, what remains after corrections are the contributions of the analytes only. The outputs of the non-humidity sensors are then deconvolved to obtain the concentrations of the analytes. In addition, the humidity reading is retained as an analyte reading in its own right. This subtraction of the humidity background increases the ability of the software to identify such events as spills in which contaminants may be present in small concentrations and accompanied by large changes in humidity.
System life and reliability modeling for helicopter transmissions
NASA Technical Reports Server (NTRS)
Savage, M.; Brikmanis, C. K.
1986-01-01
A computer program which simulates life and reliability of helicopter transmissions is presented. The helicopter transmissions may be composed of spiral bevel gear units and planetary gear units - alone, in series or in parallel. The spiral bevel gear units may have either single or dual input pinions, which are identical. The planetary gear units may be stepped or unstepped and the number of planet gears carried by the planet arm may be varied. The reliability analysis used in the program is based on the Weibull distribution lives of the transmission components. The computer calculates the system lives and dynamic capacities of the transmission components and the transmission. The system life is defined as the life of the component or transmission at an output torque at which the probability of survival is 90 percent. The dynamic capacity of a component or transmission is defined as the output torque which can be applied for one million output shaft cycles for a probability of survival of 90 percent. A complete summary of the life and dynamic capacity results is produced by the program.
NASA Technical Reports Server (NTRS)
Myers, W. L.
1981-01-01
The LANDSAT-geographic information system (GIS) interface must summarize the results of the LANDSAT classification over the same cells that serve as geographic referencing units for the GIS, and output these summaries on a cell-by-cell basis in a form that is readable by the input routines of the GIS. The ZONAL interface for cell-oriented systems consists of two primary programs. The PIXCEL program scans the grid of cells and outputs a channel of pixels. Each pixel contains not the reflectance values but the identifier of the cell in which the center of the pixel is located. This file of pixelized cells along with the results of a pixel-by-pixel classification of the scene produced by the LANDSAT analysis system are input to the CELSUM program which then outputs a cell-by-cell summary formatted according to the requirements of the host GIS. Cross-correlation of the LANDSAT layer with the other layers in the data base is accomplished with the analysis and display facilities of the GIS.
Towards Standardization in Terminal Ballistics Testing: Velocity Representation
1976-01-01
d vd vr does not exist at vV, it is true that -. Also avs rd s t d v s approximates...29 3b. Sample of plotter output: v versus v s -r.. ....... .. 30s S 3c. Sample of plotter output: v /vs versus vr/avs. ...... 31 I ’ i Li- Preceding...implicit in sets of ( v s , v r) data. A form is proposed as being sufficiently simple and versatile to usefully and realistically model
OIL—Output input language for data connectivity between geoscientific software applications
NASA Astrophysics Data System (ADS)
Amin Khan, Khalid; Akhter, Gulraiz; Ahmad, Zulfiqar
2010-05-01
Geoscientific computing has become so complex that no single software application can perform all the processing steps required to get the desired results. Thus for a given set of analyses, several specialized software applications are required, which must be interconnected for electronic flow of data. In this network of applications the outputs of one application become inputs of other applications. Each of these applications usually involve more than one data type and may have their own data formats, making them incompatible with other applications in terms of data connectivity. Consequently several data format conversion utilities are developed in-house to provide data connectivity between applications. Practically there is no end to this problem as each time a new application is added to the system, a set of new data conversion utilities need to be developed. This paper presents a flexible data format engine, programmable through a platform independent, interpreted language named; Output Input Language (OIL). Its unique architecture allows input and output formats to be defined independent of each other by two separate programs. Thus read and write for each format is coded only once and data connectivity link between two formats is established by a combination of their read and write programs. This results in fewer programs with no redundancy and maximum reuse, enabling rapid application development and easy maintenance of data connectivity links.
A Cost-Effectiveness Analysis of Community Health Workers in Mozambique.
Bowser, Diana; Okunogbe, Adeyemi; Oliveras, Elizabeth; Subramanian, Laura; Morrill, Tyler
2015-10-01
Community health worker (CHW) programs are a key strategy for reducing mortality and morbidity. Despite this, there is a gap in the literature on the cost and cost-effectiveness of CHW programs, especially in developing countries. This study assessed the costs of a CHW program in Mozambique over the period 2010-2012. Incremental cost-effectiveness ratios, comparing the change in costs to the change in 3 output measures, as well as gains in efficiency were calculated over the periods 2010-2011 and 2010-2012. The results were reported both excluding and including salaries for CHWs. The results of the study showed total costs of the CHW program increased from US$1.34 million in 2010 to US$1.67 million in 2012. The highest incremental cost-effectiveness ratio was for the cost per beneficiary covered including CHW salaries, estimated at US$47.12 for 2010-2011. The smallest incremental cost-effectiveness ratio was for the cost per household visit not including CHW salaries, estimated at US$0.09 for 2010-2012. Adding CHW salaries would not only have increased total program costs by 362% in 2012 but also led to the largest efficiency gains in program implementation; a 56% gain in cost per output in the long run as compared with the short run after including CHW salaries. Our findings can be used to inform future CHW program policy both in Mozambique and in other countries, as well as provide a set of incremental cost per output measures to be used in benchmarking to other CHW costing analyses. © The Author(s) 2015.
Modification of infant hypothyroidism and phenylketonuria screening program using electronic tools.
Taheri, Behjat; Haddadpoor, Asefeh; Mirkhalafzadeh, Mahmood; Mazroei, Fariba; Aghdak, Pezhman; Nasri, Mehran; Bahrami, Gholamreza
2017-01-01
Congenital hypothyroidism and phenylketonuria (PKU) are the most common cause for preventable mental retardation in infants worldwide. Timely diagnosis and treatment of these disorders can have lasting effects on the mental development of newborns. However, there are several problems at different stages of screening programs that along with imposing heavy costs can reduce the precision of the screening, increasing the chance of undiagnosed cases which in turn can have damaging consequences for the society. Therefore, given these problems and the importance of information systems in facilitating the management and improving the quality of health care the aim of this study was to improve the screening process of hypothyroidism and PKU in infants with the help of electronic resources. The current study is a qualitative, action research designed to improve the quality of screening, services, performance, implementation effectiveness, and management of hypothyroidism and PKU screening program in Isfahan province. To this end, web-based software was designed. Programming was carried out using Delphi.net software and used SQL Server 2008 for database management. Given the weaknesses, problems, and limitations of hypothyroidism and PKU screening program, and the importance of these diseases in a national scale, this study resulted in design of hypothyroidism and PKU screening software for infants in Isfahan province. The inputs and outputs of the software were designed in three levels including Health Care Centers in charge of the screening program, provincial reference lab, and health and treatment network of Isfahan province. Immediate registration of sample data at the time and location of sampling, providing the provincial reference Laboratory and Health Centers of different eparchies with the ability to instantly observe, monitor, and follow-up on the samples at any moment, online verification of samples by reference lab, creating a daily schedule for reference lab, and receiving of the results from analysis equipment; and entering the results into the database without the need for user input are among the features of this software. The implementation of hypothyroidism screening software led to an increase in the quality and efficiency of the screening program; minimized the risk of human error in the process and solved many of the previous limitations of the screening program which were the main goals for implementation of this software. The implementation of this software also resulted in improvement in precision and quality of services provided for these two diseases and better accuracy and precision for data inputs by providing the possibility of entering the sample data at the place and time of sampling which then resulted in the possibility of management based on precise data and also helped develop a comprehensive database and improved the satisfaction of service recipients.
A 30 MW, 200 MHz Inductive Output Tube for RF Accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Lawrence Ives; Michael Read
2008-06-19
This program investigated development of a multiple beam inductive output tube (IOT) to produce 30 MW pulses at 200 MHz. The program was successful in demonstrating feasibility of developing the source to achieve the desired power in microsecond pulses with 70% efficiency. The predicted gain of the device is 24 dB. Consequently, a 200 kW driver would be required for the RF input. Estimated cost of this driver is approximately $1.25 M. Given the estimated development cost of the IOT of approximately $750K and the requirements for a test set that would significantly increase the cost, it was determined thatmore » development could not be achieved within the funding constraints of a Phase II program.« less
Bayesian Processor of Output for Probabilistic Quantitative Precipitation Forecasting
NASA Astrophysics Data System (ADS)
Krzysztofowicz, R.; Maranzano, C. J.
2006-05-01
The Bayesian Processor of Output (BPO) is a new, theoretically-based technique for probabilistic forecasting of weather variates. It processes output from a numerical weather prediction (NWP) model and optimally fuses it with climatic data in order to quantify uncertainty about a predictand. The BPO is being tested by producing Probabilistic Quantitative Precipitation Forecasts (PQPFs) for a set of climatically diverse stations in the contiguous U.S. For each station, the PQPFs are produced for the same 6-h, 12-h, and 24-h periods up to 84- h ahead for which operational forecasts are produced by the AVN-MOS (Model Output Statistics technique applied to output fields from the Global Spectral Model run under the code name AVN). The inputs into the BPO are estimated as follows. The prior distribution is estimated from a (relatively long) climatic sample of the predictand; this sample is retrieved from the archives of the National Climatic Data Center. The family of the likelihood functions is estimated from a (relatively short) joint sample of the predictor vector and the predictand; this sample is retrieved from the same archive that the Meteorological Development Laboratory of the National Weather Service utilized to develop the AVN-MOS system. This talk gives a tutorial introduction to the principles and procedures behind the BPO, and highlights some results from the testing: a numerical example of the estimation of the BPO, and a comparative verification of the BPO forecasts and the MOS forecasts. It concludes with a list of demonstrated attributes of the BPO (vis- à-vis the MOS): more parsimonious definitions of predictors, more efficient extraction of predictive information, better representation of the distribution function of predictand, and equal or better performance (in terms of calibration and informativeness).
Probabilistic Analysis and Density Parameter Estimation Within Nessus
NASA Astrophysics Data System (ADS)
Godines, Cody R.; Manteufel, Randall D.
2002-12-01
This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.
Probabilistic Analysis and Density Parameter Estimation Within Nessus
NASA Technical Reports Server (NTRS)
Godines, Cody R.; Manteufel, Randall D.; Chamis, Christos C. (Technical Monitor)
2002-01-01
This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.
Simulation of Distributed PV Power Output in Oahu Hawaii
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lave, Matthew Samuel
2016-08-01
Distributed solar photovoltaic (PV) power generation in Oahu has grown rapidly since 2008. For applications such as determining the value of energy storage, it is important to have PV power output timeseries. Since these timeseries of not typically measured, here we produce simulated distributed PV power output for Oahu. Simulated power output is based on (a) satellite-derived solar irradiance, (b) PV permit data by neighborhood, and (c) population data by census block. Permit and population data was used to model locations of distributed PV, and irradiance data was then used to simulate power output. PV power output simulations are presentedmore » by sub-neighborhood polygons, neighborhoods, and for the whole island of Oahu. Summary plots of annual PV energy and a sample week timeseries of power output are shown, and a the files containing the entire timeseries are described.« less
As-built design specification for segment map (Sgmap) program
NASA Technical Reports Server (NTRS)
Tompkins, M. A. (Principal Investigator)
1981-01-01
The segment map program (SGMAP), which is part of the CLASFYT package, is described in detail. This program is designed to output symbolic maps or numerical dumps from LANDSAT cluster/classification files or aircraft ground truth/processed ground truth files which are in 'universal' format.
MULGRES: a computer program for stepwise multiple regression analysis
A. Jeff Martin
1971-01-01
MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.
The Greenfoot Programming Environment
ERIC Educational Resources Information Center
Kolling, Michael
2010-01-01
Greenfoot is an educational integrated development environment aimed at learning and teaching programming. It is aimed at a target audience of students from about 14 years old upwards, and is also suitable for college- and university-level education. Greenfoot combines graphical, interactive output with programming in Java, a standard, text-based…
7 CFR 550.52 - Reporting program performance.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 6 2010-01-01 2010-01-01 false Reporting program performance. 550.52 Section 550.52 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL RESEARCH SERVICE, DEPARTMENT... appropriate and the output of programs or projects can be readily quantified, such quantitative data should be...
7 CFR 550.52 - Reporting program performance.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 6 2013-01-01 2013-01-01 false Reporting program performance. 550.52 Section 550.52 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL RESEARCH SERVICE, DEPARTMENT... appropriate and the output of programs or projects can be readily quantified, such quantitative data should be...
7 CFR 550.52 - Reporting program performance.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 6 2012-01-01 2012-01-01 false Reporting program performance. 550.52 Section 550.52 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL RESEARCH SERVICE, DEPARTMENT... appropriate and the output of programs or projects can be readily quantified, such quantitative data should be...
7 CFR 550.52 - Reporting program performance.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 6 2011-01-01 2011-01-01 false Reporting program performance. 550.52 Section 550.52 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL RESEARCH SERVICE, DEPARTMENT... appropriate and the output of programs or projects can be readily quantified, such quantitative data should be...
7 CFR 550.52 - Reporting program performance.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 6 2014-01-01 2014-01-01 false Reporting program performance. 550.52 Section 550.52 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL RESEARCH SERVICE, DEPARTMENT... appropriate and the output of programs or projects can be readily quantified, such quantitative data should be...
Brinkhus, H B; Klinkenborg, H; Estorf, R; Weber, R
1983-01-01
A new programming language SORCA has been defined and a compiler has been written for Z80-based microcomputer systems with CP/M operating system. The language was developed to control behavioral experiments by external stimuli and by time schedule in real-time. Eight binary hardware input lines are sampled cyclically by the computer and can be used to sense switches, level detectors and other binary information, while 8 binary hardware output lines, that are cyclically updated, can be used to control relays, lamps, generate tones or for other purposes. The typical reaction time (cycle time) of a SORCA-program is 500 microseconds to 1 ms. All functions can be programmed as often as necessary. Included are the basic logic functions, counters, timers, majority gates and other complex functions. Parameters can be given as constants or as a result of a step function or of a random process (with Gaussian or equal distribution). Several tasks can be performed simultaneously. In addition, results of an experiment (e.g., number of reactions or latencies) can be measured and printed out on request or automatically. The language is easy to learn and can also be used for many other control purposes.
HZETRN: Description of a free-space ion and nucleon transport and shielding computer program
NASA Technical Reports Server (NTRS)
Wilson, John W.; Badavi, Francis F.; Cucinotta, Francis A.; Shinn, Judy L.; Badhwar, Gautam D.; Silberberg, R.; Tsao, C. H.; Townsend, Lawrence W.; Tripathi, Ram K.
1995-01-01
The high-charge-and energy (HZE) transport computer program HZETRN is developed to address the problems of free-space radiation transport and shielding. The HZETRN program is intended specifically for the design engineer who is interested in obtaining fast and accurate dosimetric information for the design and construction of space modules and devices. The program is based on a one-dimensional space-marching formulation of the Boltzmann transport equation with a straight-ahead approximation. The effect of the long-range Coulomb force and electron interaction is treated as a continuous slowing-down process. Atomic (electronic) stopping power coefficients with energies above a few A MeV are calculated by using Bethe's theory including Bragg's rule, Ziegler's shell corrections, and effective charge. Nuclear absorption cross sections are obtained from fits to quantum calculations and total cross sections are obtained with a Ramsauer formalism. Nuclear fragmentation cross sections are calculated with a semiempirical abrasion-ablation fragmentation model. The relation of the final computer code to the Boltzmann equation is discussed in the context of simplifying assumptions. A detailed description of the flow of the computer code, input requirements, sample output, and compatibility requirements for non-VAX platforms are provided.
Output orientation in R and D: A better approach?. [decision making in R and D
NASA Technical Reports Server (NTRS)
Black, G.
1974-01-01
Research and development management is examined as it might be performed under an output-oriented approach in which the company's needs for innovations in various product and production areas were identified. It is shown that a company's R and D program is the aggregate of its needs in various areas of its business. The planning, programming and budgeting approach is applied to R and D. The state of theory on R and D decision making in economics is summarized. Abstracts of articles concerning R and D in industry are included.
NASA Technical Reports Server (NTRS)
Giddings, L.; Boston, S.
1976-01-01
A method for digitizing zone maps is presented, starting with colored images and producing a final one-channel digitized tape. This method automates the work previously done interactively on the Image-100 and Data Analysis System computers of the Johnson Space Center (JSC) Earth Observations Division (EOD). A color-coded map was digitized through color filters on a scanner to form a digital tape in LARSYS-2 or JSC Universal format. The taped image was classified by the EOD LARSYS program on the basis of training fields included in the image. Numerical values were assigned to all pixels in a given class, and the resulting coded zone map was written on a LARSYS or Universal tape. A unique spatial filter option permitted zones to be made homogeneous and edges of zones to be abrupt transitions from one zone to the next. A zoom option allowed the output image to have arbitrary dimensions in terms of number of lines and number of samples on a line. Printouts of the computer program are given and the images that were digitized are shown.
Exact sampling hardness of Ising spin models
NASA Astrophysics Data System (ADS)
Fefferman, B.; Foss-Feig, M.; Gorshkov, A. V.
2017-09-01
We study the complexity of classically sampling from the output distribution of an Ising spin model, which can be implemented naturally in a variety of atomic, molecular, and optical systems. In particular, we construct a specific example of an Ising Hamiltonian that, after time evolution starting from a trivial initial state, produces a particular output configuration with probability very nearly proportional to the square of the permanent of a matrix with arbitrary integer entries. In a similar spirit to boson sampling, the ability to sample classically from the probability distribution induced by time evolution under this Hamiltonian would imply unlikely complexity theoretic consequences, suggesting that the dynamics of such a spin model cannot be efficiently simulated with a classical computer. Physical Ising spin systems capable of achieving problem-size instances (i.e., qubit numbers) large enough so that classical sampling of the output distribution is classically difficult in practice may be achievable in the near future. Unlike boson sampling, our current results only imply hardness of exact classical sampling, leaving open the important question of whether a much stronger approximate-sampling hardness result holds in this context. The latter is most likely necessary to enable a convincing experimental demonstration of quantum supremacy. As referenced in a recent paper [A. Bouland, L. Mancinska, and X. Zhang, in Proceedings of the 31st Conference on Computational Complexity (CCC 2016), Leibniz International Proceedings in Informatics (Schloss Dagstuhl-Leibniz-Zentrum für Informatik, Dagstuhl, 2016)], our result completes the sampling hardness classification of two-qubit commuting Hamiltonians.
A Digital Control Algorithm for Magnetic Suspension Systems
NASA Technical Reports Server (NTRS)
Britton, Thomas C.
1996-01-01
An ongoing program exists to investigate and develop magnetic suspension technologies and modelling techniques at NASA Langley Research Center. Presently, there is a laboratory-scale large air-gap suspension system capable of five degree-of-freedom (DOF) control that is operational and a six DOF system that is under development. Those systems levitate a cylindrical element containing a permanent magnet core above a planar array of electromagnets, which are used for levitation and control purposes. In order to evaluate various control approaches with those systems, the Generic Real-Time State-Space Controller (GRTSSC) software package was developed. That control software package allows the user to implement multiple control methods and allows for varied input/output commands. The development of the control algorithm is presented. The desired functionality of the software is discussed, including the ability to inject noise on sensor inputs and/or actuator outputs. Various limitations, common issues, and trade-offs are discussed including data format precision; the drawbacks of using either Direct Memory Access (DMA), interrupts, or program control techniques for data acquisition; and platform dependent concerns related to the portability of the software, such as memory addressing formats. Efforts to minimize overall controller loop-rate and a comparison of achievable controller sample rates are discussed. The implementation of a modular code structure is presented. The format for the controller input data file and the noise information file is presented. Controller input vector information is available for post-processing by mathematical analysis software such as MATLAB1.
Gallo, Stephen A; Carpenter, Afton S; Irwin, David; McPartland, Caitlin D; Travis, Joseph; Reynders, Sofie; Thompson, Lisa A; Glisson, Scott R
2014-01-01
There is a paucity of data in the literature concerning the validation of the grant application peer review process, which is used to help direct billions of dollars in research funds. Ultimately, this validation will hinge upon empirical data relating the output of funded projects to the predictions implicit in the overall scientific merit scores from the peer review of submitted applications. In an effort to address this need, the American Institute of Biological Sciences (AIBS) conducted a retrospective analysis of peer review data of 2,063 applications submitted to a particular research program and the bibliometric output of the resultant 227 funded projects over an 8-year period. Peer review scores associated with applications were found to be moderately correlated with the total time-adjusted citation output of funded projects, although a high degree of variability existed in the data. Analysis over time revealed that as average annual scores of all applications (both funded and unfunded) submitted to this program improved with time, the average annual citation output per application increased. Citation impact did not correlate with the amount of funds awarded per application or with the total annual programmatic budget. However, the number of funded applications per year was found to correlate well with total annual citation impact, suggesting that improving funding success rates by reducing the size of awards may be an efficient strategy to optimize the scientific impact of research program portfolios. This strategy must be weighed against the need for a balanced research portfolio and the inherent high costs of some areas of research. The relationship observed between peer review scores and bibliometric output lays the groundwork for establishing a model system for future prospective testing of the validity of peer review formats and procedures.
Gallo, Stephen A.; Carpenter, Afton S.; Irwin, David; McPartland, Caitlin D.; Travis, Joseph; Reynders, Sofie; Thompson, Lisa A.; Glisson, Scott R.
2014-01-01
There is a paucity of data in the literature concerning the validation of the grant application peer review process, which is used to help direct billions of dollars in research funds. Ultimately, this validation will hinge upon empirical data relating the output of funded projects to the predictions implicit in the overall scientific merit scores from the peer review of submitted applications. In an effort to address this need, the American Institute of Biological Sciences (AIBS) conducted a retrospective analysis of peer review data of 2,063 applications submitted to a particular research program and the bibliometric output of the resultant 227 funded projects over an 8-year period. Peer review scores associated with applications were found to be moderately correlated with the total time-adjusted citation output of funded projects, although a high degree of variability existed in the data. Analysis over time revealed that as average annual scores of all applications (both funded and unfunded) submitted to this program improved with time, the average annual citation output per application increased. Citation impact did not correlate with the amount of funds awarded per application or with the total annual programmatic budget. However, the number of funded applications per year was found to correlate well with total annual citation impact, suggesting that improving funding success rates by reducing the size of awards may be an efficient strategy to optimize the scientific impact of research program portfolios. This strategy must be weighed against the need for a balanced research portfolio and the inherent high costs of some areas of research. The relationship observed between peer review scores and bibliometric output lays the groundwork for establishing a model system for future prospective testing of the validity of peer review formats and procedures. PMID:25184367
A low-power, high-efficiency Ka-band TWTA
NASA Technical Reports Server (NTRS)
Curren, A. N.; Dayton, J. A., Jr.; Palmer, R. W.; Force, D. A.; Tamashiro, R. N.; Wilson, J. F.; Dombro, L.; Harvey, W. L.
1991-01-01
A NASA-sponsored program is described for developing a high-efficiency low-power TWTA operating at 32 GHz and meeting the requirements for the Cassini Mission to study Saturn. The required RF output power of the helix TWT is 10 watts, while the dc power from the spacecraft is limited to about 30 watts. The performance level permits the transmission to earth of all mission data. Several novel technologies are incorporated into the TWT to achieve this efficiency including an advanced dynamic velocity taper characterized by a nonlinear reduction in pitch in the output helix section and a multistage depressed collector employing copper electrodes treated for secondary electron-emission suppression. Preliminary program results are encouraging: RF output power of 10.6 watts is obtained at 14-mA beam current and 5.2-kV helix voltage with overall TWT efficiency exceeding 40 percent.
Wang, Jun-Sheng; Yang, Guang-Hong
2017-07-25
This paper studies the optimal output-feedback control problem for unknown linear discrete-time systems with stochastic measurement and process noise. A dithered Bellman equation with the innovation covariance matrix is constructed via the expectation operator given in the form of a finite summation. On this basis, an output-feedback-based approximate dynamic programming method is developed, where the terms depending on the innovation covariance matrix are available with the aid of the innovation covariance matrix identified beforehand. Therefore, by iterating the Bellman equation, the resulting value function can converge to the optimal one in the presence of the aforementioned noise, and the nearly optimal control laws are delivered. To show the effectiveness and the advantages of the proposed approach, a simulation example and a velocity control experiment on a dc machine are employed.
A low-power, high-efficiency Ka-band TWTA
NASA Astrophysics Data System (ADS)
Curren, A. N.; Dayton, J. A., Jr.; Palmer, R. W.; Force, D. A.; Tamashiro, R. N.; Wilson, J. F.; Dombro, L.; Harvey, W. L.
1991-11-01
A NASA-sponsored program is described for developing a high-efficiency low-power TWTA operating at 32 GHz and meeting the requirements for the Cassini Mission to study Saturn. The required RF output power of the helix TWT is 10 watts, while the dc power from the spacecraft is limited to about 30 watts. The performance level permits the transmission to earth of all mission data. Several novel technologies are incorporated into the TWT to achieve this efficiency including an advanced dynamic velocity taper characterized by a nonlinear reduction in pitch in the output helix section and a multistage depressed collector employing copper electrodes treated for secondary electron-emission suppression. Preliminary program results are encouraging: RF output power of 10.6 watts is obtained at 14-mA beam current and 5.2-kV helix voltage with overall TWT efficiency exceeding 40 percent.
Development of a 402.5 MHz 140 kW Inductive Output Tube
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Lawrence Ives; Michael Read, Robert Jackson
2012-05-09
This report contains the results of Phase I of an SBIR to develop a Pulsed Inductive Output Tube (IOT) with 140 kW at 400 MHz for powering H-proton beams. A number of sources, including single beam and multiple beam klystrons, can provide this power, but the IOT provides higher efficiency. Efficiencies exceeding 70% are routinely achieved. The gain is typically limited to approximately 24 dB; however, the availability of highly efficient, solid state drivers reduces the significance of this limitation, particularly at lower frequencies. This program initially focused on developing a 402 MHz IOT; however, the DOE requirement for thismore » device was terminated during the program. The SBIR effort was refocused on improving the IOT design codes to more accurately simulate the time dependent behavior of the input cavity, electron gun, output cavity, and collector. Significant improvement was achieved in modeling capability and simulation accuracy.« less
Memory-based frame synchronizer. [for digital communication systems
NASA Technical Reports Server (NTRS)
Stattel, R. J.; Niswander, J. K. (Inventor)
1981-01-01
A frame synchronizer for use in digital communications systems wherein data formats can be easily and dynamically changed is described. The use of memory array elements provide increased flexibility in format selection and sync word selection in addition to real time reconfiguration ability. The frame synchronizer comprises a serial-to-parallel converter which converts a serial input data stream to a constantly changing parallel data output. This parallel data output is supplied to programmable sync word recognizers each consisting of a multiplexer and a random access memory (RAM). The multiplexer is connected to both the parallel data output and an address bus which may be connected to a microprocessor or computer for purposes of programming the sync word recognizer. The RAM is used as an associative memory or decorder and is programmed to identify a specific sync word. Additional programmable RAMs are used as counter decoders to define word bit length, frame word length, and paragraph frame length.
Low-cost USB interface for operant research using Arduino and Visual Basic.
Escobar, Rogelio; Pérez-Herrera, Carlos A
2015-03-01
This note describes the design of a low-cost interface using Arduino microcontroller boards and Visual Basic programming for operant conditioning research. The board executes one program in Arduino programming language that polls the state of the inputs and generates outputs in an operant chamber. This program communicates through a USB port with another program written in Visual Basic 2010 Express Edition running on a laptop, desktop, netbook computer, or even a tablet equipped with Windows operating system. The Visual Basic program controls schedules of reinforcement and records real-time data. A single Arduino board can be used to control a total of 52 inputs/output lines, and multiple Arduino boards can be used to control multiple operant chambers. An external power supply and a series of micro relays are required to control 28-V DC devices commonly used in operant chambers. Instructions for downloading and using the programs to generate simple and concurrent schedules of reinforcement are provided. Testing suggests that the interface is reliable, accurate, and could serve as an inexpensive alternative to commercial equipment. © Society for the Experimental Analysis of Behavior.
Two demonstrators and a simulator for a sparse, distributed memory
NASA Technical Reports Server (NTRS)
Brown, Robert L.
1987-01-01
Described are two programs demonstrating different aspects of Kanerva's Sparse, Distributed Memory (SDM). These programs run on Sun 3 workstations, one using color, and have straightforward graphically oriented user interfaces and graphical output. Presented are descriptions of the programs, how to use them, and what they show. Additionally, this paper describes the software simulator behind each program.
A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.
ERIC Educational Resources Information Center
Kim, Jin Eun
A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…
Solid rocket booster performance evaluation model. Volume 2: Users manual
NASA Technical Reports Server (NTRS)
1974-01-01
This users manual for the solid rocket booster performance evaluation model (SRB-II) contains descriptions of the model, the program options, the required program inputs, the program output format and the program error messages. SRB-II is written in FORTRAN and is operational on both the IBM 370/155 and the MSFC UNIVAC 1108 computers.
Emulation of simulations of atmospheric dispersion at Fukushima for Sobol' sensitivity analysis
NASA Astrophysics Data System (ADS)
Girard, Sylvain; Korsakissok, Irène; Mallet, Vivien
2015-04-01
Polyphemus/Polair3D, from which derives IRSN's operational model ldX, was used to simulate the atmospheric dispersion at the Japan scale of radionuclides after the Fukushima disaster. A previous study with the screening method of Morris had shown that - The sensitivities depend a lot on the considered output; - Only a few of the inputs are non-influential on all considered outputs; - Most influential inputs have either non-linear effects or are interacting. These preliminary results called for a more detailed sensitivity analysis, especially regarding the characterization of interactions. The method of Sobol' allows for a precise evaluation of interactions but requires large simulation samples. Gaussian process emulators for each considered outputs were built in order to relieve this computational burden. Globally aggregated outputs proved to be easy to emulate with high accuracy, and associated Sobol' indices are in broad agreement with previous results obtained with the Morris method. More localized outputs, such as temporal averages of gamma dose rates at measurement stations, resulted in lesser emulator performances: tests simulations could not satisfactorily be reproduced by some emulators. These outputs are of special interest because they can be compared to available observations, for instance for calibration purpose. A thorough inspection of prediction residuals hinted that the model response to wind perturbations often behaved in very distinct regimes relatively to some thresholds. Complementing the initial sample with wind perturbations set to the extreme values allowed for sensible improvement of some of the emulators while other remained too unreliable to be used in a sensitivity analysis. Adaptive sampling or regime-wise emulation could be tried to circumvent this issue. Sobol' indices for local outputs revealed interesting patterns, mostly dominated by the winds, with very high interactions. The emulators will be useful for subsequent studies. Indeed, our goal is to characterize the model output uncertainty but too little information is available about input uncertainties. Hence, calibration of the input distributions with observation and a Bayesian approach seem necessary. This would probably involve methods such as MCMC which would be intractable without emulators.
Rail-RNA: scalable analysis of RNA-seq splicing and coverage.
Nellore, Abhinav; Collado-Torres, Leonardo; Jaffe, Andrew E; Alquicira-Hernández, José; Wilks, Christopher; Pritt, Jacob; Morton, James; Leek, Jeffrey T; Langmead, Ben
2017-12-15
RNA sequencing (RNA-seq) experiments now span hundreds to thousands of samples. Current spliced alignment software is designed to analyze each sample separately. Consequently, no information is gained from analyzing multiple samples together, and it requires extra work to obtain analysis products that incorporate data from across samples. We describe Rail-RNA, a cloud-enabled spliced aligner that analyzes many samples at once. Rail-RNA eliminates redundant work across samples, making it more efficient as samples are added. For many samples, Rail-RNA is more accurate than annotation-assisted aligners. We use Rail-RNA to align 667 RNA-seq samples from the GEUVADIS project on Amazon Web Services in under 16 h for US$0.91 per sample. Rail-RNA outputs alignments in SAM/BAM format; but it also outputs (i) base-level coverage bigWigs for each sample; (ii) coverage bigWigs encoding normalized mean and median coverages at each base across samples analyzed; and (iii) exon-exon splice junctions and indels (features) in columnar formats that juxtapose coverages in samples in which a given feature is found. Supplementary outputs are ready for use with downstream packages for reproducible statistical analysis. We use Rail-RNA to identify expressed regions in the GEUVADIS samples and show that both annotated and unannotated (novel) expressed regions exhibit consistent patterns of variation across populations and with respect to known confounding variables. Rail-RNA is open-source software available at http://rail.bio. anellore@gmail.com or langmea@cs.jhu.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Schuchert, Andreas; Frese, Jens; Stammwitz, Ekkehard; Novák, Miroslav; Schleich, Arthur; Wagner, Stefan M; Meinertz, Thomas
2002-06-01
It is generally acknowledged that pacemaker output must be adjusted with a 100% voltage safety margin above the pacing threshold to avoid ineffective pacing, especially in patients dependent on pacemakers. The aim of this prospective crossover study was to assess the beat-to-beat safety of low outputs in patients who are dependent on a pacemaker between 2 follow-up examinations. The study included 12 patients who had received a DDD pacemaker with an automatic beat-to-beat capture verification function. The ventricular output at 0.4 milliseconds pulse duration was programmed independently of the actual pacing threshold in a crossover randomization to 1.0 V, 1.5 V, and 2.5 V for 6 weeks each. At each follow-up, the diagnostic counters were interrogated and the pacing threshold at 0.4 milliseconds was determined in 0.1-V steps. The diagnostic pacemaker counters depict the frequency of back-up pulses delivered because of a loss of capture. During the randomization to 1.0-V output, we evaluated whether the adjustment of the output under consideration of the >100% voltage safety margin reduced the frequency of back-up pulses. Pacing thresholds at the randomization to 1.0-V, 1.5-V, and 2.5-V output were not significantly different, with 0.7 +/- 0.3 V at 2.5-V output, 0.6 +/- 0.2 V at 1.5-V output, and 0.6 +/- 0.2 V at 1.0-V output. The frequency of back-up pulses was similar at 2.5-V and 1.5-V output, 2.2% +/- 1.9% and 2.0% +/- 2.0%, respectively. The frequency of back-up pulses significantly increased at 1.0-V output to 5.8% +/- 6.4% (P <.05). Back-up pulses >5% of the time between the 2 follow-ups were observed in no patient at 2.5 V, in 1 patient at 1.5 V, and in 5 patients at 1.0 V. At the randomization to the 1.0-V output, 6 patients had pacing thresholds of 0.5 V or less, and 6 patients had pacing thresholds >0.5 V. The frequency of back-up pulses in the 2 groups was not significantly different, 6.4% +/- 8.6% and 5.7% +/- 2.6%. The frequency of back-up pulses was significantly higher at 1.0-V output than at 1.5-V and 2.5-V output. This also applied to patients with pacing thresholds of < or =0.5 V. Fixed low outputs seem not to be absolutely safe between 2 follow-ups in patients who are dependent on a pacemaker, even when the output has a 100% voltage safety margin above the pacing threshold. When patients with pacemakers programmed to a low ventricular output have symptoms of ineffective pacing, an intermittent increase of the pacing threshold should be carefully ruled out.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritchie, L.T.; Johnson, J.D.; Blond, R.M.
The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.
Chirp Scaling Algorithms for SAR Processing
NASA Technical Reports Server (NTRS)
Jin, M.; Cheng, T.; Chen, M.
1993-01-01
The chirp scaling SAR processing algorithm is both accurate and efficient. Successful implementation requires proper selection of the interval of output samples, which is a function of the chirp interval, signal sampling rate, and signal bandwidth. Analysis indicates that for both airborne and spaceborne SAR applications in the slant range domain a linear chirp scaling is sufficient. To perform nonlinear interpolation process such as to output ground range SAR images, one can use a nonlinear chirp scaling interpolator presented in this paper.
Ames Research Center Publications: A Continuing Bibliography
NASA Technical Reports Server (NTRS)
1981-01-01
The Ames Research Center Publications: A Continuing Bibliography contains the research output of the Center indexed during 1981 in Scientific and Technical Aerospace Reports (STAR), Limited Scientific and Technical Aerospace Reports (LSTAR), International Aerospace Abstracts (IAA), and Computer Program Abstracts (CPA). This bibliography is published annually in an attempt to effect greater awareness and distribution of the Center's research output.
Heat simulation via Scilab programming
NASA Astrophysics Data System (ADS)
Hasan, Mohammad Khatim; Sulaiman, Jumat; Karim, Samsul Arifin Abdul
2014-07-01
This paper discussed the used of an open source sofware called Scilab to develop a heat simulator. In this paper, heat equation was used to simulate heat behavior in an object. The simulator was developed using finite difference method. Numerical experiment output show that Scilab can produce a good heat behavior simulation with marvellous visual output with only developing simple computer code.
The Correlation of Human Capital on Costs of Air Force Acquisition Programs
2009-03-01
6.78 so our model does not exhibit the presence of multi-collinearity. We empirically tested for heteroskedasticity using the Breusch - Pagan -Godfrey...inputs to outputs. The output in this study is the average cost overrun of Aeronautical Systems Center research, development, test , and evaluation...32 Pre-Estimation Specification Tests ............................................................................34 Post
Modifications to the accuracy assessment analysis routine SPATL to produce an output file
NASA Technical Reports Server (NTRS)
Carnes, J. G.
1978-01-01
The SPATL is an analysis program in the Accuracy Assessment Software System which makes comparisons between ground truth information and dot labeling for an individual segment. In order to facilitate the aggregation cf this information, SPATL was modified to produce a disk output file containing the necessary information about each segment.
Isolated thermocouple amplifier system for stirred fixed-bed gasifier
Fasching, George E.
1992-01-01
A sensing system is provided for determining the bed temperature profile of the bed of a stirred, fixed-bed gasifier including a plurality of temperature sensors for sensing the bed temperature at different levels, a transmitter for transmitting data based on the outputs of the sensors to a remote operator's station, and a battery-based power supply. The system includes an isolation amplifier system comprising a plurality of isolation amplifier circuits for amplifying the outputs of the individual sensors. The isolation amplifier circuits each comprise an isolation operational amplifier connected to a sensor; a first "flying capacitor" circuit for, in operation, controlling the application of power from the power supply to the isolation amplifier; an output sample and hold circuit connected to the transmitter; a second "flying capacitor" circuit for, in operation, controlling the transfer of the output of the isolation amplifier to the sample and hold circuit; and a timing and control circuit for activating the first and second capacitor circuits in a predetermined timed sequence.
CUTSETS - MINIMAL CUT SET CALCULATION FOR DIGRAPH AND FAULT TREE RELIABILITY MODELS
NASA Technical Reports Server (NTRS)
Iverson, D. L.
1994-01-01
Fault tree and digraph models are frequently used for system failure analysis. Both type of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Fault trees must have a tree structure and do not allow cycles or loops in the graph. Digraphs allow any pattern of interconnection between loops in the graphs. A common operation performed on digraph and fault tree models is the calculation of minimal cut sets. A cut set is a set of basic failures that could cause a given target failure event to occur. A minimal cut set for a target event node in a fault tree or digraph is any cut set for the node with the property that if any one of the failures in the set is removed, the occurrence of the other failures in the set will not cause the target failure event. CUTSETS will identify all the minimal cut sets for a given node. The CUTSETS package contains programs that solve for minimal cut sets of fault trees and digraphs using object-oriented programming techniques. These cut set codes can be used to solve graph models for reliability analysis and identify potential single point failures in a modeled system. The fault tree minimal cut set code reads in a fault tree model input file with each node listed in a text format. In the input file the user specifies a top node of the fault tree and a maximum cut set size to be calculated. CUTSETS will find minimal sets of basic events which would cause the failure at the output of a given fault tree gate. The program can find all the minimal cut sets of a node, or minimal cut sets up to a specified size. The algorithm performs a recursive top down parse of the fault tree, starting at the specified top node, and combines the cut sets of each child node into sets of basic event failures that would cause the failure event at the output of that gate. Minimal cut set solutions can be found for all nodes in the fault tree or just for the top node. The digraph cut set code uses the same techniques as the fault tree cut set code, except it includes all upstream digraph nodes in the cut sets for a given node and checks for cycles in the digraph during the solution process. CUTSETS solves for specified nodes and will not automatically solve for all upstream digraph nodes. The cut sets will be output as a text file. CUTSETS includes a utility program that will convert the popular COD format digraph model description files into text input files suitable for use with the CUTSETS programs. FEAT (MSC-21873) and FIRM (MSC-21860) available from COSMIC are examples of programs that produce COD format digraph model description files that may be converted for use with the CUTSETS programs. CUTSETS is written in C-language to be machine independent. It has been successfully implemented on a Sun running SunOS, a DECstation running ULTRIX, a Macintosh running System 7, and a DEC VAX running VMS. The RAM requirement varies with the size of the models. CUTSETS is available in UNIX tar format on a .25 inch streaming magnetic tape cartridge (standard distribution) or on a 3.5 inch diskette. It is also available on a 3.5 inch Macintosh format diskette or on a 9-track 1600 BPI magnetic tape in DEC VAX FILES-11 format. Sample input and sample output are provided on the distribution medium. An electronic copy of the documentation in Macintosh Microsoft Word format is included on the distribution medium. Sun and SunOS are trademarks of Sun Microsystems, Inc. DEC, DeCstation, ULTRIX, VAX, and VMS are trademarks of Digital Equipment Corporation. UNIX is a registered trademark of AT&T Bell Laboratories. Macintosh is a registered trademark of Apple Computer, Inc.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-20
... Information Collection for Public Comment; Office of Native American Programs (ONAP) Training and Technical... subject proposal. The data required by Office of Native American Programs Training and Technical... progress. The data identifies needs, outputs and outcomes of the training and technical assistance. DATES...
A New Standard for Measuring Doctoral Programs
ERIC Educational Resources Information Center
Fogg, Piper
2007-01-01
This article discusses a new standard for measuring graduate programs in the United States. The Faculty Scholarly Productivity Index, produced by Academic Analytics, a for-profit company, rates faculty members' scholarly output at nearly 7,300 doctoral programs around the country. It examines the number of book and journal articles published by…
Confidence Region for the Evaluation of HF DF Single Site Location Systems.
1983-09-02
CONTRACT ORt GRANT NUMBER(@) M.H. Reilly and J. Coran S. PERFORMING ORGANIZATION NAME AND ADDRESS WD PROGRAM ELEMENT. PROJECTAS Naval Research...1 DETERMINATION OF THE CONFIDENCE REGION....................2 COMPUTER PROGRAM FOR THE CONFIDENCE ELLIPSE..............5 EXAMPLES OF COMPUTER... PROGRAM OUTPUT......................6 DISCUSSION ................................................... 7 ACKNOWLEDGMENTS
FIREFAMILY: Fire planning with historic weather data.
William A. Main; Robert J. Straub; Donna M. Paananen
1982-01-01
This user's guide will help fire managers interpret the output from FIREFAMILY, a computer program that uses historic weather data for fire planning. The guide describes options within the program and explains various tables and graphs necessary for planning. It also provides details which computer specialists need to run the program.
Spectrum/Orbit-Utilization Program
NASA Technical Reports Server (NTRS)
Miller, Edward F.; Sawitz, Paul; Zusman, Fred
1988-01-01
Interferences among geostationary satellites determine allocations. Spectrum/Orbit Utilization Program (SOUP) is analytical computer program for determining mutual interferences among geostationary-satellite communication systems operating in given scenario. Major computed outputs are carrier-to-interference ratios at receivers at specified stations on Earth. Information enables determination of acceptability of planned communication systems. Written in FORTRAN.
Computer program documentation: CYBER to Univac binary conversion user's guide
NASA Technical Reports Server (NTRS)
Martin, E. W.
1980-01-01
A user's guide for a computer program which will convert SINDA temperature history data from CDC (Cyber) binary format to UNIVAC 1100 binary format is presented. The various options available, the required input, the optional output, file assignments, and the restrictions of the program are discussed.
Calculating far-field radiated sound pressure levels from NASTRAN output
NASA Technical Reports Server (NTRS)
Lipman, R. R.
1986-01-01
FAFRAP is a computer program which calculates far field radiated sound pressure levels from quantities computed by a NASTRAN direct frequency response analysis of an arbitrarily shaped structure. Fluid loading on the structure can be computed directly by NASTRAN or an added-mass approximation to fluid loading on the structure can be used. Output from FAFRAP includes tables of radiated sound pressure levels and several types of graphic output. FAFRAP results for monopole and dipole sources compare closely with an explicit calculation of the radiated sound pressure level for those sources.
NASA Technical Reports Server (NTRS)
1993-01-01
The information required by a programmer using the Minimum Hamiltonian AScent Trajectory Evaluation (MASTRE) Program is provided. This document enables the programmer to either modify the program or convert the program to computers other than the VAX computer. Documentation for each subroutine or function based on providing the definitions of the variables and a source listing are included. Questions concerning the equations, techniques, or input requirements should be answered by either the Engineering or User's manuals. Three appendices are also included which provide a listing of the Root-Sum-Square (RSS) program, a listing of subroutine names and definitions used in the MASTRE User Friendly Interface Program, and listing of the subroutine names and definitions used in the Mass Properties Program. The RSS Program is used to aid in the performance of dispersion analyses. The RSS program reads a file generated by the MASTRE Program, calculates dispersion parameters, and generates output tables and output plot files. UFI Program provides a screen user interface to aid the user in providing input to the model. The Mass Properties Program defines the mass properties data for the MASTRE program through the use of user interface software.
Timeline Analysis Program (TLA-1)
NASA Technical Reports Server (NTRS)
Miller, K. H.
1976-01-01
The Timeline Analysis Program (TLA-1) was described. This program is a crew workload analysis computer program that was developed and expanded from previous workload analysis programs, and is designed to be used on the NASA terminal controlled vehicle program. The following information is described: derivation of the input data, processing of the data, and form of the output data. Eight scenarios that were created, programmed, and analyzed as verification of this model were also described.
45 CFR 2522.580 - What performance measures am I required to submit to the Corporation?
Code of Federal Regulations, 2010 CFR
2010-10-01
....590. (b) For example, a tutoring program might use the following aligned performance measures: (1) Output: Number of students that participated in a tutoring program; (2) Intermediate-Outcome: Percent of...
A computer program to determine the possible daily release window for sky target experiments
NASA Technical Reports Server (NTRS)
Michaud, N. H.
1973-01-01
A computer program is presented which is designed to determine the daily release window for sky target experiments. Factors considered in the program include: (1) target illumination by the sun at release time and during the tracking period; (2) look angle elevation above local horizon from each tracking station to the target; (3) solar depression angle from the local horizon of each tracking station during the experimental period after target release; (4) lunar depression angle from the local horizon of each tracking station during the experimental period after target release; and (5) total sky background brightness as seen from each tracking station while viewing the target. Program output is produced in both graphic and data form. Output data can be plotted for a single calendar month or year. The numerical values used to generate the plots are furnished to permit a more detailed review of the computed daily release windows.
Mars Global Reference Atmospheric Model (Mars-GRAM 3.34): Programmer's Guide
NASA Technical Reports Server (NTRS)
Justus, C. G.; James, Bonnie F.; Johnson, Dale L.
1996-01-01
This is a programmer's guide for the Mars Global Reference Atmospheric Model (Mars-GRAM 3.34). Included are a brief history and review of the model since its origin in 1988 and a technical discussion of recent additions and modifications. Examples of how to run both the interactive and batch (subroutine) forms are presented. Instructions are provided on how to customize output of the model for various parameters of the Mars atmosphere. Detailed descriptions are given of the main driver programs, subroutines, and associated computational methods. Lists and descriptions include input, output, and local variables in the programs. These descriptions give a summary of program steps and 'map' of calling relationships among the subroutines. Definitions are provided for the variables passed between subroutines through common lists. Explanations are provided for all diagnostic and progress messages generated during execution of the program. A brief outline of future plans for Mars-GRAM is also presented.
Burner liner thermal/structural load modeling: TRANCITS program user's manual
NASA Technical Reports Server (NTRS)
Maffeo, R.
1985-01-01
Transfer Analysis Code to Interface Thermal/Structural Problems (TRANCITS) is discussed. The TRANCITS code satisfies all the objectives for transferring thermal data between heat transfer and structural models of combustor liners and it can be used as a generic thermal translator between heat transfer and stress models of any component, regardless of the geometry. The TRANCITS can accurately and efficiently convert the temperature distributions predicted by the heat transfer programs to those required by the stress codes. It can be used for both linear and nonlinear structural codes and can produce nodal temperatures, elemental centroid temperatures, or elemental Gauss point temperatures. The thermal output of both the MARC and SINDA heat transfer codes can be interfaced directly with TRANCITS, and it will automatically produce stress model codes formatted for NASTRAN and MARC. Any thermal program and structural program can be interfaced by using the neutral input and output forms supported by TRANCITS.
System and method for resolving gamma-ray spectra
Gentile, Charles A.; Perry, Jason; Langish, Stephen W.; Silber, Kenneth; Davis, William M.; Mastrovito, Dana
2010-05-04
A system for identifying radionuclide emissions is described. The system includes at least one processor for processing output signals from a radionuclide detecting device, at least one training algorithm run by the at least one processor for analyzing data derived from at least one set of known sample data from the output signals, at least one classification algorithm derived from the training algorithm for classifying unknown sample data, wherein the at least one training algorithm analyzes the at least one sample data set to derive at least one rule used by said classification algorithm for identifying at least one radionuclide emission detected by the detecting device.
Xinyinqin: a computer-based heart sound simulator.
Zhan, X X; Pei, J H; Xiao, Y H
1995-01-01
"Xinyinqin" is the Chinese phoneticized name of the Heart Sound Simulator (HSS). The "qin" in "Xinyinqin" is the Chinese name of a category of musical instruments, which means that the operation of HSS is very convenient--like playing an electric piano with the keys. HSS is connected to the GAME I/O of an Apple microcomputer. The generation of sound is controlled by a program. Xinyinqin is used as a teaching aid of Diagnostics. It has been applied in teaching for three years. In this demonstration we will introduce the following functions of HSS: 1) The main program has two modules. The first one is the heart auscultation training module. HSS can output a heart sound selected by the student. Another program module is used to test the student's learning condition. The computer can randomly simulate a certain heart sound and ask the student to name it. The computer gives the student's answer an assessment: "correct" or "incorrect." When the answer is incorrect, the computer will output that heart sound again for the student to listen to; this process is repeated until she correctly identifies it. 2) The program is convenient to use and easy to control. By pressing the S key, it is able to output a slow heart rate until the student can clearly identify the rhythm. The heart rate, like the actual rate of a patient, can then be restored by hitting any key. By pressing the SPACE BAR, the heart sound output can be stopped to allow the teacher to explain something to the student. The teacher can resume playing the heart sound again by hitting any key; she can also change the content of the training by hitting RETURN key. In the future, we plan to simulate more heart sounds and incorporate relevant graphs.
User's Manual: Thermal Radiation Analysis System TRASYS 2
NASA Technical Reports Server (NTRS)
Jensen, C. L.
1981-01-01
A digital computer software system with generalized capability to solve the radiation related aspects of thermal analysis problems is presented. When used in conjunction with a generalized thermal analysis program such as the systems improved numerical differencing analyzer program, any thermal problem that can be expressed in terms of a lumped parameter R-C thermal network can be solved. The function of TRASYS is twofold. It provides: (a) Internode radiation interchange data; and (b) Incident and absorbed heat rate data from environmental radiant heat sources. Data of both types is provided in a format directly usable by the thermal analyzer programs. The system allows the user to write his own executive or driver program which organizes and directs the program library routines toward solution of each specific problem in the most expeditious manner. The user also may write his own output routines, thus the system data output can directly interface with any thermal analyzer using the R-C network concept.
Splines and polynomial tools for flatness-based constrained motion planning
NASA Astrophysics Data System (ADS)
Suryawan, Fajar; De Doná, José; Seron, María
2012-08-01
This article addresses the problem of trajectory planning for flat systems with constraints. Flat systems have the useful property that the input and the state can be completely characterised by the so-called flat output. We propose a spline parametrisation for the flat output, the performance output, the states and the inputs. Using this parametrisation the problem of constrained trajectory planning can be cast into a simple quadratic programming problem. An important result is that the B-spline parametrisation used gives exact results for constrained linear continuous-time system. The result is exact in the sense that the constrained signal can be made arbitrarily close to the boundary without having intersampling issues (as one would have in sampled-data systems). Simulation examples are presented, involving the generation of rest-to-rest trajectories. In addition, an experimental result of the method is also presented, where two methods to generate trajectories for a magnetic-levitation (maglev) system in the presence of constraints are compared and each method's performance is discussed. The first method uses the nonlinear model of the plant, which turns out to belong to the class of flat systems. The second method uses a linearised version of the plant model around an operating point. In every case, a continuous-time description is used. The experimental results on a real maglev system reported here show that, in most scenarios, the nonlinear and linearised models produce almost similar, indistinguishable trajectories.
Alpha1 LASSO data bundles Lamont, OK
Gustafson, William Jr; Vogelmann, Andrew; Endo, Satoshi; Toto, Tami; Xiao, Heng; Li, Zhijin; Cheng, Xiaoping; Krishna, Bhargavi (ORCID:000000018828528X)
2016-08-03
A data bundle is a unified package consisting of LASSO LES input and output, observations, evaluation diagnostics, and model skill scores. LES input includes model configuration information and forcing data. LES output includes profile statistics and full domain fields of cloud and environmental variables. Model evaluation data consists of LES output and ARM observations co-registered on the same grid and sampling frequency. Model performance is quantified by skill scores and diagnostics in terms of cloud and environmental variables.
Jenkins, Clinton N.; Flocks, J.; Kulp, M.; ,
2006-01-01
Information-processing methods are described that integrate the stratigraphic aspects of large and diverse collections of sea-floor sample data. They efficiently convert common types of sea-floor data into database and GIS (geographical information system) tables, visual core logs, stratigraphic fence diagrams and sophisticated stratigraphic statistics. The input data are held in structured documents, essentially written core logs that are particularly efficient to create from raw input datasets. Techniques are described that permit efficient construction of regional databases consisting of hundreds of cores. The sedimentological observations in each core are located by their downhole depths (metres below sea floor - mbsf) and also by a verbal term that describes the sample 'situation' - a special fraction of the sediment or position in the core. The main processing creates a separate output event for each instance of top, bottom and situation, assigning top-base mbsf values from numeric or, where possible, from word-based relative locational information such as 'core catcher' in reference to sampler device, and recovery or penetration length. The processing outputs represent the sub-bottom as a sparse matrix of over 20 sediment properties of interest, such as grain size, porosity and colour. They can be plotted in a range of core-log programs including an in-built facility that better suits the requirements of sea-floor data. Finally, a suite of stratigraphic statistics are computed, including volumetric grades, overburdens, thicknesses and degrees of layering. ?? The Geological Society of London 2006.
NASA Astrophysics Data System (ADS)
Vo, Martin
2017-08-01
Light Curves Classifier uses data mining and machine learning to obtain and classify desired objects. This task can be accomplished by attributes of light curves or any time series, including shapes, histograms, or variograms, or by other available information about the inspected objects, such as color indices, temperatures, and abundances. After specifying features which describe the objects to be searched, the software trains on a given training sample, and can then be used for unsupervised clustering for visualizing the natural separation of the sample. The package can be also used for automatic tuning parameters of used methods (for example, number of hidden neurons or binning ratio). Trained classifiers can be used for filtering outputs from astronomical databases or data stored locally. The Light Curve Classifier can also be used for simple downloading of light curves and all available information of queried stars. It natively can connect to OgleII, OgleIII, ASAS, CoRoT, Kepler, Catalina and MACHO, and new connectors or descriptors can be implemented. In addition to direct usage of the package and command line UI, the program can be used through a web interface. Users can create jobs for ”training” methods on given objects, querying databases and filtering outputs by trained filters. Preimplemented descriptors, classifier and connectors can be picked by simple clicks and their parameters can be tuned by giving ranges of these values. All combinations are then calculated and the best one is used for creating the filter. Natural separation of the data can be visualized by unsupervised clustering.
Chang, Chia-Ling; Trimbuch, Thorsten; Chao, Hsiao-Tuan; Jordan, Julia-Christine; Herman, Melissa A; Rosenmund, Christian
2014-01-15
Neural circuits are composed of mainly glutamatergic and GABAergic neurons, which communicate through synaptic connections. Many factors instruct the formation and function of these synapses; however, it is difficult to dissect the contribution of intrinsic cell programs from that of extrinsic environmental effects in an intact network. Here, we perform paired recordings from two-neuron microculture preparations of mouse hippocampal glutamatergic and GABAergic neurons to investigate how synaptic input and output of these two principal cells develop. In our reduced preparation, we found that glutamatergic neurons showed no change in synaptic output or input regardless of partner neuron cell type or neuronal activity level. In contrast, we found that glutamatergic input caused the GABAergic neuron to modify its output by way of an increase in synapse formation and a decrease in synaptic release efficiency. These findings are consistent with aspects of GABAergic synapse maturation observed in many brain regions. In addition, changes in GABAergic output are cell wide and not target-cell specific. We also found that glutamatergic neuronal activity determined the AMPA receptor properties of synapses on the partner GABAergic neuron. All modifications of GABAergic input and output required activity of the glutamatergic neuron. Because our system has reduced extrinsic factors, the changes we saw in the GABAergic neuron due to glutamatergic input may reflect initiation of maturation programs that underlie the formation and function of in vivo neural circuits.
NASA Technical Reports Server (NTRS)
Harrison, B. A.; Richard, M.
1979-01-01
The information necessary for execution of the digital computer program L216 on the CDC 6600 is described. L216 characteristics are based on the doublet lattice method. Arbitrary aerodynamic configurations may be represented with combinations of nonplanar lifting surfaces composed of finite constant pressure panel elements, and axially summetric slender bodies composed of constant pressure line elements. Program input consists of configuration geometry, aerodynamic parameters, and modal data; output includes element geometry, pressure difference distributions, integrated aerodynamic coefficients, stability derivatives, generalized aerodynamic forces, and aerodynamic influence coefficient matrices. Optionally, modal data may be input on magnetic field (tape or disk), and certain geometric and aerodynamic output may be saved for subsequent use.