Sample records for minicomputers

  1. Overview of a minicomputer network

    SciTech Connect

    Vahle, M. O.; Tolendino, L. F.

    1980-08-01

    A computer network was developed to support minicomputers used at a number of locations within Sandia National Laboratories. This report describes the control strategies, capabilities, and design philosophies of the minicomputer network. 2 figures.

  2. Finite element software for mini-computers

    Microsoft Academic Search

    Sharaf A. Eldin; D. J. Evans

    1990-01-01

    In this paper the efficient implementation of the Finite Element Method (FEM) on mini-computers is considered. The main limitations of memory and address space are overcome and a software solution proposed using a Virtual Stack Facility (VSF) which is implemented and tested. A new replacement algorithm for a virtual stack is presented and shown to be more efficient than other

  3. A NASA family of minicomputer systems, Appendix A

    NASA Technical Reports Server (NTRS)

    Deregt, M. P.; Dulfer, J. E.

    1972-01-01

    This investigation was undertaken to establish sufficient specifications, or standards, for minicomputer hardware and software to provide NASA with realizable economics in quantity purchases, interchangeability of minicomputers, software, storage and peripherals, and a uniformly high quality. The standards will define minicomputer system component types, each specialized to its intended NASA application, in as many levels of capacity as required.

  4. Minicomputer Capabilities Related to Meteorological Aspects of Emergency Response

    SciTech Connect

    Rarnsdell, J. V.; Athey, G. F.; Ballinger, M. Y.

    1982-02-01

    The purpose of this report is to provide the NRC staff involved in reviewing licensee emergency response plans with background information on the capabilities of minicomputer systems that are related to the collection and dissemination of meteorological infonmation. The treatment of meteorological information by organizations with existing emergency response capabilities is described, and the capabilities, reliability and availability of minicomputers and minicomputer systems are discussed.

  5. Selecting an image analysis minicomputer system

    NASA Technical Reports Server (NTRS)

    Danielson, R.

    1981-01-01

    Factors to be weighed when selecting a minicomputer system as the basis for an image analysis computer facility vary depending on whether the user organization procures a new computer or selects an existing facility to serve as an image analysis host. Some conditions not directly related to hardware or software should be considered such as the flexibility of the computer center staff, their encouragement of innovation, and the availability of the host processor to a broad spectrum of potential user organizations. Particular attention must be given to: image analysis software capability; the facilities of a potential host installation; the central processing unit; the operating system and languages; main memory; disk storage; tape drives; hardcopy output; and other peripherals. The operational environment, accessibility; resource limitations; and operational supports are important. Charges made for program execution and data storage must also be examined.

  6. CAISYS-8- A CAI Language Developed For A Minicomputer.

    ERIC Educational Resources Information Center

    Holm, Cheryl; And Others

    The University of Texas Medical Branch developed a minicomputer-based computer-assisted instruction (CAI) system which employed a teacher oriented software package called CAISYS-8, consisting of a highly modularized teaching compiler and operating system. CAISYS-8 used instructional quanta which generalized the flow of information to and from the…

  7. If Minicomputers Are the Answer, What Was the Question?

    ERIC Educational Resources Information Center

    GRI Computer Corp., Newton, MA.

    The availability of low-cost minicomputers in the last few years has opened up many new control and special purpose applications for computers. However, using general purpose computers for these specialized applications often leads to inefficiencies in programing and operation. GRI Computer Corporation has developed a common-sense approach called…

  8. Recent Trends in Minicomputer-Based Integrated Learning Systems for Reading and Language Arts Instruction.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    This paper discusses minicomputer-based ILSs (integrated learning systems), i.e., computer-based systems of hardware and software. An example of a minicomputer-based system in a school district (a composite of several actual districts) considers hardware, staffing, scheduling, reactions, problems, and training for a subskill-oriented reading…

  9. Distributing structural optimization software between a mainframe and a minicomputer

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.; Dovi, A. R.; Riley, K. M.

    1981-01-01

    This paper describes a distributed software system for solving large-scale structural optimization problems. Distributing the software between a mainframe computer and a minicomputer takes advantage of some of the best features available on each computer. The described software system consists of a finite element structural analysis computer program, a general purpose optimizer program, and several small user-supplied problem dependent programs. Comparison with a similar system executing entirely on the mainframe computer reveals that the distributed system costs less, uses computer resources more efficiently and improves production through faster turnaround and improved user control. The system interfaces with interactive graphics software for generating models and displaying the intermediate and final results

  10. Application of split-film anemometer and mini-computer for measurement in turbulent separated flow

    NASA Technical Reports Server (NTRS)

    Wentz, W. H.; Habluetzel, T.; Howe, D. C.; Fiscko, K. A.

    1979-01-01

    A split-film anemometer has been adapted for measurement of highly turbulent intermittently reversing flows in regions of local separation around airfoils and flaps. Analog signals from the split-film anemometer are fed directly to a mini-computer for processing and analysis. Mean velocity magnitude and direction, intermittency of reversal, turbulence intensity and histograms of the velocity are obtained as outputs of the system.

  11. A brief description of the Medical Information Computer System (MEDICS). [real time minicomputer system

    NASA Technical Reports Server (NTRS)

    Moseley, E. C.

    1974-01-01

    The Medical Information Computer System (MEDICS) is a time shared, disk oriented minicomputer system capable of meeting storage and retrieval needs for the space- or non-space-related applications of at least 16 simultaneous users. At the various commercially available low cost terminals, the simple command and control mechanism and the generalized communication activity of the system permit multiple form inputs, real-time updating, and instantaneous retrieval capability with a full range of options.

  12. Interface for 15VSM-5 and Elektronika D3-28 minicomputers with digital measuring instruments

    SciTech Connect

    Udovichenko, N.A.; Polikarpov, Yu.I.; Makushkin, B.V.

    1987-07-01

    A device is described for data input (up to 8 decimal digits in 8421 code) into 15VSM-5 and Elektronika D3-28 minicomputers from four measuring instruments: a V7-21 voltmeter and three Ch3-54 frequency counters. Data from the voltmeter are entered by software interrogation and data from the frequency counters are entered by software interrupts. The device is implemented by TTL integrated circuits.

  13. Prickett and Lonnquist aquifer simulation program for the Apple II minicomputer

    SciTech Connect

    Hull, L.C.

    1983-02-01

    The Prickett and Lonnquist two-dimensional groundwater model has been programmed for the Apple II minicomputer. Both leaky and nonleaky confined aquifers can be simulated. The model was adapted from the FORTRAN version of Prickett and Lonnquist. In the configuration presented here, the program requires 64 K bits of memory. Because of the large number of arrays used in the program, and memory limitations of the Apple II, the maximum grid size that can be used is 20 rows by 20 columns. Input to the program is interactive, with prompting by the computer. Output consists of predicted lead values at the row-column intersections (nodes).

  14. Application of a ground based minicomputer system for real time, closed loop control of remotely piloted aircraft models used in stall/spin research

    NASA Technical Reports Server (NTRS)

    Montoya, R. J.; Jai, A. R.

    1979-01-01

    The paper describes a minicomputer-based, real-time closed loop remote control system at NASA Langley outdoor facility which is used to determine the stall/departure/spin characteristics of high-performance aircraft. The experiments are conducted with 15% dynamically scaled, unpowered models that are dropped from 3000 m and ground controlled. The effects of time delays and sampling rates on the stability of the control system and the selection of digital algorithms to meet frequency response and real time constraints are examined. Also described is the implementation of the modular software for the flexible programming of multi-axis control laws.

  15. Programmable Calculators and Minicomputers in Agriculture. A Symposium Exploring Computerized Decision-Making Aids and Their Extension to the Farm Level. Proceedings of a Symposium (Hot Springs, Arkansas, February 6-7, 1980)

    ERIC Educational Resources Information Center

    Bentley, Ernest, Ed.

    Ten papers presented at a symposium discuss the array of computerized decision-making aids currently available to farmers and ways to speed up the rate of adoption of computers by agriculturalists. Topics presented include the development of software for agricultural decision-making; the role of programmable calculators and minicomputers in…

  16. Where Will the Minicomputer Lead Us?

    ERIC Educational Resources Information Center

    Leeson, Marjorie M.

    1975-01-01

    The responsibility of individuals working with computers both the mini and the maxi, must not only be envisioned but must also be accepted as one of the greatest challenges ever to face mankind. (Author)

  17. CMC: a machine independent macro preprocessor for minicomputers

    E-print Network

    Crews, Phillip Lee

    1973-01-01

    which separates arguments with commas and appends a universal terminator character to the last element of the list, or the delimiter and separator characters may be specified by the systems programmer as almost any characters he desires. One...

  18. Use of minicomputer for medical instruction: radiologic technology

    Microsoft Academic Search

    S M Brahmavar

    1974-01-01

    A PDP-8 computer obtained primarily for radiation treatment planning of cancer patients and presently used as a hospital-based tumour registry system, is further developed as a teaching aid in medical instruction of radiologic technologists. The text of the radiologic physics course is stored on Dectape in the form of basic questions with an answer to be selected from the multiple

  19. Mini-computers in a social science instructional context

    Microsoft Academic Search

    Ronald E. Anderson; Jonathan Gross

    1972-01-01

    Technological change is rapidly moving us into the age of the pocket computer, but in the early 70s we can be content to work with computers no smaller than typewriters. Little computers have long been popular in behavioral laboratories; only recently have they been seriously considered for broader purposes such as instruction or modeling. In this paper we will examine

  20. The addition of multichannel analog-to-digital conversion capability to a minicomputer facility

    E-print Network

    Malek-Shahmirzadi, Homayoun

    1972-01-01

    to the 7-segment decoder driver are shown in Fig. 12. ADC DIGITAL OUTPUT MSB (BIT 10) TO BIT 15-9 OF THE NICROINTERFACE CARD (C -C ) 15 9 BIT 9 CB +5V BIT 8 C7 C6 +5V 5V BIT 7 BIT 6 LOGIC "1" BIT 5 BIT 4 5 C4 C3 +5V 5V +5V C2 +5V BIT... for the multiplexer control inputs are listed below; (1+2) = Z (8 2+8 Y+8 8 +8 8 ) 0 (1+2) = E (8 2 + 8 Y + 8 8 + 8 8 ) 0 (1+2) = E (8 2+8 Y+8 8 +8 8 ) 0 (1+2) = Z (8 2+8 Y+8 8 +8 8 ) 15 o 15 1 (4) ? Z (815 X + 815 8 ) (4) = Z (815 X + 8 8 ) (2) (3) (4) (5...

  1. Retrieving Records from a Gigabyte of Text on a Mini-Computer Using Statistical Ranking

    Microsoft Academic Search

    Donna Harman; Gerald Candela

    1990-01-01

    Statistically based ranked retrieval of records using keywords provides many advantages over traditional Boolean retrieval methods, especially for end users. This approach to retrieval, however, has not seen wide- spread use in large operational retrieval systems. To show the feasibility of this retrieval methodology, re- search was done to produce very fast search tech- niques using these ranking algorithms, and

  2. A stand-alone alphanumeric CRT teleprocessor unit for a Hewlett-Packard 2114B minicomputer

    E-print Network

    Burrage, George Richard

    1973-01-01

    DAC. C DESIGN FOR 3-BIT DAC. D DESIGN FOR DAC SUMMER NETWORKS. Y ITA 85 92 99 106 113 LIST OF FIGURES FIGURE 2. 1 System block diagram. 2. 2 Memory section block diagram. 2. 3 Display processor block diagram . 2. 4 CRT beam racetrack. 2... selected. Nap~ Peoceaaoa The display processor can be broken down into four functional blocks. As shown in Fig. 2. 3, they are a character generator, a video amplifier, a carriage control section, and a dig1tal-to- analog conversion (DAC) section...

  3. Minicomputers and microprocessors in optical systems; Proceedings of the Seminar, Washington, DC, April 8, 9, 1980

    NASA Astrophysics Data System (ADS)

    Koliopoulos, C. L.; Zweibaum, F. M.

    1980-01-01

    Spectroscopic applications are considered, taking into account microprocessors in spectroscopic applications of imaging devices, design considerations for a microprocessor-based multipurpose spectral radiometer system, applying a microprocessor-controlled spectral radiometer system to field measurements, a spectral analysis microcomputer, calculator-assisted evaluation of reflectance characteristics of fluorescent films, and a micro-computerized facility for on-line spectroscopic plasma diagnostics. Attention is also given to sensors, control applications, and instruments. Design considerations for a solid-state image sensing system are discussed along with system requirements for computer-aided testing and evaluation of solid-state imaging devices, a computer-controlled laser bore scanner, a microprocessor-controlled photodetector test console, an experimental image alignment system, a microcomputer system for controlling an infrared scanning camera, closed-loop active optical system control, a fully integrated microprocessor-controlled surveying instrument, and a new approach to high-precision phase measurement interferometry.

  4. Mini-Computers and the Building Trades: A Guide for Teachers of Vocational Education. Final Report.

    ERIC Educational Resources Information Center

    Asplen, Donald; And Others

    These training materials are designed to help vocational education teachers introduce students to the utilization and installation of mini- and microcomputers in residential and small business buildings. It consists of two chapters. Chapter 1 contains general materials, designed to promote awareness, and chapter 2 contains materials which are…

  5. Study of calculated and measured time dependent delayed neutron yields. [TX, for calculating delayed neutron yields; MATINV, for matrix inversion; in FORTRAN for LSI-II minicomputer

    SciTech Connect

    Waldo, R.W.

    1980-05-01

    Time-dependent delayed neutron emission is of interest in reactor design, reactor dynamics, and nuclear physics studies. The delayed neutrons from neutron-induced fission of /sup 232/U, /sup 237/Np, /sup 238/Pu, /sup 241/Am, /sup 242m/Am, /sup 245/Cm, and /sup 249/Cf were studied for the first time. The delayed neutron emission from /sup 232/Th, /sup 233/U, /sup 235/U, /sup 238/U, /sup 239/Pu, /sup 241/Pu, and /sup 242/Pu were measured as well. The data were used to develop an empirical expression for the total delayed neutron yield. The expression gives accurate results for a large variety of nuclides from /sup 232/Th to /sup 252/Cf. The data measuring the decay of delayed neutrons with time were used to derive another empirical expression predicting the delayed neutron emission with time. It was found that nuclides with similar mass-to-charge ratios have similar decay patterns. Thus the relative decay pattern of one nuclide can be established by any measured nuclide with a similar mass-to-charge ratio. A simple fission product yield model was developed and applied to delayed neutron precursors. It accurately predicts observed yield and decay characteristics. In conclusion, it is possible to not only estimate the total delayed neutron yield for a given nuclide but the time-dependent nature of the delayed neutrons as well. Reactors utilizing recycled fuel or burning actinides are likely to have inventories of fissioning nuclides that have not been studied until now. The delayed neutrons from these nuclides can now be incorporated so that their influence on the stability and control of reactors can be delineated. 8 figures, 39 tables.

  6. 32 CFR Appendix A to Part 310 - Safeguarding Personally Identifiable Information (PII)

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...outside the data processing installation (such as, remote job entry stations, terminal stations, minicomputers, microprocessors, and similar activities). 3. IT facilities authorized to process classified material have adequate...

  7. 32 CFR Appendix A to Part 310 - Safeguarding Personally Identifiable Information (PII)

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...outside the data processing installation (such as, remote job entry stations, terminal stations, minicomputers, microprocessors, and similar activities). 3. IT facilities authorized to process classified material have adequate...

  8. 32 CFR Appendix A to Part 310 - Safeguarding Personally Identifiable Information (PII)

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...outside the data processing installation (such as, remote job entry stations, terminal stations, minicomputers, microprocessors, and similar activities). 3. IT facilities authorized to process classified material have adequate...

  9. 32 CFR Appendix A to Part 310 - Safeguarding Personally Identifiable Information (PII)

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...outside the data processing installation (such as, remote job entry stations, terminal stations, minicomputers, microprocessors, and similar activities). 3. IT facilities authorized to process classified material have adequate...

  10. 32 CFR Appendix A to Part 310 - Safeguarding Personally Identifiable Information (PII)

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...outside the data processing installation (such as, remote job entry stations, terminal stations, minicomputers, microprocessors, and similar activities). 3. IT facilities authorized to process classified material have adequate...

  11. Computer program and user documentation medical data tape retrieval system

    NASA Technical Reports Server (NTRS)

    Anderson, J.

    1971-01-01

    This volume provides several levels of documentation for the program module of the NASA medical directorate mini-computer storage and retrieval system. A biomedical information system overview describes some of the reasons for the development of the mini-computer storage and retrieval system. It briefly outlines all of the program modules which constitute the system.

  12. Automation of a solar domestic hot water heating system test

    Microsoft Academic Search

    G. T. Meares; W. W. Youngblood

    1982-01-01

    A minicomputer-based system for the automated testing of solar domestic hot water systems is described. This system was developed for use with the ASHRAE 95-P test method and the associated rating standard developed by the Tennessee Valley Authority for Middle Tennessee. The minicomputer controls the test and, currently, scans 12 channels of data every four seconds for each of up

  13. Downsizing a database platform for increased performance and decreased costs

    SciTech Connect

    Miller, M.M.; Tolendino, L.F.

    1993-06-01

    Technological advances in the world of microcomputers have brought forth affordable systems and powerful software than can compete with the more traditional world of minicomputers. This paper describes an effort at Sandia National Laboratories to decrease operational and maintenance costs and increase performance by moving a database system from a minicomputer to a microcomputer.

  14. Cooperative processing data bases

    NASA Technical Reports Server (NTRS)

    Hasta, Juzar

    1991-01-01

    Cooperative processing for the 1990's using client-server technology is addressed. The main theme is concepts of downsizing from mainframes and minicomputers to workstations on a local area network (LAN). This document is presented in view graph form.

  15. Table-lookup algorithm for pattern recognition: ELLTAB (Elliptical Table)

    NASA Technical Reports Server (NTRS)

    Jones, W. C., III; Eppler, W. G.

    1975-01-01

    Remotely sensed unit is assigned to category by merely looking up its channel readings in four-dimensional table. Approach makes it possible to process multispectral scanner data using a minicomputer.

  16. Transcription of the Workshop on General Aviation Advanced Avionics Systems

    NASA Technical Reports Server (NTRS)

    Tashker, M. (editor)

    1975-01-01

    Papers are presented dealing with the design of reliable, low cost, advanced avionics systems applicable to general aviation in the 1980's and beyond. Sensors, displays, integrated circuits, microprocessors, and minicomputers are among the topics discussed.

  17. Design and performance of a large vocabulary discrete word recognition system. Volume 2: Appendixes. [flow charts and users manual

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The users manual for the word recognition computer program contains flow charts of the logical diagram, the memory map for templates, the speech analyzer card arrangement, minicomputer input/output routines, and assembly language program listings.

  18. Precise Determination of the Absorption Maximum in Wide Bands

    ERIC Educational Resources Information Center

    Eriksson, Karl-Hugo; And Others

    1977-01-01

    A precise method of determining absorption maxima where Gaussian functions occur is described. The method is based on a logarithmic transformation of the Gaussian equation and is suited for a mini-computer. (MR)

  19. Automated chromosome analysis

    NASA Technical Reports Server (NTRS)

    Castleman, K. R.; Frieden, H. J.; Johnson, E. T.; Rennie, P. A.; Wall, R. J.

    1979-01-01

    Minicomputer-controlled system automatically prepares and analyses blood samples and displays karyotype in pictorial form as primary output. System accuracy is assured by operator interaction at key points during process. System can process up to 576 specimens per day.

  20. The Electronic Hermit: Trends in Library Automation.

    ERIC Educational Resources Information Center

    LaRue, James

    1988-01-01

    Reviews trends in library software development including: (1) microcomputer applications; (2) CD-ROM; (3) desktop publishing; (4) public access microcomputers; (5) artificial intelligence; (6) mainframes and minicomputers; and (7) automated catalogs. (MES)

  1. Vault Safety and Inventory System users manual, PRIME 2350. Revision 1

    SciTech Connect

    Downey, N.J.

    1994-12-14

    This revision is issued to request review of the attached document: VSIS User Manual, PRIME 2350, which provides user information for the operation of the VSIS (Vault Safety and Inventory System). It describes operational aspects of Prime 2350 minicomputer and vault data acquisition equipment. It also describes the User`s Main Menu and menu functions, including REPORTS. Also, system procedures for the Prime 2350 minicomputer are covered.

  2. The revolution in data gathering systems

    NASA Technical Reports Server (NTRS)

    Cambra, J. M.; Trover, W. F.

    1975-01-01

    Data acquisition systems used in NASA's wind tunnels from the 1950's through the present time are summarized as a baseline for assessing the impact of minicomputers and microcomputers on data acquisition and data processing. Emphasis is placed on the cyclic evolution in computer technology which transformed the central computer system, and finally the distributed computer system. Other developments discussed include: medium scale integration, large scale integration, combining the functions of data acquisition and control, and micro and minicomputers.

  3. Automation of a guarded hot plate thermal conductivity instrument

    SciTech Connect

    Holland, L.L.

    1980-06-01

    The Thermo-Physics Corporation's GP-1800 guarded hot plate thermal conductivity instrument has been automated using a Digital Equipment Corporation PDP 11/35 minicomputer with an Industrial Control Subsystem Remote. Automation included constructing a hardware link between the instrument and the minicomputer system and designing, writing, and documenting software to perform equipment control, data acquisition, data reduction, and report generation. The software was designed and written so that non-programmers can run the thermal conductivity experiment.

  4. A Computer Based System for Data Acquisition and Control of Scientific Experiments on Remote Platforms

    Microsoft Academic Search

    K. Birch; E. Harrison; S. Beal

    1976-01-01

    The observation and measurement of environmental processes is made simple and economical using a minicomputer based system which controls activities and logs data from a remote unattended platform by 20 000 baud, digital, cable telemetry. The remote station electronics is bus oriented and provides A\\/D and D\\/A conversion, logical input\\/output and servo motor control. The minicomputer, an Interdata 74 with

  5. MINIS: Multipurpose Interactive NASA Information System

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The Multipurpose Interactive NASA Information Systems (MINIS) was developed in response to the need for a data management system capable of operation on several different minicomputer systems. The desired system had to be capable of performing the functions of a LANDSAT photo descriptive data retrieval system while remaining general in terms of other acceptable user definable data bases. The system also had to be capable of performing data base updates and providing user-formatted output reports. The resultant MINI System provides all of these capabilities and several other features to complement the data management system. The MINI System is currently implemented on two minicomputer systems and is in the process of being installed on another minicomputer system. The MINIS is operational on four different data bases.

  6. EZPIX: a tablet entry method for computer-generated slides, drawings, and graphs

    SciTech Connect

    Williams, J.M.

    1984-09-01

    This manual outlines a method for making slides, drawings, maps, graphs, etc., with the touch of a button. A minicomputer-based program, called EZPIX, connects the host computer graphics system to a tablet on which one can digitize input and specify graphics commands from a menu. A minicomputer terminal serves as editor and provides local graphics output. The magic button does practically everything else. It signs you on, starts, builds, ends, executes and saves your command file, and signs you off. It even allows you to make a composite picture from an assortment of inappropriately sized originals. The graph and pie chart modes are handy, too.

  7. A program for mass spectrometer control and data processing analyses in isotope geology; written in BASIC for an 8K Nova 1120 computer

    USGS Publications Warehouse

    Stacey, J.S.; Hope, J.

    1975-01-01

    A system is described which uses a minicomputer to control a surface ionization mass spectrometer in the peak switching mode, with the object of computing isotopic abundance ratios of elements of geologic interest. The program uses the BASIC language and is sufficiently flexible to be used for multiblock analyses of any spectrum containing from two to five peaks. In the case of strontium analyses, ratios are corrected for rubidium content and normalized for mass spectrometer fractionation. Although almost any minicomputer would be suitable, the model used was the Data General Nova 1210 with 8K memory. Assembly language driver program and interface hardware-descriptions for the Nova 1210 are included.

  8. [Current methods in automated statistical analysis].

    PubMed

    Praganov, D; Kalpazanov, I; Simeonov, G

    1983-01-01

    The advantages and disadvantages of the so called minicomputers (e.g. type NOVA) and microcomputers (e.g. type NR 95) are compared in their use for statistical analysis. In spite of some advantages - autonomy, possibilities for immediate use, etc. of microcomputers, the minicomputers, type NOVA, were established to enable the elimination of their non-specific work for the non-mathematical specialists, their direction to the most proper and complexly applied statistical method in the organized statistical processing, existing at the Institute of Hygiene and Occupational Diseases, hence the expenditures would be lower and reability - higher. PMID:6672822

  9. Design of a microprocessor-based Control, Interface and Monitoring (CIM unit for turbine engine controls research

    NASA Technical Reports Server (NTRS)

    Delaat, J. C.; Soeder, J. F.

    1983-01-01

    High speed minicomputers were used in the past to implement advanced digital control algorithms for turbine engines. These minicomputers are typically large and expensive. It is desirable for a number of reasons to use microprocessor-based systems for future controls research. They are relatively compact, inexpensive, and are representative of the hardware that would be used for actual engine-mounted controls. The Control, Interface, and Monitoring Unit (CIM) contains a microprocessor-based controls computer, necessary interface hardware and a system to monitor while it is running an engine. It is presently being used to evaluate an advanced turbofan engine control algorithm.

  10. Study of systems and techniques for data base management

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  11. Heliostat Beam Characterization System

    Microsoft Academic Search

    E. D. Thalhammer; G. S. Phipps

    1979-01-01

    The Beam Characterization System utilizes video radiometer techniques to quantitatively describe the solar energy projected by a heliostat. This system is designed to evaluate prototype heliostats and to improve the performance of the Central Receiver Test Facility heliostats. The system consists of a beam target, video camera, analog image analyzer, calibration system, video digitizer and a minicomputer system. The calibration

  12. Telescopes: Control by Software for Amateurs

    NASA Astrophysics Data System (ADS)

    Bartels, M.; Murdin, P.

    2003-04-01

    In 1970 an amateur might dream of access to a state-of-the-art small institutional telescope run by a $150 000 minicomputer with 4K memory using assembly language programming. By the mid-1980s, small numbers of amateurs were building computer-operated telescopes thanks to hardware advances in the form of the personal computer....

  13. Microprocessors in U.S. Electrical Engineering Departments, 1974-1975.

    ERIC Educational Resources Information Center

    Sloan, M. E.

    Drawn from a survey of engineering departments known to be teaching microprocessor courses, this paper shows that the adoption of microprocessors by Electrical Engineering Departments has been rapid compared with their adoption of minicomputers. The types of courses that are being taught can be categorized as: surveys of microprocessors, intensive…

  14. Synchronous multi-microprocessor system for implementing digital signal processing algorithms

    SciTech Connect

    Barnwell, T.P. III; Hodges, C.J.M.

    1982-01-01

    This paper discusses the details of a multi-microprocessor system design as a research facility for studying multiprocessor implementation of digital signal processing algorithms. The overall system, which consists of a control microprocessor, eight satellite microprocessors, a control minicomputer, and extensive distributed software, has proven to be an effect tool in the study of multiprocessor implementations. 5 references.

  15. Modern programming language

    NASA Technical Reports Server (NTRS)

    Feldman, G. H.; Johnson, J. A.

    1980-01-01

    Structural-programming language is especially-tailored for producing assembly language programs for MODCOMP II and IV mini-computes. Modern programming language consists of set of simple and powerful control structures that include sequencing alternative selection, looping, sub-module linking, comment insertion, statement continuation, and compilation termination capabilities.

  16. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    ERIC Educational Resources Information Center

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  17. The application of microprocessors to strapdown inertial navigation

    NASA Technical Reports Server (NTRS)

    Napjus, G. A.

    1974-01-01

    The paper describes the nature of strapdown navigators and the computational requirements associated with them. A current system design is then described in which three limited-capability microcomputers perform the tasks previously assigned to a powerful minicomputer. In addition, a technique employing dedicated microprocessors in place of conventional analog electronics in the gyroscope control loops is discussed.

  18. CEBAF control system

    SciTech Connect

    Bork, R.; Grubb, C.; Lahti, G.; Navarro, E.; Sage, J.

    1989-01-01

    A logic-based computer control system is in development at CEBAF. This Unix/C language software package, running on a distributed, hierarchical system of workstation and supervisory minicomputers, interfaces to hardware via CAMAC. Software aspects to be covered are ladder logic, interactive database generation, networking, and graphic user interfaces. 1 fig.

  19. CEBAF Control System

    SciTech Connect

    Bork, Rolf; Lahti, George; Navarro, Edwin; Grubb, Caroline; Sage, Joan; Moore, T.

    1988-10-01

    A logic-based computer control system is in development at CEBAF. This Unix/C language software package, running on a distributed, hierarchical system of workstation and supervisory minicomputers, interfaces to hardware via CAMAC. Software aspects to be covered are ladder logic, interactive database generation, networking, and graphic user interfaces.

  20. An Experimental CMI System on the PDP 11/20.

    ERIC Educational Resources Information Center

    Espeland, L. Roger; Walker, Gerald S.

    A computer-managed instructional (CMI) system is being developed for use in investigating a CMI environment for Air Force technical training using the PDP 11/20 minicomputer. Software and hardware interfaces are now available for 24k core memory with an additional 128k random access disc storage. Hardware interfaces are complete for the student…

  1. Interactive computer graphics at Ford Motor Company

    Microsoft Academic Search

    Frank W. Bliss

    1980-01-01

    Over the past fourteen years the Ford Motor Company actively pursued the development and use of interactive computer graphics in the design and manufacture of automobiles. The success of Ford's pioneering efforts in the area of minicomputer graphics led to widespread acceptance and support for graphics within the corporation. This in turn led to a rapid expansion of computer graphics

  2. The APPS-IV analytical stereoplotter with superposition graphics: a unique tool to build and revise GIS data bases

    Microsoft Academic Search

    DAVID C. GOODRICH

    The APPS-IV with Superposition Graphics (SUPER-P ) interfaced with a minicomputer and the AUTOGIS geographic information system (GIS) provides an ideal facility to collect, update and analyze digital database information. The database information can be compiled from stereophotography obtained from a large number of sensing systems ranging from frame photography to some of the most complex dynamic imaging systems including

  3. A Report on the Loading of MARC Format Bibliographic Records into HyperCard.

    ERIC Educational Resources Information Center

    Rosenberg, Jason B.; Borgman, Christine L.

    1991-01-01

    Outlines a process for downloading MARC format bibliographic data into a form readable for an Apple Macintosh computer running HyperCard software. Loading procedures for two data sources--an OCLC format tape and records from UCLA's ORION public access catalog--are discussed, and the use of a minicomputer system is considered. (eight references)…

  4. Measurement of the absolute value of gravitational acceleration by a ballistic laser gravimeter

    Microsoft Academic Search

    G. P. Arnautov; E. N. Kalish; F. I. Kokoulin; V. P. Koronkevich; A. I. Lokhmatov; I. S. Malyshev; Iu. E. Nesterikhin; L. A. Petrashevich; M. G. Smirnov; Iu. F. Stus

    1979-01-01

    Description of a unique device for high-accuracy measurements of absolute gravitational acceleration using a ballistic technique. The use of frequency stabilized lasers, a rubidium frequency standard, a digital electronic counter, and a minicomputer made it possible to attain a relative error of 4 parts in a billion over a 24-hr interval of 5000 measurements. Experimental data are shown for measurements

  5. The Use of a Microcomputer Based Array Processor for Real Time Laser Velocimeter Data Processing

    NASA Technical Reports Server (NTRS)

    Meyers, James F.

    1990-01-01

    The application of an array processor to laser velocimeter data processing is presented. The hardware is described along with the method of parallel programming required by the array processor. A portion of the data processing program is described in detail. The increase in computational speed of a microcomputer equipped with an array processor is illustrated by comparative testing with a minicomputer.

  6. DATMAN FORTRAN USER'S GUIDE VERSION 3.0

    EPA Science Inventory

    DATMAN is a data management system which runs on a variety of minicomputers. Currently, versions are supported on the following computers: PRIME and PDP 11/70 under IAS. DATMAN has facilities for creating data bases, retrieving selected data from data in data bases, retrieving se...

  7. Computer-processed potentiometric titration for the determination of calcium and magnesium in sea water

    Microsoft Academic Search

    Satoru Kanamori; Hisashi Ikegami

    1980-01-01

    An improved potentiometric titration method for the determination of calcium and magnesium in sea water has been newly devised. In this method, a mini-computer is used for the automation of titrations, and ion-selective electrodes are used as an end-point detector. Calcium is determined by titration with EGTA, and total alkaline earth metals (magnesium + calcium + strontium) by titration with

  8. Operating manual for the RRL 8 channel data logger

    NASA Technical Reports Server (NTRS)

    Paluch, E. J.; Shelton, J. D.; Gardner, C. S.

    1979-01-01

    A data collection device which takes measurements from external sensors at user specified time intervals is described. Three sensor ports are dedicated to temperature, air pressure, and dew point. Five general purpose sensor ports are provided. The user specifies when the measurements are recorded as well as when the information is read or stored in a minicomputer or a paper tape.

  9. Technological Discontinuities and Organizational Environments.

    ERIC Educational Resources Information Center

    Tushman, Michael L.; Anderson, Philip

    1986-01-01

    Technological effects on environmental conditions are analyzed using longitudinal data from the minicomputer, cement, and airline industries. Technology evolves through periods of incremental change punctuated by breakthroughs that enhance or destroy the competence of firms. Competence-destroying discontinuities increase environmental turbulence;…

  10. Technological Discontinuities and Dominant Designs: A Cyclical Model of Technological Change.

    ERIC Educational Resources Information Center

    Anderson, Philip; Tushman, Michael L.

    1990-01-01

    Based on longitudinal studies of the cement, glass, and minicomputer industries, this article proposes a technological change model in which a technological breakthrough, or discontinuity, initiates an era of intense technical variation and selection, culminating in a single dominant design and followed by a period of incremental technical…

  11. A very large PC LAN as the basis for a hospital information system

    Microsoft Academic Search

    John P. Glaser; Robert F. Beckley; Pasha Roberts; James K. Marra; Frederick L. Hiltz; Jean Hurley

    1991-01-01

    Brigham and Women's Hospital is converting its financial, administrative and clinical information systems from a mini-computer environment to a platform based on MUMPS and a network of several thousand personal computers. This article describes the project rationale and status and provides an overview of the architecture of the new system. The initial results of the project indicate that the personal

  12. Computer Series, 51: Bits and Pieces, 20.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1984-01-01

    Describes: Apple stereochemistry program; CNDO/2-INDO mini-computer calculations; direct linear plot procedure for enzyme kinetics calculations; construction of nonlinear Scatchard plots; simulation of mass spectral envelopes of polyisotopic elements; graphics with a dot-matrix printer; MINC computer in the physical chemistry laboratory; hallway…

  13. NEUTRON ACTIVATION ANALYSIS FOR SIMULTANEOUS DETERMINATION OF TRACE ELEMENTS IN AMBIENT AIR COLLECTED ON GLASS-FIBER FILTERS

    EPA Science Inventory

    Arsenic with 25 other elements are simultaneously determined in ambient air samples collected on glass-fiber filter composites at 250 United States sites. The instrumental neutron activation analysis (NAA) technique combined with the power of a dedicated mini-computer resulted in...

  14. Optimizing remote offshore drilling operations

    Microsoft Academic Search

    W. F. Deerhake; F. Khalaf; J. A. Seehafer

    1981-01-01

    Gulf of Suez Petroleum Company's experience in using mini-computers as an aid in controlling drilling operations has been an unqualified success. Current uses include optimization of drilling operations, storage and retrieval of well data and word processing of standard programs. As a result, overall drilling costs, problems and manpower requirements have been lessened. This work discusses the computer system, its

  15. COMPUTER-CONTROLLED, REAL-TIME AUTOMOBILE EMISSIONS MONITORING SYSTEM

    EPA Science Inventory

    A minicomputer controlled automotive emissions sampling and analysis system (the Real-Time System) was developed to determine vehicular modal emissions over various test cycles. This data acquisition system can sample real-time emissions at a rate of 10 samples/s. A buffer utiliz...

  16. An Interactive, Interdisciplinary, On-Line Graphics System for Presenting and Manipulating Directed Graphs.

    ERIC Educational Resources Information Center

    Beazley, William; And Others

    An interactive graphics system has been implemented for tutorial purposes and for research in man-machine communication of structural digraphs. An IMLAC intelligent terminal with ligthpen input is used in conjunction with a NOVA minicomputer. Successful application in linguistics and engineering problem solving are discussed, the latter in detail.…

  17. Digital Image Acquisition and Processing Using a Scanning Transmission Electron Microscope

    Microsoft Academic Search

    Jeffrey Harold Butler

    1981-01-01

    The aim of this work is two-fold: to help develop a digital interface linking a Scanning Transmission Electron Microscope to a minicomputer, and to apply this system, in conjunction with a sophisticated detector assembly, in a variety of new imaging modes. The construction of the interface represents a cooperative effort in design between Martin Strahm, a first -class digital electronics

  18. An inexpensive vehicle speed detector

    NASA Technical Reports Server (NTRS)

    Broussard, P., Jr.

    1973-01-01

    Low-power minicomputer can plug into automobile cigarette lighter. It measures time it takes observed car to travel premeasured distance and provides immediate readout of speed. Potentially, detector could be manufactured for less than $200 per unit and would have very low maintenance cost.

  19. The Carlsberg automatic meridian circle and the plans for Anglo-Danish collaboration

    Microsoft Academic Search

    H. J. F. Olsen; L. Helmer

    1978-01-01

    The Carlsberg automatic meridian circle at Brorfelde is controlled by an HP 2100 minicomputer and incorporates an active photoelectric slit micrometer, a meteorological data reading system with rain detector, a photoelectric collimation micrometer, and a photoelectric circle reading system accurate to 0.04 arc sec which includes a standard A\\/D computer interface board and stepping motors. Furthermore, the telescope setting system

  20. Cloud Computing and the Power to Choose

    ERIC Educational Resources Information Center

    Bristow, Rob; Dodds, Ted; Northam, Richard; Plugge, Leo

    2010-01-01

    Some of the most significant changes in information technology are those that have given the individual user greater power to choose. The first of these changes was the development of the personal computer. The PC liberated the individual user from the limitations of the mainframe and minicomputers and from the rules and regulations of centralized…

  1. Introduction to acoustic emission

    NASA Technical Reports Server (NTRS)

    Possa, G.

    1983-01-01

    Typical acoustic emission signal characteristics are described and techniques which localize the signal source by processing the acoustic delay data from multiple sensors are discussed. The instrumentation, which includes sensors, amplifiers, pulse counters, a minicomputer and output devices is examined. Applications are reviewed.

  2. Mini or Maxi: Which Computer Is Right for You?

    ERIC Educational Resources Information Center

    Perry, Nancy N.

    1979-01-01

    Selecting a type of computer system for instructional purposes does not depend upon the kind of instruction to be provided but on the features of the maxi- and minicomputer systems themselves. The features of each system must be considered in the context in which it will be used. (Author/CMV)

  3. A practical Hadamard transform spectrometer for astronomical application

    NASA Technical Reports Server (NTRS)

    Tai, M. H.

    1977-01-01

    The mathematical properties of Hadamard matrices and their application to spectroscopy are discussed. A comparison is made between Fourier and Hadamard transform encoding in spectrometry. The spectrometer is described and its laboratory performance evaluated. The algorithm and programming of inverse transform are given. A minicomputer is used to recover the spectrum.

  4. A Reprogrammable Industrial Robot Control System

    Microsoft Academic Search

    Shyh J. Wang

    1976-01-01

    Aspects of a control system that employs a minicomputer for the programming and control of industrial robots are described. The control system hierarchy, the system interface, and the software controller design are presented along with the mode of operation and the methods of robot training and on-line testing. The features of the robot control language (Alfa) and a typical program

  5. The Rise of K-12 Blended Learning: Profiles of Emerging Models

    ERIC Educational Resources Information Center

    Staker, Heather

    2011-01-01

    Some innovations change everything. The rise of personal computers in the 1970s decimated the mini-computer industry. TurboTax forever changed tax accounting, and MP3s made libraries of compact discs obsolete. These innovations bear the traits of what Harvard Business School Professor Clayton M. Christensen terms a "disruptive innovation."…

  6. The prediction of acoustical particle motion using an efficient polynomial curve fit procedure

    NASA Technical Reports Server (NTRS)

    Marshall, S. E.; Bernhard, R.

    1984-01-01

    A procedure is examined whereby the acoustic model parameters, natural frequencies and mode shapes, in the cavities of transportation vehicles are determined experimentally. The acoustic model shapes are described in terms of the particle motion. The acoustic modal analysis procedure is tailored to existing minicomputer based spectral analysis systems.

  7. A system for the management of requests at an image data bank. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Debarrosaguirre, J. L. (principal investigator)

    1984-01-01

    An automated system was implemented to supersede existing manual procedures in fulfilling user requests made to a remote sensing data bank, concerning specifically LANDSAT imagery. The system controls the several production steps from request entry to the shipment of each final product. Special solutions and techniques were employed due to the severe limitations, in both hardware and software of the host minicomputer system.

  8. A Computer-Based Instructional Management System for General Psychology.

    ERIC Educational Resources Information Center

    Halcomb, Charles G.; And Others

    1989-01-01

    Describes the development and impact of minicomputer-based system of instructional management for the general psychology program at Texas Tech University (Lubbock). Discusses how a program might be used for educational research in other university settings. Suggests the program can be used in either large or small departments. (KO)

  9. A Retrospective on the Dorado, A High-Performance Personal Computer

    Microsoft Academic Search

    Kenneth A

    In late 1975, members of the Xerox Palo Alto Research Center embarked on the specification of a high-performance successor to the Alto personal minicomputer, in use since 1973. After four years, the resulting machine, called the Dorado, was in use within the research community at PARC. This paper begins with an overview of the design goals, architecture, and implementation of

  10. Curriculum development in Informatics in Lithuania David Gilbert

    E-print Network

    Gilbert, David

    was predominantly rural with an agricultural economy and a standard of living comparable to that of Denmark. During its enforced membership of the Soviet Union Lithuania became dependant on light and medium industry, exporting mini­computers and machine tools as well as agricultural goods. During the Soviet rule

  11. Supervisors with Micros: Trends and Training Needs.

    ERIC Educational Resources Information Center

    Bryan, Leslie A., Jr.

    1986-01-01

    Results of a study conducted by Purdue University concerning the use of computers by supervisors in manufacturing firms are presented and discussed. Examines access to computers, minicomputers versus mainframes, training time on computers, replacement of staff, creation of personnel problems, and training methods. (CT)

  12. Atmospheric and Oceanographic Information Processing System (AOIPS) system description

    NASA Technical Reports Server (NTRS)

    Bracken, P. A.; Dalton, J. T.; Billingsley, J. B.; Quann, J. J.

    1977-01-01

    The development of hardware and software for an interactive, minicomputer based processing and display system for atmospheric and oceanographic information extraction and image data analysis is described. The major applications of the system are discussed as well as enhancements planned for the future.

  13. Hierarchical function-structured and autonomous control systems for fossil power plants

    Microsoft Academic Search

    M. Iioka; A. Sugano; S. Nigawara

    1988-01-01

    A distributed digital monitoring and control system suitable for fossil-fueled power plants is described along with the associated minicomputers, communication devices, and man-machine interface devices. The corresponding hardware and software are also discussed. The system covers the plant-wide (distributed) monitoring and control functions including data acquisition, boiler control, burner management, turbine control, auxiliaries control, and protection

  14. Real time computer control of 5 megawatts of solar thermal energy

    Microsoft Academic Search

    E. D. Thalhammer

    1978-01-01

    The Solar Thermal Test Facility (STTF) operates under the control of a nine-machine distributed minicomputer network. The prime functions of this network are heliostat controls, heat rejection system controls, and data acquisition. This paper describes the control computer. This computer's main tasks are: (1) the sun position calculation; (2) automatic heliostat command execution; (3) graphic display of heliostat status and

  15. Sun Series program for the REEDA System. [predicting orbital lifetime using sunspot values

    NASA Technical Reports Server (NTRS)

    Shankle, R. W.

    1980-01-01

    Modifications made to data bases and to four programs in a series of computer programs (Sun Series) which run on the REEDA HP minicomputer system to aid NASA's solar activity predictions used in orbital life time predictions are described. These programs utilize various mathematical smoothing technique and perform statistical and graphical analysis of various solar activity data bases residing on the REEDA System.

  16. ULTRASONIC BIOLOGICAL EFFECT EXPOSURE SYSTEM W. D. O'Brien, Jr., C. L. Christman and S. Yarrow

    E-print Network

    Illinois at Urbana-Champaign, University of

    Bm) with linear amplification of the signal to the desired level by a high power, broadband amplifier. minicomputer controls the exposure time and the net electrical power to the ultrasonic transducer assembly was developed to provide experimental A The net electrical power is The ultrasonic transducer assembly

  17. Mobile Computer-Assisted-Instruction in Rural New Mexico.

    ERIC Educational Resources Information Center

    Gittinger, Jack D., Jr.

    The University of New Mexico's three-year Computer Assisted Instruction Project established one mobile and five permanent laboratories offering remedial and vocational instruction in winter, 1984-85. Each laboratory has a Degem learning system with minicomputer, teacher terminal, and 32 student terminals. A Digital PDP-11 host computer runs the…

  18. Maxicalculators for the Whole School. Illinois Series on Educational Applications of Computers. Number 5.

    ERIC Educational Resources Information Center

    Doring, Richard; Hicks, Bruce

    A brief review is presented of the characteristics of four maxicalculators (HP 9830, Wang 2200, IBM 5100, MCM/700) and two minicomputers (Classic, Altair 8800). The HP 9830 and the Wang 2200 are thought to be the best adapted to serve entire schools and their unique properties are discussed. Some criteria of what should be taken into account in…

  19. Visible and infrared spin scanning radiometer \\/VISSR\\/ atmospheric sounder \\/VAS\\/ ground data system

    Microsoft Academic Search

    J. T. Dalton; R. K. Jamros; D. P. Helfer; D. R. Howell

    1981-01-01

    The interactive system developed at NASA\\/Goddard Space Flight Center to receive data from the infrared radiometer on GOES-4 in near real time and to perform interactive display and analysis of the 12-channel infrared imagery is described. The system is minicomputer based and uses a menu approach in guiding the analyst through spacecraft instrument programming, area and band selection, image acquisition,

  20. Mission of the Future. Proceedings of the Annual Convention of the Association for the Development of Computer-Based Instructional Systems. Volume III: Users Interest Groups (San Diego, California, February 27 to March 1, 1979).

    ERIC Educational Resources Information Center

    Association for the Development of Computer-based Instructional Systems.

    The third of three volumes of papers presented at the 1979 ADCIS convention, this collection includes 30 papers presented to special interest groups--implementation, minicomputer users, National Consortium for Computer Based Music Instruction, and PLATO users. Papers presented to the implementation interest group were concerned with faculty…

  1. The Use of Computer Networks in Data Gathering and Data Analysis.

    ERIC Educational Resources Information Center

    Yost, Michael; Bremner, Fred

    This document describes the review, analysis, and decision-making process that Trinity University, Texas, went through to develop the three-part computer network that they use to gather and analyze EEG (electroencephalography) and EKG (electrocardiogram) data. The data are gathered in the laboratory on a PDP-1124, an analog minicomputer. Once…

  2. A High Resolution Graphic Input System for Interactive Graphic Display Terminals. Appendix B.

    ERIC Educational Resources Information Center

    Van Arsdall, Paul Jon

    The search for a satisfactory computer graphics input system led to this version of an analog sheet encoder which is transparent and requires no special probes. The goal of the research was to provide high resolution touch input capabilities for an experimental minicomputer based intelligent terminal system. The technique explored is compatible…

  3. DATA ACQUISITION SYSTEM FOR RAPID KINETIC EXPERIMENTS

    EPA Science Inventory

    A data acquisition system has been developed to collect, analyze and store large volumes of rapid kinetic data measured from a stopped-flow spectrophotometer. A digital minicomputer, with an A/D converter, tape drive unit and formatter, analog recorder, oscilloscope, and input/ou...

  4. Radioactivities in returned lunar materials and in meteorites

    NASA Technical Reports Server (NTRS)

    Fireman, E. L.

    1984-01-01

    Carbon 14 terrestial ages were determined with low level minicomputers and accelerator mass spectrometry on 1 Yamato and 18 Allan Hills and nearby sited meteorites. Techniques for an accelerator mass spectrometer which make C(14) measurements on small samples were developed. Also Be(10) concentrations were measured in Byrd core and Allan Hills ice samples.

  5. Attitude control of a triple inverted pendulum

    Microsoft Academic Search

    K. FURUT; T. OCHIAI; N. ONO

    1984-01-01

    The paper is concerned with the attitude control of a triple inverted pendulum. The lowest hinge is free for rotation and the torques of the upper two hinges are manipulated not only to stabilize the pendulum but also to control its attitude. The control system is designed by using CAD developed by the author and is realized by a minicomputer.

  6. Instructions for using the U.S. Geological Survey data base of wells on Long Island, New York

    USGS Publications Warehouse

    Hawkins, George W.; Terlecki, Gregory M.

    1983-01-01

    The population of central and eastern Long Island, New York depends on ground water for its supply of fresh water. Data on more than 7,500 wells on the island have been collected by various State and local agencies and compiled by the U.S. Geological Survey since 1906. During 1975-81, the Geological Survey developed a data base for its Data General Nova 1220 minicomputer to store and process the well information. The data base is composed of seven sections, each of which may be revised and updated. Three types of magnetic devices with limited capacity are used for data storage--disk, Linctape, and 9-track tape. This breakdown makes each section small enough to store and update on a small minicomputer while allowing simultaneous data retrieval from all sections. This manual gives complete instructions for revising, storing, and retrieving well data. Most programming is in FORTRAN, but some is in assembly language. (USGS)

  7. Techniques for digital enhancement of Landsat MSS data using an Apple II+ microcomputer

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.; Cartin, K. F.

    1984-01-01

    The information provided by remotely sensed data collected from orbiting platforms has been useful in many research fields. Particularly convenient for evaluation are generally digital data stored on computer compatible tapes (CCT's). The major advantages of CCT's are the quality of the data and the accessibility to computer manipulation. Minicomputer systems are widely used for the required computer processing operations. However, microprocessor-related technological advances make it now possible to process CCT data with computing systems which can be obtained at a much lower price than minicomputer systems. A detailed description is provided of the design considerations of a microcomputer-based Digital Image Analysis System (DIAS). Particular attention is given to the algorithms which are incorporated for eighter edge enhancement or smoothing Landsat multispectral scanner data.

  8. Two dimensional recursive digital filters for near real time image processing

    NASA Technical Reports Server (NTRS)

    Olson, D.; Sherrod, E.

    1980-01-01

    A program was designed toward the demonstration of the feasibility of using two dimensional recursive digital filters for subjective image processing applications that require rapid turn around. The concept of the use of a dedicated minicomputer for the processor for this application was demonstrated. The minicomputer used was the HP1000 series E with a RTE 2 disc operating system and 32K words of memory. A Grinnel 256 x 512 x 8 bit display system was used to display the images. Sample images were provided by NASA Goddard on a 800 BPI, 9 track tape. Four 512 x 512 images representing 4 spectral regions of the same scene were provided. These images were filtered with enhancement filters developed during this effort.

  9. TDRSS data handling and management system study. Ground station systems for data handling and relay satellite control

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Results of a two-phase study of the (Data Handling and Management System DHMS) are presented. An original baseline DHMS is described. Its estimated costs are presented in detail. The DHMS automates the Tracking and Data Relay Satellite System (TDRSS) ground station's functions and handles both the forward and return link user and relay satellite data passing through the station. Direction of the DHMS is effected via a TDRSS Operations Control Central (OCC) that is remotely located. A composite ground station system, a modified DHMS (MDHMS), was conceptually developed. The MDHMS performs both the DHMS and OCC functions. Configurations and costs are presented for systems using minicomputers and midicomputers. It is concluded that a MDHMS should be configured with a combination of the two computer types. The midicomputers provide the system's organizational direction and computational power, and the minicomputers (or interface processors) perform repetitive data handling functions that relieve the midicomputers of these burdensome tasks.

  10. Cactus

    SciTech Connect

    Sexton, R.L.

    1983-03-01

    The CACTUS project (computer-aided control, tracking, and updating system) was initiated by the Bendix Kansas City Division to address specific work-in-process problems encountered in a cable department. Since then, the project has been expanded to additional electrical manufacturing departments because of potential productivity gains from the system. The philosophy of CACTUS is to add an element of distributed data proessing to the centralized data processing system currently in use for control of work in process. Under this system, the existing chain of communications between the host computer and the CRT terminals in a department is severed. A mini-computer established in the department communicates directly with the central system, and departmental communication is then established with the mini-computer. The advantages, disadvantages, operation performance, and economics of the system are discussed.

  11. An implementation of the distributed programming structural synthesis system (PROSSS)

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1981-01-01

    A method is described for implementing a flexible software system that combines large, complex programs with small, user-supplied, problem-dependent programs and that distributes their execution between a mainframe and a minicomputer. The Programming Structural Synthesis System (PROSSS) was the specific software system considered. The results of such distributed implementation are flexibility of the optimization procedure organization and versatility of the formulation of constraints and design variables.

  12. Computer program modifications of Open-file report 82-1065; a comprehensive system for interpreting seismic-refraction and arrival-time data using interactive computer methods

    USGS Publications Warehouse

    Ackermann, Hans D.; Pankratz, Leroy W.; Dansereau, Danny A.

    1983-01-01

    The computer programs published in Open-File Report 82-1065, A comprehensive system for interpreting seismic-refraction arrival-time data using interactive computer methods (Ackermann, Pankratz, and Dansereau, 1982), have been modified to run on a mini-computer. The new version uses approximately 1/10 of the memory of the initial version, is more efficient and gives the same results.

  13. ART/Ada design project, phase 1: Project plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan and schedule for Phase 1 of the Ada based ESBT Design Research Project is described. The main platform for the project is a DEC Ada compiler on VAX mini-computers and VAXstations running the Virtual Memory System (VMS) operating system. The Ada effort and lines of code are given in tabular form. A chart is given of the entire project life cycle.

  14. The application of microprocessors to strapdown inertial navigation

    NASA Technical Reports Server (NTRS)

    Napjus, G. A.

    1974-01-01

    The fundamental concepts of inertial navigation are briefly examined. In a strapdown inertial navigator the accelerometers and gyros are mounted directly on the vehicle frame. The development of strapdown systems, which have important advantages over gimbal systems, has been mainly retarded by the computational requirements involved. However, the current availability of suitable minicomputers combined with other technological advances has now opened the way for a more widespread use of strapdown inertial navigators.

  15. Locomotive Data Acquisition Package Phase II system development. Final report. Volume 1. System overview

    SciTech Connect

    Abbott, R.K.; Kirsten, F.A.; Mullen, D.R.; Sidman, S.B.; Miller, J.G.; Ng, L.S.; Scalise, D.T.

    1980-03-01

    An examination of the problems associated with railroad locomotive data acquisition is presented. The design of a minicomputer based locomotive data acquisition system is also presented. Special attention is placed on meeting the functional characteristics and environmental specifications required for the system. The system described consists of a magnetic tape digital data recorder, an ensemble of transducers, and analysis software. The system described is designed as a research tool. The environmental test program and the field test program are also described.

  16. Robots in space exploration

    NASA Technical Reports Server (NTRS)

    Dobrotin, B. M.

    1974-01-01

    A brief outline of NASA's current robotics program is presented. Efforts are being concentrated on a roving surface vehicle for Mars exploration. This vehicle will integrate manipulative, locomotive, and visual functions and will feature an electromechanical manipulator, stereo TV cameras, a laser rangefinder, a minicomputer, and a remote off-line computer. The program hinges on the iterative development of complex scenarios describing the robot's mission and the interrelationships among its various subsystems.

  17. Which calibration-pulse location method is robust?

    NASA Astrophysics Data System (ADS)

    Gunther, F. J.

    The Threshold method, with high threshold and contiguous-block parameter values, was found to be a robust method of locating calibration pulses in the presence of light-leak and shutter-edge pulses within the calibration window. Tests used digitized calibration-window background and light-pulse data from the Landsat-5 Thematic Mapper (TM) instrument, analyzed by special software on an Apple II+ personal computer and on a VAX 11/780 minicomputer.

  18. Which calibration-pulse location method is robust?. [for satellite imaging instruments

    NASA Technical Reports Server (NTRS)

    Gunther, F. J.

    1985-01-01

    The Threshold method, with high threshold and contiguous-block parameter values, was found to be a robust method of locating calibration pulses in the presence of light-leak and shutter-edge pulses within the calibration window. Tests used digitized calibration-window background and light-pulse data from the Landsat-5 Thematic Mapper (TM) instrument, analyzed by special software on an Apple II+ personal computer and on a VAX 11/780 minicomputer.

  19. Data acquisition user's guide-1 for fuel\\/engine evaluation system applied to an experimental air stirling engine. Technical note

    Microsoft Academic Search

    I. R. Bingham; G. D. Webster

    1981-01-01

    This technical note describes the Data Acquisition (DA) System used in the evaluation of Experimental Air Stirling Engine No. 1 which had previously been designed and built as a part of the Advanced Engines studies for the Fuels\\/Powerplants Technical Subprogram 25B. The DA system and capability is presented. Brief programming guidelines for controlling various peripheral electronic equipment through a mini-computer

  20. Harrison Radiator Division's Energy Management, Reporting and Accounting System

    E-print Network

    Goubeaux, R. J.

    , farm equipment, small aircraft and other types of vehicles. The energy management, reporting and accounting system that is covered in this paper is operating in Harrison's West Complex of the New York Operations located in Lockport, Western... Harrison Radiator's energy management system is run on two IBM Series/I minicomputers. Communication throughout the Complex is accomplished using Towne Applied Technology multiplexors. The system is used to turn equipment on and off and be a...

  1. Flight simulators. Part 1: Present situation and trends. Part 2: Implications for training

    NASA Technical Reports Server (NTRS)

    Hass, D.; Volk, W.

    1977-01-01

    The present situation and developments in the technology of flight simulators based on digital computers are evaluated from the standpoint of training airline flight crews. Areas covered are minicomputers and their advantages in terms of cost, space and time savings, software data packets, motion simulation, visual simulation and instructor aids. The division of training time between aircraft and simulator training and the possible advantages from increased use of simulators are evaluated.

  2. FTIR (Fourier Transform Infrared) spectrophotometry for thin film monitors: Computer and equipment integration for enhanced capabilities

    NASA Astrophysics Data System (ADS)

    Cox, J. N.; Sedayao, J.; Shergill, G.; Villasol, R.; Haaland, D. M.

    Fourier transform infrared spectrophotometry (FTIR) is a valuable technique for monitoring thin films used in semiconductor device manufacture. Determinations of the constituent contents in borophosphosilicate (BPSG), phosphosilicate (PSG), silicon oxynitride (SiON:H,OH), and spin-on-glass (SOG) thin films are a few applications. Due to the nature of the technique, FTIR instrumentation is one of the most extensively computer-dependent pieces of equipment that is likely to be found in a microelectronics plant. In the role of fab monitor or reactor characterization tool, FTIR instruments can rapidly generate large amounts of data. By linking a local FTIR data station to a remote minicomputer its capabilities are greatly improved. We discuss three caused of enhancement. First, the FTIR in the fab area communicates and interacts in real time with the minicomputer: transferring data segments to it, instructing it to perform sophisticated processing, and returning the result to the operator in the fab. Characterizations of PSG thin films by this approach are discussed. Second, the spectra of large numbers of samples are processed locally. The large database is then transmitted to the minicomputer for study by statistical/graphics software. Results of CVD-reactor spatial profiling experiments for plasma SiON are presented. Third, processing of calibration spectra is performed on the minicomputer to optimize the accuracy and precision of a Partial Least Squares analysis mode. This model is then transferred to the data station in the fab. The analysis of BPSG thin films is discussed in this regard. The prospects for fully automated at-line monitoring and for real-time, in-situ monitoring will be discussed.

  3. FTIR (Fourier transform infrared) spectrophotometry for thin film monitors: Computer and equipment integration for enhanced capabilities

    SciTech Connect

    Cox, J.N.; Sedayao, J.; Shergill, G.; Villasol, R. (Intel Corp., Santa Clara, CA (USA)); Haaland, D.M. (Sandia National Labs., Albuquerque, NM (USA))

    1990-01-01

    Fourier transform infrared spectrophotometry (FTIR) is a valuable technique for monitoring thin films used in semiconductor device manufacture. Determinations of the constituent contents in borophosphosilicate (BPSG), phosphosilicate (PSG), silicon oxynitride (SiON:H,OH), and spin-on-glass (SOG) thin films are a few applications. Due to the nature of the technique, FTIR instrumentation is one of the most extensively computer-dependent pieces of equipment that is likely to be found in a microelectronics plant. In the role of fab monitor or reactor characterization tool, FTIR instruments can rapidly generate large amounts of data. By linking a local FTIR data station to a remote minicomputer its capabilities are greatly improved. We discuss three caused of enhancement. First, the FTIR in the fab area communicates and interacts in real time with the minicomputer: transferring data segments to it, instructing it to perform sophisticated processing, and returning the result to the operator in the fab. Characterizations of PSG thin films by this approach are discussed. Second, the spectra of large numbers of samples are processed locally. The large database is then transmitted to the minicomputer for study by statistical/graphics software. Results of CVD-reactor spatial profiling experiments for plasma SiON are presented. Third, processing of calibration spectra is performed on the minicomputer to optimize the accuracy and precision of a Partial Least Squares'' analysis mode. This model is then transferred to the data station in the fab. The analysis of BPSG thin films is discussed in this regard. The prospects for fully automated at-line monitoring and for real-time, in-situ monitoring will be discussed. 10 refs., 4 figs.

  4. Kinetic and morphometric measurements of enzyme reactions in tissue sections with a new instrumental setup

    Microsoft Academic Search

    P. Kugler

    1981-01-01

    An instrumental setup is described for the measurement of enzyme kinetics and morphometry in tissue sections. It consists of a Vickers M85 microdensitometer and computer-assisted Kontron Videoplan system. The Videoplan system consists of a minicomputer with two mini-floppy disks, a keyboard, a graphic tablet, a TV monitor and a printer\\/plotter. The measuring component of the M85 is linked to the

  5. The 1983-84 Connecticut 45-Hz-band field-strength measurements

    Microsoft Academic Search

    P. R. Bannister

    1986-01-01

    Extremely low frequency (ELF) measurements are made of the transverse horizontal magnetic field strength received in Connecticut. The AN\\/BSR-1 receiver consists of an AN\\/UYK-20 minicomputer, a signal timing and interface unit (STIU), a rubidium frequency time standard, two magnetic tape recorders, and a preamplifier. The transmission source of these farfield (1.6-Mm range) measurements is the U.S. Navy's ELF Wisconsin Test

  6. Fast digital data acquisition and on-line processing system for an HB5 scanning transmission electron microscope

    Microsoft Academic Search

    M. Strahm; J. H. Butler

    1981-01-01

    The Fast Digital Data Acquisition System (FDDAS) links a Vacuum Generator’s HB5 Scanning Transmission Electron Microscope and a PDP-11\\/34 minicomputer, to enable high-speed collection and storage of digitized images for analysis, processing, and display, either in real time (using hardware) or later (via software). The FDDAS hardware consists of a digital scan generator, a programmable quad scaler with quad discriminator,

  7. Microfiche as a medium for the long-term storage of laboratory computer records.

    PubMed Central

    McVittie, J D; Whitehouse, C; Wilkinson, R H

    1981-01-01

    The chemical pathology requests on 180 000 patients a year are stored on microfiche, occupying 72 mm of shelf space. They are produced by a sequence of three computer programs which remove data from disc on to magnetic tape using the laboratory's Digital Equipment Corporation PDP 11/34 minicomputer. Processing on to microfiche is performed by a bureau. The magnetic tape is available for retrospective research and management studies in one-month periods. Images Fig. 4 PMID:7462438

  8. Instrumentation, techniques and data reduction associated with airfoil testing programs at Wichita State University

    NASA Technical Reports Server (NTRS)

    Rodgers, E. J.; Wentz, W. H., Jr.; Seetharam, H. C.

    1978-01-01

    Two dimensional airfoil testing was conducted at the Wichita State University Beech Wind Tunnel for a number of years. The instrumentation developed and adapted during this period of testing for determination of flow fields along with traversing mechanisms for these probes are discussed. In addition, some of the techniques used to account for interference effects associated with the apparatus used for this two dimensional testing are presented. The application of a minicomputer to the data reduction and presentation is discussed.

  9. Laser velocimeter (autocovariance) buffer interface

    NASA Technical Reports Server (NTRS)

    Clemmons, J. I., Jr.

    1981-01-01

    A laser velocimeter (autocovariance) buffer interface (LVABI) was developed to serve as the interface between three laser velocimeter high speed burst counters and a minicomputer. A functional description is presented of the instrument and its unique features which allow the studies of flow velocity vector analysis, turbulence power spectra, and conditional sampling of other phenomena. Typical applications of the laser velocimeter using the LVABI are presented to illustrate its various capabilities.

  10. Wide-ranging UNIX pervades the operating system world

    SciTech Connect

    Schindler, M.

    1984-02-23

    UNIX has extended its sphere of influence beyond minicomputer systems like the PDP-11 or VAX-11 series-it now ranges from micros to mainframes, from VLSI designers to office managers. In fact, the UNIX bandwagon is gaining momentum so fast that no system manufacturer wants to be caught on the sidelines. Not only are customized versions of UNIX proliferating, but the system is finding its way into the boxes of new computers.

  11. SIFT: Design and analysis of a fault-tolerant computer for aircraft control

    Microsoft Academic Search

    John H. Wensley; L. Lamport; J. Goldberg; M. W. Green; K. N. Levitt; P. M. Melliar-Smith; R. E. Shostak; C. B. Weinstock

    1978-01-01

    SIFT (Software Implemented Fault Tolerance) is an ultrareliable computer for critical aircraft control applications that achieves fault tolerance by the replication of tasks among processing units. The main processing units are off-the-shelf minicomputers, with standard microcomputers serving as the interface to the I\\/O system. Fault isolation is achieved by using a specially designed redundant bus system to interconnect the proeessing

  12. Microcomputer-based digital image processing - A tutorial package for exploration geologists

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.; Cartin, K. F.

    1985-01-01

    An Apple II microcomputer-based software package for analysis of digital data developed at the University of Oklahoma, the Digital Image Analysis System (DIAS), provides a relatively low-cost, portable alternative to large, dedicated minicomputers for digital image processing education. Digital processing techniques for analysis of Landsat MSS data and a series of tutorial exercises for exploration geologists are described and evaluated. DIAS allows in-house training that does not interfere with computer-based prospect analysis objectives.

  13. Dose rates for several human organs using QAD

    Microsoft Academic Search

    K. Phillips; N. Tsoulfanidis

    1990-01-01

    The objective of this study is to determine dose rates for various organs of a human standing on a plane uniformly contaminated with gamma-emitting isotopes. The calculation was performed using the computer program QAD-CGGP, a code that can be run on a mainframe, a minicomputer, or a personal computer. The QAD-CGGP is a point kernel code using gamma-ray buildup factors

  14. Display system for imaging scientific telemetric information

    NASA Technical Reports Server (NTRS)

    Zabiyakin, G. I.; Rykovanov, S. N.

    1979-01-01

    A system for imaging scientific telemetric information, based on the M-6000 minicomputer and the SIGD graphic display, is described. Two dimensional graphic display of telemetric information and interaction with the computer, in analysis and processing of telemetric parameters displayed on the screen is provided. The running parameter information output method is presented. User capabilities in the analysis and processing of telemetric information imaged on the display screen and the user language are discussed and illustrated.

  15. Dedicated multiprocessor system for calculating Josephson-junction noise thermometer frequency variances at high speed

    SciTech Connect

    Cutkosky, R.D.

    1983-07-01

    A Josephson-junction noise thermometer produces a sequence of frequency readings from whose variations the temperature of the thermometer may be calculated. A preprocessor system has been constructed to collect the frequency readings delivered to an IEEE 488 bus by an ordinary counter operating at up to 1000 readings per second, perform the required calculations, and send summary information to a desk calculator or minicomputer on another 488 bus at a more convenient rate.

  16. Conversion of radionuclide transport codes to IBM-compatible microcomputers

    SciTech Connect

    Pon, W.D.; Marschke, S.F.

    1986-01-01

    This paper presents an approach for modifying select US Nuclear Regulatory Commission (NRC) computer codes for use on an IBM microcomputer and/or compatibles. The three codes selected are GALE, GASPAR, and LADTAP. These codes are used by utilities, architect/engineering firms, and various governing bodies for estimating normal operational radiological releases and impacts. Originally written for large mainframe computers, they have been transferred to smaller minicomputer systems without any real conversion of the structure of the codes.

  17. A new theory for rapid calculation of the ground pattern of the incident sound intensity produced by a maneuvering jet airplane

    NASA Technical Reports Server (NTRS)

    Barger, R. L.

    1980-01-01

    An approximate method for computing the jet noise pattern of a maneuvering airplane is described. The method permits one to relate the noise pattern individually to the influences of airplane speed and acceleration, jet velocity and acceleration, and the flight path curvature. The analytic formulation determines the ground pattern directly without interpolation and runs rapidly on a minicomputer. Calculated examples including a climbing turn and a simple climb pattern with a gradual throttling back are presented.

  18. Topics in programmable automation. [for materials handling, inspection, and assembly

    NASA Technical Reports Server (NTRS)

    Rosen, C. A.

    1975-01-01

    Topics explored in the development of integrated programmable automation systems include: numerically controlled and computer controlled machining; machine intelligence and the emulation of human-like capabilities; large scale semiconductor integration technology applications; and sensor technology for asynchronous local computation without burdening the executive minicomputer which controls the whole system. The role and development of training aids, and the potential application of these aids to augmented teleoperator systems are discussed.

  19. Seasat Synthetic-Aperture Radar Data Reduction Using Parallel Programmable Array Processors

    Microsoft Academic Search

    Chialin Wu; Budak Barkan; Walter J. Karplus; Dennis Caswell

    1982-01-01

    This paper presents a digital signal processing system that produces the SEASAT synthetic-aperture radar (SAR) imagery. The system consists of a SEL 32\\/77 host minicomputer and three AP-120B array processors. The partitioning of the SAR processing functions and the design of softwae modules is described. The rationale for selecting the parallel array processor architecture and the methodology for developing the

  20. A multiparameter multichannel pulse height analyzer for nuclear data acquisition

    E-print Network

    Chenoweth, John Stephen

    1978-01-01

    Rhyne The design and implementation of the hardware and diagnostic software for a Nova minicomputer-based pulse height analyzer is developed in this thesis. The hardware provides for up to three pulse height analog-to-digital converters... to be interfaced to the Nova input/output bus data channel in either of two memory modes. The hardware also provides for interfacing a storage oscilloscope and general purpose output register using program controlled output. The diagnostic software to test...

  1. Thicket manipulation and static inventory control

    E-print Network

    Jones, Marvin Rex

    1974-01-01

    working and which is outlined in this section is a response to that administrative request. The NOVA minicomputer laboratory operates as a student-oriented teaching and research facility, pro- viding hands-on acquaintance and experience with a... functional, active and growing computing installation. Hardware includes an 8K NOVA. interfaced and servicing magnetic tape facilities and a large Gerber plotter and a 52K NOVA 1200 with three moveable-head disk drives, each supporting a 1. 25 mega...

  2. TMS communications hardware. Volume 2: Bus interface unit

    NASA Technical Reports Server (NTRS)

    Brown, J. S.; Hopkins, G. T.

    1979-01-01

    A prototype coaxial cable bus communication system used in the Trend Monitoring System to interconnect intelligent graphics terminals to a host minicomputer is described. The terminals and host are connected to the bus through a microprocessor-based RF modem termed a Bus Interface Unit (BIU). The BIU hardware and the Carrier Sense Multiple Access Listen-While-Talk protocol used on the network are described.

  3. TMS communications hardware. Volume 1: Computer interfaces

    NASA Technical Reports Server (NTRS)

    Brown, J. S.; Weinrich, S. S.

    1979-01-01

    A prototpye coaxial cable bus communications system was designed to be used in the Trend Monitoring System (TMS) to connect intelligent graphics terminals (based around a Data General NOVA/3 computer) to a MODCOMP IV host minicomputer. The direct memory access (DMA) interfaces which were utilized for each of these computers are identified. It is shown that for the MODCOMP, an off-the-shell board was suitable, while for the NOVAs, custon interface circuitry was designed and implemented.

  4. TMS communications software. Volume 1: Computer interfaces

    NASA Technical Reports Server (NTRS)

    Brown, J. S.; Lenker, M. D.

    1979-01-01

    A prototype bus communications system, which is being used to support the Trend Monitoring System (TMS) as well as for evaluation of the bus concept is considered. Hardware and software interfaces to the MODCOMP and NOVA minicomputers are included. The system software required to drive the interfaces in each TMS computer is described. Documentation of other software for bus statistics monitoring and for transferring files across the bus is also included.

  5. Clinical percutaneous imaging of coronary anatomy using an over-the-wire ultrasound catheter system

    Microsoft Academic Search

    J. B. Hodgson; S. P. Graham; A. D. Savakus; S. G. Dame; D. N. Stephens; P. S. Dhillon; D. Brands; H. Sheehan; M. J. Eberle

    1989-01-01

    This manuscript describes initial applications of a unique new intravascular ultrasound imaging catheter. This 5.5F catheter uses an over-the-wire design and incorporates a phased array transducer at its tip. There are no moving parts. A 360° image is produced perpendicular to the catheter axis using a 20 MHz center frequency. A dedicated minicomputer is used for initial image processing, as

  6. Electromechanical three-axis development for remote handling in the Hot Experimental Facility

    SciTech Connect

    Garin, J.; Bolfing, B.J.; Satterlee, P.E.; Babcock, S.M.

    1981-01-01

    A three-axis closed-loop position control system has been designed and installed on an overhead bridge, carriage, tube hoist for automotive positioning of manipulation at a remotely maintained work site. The system provides accurate (within 3 min) and repeatable three-axis positioning of the manipulator. The position control system has been interfaced to a supervisory minicomputer system that provides teach-playback capability of manipulator positioning and color graphic display of the three-axis system position.

  7. Experience using the 168\\/E microprocessor for off-line data analysis

    Microsoft Academic Search

    P. F. Kunz; R. N. Fall; M. F. Gravina; J. H. Halperin; L. J. Levinson; G. J. Oxoby; Q. H. Trang

    1979-01-01

    The 168\\/E is a SLAC developed microprocessor which emulates the IBM 360\\/370 computers with an execution speed of about one half of a IBM 370\\/168. These processors are used in parallel for the track finding and geometry programs of the LASS spectrometer. The system is controlled by a PDP-11 minicomputer via a three port interface which we call the Bermuda

  8. Fourier transform infrared spectrophotometry for thin film monitors: computer and equipment integration for enhanced capabilities

    NASA Astrophysics Data System (ADS)

    Cox, J. Neal; Sedayao, J.; Shergill, Gurmeet S.; Villasol, R.; Haaland, David M.

    1991-03-01

    Fourier transform infrared spectrophotometry (FTIR) is a valuable technique for monitoring thin films used in semiconductor device manufacture. Determinations of the constituent contents in borophosphosilicate (BPSG) phosphosilicate (PSG) silicon oxynitride (SiON:H and spin-on-glass (SOG) thin films are a few applications. Due to the nature of the technique FTIR instrumentation is one of the most extensively computer-dependent pieces of equipment that is likely to be found in a microelectronics plant. In the role of fab monitor or reactor characterization tool FTIR instruments can rapidly generate large amounts of data. Also the drive for greater accuracy and tighter precision is leading to the development of increasingly sophisticated data processing software that tax the computing abilities of most instrument local data stations. By linking a local FTIR data station to a remote minicomputer its capabilities are greatly improved. We discuss three classes of enhancement. First the FTIR in the fab area communicates and interacts in real time with the minicomputer: transferring data segments to it instructing it to perform sophisticated processing and returning the results to the operator in the fab. Characterizations of PSG thin films by this approach are discussed. Second the spectra of large numbers of samples are processed locally. The large database is then transmitted to the minicomputer for study by statistical/graphics software. Results of CVD-reactor spatial profiling experiments for plasma SiON are presented. Third processing of calibration spectra is performed

  9. Application of a personal computer for the uncoupled vibration analysis of wind turbine blade and counterweight assemblies

    NASA Technical Reports Server (NTRS)

    White, P. R.; Little, R. R.

    1985-01-01

    A research effort was undertaken to develop personal computer based software for vibrational analysis. The software was developed to analytically determine the natural frequencies and mode shapes for the uncoupled lateral vibrations of the blade and counterweight assemblies used in a single bladed wind turbine. The uncoupled vibration analysis was performed in both the flapwise and chordwise directions for static rotor conditions. The effects of rotation on the uncoupled flapwise vibration of the blade and counterweight assemblies were evaluated for various rotor speeds up to 90 rpm. The theory, used in the vibration analysis codes, is based on a lumped mass formulation for the blade and counterweight assemblies. The codes are general so that other designs can be readily analyzed. The input for the codes is generally interactive to facilitate usage. The output of the codes is both tabular and graphical. Listings of the codes are provided. Predicted natural frequencies of the first several modes show reasonable agreement with experimental results. The analysis codes were originally developed on a DEC PDP 11/34 minicomputer and then downloaded and modified to run on an ITT XTRA personal computer. Studies conducted to evaluate the efficiency of running the programs on a personal computer as compared with the minicomputer indicated that, with the proper combination of hardware and software options, the efficiency of using a personal computer exceeds that of a minicomputer.

  10. Feasibility study: Replacement of the inoperative decommutating buffer subsystem for the instrumentation checkout complex in the Quality and Reliability Assurance Laboratory

    NASA Technical Reports Server (NTRS)

    Hilliard, J. W.

    1972-01-01

    A general purpose computer system, that is necessary for replacement of the present inoperative signal decommutator special purpose computer subsystem is described. The present decommutator subsystem has a very poor history of reliability and since April 1970, it has become inoperative because the core memory cannot be repaired. Functions of the present signal, decommutator subsystem are to receive, demultiplex, record in real time, playback in real time, and output to the SDS-930 control computer for analysis of the telemetry data. Recommendations for replacement of the inoperative telemetry decommutator subsystem are for the purchase of a mini-computer.

  11. ART/Ada design project, phase 1. Task 3 report: Test plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan is described for the integrated testing and benchmark of Phase Ada based ESBT Design Research Project. The integration testing is divided into two phases: (1) the modules that do not rely on the Ada code generated by the Ada Generator are tested before the Ada Generator is implemented; and (2) all modules are integrated and tested with the Ada code generated by the Ada Generator. Its performance and size as well as its functionality is verified in this phase. The target platform is a DEC Ada compiler on VAX mini-computers and VAX stations running the VMS operating system.

  12. Recent advances in strapdown inertial navigation

    NASA Technical Reports Server (NTRS)

    Napjus, G. A.

    1975-01-01

    The computational requirements and basic features of strapdown gyroscopes for inertial navigation are discussed. Strapdown navigators currently require 20 to 50% of the available time of a minicomputer with a capability of several hundred thousand operations per second; memory requirements are 2000 to 3000 16-bit words. A system in which these computational demands are met by three limited capability microcomputers is described. A technique using dedicated microprocessors in the place of analog electronics in the gyroscope control loops is discussed, and attention is given to applications of microprocessor technology in redundant strapdown navigation systems and associated flight control systems.

  13. Locomotive data acquisition package phase II system development. Final report. Volume 2. LDR operations and maintenance

    SciTech Connect

    Abbott, R.K.; Kirsten, F.A.; Mullen, D.R.; Sidman, S.B.; Miller, J.G.; Ng, L.S.

    1980-03-01

    An examination of the problems associated with railroad locomotive data acquisition is presented. The design of a minicomputer based locomotive data acquisition system is also presented. Special attention is placed on meeting the functional characteristics and environmental specifications required for the system. The system described consists of a magnetic tape digital data recorder, an ensemble of transducers, and analysis software. The system described is to be used as a research tool. This volume discusses the operation and maintenance of the Locomotive Data Recorder (LDR).

  14. A large matrix sorting engine

    SciTech Connect

    Cresswell, J.R.; Sampson, J.A.

    1987-08-01

    This paper describes the development of cost effective hardware and software for analysing large volumes of event-by-event data produced from the TESSA gamma-ray detector array at the Nuclear Structure Facility (NSF) at Daresbury Laboratory. The system will look like an intelligent I/O peripheral of the mini-computer in current use for data analysis. The data will be sorted into large 2D matrices from magnetic tape considerably faster than present techniques allow. The sorting software will utilise compiler generation techniques. The hardware will be based on multiprocessing in VMEbus.

  15. Data acquisition user's guide-1 for fuel/engine evaluation system applied to an experimental air stirling engine. Technical note

    SciTech Connect

    Bingham, I.R.; Webster, G.D.

    1981-07-01

    This technical note describes the Data Acquisition (DA) System used in the evaluation of Experimental Air Stirling Engine No. 1 which had previously been designed and built as a part of the Advanced Engines studies for the Fuels/Powerplants Technical Subprogram 25B. The DA system and capability is presented. Brief programming guidelines for controlling various peripheral electronic equipment through a mini-computer are given. The program software used in testing the Stirling engine is described. Finally, some limitations of the DA system are listed.

  16. Data base management study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data base management techniques and applicable equipment are described. Recommendations which will assist potential NASA data users in selecting and using appropriate data base management tools and techniques are presented. Classes of currently available data processing equipment ranging from basic terminals to large minicomputer systems were surveyed as they apply to the needs of potential SEASAT data users. Cost and capabilities projections for this equipment through 1985 were presented. A test of a typical data base management system was described, as well as the results of this test and recommendations to assist potential users in determining when such a system is appropriate for their needs. The representative system tested was UNIVAC's DMS 1100.

  17. Rocketdyne automated dynamics data analysis and management system

    NASA Technical Reports Server (NTRS)

    Tarn, Robert B.

    1988-01-01

    An automated dynamics data analysis and management systems implemented on a DEC VAX minicomputer cluster is described. Multichannel acquisition, Fast Fourier Transformation analysis, and an online database have significantly improved the analysis of wideband transducer responses from Space Shuttle Main Engine testing. Leakage error correction to recover sinusoid amplitudes and correct for frequency slewing is described. The phase errors caused by FM recorder/playback head misalignment are automatically measured and used to correct the data. Data compression methods are described and compared. The system hardware is described. Applications using the data base are introduced, including software for power spectral density, instantaneous time history, amplitude histogram, fatigue analysis, and rotordynamics expert system analysis.

  18. Automated emergency meteorological response system

    SciTech Connect

    Pepper, D W

    1980-01-01

    A sophisticated emergency response system was developed to aid in the evaluation of accidental releases of hazardous materials from the Savannah River Plant to the environment. A minicomputer system collects and archives data from both onsite meteorological towers and the National Weather Service. In the event of an accidental release, the computer rapidly calculates the trajectory and dispersion of pollutants in the atmosphere. Computer codes have been developed which provide a graphic display of predicted concentration profiles downwind from the source, as functions of time and distance.

  19. Microcontroller interface for diode array spectrometry

    NASA Astrophysics Data System (ADS)

    Aguo, L.; Williams, R. R.

    An alternative to bus-based computer interfacing is presented using diode array spectrometry as a typical application. The new interface consists of an embedded single-chip microcomputer, known as a microcontroller, which provides all necessary digital I/O and analog-to-digital conversion (ADC) along with an unprecedented amount of intelligence. Communication with a host computer system is accomplished by a standard serial interface so this type of interfacing is applicable to a wide range of personal and minicomputers and can be easily networked. Data are acquired asynchronousty and sent to the host on command. New operating modes which have no traditional counterparts are presented.

  20. A computer system to analyze showers in nuclear emulsions: Center Director's discretionary fund report

    NASA Technical Reports Server (NTRS)

    Meegan, C. A.; Fountain, W. F.; Berry, F. A., Jr.

    1987-01-01

    A system to rapidly digitize data from showers in nuclear emulsions is described. A TV camera views the emulsions though a microscope. The TV output is superimposed on the monitor of a minicomputer. The operator uses the computer's graphics capability to mark the positions of particle tracks. The coordinates of each track are stored on a disk. The computer then predicts the coordinates of each track through successive layers of emulsion. The operator, guided by the predictions, thus tracks and stores the development of the shower. The system provides a significant improvement over purely manual methods of recording shower development in nuclear emulsion stacks.

  1. U. S. GEOLOGICAL SURVEY'S NATIONAL REAL-TIME HYDROLOGIC INFORMATION SYSTEM USING GOES SATELLITE TECHNOLOGY.

    USGS Publications Warehouse

    Shope, William G., Jr.

    1987-01-01

    The U. S. Geological Survey maintains the basic hydrologic data collection system for the United States. The Survey is upgrading the collection system with electronic communications technologies that acquire, telemeter, process, and disseminate hydrologic data in near real-time. These technologies include satellite communications via the Geostationary Operational Environmental Satellite, Data Collection Platforms in operation at over 1400 Survey gaging stations, Direct-Readout Ground Stations at nine Survey District Offices and a network of powerful minicomputers that allows data to be processed and disseminate quickly.

  2. Transfer-function-parameter estimation from frequency response data: A FORTRAN program

    NASA Technical Reports Server (NTRS)

    Seidel, R. C.

    1975-01-01

    A FORTRAN computer program designed to fit a linear transfer function model to given frequency response magnitude and phase data is presented. A conjugate gradient search is used that minimizes the integral of the absolute value of the error squared between the model and the data. The search is constrained to insure model stability. A scaling of the model parameters by their own magnitude aids search convergence. Efficient computer algorithms result in a small and fast program suitable for a minicomputer. A sample problem with different model structures and parameter estimates is reported.

  3. Airborne differential absorption lidar system for water vapor investigations

    NASA Technical Reports Server (NTRS)

    Browell, E. V.; Carter, A. F.; Wilkerson, T. D.

    1981-01-01

    Range-resolved water vapor measurements using the differential-absorption lidar (DIAL) technique is described in detail. The system uses two independently tunable optically pumped lasers operating in the near infrared with laser pulses of less than 100 microseconds separation, to minimize concentration errors caused by atmospheric scattering. Water vapor concentration profiles are calculated for each measurement by a minicomputer, in real time. The work is needed in the study of atmospheric motion and thermodynamics as well as in forestry and agriculture problems.

  4. Applications of intelligent-measurement systems in controlled-fusion research

    SciTech Connect

    Owen, E.W.; Shimer, D.W.; Lindquist, W.B.; Peterson, R.L.; Wyman, R.H.

    1981-06-22

    The paper describes the control and instrumentation for the Mirror Fusion Test Facility at the Lawrence Livermore National Laboratory, California, USA. This large-scale scientific experiment in controlled thermonuclear fusion, which is currently being expanded, originally had 3000 devices to control and 7000 sensors to monitor. A hierarchical computer control system, is used with nine minicomputers forming the supervisory system. There are approximately 55 local control and instrumentation microcomputers. In addition, each device has its own monitoring equipment, which in some cases consists of a small computer. After describing the overall system a more detailed account is given of the control and instrumentation for two large superconducting magnets.

  5. F100 Multivariable Control Synthesis Program. Computer Implementation of the F100 Multivariable Control Algorithm

    NASA Technical Reports Server (NTRS)

    Soeder, J. F.

    1983-01-01

    As turbofan engines become more complex, the development of controls necessitate the use of multivariable control techniques. A control developed for the F100-PW-100(3) turbofan engine by using linear quadratic regulator theory and other modern multivariable control synthesis techniques is described. The assembly language implementation of this control on an SEL 810B minicomputer is described. This implementation was then evaluated by using a real-time hybrid simulation of the engine. The control software was modified to run with a real engine. These modifications, in the form of sensor and actuator failure checks and control executive sequencing, are discussed. Finally recommendations for control software implementations are presented.

  6. Dual charge-coupled device /CCD/, astronomical spectrometer and direct imaging camera. II - Data handling and control systems

    NASA Astrophysics Data System (ADS)

    Dewey, D.; Ricker, G. R.

    The data collection system for the MASCOT (MIT Astronomical Spectrometer/Camera for Optical Telescopes) is described. The system relies on an RCA 1802 microprocessor-based controller, which serves to collect and format data, to present data to a scan converter, and to operate a device communication bus. A NOVA minicomputer is used to record and recall frame images and to perform refined image processing. The RCA 1802 also provides instrument mode control for the MASCOT. Commands are issued using STOIC, a FORTH-like language. Sufficient flexibility has been provided so that a variety of CCDs can be accommodated.

  7. Computers for artificial intelligence a technology assessment and forecast

    SciTech Connect

    Miller, R.K.

    1986-01-01

    This study reviews the development and current state-of-the-art in computers for artificial intelligence, including LISP machines, AI workstations, professional and engineering workstations, minicomputers, mainframes, and supercomputers. Major computer systems for AI applications are reviewed. The use of personal computers for expert system development is discussed, and AI software for the IBM PC, Texas Instrument Professional Computer, and Apple MacIntosh is presented. Current research aimed at developing a new computer for artificial intelligence is described, and future technological developments are discussed.

  8. An object oriented software bus

    SciTech Connect

    McGirt, F. [Los Alamos National Lab., NM (United States); Wilkerson, J.F. [Washington Univ., Seattle, WA (United States). Dept. of Physics

    1995-12-31

    This paper describes a new approach to development of software for highly integrated software-hardware systems such as used for data acquisition and control. This approach, called the Object Oriented Software Bus (OSB), is a way to develop software according to a common specification similar to the way interface hardware has been developed since the advent of bus structures for minicomputers and microcomputers. Key concept of the OSB is extension of the common use of objects to support user interface and data analysis functions to the development of software objects that directly correspond to real- world hardware interfaces and modules.

  9. An accelerated forth data-acquisition system

    NASA Technical Reports Server (NTRS)

    Bowhill, S. A.; Rennier, A. D.

    1986-01-01

    A new data acquisition system was put into operation at Urbana in August 1984. It uses a standard Apple 2 microcomputer with 48 k RAM and a standard 5 1/4 inch floppy disk. Design criteria for the system is given. The system was implemented using fig-FORTH, a threaded interpretive language which permits easy interfacing to machine code. The throughput of this system is better by a factor of 6 than the PDP-15 minicomputer system previously used, and it has the real time display feature and provides the data in much more convenient form. The features which contribute to this improved performance is listed.

  10. The library and its home computer: automation as if people mattered.

    PubMed

    Avriel, D

    1983-07-01

    To provide its users with quick and easy access to the library resources, the Muriel and Philip Berman National Medical Library, Jerusalem, between 1978 and 1982 developed an integrated library system (MAIMON) on a minicomputer. Because humans are the most important element of the library system, MAIMON's performance was evaluated in terms of benefits provided to patrons, library management, and library staff. After successfully adopting the system, users' needs and expectations have grown. How the existing system will be used and expanded to meet the new information demands at the library is discussed. PMID:6626802

  11. Man-machine analysis of translation and work tasks of Skylab films

    NASA Technical Reports Server (NTRS)

    Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.

    1979-01-01

    An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.

  12. Distributed data base systems with special emphasis toward POREL

    NASA Technical Reports Server (NTRS)

    Neuhold, E. J.

    1984-01-01

    In the last few years a number of research and advanced development projects have resulted in distributed data base management prototypes. POREL, developed at the University of Stuttgart, is a multiuser, distributed, relational system developed for wide and local area networks of minicomputers and advanced micros. The general objectives of such data base systems and the architecture of POREL are discussed. In addition a comparison of some of the the existing distributed DMBS is included to provide the reader with information about the current state of the art.

  13. INEL personal computer version of MACCS 1. 5

    SciTech Connect

    Jones, K.R.; Dobbe, C.A.; Knudson, D.L. (EG and G Idaho, Inc., Idaho Falls, ID (USA))

    1991-03-01

    The MELCOR Accident Consequence Code System, Version 1.5, (MACCS 1.5) calculates potential consequences resulting from atmospheric releases of radioactive materials. Sandia National Laboratories developed the code for the US Nuclear Regulatory Commission on a VAX/VMS mini-computer. This report documents the Idaho National Engineering Laboratory conversion of MACCS 1.5 for compilation and execution on an 80386-based IBM or IBM-compatible personal computer (PC). The resulting PC version of the code is available through the National Energy Software Center, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL, 60439. 2 refs., 1 fig., 1 tab.

  14. High resolution color raster computer animation of space filling molecular models

    SciTech Connect

    Max, N.L.

    1981-01-01

    The ATOMLLL system efficiently produces realistic photographs of ball-and-stick or space-filling molecular models, with color shading, highlights, shadows, and transparency. The hidden surface problem for a scene composed of intersecting spheres and cylinders is solved on a CDC-7600, which outputs onto magnetic tape the outlines of the visible parts of each object. The outlines are then rendered, at up to 4096 x 4096 resolution, by a Dicomed D-48 color film recorder, controlled by a Varian V-75 minicomputer. The Varian computes the shading and highlights for each pixel in a fast microcoded loop. Recent modifications to give shadows and transparency are described.

  15. Application of image processing techniques to fluid flow data analysis

    NASA Technical Reports Server (NTRS)

    Giamati, C. C.

    1981-01-01

    The application of color coding techniques used in processing remote sensing imagery to analyze and display fluid flow data is discussed. A minicomputer based color film recording and color CRT display system is described. High quality, high resolution images of two-dimensional data are produced on the film recorder. Three dimensional data, in large volume, are used to generate color motion pictures in which time is used to represent the third dimension. Several applications and examples are presented. System hardware and software is described.

  16. Catastrophe theory as a tool for determining synchronous power system dynamic stability

    SciTech Connect

    Sallam, A.A.; Dineley, J.L.

    1983-03-01

    A mathematical method, Catastrophe Theory, is applied to the problem of electrical power system dynamic stability. It is suggested that this offers a method for the continual monitoring of power system stability margins by the use of visual graphic display produced by a dedicated minicomputer using information monitored from the power system. The approach arises from long experience in the field of power system stability and a pre-occupation with visualising this multi-dimensional dynamic problem in such a way as to enhance comprehension, both as an aid to understanding and as a method for rapid assimilation of the significance of changes in the system.

  17. APSAS; an Automated Particle Size Analysis System

    USGS Publications Warehouse

    Poppe, Lawrence J.; Eliason, A.H.; Fredericks, J.J.

    1985-01-01

    The Automated Particle Size Analysis System integrates a settling tube and an electroresistance multichannel particle-size analyzer (Coulter Counter) with a Pro-Comp/gg microcomputer and a Hewlett Packard 2100 MX(HP 2100 MX) minicomputer. This system and its associated software digitize the raw sediment grain-size data, combine the coarse- and fine-fraction data into complete grain-size distributions, perform method of moments and inclusive graphics statistics, verbally classify the sediment, generate histogram and cumulative frequency plots, and transfer the results into a data-retrieval system. This system saves time and labor and affords greater reliability, resolution, and reproducibility than conventional methods do.

  18. MORPH-I (Ver 1.0) a software package for the analysis of scanning electron micrograph (binary formatted) images for the assessment of the fractal dimension of enclosed pore surfaces

    USGS Publications Warehouse

    Mossotti, Victor G.; Eldeeb, A. Raouf; Oscarson, Robert

    1998-01-01

    MORPH-I is a set of C-language computer programs for the IBM PC and compatible minicomputers. The programs in MORPH-I are used for the fractal analysis of scanning electron microscope and electron microprobe images of pore profiles exposed in cross-section. The program isolates and traces the cross-sectional profiles of exposed pores and computes the Richardson fractal dimension for each pore. Other programs in the set provide for image calibration, display, and statistical analysis of the computed dimensions for highly complex porous materials. Requirements: IBM PC or compatible; minimum 640 K RAM; mathcoprocessor; SVGA graphics board providing mode 103 display.

  19. User's operating procedures. Volume 1: Scout project information programs

    NASA Technical Reports Server (NTRS)

    Harris, C. G.; Harris, D. K.

    1985-01-01

    A review of the user's operating procedures for the Scout Project Automatic Data System, called SPADS is given. SPADS is the result of the past seven years of software development on a Prime minicomputer located at the Scout Project Office. SPADS was developed as a single entry, multiple cross reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. The instructions to operate the Scout Project Information programs in data retrieval and file maintenance via the user friendly menu drivers is presented.

  20. Industrial linguistic control

    SciTech Connect

    King, R.E.; Karonis, F.

    1983-01-01

    The use of various types of controllers and control techniques for industrial process is discussed. An ongoing research and development project is reported on the application of intelligent linguistic controllers to processes in the cement industry in Greece which have, in the past, been controllable only by human operators. Prototype linguistic controllers using fuzzy logic have been implemented and tested on a rotary kiln precalciner flash furnace (3-input 3-output) and on a cement mill separator (3-input 2-output) with good results. Originally implemented on a supervisory minicomputer, the algorithms have been transferred to microcomputers which form the heart of this class of intelligent linguistic controllers. 6 references.

  1. On the development of an interactive resource information management system for analysis and display of spatiotemporal data

    NASA Technical Reports Server (NTRS)

    Schell, J. A.

    1974-01-01

    The recent availability of timely synoptic earth imagery from the Earth Resources Technology Satellites (ERTS) provides a wealth of information for the monitoring and management of vital natural resources. Formal language definitions and syntax interpretation algorithms were adapted to provide a flexible, computer information system for the maintenance of resource interpretation of imagery. These techniques are incorporated, together with image analysis functions, into an Interactive Resource Information Management and Analysis System, IRIMAS, which is implemented on a Texas Instruments 980A minicomputer system augmented with a dynamic color display for image presentation. A demonstration of system usage and recommendations for further system development are also included.

  2. Gait Analysis Laboratory

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Complete motion analysis laboratory has evolved out of analyzing walking patterns of crippled children at Stanford Children's Hospital. Data is collected by placing tiny electrical sensors over muscle groups of child's legs and inserting step-sensing switches in soles of shoes. Miniature radio transmitters send signals to receiver for continuous recording of abnormal walking pattern. Engineers are working to apply space electronics miniaturization techniques to reduce size and weight of telemetry system further as well as striving to increase signal bandwidth so analysis can be performed faster and more accurately using a mini-computer.

  3. UNIX-based data management system for the Mobile Satellite Propagation Experiment (PiFEx)

    NASA Technical Reports Server (NTRS)

    Kantak, Anil V.

    1987-01-01

    A new method is presented for handling data resulting from Mobile Satellite propagation experiments such as the Pilot Field Experiment (PiFEx) conducted by JPL. This method uses the UNIX operating system and C programming language. The data management system is implemented on a VAX minicomputer. The system automatically divides the large data file housing data from various experiments under a predetermined format into various individual files containing data from each experiment. The system also has a number of programs written in C and FORTRAN languages to allow the researcher to obtain meaningful quantities from the data at hand.

  4. Integrating the university medical center. Phase one: providing an information backbone.

    PubMed Central

    Berry, S. J.; Reber, E.; Offeman, W. E.

    1991-01-01

    UCLA School of Medicine represents a diverse computing community where the creation of each individual network has been driven by applications, price/performance and functionality. Indeed, the ability to connect to other computers has had no bearing on selection. Yet, there exists a need to seamlessly connect the individual networks to other minicomputers, mainframes and remote computers. We have created a school wide backbone network that will enable an individual from a single workstation to access a wide variety of services residing on any number of machines. PMID:1807658

  5. Manipulator for rotating and examining small spheres

    DOEpatents

    Weinstein, Berthold W. [Livermore, CA; Willenborg, David L. [Livermore, CA

    1980-02-12

    A manipulator which provides fast, accurate rotational positioning of a small sphere, such as an inertial confinement fusion target, which allows inspecting of the entire surface of the sphere. The sphere is held between two flat, flexible tips which move equal amounts in opposite directions. This provides rolling of the ball about two orthogonal axes without any overall translation. The manipulator may be controlled, for example, by an x- and y-axis driven controlled by a mini-computer which can be programmed to generate any desired scan pattern.

  6. Manipulator for rotating and examining small spheres

    DOEpatents

    Weinstein, B.W.; Willenborg, D.L.

    1980-02-12

    A manipulator is disclosed which provides fast, accurate rotational positioning of a small sphere, such as an inertial confinement fusion target, which allows inspecting of the entire surface of the sphere. The sphere is held between two flat, flexible tips which move equal amounts in opposite directions. This provides rolling of the ball about two orthogonal axes without any overall translation. The manipulator may be controlled, for example, by an x- and y-axis driven controlled by a mini-computer which can be programmed to generate any desired scan pattern. 8 figs.

  7. Alternatives in the complement and structure of NASA teleprocessing resources

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results are presented of a program to identify technical innovations which would have an impact on NASA data processing and describe as fully as possible the development work necessary to exploit them. Seven of these options for NASA development, as the opportunities to participate in and enhance the advancing information system technology were called, are reported. A detailed treatment is given of three of the options, involving minicomputers, mass storage devices and software development techniques. These areas were picked by NASA as having the most potential for improving their operations.

  8. User's operating procedures. Volume 2: Scout project financial analysis program

    NASA Technical Reports Server (NTRS)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  9. Energy Conservation and BP

    E-print Network

    Partridge, R. W.

    1982-01-01

    , refinery fuel gas as an atomising medium, com bustion of low BTU gas and combustion of difficult fuels (eg. decanted oil containing catalyst). IMPROVED PROCESS CONTROL The advent of small and relatively inexpensive :minicomputer systems has led... of the dynamic pressure of a distil lation column is achieved by operating the air c6n denser for maximum cooling, resulting in a day-t6 I!ight variation in the overhead receiver temperaturE:. The computer calculates the optimum pressure corres ~nding...

  10. Programming for energy monitoring/display system in multicolor lidar system research

    NASA Technical Reports Server (NTRS)

    Alvarado, R. C., Jr.; Allen, R. J.

    1982-01-01

    The Z80 microprocessor based computer program that directs and controls the operation of the six channel energy monitoring/display system that is a part of the NASA Multipurpose Airborne Differential Absorption Lidar (DIAL) system is described. The program is written in the Z80 assembly language and is located on EPROM memories. All source and assembled listings of the main program, five subroutines, and two service routines along with flow charts and memory maps are included. A combinational block diagram shows the interfacing (including port addresses) between the six power sensors, displays, front panel controls, the main general purpose minicomputer, and this dedicated microcomputer system.

  11. Comparison of existing digital image analysis systems for the analysis of Thematic Mapper data

    NASA Technical Reports Server (NTRS)

    Likens, W. C.; Wrigley, R. C.

    1984-01-01

    Most existing image analysis systems were designed with the Landsat Multi-Spectral Scanner in mind, leaving open the question of whether or not these systems could adequately process Thematic Mapper data. In this report, both hardware and software systems have been evaluated for compatibility with TM data. Lack of spectral analysis capability was not found to be a problem, though techniques for spatial filtering and texture varied. Computer processing speed and data storage of currently existing mini-computer based systems may be less than adequate. Upgrading to more powerful hardware may be required for many TM applications.

  12. Close to real life. [solving for transonic flow about lifting airfoils using supercomputers

    NASA Technical Reports Server (NTRS)

    Peterson, Victor L.; Bailey, F. Ron

    1988-01-01

    NASA's Numerical Aerodynamic Simulation (NAS) facility for CFD modeling of highly complex aerodynamic flows employs as its basic hardware two Cray-2s, an ETA-10 Model Q, an Amdahl 5880 mainframe computer that furnishes both support processing and access to 300 Gbytes of disk storage, several minicomputers and superminicomputers, and a Thinking Machines 16,000-device 'connection machine' processor. NAS, which was the first supercomputer facility to standardize operating-system and communication software on all processors, has done important Space Shuttle aerodynamics simulations and will be critical to the configurational refinement of the National Aerospace Plane and its intergrated powerplant, which will involve complex, high temperature reactive gasdynamic computations.

  13. Development and Implementation of Kumamoto Technopolis Regional Database T-KIND

    NASA Astrophysics Data System (ADS)

    Onoue, Noriaki

    T-KIND (Techno-Kumamoto Information Network for Data-Base) is a system for effectively searching information of technology, human resources and industries which are necessary to realize Kumamoto Technopolis. It is composed of coded database, image database and LAN inside technoresearch park which is the center of R & D in the Technopolis. It constructs on-line system by networking general-purposed computers, minicomputers, optical disk file systems and so on, and provides the service through public telephone line. Two databases are now available on enterprise information and human resource information. The former covers about 4,000 enterprises, and the latter does about 2,000 persons.

  14. An experimental study of a hybrid adaptive control system

    NASA Technical Reports Server (NTRS)

    Lizewski, E. F.; Monopoli, R. V.

    1974-01-01

    A Liapunov type model reference adaptive control system with five adjustable gains is implemented using a PDP-11 digital computer and an EAI 380 analog computer. The plant controlled is a laboratory type dc servo system. It is made to follow closely a second order linear model. The experimental results demonstrate the feasibility of implementing this rather complex design using only a minicomputer and a reasonable number of operational amplifiers. Also, it points out that satisfactory performance can be achieved even when certain assumptions necessary for the theory are not satisfied.

  15. Simplified extension of the LSI-11 Q-Bus for a high energy laser control application

    SciTech Connect

    Burczyk, L.

    1981-01-01

    Antares, a large, experimental laser fusion facility under construction at Los Alamos National Laboratory in New Mexico, is controlled by a network of PDP-11 minicomputers and microprocessors. The remote nodes of the Antares control network are based on an LSI-11/2 microcomputer interfaced to an STD Bus. This machine interface or MI forms the intelligent process controller located directly adjacent to the many diverse laser subsystem devices. The STD Bus, linked to the LSI-11/2 microcomputer, offers a standardized, cost effective means for the development of the specialized interface functions required for the high energy laser environment.

  16. A new system for observing solar oscillations at the Mount Wilson Observatory. I - System design and installation

    NASA Technical Reports Server (NTRS)

    Rhodes, E. J., Jr.; Howard, R. F.; Ulrich, R. K.; Smith, E. J.

    1983-01-01

    An observation system designed to obtain daily measurements of solar photospheric and subphotospheric rotational velocities, from the frequency splitting of nonradial solar p-mode oscillations of degree greater than 150, is nearing completion of the Mount Wilson Observatory. The system will combine a 244 x 248 pixel CID camera with a high speed floating point array processor, a 32-bit minicomputer, and a large capacity disk storage system. These components will be integrated into the spectrograph of the 60-foot solar tower telescope at Mount Wilson.

  17. Digital system for structural dynamics simulation

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Lagace, L. J.; Wojnar, M. K.; Glor, C.

    1982-01-01

    State-of-the-art digital hardware and software for the simulation of complex structural dynamic interactions, such as those which occur in rotating structures (engine systems). System were incorporated in a designed to use an array of processors in which the computation for each physical subelement or functional subsystem would be assigned to a single specific processor in the simulator. These node processors are microprogrammed bit-slice microcomputers which function autonomously and can communicate with each other and a central control minicomputer over parallel digital lines. Inter-processor nearest neighbor communications busses pass the constants which represent physical constraints and boundary conditions. The node processors are connected to the six nearest neighbor node processors to simulate the actual physical interface of real substructures. Computer generated finite element mesh and force models can be developed with the aid of the central control minicomputer. The control computer also oversees the animation of a graphics display system, disk-based mass storage along with the individual processing elements.

  18. Technology innovation and management in the US Bureau of the Census: Discussion and recommendations

    SciTech Connect

    Tonn, B.; Edwards, R.; Goeltz, R.; Hake, K.

    1990-09-01

    This report contains a set of recommendations prepared by Oak Ridge National Laboratory (ORNL) for the US Bureau of the Census pertaining to technology innovation and management. Technology has the potential to benefit the Bureau's data collection, capture, processing, and analysis activities. The entire Bureau was represented from Decennial Census to Economic Programs and various levels of Bureau management and numerous experts in technology. Throughout the Bureau, workstations, minicomputers, and microcomputers have found their place along side the Bureau's mainframes. The Bureau's new computer file structure called the Topologically Integrated Geographic Encoding and Referencing data base (TIGER) represents a major innovation in geographic information systems and impressive progress has been made with Computer Assisted Telephone Interviewing (CATI). Other innovations, such as SPRING, which aims to provide Bureau demographic analysts with the capability of interactive data analysis on minicomputers, are in the initial stages of development. Recommendations fall into five independent, but mutually beneficial categories. (1) The ADP Steering Committee be disbanded and replaced with The Technology Forum. (2) Establishment of a Technology Review Committee (TRC), to be composed of technology experts from outside the Bureau. (3) Designate technological gurus. These individuals will be the Bureau's experts in new and innovative technologies. (4) Adopt a technology innovation process. (5) Establish an Advanced Technology Studies Staff (ATSS) to promote technology transfer, obtain funding for technological innovation, manage innovation projects unable to find a home in other divisions, evaluate innovations that cut across Bureau organizational boundaries, and provide input into Bureau technology analyses. (JF)

  19. Distributed information system (water fact sheet)

    USGS Publications Warehouse

    Harbaugh, A.W.

    1986-01-01

    During 1982-85, the Water Resources Division (WRD) of the U.S. Geological Survey (USGS) installed over 70 large minicomputers in offices across the country to support its mission in the science of hydrology. These computers are connected by a communications network that allows information to be shared among computers in each office. The computers and network together are known as the Distributed Information System (DIS). The computers are accessed through the use of more than 1500 terminals and minicomputers. The WRD has three fundamentally different needs for computing: data management; hydrologic analysis; and administration. Data management accounts for 50% of the computational workload of WRD because hydrologic data are collected in all 50 states, Puerto Rico, and the Pacific trust territories. Hydrologic analysis consists of 40% of the computational workload of WRD. Cost accounting, payroll, personnel records, and planning for WRD programs occupies an estimated 10% of the computer workload. The DIS communications network is shown on a map. (Lantz-PTT)

  20. A multiprocessor airborne lidar data system

    NASA Technical Reports Server (NTRS)

    Wright, C. W.; Bailey, S. A.; Heath, G. E.; Piazza, C. R.

    1988-01-01

    A new multiprocessor data acquisition system was developed for the existing Airborne Oceanographic Lidar (AOL). This implementation simultaneously utilizes five single board 68010 microcomputers, the UNIX system V operating system, and the real time executive VRTX. The original data acquisition system was implemented on a Hewlett Packard HP 21-MX 16 bit minicomputer using a multi-tasking real time operating system and a mixture of assembly and FORTRAN languages. The present collection of data sources produce data at widely varied rates and require varied amounts of burdensome real time processing and formatting. It was decided to replace the aging HP 21-MX minicomputer with a multiprocessor system. A new and flexible recording format was devised and implemented to accommodate the constantly changing sensor configuration. A central feature of this data system is the minimization of non-remote sensing bus traffic. Therefore, it is highly desirable that each micro be capable of functioning as much as possible on-card or via private peripherals. The bus is used primarily for the transfer of remote sensing data to or from the buffer queue.

  1. Quality control in a deterministic manufacturing environment

    SciTech Connect

    Barkman, W.E.; Babelay, E.F.; De Mint, P.D.; Lewis, J.C.; Woodard, L.M.

    1985-01-24

    An approach for establishing quality control in processes which exhibit undesired continual or intermittent excursions in key process parameters is discussed. The method is called deterministic manufacturing, and it is designed to employ automatic monitoring of the key process variables for process certification, but utilizes only sample certification of the process output to verify the validity of the measurement process. The system utilizes a local minicomputer to sample the appropriate process parameters that describe the condition of the machine tool, the cutting process, and the computer numerical control system. Sampled data are pre-processed by the minicomputer and then sent to a host computer that maintains a permanent data base describing the manufacturing conditions for each work piece. Parts are accepted if the various parameters remain within the required limits during the machining cycle. The need for additional actions is flagged if limits are exceeded. With this system it is possible to retrospectively examine the process status just prior to the occurrence of a problem. (LEW)

  2. Computer network that assists in the planning, execution and evaluation of in-reactor experiments

    SciTech Connect

    Bauer, T.H.; Froehle, P.H.; August, C.; Baldwin, R.D.; Johanson, E.W.; Kraimer, M.R.; Simms, R.; Klickman, A.E.

    1985-01-01

    For over 20 years complex, in-reactor experiments have been performed at Argonne National Laboratory (ANL) to investigate the performance of nuclear reactor fuel and to support the development of large computer codes that address questions of reactor safety in full-scale plants. Not only are computer codes an important end-product of the research, but computer analysis is also involved intimately at most stages of experiment planning, data reduction, and evaluation. For instance, many experiments are of sufficiently long duration or, if they are of brief duration, occur in such a purposeful sequence that need for speedy availability of on-line data is paramount. This is made possible most efficiently by computer assisted displays and evaluation. A purposeful linking of main-frame, mini, and micro computers has been effected over the past eight years which greatly enhances the speed with which experimental data are reduced to useful forms and applied to the relevant technological issues. This greater efficiency in data management led also to improvements in the planning and execution of subsequent experiments. Raw data from experiments performed at INEL is stored directly on disk and tape with the aid of minicomputers. Either during or shortly after an experiment, data may be transferred, via a direct link, to the Illinois offices of ANL where the data base is stored on a minicomputer system. This Idaho-to-Illinois link has both enhanced experiment performance and allowed rapid dissemination of results.

  3. Prescriptive concepts for advanced nuclear materials control and accountability systems

    SciTech Connect

    Whitty, W.J.; Strittmatter, R.B.; Ford, W.; Tisinger, R.M.; Meyer, T.H.

    1987-06-01

    Networking- and distributed-processing hardware and software have the potential of greatly enhancing nuclear materials control and accountability (MC and A) systems, from both safeguards and process operations perspectives, while allowing timely integrated safeguards activities and enhanced computer security at reasonable cost. A hierarchical distributed system is proposed consisting of groups of terminal and instruments in plant production and support areas connected to microprocessors that are connected to either larger microprocessors or minicomputers. These micros and/or minis are connected to a main machine, which might be either a mainframe or a super minicomputer. Data acquisition, preliminary input data validation, and transaction processing occur at the lowest level. Transaction buffering, resource sharing, and selected data processing occur at the intermediate level. The host computer maintains overall control of the data base and provides routine safeguards and security reporting and special safeguards analyses. The research described outlines the distribution of MC and A system requirements in the hierarchical system and distributed processing applied to MC and A. Implications of integrated safeguards and computer security concepts for the distributed system design are discussed. 10 refs., 4 figs.

  4. Refractive index and absorption detector for liquid chromatography based on Fabry-Perot interferometry

    DOEpatents

    Yeung, Edward S. (Ames, IA); Woodruff, Steven D. (Ames, IA)

    1984-06-19

    A refractive index and absorption detector for liquid chromatography. It is based in part on a Fabry-Perot interferometer and is used for the improved detection of refractive index and absorption. It includes a Fabry-Perot interferometer having a normally fixed first partially reflecting mirror and a movable second partially reflecting mirror. A chromatographic flow-cell is positioned between the mirrors along the optical axis of a monochromatic laser beam passing through the interferometer. A means for deriving information about the interference fringes coming out of the interferometer is used with a mini-computer to compute the refractive index of the specimen injected into the flow cell. The minicomputer continuously scans the interferometer for continuous refractive index readings and outputs the continuous results of the scans on a chart recorder. The absorption of the specimen can concurrently be scanned by including a second optical path for an excitation laser which will not interfere with the first laser, but will affect the specimen so that absorption properties can be detected. By first scanning for the refractive index of the specimen, and then immediately adding the excitation laser and subsequently scanning for the refractive index again, the absorption of the specimen can be computed and recorded.

  5. Automated search for supernovae

    SciTech Connect

    Kare, J.T.

    1984-11-15

    This thesis describes the design, development, and testing of a search system for supernovae, based on the use of current computer and detector technology. This search uses a computer-controlled telescope and charge coupled device (CCD) detector to collect images of hundreds of galaxies per night of observation, and a dedicated minicomputer to process these images in real time. The system is now collecting test images of up to several hundred fields per night, with a sensitivity corresponding to a limiting magnitude (visual) of 17. At full speed and sensitivity, the search will examine some 6000 galaxies every three nights, with a limiting magnitude of 18 or fainter, yielding roughly two supernovae per week (assuming one supernova per galaxy per 50 years) at 5 to 50 percent of maximum light. An additional 500 nearby galaxies will be searched every night, to locate about 10 supernovae per year at one or two percent of maximum light, within hours of the initial explosion.

  6. An image-processing system applied to earth-resource imagery

    NASA Technical Reports Server (NTRS)

    Carter, P.; Gardner, W. E.

    1977-01-01

    The Harwell Image Processing System (HIPS) has been adapted for processing earth-resource imagery in either film or tape format. Data from film are obtained using a computer-controlled flying-spot scanner. Local rapid interactive processing is based on a PDP 11/20 minicomputer which has suitable display facilities for immediate visual appraisal of results and also a fast data link to an IBM 370/168 computer complex. An extensive subroutine library is being assembled for data preprocessing and feature extraction. This chapter includes a discussion of the basic principles of image analysis, a description of the HIPS system, and finally, for illustrative purposes, a description of several simple software routines.

  7. State-of-the-art Monte Carlo 1988

    SciTech Connect

    Soran, P.D.

    1988-06-28

    Particle transport calculations in highly dimensional and physically complex geometries, such as detector calibration, radiation shielding, space reactors, and oil-well logging, generally require Monte Carlo transport techniques. Monte Carlo particle transport can be performed on a variety of computers ranging from APOLLOs to VAXs. Some of the hardware and software developments, which now permit Monte Carlo methods to be routinely used, are reviewed in this paper. The development of inexpensive, large, fast computer memory, coupled with fast central processing units, permits Monte Carlo calculations to be performed on workstations, minicomputers, and supercomputers. The Monte Carlo renaissance is further aided by innovations in computer architecture and software development. Advances in vectorization and parallelization architecture have resulted in the development of new algorithms which have greatly reduced processing times. Finally, the renewed interest in Monte Carlo has spawned new variance reduction techniques which are being implemented in large computer codes. 45 refs.

  8. Observer performance measured against hybrid compressed video imagery

    NASA Astrophysics Data System (ADS)

    Swistak, J. E.

    1981-03-01

    A study was conducted in which the effects of bits per pixel reductions on target detection and recognition were measured. A hybrid DCT/DPCM compression algorithm was used to manipulate the bit-per-pixel reduction rate. The algorithm was implemented in the form of a software simulation package on a general-purpose minicomputer facility. Using this facility, real-time imagery was reduced from analog to 8- and 2-bit-per-pixel levels at 1-frame-per-second update rates. Observers were asked to detect and recognize targets located in the imagery. Average detection and recognition slant ranges were calculated for each target and compression level. No significant differences in performance were noted due to the different compression levels.

  9. Development of a multiplane multispeed balancing system for turbine systems

    NASA Technical Reports Server (NTRS)

    Martin, M. R.

    1984-01-01

    A prototype high speed balancing system was developed for assembled gas turbine engine modules. The system permits fully assembled gas turbine modules to be operated and balanced at selected speeds up to full turbine speed. The balancing system is a complete stand-alone system providing all necesary lubrication and support hardware for full speed operation. A variable speed motor provides the drive power. A drive belt and gearbox provide rotational speeds up to 21,000 rpm inside a vacuum chamber. The heart of the system is a dedicated minicomputer with attendant data acquisition, storage and I/O devices. The computer is programmed to be completely interactive with the operator. The system was installed at CCAD and evaluated by testing 20 T55 power turbines and 20 T53 power turbines. Engine test results verified the performance of the high speed balanced turbines.

  10. Laboratory data manipulation tools basic data handling programs. Volume 2: Detailed software/hardware documentation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The set of computer programs described allows for data definition, data input, and data transfer between the LSI-11 microcomputers and the VAX-11/780 minicomputer. Program VAXCOM allows for a simple method of textual file transfer from the LSI to the VAX. Program LSICOM allows for easy file transfer from the VAX to the LSI. Program TTY changes the LSI-11 operators console to the LSI's printing device. Program DICTIN provides a means for defining a data set for input to either computer. Program DATAIN is a simple to operate data entry program which is capable of building data files on either machine. Program LEDITV is an extremely powerful, easy to use, line oriented text editor. Program COPYSBF is designed to print out textual files on the line printer without character loss from FORTRAN carriage control or wide record transfer.

  11. Composite structural materials. [aircraft structures

    NASA Technical Reports Server (NTRS)

    Ansell, G. S.; Loewy, R. G.; Wiberley, S. E.

    1980-01-01

    The use of filamentary composite materials in the design and construction of primary aircraft structures is considered with emphasis on efforts to develop advanced technology in the areas of physical properties, structural concepts and analysis, manufacturing, and reliability and life prediction. The redesign of a main spar/rib region on the Boeing 727 elevator near its actuator attachment point is discussed. A composite fabrication and test facility is described as well as the use of minicomputers for computer aided design. Other topics covered include (1) advanced structural analysis methids for composites; (2) ultrasonic nondestructive testing of composite structures; (3) optimum combination of hardeners in the cure of epoxy; (4) fatigue in composite materials; (5) resin matrix characterization and properties; (6) postbuckling analysis of curved laminate composite panels; and (7) acoustic emission testing of composite tensile specimens.

  12. Selecting a labor information system. What to ask, what to avoid.

    PubMed

    Garcia, L

    1990-12-01

    Payroll expenses may account for over half of all of a hospital's expenses. Manual time card processing requires an abundance of staff time and can often result in costly errors. To alleviate this problem, many healthcare facilities are implementing computerized labor information systems. To minimize the risk of selecting the wrong system, hospital administrators should ask the following questions before committing to any computerized labor information system: Is the software designed for hospital use and easily adaptable to each hospital's unique policies? How flexible is the software's reporting system? Does it include automatic scheduling that creates generic schedules? Does the system have the capability of securing time and attendance records and documenting the audit trail? Does the system include an accurate and reliable badge reader? What type of hardware is best for the particular hospital--microcomputer, minicomputer, or mainframe? Finally, to guarantee successful software installation, the vendor should have extensive experience and documentation in the system's implementation. PMID:10108009

  13. Safeguards instrumentation: past, present, future

    SciTech Connect

    Higinbotham, W.A.

    1982-01-01

    Instruments are essential for accounting, for surveillance and for protection of nuclear materials. The development and application of such instrumentation is reviewed, with special attention to international safeguards applications. Active and passive nondestructive assay techniques are some 25 years of age. The important advances have been in learning how to use them effectively for specific applications, accompanied by major advances in radiation detectors, electronics, and, more recently, in mini-computers. The progress in seals has been disappointingly slow. Surveillance cameras have been widely used for many applications other than safeguards. The revolution in TV technology will have important implications. More sophisticated containment/surveillance equipment is being developed but has yet to be exploited. On the basis of this history, some expectations for instrumentation in the near future are presented.

  14. Correction factors for on-line microprobe analysis of multielement alloy systems

    NASA Technical Reports Server (NTRS)

    Unnam, J.; Tenney, D. R.; Brewer, W. D.

    1977-01-01

    An on-line correction technique was developed for the conversion of electron probe X-ray intensities into concentrations of emitting elements. This technique consisted of off-line calculation and representation of binary interaction data which were read into an on-line minicomputer to calculate variable correction coefficients. These coefficients were used to correct the X-ray data without significantly increasing computer core requirements. The binary interaction data were obtained by running Colby's MAGIC 4 program in the reverse mode. The data for each binary interaction were represented by polynomial coefficients obtained by least-squares fitting a third-order polynomial. Polynomial coefficients were generated for most of the common binary interactions at different accelerating potentials and are included. Results are presented for the analyses of several alloy standards to demonstrate the applicability of this correction procedure.

  15. Remote sensing information sciences research group: Browse in the EOS era

    NASA Technical Reports Server (NTRS)

    Estes, John E.; Star, Jeffrey L.

    1989-01-01

    The problem of science data browse was examined. Given the tremendous data volumes that are planned for future space missions, particularly the Earth Observing System in the late 1990's, the need for access to large spatial databases must be understood. Work was continued to refine the concept of data browse. Further, software was developed to provide a testbed of the concepts, both to locate possibly interesting data, as well as view a small portion of the data. Build II was placed on a minicomputer and a PC in the laboratory, and provided accounts for use in the testbed. Consideration of the testbed software as an element of in-house data management plans was begun.

  16. High-performance control system for a heavy-ion medical accelerator

    SciTech Connect

    Lancaster, H.D.; Magyary, S.B.; Sah, R.C.

    1983-03-01

    A high performance control system is being designed as part of a heavy ion medical accelerator. The accelerator will be a synchrotron dedicated to clinical and other biomedical uses of heavy ions, and it will deliver fully stripped ions at energies up to 800 MeV/nucleon. A key element in the design of an accelerator which will operate in a hospital environment is to provide a high performance control system. This control system will provide accelerator modeling to facilitate changes in operating mode, provide automatic beam tuning to simplify accelerator operations, and provide diagnostics to enhance reliability. The control system being designed utilizes many microcomputers operating in parallel to collect and transmit data; complex numerical computations are performed by a powerful minicomputer. In order to provide the maximum operational flexibility, the Medical Accelerator control system will be capable of dealing with pulse-to-pulse changes in beam energy and ion species.

  17. Assessment of the Accuracy and Computing Speed of Simplified Saturation Vapor Equations Using a New Reference Dataset.

    NASA Astrophysics Data System (ADS)

    Gueymard, Christian

    1993-07-01

    A revised saturation vapor dataset is proposed for use in meteorology. Based on new engineering data of the American Society of Heating, Refrigerating, and Air-Conditioning Engineers for temperatures above 0°C, it should supersede the older Smithsonian and World Meteorological Organization meteorological tables.Simple new equations are proposed to compute the saturation vapor pressure over water between 50° and 50°C. Their accuracy is shown to be excellent over this range, with an nns error of 3 × 103 mb and an average relative error of 0.02%. Detailed statistics descrbing the accuracy performance of 22 other equations are presented and the speed performance of all these equations is assessed. Nested polynomials are shown to provide both good accuracy and computational speed. On a modern minicomputer, a single evaluation of saturation vapor pressure may take less than 1 µs of CPU time, 15 times less than required by the Goff Gratch equations that were used to construct the meteorological tables.

  18. Total ozone determination by spectroradiometry in the middle ultraviolet

    NASA Technical Reports Server (NTRS)

    Garrison, L. M.; Doda, D. D.; Green, A. E. S.

    1979-01-01

    A method has been developed to determine total ozone from multispectral measurements of the direct solar irradiance. The total ozone is determined by a least squares fit to the spectrum between 290 nm and 380 nm. The aerosol extinction is accounted for by expanding it in a power series in wavelength; use of the linear term proved adequate. A mobile laboratory incorporating a sky scanner has been developed and used to obtain data to verify the method. Sun tracking, wavelength setting of the double monochromator, and data acquisition are under control of a minicomputer. Results obtained at Wallops Island, Virginia, and Palestine, Texas, agree well with simultaneous Dobson and Canterbury spectrometer and balloon ECC ozonesonde values. The wavelength calibration of the monochromator and the values for the normalized ozone absorption are the most important factors in an accurate determination of total ozone.

  19. Oxygen analyzer

    DOEpatents

    Benner, W.H.

    1984-05-08

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N/sub 2/), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable obtained by decomposing the sample at 1135/sup 0/C, or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135/sup 0/C as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N/sub 2/, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  20. Oxygen analyzer

    DOEpatents

    Benner, William H. (Danville, CA)

    1986-01-01

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  1. Computerized data acquisition and analysis for measuring thermal diffusivity. [in thermoelectric space applications materials

    NASA Technical Reports Server (NTRS)

    Chmielewski, A.; Wood, C.; Vandersande, J.

    1985-01-01

    JPL has been leading a concentrated effort to develop improved thermoelectric materials for space applications. Thermoelectric generators are an attractive source of electrical energy for space power because of lack of moving parts and slow degradation of performance. Thermoelectric material is characterized by: Seebeck coefficient, electrical resistivity and thermal conductivity. To measure the high temperature thermal conductivity is experimentally very difficult. However, it can be calculated from the specific heat and thermal diffusivity which are easier to measure at high temperatures, especially using the flash method. Data acquisition and analysis for this experiment were automated at JPL using inexpensive microcomputer equipment. This approach is superior to tedious and less accurate manual analysis of data. It is also preferred to previously developed systems utilizing expensive minicomputers or mainframes.

  2. Binary chromatographic data and estimation of adsorbent porosities. [data for system n-heptane/n-pentane

    NASA Technical Reports Server (NTRS)

    Meisch, A. J.

    1972-01-01

    Data for the system n-pentane/n-heptane on porous Chromosorb-102 adsorbent were obtained at 150, 175, and 200 C for mixtures containing zero to 100% n-pentane by weight. Prior results showing limitations on superposition of pure component data to predict multicomponent chromatograms were verified. The thermodynamic parameter MR0 was found to be a linear function of sample composition. A nonporous adsorbent failed to separate the system because of large input sample dispersions. A proposed automated data processing scheme involving magnetic tape recording of the detector signals and processing by a minicomputer was rejected because of resolution limitations of the available a/d converters. Preliminary data on porosity and pore size distributions of the adsorbents were obtained.

  3. Thermal systems analysis for the Space Infrared Telescope Facility dewar

    NASA Technical Reports Server (NTRS)

    Bhandari, Pradeep; Petrick, S. W.; Schember, Helene

    1991-01-01

    Thermal systems analysis models were used to design SFHe cooled dewar for the Space Infrared Telescope Facility (SIRTF), a 1 m class cryogenically cooled observatory for IR astronomy. The models are capable of computing both the heat leaks into the dewar and the operating temperature of a SFHe tank. The models are aimed at predicting the ability of the SIRTF cryogenic system to satisfy a five-year mission lifetime requirement and maintain the SFHe tank operating temperature of 1.25 K to provide sufficient cooling for science instruments and the optical system. The thermal models are very detailed and very fast with a typical steady state run of about 20 sec on a VAX minicomputer.

  4. FINDS: A fault inferring nonlinear detection system. User's guide

    NASA Technical Reports Server (NTRS)

    Lancraft, R. E.; Caglayan, A. K.

    1983-01-01

    The computer program FINDS is written in FORTRAN-77, and is intended for operation on a VAX 11-780 or 11-750 super minicomputer, using the VMS operating system. The program detects, isolates, and compensates for failures in navigation aid instruments and onboard flight control and navigation sensors of a Terminal Configured Vehicle aircraft in a Microwave Landing System environment. In addition, FINDS provides sensor fault tolerant estimates for the aircraft states which are then used by an automatic guidance and control system to land the aircraft along a prescribed path. FINDS monitors for failures by evaluating all sensor outputs simultaneously using the nonlinear analytic relationships between the various sensor outputs arising from the aircraft point mass equations of motion. Hence, FINDS is an integrated sensor failure detection and isolation system.

  5. Tracing technology in the Association of Academic Health Sciences Libraries

    PubMed Central

    Guard, J. Roger; Peay, Wayne J.

    2003-01-01

    From the beginning of the association, technology and the Association of Academic Health Sciences Libraries (AAHSL) have been intertwined. Technology was the focus of one of the first committees. Innovative applications of technology have been employed in the operations of the association. Early applications of mini-computers were used in preparing the Annual Statistics. The association's use of network communications was among the first in the country and later applications of the Web have enhanced association services. For its members, technology has transformed libraries. The association's support of the early development of Integrated Advanced Information Management Systems (IAIMS) and of its recent reconceptualization has contributed to the intellectual foundation for this revolution. PMID:12883580

  6. Automation in photogrammetry: Recent developments and applications (1972-1976)

    USGS Publications Warehouse

    Thompson, M.M.; Mikhail, E.M.

    1976-01-01

    An overview of recent developments in the automation of photogrammetry in various countries is presented. Conclusions regarding automated photogrammetry reached at the 1972 Congress in Ottawa are reviewed first as a background for examining the developments of 1972-1976. Applications are described for each country reporting significant developments. Among fifteen conclusions listed are statements concerning: the widespread practice of equipping existing stereoplotters with simple digitizers; the growing tendency to use minicomputers on-line with stereoplotters; the optimization of production of digital terrain models by progressive sampling in stereomodels; the potential of digitization of a photogrammetric model by density correlation on epipolar lines; the capabilities and economic aspects of advanced systems which permit simultaneous production of orthophotos, contours, and digital terrain models; the economy of off-line orthophoto systems; applications of digital image processing; automation by optical techniques; applications of sensors other than photographic imagery, and the role of photogrammetric phases in a completely automated cartographic system. ?? 1976.

  7. Computer-controlled system for rapid soil analysis of /sup 226/Ra

    SciTech Connect

    Doane, R.W.; Berven, B.A.; Blair, M.S.

    1984-01-01

    A computer-controlled multichannel analysis system has been developed by the Radiological Survey Activities Group at Oak Ridge National Laboratory (ORNL) for the Department of Energy (DOE) in support of the DOE's remedial action programs. The purpose of this system is to provide a rapid estimate of the /sup 226/Ra concentration in soil samples using a 6 x 9-in. NaI(Tl) crystal containing a 3.25-in. deep by 3.5-in. diameter well. This gamma detection system is controlled by a mini-computer with a dual floppy disk storage medium. A two-chip interface was also designed at ORNL which handles all control signals generated from the computer keyboard. These computer-generated control signals are processed in machine language for rapid data transfer and BASIC language is used for data processing.

  8. Interactive graphical data analysis. Progress report, March 25, 1981-March 24, 1982

    SciTech Connect

    Bloomfield, P.; Tukey, J.W.

    1982-03-01

    Efforts during the first year of this project have emphasized developing a new data-analysis system, developing new algorithms for analyzing multidimensional data, and installing a new minicomputer to support both of these methodological developments. The grammar of the user language for the data-analysis system has been defined, and the specification of the data-structures that it will manipulate has been partly completed. Work on data analysis algorithms has focussed on two areas: an algorithm to assist a data analyst in finding interesting projections of multidimensional data, and an application of canonical correlations to investigating the structure of a time series. A philosophy for data-modification display systems, focused on PRIM-81, has been developed. A class of techniques for curve-isolation in the presence of a noise background have been considered. The use of simple functions in fitting non-linear behavior has to be expanded and improved.

  9. A method for diagnosing surface parameters using geostationary satellite imagery and a boundary-layer model. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Polansky, A. C.

    1982-01-01

    A method for diagnosing surface parameters on a regional scale via geosynchronous satellite imagery is presented. Moisture availability, thermal inertia, atmospheric heat flux, and total evaporation are determined from three infrared images obtained from the Geostationary Operational Environmental Satellite (GOES). Three GOES images (early morning, midafternoon, and night) are obtained from computer tape. Two temperature-difference images are then created. The boundary-layer model is run, and its output is inverted via cubic regression equations. The satellite imagery is efficiently converted into output-variable fields. All computations are executed on a PDP 11/34 minicomputer. Output fields can be produced within one hour of the availability of aligned satellite subimages of a target area.

  10. Wind tunnel evaluation of air-foil performance using simulated ice shapes

    NASA Technical Reports Server (NTRS)

    Bragg, M. B.; Zaguli, R. J.; Gregorek, G. M.

    1982-01-01

    A two-phase wind tunnel test was conducted in the 6 by 9 foot Icing Research Tunnel (IRT) at NASA Lewis Research Center to evaluate the effect of ice on the performance of a full scale general aviation wing. In the first IRT tests, rime and glaze shapes were carefully documented as functions of angle of attack and free stream conditions. Next, simulated ice shapes were constructed for two rime and two glaze shapes and used in the second IRT tunnel entry. The ice shapes and the clean airfoil were tapped to obtain surface pressures and a probe used to measure the wake characteristics. These data were recorded and processed, on-line, with a minicomputer/digital data acquisition system. The effect of both rime and glaze ice on the pressure distribution, Cl, Cd, and Cm are presented.

  11. Implementation of a Prototype Generalized Network Technology for Hospitals *

    PubMed Central

    Tolchin, S. G.; Stewart, R. L.; Kahn, S. A.; Bergan, E. S.; Gafke, G. P.; Simborg, D. W.; Whiting-O'Keefe, Q. E.; Chadwick, M. G.; McCue, G. E.

    1981-01-01

    A demonstration implementation of a distributed data processing hospital information system using an intelligent local area communications network (LACN) technology is described. This system is operational at the UCSF Medical Center and integrates four heterogeneous, stand-alone minicomputers. The applications systems are PID/Registration, Outpatient Pharmacy, Clinical Laboratory and Radiology/Medical Records. Functional autonomy of these systems has been maintained, and no operating system changes have been required. The LACN uses a fiber-optic communications medium and provides extensive communications protocol support within the network, based on the ISO/OSI Model. The architecture is reconfigurable and expandable. This paper describes system architectural issues, the applications environment and the local area network.

  12. The experimental computer control of a two-dimensional hyperbolic system

    NASA Technical Reports Server (NTRS)

    Yam, Y.; Lang, J. H.; Staelin, D. H.; Johnson, T. L.

    1985-01-01

    The experimental computer control of a two-dimensional hyperbolic system is described. The system consists of a 5-foot gold-coated rubber membrane mounted on a circular cylindrical drum. Seven electrodes reside on a command surface located behind the membrane inside the drum. These electrodes served as capacitive sensors and electrostatic force actuators of transverse membrane deflection. The membrane was modelled as flat, isotropic and uniformly tensioned. Transverse membrane deflections were expanded in normal modes. Controllers regulating membrane deflection are designed using aggregation and design procedures based upon sensor and actuator influence functions. The resulting control laws are implemented on a minicomputer in two sets of experiments. The experimental study confirms the theoretically predicted behavior of the system, usefulness of the aggregation and design procedures, and the expectation that spillover can be made a beneficial source of damping in residual systems.

  13. Using CLIPS in a distributed system: The Network Control Center (NCC) expert system

    NASA Technical Reports Server (NTRS)

    Wannemacher, Tom

    1990-01-01

    This paper describes an intelligent troubleshooting system for the Help Desk domain. It was developed on an IBM-compatible 80286 PC using Microsoft C and CLIPS and an AT&T 3B2 minicomputer using the UNIFY database and a combination of shell script, C programs and SQL queries. The two computers are linked by a lan. The functions of this system are to help non-technical NCC personnel handle trouble calls, to keep a log of problem calls with complete, concise information, and to keep a historical database of problems. The database helps identify hardware and software problem areas and provides a source of new rules for the troubleshooting knowledge base.

  14. An optical/digital processor - Hardware and applications

    NASA Technical Reports Server (NTRS)

    Casasent, D.; Sterling, W. M.

    1975-01-01

    A real-time two-dimensional hybrid processor consisting of a coherent optical system, an optical/digital interface, and a PDP-11/15 control minicomputer is described. The input electrical-to-optical transducer is an electron-beam addressed potassium dideuterium phosphate (KD2PO4) light valve. The requirements and hardware for the output optical-to-digital interface, which is constructed from modular computer building blocks, are presented. Initial experimental results demonstrating the operation of this hybrid processor in phased-array radar data processing, synthetic-aperture image correlation, and text correlation are included. The applications chosen emphasize the role of the interface in the analysis of data from an optical processor and possible extensions to the digital feedback control of an optical processor.

  15. A computer-controlled poppet-valve actuation system for application on research engines

    SciTech Connect

    Richman, R.M.; Reynolds, W.C.

    1984-01-01

    A computer-controlled valve actuation system for research engines is described. The system can be used for the complete real-time control of valve motion, including valve lift, dwell, valve overlap and valve timing. The valve actuator is a fast electro-hydraulic unit, incorporating a small hydraulic actuator and a high-performance servovalve. The actuator assembly mounts over the engine valve and replaces the existing camshaft and rocker arm. An analog position-feedback controller coupled to a laboratory minicomputer provides the system control. The closed-loop step response time is 3 milliseconds, with a 0.5 millisecond initial delay. The system currently operates at engine speeds less than 1000 rpm and with appropriate control software is expected to operate at engine speeds approaching 3000 rpm. The actuation system is suitable for a research environment and has many potential applications in piston engine research and development.

  16. Digital resolver for helicopter model blade motion analysis

    NASA Technical Reports Server (NTRS)

    Daniels, T. S.; Berry, J. D.; Park, S.

    1992-01-01

    The paper reports the development and initial testing of a digital resolver to replace existing analog signal processing instrumentation. Radiometers, mounted directly on one of the fully articulated blades, are electrically connected through a slip ring to analog signal processing circuitry. The measured signals are periodic with azimuth angle and are resolved into harmonic components, with 0 deg over the tail. The periodic nature of the helicopter blade motion restricts the frequency content of each flapping and yaw signal to the fundamental and harmonics of the rotor rotational frequency. A minicomputer is employed to collect these data and then plot them graphically in real time. With this and other information generated by the instrumentation, a helicopter test pilot can then adjust the helicopter model's controls to achieve the desired aerodynamic test conditions.

  17. Expert system for scheduling simulation lab sessions

    NASA Technical Reports Server (NTRS)

    Lund, Chet

    1990-01-01

    Implementation and results of an expert system used for scheduling session requests for the Systems Engineering Simulator (SES) laboratory at the NASA Johnson Space Center (JSC) are discussed. Weekly session requests are received from astronaut crew trainers, procedures developers, engineering assessment personnel, software developers, and various others who wish to access the computers, scene generators, and other simulation equipment available to them in the SES lab. The expert system under discussion is comprised of a data acquisition portion - two Pascal programs run on a personal computer - and a CLIPS program installed on a minicomputer. A brief introduction to the SES lab and its scheduling background is given. A general overview of the system is provided, followed by a detailed description of the constraint-reduction process and of the scheduler itself. Results from a ten-week trial period using this approach are discussed. Finally, a summary of the expert system's strengths and shortcomings are provided.

  18. An autonomous Nimbus Doppler positioning system

    NASA Technical Reports Server (NTRS)

    Garza-Robles, R.; Argentiero, P.

    1981-01-01

    A single-pass Doppler positioning system was developed at the GSFC in support of the Nimbus-6 mission. The system was designed to satisfy the following requirements: compatibility with the PDP 11/70 minicomputer; single-pass recovery of up to 250 positions with a two-sigma accuracy of 5 km; high reliability with a minimum of human interaction; and near real-time responsiveness. The system consists of a numerical integrator which includes only the J2 term (earth flattening) of the earth's gravity field, an editing routine, a first guess algorithm, a least squares position recovery program, and a routine for generating a 95% confidence circle based on a computer covariance matrix. An analysis and quantification of the following error sources that limit system accuracy are presented: orbit error, radial aliasing, computational difficulties with overhead passes, oscillator drift, and ambiguity problems. Results of system reliability tests and sample outputs are included.

  19. Software for Digital Acquisition System and Application to Environmental Monitoring

    NASA Technical Reports Server (NTRS)

    Copeland, G. E.

    1975-01-01

    Criteria for selection of a minicomputer for use as a core resident acquisition system were developed for the ODU Mobile Air Pollution Laboratory. A comprehensive data acquisition program named MONARCH was instituted in a DEC-8/E-8K 12-bit computer. Up to 32 analog voltage inputs are scanned sequentially, converted to BCD, and then to actual numbers. As many as 16 external devices (valves or any other two-state device) are controlled independently. MONARCH is written as a foreground-background program, controlled by an external clock which interrupts once per minute. Transducer voltages are averaged over user specified time intervals and, upon completion of any desired time sequence, outputted are: day, hour, minute, second; state of external valves; average value of each analogue voltage (E Format); as well as standard deviations of these values. Output is compatible with any serially addressed media.

  20. Diagnosis of alcoholic cirrhosis with the right-to-left hepatic lobe ratio: concise communication

    SciTech Connect

    Shreiner, D.P.; Barlai-Kovach, M.

    1981-02-01

    Since scans of cirrhotic livers commonly show a reduction in size and colloid uptake of the right lobe, a quantitative measure of uptake was made using a minicomputer to determine total counts in regions of interest defined over each lobe. Right-to-left ratios were then compared in 103 patients. For normal paitents the mean ratio +- 1 s.d. was 2.85 +- 0.65, and the mean for patients with known cirrhosis was 1.08 +- 0.33. Patients with other liver diseases had ratios similar to the normal group. The normal range of the right-to-left lobe ratio was 1.55 to 4.15. The sensitivity of the ratio for alcoholic cirrhosis was 85.7% and the specificity was 100% in this patient population. The right-to-left lobe ratio was more sensitive and specific for alcoholic cirrhosis than any other criterion tested. An hypothesis is described to explain these results.

  1. Microcumpter computation of water quality discharges

    USGS Publications Warehouse

    Helsel, Dennis R.

    1983-01-01

    A fully prompted program (SEDQ) has been developed to calculate daily and instantaneous water quality (QW) discharges. It is written in a version of BASIC, and requires inputs of gage heights, discharge rating curve, shifts, and water quality concentration information. Concentration plots may be modified interactively using the display screen. Semi-logarithmic plots of concentration and water quality discharge are output to the display screen, and optionally to plotters. A summary table of data is also output. SEDQ could be a model program for micro and minicomputer systems likely to be in use within the Water Resources Division, USGS, in the near future. The daily discharge-weighted mean concentration is one output from SEDQ. It is defined in this report, differentiated from the currently used mean concentration, and designated the ' equivalent concentration. ' (USGS)

  2. Cryogenic system for a superconducting spectrometer

    SciTech Connect

    Porter, J.

    1983-03-01

    The Heavy Ion Spectrometer System (HISS) relies upon superconducting coils of cryostable, pool boiling design to provide a maximum particle bending field of 3 tesla. This paper describes the cryogenic facility including helium refrigeration, gas management, liquid nitrogen system, and the overall control strategy. The system normally operates with a 4 K heat load of 150 watts; the LN/sub 2/ circuits absorb an additional 4000 watts. 80K intercept control is by an LSI 11 computer. Total available refrigeration at 4K is 400 watts using reciprocating expanders at the 20K and 4K level. The minicomputer has the capability of optimizing overall utility input cost by varying operating points. A hybrid of pneumatic, analog, and digital control is successful in providing full time unattended operation. The 7m diameter magnet/cryostat assembly is rotatable through 180 degrees to provide a variety of spectrometer orientations.

  3. Investigation of creep by use of closed loop servo-hydraulic test system

    NASA Technical Reports Server (NTRS)

    Wu, H. C.; Yao, J. C.

    1981-01-01

    Creep tests were conducted by means of a closed loop servo-controlled materials test system. These tests are different from the conventional creep tests in that the strain history prior to creep may be carefully monitored. Tests were performed for aluminum alloy 6061-0 at 150 C and monitored by a PDP 11/04 minicomputer at a preset constant plastic-strain rate prehistory. The results show that the plastic-strain rate prior to creep plays a significant role in creep behavior. The endochronic theory of viscoplasticity was applied to describe the observed creep curves. The concepts of intrinsic time and strain rate sensitivity function are employed and modified according to the present observation.

  4. Networking and AI systems: Requirements and benefits

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The price performance benefits of network systems is well documented. The ability to share expensive resources sold timesharing for mainframes, department clusters of minicomputers, and now local area networks of workstations and servers. In the process, other fundamental system requirements emerged. These have now been generalized with open system requirements for hardware, software, applications and tools. The ability to interconnect a variety of vendor products has led to a specification of interfaces that allow new techniques to extend existing systems for new and exciting applications. As an example of the message passing system, local area networks provide a testbed for many of the issues addressed by future concurrent architectures: synchronization, load balancing, fault tolerance and scalability. Gold Hill has been working with a number of vendors on distributed architectures that range from a network of workstations to a hypercube of microprocessors with distributed memory. Results from early applications are promising both for performance and scalability.

  5. High speed file transfer - Point to point and multipoint, using satellite links

    NASA Astrophysics Data System (ADS)

    Valet, I.

    Techniques developed for simulation trials of high-speed file transfer via the Telecom-1 satellite system (using the ANIS simulator) by the French NADIR project are characterized. The choice of frame length, error-correction procedure, numbering scheme, and flow-control technique is discussed, and the problems encountered in applying classical protocols such as HDLC are indicated. A 32-bit numbering field and a selective-acknowledgement error algorithm with minimal flow-control will be implemented in the point-to-point simulation, using minicomputers linked by ANIS. The multipoint 'file broadcasting' simulation will be conducted with two different configurations (sending directly to all stations, with AND-forwarded random-access return channels or with only virtual packet-switched return channels, and the selective-acknowledgement algorithm. The goal of both simulations is efficient transmission of bulk files of up to 100 Mbytes.

  6. Vibration in Planetary Gear Systems with Unequal Planet Stiffnesses

    NASA Technical Reports Server (NTRS)

    Frater, J. L.; August, R.; Oswald, F. B.

    1982-01-01

    An algorithm suitable for a minicomputer was developed for finding the natural frequencies and mode shapes of a planetary gear system which has unequal stiffnesses between the Sun/planet and planet/ring gear meshes. Mode shapes are represented in the form of graphical computer output that illustrates the lateral and rotational motion of the three coaxial gears and the planet gears. This procedure permits the analysis of gear trains utilizing nonuniform mesh conditions and user specified masses, stiffnesses, and boundary conditions. Numerical integration of the equations of motion for planetary gear systems indicates that this algorithm offers an efficient means of predicting operating speeds which may result in high dynamic tooth loads.

  7. ANNIE - INTERACTIVE PROCESSING OF DATA BASES FOR HYDROLOGIC MODELS.

    USGS Publications Warehouse

    Lumb, Alan M.; Kittle, John L.

    1985-01-01

    ANNIE is a data storage and retrieval system that was developed to reduce the time and effort required to calibrate, verify, and apply watershed models that continuously simulate water quantity and quality. Watershed models have three categories of input: parameters to describe segments of a drainage area, linkage of the segments, and time-series data. Additional goals for ANNIE include the development of software that is easily implemented on minicomputers and some microcomputers and software that has no special requirements for interactive display terminals. Another goal is for the user interaction to be based on the experience of the user so that ANNIE is helpful to the inexperienced user and yet efficient and brief for the experienced user. Finally, the code should be designed so that additional hydrologic models can easily be added to ANNIE.

  8. Improved techniques to monitor thin film characteristics for reliable hybrid microcircuit fabrication

    SciTech Connect

    Zawicki, L.R.; Thomas, N.C.

    1980-06-01

    The development, testing, and uses of an edge monitor for monitoring the characteristics of thin metallic films used in the fabrication of hybrid microcircuits are described. The data obtained on this quality control equipment demonstrated improved techniques for monitoring thin film characteristics for reliable hybrid microcircuit fabrication. This was accomplished through the design of a specialized circuit, an integral part of the product substrate to be tested that enhances the effects of degradation mechanisms, both mechanically and electrically. Correlation of mechanical and/or physical characteristcs with electrical data was demonstrated. When coupled with a minicomputer test set with controlling and data retention capabilities, the system offers a useful tool for production process monitoring, diagnostic evaluation, or developmental efforts. (LCL)

  9. Improved techniques to monitor thin film characteristics for reliable hybrid microcircuit fabrication

    SciTech Connect

    Thomas, N.C.; Zawicki, L.R.

    1980-08-01

    Improved testing techniques for monitoring thin film product characteristics and process control were developed primarily to provide a real-time, full-process monitor to test thin film adhesion and to evaluate thin film properties under accelerated aging. Additionally, more cost-effective methods were developed to monitor process control through measurement of film properties that relate directly to key process variables. A special circuit pattern was designed to offer improved information-gathering capabilities. This pattern is now included on the corner of production substrates to monitor thin film adhesion, via resistance, and resistor film temperature coefficient of resistance (TCR), and life stability. Minicomputer hardware and software were developed to monitor thin film interface resistance changes under accelerated aging conditions of elevated temperature, high humidity, and corrosive environments. Curves for different aging characteristics were developed for different thin film metallization systems and for some process and substrate variables. Interface aging characteristics were compared with lead frame bond pull-test data.

  10. A microprocessor based communications multiplexer/concentrator

    NASA Technical Reports Server (NTRS)

    Brown, D. W.; Arozullah, M.

    1978-01-01

    In the past communications concentrators have been designed using mini-computers. In this work a design for a high speed concentrator (in the megabits range) is developed using the 3000 series bit slice microprocessor. The proposed concentrator realizes the functions of multiplexing of the data arriving on the low speed lines, demultiplexing of the data arriving on the high speed line, including canned responses and code conversion. The basic system hardware configuration and the principles of operation of the multiplexing and the demultiplexing subsystems are presented. The software for these two functions are also presented. Using queueing theory, an estimate of the required buffer size is provided. Possible areas of further improvement are also indicated.

  11. University of Missouri-Rolla cloud simulation facility - Proto II chamber

    NASA Technical Reports Server (NTRS)

    White, Daniel R.; Carstens, John C.; Hagen, Donald E.; Schmitt, John L.; Kassner, James L.

    1987-01-01

    The design and supporting systems for the cooled-wall expansion cloud chamber, designated Proto II, are described. The chamber is a 10-sided vertical cylinder designed to be operated with interior wall temperatures between +40 and -40 C, and is to be utilized to study microphysical processes active in atmospheric clouds and fogs. Temperatures are measured using transistor thermometers which have a range of + or - 50 C and a resolution of about + or - 0.001 C; and pressures are measured in the chamber by a differential strain gauge pressure transducer. The methods used for temperature and pressure control are discussed. Consideration is given to the chamber windows, optical table, photographic/video, optical attenuation, Mie scattering, and the scanning system for the chamber. The system's minicomputer and humidifier, sample preparation, and chamber flushing are examined.

  12. Automation software for a materials testing laboratory

    NASA Technical Reports Server (NTRS)

    Mcgaw, Michael A.; Bonacuse, Peter J.

    1986-01-01

    A comprehensive software system for automating much of the experimental process has recently been completed at the Lewis Research Center's high-temperature fatigue and structures laboratory. The system was designed to support experiment definition and conduct, results analysis and archiving, and report generation activities. This was accomplished through the design and construction of several software systems, as well as through the use of several commercially available software products, all operating on a local, distributed minicomputer system. Experimental capabilities currently supported in an automated fashion include both isothermal and thermomechanical fatigue and deformation testing capabilities. The future growth and expansion of this system will be directed toward providing multiaxial test control, enhanced thermomechanical test control, and higher test frequency (hundreds of hertz).

  13. MIDAS, prototype Multivariate Interactive Digital Analysis System, phase 1. Volume 1: System description

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.

    1974-01-01

    The MIDAS System is described as a third-generation fast multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turnaround time and significant gains in throughput. The hardware and software are described. The system contains a mini-computer to control the various high-speed processing elements in the data path, and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 200,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation.

  14. NEC's traffic terminal equipment for the Intelsat TDMA system

    NASA Astrophysics Data System (ADS)

    Saburi, A.; Oshima, G.; Saga, R.; Atobe, M.; Shigaki, S.

    1985-06-01

    The components of the traffic terminal equipment (TTE) for the Intelsat TDMA system are discussed. The TTE consists of: (1) common TDMA terminal equipment (CTTE), (2) DSI equipment, and (3) an operation and maintenance center (OMC). The CTTE contains two BASIC CTTE controllers and two QPSK modems; their composition and functions are described. The DSI equipment contains numerous DSI/digital noninterpolation (DNI) units, which use four microprocessors and two types of speech detectors to provide dynamic signal mapping between incoming and outgoing channels and reliable speech detection. A description of the DSI/DNI units, the microprocessors, and speech detectors is presented. The OMC is the man-machine interface for all the TTE. The functions of the OMC, which is composed of a minicomputer system and a status, alarm control (SAC) panel, are explained.

  15. Elemental composition of two cumulate rocks

    SciTech Connect

    Naeem, A.; Almohandis, A.A.

    1983-04-01

    Two cumulate rock samples K-185, K-250 from the Kapalagulu intrusion, W. Tanzania, were analyzed using X-ray fluorescence (XRF), wet chemical and neutron activation analysis (NAA) techniques. Major element oxides were determined by XRF and wet chemical methods, while the concentration of trace elements were measured by NAA, using high resolution Ge(Li) detector, minicomputer-based data acquisition system and off-line computer. The percentage of major oxides and sixteen trace elements have been reported. It has been found that Cr, Ni, and Co are highly concentrated in K-250 while Sc, and most of the major elements are more concentrated in K-185. The variation of major and trace elements in these two samples have been discussed.

  16. Data reduction programs for a laser radar system

    NASA Technical Reports Server (NTRS)

    Badavi, F. F.; Copeland, G. E.

    1984-01-01

    The listing and description of software routines which were used to analyze the analog data obtained from LIDAR - system are given. All routines are written in FORTRAN - IV on a HP - 1000/F minicomputer which serves as the heart of the data acquisition system for the LIDAR program. This particular system has 128 kilobytes of highspeed memory and is equipped with a Vector Instruction Set (VIS) firmware package, which is used in all the routines, to handle quick execution of different long loops. The system handles floating point arithmetic in hardware in order to enhance the speed of execution. This computer is a 2177 C/F series version of HP - 1000 RTE-IVB data acquisition computer system which is designed for real time data capture/analysis and disk/tape mass storage environment.

  17. CAMAPPLE: CAMAC interface to the Apple computer

    SciTech Connect

    Williams, S.H.; Oxoby, G.J.; Trang, Q.H.

    1981-09-01

    The advent of the personal microcomputer provides a new tool for the debugging, calibration and monitoring of small scale physics apparatus; e.g., a single detector being developed for a larger physics apparatus. With an appropriate interface these microcomputer systems provide a low cost (1/3 the cost of a comparable minicomputer system), convenient, dedicated, portable system which can be used in a fashion similar to that of portable oscilloscopes. Here we describe an interface between the Apple computer and CAMAC which is now being used to study the detector for a Cerenkov ring-imaging device. The Apple is particularly well-suited to this application because of its ease of use, hi-resolution graphics peripheral buss and documentation support.

  18. Advances in automatic extraction of information from multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Erickson, J. D.

    1975-01-01

    The state-of-the-art of automatic multispectral scanner data analysis and interpretation is reviewed. Sources of system variability which tend to obscure the spectral characteristics of the classes under consideration are discussed, and examples of the application of spatial and temporal discrimination bases are given. Automatic processing functions, techniques and methods, and equipment are described with particular attention to those that are applicable to large land surveys using satellite data. The development and characteristics of the Multivariate Interactive Digital Analysis System (MIDAS) for processing aircraft or satellite multispectral scanning data are discussed in detail. The MIDAS system combines the parallel digital implementation capabilities of a low-cost processor with a general purpose PDP-11/45 minicomputer to provide near-real-time data processing. The preprocessing functions are user-selectable. The input subsystem accepts data stored on high density digital tape, computer compatible tape, and analog tape.

  19. The microcomputer workstation - An alternate hardware architecture for remotely sensed image analysis

    NASA Technical Reports Server (NTRS)

    Erickson, W. K.; Hofman, L. B.; Donovan, W. E.

    1984-01-01

    Difficulties regarding the digital image analysis of remotely sensed imagery can arise in connection with the extensive calculations required. In the past, an expensive large to medium mainframe computer system was needed for performing these calculations. For image-processing applications smaller minicomputer-based systems are now used by many organizations. The costs for such systems are still in the range from $100K to $300K. Recently, as a result of new developments, the use of low-cost microcomputers for image processing and display systems appeared to have become feasible. These developments are related to the advent of the 16-bit microprocessor and the concept of the microcomputer workstation. Earlier 8-bit microcomputer-based image processing systems are briefly examined, and a computer workstation architecture is discussed. Attention is given to a microcomputer workstation developed by Stanford University, and the design and implementation of a workstation network.

  20. A signal processing method based on a homotopic correlation product applied to speech recognition problems

    NASA Astrophysics Data System (ADS)

    Bianchi, F.; Pocci, P.; Prina-Ricotti, L.

    1981-02-01

    The assumptions used to formulate the processing method, the proposed algorithm, and phoneme recognition test results of a homotopic signal processing method are presented. The hearing system is considered as a box with one imput, that applies a signal whose information content = 500 Kbit/sec, and many thousand outputs, the nerve fibers, having a transmission rate variable between 30 and 400 bit/sec. The signal transmitted by any one fiber is a series of equal impulse. Homotopic representation of a phoneme is available in steady state after 2 to 3 msec. The phoneme patterns are very different, although patterns for the same phoneme from different speakers are similar. Transition patterns between phonemes change rapidly. Recognition rate, using a minicomputer, of all possible combinations of 'a', 'e', 'r' and 'm' is 95.2%.

  1. The spatial and logical organization of devices in an advanced industrial robot system

    NASA Technical Reports Server (NTRS)

    Ruoff, C. F.

    1980-01-01

    This paper describes the geometrical and device organization of a robot system which is based in part upon transformations of Cartesian frames and exchangeable device tree structures. It discusses coordinate frame transformations, geometrical device representation and solution degeneracy along with the data structures which support the exchangeable logical-physical device assignments. The system, which has been implemented in a minicomputer, supports vision, force, and other sensors. It allows tasks to be instantiated with logically equivalent devices and it allows tasks to be defined relative to appropriate frames. Since these frames are, in turn, defined relative other frames this organization provides a significant simplification in task specification and a high degree of system modularity.

  2. AOIPS data base management systems support for GARP data sets

    NASA Technical Reports Server (NTRS)

    Gary, J. P.

    1977-01-01

    A data base management system is identified, developed to provide flexible access to data sets produced by GARP during its data systems tests. The content and coverage of the data base are defined and a computer-aided, interactive information storage and retrieval system, implemented to facilitate access to user specified data subsets, is described. The computer programs developed to provide the capability were implemented on the highly interactive, minicomputer-based AOIPS and are referred to as the data retrieval system (DRS). Implemented as a user interactive but menu guided system, the DRS permits users to inventory the data tape library and create duplicate or subset data sets based on a user selected window defined by time and latitude/longitude boundaries. The DRS permits users to select, display, or produce formatted hard copy of individual data items contained within the data records.

  3. The Mount Wilson solar magnetograph - Scanning and data system

    NASA Technical Reports Server (NTRS)

    Howard, R.

    1976-01-01

    The paper describes a computer-operated image-scanning and data-collection system for the magnetograph at the Mt. Wilson 150-foot Tower telescope. The system is based on a minicomputer with a 32K word core memory and a generalized interface unit for controlling image motion, a keyboard, and an associated television screen. Operation of the solar image guider and the data-collection assembly is outlined along with the observation and data-reduction procedures. Advantages of the system include the ability to move the image in almost any conceivable fashion, a wide choice of integration times, and increased accuracy in magnetic and Doppler calibrations as well as in setting of the magnetic zero level.

  4. CAMAPPLE: CAMAC interface to the Apple computer

    SciTech Connect

    Oxoby, G.J.; Trang, Q.H.; Williams, S.H.

    1981-04-01

    The advent of the personal microcomputer provides a new tool for the debugging, calibration and monitoring of small scale physics apparatus, e.g., a single detector being developed for a larger physics apparatus. With an appropriate interface these microcomputer systems provide a low cost (1/3 the cost of a comparable minicomputer system), convenient, dedicated, portable system which can be used in a fashion similar to that of portable oscilloscopes. Here, an interface between the Apple computer and CAMAC which is now being used to study the detector for a Cerenkov ring-imaging device is described. The Apple is particularly well-suited to this application because of its ease of use, hi-resolution graphics, peripheral bus and documentation support.

  5. MIDAS, prototype Multivariate Interactive Digital Analysis System, phase 1. Volume 3: Wiring diagrams

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.

    1974-01-01

    The Midas System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in Phase I of the overall program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 2 x 100,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. The MIDAS construction and wiring diagrams are given.

  6. MIDAS, prototype Multivariate Interactive Digital Analysis System, Phase 1. Volume 2: Diagnostic system

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.

    1974-01-01

    The MIDAS System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughout. The hardware and software generated in Phase I of the over-all program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating 2 x 105 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. Diagnostic programs used to test MIDAS' operations are presented.

  7. A statistical data analysis and plotting program for cloud microphysics experiments

    NASA Technical Reports Server (NTRS)

    Jordan, A. J.

    1981-01-01

    The analysis software developed for atmospheric cloud microphysics experiments conducted in the laboratory as well as aboard a KC-135 aircraft is described. A group of four programs was developed and implemented on a Hewlett Packard 1000 series F minicomputer running under HP's RTE-IVB operating system. The programs control and read data from a MEMODYNE Model 3765-8BV cassette recorder, format the data on the Hewlett Packard disk subsystem, and generate statistical data (mean, variance, standard deviation) and voltage and engineering unit plots on a user selected plotting device. The programs are written in HP FORTRAN IV and HP ASSEMBLY Language with the graphics software using the HP 1000 Graphics. The supported plotting devices are the HP 2647A graphics terminal, the HP 9872B four color pen plotter, and the HP 2608A matrix line printer.

  8. Scheduler software for tracking and data relay satellite system loading analysis: User manual and programmer guide

    NASA Technical Reports Server (NTRS)

    Craft, R.; Dunn, C.; Mccord, J.; Simeone, L.

    1980-01-01

    A user guide and programmer documentation is provided for a system of PRIME 400 minicomputer programs. The system was designed to support loading analyses on the Tracking Data Relay Satellite System (TDRSS). The system is a scheduler for various types of data relays (including tape recorder dumps and real time relays) from orbiting payloads to the TDRSS. Several model options are available to statistically generate data relay requirements. TDRSS time lines (representing resources available for scheduling) and payload/TDRSS acquisition and loss of sight time lines are input to the scheduler from disk. Tabulated output from the interactive system includes a summary of the scheduler activities over time intervals specified by the user and overall summary of scheduler input and output information. A history file, which records every event generated by the scheduler, is written to disk to allow further scheduling on remaining resources and to provide data for graphic displays or additional statistical analysis.

  9. Speech as a pilot input medium

    NASA Technical Reports Server (NTRS)

    Plummer, R. P.; Coler, C. R.

    1977-01-01

    The speech recognition system under development is a trainable pattern classifier based on a maximum-likelihood technique. An adjustable uncertainty threshold allows the rejection of borderline cases for which the probability of misclassification is high. The syntax of the command language spoken may be used as an aid to recognition, and the system adapts to changes in pronunciation if feedback from the user is available. Words must be separated by .25 second gaps. The system runs in real time on a mini-computer (PDP 11/10) and was tested on 120,000 speech samples from 10- and 100-word vocabularies. The results of these tests were 99.9% correct recognition for a vocabulary consisting of the ten digits, and 99.6% recognition for a 100-word vocabulary of flight commands, with a 5% rejection rate in each case. With no rejection, the recognition accuracies for the same vocabularies were 99.5% and 98.6% respectively.

  10. Procedure for editing the fluxgate magnetometer data of the AFGL (Air Force Geophysics Laboratory) magnetometer network

    NASA Astrophysics Data System (ADS)

    Paboojian, A. J.

    1984-10-01

    The procedure for producing the edited data base of the AFGL Magnetometer Network is described in detail. The input to the procedure is the series of archive tapes on which the raw data from the seven network stations are recorded; the output is several series of tapes containing the edited data from the fluxgate magnetometer only. Each series has either a one-second or a one-minute (averaged) sampling interval and is written in a tape format selected for compatibility with one or more specific computer types used at the Air Force Geophysics Laboratory, The World Data Center, and other scientific institutions. Detailed instructions are given for the execution of each of the computer programs employed in the procedure as well as for the basic operation of the network minicomputer on which the procedure is carried out. The procedure is highly automated and the description provided is sufficient to permit its being carried out by an untrained operator.

  11. Flexible missile autopilot design studies with PC-MATLAB/386

    NASA Technical Reports Server (NTRS)

    Ruth, Michael J.

    1989-01-01

    Development of a responsive, high-bandwidth missile autopilot for airframes which have structural modes of unusually low frequency presents a challenging design task. Such systems are viable candidates for modern, state-space control design methods. The PC-MATLAB interactive software package provides an environment well-suited to the development of candidate linear control laws for flexible missile autopilots. The strengths of MATLAB include: (1) exceptionally high speed (MATLAB's version for 80386-based PC's offers benchmarks approaching minicomputer and mainframe performance); (2) ability to handle large design models of several hundred degrees of freedom, if necessary; and (3) broad extensibility through user-defined functions. To characterize MATLAB capabilities, a simplified design example is presented. This involves interactive definition of an observer-based state-space compensator for a flexible missile autopilot design task. MATLAB capabilities and limitations, in the context of this design task, are then summarized.

  12. Modern control techniques for accelerators

    SciTech Connect

    Goodwin, R.W.; Shea, M.F.

    1984-05-01

    Beginning in the mid to late sixties, most new accelerators were designed to include computer based control systems. Although each installation differed in detail, the technology of the sixties and early to mid seventies dictated an architecture that was essentially the same for the control systems of that era. A mini-computer was connected to the hardware and to a console. Two developments have changed the architecture of modern systems: (a) the microprocessor and (b) local area networks. This paper discusses these two developments and demonstrates their impact on control system design and implementation by way of describing a possible architecture for any size of accelerator. Both hardware and software aspects are included.

  13. Interfacing a torsion pendulum with a microcomputer

    SciTech Connect

    Bush, J.A.; Newby, J.W.

    1983-02-24

    Shear modulus testing is performed on the torsion pendulum at the General Electric Neutron Devices Department (GEND) as a means of gauging the state of cure for a polymer system. However, collection and reduction of the data to obtain the elastic modulus necessitated extensive operator involved measurements and calculations, which were subject to errors. To improve the reliability of the test, an analog-to-digital interface was designed and built to connect the torsion pendulum with a minicomputer. After the necessary programming was prepared, the system was tested and found to be an improvement over the old procedure in both quality and time of operation. An analysis of the data indicated that the computer generated modulus data were equivalent to the hand method data, but potential operator errors in frequency measurements and calculations were eliminated. The interfacing of the pendulum with the computer resulted in an overall time savings of 52 percent.

  14. The History of the Data Systems AutoChemist® (ACH) and AutoChemist-PRISMA (PRISMA®): from 1964 to 1986

    PubMed Central

    2014-01-01

    Summary Objectives This paper presents the history of data system development steps (1964 – 1986) for the clinical analyzers AutoChemist®, and its successor AutoChemist PRISMA® (PRogrammable Individually Selective Modular Analyzer). The paper also partly recounts the history of development steps of the minicomputer PDP 8 from Digital Equipment. The first PDP 8 had 4 core memory boards of 1 K each and was large as a typical oven baking sheet and about 10 years later, PDP 8 was a “one chip microcomputer” with a 32 K memory chip. The fast developments of PDP 8 come to have a strong influence on the development of the data system for AutoChemist. Five major releases of the software were made during this period (1-5 MIACH). Results The most important aims were not only to calculate the results, but also be able to monitor their quality and automatically manage the orders, store the results in digital form for later statistical analysis and distribute the results to the physician in charge of the patient using thesame computer as the analyzer. Another result of the data system was the ability to customize AutoChemist to handle sample identification by using bar codes and the presentation of results to different types of laboratories. Conclusions Digital Equipment launched the PDP 8 just as a new minicomputer was desperately needed. No other known alternatives were available at the time. This was to become a key success factor for AutoChemist. That the AutoChemist with such a high capacity required a computer for data collection was obvious already in the early 1960s. That computer development would be so rapid and that one would be able to accomplish so much with a data system was even suspicious at the time. In total, 75; AutoChemist (31) and PRISMA (44) were delivered Worldwide The last PRISMA was delivered in 1987 to the Veteran Hospital Houston, TX USA PMID:24853032

  15. Evolution of the Mobile Information SysTem (MIST)

    NASA Technical Reports Server (NTRS)

    Litaker, Harry L., Jr.; Thompson, Shelby; Archer, Ronald D.

    2008-01-01

    The Mobile Information SysTem (MIST) had its origins in the need to determine whether commercial off the shelf (COTS) technologies could improve intervehicular activities (IVA) on International Space Station (ISS) crew maintenance productivity. It began with an exploration of head mounted displays (HMDs), but quickly evolved to include voice recognition, mobile personal computing, and data collection. The unique characteristic of the MIST lies within its mobility, in which a vest is worn that contains a mini-computer and supporting equipment, and a headband with attachments for a HMD, lipstick camera, and microphone. Data is then captured directly by the computer running Morae(TM) or similar software for analysis. To date, the MIST system has been tested in numerous environments such as two parabolic flights on NASA's C-9 microgravity aircraft and several mockup facilities ranging from ISS to the Altair Lunar Sortie Lander. Functional capabilities have included its lightweight and compact design, commonality across systems and environments, and usefulness in remote collaboration. Human Factors evaluations of the system have proven the MIST's ability to be worn for long durations of time (approximately four continuous hours) with no adverse physical deficits, moderate operator compensation, and low workload being reported as measured by Corlett Bishop Discomfort Scale, Cooper-Harper Ratings, and the NASA Total Workload Index (TLX), respectively. Additionally, through development of the system, it has spawned several new applications useful in research. For example, by only employing the lipstick camera, microphone, and a compact digital video recorder (DVR), we created a portable, lightweight data collection device. Video is recorded from the participants point of view (POV) through the use of the camera mounted on the side of the head. Both the video and audio is recorded directly into the DVR located on a belt around the waist. This data is then transferred to another computer for video editing and analysis. Another application has been discovered using simulated flight, in which, a kneeboard is replaced with mini-computer and the HMD to project flight paths and glide slopes for lunar ascent. As technologies evolve, so will the system and its application for research and space system operations.

  16. Pulse code modulation telemetry in ski injury research. I. Instrumentation.

    PubMed

    Hull, M L; Mote, C D

    1974-01-01

    Measurement problems can be classified into instrumentation, data transmission and recording, and analysis. This paper focuses on the transmission of multichannel, high-volume, high-frequency, high-accuracy data. Boot-ski dynamometer and skier velocity anemometer data provide 13 channels of max. 8-mV signals requiring 8-microvolt resolution or 4.45-Newton dynamometer resolution. The data transmission system features durability, power consumption approx. 10 Watts, weight 4.54 kp, range greater than 3,500 m, frequency response 250 Hz, accuracy 1 per cent, temperature stability, dynamic range plus or minus 2 inches. The transducer signals are ampflified to plus or minus 10 V for the 100-kbps PCM system. Special AC amplifiers, driven by an amplitude-stabilized power oscillator, were designed for elimination of radio frequency interference (RFI), improved stability and high signal/noise. Sixteen words are sequentially sampled at 521/sec-13 data, 2 frame counters, and 1 sync. The ground station consists of the PCM decoder with real-time capability and an analog tape recorder. Data is subsequently buffered and formatted onto digital tape by mini-computer. PMID:4469157

  17. Transferring ecosystem simulation codes to supercomputers

    NASA Technical Reports Server (NTRS)

    Skiles, J. W.; Schulbach, C. H.

    1995-01-01

    Many ecosystem simulation computer codes have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Supercomputing platforms (both parallel and distributed systems) have been largely unused, however, because of the perceived difficulty in accessing and using the machines. Also, significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers must be considered. We have transferred a grassland simulation model (developed on a VAX) to a Cray Y-MP/C90. We describe porting the model to the Cray and the changes we made to exploit the parallelism in the application and improve code execution. The Cray executed the model 30 times faster than the VAX and 10 times faster than a Unix workstation. We achieved an additional speedup of 30 percent by using the compiler's vectoring and 'in-line' capabilities. The code runs at only about 5 percent of the Cray's peak speed because it ineffectively uses the vector and parallel processing capabilities of the Cray. We expect that by restructuring the code, it could execute an additional six to ten times faster.

  18. ANL statement of site strategy for computing workstations

    SciTech Connect

    Fenske, K.R. (ed.); Boxberger, L.M.; Amiot, L.W.; Bretscher, M.E.; Engert, D.E.; Moszur, F.M.; Mueller, C.J.; O'Brien, D.E.; Schlesselman, C.G.; Troyer, L.J.

    1991-11-01

    This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is to develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.

  19. Discounts for dynamic programming with applications in VLSI processor arrays

    SciTech Connect

    Lopresti, D.P.

    1987-01-01

    This dissertation introduces a method for transforming certain dynamic programming problems into ones that require less space and time to solve under the logarithmic cost criterion, an appropriate complexity measure for flexible word-length machines. The mapping is based on discounts that change the costs but not the identities of optimal policies. Under the proper circumstances, the structure present in the original problem is preserved in the image so that the functional equations of dynamic programming still apply. Practical value of the theory is illustrated by demonstrating that a previously published VLSI processor array can be made asymptotically smaller and faster. The second half of this work addresses issues that arise in parallel sequence comparison. The paradigm here is deoxyribonucleic acid (DNA) which maybe considered a string over a four-character alphabet. It is shown how a number of popular sequence matching algorithms can be mapped onto linear arrays of processors. One of these, the Princeton Nucleic Acid Comparator (P-NAC), has been fabricated, tested, and found to work perfectly. Its efficient implementation is due entirely to an application of discounts; benchmark results prove that it is several hundred times faster than a minicomputer.

  20. Pneumatic sample-transfer system for use with the Lawrence Livermore National Laboratory rotating target neutron source (RTNS-I)

    SciTech Connect

    Williams, R.E.

    1981-07-01

    A pneumatic sample-transfer system is needed to be able to rapidly retrieve samples irradiated with 14-MeV neutrons at the Rotating Target Neutron Source (RTNS-I). The rabbit system, already in place for many years, has been refurbished with modern system components controlled by an LSI-11 minicomputer. Samples can now be counted three seconds after an irradiation. There are many uses for this expanded 14-MeV neutron activation capability. Several fission products difficult to isolate from mixed fission fragments can be produced instead through (n,p) or (n,..cap alpha..) reactions with stable isotopes. Mass-separated samples of Nd, Mo, and Se, for example, can be irradiated to produce Pr, Nb, and As radionuclides sufficient for decay scheme studies. The system may also be used for multielement fast-neutron activation analysis because the neutron flux is greater than 2 x 10/sup 11/ n/cm/sup 2/-sec. Single element analyses of Si and O are also possible. Finally, measurements of fast-neutron cross sections producing short-lived activation products can be performed with this system. A description of the rabbit system and instructions for its use are presented in this report.

  1. Review of the Water Resources Information System of Argentina

    USGS Publications Warehouse

    Hutchison, N.E.

    1987-01-01

    A representative of the U.S. Geological Survey traveled to Buenos Aires, Argentina, in November 1986, to discuss water information systems and data bank implementation in the Argentine Government Center for Water Resources Information. Software has been written by Center personnel for a minicomputer to be used to manage inventory (index) data and water quality data. Additional hardware and software have been ordered to upgrade the existing computer. Four microcomputers, statistical and data base management software, and network hardware and software for linking the computers have also been ordered. The Center plans to develop a nationwide distributed data base for Argentina that will include the major regional offices as nodes. Needs for continued development of the water resources information system for Argentina were reviewed. Identified needs include: (1) conducting a requirements analysis to define the content of the data base and insure that all user requirements are met, (2) preparing a plan for the development, implementation, and operation of the data base, and (3) developing a conceptual design to inform all development personnel and users of the basic functionality planned for the system. A quality assurance and configuration management program to provide oversight to the development process was also discussed. (USGS)

  2. ATS-6 - Radio Beacon Experiment: The first years. [ionospheric and satellite-to-ground electron content

    NASA Technical Reports Server (NTRS)

    Davies, K.; Fritz, R. B.; Grubb, R. N.; Jones, J. E.

    1975-01-01

    The Radio Beacon Experiment aboard Applications Technology Satellite-6 (ATS-6) is designed to measure the total electron content and the ionospheric electron content between the satellite and ground. The spaceborne beacon transmits signals on frequencies of 40, 140, and 360 MHz with amplitude modulations of 1 MHz and/or 0.1 MHz for the measurement of modulation phase, Faraday rotation, and amplitude. The modulation phase delays are calibrated in the satellite and in the ground equipment, and the polarization of the emitted signals are predetermined by standard antenna range techniques. The design of the ATS-6 receiver in Boulder, Colorado, is discussed. The antennae are of the short backfire type described by Ehrenspeck (1967), with nominal gains of 13, 19, and 22 dB at 40, 140, and 360 MHz, respectively. Data recording and overall supervision of the receiver is carried out by a 16-bit minicomputer with 8 k of memory. Overall performance of the system is satisfactory. Sample data on the monthly median hourly values of the total electron content, plasmospheric content, and shape factor show distinct seasonal and diurnal variations.

  3. Modular on-board adaptive imaging

    NASA Technical Reports Server (NTRS)

    Eskenazi, R.; Williams, D. S.

    1978-01-01

    Feature extraction involves the transformation of a raw video image to a more compact representation of the scene in which relevant information about objects of interest is retained. The task of the low-level processor is to extract object outlines and pass the data to the high-level process in a format that facilitates pattern recognition tasks. Due to the immense computational load caused by processing a 256x256 image, even a fast minicomputer requires a few seconds to complete this low-level processing. It is, therefore, necessary to consider hardware implementation of these low-level functions to achieve real-time processing speeds. The considered project had the objective to implement a system in which the continuous feature extraction process is not affected by the dynamic changes in the scene, varying lighting conditions, or object motion relative to the cameras. Due to the high bandwidth (3.5 MHz) and serial nature of the TV data, a pipeline processing scheme was adopted as the overall architecture of this system. Modularity in the system is achieved by designing circuits that are generic within the overall system.

  4. Computer card morphometry of jejunal biopsies in childhood coeliac disease.

    PubMed

    Meinhard, E A; Wadbrook, D G; Risdon, R A

    1975-02-01

    The histological changes in 95 jejunal biopsy specimens from children have been analyzed by a new mporphometric technique. The microscope image of the specimen is traced directly onto computer data cards. A simple sketch records accurate quantitative data in a matrix of 840 points, retaining the spatial arrangement of the tissue components. The data are fed via an optical mark data card reader, into a mini-computer. FORTRAN IV programs allow calculation of surface area, villous heights, and component volumes in metric units, and of volume proportions, volume-to-volume ratios, and surface-to-volume ratios. Pictorial and numerical printouts are produced, which are suitable for inclusion in the patient's notes. Jejunal biopsies from 37 controls and 26 untreated coeliac patients were clearly distinguished morphometrically. Sixteen pairs of biopsies from coeliac patients on long-term gluten-free diets before, and 12 weeks after, the reintroduction of dietary gluten significantly reflected the effects of gluten challenge. Comparison of control and abnormal biopsies showed a spatial redistribution of the components, more than a change in their absolute amounts. There was no significant differences in the total epithelial volumes in controls, treated or untreated patients, suggesting that the mucosal lesion in coeliac disease is not a true atrophy. PMID:1127115

  5. The graphics and data acquisition software package

    NASA Technical Reports Server (NTRS)

    Crosier, W. G.

    1981-01-01

    A software package was developed for use with micro and minicomputers, particularly the LSI-11/DPD-11 series. The package has a number of Fortran-callable subroutines which perform a variety of frequently needed tasks for biomedical applications. All routines are well documented, flexible, easy to use and modify, and require minimal programmer knowledge of peripheral hardware. The package is also economical of memory and CPU time. A single subroutine call can perform any one of the following functions: (1) plot an array of integer values from sampled A/D data, (2) plot an array of Y values versus an array of X values; (3) draw horizontal and/or vertical grid lines of selectable type; (4) annotate grid lines with user units; (5) get coordinates of user controlled crosshairs from the terminal for interactive graphics; (6) sample any analog channel with program selectable gain; (7) wait a specified time interval, and (8) perform random access I/O of one or more blocks of a sequential disk file. Several miscellaneous functions are also provided.

  6. GEEF: a geothermal engineering and economic feasibility model. Description and user's manual

    SciTech Connect

    Not Available

    1982-09-01

    The model is designed to enable decision makers to compare the economics of geothermal projects with the economics of alternative energy systems at an early stage in the decision process. The geothermal engineering and economic feasibility computer model (GEEF) is written in FORTRAN IV language and can be run on a mainframe or a mini-computer system. An abbreviated version of the model is being developed for usage in conjunction with a programmable desk calculator. The GEEF model has two main segments, namely (i) the engineering design/cost segment and (ii) the economic analysis segment. In the engineering segment, the model determines the numbers of production and injection wells, heat exchanger design, operating parameters for the system, requirement of supplementary system (to augment the working fluid temperature if the resource temperature is not sufficiently high), and the fluid flow rates. The model can handle single stage systems as well as two stage cascaded systems in which the second stage may involve a space heating application after a process heat application in the first stage.

  7. Text processing for technical reports (direct computer-assisted origination, editing, and output of text)

    SciTech Connect

    De Volpi, A.; Fenrick, M. R.; Stanford, G. S.; Fink, C. L.; Rhodes, E. A.

    1980-10-01

    Documentation often is a primary residual of research and development. Because of this important role and because of the large amount of time consumed in generating technical reports, particularly those containing formulas and graphics, an existing data-processing computer system has been adapted so as to provide text-processing of technical documents. Emphasis has been on accuracy, turnaround time, and time savings for staff and secretaries, for the types of reports normally produced in the reactor development program. The computer-assisted text-processing system, called TXT, has been implemented to benefit primarily the originator of technical reports. The system is of particular value to professional staff, such as scientists and engineers, who have responsibility for generating much correspondence or lengthy, complex reports or manuscripts - especially if prompt turnaround and high accuracy are required. It can produce text that contains special Greek or mathematical symbols. Written in FORTRAN and MACRO, the program TXT operates on a PDP-11 minicomputer under the RSX-11M multitask multiuser monitor. Peripheral hardware includes videoterminals, electrostatic printers, and magnetic disks. Either data- or word-processing tasks may be performed at the terminals. The repertoire of operations has been restricted so as to minimize user training and memory burden. Spectarial staff may be readily trained to make corrections from annotated copy. Some examples of camera-ready copy are provided.

  8. Three-axis electron-beam test facility

    NASA Technical Reports Server (NTRS)

    Dayton, J. A., Jr.; Ebihara, B. T.

    1981-01-01

    An electron beam test facility, which consists of a precision multidimensional manipulator built into an ultra-high-vacuum bell jar, was designed, fabricated, and operated at Lewis Research Center. The position within the bell jar of a Faraday cup which samples current in the electron beam under test, is controlled by the manipulator. Three orthogonal axes of motion are controlled by stepping motors driven by digital indexers, and the positions are displayed on electronic totalizers. In the transverse directions, the limits of travel are approximately + or - 2.5 cm from the center with a precision of 2.54 micron (0.0001 in.); in the axial direction, approximately 15.0 cm of travel are permitted with an accuracy of 12.7 micron (0.0005 in.). In addition, two manually operated motions are provided, the pitch and yaw of the Faraday cup with respect to the electron beam can be adjusted to within a few degrees. The current is sensed by pulse transformers and the data are processed by a dual channel box car averager with a digital output. The beam tester can be operated manually or it can be programmed for automated operation. In the automated mode, the beam tester is controlled by a microcomputer (installed at the test site) which communicates with a minicomputer at the central computing facility. The data are recorded and later processed by computer to obtain the desired graphical presentations.

  9. A five-collector system for the simultaneous measurement of argon isotope ratios in a static mass spectrometer

    USGS Publications Warehouse

    Stacey, J.S.; Sherrill, N.D.; Dalrymple, G.B.; Lanphere, M.A.; Carpenter, N.V.

    1981-01-01

    A system is described that utilizes five separate Faraday-cup collector assemblies, aligned along the focal plane of a mass spectrometer, to collect simultaneous argon ion beams at masses 36-40. Each collector has its own electrometer amplifier and analog-to-digital measuring channel, the outputs of which are processed by a minicomputer that also controls the mass spectrometer. The mass spectrometer utilizes a 90?? sector magnetic analyzer with a radius of 23 cm, in which some degree of z-direction focussing is provided for all the ion beams by the fringe field of the magnet. Simultaneous measurement of the ion beams helps to eliminate mass-spectrometer memory as a significant source of measurement error during an analysis. Isotope ratios stabilize between 7 and 9 s after sample admission into the spectrometer, and thereafter changes in the measured ratios are linear, typically to within ??0.02%. Thus the multi-collector arrangement permits very short extrapolation times for computation of initial ratios, and also provides the advantages of simultaneous measurement of the ion currents in that errors due to variations in ion beam intensity are minimized. A complete analysis takes less than 10 min, so that sample throughput can be greatly enhanced. In this instrument, the factor limiting analytical precision now lies in short-term apparent variations in the interchannel calibration factors. ?? 1981.

  10. In-phase and out-of-phase axial-torsional fatigue behavior of Haynes 188 at 760 C

    NASA Technical Reports Server (NTRS)

    Kalluri, Sreeramesh; Bonacuse, Peter J.

    1991-01-01

    Isothermal, in-phase and out-of-phase axial-torsional fatigue experiments have been conducted at 760 C on uniform gage section, thin-walled tubular specimens of a wrought cobalt-base superalloy, Haynes 188. Test-control and data acquisition were accomplished with a minicomputer. Fatigue lives of the in- and out-of-phase axial-torsional fatigue tests have been estimated with four different multiaxial fatigue life prediction models that were developed primarly for predicting axial-torsional fatigue lives at room temperature. The models investigated were: (1) the von Mises equivalent strain range; (2) the Modified Multiaxiality Factor Approach; (3) the Modified Smith-Watson-Topper Parameter; and (4) the critical shear plane method of Fatemi, Socie, and Kurath. In general, life predictions by the von Mises equivalent strain range model were within a factor of 2 for a majority of the tests and the predictions by the Modified Multiaxiality Factor Approach were within a factor of 2, while predictions of the Modified Smith-Watson-Topper Parameter and of the critical shear plane method of Fatemi, Socie, and Kurath were unconservative and conservative, respectively, by up to factors of 4. In some of the specimens tested under combined axial-torsional loading conditions, fatigue cracks initiated near extensometer indentations. Two design modifications have been proposed to the thin-walled tubular specimen to overcome this problem.

  11. AESOP XX: summary of proceedings. [Gatlinburg, Tennessee, April 24 to 26, 1979

    SciTech Connect

    none,

    1980-03-01

    The 20th meeting of the Association for Energy Systems, Operations, and Programming (AESOP) was held in Gatlinburg, Tennessee, on April 24 to 26, 1979. Representatives of DOE Headquarters discussed the effects that new security and privacy regulations will have on automatic data processing operations. The status and future possibilities of the Business Management Information System (BMIS) were also discussed. Then representatives of various DOE offices and contractors presented reports on various topics. This report contains two-page summaries of the papers presented at the meeting. Session topics and titles of papers were as follows: Washington report (New ADP issues; BMIS: the Business Management Information System; Nuclear weapons and the computer); Improving the productivity of the computing analyst/programer (What productivity improvement tools are available; Rocky Flats experience with SDM/70; Albuquerque Operations Office experience with SDM/70; Planning and project management; Minicomputer standards and programer productivity; MRC productivity gains through applications development tools); User viewpoints and expectations of data processing (User perspectives on computer applications; User viewpoints on environmental studies; Planning and implementing a procurement system; Two sides of the DP coin); Data base management (Use of data base systems within DOE; Future trends in data base hardware; Future trends in data base software; Toward automating the data base design process); and Management discussions. Complete versions of three of the papers have already been cited in ERA. These can be located by reference to the entry CONF-790431-- in the Report Number Index. (RWR)

  12. Determination of physical and chemical states of lubricants in concentrated contacts, part 1

    NASA Technical Reports Server (NTRS)

    Lauer, J. L.

    1979-01-01

    A Fourier emission infrared microspectrometer, set up on a vibration-proof optical table and interfaced to a dedicated minicomputer, was used to record infrared emission spectra from elastohydrodynamic bearing contacts. Its range was extended to cover the entire mid-infrared from 2 to 15 micron. A series of experiments with 5P4E polyphenyl ether showed the existence of a temperature gradient through the lubricant in an ehd contact, which is perpendicular to the flow direction. The experiments also show marked polarization of some of the spectral bands, indicating a molecular alignment. Alignment is less evident at high pressure than at low pressure. To account for this behavior, a model is suggested along the lines developed for the conformational changes observed in long-chain polymers when subjected to increased pressure--to accommodate closer packing, molecules become kinked and curl up. Experiments with a traction fluid showed periodic changes of flow pattern associated with certain spectral changes. These observations will be studied further. A study by infrared attenuated total reflection spectrophotometry was undertaken to determine whether gamma irradiation would change polyethylene wear specimens. The results were negative.

  13. A new approach for data acquisition at the JPL space simulators

    NASA Technical Reports Server (NTRS)

    Fisher, Terry C.

    1992-01-01

    In 1990, a personal computer based data acquisition system was put into service for the Space Simulators and Environmental Test Laboratory at the Jet Propulsion Laboratory (JPL) in Pasadena, California. The new system replaced an outdated minicomputer system which had been in use since 1980. This new data acquisition system was designed and built by JPL for the specific task of acquiring thermal test data in support of space simulation and thermal vacuum testing at JPL. The data acquisition system was designed using powerful personal computers and local-area-network (LAN) technology. Reliability, expandability, and maintainability were some of the most important criteria in the design of the data system and in the selection of hardware and software components. The data acquisition system is used to record both test chamber operational data and thermal data from the unit under test. Tests are conducted in numerous small thermal vacuum chambers and in the large solar simulator and range in size from individual components using only 2 or 3 thermocouples to entire planetary spacecraft requiring in excess of 1200 channels of test data. The system supports several of these tests running concurrently. The previous data system is described along with reasons for its replacement, the types of data acquired, the new data system, and the benefits obtained from the new system including information on tests performed to date.

  14. Managing for the next big thing. Interview by Paul Hemp.

    PubMed

    Ruettgers, M

    2001-01-01

    In this HBR interview, CEO Michael Ruettgers speaks in detail about the managerial practices that have allowed EMC to anticipate and exploit disruptive technologies, market opportunities, and business models ahead of its competitors. He recounts how the company repeatedly ventured into untested markets, ultimately transforming itself from a struggling maker of minicomputer memory boards into a data storage powerhouse and one of the most successful companies of the past decade. The company has achieved sustained and nearly unrivaled revenue, profit, and shareprice growth through a number of means. Emphasizing timing and speed, Ruettgers says, is critical. That's meant staggering products rather than developing them sequentially and avoiding the excessive refinements that slow time to market. Indeed, a sense of urgency, Ruettgers explains, has been critical to EMC's success. Processes such as quarterly goal setting and monthly forecasting meetings help maintain a sense of urgency and allow managers to get early glimpses of changes in the market. So does an environment in which personal accountability is stressed and the corporate focus is single-minded. Perhaps most important, the company has procedures to glean insights from customers. Intensive forums involving EMC engineers and leading-edge customers, who typically push for unconventional solutions to their problems, often yield new product features. Similarly, a customer service system that includes real-time monitoring of product use enables EMC to understand customer needs firsthand. PMID:11189457

  15. The 1983-84 Connecticut 45-Hz-band field-strength measurements

    NASA Astrophysics Data System (ADS)

    Bannister, P. R.

    1986-03-01

    Extremely low frequency (ELF) measurements are made of the transverse horizontal magnetic field strength received in Connecticut. The AN/BSR-1 receiver consists of an AN/UYK-20 minicomputer, a signal timing and interface unit (STIU), a rubidium frequency time standard, two magnetic tape recorders, and a preamplifier. The transmission source of these farfield (1.6-Mm range) measurements is the U.S. Navy's ELF Wisconsin Test Facility (WTF), located in the Chequamegon National Forest in north central Wisconsin, about 8 km south of the village of Clam Lake. The WTF consists of two 22.5-km antennas; one of which is situated approximately in the north-south (NS) direction and the other approximately in the east-west (EW) direction. Each antenna is grounded at both ends. The electrical axis of the WTF EW antenna is 11 deg east of north at 45 Hz and 14 deg east of north at 75Hz. The electrical axis of the WTF NS antenna is 11 deg east of north at 45 Hz and 14 deg east of north at 75 Hz. The WTF array can be steered electrically. Its radiated power is approximately 0.5 W at 45 Hz and 1 W at 75 Hz. This report will compare results of 45 Hz band data taken during 1983 to 1984 with previous 45 Hz band measurements.

  16. Test plan for 32-bit microcomputers for the Water Resources Division; Chapter A, Test plan for acquisition of prototype 32-bit microcomputers

    USGS Publications Warehouse

    Hutchison, N.E.; Harbaugh, A.W.; Holloway, R.A.; Merk, C.F.

    1987-01-01

    The Water Resources Division (WRD) of the U.S. Geological Survey is evaluating 32-bit microcomputers to determine how they can complement, and perhaps later replace, the existing network of minicomputers. The WRD is also designing a National Water Information System (NWIS) that will combine and integrate the existing National Water Data Storage and Retrieval System (WATSTORE), National Water Data Exchange (NAWDEX), and components of several other existing systems. The procedures and testing done in a market evaluation of 32-bit microcomputers are documented. The results of the testing are documented in the NWIS Project Office. The market evaluation was done to identify commercially available hardware and software that could be used for implementing early NWIS prototypes to determine the applicability of 32-bit microcomputers for data base and general computing applications. Three microcomputers will be used for these prototype studies. The results of the prototype studies will be used to compile requirements for a Request for Procurement (RFP) for hardware and software to meet the WRD 's needs in the early 1990's. The identification of qualified vendors to provide the prototype hardware and software included reviewing industry literature, and making telephone calls and personal visits to prospective vendors. Those vendors that appeared to meet general requirements were required to run benchmark tests. (Author 's abstract)

  17. Spent fuel test. Climax data acquisition system integration report

    SciTech Connect

    Nyholm, R.A.; Brough, W.G.; Rector, N.L.

    1982-06-01

    The Spent Fuel Test - Climax (SFT-C) is a test of the retrievable, deep geologic storage of commercially generated, spent nuclear reactor fuel in granitic rock. Eleven spent fuel assemblies, together with 6 electrical simulators and 20 guard heaters, are emplaced 420 m below the surface in the Climax granite at the Nevada Test Site. On June 2, 1978, Lawrence Livermore National Laboratory (LLNL) secured funding for the SFT-C, and completed spent fuel emplacement May 28, 1980. This multi-year duration test is located in a remote area and is unattended much of the time. An extensive array of radiological safety and geotechnical instrumentation is deployed to monitor the test performance. A dual minicomputer-based data acquisition system collects and processes data from more than 900 analog instruments. This report documents the design and functions of the hardware and software elements of the Data Acquisition System and describes the supporting facilities which include environmental enclosures, heating/air-conditioning/humidity systems, power distribution systems, fire suppression systems, remote terminal stations, telephone/modem communications, and workshop areas. 9 figures.

  18. Spent Fuel Test - Climax data acquisition system operations manual

    SciTech Connect

    Nyholm, R.A.

    1983-01-01

    The Spent Fuel Test-Climax (SFT-C) is a test of the retrievable, deep geologic storage of commercially generated, spent nuclear reactor fuel in granite rock. Eleven spent fuel assemblies, together with 6 electrical simulators and 20 guard heaters, are emplaced 420 m below the surface in the Climax granite at the US Department of Energy Nevada Test Site. On June 2, 1978, Lawrence Livermore National Laboratory (LLNL) secured funding for the SFT-C, and completed spent fuel emplacement May 28, 1980. The multi-year duration test is located in a remote area and is unattended much of the time. An extensive array of radiological safety and geotechnical instrumentation is deployed to monitor the test performance. A dual minicomputer-based data acquisition system (DAS) collects and processes data from more than 900 analog instruments. This report documents the software element of the LLNL developed SFT-C Data Acquisition System. It defines the operating system and hardware interface configurations, the special applications software and data structures, and support software.

  19. [Analysis of fatty acid composition of spotted fever group rickettsiae isolated in China by gas chromatography].

    PubMed

    Zhou, L; Fan, M; Chen, J; Cai, H; Zhou, F; Zhu, H

    1993-08-01

    In present paper, fatty acid composition of seven Chinese isolates of SFG rickettsiae and six prototype strains of SFG rickettsiae were analyzed by GC-MS. Tested prototype strains of SFG rickettsiae were R. sibirica (strains 232 and 246), R. conorii (Simko), R. rickettsi (R), R. akari (Kaplan), R. australis (W58); Chinese isolates were An-84, Se-85, W-88 (human strain), MT-84, FT-84 (D. nuttalli strain), TO-85 (ova of nuttalli) and Chinese reference strain -JH-74 (D. nuttalli). They were propagated in yolk sacs of embryonated hen eggs and purified by centrifugation in a 30%-36%-42% discontinuous renografin density gradient. The fatty acid composition of selected strains of SFG rickettsiae was analyzed by gas chromatography, and then comparison being carried out by single linkage on mini-computer. Identification of the strains was performed based on the results obtained from GC-MS. Results showed that the fatty acid profiles of all the isolates from China were quantitatively similar to that of R. sibirica and quite different from other prototype strains of SFG rickettsiae. PMID:8256442

  20. Quantitative scintigraphy with deconvolutional analysis for the dynamic measurement of hepatic function

    SciTech Connect

    Tagge, E.P.; Campbell, D.A. Jr.; Reichle, R.; Averill, D.R. Jr.; Merion, R.M.; Dafoe, D.C.; Turcotte, J.G.; Juni, J.E.

    1987-06-01

    A mathematical technique known as deconvolutional analysis was used to provide a critical and previously missing element in the computations required to quantitate hepatic function scintigraphically. This computer-assisted technique allowed for the determination of the time required, in minutes, of a labeled bilirubin analog (/sup 99m/Tc-disofenin) to enter the liver via blood and exit via bile. This interval was referred to as the mean transit time (MTT). The critical process provided for by deconvolution is the mathematical simulation of a bolus injection of tracer directly into the afferent blood supply of the liver. The raw data required for this simulation are obtained from the intravenous injection of labeled disofenin, a member of the HIDA family of radiopharmaceuticals. In this study, we perform experiments which document that the simulation process itself is accurate. We then calculate the MTT under a variety of experimental conditions involving progressive hepatic ischemia/reperfusion injury and correlate these results with the results of simultaneously performed BSP determinations and hepatic histology. The experimental group with the most pronounced histologic findings (necrosis, vacuolization, disorganization of hepatic cords) also have the most prolonged MTT and BSP half-life. However, both quantitative imaging and BSP testing are able to identify milder degrees of hepatic ischemic injury not reflected in the histologic evaluation. Quantitative imaging with deconvolutional analysis is a technique easily adaptable to the standard nuclear medicine minicomputer. It provides rapid results and appears to be a sensitive monitor of hepatic functional disturbances resulting from ischemia and reperfusion.

  1. CAD application to power plant design, construction and operation

    SciTech Connect

    Pandya, J.M.; Marinkovich, P.S.; Mysore, R.K.

    1982-05-01

    The nuclear industry, beset with increasing costs of constructing new nuclear plants, needs new initiatives. One, recently developed, is the use of CAD-IGS techniques that are proven beneficial in reducing the construction costs. This methodology had been used successfully in aerospace and automotive industries for many years. Westinghouse CAD-IGS experience with an overseas nuclear plant, now under construction, demonstrates this to be an effective tool for design verification and project management with excellent capabilities for application to new designs and operating plant support. This is accomplished through the CAD plant model and associated data base, which results in reducing human error, more complete preengineering before the start of construction and effective space utilization. Furthermore, the data base accurately represents the as-built plant, which is essential for expediting future plant upgrades. Computer-based modeling is less expensive than the conventional scale modeling and the current technology developments viz. optical scanners, photogrammetry, IGES, and advanced minicomputers will favorably improve the cost-effectiveness.

  2. Pacific Missile Test Center Information Resources Management Organization (code 0300): The ORACLE client-server and distributed processing architecture

    SciTech Connect

    Beckwith, A. L.; Phillips, J. T.

    1990-06-10

    Computing architectures using distributed processing and distributed databases are increasingly becoming considered acceptable solutions for advanced data processing systems. This is occurring even though there is still considerable professional debate as to what truly'' distributed computing actually is and despite the relative lack of advanced relational database management software (RDBMS) capable of meeting database and system integrity requirements for developing reliable integrated systems. This study investigates the functionally of ORACLE data base management software that is performing distributed processing between a MicroVAX/VMS minicomputer and three MS-DOS-based microcomputers. The ORACLE database resides on the MicroVAX and is accessed from the microcomputers with ORACLE SQL*NET, DECnet, and ORACLE PC TOOL PACKS. Data gathered during the study reveals that there is a demonstrable decrease in CPU demand on the MicroVAX, due to distributed processing'', when the ORACLE PC Tools are used to access the database as opposed to database access from dumb'' terminals. Also discovered were several hardware/software constraints that must be considered in implementing various software modules. The results of the study indicate that this distributed data processing architecture is becoming sufficiently mature, reliable, and should be considered for developing applications that reduce processing on central hosts. 33 refs., 2 figs.

  3. PSA: A program to streamline orbit determination for launch support operations

    NASA Technical Reports Server (NTRS)

    Legerton, V. N.; Mottinger, N. A.

    1988-01-01

    An interactive, menu driven computer program was written to streamline the orbit determination process during the critical launch support phase of a mission. Residing on a virtual memory minicomputer, this program retains the quantities in-core needed to obtain a least squares estimate of the spacecraft trajectory with interactive displays to assist in rapid radio metric data evaluation. Menu-driven displays allow real time filter and data strategy development. Graphical and tabular displays can be sent to a laser printer for analysis without exiting the program. Products generated by this program feed back to the main orbit determination program in order to further refine the estimate of the trajectory. The final estimate provides a spacecraft ephemeris which is transmitted to the mission control center and used for antenna pointing and frequency predict generation by the Deep Space Network. The development and implementation process of this program differs from that used for most other navigation software by allowing the users to check important operating features during development and have changes made as needed.

  4. Engineering information management in a distributed environment

    SciTech Connect

    Trost, S.R.

    1986-07-10

    Lawrence Livermore National Laboratory's (LLNL) Computer Integrated Manufacturing (CIM) project's goal is to implement a wide variety of Computer Aided Engineering (CAE) systems to support our engineering staff. As we move to routine operation, we are addressing the problems of integrated information flow. This paper describes how Computer Aided Design (CAD), Computer Aided Manufacturing (CAM), analysis, and information systems interact and provide vital information, such as drawing release status, production job information, and analytical data. LLNL's information systems must handle a wide spectrum of classified and unclassified data in both paper and electronic form. The range of systems includes terminals, PC's, minicomputers, networks, and mainframe supercomputers. A natural progression toward stand alone engineering workstations, PC based CAD systems, and multiple vendors is occurring. Thus, we are taking steps to ensure that we retain system compatibility. Many such information systems have been attempted. Because results have not always been positive, we are using a pragmatic bottoms up approach to assure success. By beginning with small subsystems, and progressing to full integration, we ensure smooth information flow and provide users with information necessary for decision making. The path to data integration is strewn with obstacles and hazards. We describe many of these and the steps we are taking to remove them.

  5. Software for device-independent graphical input

    SciTech Connect

    Hamlin, G.

    1982-01-01

    A three-level model and a graphics software structure based on the model that was developed with the goal of making graphical applications independent of the input devices are described. The software structure makes graphical applications independent of the input devices in a manner similar to the way the SIGGRAPH CORE proposal makes them independent of the output devices. A second goal was to provide a convenient means for application programmers to specify the user-input language for their applications. The software consists of an input handler and a table-driven parser. The input handler manages a CORE-like event queue, changing input events into terminal symbols and making their terminal symbols available to the parser in a uniform manner. It also removes most device dependencies. The parser is table driven from a Backus-Naur form (BNF) grammer that specifies the user-input language. The lower level grammar rules remove the remaining device dependencies from the input, and the higher level grammar rules specify legal sentences in the user-input language. Implementation of this software is on a table-top minicomputer. Experience with retrofitting existing applications indicates that one can find a grammar that removes essentially all the device dependencies from the application proper.

  6. An interferometric strain-displacement measurement system

    NASA Technical Reports Server (NTRS)

    Sharpe, William N., Jr.

    1989-01-01

    A system for measuring the relative in-plane displacement over a gage length as short as 100 micrometers is described. Two closely spaced indentations are placed in a reflective specimen surface with a Vickers microhardness tester. Interference fringes are generated when they are illuminated with a He-Ne laser. As the distance between the indentations expands or contracts with applied load, the fringes move. This motion is monitored with a minicomputer-controlled system using linear diode arrays as sensors. Characteristics of the system are: (1) gage length ranging from 50 to 500 micrometers, but 100 micrometers is typical; (2) least-count resolution of approximately 0.0025 micrometer; and (3) sampling rate of 13 points per second. In addition, the measurement technique is non-contacting and non-reinforcing. It is useful for strain measurements over small gage lengths and for crack opening displacement measurements near crack tips. This report is a detailed description of a new system recently installed in the Mechanisms of Materials Branch at the NASA Langley Research Center. The intent is to enable a prospective user to evaluate the applicability of the system to a particular problem and assemble one if needed.

  7. Kinesthetic coupling between operator and remote manipulator

    NASA Technical Reports Server (NTRS)

    Bejczy, A. K.; Salisbury, J. K., Jr.

    1980-01-01

    A universal force-reflecting hand controller has been developed which allows the establishment of a kinesthetic coupling between the operator and a remote manipulator. The six-degree-of-freedom controller was designed to generate forces and torques on its three positional and three rotational axes in order to permit the operator to accurately feel the forces encountered by the manipulator and be as transparent to operate as possible. The universal controller has been used in an application involving a six-degree-of-freedom mechanical arm equipped with a six-dimensional force-torque sensor at its base. In this application, the hand controller acts as a position control input device to the arm, while forces and torques sensed at the base of the mechanical hand back drive the hand controller. The positional control relation and the back driving of the controller according to inputs experienced by the force-torque sensor are established through complex mathematical transformations performed by a minicomputer. The hand controller is intended as a development tool for investigating force-reflecting master-slave manipulator control technology.

  8. Interactive Forecasting with the National Weather Service River Forecast System

    NASA Technical Reports Server (NTRS)

    Smith, George F.; Page, Donna

    1993-01-01

    The National Weather Service River Forecast System (NWSRFS) consists of several major hydrometeorologic subcomponents to model the physics of the flow of water through the hydrologic cycle. The entire NWSRFS currently runs in both mainframe and minicomputer environments, using command oriented text input to control the system computations. As computationally powerful and graphically sophisticated scientific workstations became available, the National Weather Service (NWS) recognized that a graphically based, interactive environment would enhance the accuracy and timeliness of NWS river and flood forecasts. Consequently, the operational forecasting portion of the NWSRFS has been ported to run under a UNIX operating system, with X windows as the display environment on a system of networked scientific workstations. In addition, the NWSRFS Interactive Forecast Program was developed to provide a graphical user interface to allow the forecaster to control NWSRFS program flow and to make adjustments to forecasts as necessary. The potential market for water resources forecasting is immense and largely untapped. Any private company able to market the river forecasting technologies currently developed by the NWS Office of Hydrology could provide benefits to many information users and profit from providing these services.

  9. Real-time measurement of plutonium in air by direct-inlet surface ionization mass spectrometry. Status report

    SciTech Connect

    Stoffels, J.J.

    1980-04-01

    A new technique is being developed for monitoring low-level airborne plutonium on a real-time basis. The technique is based on surface ionization mass spectrometry of airborne particles. It will be capable of measuring plutonium concentrations below the maximum permissible concentration (MPC) level. A complete mass spectrometer was designed and constructed for this purpose. Major components which were developed and made operational for the instrument include an efficient inlet for directly sampling particles in air, a wide dynamic range ion detector and a minicomputer-based ion-burst measurement system. Calibration of the direct-inlet mass spectrometer (DIMS) was initiated to establish the instrument's response to plutonium dioxide as a function of concentration and particle size. This work revealed an important problem - bouncing of particles upon impact with the ionizing filament. Particle bounce results in a significant loss of measurement sensitivity. The feasibility of using an oven ionizer to overcome the particle bounce problem has been demonstrated. A rhenium oven ionizer was designed and constructed for the purpose of trapping particles which enter via the direct inlet. High-speed particles were trapped in the oven yielding a measurement sensitivity comparable to that for particles which are preloaded. Development of the Pu DIMS can now be completed by optimizing the oven design and calibrating the instrument's performance with UO/sub 2/ and CeO/sub 2/ particles as analogs to PuO/sub 2/ particles.

  10. Hardware requirements: A new generation partial reflection radar for studies of the equatorial mesosphere

    NASA Technical Reports Server (NTRS)

    Vincent, R. A.

    1986-01-01

    A new partial reflection (PR) radar is being developed for operation at the proposed Equatorial Observatory. The system is being designed to make maximum use of recent advances in solid-state technology in order to minimize the power requirements. In particular, it is planned to use a solid-state transmitter in place of the tube transmitters previously used in PR systems. Solid-state transmitters have the advantages that they do not need high voltage supplies, they do not require cathode heaters with a corresponding saving in power consumption and parts are readily available and inexpensive. It should be possible to achieve 15 kW peak powers with recently announced fast switching transistors. Since high mean powers are desirable for obtaining good signal-to-noise ratios, it is also planned to phase code the transmitted pulses and decode after coherent integration. All decoding and signal processing will be carried out in dedicated microprocessors before the signals are passed to a microcomputer for on-line analysis. Recent tests have shown that an Olivetti M24 micro (an IBM compatible) running an 8-MHz clock with a 8087 coprocessor can analyze data at least as fast as the minicomputers presently being used with the Adelaide PR rad ar and at a significantly lower cost. The processed winds data will be stored in nonvolatile CMOS RAM modules; about 0.5 to 1 Mbyte is required to store one week's information.

  11. A real-time electronic imaging system for solar X-ray observations from sounding rockets

    NASA Technical Reports Server (NTRS)

    Davis, J. M.; Ting, J. W.; Gerassimenko, M.

    1979-01-01

    A real-time imaging system for displaying the solar coronal soft X-ray emission, focussed by a grazing incidence telescope, is described. The design parameters of the system, which is to be used primarily as part of a real-time control system for a sounding rocket experiment, are identified. Their achievement with a system consisting of a microchannel plate, for the conversion of X-rays into visible light, and a slow-scan vidicon, for recording and transmission of the integrated images, is described in detail. The system has a quantum efficiency better than 8 deg above 8 A, a dynamic range of 1000 coupled with a sensitivity to single photoelectrons, and provides a spatial resolution of 15 arc seconds over a field of view of 40 x 40 square arc minutes. The incident radiation is filtered to eliminate wavelengths longer than 100 A. Each image contains 3.93 x 10 to the 5th bits of information and is transmitted to the ground where it is processed by a mini-computer and displayed in real-time on a standard TV monitor.

  12. WATEQ4F - a personal computer Fortran translation of the geochemical model WATEQ2 with revised data base

    USGS Publications Warehouse

    Ball, J.W.; Nordstrom, D.K.; Zachmann, D.W.

    1987-01-01

    A FORTRAN 77 version of the PL/1 computer program for the geochemical model WATEQ2, which computes major and trace element speciation and mineral saturation for natural waters has been developed. The code (WATEQ4F) has been adapted to execute on an IBM PC or compatible microcomputer. Two versions of the code are available, one operating with IBM Professional FORTRAN and an 8087 or 89287 numeric coprocessor, and one which operates without a numeric coprocessor using Microsoft FORTRAN 77. The calculation procedure is identical to WATEQ2, which has been installed on many mainframes and minicomputers. Limited data base revisions include the addition of the following ions: AlHS04(++), BaS04, CaHS04(++), FeHS04(++), NaF, SrC03, and SrHCO3(+). This report provides the reactions and references for the data base revisions, instructions for program operation, and an explanation of the input and output files. Attachments contain sample output from three water analyses used as test cases and the complete FORTRAN source listing. U.S. Geological Survey geochemical simulation program PHREEQE and mass balance program BALANCE also have been adapted to execute on an IBM PC or compatible microcomputer with a numeric coprocessor and the IBM Professional FORTRAN compiler. (Author 's abstract)

  13. In-phase and out-of-phase axial-torsional fatigue behavior of Haynes 188 at 760 C

    SciTech Connect

    Kalluri, S.; Bonacuse, P.J.

    1991-10-01

    Isothermal, in-phase and out-of-phase axial-torsional fatigue experiments have been conducted at 760 C on uniform gage section, thin-walled tubular specimens of a wrought cobalt-base superalloy, Haynes 188. Test-control and data acquisition were accomplished with a minicomputer. Fatigue lives of the in- and out-of-phase axial-torsional fatigue tests have been estimated with four different multiaxial fatigue life prediction models that were developed primarly for predicting axial-torsional fatigue lives at room temperature. The models investigated were: (1) the von Mises equivalent strain range; (2) the Modified Multiaxiality Factor Approach; (3) the Modified Smith-Watson-Topper Parameter; and (4) the critical shear plane method of Fatemi, Socie, and Kurath. In general, life predictions by the von Mises equivalent strain range model were within a factor of 2 for a majority of the tests and the predictions by the Modified Multiaxiality Factor Approach were within a factor of 2, while predictions of the Modified Smith-Watson-Topper Parameter and of the critical shear plane method of Fatemi, Socie, and Kurath were unconservative and conservative, respectively, by up to factors of 4. In some of the specimens tested under combined axial-torsional loading conditions, fatigue cracks initiated near extensometer indentations. Two design modifications have been proposed to the thin-walled tubular specimen to overcome this problem.

  14. Simulation and analyses of the aeroassist flight experiment attitude update method

    NASA Technical Reports Server (NTRS)

    Carpenter, J. R.

    1991-01-01

    A method which will be used to update the alignment of the Aeroassist Flight Experiment's Inertial Measuring Unit is simulated and analyzed. This method, the Star Line Maneuver, uses measurements from the Space Shuttle Orbiter star trackers along with an extended Kalman filter to estimate a correction to the attitude quaternion maintained by an Inertial Measuring Unit in the Orbiter's payload bay. This quaternion is corrupted by on-orbit bending of the Orbiter payload bay with respect to the Orbiter navigation base, which is incorporated into the payload quaternion when it is initialized via a direct transfer of the Orbiter attitude state. The method of updating this quaternion is examined through verification of baseline cases and Monte Carlo analysis using a simplified simulation, The simulation uses nominal state dynamics and measurement models from the Kalman filter as its real world models, and is programmed on Microvax minicomputer using Matlab, and interactive matrix analysis tool. Results are presented which confirm and augment previous performance studies, thereby enhancing confidence in the Star Line Maneuver design methodology.

  15. Definition study for photovoltaic residential prototype system

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Hulstrom, R. L.; Cookson, C.; Waldman, B. H.; Lane, R. A.

    1976-01-01

    A parametric sensitivity study and definition of the conceptual design is presented. A computer program containing the solar irradiance, solar array, and energy balance models was developed to determine the sensitivities of solar insolation and the corresponding solar array output at five sites selected for this study as well as the performance of several solar array/battery systems. A baseline electrical configuration was chosen, and three design options were recommended. The study indicates that the most sensitive parameters are the solar insolation and the inverter efficiency. The baseline PST selected is comprised of a 133 sg m solar array, 250 ampere hour battery, one to three inverters, and a full shunt regulator to limit the upper solar array voltage. A minicomputer controlled system is recommended to provide the overall control, display, and data acquisition requirements. Architectural renderings of two photovoltaic residential concepts, one above ground and the other underground, are presented. The institutional problems were defined in the areas of legal liabilities during and after installation of the PST, labor practices, building restrictions and architectural guides, and land use.

  16. Evaluation Of Photovoltaic Panels With Ir Thermography

    NASA Astrophysics Data System (ADS)

    Tscharner, R.; Rao, K. H... S...; Schwarz, R.; Shah, A. V.

    1985-03-01

    Electronic infrared thermography allows fast and non-destructive measurements of temperature distributions of encapsulated solar cells on a panel under various operating conditions. Differences in the performance of the individual cells can be visualized and so called "hot spots" in the panel (due to partial shadowing, etc.) can be analyzed. IR image analysis was also used to determine the IR emissivity of different types of solar cells. This is one of the important criteria for the evaluation of the electrical-thermal behaviour of a cell for photo-voltaic or combined photovoltaic thermal applications. We designed a portable IR measuring system having both on-line and off-line interfacing possibilities with a minicomputer for image processing and analysis. During field measurements, the composite video signal of the IRT system is recorded on a video cassette. For analysis, the thermal images can be visualized on a TV moni-tor and, at the same time, they are digitized and transferred onto a PDP-11 computer for processing. A suit-able software package was developed to obtain different types of thermograms such as simple line profiles, 3-dimensional relief images and isothermal maps.

  17. A C Language Implementation of the SRO (Murdock) Detector/Analyzer

    USGS Publications Warehouse

    Murdock, James N.; Halbert, Scott E.

    1991-01-01

    A signal detector and analyzer algorithm was described by Murdock and Hutt in 1983. The algorithm emulates the performance of a human interpreter of seismograms. It estimates the signal onset, the direction of onset (positive or negative), the quality of these determinations, the period and amplitude of the signal, and the background noise at the time of the signal. The algorithm has been coded in C language for implementation as a 'blackbox' for data similar to that of the China Digital Seismic Network. A driver for the algorithm is included, as are suggestions for other drivers. In all of these routines, plus several FIR filters that are included as well, floating point operations are not required. Multichannel operation is supported. Although the primary use of the code has been for in-house processing of broadband and short period data of the China Digital Seismic Network, provisions have been made to process the long period and very long period data of that system as well. The code for the in-house detector, which runs on a mini-computer, is very similar to that of the field system, which runs on a microprocessor. The code is documented.

  18. A geographic information system for resource managers based on multi-level remote sensing data

    NASA Technical Reports Server (NTRS)

    Wheeler, D. J.; Ridd, M. K.

    1984-01-01

    Procedures followed in developing a test case geographic information system derived primarily from remotely sensed data for the North Cache Soil Conservation District (SCD) in northern Utah are outlined. The North Cache SCD faces serious problems regarding water allocation, flood and geologic hazards, urban encroachment into prime farmland, soil erosion, and wildlife habitat. Four fundamental data planes were initially entered into the geo-referenced data base: (1) land use/land cover information for the agricultural and built-up areas of the valley obtained from various forms of aerial photography; (2) vegetation/land cover in mountains classified digitally from LANDSAT; (3) geomorphic terrain units derived from aerial photography and soil maps; and (4) digital terrain maps obtained from DMA digital data. The land use/vegetation/land cover information from manual photographic and LANDSAT interpretation were joined digitally into a single data plane with an integrated legend, and segmented into quadrangle units. These were merged with the digitized geomorphic units and the digital terrain data using a Prime 400 minicomputer. All data planes were geo-referenced to a UTM coordinate grid.

  19. H-coal fluid dynamics. Final report, August 1, 1977-December 31, 1979

    SciTech Connect

    Not Available

    1980-04-16

    This report presents the results of work aimed at understanding the hydrodynamic behavior of the H-Coal reactor. A summary of the literature search related to the fluid dynamic behavior of gas/liquid/solid systems has been presented. Design details of a cold flow unit were discussed. The process design of this cold flow model followed practices established by HRI in their process development unit. The cold fow unit has been used to conduct experiments with nitrogen, kerosene, or kerosene/coal char slurries, and HDS catalyst, which at room temperature have properties similar to those existing in the H-Coal reactor. Mineral oil, a high-viscosity liquid, was also used. The volume fractions occupied by gas/liquid slurries and catalyst particles were determined by several experimental techniques. The use of a mini-computer for data collection and calculation has greatly accelerated the analysis and reporting of data. Data on nitrogen/kerosene/HDS catalyst and coal char fines are presented in this paper. Correlations identified in the literature search were utilized to analyze the data. From this analysis it became evident that the Richardson-Zaki correlation describes the effect of slurry flow rate on catalyst expansion. Three-phase fluidization data were analyzed with two models.

  20. High strain rate properties of unidirectional composites, part 1

    NASA Technical Reports Server (NTRS)

    Daniel, I. M.

    1991-01-01

    Experimental methods were developed for testing and characterization of composite materials at strain rates ranging from quasi-static to over 500 s(sup -1). Three materials were characterized, two graphite/epoxies and a graphite/S-glass/epoxy. Properties were obtained by testing thin rings 10.16 cm (4 in.) in diameter, 2.54 cm (1 in.) wide, and six to eight plies thick under internal pressure. Unidirectional 0 degree, 90 degree, and 10 degree off-axis rings were tested to obtain longitudinal, transverse, and in-plane shear properties. In the dynamic tests internal pressure was applied explosively through a liquid and the pressure was measured with a calibrated steel ring. Strains in the calibration and specimen rings were recorded with a digital processing oscilloscope. The data were processed and the equation of motion solved numerically by the mini-computer attached to the oscilloscope. Results were obtained and plotted in the form of dynamic stress-strain curves. Longitudinal properties which are governed by the fibers do not vary much with strain rate with only a moderate (up to 20 percent) increase in modulus. Transverse modulus and strength increase sharply with strain rate reaching values up to three times the static values. The in-plane shear modulus and shear strength increase noticeably with strain rate by up to approximately 65 percent. In all cases ultimate strains do not vary significantly with strain rates.

  1. Integration and software for thermal test of heat rate sensors. [space shuttle external tank

    NASA Technical Reports Server (NTRS)

    Wojciechowski, C. J.; Shrider, K. R.

    1982-01-01

    A minicomputer controlled radiant test facility is described which was developed and calibrated in an effort to verify analytical thermal models of instrumentation islands installed aboard the space shuttle external tank to measure thermal flight parameters during ascent. Software was provided for the facility as well as for development tests on the SRB actuator tail stock. Additional testing was conducted with the test facility to determine the temperature and heat flux rate and loads required to effect a change of color in the ET tank external paint. This requirement resulted from the review of photographs taken of the ET at separation from the orbiter which showed that 75% of the external tank paint coating had not changed color from its original white color. The paint on the remaining 25% of the tank was either brown or black, indicating that it had degraded due to heating or that the spray on form insulation had receded in these areas. The operational capability of the facility as well as the various tests which were conducted and their results are discussed.

  2. Integration of autonomous systems for remote control of data acquisition and diagnostics in the TJ-II device

    NASA Astrophysics Data System (ADS)

    Vega, J.; Mollinedo, A.; López, A.; Pacios, L.; Dormido, S.

    1997-01-01

    The data acquisition system for TJ-II will consist of a central computer, containing the data base of the device, and a set of independent systems (personal computers, embedded ones, workstations, minicomputers, PLCs, and microprocessor systems among others), controlling data collection, and automated diagnostics. Each autonomous system can be used to isolate and manage specific problems in the most efficient manner. These problems are related to data acquisition, hard (?s-ms) real time requirements, soft (ms-s) real time requirements, remote control of diagnostics, etc. In the operation of TJ-II, the programming of systems will be carried out from the central computer. Coordination and synchronization will be performed by linking systems to local area networks. Several Ethernet segments and FDDI rings will be used for these purposes. Programmable logic controller devices (PLCs) used for diagnostic low level control will be linked among them through a fast serial link, the RS485 Profibus standard. One VME crate, running on the OS-9 real time operating system, will be assigned as a gateway, so as to connect the PLCs based systems with an Ethernet segment.

  3. Integration of autonomous systems for remote control of data acquisition and diagnostics in the TJ-II device

    SciTech Connect

    Vega, J.; Mollinedo, A.; Lopez, A.; Pacios, L. [Asociacion EURATOM/CIEMAT para Fusion, Avda, Complutense, 22.28040 Madrid (Spain)] [Asociacion EURATOM/CIEMAT para Fusion, Avda, Complutense, 22.28040 Madrid (Spain); Dormido, S. [Dpto. Informatica y Automatica, Facultad de Ciencias, UNED, Avda, Senda del Rey s/n, 28040 Madrid (Spain)] [Dpto. Informatica y Automatica, Facultad de Ciencias, UNED, Avda, Senda del Rey s/n, 28040 Madrid (Spain)

    1997-01-01

    The data acquisition system for TJ-II will consist of a central computer, containing the data base of the device, and a set of independent systems (personal computers, embedded ones, workstations, minicomputers, PLCs, and microprocessor systems among others), controlling data collection, and automated diagnostics. Each autonomous system can be used to isolate and manage specific problems in the most efficient manner. These problems are related to data acquisition, hard ({mu}s{endash}ms) real time requirements, soft (ms{endash}s) real time requirements, remote control of diagnostics, etc. In the operation of TJ-II, the programming of systems will be carried out from the central computer. Coordination and synchronization will be performed by linking systems to local area networks. Several Ethernet segments and FDDI rings will be used for these purposes. Programmable logic controller devices (PLCs) used for diagnostic low level control will be linked among them through a fast serial link, the RS485 Profibus standard. One VME crate, running on the OS-9 real time operating system, will be assigned as a gateway, so as to connect the PLCs based systems with an Ethernet segment. {copyright} {ital 1997 American Institute of Physics.}

  4. Pressure Measurement Systems

    NASA Technical Reports Server (NTRS)

    1990-01-01

    System 8400 is an advanced system for measurement of gas and liquid pressure, along with a variety of other parameters, including voltage, frequency and digital inputs. System 8400 offers exceptionally high speed data acquisition through parallel processing, and its modular design allows expansion from a relatively inexpensive entry level system by the addition of modular Input Units that can be installed or removed in minutes. Douglas Juanarena was on the team of engineers that developed a new technology known as ESP (electronically scanned pressure). The Langley ESP measurement system was based on miniature integrated circuit pressure-sensing transducers that communicated pressure information to a minicomputer. In 1977, Juanarena formed PSI to exploit the NASA technology. In 1978 he left Langley, obtained a NASA license for the technology, introduced the first commercial product, the 780B pressure measurement system. PSI developed a pressure scanner for automation of industrial processes. Now in its second design generation, the DPT-6400 is capable of making 2,000 measurements a second and has 64 channels by addition of slave units. New system 8400 represents PSI's bid to further exploit the $600 million U.S. industrial pressure measurement market. It is geared to provide a turnkey solution to physical measurement.

  5. Upgrading NASA/DOSE laser ranging system control computers

    NASA Technical Reports Server (NTRS)

    Ricklefs, Randall L.; Cheek, Jack; Seery, Paul J.; Emenheiser, Kenneth S.; Hanrahan, William P., III; Mcgarry, Jan F.

    1993-01-01

    Laser ranging systems now managed by the NASA Dynamics of the Solid Earth (DOSE) and operated by the Bendix Field Engineering Corporation, the University of Hawaii, and the University of Texas have produced a wealth on interdisciplinary scientific data over the last three decades. Despite upgrades to the most of the ranging station subsystems, the control computers remain a mix of 1970's vintage minicomputers. These encompass a wide range of vendors, operating systems, and languages, making hardware and software support increasingly difficult. Current technology allows replacement of controller computers at a relatively low cost while maintaining excellent processing power and a friendly operating environment. The new controller systems are now being designed using IBM-PC-compatible 80486-based microcomputers, a real-time Unix operating system (LynxOS), and X-windows/Motif IB, and serial interfaces have been chosen. This design supports minimizing short and long term costs by relying on proven standards for both hardware and software components. Currently, the project is in the design and prototyping stage with the first systems targeted for production in mid-1993.

  6. Interactive initialization of heat flux parameters for numerical models using satellite temperature measurements. [Kansas and Indiana

    NASA Technical Reports Server (NTRS)

    Carlson, T. N. (principal investigator)

    1982-01-01

    A method for obtaining patterns of moisture availability (and net evaporation) from satellite infrared measurements employs Carlson's boundary layer model and a variety of image processing routines executed by a minicomputer. To test the method with regard to regional scale moisture analyses, two case studies were chosen because of the availability of HCMM data and because of the presence of a large horizontal gradient in antecedent precipitation and crp moisture index. Results show some correlation in both cases between antecedent precipitation and derived moisture availability. Apparently, regional-scale moisture availability patterns can be determined with some degree of fidelity but the values themselves may be useful only in the relative sense and significant to within plus or minus one category of dryness over a range of 4 or 5 categories between absolutely dry and field saturation. Preliminary results suggest that the derived moisture values correlate best with longer-term precipitation totals, suggesting that the infrared temperatures respond more sensitively to a relatively deep substrate layer.

  7. Operational Performance Of Optical Disk Systems

    NASA Astrophysics Data System (ADS)

    Ammon, G. J.; Calabria, J. A.

    1985-04-01

    Two optical disk "jukebox" mass memory storage systems have been developed that provide access to any data in a store of 1013 bits (1250 Gbytes) within six seconds. These engineering models have been developed under a program sponsored by the Air Force and NASA ana have recently been delivered to testbed facilities -- one to NASA Marshall Space Flight Center and one to the AF Rome Air Development Center. Each system contains a library of 125 optical disks with mechanisms for retrieving any disk, and recording or playing digital data at 50 Mb/s. Disks in protective cartridges are moved from the store to a load station, which then mounts the disks onto a precision turntable. Still in the cartridge, they are spun up to speed and data is recorded or played back via focused laser beams. The major emphasis in both the NASA and Air Force jukebox optical disk systems has been reliability of operation. Enhancements of the mechanical, electrical, and software designs have been implemented to minimize the user downtime in an operating scenario. The NASA system will interface to a database management system using a fiber optics data bus, while the Air Force system will interface to a DEC VAX 11/750 minicomputer. Both systems will store digitized imagery and provide fast access to a huge store of such images.

  8. Software used with the flux mapper at the solar parabolic dish test site

    NASA Technical Reports Server (NTRS)

    Miyazono, C.

    1984-01-01

    Software for data archiving and data display was developed for use on a Digital Equipment Corporation (DEC) PDP-11/34A minicomputer for use with the JPL-designed flux mapper. The flux mapper is a two-dimensional, high radiant energy scanning device designed to measure radiant flux energies expected at the focal point of solar parabolic dish concentrators. Interfacing to the DEC equipment was accomplished by standard RS-232C serial lines. The design of the software was dicated by design constraints of the flux-mapper controller. Early attemps at data acquisition from the flux-mapper controller were not without difficulty. Time and personnel limitations result in an alternative method of data recording at the test site with subsequent analysis accomplished at a data evaluation location at some later time. Software for plotting was also written to better visualize the flux patterns. Recommendations for future alternative development are discussed. A listing of the programs used in the anaysis is included in an appendix.

  9. A system for processing Landsat and other georeferenced data for resource management applications

    NASA Technical Reports Server (NTRS)

    Whitley, S. L.

    1979-01-01

    The NASA Earth Resources Laboratory has developed a transferrable system for processing Landsat and disparate data with capabilities for digital data classification, georeferencing, overlaying, and data base management. This system is known as the Earth Resources Data Analysis System. The versatility of the system has been demonstrated with applications in several disciplines. A description is given of a low-cost data system concept that is suitable for transfer to one's available in-house minicomputer or to a low-cost computer purchased for this purpose. Software packages are described that process Landsat data to produce surface cover classifications and that geographically reference the data to the UTM projection. Programs are also described that incorporate several sets of Landsat derived information, topographic information, soils information, rainfall information, etc., into a data base. Selected application algorithms are discussed and sample products are presented. The types of computers on which the low-cost data system concept has been implemented are identified, typical implementation costs are given, and the source where the software may be obtained is identified.

  10. Peripheral processors for high-speed simulation. [helicopter cockpit simulator

    NASA Technical Reports Server (NTRS)

    Karplus, W. J.

    1977-01-01

    This paper describes some of the results of a study directed to the specification and procurement of a new cockpit simulator for an advanced class of helicopters. A part of the study was the definition of a challenging benchmark problem, and detailed analyses of it were made to assess the suitability of a variety of simulation techniques. The analyses showed that a particularly cost-effective approach to the attainment of adequate speed for this extremely demanding application is to employ a large minicomputer acting as host and controller for a special-purpose digital peripheral processor. Various realizations of such peripheral processors, all employing state-of-the-art electronic circuitry and a high degree of parallelism and pipelining, are available or under development. The types of peripheral processors array processors, simulation-oriented processors, and arrays of processing elements - are analyzed and compared. They are particularly promising approaches which should be suitable for high-speed simulations of all kinds, the cockpit simulator being a case in point.

  11. Research, development and demonstration of nickel-zinc batteries for electric vehicle propulsion. Annual report, 1979. [70 W/lb

    SciTech Connect

    Not Available

    1980-06-01

    This second annual report under Contract No. 31-109-39-4200 covers the period July 1, 1978 through August 31, 1979. The program demonstrates the feasibility of the nickel-zinc battery for electric vehicle propulsion. The program is divided into seven distinct but highly interactive tasks collectively aimed at the development and commercialization of nickel-zinc technology. These basic technical tasks are separator development, electrode development, product design and analysis, cell/module battery testing, process development, pilot manufacturing, and thermal management. A Quality Assurance Program has also been established. Significant progress has been made in the understanding of separator failure mechanisms, and a generic category of materials has been specified for the 300+ deep discharge (100% DOD) applications. Shape change has been reduced significantly. A methodology has been generated with the resulting hierarchy: cycle life cost, volumetric energy density, peak power at 80% DOD, gravimetric energy density, and sustained power. Generation I design full-sized 400-Ah cells have yielded in excess of 70 W/lb at 80% DOD. Extensive testing of cells, modules, and batteries is done in a minicomputer-based testing facility. The best life attained with electric vehicle-size cell components is 315 cycles at 100% DOD (1.0V cutoff voltage), while four-cell (approx. 6V) module performance has been limited to about 145 deep discharge cycles. The scale-up of processes for production of components and cells has progressed to facilitate component production rates of thousands per month. Progress in the area of thermal management has been significant, with the development of a model that accurately represents heat generation and rejection rates during battery operation. For the balance of the program, cycle life of > 500 has to be demonstrated in modules and full-sized batteries. 40 figures, 19 tables. (RWR)

  12. Integrating real-time digital signal processing capability into a large research and development facility

    SciTech Connect

    Manges, W.W.; Mallinak-Glassell, J.T.; Breeding, J.E.; Jansen, J.M. Jr.; Tate, R.M.; Bentz, R.R.

    1992-12-31

    The Instrumentation and Controls Division at Oak Ridge National Laboratory recently developed and installed a large scale, real-time measurement system for the world`s largest pressurized water tunnel. This water tunnel, the Large Cavitation Channel (LCC) provides a research and development facility for the study of acoustic phenomena to aid in model testing of new naval ship and submarine designs. The LCC design required the development of a near-field beamformer in addition to extending the range of real-time processing capability to frequencies unavailable at other facilities. The beamformer acquires and processes time-domain acoustic data at 9.5 MB/s from up to 45 hydrophones while. The acoustic processing software provides for the real-time analysis of acoustic data. Up to 128 facility sensors are sampled, time stamped, and stored at 600 kB/s. The system generates information for acoustic phenomena and facility measurements in real time so that the operator can make facility adjustments to control the running experiment This real-time control of facility conditions requires that the measurement system integrate facility and acoustic data for simultaneous display to the operator in engineering units via high-end workstations. A dual-host minicomputer configuration with high-end workstations connected via an Ethernet networking cluster controls and integrates measurement and display subsystems. The system architecture integrates high-performance array processors, matrix switches, signal conditioning amplifiers, antialiasing filter subsystems, high-precision analog-to-digital subsystems, high-performance data disks, and support equipment The hardware and software architecture with its distributed computers and distributed real-time data base, the signal processing algorithms and architecture, and the flexible user interface for facility and measurements integration are described in this paper.

  13. Integrating real-time digital signal processing capability into a large research and development facility

    SciTech Connect

    Manges, W.W.; Mallinak-Glassell, J.T.; Breeding, J.E.; Jansen, J.M. Jr.; Tate, R.M.; Bentz, R.R.

    1992-01-01

    The Instrumentation and Controls Division at Oak Ridge National Laboratory recently developed and installed a large scale, real-time measurement system for the world's largest pressurized water tunnel. This water tunnel, the Large Cavitation Channel (LCC) provides a research and development facility for the study of acoustic phenomena to aid in model testing of new naval ship and submarine designs. The LCC design required the development of a near-field beamformer in addition to extending the range of real-time processing capability to frequencies unavailable at other facilities. The beamformer acquires and processes time-domain acoustic data at 9.5 MB/s from up to 45 hydrophones while. The acoustic processing software provides for the real-time analysis of acoustic data. Up to 128 facility sensors are sampled, time stamped, and stored at 600 kB/s. The system generates information for acoustic phenomena and facility measurements in real time so that the operator can make facility adjustments to control the running experiment This real-time control of facility conditions requires that the measurement system integrate facility and acoustic data for simultaneous display to the operator in engineering units via high-end workstations. A dual-host minicomputer configuration with high-end workstations connected via an Ethernet networking cluster controls and integrates measurement and display subsystems. The system architecture integrates high-performance array processors, matrix switches, signal conditioning amplifiers, antialiasing filter subsystems, high-precision analog-to-digital subsystems, high-performance data disks, and support equipment The hardware and software architecture with its distributed computers and distributed real-time data base, the signal processing algorithms and architecture, and the flexible user interface for facility and measurements integration are described in this paper.

  14. An improved method for the quantitative analysis of M-mode echocardiograms.

    PubMed

    Brower, R W; van Dorp, W G; Vogel, J A; Roelandt, J R

    1975-10-01

    A computer-assisted system is described which speeds and extends the quantitative interpretation of M-mode echocardiographic recordings. The system consists of a digitizing tablet, minicomputer, TV monitor and a hard copy device. M-mode echocardiograms are placed on the digitizing surface and traced using the digitizing pen. The entered signal includes the endocardial surfaces of the anterior and posterior left ventricular wall for at least one cycle, and two Q waves from a simultaneously recorded ECG to identify end diastole and heart rate. End systole is determined automatically as corresponding to the minimum LV dimension. Results of analysis include continuous plots of estimated volume and circumferential fiber shortening rate (CFSR) vs time. Determinations of special interest are also displayed: enddiastolic volume (EDV) and endsystolic volume (ESV), ejection fraction, cardiac output, mean and peak CFSR. M-mode echocardiograms obtained from 25 normal volunteers are used to evaluate the system. The standard error of the estimate of the computer-assisted system is comparable to the error between observers, furthermore the computer system adds no significant systematic or random error. Comparison between M-mode estimated volumes and angiographically determined values has been described previously and Sy - x here is significantly greater. The main advantages of this system are: 1. a continuous plot of estimated LV volume and CFSR is provided; 2. beat-to-beat analyses are facilitated; 3. the automatic determination of end systole removes possible errors in judgement made previously; 4. it is time saving when one considers the amount of data obtained. With these advantages and the generally satisfactory performance in the clinical trials, this system appears to have extended the clinical quantitative capabilities of M-mode echocardiograms. PMID:1102317

  15. Side-scan sonar mapping: Pseudo-real-time processing and mosaicking techniques

    SciTech Connect

    Danforth, W.W.; Schwab, W.C.; O'Brien, T.F. (Geological Survey, Woods Hole, MA (USA)); Karl, H. (Geological Survey, Menlo Park, CA (USA))

    1990-05-01

    The US Geological Survey (USGS) surveyed 1,000 km{sup 2} of the continental shelf off San Francisco during a 17-day cruise, using a 120-kHz side-scan sonar system, and produced a digitally processed sonar mosaic of the survey area. The data were processed and mosaicked in real time using software developed at the Lamont-Doherty Geological Observatory and modified by the USGS, a substantial task due to the enormous amount of data produced by high-resolution side-scan systems. Approximately 33 megabytes of data were acquired every 1.5 hr. The real-time sonar images were displayed on a PC-based workstation and the data were transferred to a UNIX minicomputer where the sonar images were slant-range corrected, enhanced using an averaging method of desampling and a linear-contrast stretch, merged with navigation, geographically oriented at a user-selected scale, and finally output to a thermal printer. The hard-copy output was then used to construct a mosaic of the survey area. The final product of this technique is a UTM-projected map-mosaic of sea-floor backscatter variations, which could be used, for example, to locate appropriate sites for sediment sampling to ground truth the sonar imagery while still at sea. More importantly, reconnaissance surveys of this type allow for the analysis and interpretation of the mosaic during a cruise, thus greatly reducing the preparation time needed for planning follow-up studies of a particular area.

  16. Acoustic systems for the measurement of streamflow

    USGS Publications Warehouse

    Laenen, Antonius; Smith, Winchell

    1983-01-01

    The acoustic velocity meter (AVM), also referred to as an ultrasonic flowmeter, has been an operational tool for the measurement of streamflow since 1965. Very little information is available concerning AVM operation, performance, and limitations. The purpose of this report is to consolidate information in such a manner as to provide a better understanding about the application of this instrumentation to streamflow measurement. AVM instrumentation is highly accurate and nonmechanical. Most commercial AVM systems that measure streamflow use the time-of-travel method to determine a velocity between two points. The systems operate on the principle that point-to-point upstream travel-time of sound is longer than the downstream travel-time, and this difference can be monitored and measured accurately by electronics. AVM equipment has no practical upper limit of measurable velocity if sonic transducers are securely placed and adequately protected. AVM systems used in streamflow measurement generally operate with a resolution of ?0.01 meter per second but this is dependent on system frequency, path length, and signal attenuation. In some applications the performance of AVM equipment may be degraded by multipath interference, signal bending, signal attenuation, and variable streamline orientation. Presently used minicomputer systems, although expensive to purchase and maintain, perform well. Increased use of AVM systems probably will be realized as smaller, less expensive, and more conveniently operable microprocessor-based systems become readily available. Available AVM equipment should be capable of flow measurement in a wide variety of situations heretofore untried. New signal-detection techniques and communication linkages can provide additional flexibility to the systems so that operation is possible in more river and estuary situations.

  17. Galatea ¬â€?An Interactive Computer Graphics System For Movie And Video Analysis

    NASA Astrophysics Data System (ADS)

    Potel, Michael J.; MacKay, Steven A.; Sayre, Richard E.

    1983-03-01

    Extracting quantitative information from movie film and video recordings has always been a difficult process. The Galatea motion analysis system represents an application of some powerful interactive computer graphics capabilities to this problem. A minicomputer is interfaced to a stop-motion projector, a data tablet, and real-time display equipment. An analyst views a film and uses the data tablet to track a moving position of interest. Simultaneously, a moving point is displayed in an animated computer graphics image that is synchronized with the film as it runs. Using a projection CRT and a series of mirrors, this image is superimposed on the film image on a large front screen. Thus, the graphics point lies on top of the point of interest in the film and moves with it at cine rates. All previously entered points can be displayed simultaneously in this way, which is extremely useful in checking the accuracy of the entries and in avoiding omission and duplication of points. Furthermore, the moving points can be connected into moving stick figures, so that such representations can be transcribed directly from film. There are many other tools in the system for entering outlines, measuring time intervals, and the like. The system is equivalent to "dynamic tracing paper" because it is used as though it were tracing paper that can keep up with running movie film. We have applied this system to a variety of problems in cell biology, cardiology, biomechanics, and anatomy. We have also extended the system using photogrammetric techniques to support entry of three-dimensional moving points from two (or more) films taken simultaneously from different perspective views. We are also presently constructing a second, lower-cost, microcomputer-based system for motion analysis in video, using digital graphics and video mixing to achieve the graphics overlay for any composite video source image.

  18. Design of a real-time wind turbine simulator using a custom parallel architecture

    NASA Technical Reports Server (NTRS)

    Hoffman, John A.; Gluck, R.; Sridhar, S.

    1995-01-01

    The design of a new parallel-processing digital simulator is described. The new simulator has been developed specifically for analysis of wind energy systems in real time. The new processor has been named: the Wind Energy System Time-domain simulator, version 3 (WEST-3). Like previous WEST versions, WEST-3 performs many computations in parallel. The modules in WEST-3 are pure digital processors, however. These digital processors can be programmed individually and operated in concert to achieve real-time simulation of wind turbine systems. Because of this programmability, WEST-3 is very much more flexible and general than its two predecessors. The design features of WEST-3 are described to show how the system produces high-speed solutions of nonlinear time-domain equations. WEST-3 has two very fast Computational Units (CU's) that use minicomputer technology plus special architectural features that make them many times faster than a microcomputer. These CU's are needed to perform the complex computations associated with the wind turbine rotor system in real time. The parallel architecture of the CU causes several tasks to be done in each cycle, including an IO operation and the combination of a multiply, add, and store. The WEST-3 simulator can be expanded at any time for additional computational power. This is possible because the CU's interfaced to each other and to other portions of the simulation using special serial buses. These buses can be 'patched' together in essentially any configuration (in a manner very similar to the programming methods used in analog computation) to balance the input/ output requirements. CU's can be added in any number to share a given computational load. This flexible bus feature is very different from many other parallel processors which usually have a throughput limit because of rigid bus architecture.

  19. PC-based system for retrospective cardiac and respiratory gating of NMR data.

    PubMed

    Bohning, D E; Carter, B; Liu, S S; Pohost, G M

    1990-11-01

    A method and a means for retrospectively clustering NMR k-space measurement profiles with respect to both cardiac and respiratory phases were developed to explore strategies for (1) reducing cardiovascular and respiratory flow/motion image artifacts and (2) improving T1 and T2 characterization of the heart. The image data are collected at a uniform rate so that echo (TE) and repetition (TR) times are independent of the varying cardiac cycle R-R interval and/or respiratory motions. Cardiac (C) time, respiratory (R) time or diaphragm position, and NMR data acquisition (A) cycle time are collected by microcomputer in parallel with free running (untriggered) image collection on a standard magnetic resonance imager. After the raw data equivalent of multiple images are collected, the C-A-R phase timing data are uploaded from the microcomputer to the scanner's minicomputer for use in a normalized C-R phase plane clustering of the image raw data. Each profile's position in the C-R phase plane is determined and then clustered into a new set of data, one image being equivalent for each desired C-R phase combination. These raw data are then zero-filled and (optionally) filtered to compensate for the nonuniform k-space sampling and, finally, reconstructed. Cardiac "cines" made from these retrospectively gated images are comparable to similarly phased triggered images. When high time resolution is required, retrospective gating can be expected to show improvements over triggering, especially toward the critical latter part of the cardiac cycle, where coronary artery filling occurs. The system described can readily be assembled from generally available components. PMID:2266849

  20. The surgical pathologist in a client/server computer network: work support, quality assurance, and the graphical user interface.

    PubMed

    Dictor, M

    1997-03-01

    Cympathy is a relational client/server database application designed to integrate the departmental work flow in anatomic pathology, segment information appropriately, and allow flexible interaction with standalone microcomputer programs. The database resides in a minicomputer server connected to 40 client microcomputers. Patient histories on consultation requests are scanned and maintained as bitmapped files; all information is stored on fixed disks. Client microcomputers use a graphical interface to update a patient-related status bar and retrieve any of the nearly 40 data entry tables for accessioning, ordering special stains and studies, block and slide production, reports, gross and microscopic findings, evaluation of analyses, Systematized Nomenclature in Medicine (SNOMED) coding, conference scheduling, logging of borrowed materials, queries, and specialized functions such as electron microscopy, including indexes for blocks, grids, and photonegatives. The application radically reduces the need for administrative personnel. Cympathy allows major refinements in the method of composing, distributing, and storing report information. Quality assurance protocols can be expanded to relate the frequency and quantification of microscopic findings retrieved in a structured query to the individual pathologist for a given type of specimen and diagnosis. Such queries can extend output to include the number of blocks and slides and all analyses, services, and procedures applied to each case. Debiting is automated through a flexible dependence on SNOMED codes, analyses, and number of slides generated. Any number of structured queries can be written and saved within Cympathy, addressed directly to the database server, or composed in microcomputer programs linked to the database by open database connectivity drivers. Within Cympathy, query results are output in a fixed table format, which can be expanded to include, for example, tabulations of gross and microscopic findings and analysis results. The application offers the opportunity for research and quality assurance in anatomic pathology without the need for hard copy output. PMID:9071735

  1. Rapid calculation of functional maps of glucose metabolic rate and individual model rate parameters from serial 2-FDG images

    SciTech Connect

    Koeppe, R.A.; Holden, J.E.; Hutchins, G.D.

    1985-05-01

    The authors have developed a method for the rapid pixel-by-pixel estimation of glucose metabolic rate from a dynamic sequence of PCT images acquired over 40 minutes following venous bolus injection of 2-deoxy-2-fluoro-D-glucose (2-FDG). The calculations are based on the conventional four parameter model. The dephosphorylation rate (k/sub 4/) cannot be reliably estimated from only 40 minutes of data; however, neglecting dephosphorylation can nonetheless introduce significant biases into the parameter estimation processes. In the authors' method, the rate is constrained to fall within a small range about a presumed value. Computer simulation studies show that this constraint greatly reduces the systematic biases in the other three fitted parameters and in the metabolic rate that arise from the assumption of no dephosphorylation. The parameter estimation scheme used is formally identical to one originally developed for dynamic methods of cerebral blood flow estimation. Estimation of metabolic rate and the individual model rate parameters k/sub 1/, k/sub 2/, and k/sub 3/, can be carried out for each pixel sequence of a 100 x 100 pixel image in less than two minutes on our PDP 11/60 minicomputer with floating point processor. While the maps of k/sub 2/ amd k/sub 3/ are quite noisy, accurate estimates of average values can be attained for regions of a few cm/sup 2/. The maps of metabolic rate offer many advantages in addition to that of direct visualization. These include improved statistical precision and the avoidance of averaging failure in the fitting of heterogeneous regions.

  2. Large scale contracted MC-CI calculations on acetylene and its dissociation into two CH(2?) radicals

    NASA Astrophysics Data System (ADS)

    Siegbahn, Per E. M.

    1981-09-01

    Large scale MC-CI calculations with up to 178 000 configurations have been performed on acetylene and its dissociation into two CH(2?) radicals with a newly developed contracted CI scheme. The geometry obtained for acetylene was RC-C = 1.208 Ĺ (exptl. = 1.203 Ĺ) and RC-H = 1.061 Ĺ (1.060 Ĺ). The dissociation energy into two CH(2?) was De = 10.00 eV (10.26 eV). The barrier along the linear dissociation path recently predicted by Raimondi et al. was confirmed but their value of 0.33 eV was raised to 0.57 eV and the location of the barrier was moved from 6.0 to 5.0 a.u. The origin of the barrier is an avoided crossing between the states dissociating into two CH(2?)—and two CH(4?-)—radicals. The energy difference between the states at the point of the avoided crossing was computed to be 0.19 eV from second root contracted CI calculations. The energy barrier was increased in going from MCSCF to CI and this unusual behavior is explained by the much larger amount of correlation energy in CH(2?) than in CH(4?-). The minimum energy path is however found to be nonlinear and has no energy barrier. A simple molecular orbital argument is given for why this should be so. The potential surface for acetylene is further found to exhibit irregular regions with double minima for bending which were not predicted in the surface given recently by Carter et al. All the presently performed calculations were done on a minicomputer VAX-11/780.

  3. FACSIM/MRS-1: Cask receiving and consolidation model documentation and user's guide

    SciTech Connect

    Lotz, T.L.; Shay, M.R.

    1987-06-01

    The Pacific Northwest Laboratory (PNL) has developed a stochastic computer model, FACSIM/MRS, to assist in assessing the operational performance of the Monitored Retrievable Storage (MRS) waste-handling facility. This report provides the documentation and user's guide for the component FACSIM/MRS-1, which is also referred to as the front-end model. The FACSIM/MRS-1 model simulates the MRS cask-receiving and spent-fuel consolidation activities. The results of the assessment of the operational performance of these activities are contained in a second report, FACSIM/MRS-1: Cask Receiving and Consolidation Performance Assessment (Lotz and Shay 1987). The model of MRS canister storage and shipping operations is presented in FACSIM/MRS-2: Storage and Shipping Model Documentation and User's Guide (Huber et al. 1987). The FACSIM/MRS model uses the commercially available FORTRAN-based SIMAN (SIMulation ANalysis language) simulation package (Pegden 1982). SIMAN provides a set of FORTRAN-coded commands, called block operations, which are used to build detailed models of continuous or discrete events that make up the operations of any process, such as the operation of an MRS facility. The FACSIM models were designed to run on either an IBM-PC or a VAX minicomputer. The FACSIM/MRS-1 model is flexible enough to collect statistics concerning almost any aspect of the cask receiving and consolidation operations of an MRS facility. The MRS model presently collects statistics on 51 quantities of interest during the simulation. SIMAN reports the statistics with two forms of output: a SIMAN simulation summary and an optional set of SIMAN output files containing data for use by more detailed post processors and report generators.

  4. Cyclic axial-torsional deformation behavior of a cobalt-base superalloy

    SciTech Connect

    Bonacuse, P.J.; Kalluri, S.

    1992-11-01

    Multiaxial loading, especially at elevated temperature, can cause the inelastic response of a material to differ significantly from that predicted by simple flow rules, i.e., von Mises or Tresca. To quantify some of these differences, the cyclic high-temperature, deformation behavior of a wrought cobalt-based superalloy, Haynes 188, is investigated under combined axial and torsional loads. Haynes 188 is currently used in many aerospace gas turbine and rocket engine applications, e.g., the combustor liner for the T800 turboshaft engine for the RAH-66 Comanche helicopter and the liquid oxygen posts in the main injector of the space shuttle main engine. The deformation behavior of this material is assessed through the examination of hysteresis loops generated from a biaxial fatigue test program. A high-temperature axial, torsional, and combined axial-torsional fatigue data base has been generated on Haynes 188 at 760 C. Cyclic loading tests have been conducted on uniform gauge section tubular specimens in a servohydraulic axial-torsional test rig. Test control and data acquisition were accomplished with a minicomputer. In this paper, the cyclic hardening characteristics and typical hysteresis loops in the axial stress versus axial strain, shear stress versus engineering shear strain, axial strain versus engineering shear strain, and axial stress versus shear stress spaces are presented for cyclic, in-phase and out-of-phase, axial torsional tests. For in-phase tests three different values of the proportionality constant, lambda (ratio of engineering shear strain amplitude to axial strain amplitude), are examined, viz., 0.86, 1.73, and 3.46. In the out-of-phase tests, three different values of the phase angle, phi (between the axial and engineering shear strain waveforms), are studied, viz., 30, 60, and 90 deg with lambda = 1.73.

  5. Computer-generated speech

    SciTech Connect

    Aimthikul, Y.

    1981-12-01

    This thesis reviews the essential aspects of speech synthesis and distinguishes between the two prevailing techniques: compressed digital speech and phonemic synthesis. It then presents the hardware details of the five speech modules evaluated. FORTRAN programs were written to facilitate message creation and retrieval with four of the modules driven by a PDP-11 minicomputer. The fifth module was driven directly by a computer terminal. The compressed digital speech modules (T.I. 990/306, T.S.I. Series 3D and N.S. Digitalker) each contain a limited vocabulary produced by the manufacturers while both the phonemic synthesizers made by Votrax permit an almost unlimited set of sounds and words. A text-to-phoneme rules program was adapted for the PDP-11 (running under the RSX-11M operating system) to drive the Votrax Speech Pac module. However, the Votrax Type'N Talk unit has its own built-in translator. Comparison of these modules revealed that the compressed digital speech modules were superior in pronouncing words on an individual basis but lacked the inflection capability that permitted the phonemic synthesizers to generate more coherent phrases. These findings were necessarily highly subjective and dependent on the specific words and phrases studied. In addition, the rapid introduction of new modules by manufacturers will necessitate new comparisons. However, the results of this research verified that all of the modules studied do possess reasonable quality of speech that is suitable for man-machine applications. Furthermore, the development tools are now in place to permit the addition of computer speech output in such applications.

  6. Structural Analysis Made 'NESSUSary'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Everywhere you look, chances are something that was designed and tested by a computer will be in plain view. Computers are now utilized to design and test just about everything imaginable, from automobiles and airplanes to bridges and boats, and elevators and escalators to streets and skyscrapers. Computer-design engineering first emerged in the 1970s, in the automobile and aerospace industries. Since computers were in their infancy, however, architects and engineers during the time were limited to producing only designs similar to hand-drafted drawings. (At the end of 1970s, a typical computer-aided design system was a 16-bit minicomputer with a price tag of $125,000.) Eventually, computers became more affordable and related software became more sophisticated, offering designers the "bells and whistles" to go beyond the limits of basic drafting and rendering, and venture into more skillful applications. One of the major advancements was the ability to test the objects being designed for the probability of failure. This advancement was especially important for the aerospace industry, where complicated and expensive structures are designed. The ability to perform reliability and risk assessment without using extensive hardware testing is critical to design and certification. In 1984, NASA initiated the Probabilistic Structural Analysis Methods (PSAM) project at Glenn Research Center to develop analysis methods and computer programs for the probabilistic structural analysis of select engine components for current Space Shuttle and future space propulsion systems. NASA envisioned that these methods and computational tools would play a critical role in establishing increased system performance and durability, and assist in structural system qualification and certification. Not only was the PSAM project beneficial to aerospace, it paved the way for a commercial risk- probability tool that is evaluating risks in diverse, down- to-Earth application

  7. LOOK- A TEXT FILE DISPLAY PROGRAM

    NASA Technical Reports Server (NTRS)

    Vavrus, J. L.

    1994-01-01

    The LOOK program was developed to permit a user to examine a text file in a psuedo-random access manner. Many engineering and scientific programs generate large amounts of printed output. Often this output needs to be examined in only a few places. On mini-computers (like the DEC VAX) high-speed printers are usually at a premium. One alternative is to save the output in a text file and examine it with a text editor. The slowness of a text editor, the possibility of inadvertently changing the output, and other factors make this an unsatisfactory solution. The LOOK program provides the user with a means of rapidly examining the contents of an ASCII text file. LOOK's basis of operation is to open the text file for input only and then access it in a block-wise fashion. LOOK handles the text formatting and displays the text lines on the screen. The user can move forward or backward in the file by a given number of lines or blocks. LOOK also provides the ability to "scroll" the text at various speeds in the forward or backward directions. The user can perform a search for a string (or a combination of up to 10 strings) in a forward or backward direction. Also, user selected portions of text may be extracted and submitted to print or placed in a file. Additional features available to the LOOK user include: cancellation of an operation with a keystroke, user definable keys, switching mode of operation (e.g. 80/132 column), on-line help facility, trapping broadcast messages, and the ability to spawn a sub-process to carry out DCL functions without leaving LOOK. The LOOK program is written in FORTRAN 77 and MACRO ASSEMBLER for interactive execution and has been implemented on a DEC VAX computer using VAX/VMS with a central memory requirement of approximately 430K of 8 bit bytes. LOOK operation is terminal independent but will take advantage of the features of the DEC VT100 terminal if available. LOOK was developed in 1983.

  8. Cyclic Axial-Torsional Deformation Behavior of a Cobalt-Base Superalloy

    NASA Technical Reports Server (NTRS)

    Bonacuse, Peter J.; Kalluri, Sreeramesh

    1995-01-01

    The cyclic, high-temperature deformation behavior of a wrought cobalt-base super-alloy, Haynes 188, is investigated under combined axial and torsional loads. This is accomplished through the examination of hysteresis loops generated from a biaxial fatigue test program. A high-temperature axial, torsional, and combined axial-torsional fatigue database has been generated on Haynes 188 at 760 C. Cyclic loading tests have been conducted on uniform gage section tubular specimens in a servohydraulic axial-torsional test rig. Test control and data acquisition were accomplished with a minicomputer. The fatigue behavior of Haynes 188 at 760 C under axial, torsional, and combined axial-torsional loads and the monotonic and cyclic deformation behaviors under axial and torsional loads have been previously reported. In this paper, the cyclic hardening characteristics and typical hysteresis loops in the axial stress versus axial strain, shear stress ,versus engineering shear strain, axial strain versus engineering shear strain. and axial stress versus shear stress spaces are presented for cyclic in-phase and out-of-phase axial-torsional tests. For in-phase tests, three different values of the proportionality constant lambda (the ratio of engineering shear strain amplitude to axial strain amplitude, are examined, viz. 0.86, 1.73, and 3.46. In the out-of-phase tests, three different values of the phase angle, phi (between the axial and engineering shear strain waveforms), are studied, viz., 30, 60, and 90 degrees with lambda equals 1.73. The cyclic hardening behaviors of all the tests conducted on Haynes 188 at 760 C are evaluated using the von Mises equivalent stress-strain and the maximum shear stress-maximum engineering shear strain (Tresca) curves. Comparisons are also made between the hardening behaviors of cyclic axial, torsional, and combined in-phase (lambda = 1.73 and phi = 0) and out-of-phase (lambda = 1.73 and phi = 90') axial-torsional fatigue tests. These comparisons are accomplished through simple Ramberg-Osgood type stress-strain functions for cyclic, axial stress-strain and shear stress-engineering shear strain curves.

  9. Cyclic axial-torsional deformation behavior of a cobalt-base superalloy

    NASA Technical Reports Server (NTRS)

    Bonacuse, Peter J.; Kalluri, Sreeramesh

    1992-01-01

    Multiaxial loading, especially at elevated temperature, can cause the inelastic response of a material to differ significantly from that predicted by simple flow rules, i.e., von Mises or Tresca. To quantify some of these differences, the cyclic high-temperature, deformation behavior of a wrought cobalt-based superalloy, Haynes 188, is investigated under combined axial and torsional loads. Haynes 188 is currently used in many aerospace gas turbine and rocket engine applications, e.g., the combustor liner for the T800 turboshaft engine for the RAH-66 Comanche helicopter and the liquid oxygen posts in the main injector of the space shuttle main engine. The deformation behavior of this material is assessed through the examination of hysteresis loops generated from a biaxial fatigue test program. A high-temperature axial, torsional, and combined axial-torsional fatigue data base has been generated on Haynes 188 at 760 C. Cyclic loading tests have been conducted on uniform gauge section tubular specimens in a servohydraulic axial-torsional test rig. Test control and data acquisition were accomplished with a minicomputer. In this paper, the cyclic hardening characteristics and typical hysteresis loops in the axial stress versus axial strain, shear stress versus engineering shear strain, axial strain versus engineering shear strain, and axial stress versus shear stress spaces are presented for cyclic, in-phase and out-of-phase, axial torsional tests. For in-phase tests three different values of the proportionality constant, lambda (ratio of engineering shear strain amplitude to axial strain amplitude), are examined, viz., 0.86, 1.73, and 3.46. In the out-of-phase tests, three different values of the phase angle, phi (between the axial and engineering shear strain waveforms), are studied, viz., 30, 60, and 90 deg with lambda = 1.73. The cyclic hardening behaviors of all the tests conducted on Haynes 188 at 760 C are evaluated using the von Mises equivalent stress-strain and the maximum shear stress-maximum engineering shear strain (Tresca) curves. Comparisons are also made between the hardening behaviors of cyclic axial, torsional, and combined in-phase and out-of-phase axial-torsional fatigue tests. These comparisons are accomplished through simple Ramberg-Osgood type stress-strain functions for cyclic, axial stress-strain and shear stress-engineering shear strain curves.

  10. Applications of research from the U.S. Geological Survey program, assessment of regional earthquake hazards and risk along the Wasatch Front, Utah

    USGS Publications Warehouse

    Gori, Paula L., (Edited By)

    1993-01-01

    INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and engineering studies. Translated earthquake hazard maps have also been developed to identify areas that are particularly vulnerable to various causes of damage such as ground shaking, surface rupturing, and liquefaction. The implementation of earthquake hazard reduction plans are now under way in various communities in Utah. The results of a survey presented in this paper indicate that technical public officials (planners and building officials) have an understanding of the earthquake hazards and how to mitigate the risks. Although the survey shows that the general public has a slightly lower concern about the potential for economic losses, they recognize the potential problems and can support a number of earthquake mitigation measures. The study suggests that many community groups along the Wasatch Front, including volunteer groups, business groups, and elected and appointed officials, are ready for action-oriented educational programs. These programs could lead to a significant reduction in the risks associated with earthquake hazards. A DATA BASE DESIGNED FOR URBAN SEISMIC HAZARDS STUDIES: A computerized data base has been designed for use in urban seismic hazards studies conducted by the U.S. Geological Survey. The design includes file structures for 16 linked data sets, which contain geological, geophysical, and seismological data used in preparing relative ground response maps of large urban areas. The data base is organized along relational data base principles. A prototype urban hazards data base has been created for evaluation in two urban areas currently under investigation: the Wasatch Front region of Utah and the Puget Sound area of Washington. The initial implementation of the urban hazards data base was accomplished on a microcomputer using dBASE III Plus software and transferred to minicomputers and a work station. A MAPPING OF GROUND-SHAKING INTENSITIES FOR SALT LAKE COUNTY, UTAH: This paper documents the development of maps showing a

  11. Obituary: Arthur Dodd Code (1923-2009)

    NASA Astrophysics Data System (ADS)

    Marché, Jordan D., II

    2009-12-01

    Former AAS president Arthur Dodd Code, age 85, passed away at Meriter Hospital in Madison, Wisconsin on 11 March 2009, from complications involving a long-standing pulmonary condition. Code was born in Brooklyn, New York on 13 August 1923, as the only child of former Canadian businessman Lorne Arthur Code and Jesse (Dodd) Code. An experienced ham radio operator, he entered the University of Chicago in 1940, but then enlisted in the U.S. Navy (1943-45) and was later stationed as an instructor at the Naval Research Laboratory, Washington, D.C. During the war, he gained extensive practical experience with the design and construction of technical equipment that served him well in years ahead. Concurrently, he took physics courses at George Washington University (some under the tutelage of George Gamow). In 1945, he was admitted to the graduate school of the University of Chicago, without having received his formal bachelor's degree. In 1950, he was awarded his Ph.D. for a theoretical study of radiative transfer in O- and B-type stars, directed by Subrahmanyan Chandrasekhar. hired onto the faculty of the Department of Astronomy at the University of Wisconsin-Madison (1951-56). He then accepted a tenured appointment at the California Institute of Technology and the Mount Wilson and Palomar Observatories (1956-58). But following the launch of Sputnik, Code returned to Wisconsin in 1958 as full professor of astronomy, director of the Washburn Observatory, and department chairman so that he could more readily pursue his interest in space astronomy. That same year, he was chosen a member of the Space Science Board of the National Academy of Sciences (created during the International Geophysical Year) and shortly became one of five principal investigators of the original NASA Space Science Working Group. In a cogent 1960 essay, Code argued that astrophysical investigations, when conducted from beyond the Earth's atmosphere, "cannot fail to have a tremendous impact on the future course of stellar astronomy," a prediction strongly borne out in the decades that followed. In 1959, Code founded the Space Astronomy Laboratory (SAL) within the UW Department of Astronomy. Early photometric and spectrographic equipment was test-flown aboard NASA's X-15 rocket plane and Aerobee sounding rockets. Along with other SAL personnel, including Theodore E. Houck, Robert C. Bless, and John F. McNall, Code (as principal investigator) was responsible for the design of the Wisconsin Experiment Package (WEP) as one of two suites of instruments to be flown aboard the Orbiting Astronomical Observatory (OAO), which represented a milestone in the advent of space astronomy. With its seven reflecting telescopes feeding five filter photometers and two scanning spectrometers, WEP permitted the first extended observations in the UV portion of the spectrum. After the complete failure of the OAO-1 spacecraft (launched in 1966), OAO-2 was successfully launched on 7 December 1968 and gathered data on over a thousand celestial objects during the next 50 months, including stars, nebulae, galaxies, planets, and comets. These results appeared in a series of more than 40 research papers, chiefly in the Ap.J., along with the 1972 monograph, The Scientific Results from the Orbiting Astronomical Observatory (OAO-2), edited by Code. Between the OAO launches, other SAL colleagues of Code developed the Wisconsin Automatic Photoelectric Telescope (or APT), the first computer-controlled (or "robotic") telescope. Driven by a PDP-8 mini-computer, it routinely collected atmospheric extinction data. Code was also chosen principal investigator for the Wisconsin Ultraviolet Photo-Polarimeter Experiment (or WUPPE). This used a UV-sensitive polarimeter designed by Kenneth Nordsieck that was flown twice aboard the space shuttles in 1990 and 1995. Among other findings, WUPPE observations demonstrated that interstellar dust does not appreciably change the direction of polarization of starlight, thereby supporting its possible composition as graphite. Code was the recipie

  12. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (MACINTOSH VERSION)

    NASA Technical Reports Server (NTRS)

    Culbert, C.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.

  13. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION)

    NASA Technical Reports Server (NTRS)

    Riley, G.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.

  14. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION WITH CLIPSITS)

    NASA Technical Reports Server (NTRS)

    Riley, , .

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.

  15. NASA/FLAGRO - FATIGUE CRACK GROWTH COMPUTER PROGRAM

    NASA Technical Reports Server (NTRS)

    Forman, R. G.

    1994-01-01

    Structural flaws and cracks may grow under fatigue inducing loads and, upon reaching a critical size, cause structural failure to occur. The growth of these flaws and cracks may occur at load levels well below the ultimate load bearing capability of the structure. The Fatigue Crack Growth Computer Program, NASA/FLAGRO, was developed as an aid in predicting the growth of pre-existing flaws and cracks in structural components of space systems. The earlier version of the program, FLAGRO4, was the primary analysis tool used by Rockwell International and the Shuttle subcontractors for fracture control analysis on the Space Shuttle. NASA/FLAGRO is an enhanced version of the program and incorporates state-of-the-art improvements in both fracture mechanics and computer technology. NASA/FLAGRO provides the fracture mechanics analyst with a computerized method of evaluating the "safe crack growth life" capabilities of structural components. NASA/FLAGRO could also be used to evaluate the damage tolerance aspects of a given structural design. The propagation of an existing crack is governed by the stress field in the vicinity of the crack tip. The stress intensity factor is defined in terms of the relationship between the stress field magnitude and the crack size. The propagation of the crack becomes catastrophic when the local stress intensity factor reaches the fracture toughness of the material. NASA/FLAGRO predicts crack growth using a two-dimensional model which predicts growth independently in two directions based on the calculation of stress intensity factors. The analyst can choose to use either a crack growth rate equation or a nonlinear interpolation routine based on tabular data. The growth rate equation is a modified Forman equation which can be converted to a Paris or Walker equation by substituting different values into the exponent. This equation provides accuracy and versatility and can be fit to data using standard least squares methods. Stress-intensity factor numerical values can be computed for making comparisons or checks of solutions. NASA/FLAGRO can check for failure of a part-through crack in the mode of a through crack when net ligament yielding occurs. NASA/FLAGRO has a number of special subroutines and files which provide enhanced capabilities and easy entry of data. These include crack case solutions, cyclic load spectrums, nondestructive examination initial flaw sizes, table interpolation, and material properties. The materials properties files are divided into two types, a user defined file and a fixed file. Data is entered and stored in the user defined file during program execution, while the fixed file contains already coded-in property value data for many different materials. Prompted input from CRT terminals consists of initial crack definition (which can be defined automatically), rate solution type, flaw type and geometry, material properties (if they are not in the built-in tables of material data), load spectrum data (if not included in the loads spectrum file), and design limit stress levels. NASA/FLAGRO output includes an echo of the input with any error or warning messages, the final crack size, whether or not critical crack size has been reached for the specified stress level, and a life history profile of the crack propagation. NASA/FLAGRO is modularly designed to facilitate revisions and operation on minicomputers. The program was implemented on a DEC VAX 11/780 with the VMS operating system. NASA/FLAGRO is written in FORTRAN77 and has a memory requirement of 1.4 MB. The program was developed in 1986.

  16. Evaluation Of Digital Unsharp-Mask Filtering For The Detection Of Subtle Mammographic Microcalcifications

    NASA Astrophysics Data System (ADS)

    Chan, Heang-Ping; Vyborny, Carl J.; MacMahon, Heber; Metz, Charles E.; Doi, Kunio; Sickles, Edward A.

    1986-06-01

    We have conducted a study to assess the effects of digitization and unsharp-mask filtering on the ability of observers to detect subtle microcalcifications in mammograms. Thirty-two conventional screen-film mammograms were selected from patient files by two experienced mammographers. Twelve of the mammograms contained a suspicious cluster of microcalcifications in patients who subsequently underwent biopsy. Twenty of the mammograms were normal cases which were initially interpreted as being free of clustered microcalcifications and did not demonstrate such on careful review. The mammograms were digitized with a high-quality Fuji image processing/simulation system. The system consists of two drum scanners with which an original radiograph can be digitized, processed by a minicomputer, and reconstituted on film. In this study, we employed a sampling aperture of 0.1 mm X 0.1 mm and a sampling distance of 0.1 mm. The density range from 0.2 to 2.75 was digitized to 1024 grey levels per pixel. The digitized images were printed on a single emulsion film with a display aperture having the same size as the sampling aperture. The system was carefully calibrated so that the density and contrast of a digitized image were closely matched to those of the original radiograph. Initially, we evaluated the effects of the weighting factor and the mask size of a unsharp-mask filter on the appearance of mammograms for various types of breasts. Subjective visual comparisons suggested that a mask size of 91 X 91 pixels (9.1 mm X 9.1 mm) enhances the visibility of microcalcifications without excessively increasing the high-frequency noise. Further, a density-dependent weighting factor that increases linearly from 1.5 to 3.0 in the density range of 0.2 to 2.5 enhances the contrast of microcalcifications without introducing many potentially confusing artifacts in the low-density areas. An unsharp-mask filter with these parameters was used to process the digitized mammograms. We conducted observer performance experiments to evaluate the detectability of micro-calcifications in three sets of mammograms: the original film images, unprocessed digitized images, and unsharp-masked images. Each set included the same 20 normal cases and 12 abnormal cases. A total of 5 board-certified radiologists and 4 senior radiology residents participated as observers. In the first experiment, the detectability of microcalcifications was measured for the original, unprocessed digitized, and unsharp-masked images. Each observer read all 96 films in one session with the cases arranged in a different random order. A maximum of 15 seconds was allowed to read each image. To facilitate receiver operating character-istic (ROC) analysis, each observer ranked his/her observation regarding the presence or absence of a cluster of 3 or more microcalcifications on a 5-point confidence rating scale (1=definitely no microcalcifications, 2=probably no microcalcifications; 3=microcalcifi-cations possibly present; 4=microcalcifications probably present; 5=microcalcifications definitely present). The observer identified the location of the suspected microcalci-fications when the confidence rating was 2 or greater. In the second experiment, we evaluated whether reading the unsharp-masked image and the unprocessed digitized image side by side for each case would reduce false-positive detection rates for microcalcifications and thus improve overall performance. The observer was again allowed a maximum of 15 seconds to read each pair of images and was instructed to use the unsharp-masked image for primary reading and the unprocessed digitized image for reference. The experimental setting and procedures were otherwise the same as those for the first experiment.

  17. The ASC Sequoia Programming Model

    SciTech Connect

    Seager, M

    2008-08-06

    In the late 1980's and early 1990's, Lawrence Livermore National Laboratory was deeply engrossed in determining the next generation programming model for the Integrated Design Codes (IDC) beyond vectorization for the Cray 1s series of computers. The vector model, developed in mid 1970's first for the CDC 7600 and later extended from stack based vector operation to memory to memory operations for the Cray 1s, lasted approximately 20 years (See Slide 5). The Cray vector era was deemed an extremely long lived era as it allowed vector codes to be developed over time (the Cray 1s were faster in scalar mode than the CDC 7600) with vector unit utilization increasing incrementally over time. The other attributes of the Cray vector era at LLNL were that we developed, supported and maintained the Operating System (LTSS and later NLTSS), communications protocols (LINCS), Compilers (Civic Fortran77 and Model), operating system tools (e.g., batch system, job control scripting, loaders, debuggers, editors, graphics utilities, you name it) and math and highly machine optimized libraries (e.g., SLATEC, and STACKLIB). Although LTSS was adopted by Cray for early system generations, they later developed COS and UNICOS operating systems and environment on their own. In the late 1970s and early 1980s two trends appeared that made the Cray vector programming model (described above including both the hardware and system software aspects) seem potentially dated and slated for major revision. These trends were the appearance of low cost CMOS microprocessors and their attendant, departmental and mini-computers and later workstations and personal computers. With the wide spread adoption of Unix in the early 1980s, it appeared that LLNL (and the other DOE Labs) would be left out of the mainstream of computing without a rapid transition to these 'Killer Micros' and modern OS and tools environments. The other interesting advance in the period is that systems were being developed with multiple 'cores' in them and called Symmetric Multi-Processor or Shared Memory Processor (SMP) systems. The parallel revolution had begun. The Laboratory started a small 'parallel processing project' in 1983 to study the new technology and its application to scientific computing with four people: Tim Axelrod, Pete Eltgroth, Paul Dubois and Mark Seager. Two years later, Eugene Brooks joined the team. This team focused on Unix and 'killer micro' SMPs. Indeed, Eugene Brooks was credited with coming up with the 'Killer Micro' term. After several generations of SMP platforms (e.g., Sequent Balance 8000 with 8 33MHz MC32032s, Allian FX8 with 8 MC68020 and FPGA based Vector Units and finally the BB&N Butterfly with 128 cores), it became apparent to us that the killer micro revolution would indeed take over Crays and that we definitely needed a new programming and systems model. The model developed by Mark Seager and Dale Nielsen focused on both the system aspects (Slide 3) and the code development aspects (Slide 4). Although now succinctly captured in two attached slides, at the time there was tremendous ferment in the research community as to what parallel programming model would emerge, dominate and survive. In addition, we wanted a model that would provide portability between platforms of a single generation but also longevity over multiple--and hopefully--many generations. Only after we developed the 'Livermore Model' and worked it out in considerable detail did it become obvious that what we came up with was the right approach. In a nutshell, the applications programming model of the Livermore Model posited that SMP parallelism would ultimately not scale indefinitely and one would have to bite the bullet and implement MPI parallelism within the Integrated Design Code (IDC). We also had a major emphasis on doing everything in a completely standards based, portable methodology with POSIX/Unix as the target environment. We decided against specialized libraries like STACKLIB for performance, but kept as many general purpose, portable math libraries as were needed by the co