Science.gov

Sample records for minicomputers

  1. Introduction to Minicomputers in Federal Libraries.

    ERIC Educational Resources Information Center

    Young, Micki Jo; And Others

    This book for library administrators and Federal library staff covers the application of minicomputers in Federal libraries and offers a review of minicomputer technology. A brief overview of automation explains computer technology, hardware, and software. The role of computers in libraries is examined in terms of the history of computers and…

  2. Towards Everyday Language Information Retrieval Systems via Minicomputers.

    ERIC Educational Resources Information Center

    Bell, Colin; Jones, Kevin P.

    1979-01-01

    Surveys the current state of minicomputer-operated information systems capable of incorporating linguistic features, focusing on the Minicomputer Operated Retrieval (Partially Heuristic) System. Consideration is given to the automation of the indexing process and to available search strategies. (FM)

  3. A NASA family of minicomputer systems, Appendix A

    NASA Technical Reports Server (NTRS)

    Deregt, M. P.; Dulfer, J. E.

    1972-01-01

    This investigation was undertaken to establish sufficient specifications, or standards, for minicomputer hardware and software to provide NASA with realizable economics in quantity purchases, interchangeability of minicomputers, software, storage and peripherals, and a uniformly high quality. The standards will define minicomputer system component types, each specialized to its intended NASA application, in as many levels of capacity as required.

  4. Minicomputer Capabilities Related to Meteorological Aspects of Emergency Response

    SciTech Connect

    Rarnsdell, J. V.; Athey, G. F.; Ballinger, M. Y.

    1982-02-01

    The purpose of this report is to provide the NRC staff involved in reviewing licensee emergency response plans with background information on the capabilities of minicomputer systems that are related to the collection and dissemination of meteorological infonmation. The treatment of meteorological information by organizations with existing emergency response capabilities is described, and the capabilities, reliability and availability of minicomputers and minicomputer systems are discussed.

  5. Networking: A Solution to the Campus Minicomputer Problem.

    ERIC Educational Resources Information Center

    Fritz, Joseph

    Minicomputer networking can be an alternative solution to the problem of implementing various computer systems in universities. In its simplest case, networking takes the form of multiple small computers communicating over telephone lines to a larger host minicomputer which in turn communicates with the central mainframe. Using computers in this…

  6. Minicomputer Games. Teacher's Guide. Classroom Lessons and Games Centered around the Papy Minicomputer...A Source of Rich Situations That Call for Mental Arithmetic and Quick Strategic Thinking.

    ERIC Educational Resources Information Center

    CEMREL, Inc., St. Louis, MO.

    This material describes two games, Minicomputer Tug-of-War and Minicomputer Golf. The Papy Minicomputer derives its name from George Papy, who invented and introduced it in the 1950's. The Minicomputer is seen as an abacus with the flavor of a computer in its schematic representation of numbers. Its manner of representation combines decimal…

  7. On the role of minicomputers in structural design

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.

    1977-01-01

    Results are presented of exploratory studies on the use of a minicomputer in conjunction with large-scale computers to perform structural design tasks, including data and program management, use of interactive graphics, and computations for structural analysis and design. An assessment is made of minicomputer use for the structural model definition and checking and for interpreting results. Included are results of computational experiments demonstrating the advantages of using both a minicomputer and a large computer to solve a large aircraft structural design problem.

  8. Integrated computer-aided design using minicomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.

    1980-01-01

    Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM), a highly interactive software, has been implemented on minicomputers at the NASA Langley Research Center. CAD/CAM software integrates many formerly fragmented programs and procedures into one cohesive system; it also includes finite element modeling and analysis, and has been interfaced via a computer network to a relational data base management system and offline plotting devices on mainframe computers. The CAD/CAM software system requires interactive graphics terminals operating at a minimum of 4800 bits/sec transfer rate to a computer. The system is portable and introduces 'interactive graphics', which permits the creation and modification of models interactively. The CAD/CAM system has already produced designs for a large area space platform, a national transonic facility fan blade, and a laminar flow control wind tunnel model. Besides the design/drafting element analysis capability, CAD/CAM provides options to produce an automatic program tooling code to drive a numerically controlled (N/C) machine. Reductions in time for design, engineering, drawing, finite element modeling, and N/C machining will benefit productivity through reduced costs, fewer errors, and a wider range of configuration.

  9. A small inexpensive minicomputer system for speech research

    NASA Technical Reports Server (NTRS)

    Morris, C. F.

    1975-01-01

    A small but very effective minicomputer-based speech processing system costing just over 30,000 dollars is described here. The hardware and software comprising the system are discussed as well as immediate and future research applications.

  10. Why Use a Minicomputer? Some Factors Affecting Their Selection.

    ERIC Educational Resources Information Center

    Wainwright, Jane

    A study of computer facilities in British libraries highlighted the respective benefits and disadvantages of using the parent institution's central computer or using a dedicated minicomputer. The large computer's technical advantages include greater opportunities for sharing or buying operational software, and the availability of experienced…

  11. CAISYS-8- A CAI Language Developed For A Minicomputer.

    ERIC Educational Resources Information Center

    Holm, Cheryl; And Others

    The University of Texas Medical Branch developed a minicomputer-based computer-assisted instruction (CAI) system which employed a teacher oriented software package called CAISYS-8, consisting of a highly modularized teaching compiler and operating system. CAISYS-8 used instructional quanta which generalized the flow of information to and from the…

  12. The Minicomputer as Support for Academic Computing Requirements. A Preliminary Look at Some Results of an NSF-Supported-Experiment to Evaluate Minicomputer Use.

    ERIC Educational Resources Information Center

    Swoyer, Vincent H.

    The National Science Foundation financed a project enabling 10 small, nonresearch colleges to purchase minicomputers for academic and administrative use. The purpose of the study was to determine whether an owned minicomputer is a better investment for such small colleges than a time-sharing system or a cooperative arrangement. The colleges…

  13. Recent Trends in Minicomputer-Based Integrated Learning Systems for Reading and Language Arts Instruction.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    This paper discusses minicomputer-based ILSs (integrated learning systems), i.e., computer-based systems of hardware and software. An example of a minicomputer-based system in a school district (a composite of several actual districts) considers hardware, staffing, scheduling, reactions, problems, and training for a subskill-oriented reading…

  14. Instruction in Renal Physiology on a Minicomputer-Based Educational System.

    ERIC Educational Resources Information Center

    Wells, C. H.; And Others

    A prototypical minicomputer-based educational system was designed at the University of Texas Medical Branch to determine if it is possible to evolve complex educational programs which are effective and also flexible and of low cost. Freshman medical students using the minicomputer program substantially improved their problem-solving abilities in…

  15. Migration of 1970s Minicomputer Controls to Modern Toolkit Software

    SciTech Connect

    Juras, R.C.; Meigs, M.J.; Sinclair, J.A.; Tatum, B.A.

    1999-11-13

    Controls for accelerators and associated systems at the Holifield Radioactive Ion Beam Facility (HRIBF) at Oak Ridge National Laboratory have been migrated from 197Os-vintage minicomputers to a modern system based on Vista and EPICS toolkit software. Stability and capabilities of EPICS software have motivated increasing use of EPICS for accelerator controls. In addition, very inexpensive subsystems based on EPICS and the EPICS portable CA server running on Linux PCs have been implemented to control an ion source test facility and to control a building-access badge reader system. A new object-oriented, extensible display manager has been developed for EPICS to facilitate the transition to EPICS and will be used in place of MEDM. EPICS device support has been developed for CAMAC serial highway controls.

  16. Interface board for providing time signals to a super minicomputer

    NASA Astrophysics Data System (ADS)

    Reid, Robert J.

    1991-10-01

    This invention relates generally to signal interface circuit boards and more particularly to an interface board providing timing signals from a timing source to a super minicomputer (Q Bus compatible). In developing acoustic signatures for underwater vehicles, it is necessary to have accurate range and bearing data to the vehicle and have that data time correlated with the acoustic signals. Present equipment used for these purposes includes the Atlantic Undersea Test and Evaluation Center (AUTEC) Tracking System for providing three-dimensional positional information and the Acoustic Measurement Array for receiving the acoustic signals that are generated by the underwater vehicle. The tracking system determines the position of the underwater vehicle by the use of a pinger attached to the vehicle and bottom mounted hydrophones.

  17. Distributing structural optimization software between a mainframe and a minicomputer

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.; Dovi, A. R.; Riley, K. M.

    1981-01-01

    This paper describes a distributed software system for solving large-scale structural optimization problems. Distributing the software between a mainframe computer and a minicomputer takes advantage of some of the best features available on each computer. The described software system consists of a finite element structural analysis computer program, a general purpose optimizer program, and several small user-supplied problem dependent programs. Comparison with a similar system executing entirely on the mainframe computer reveals that the distributed system costs less, uses computer resources more efficiently and improves production through faster turnaround and improved user control. The system interfaces with interactive graphics software for generating models and displaying the intermediate and final results

  18. Microprocessor assisted real-time harmonic analysis by minicomputer.

    PubMed

    Matheson, T G; Higgins, R J

    1978-12-01

    A real-time signal averaging and harmonic analysis technique for low audio frequency waveforms is described that, although based on based on earlier software emulations of lock-in detection instruments, eliminates problems inherent in the earlier systems. Parallel processing is employed between a data acquisition minicomputer (HP-2116B) and a microcomputer (Z-80) to (1) replace an earlier chopper-type lock-in algorithm with a coherent Fourier transform, (2) digitally produce a pure (0.01% THD) modulation sine wave, (3) simplify system tune-up, and (4) produce a high-quality, flicker-free, real-time display of the averaged waveform and its harmonic content. PMID:18699035

  19. Minicomputer linear programming analysis yields options for gasoline-blending decisions

    SciTech Connect

    Arnold, V.E.

    1984-02-13

    Neither a large mainframe computer nor extensive mathematics background is now necessary to take advantage of linear programs in evaluating gasoline blending options. A minicomputer can handle the task. This article presents a general algorithm for performing linear programming (LP) analysis by the simplex method on a Radio Shack TRS-80 Model I or III (Level Basic) minicomputer with 16K of random access memory (RAM). Application of this general algorithm to gasoline blending studies is presented in this article by an outline of steps necessary for data input and evaluation of several cases to decide between various investment options.

  20. Distributed Network and Multiprocessing Minicomputer State-of-the-Art Capabilities.

    ERIC Educational Resources Information Center

    Theis, Douglas J.

    An examination of the capabilities of minicomputers and midicomputers now on the market reveals two basic items which users should evaluate when selecting computers for their own applications: distributed networking systems and multiprocessing architectures. Variables which should be considered in evaluating a distributed networking system…

  1. Mathematical Analysis of Piaget's Grouping Concept. Papy's Minicomputer as a Grouping

    ERIC Educational Resources Information Center

    Steiner, H. G.

    1974-01-01

    Through a mathematical analysis, Piaget's grouping concept can be formally interpreted as being a hybrid between the mathematical concepts of a group and a lattice. Some relevant pedagogical models are presented. Activities with Cuisenaire Rods, Dienes Blocks, and Papy's Minicomputer are shown to take place in groupings. (LS)

  2. Interface for 15VSM-5 and Elektronika D3-28 minicomputers with digital measuring instruments

    SciTech Connect

    Udovichenko, N.A.; Polikarpov, Yu.I.; Makushkin, B.V.

    1987-07-01

    A device is described for data input (up to 8 decimal digits in 8421 code) into 15VSM-5 and Elektronika D3-28 minicomputers from four measuring instruments: a V7-21 voltmeter and three Ch3-54 frequency counters. Data from the voltmeter are entered by software interrogation and data from the frequency counters are entered by software interrupts. The device is implemented by TTL integrated circuits.

  3. Automated weighing procedure for toxicological studies on small animals, using a minicomputer.

    PubMed

    Lewi, P J; Marsboom, R

    1975-08-01

    A compact system was designed for weighing procedures in toxicological studies on small animals that integrated 4 basic functions: data acquistion, record keeping, statistical analysis, and report preparation. An electric balance, a minicomputer, and a typewriter were incorporated into the system. Elimination of clerical work and accelerated flow of information between planning, operation, and evaluation of experiments were found to be the main advantages. PMID:1152423

  4. A brief description of the Medical Information Computer System (MEDICS). [real time minicomputer system

    NASA Technical Reports Server (NTRS)

    Moseley, E. C.

    1974-01-01

    The Medical Information Computer System (MEDICS) is a time shared, disk oriented minicomputer system capable of meeting storage and retrieval needs for the space- or non-space-related applications of at least 16 simultaneous users. At the various commercially available low cost terminals, the simple command and control mechanism and the generalized communication activity of the system permit multiple form inputs, real-time updating, and instantaneous retrieval capability with a full range of options.

  5. CDC/1000: a Control Data Corporation remote batch terminal emulator for Hewlett-Packard minicomputers

    SciTech Connect

    Berg, D.E.

    1981-02-01

    The Control Data Corporation Type 200 User Terminal utilizes a unique communications protocol to provide users with batch mode remote terminal access to Control Data computers. CDC/1000 is a software subsystem that implements this protocol on Hewlett-Packard minicomputers running the Real Time Executive III, IV, or IVB operating systems. This report provides brief descriptions of the various software modules comprising CDC/1000, and contains detailed instructions for integrating CDC/1000 into the Hewlett Packard operating system and for operating UTERM, the user interface program for CDC/1000. 6 figures.

  6. Prickett and Lonnquist aquifer simulation program for the Apple II minicomputer

    SciTech Connect

    Hull, L.C.

    1983-02-01

    The Prickett and Lonnquist two-dimensional groundwater model has been programmed for the Apple II minicomputer. Both leaky and nonleaky confined aquifers can be simulated. The model was adapted from the FORTRAN version of Prickett and Lonnquist. In the configuration presented here, the program requires 64 K bits of memory. Because of the large number of arrays used in the program, and memory limitations of the Apple II, the maximum grid size that can be used is 20 rows by 20 columns. Input to the program is interactive, with prompting by the computer. Output consists of predicted lead values at the row-column intersections (nodes).

  7. Potential of minicomputer/array-processor system for nonlinear finite-element analysis

    NASA Technical Reports Server (NTRS)

    Strohkorb, G. A.; Noor, A. K.

    1983-01-01

    The potential of using a minicomputer/array-processor system for the efficient solution of large-scale, nonlinear, finite-element problems is studied. A Prime 750 is used as the host computer, and a software simulator residing on the Prime is employed to assess the performance of the Floating Point Systems AP-120B array processor. Major hardware characteristics of the system such as virtual memory and parallel and pipeline processing are reviewed, and the interplay between various hardware components is examined. Effective use of the minicomputer/array-processor system for nonlinear analysis requires the following: (1) proper selection of the computational procedure and the capability to vectorize the numerical algorithms; (2) reduction of input-output operations; and (3) overlapping host and array-processor operations. A detailed discussion is given of techniques to accomplish each of these tasks. Two benchmark problems with 1715 and 3230 degrees of freedom, respectively, are selected to measure the anticipated gain in speed obtained by using the proposed algorithms on the array processor.

  8. Ruggedized minicomputer hardware and software topics, 1981: Proceedings of the 4th ROLM MIL-SPEC Computer User's Group Conference

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Presentations of a conference on the use of ruggedized minicomputers are summarized. The following topics are discussed: (1) the role of minicomputers in the development and/or certification of commercial or military airplanes in both the United States and Europe; (2) generalized software error detection techniques; (3) real time software development tools; (4) a redundancy management research tool for aircraft navigation/flight control sensors; (5) extended memory management techniques using a high order language; and (6) some comments on establishing a system maintenance scheme. Copies of presentation slides are also included.

  9. Programmable Calculators and Minicomputers in Agriculture. A Symposium Exploring Computerized Decision-Making Aids and Their Extension to the Farm Level. Proceedings of a Symposium (Hot Springs, Arkansas, February 6-7, 1980)

    ERIC Educational Resources Information Center

    Bentley, Ernest, Ed.

    Ten papers presented at a symposium discuss the array of computerized decision-making aids currently available to farmers and ways to speed up the rate of adoption of computers by agriculturalists. Topics presented include the development of software for agricultural decision-making; the role of programmable calculators and minicomputers in…

  10. Educational Time-Sharing on a Minicomputer.

    ERIC Educational Resources Information Center

    Tidball, Charles S.; Bon, Bruce B.

    A multi-language, time-sharing system (MTS-12) has been developed for the PDP-12, a Digital Equipment Corporation laboratory computer. This low-cost, core-resident system features program storage on LINC tape (3/4" magnetic tape on a 4" reel), access to the high-level interpreted FOCAL language, and a special variable storage in the user buffer…

  11. Laboratory-Equivalent Minicomputer Experiments: A Kinetic Application

    ERIC Educational Resources Information Center

    Cabrol, D.; And Others

    1975-01-01

    Describes programs that have been developed to allow kinetic experiments to be simulated on a small computer. Reports the principles that have guided the conception of the programs and describes an instance of their application to a complex reaction. (Author/GS)

  12. Gamma-Ray Spectrum Analysis Method for Minicomputers.

    1984-01-24

    Version 00 SAMPO80 is a rapid and accurate analysis program for gamma-ray spectra measured with Ge(Li) or HPGe detectors. SAMPO80 consists of three separate parts, the shape calibration part SAMPOSHAPE, the peak search and fitting part SAMPOFIT, and the nuclide identification part SAMPOID.

  13. LISP machines come out of the lab (minicomputers)

    SciTech Connect

    Creeger, M.

    1983-11-01

    Artificial intelligence is becoming increasingly attractive to commercial users thanks to computer architectures designed to support the LISP language. As an example of the novel features of the new architectures, LISP Machine Inc.'s lambda machine is described.

  14. A General-Purpose Monte Carlo Gamma-Ray Transport Code System for Minicomputers.

    1981-08-27

    Version 00 The OGRE code system was designed to calculate, by Monte Carlo methods, any quantity related to gamma-ray transport. The system is represented by two codes which treat slab geometry. OGRE-P1 computes the dose on one side of a slab for a source on the other side, and HOTONE computes energy deposition in addition. The source may be monodirectional, isotropic, or cosine distributed.

  15. The chemical abundances of the Cassiopeia A fast-moving knots - Explosive nucleosynthesis on a minicomputer

    NASA Technical Reports Server (NTRS)

    Johnston, M. D.; Joss, P. C.

    1980-01-01

    A simplified nuclear reaction network for explosive nucleosynthesis calculations is described in which only the most abundant nuclear species and the most important reactions linking these species are considered. This scheme permits the exploration of many cases without excessive computational effort. Good agreement with previous calculations employing more complex reaction networks is obtained. This scheme is applied to the observed chemical abundances of the fast-moving knots in the supernova remnant Cassiopeia A and it is found that a wide range of initial conditions could yield the observed abundances. The abundances of four of the knots with significant and different amounts of elements heavier than oxygen are consistent with an origin in material of the same initial composition but processed at different peak temperatures and densities. Despite the observed high oxygen abundances and low abundances of light elements in the knots, they did not necessarily undergo incomplete oxygen burning; in fact, it is not even necessary that oxygen have been present in the initial composition. The agreement between the calculated and observed chemical abundances in Cas A and similar supernova remnants depends primarily upon the relevant nuclear physics and does not provide strong evidence in favor of any particular model of the supernova event.

  16. Linking of the mini-computer Electronik-100I and NR-9821A

    NASA Technical Reports Server (NTRS)

    Zubkov, B. V.; Khromov, V. N.

    1979-01-01

    The means of transmitting digital information from the computer E-100I to the desk top calculator NR-9821A with the help of an intermediate carrier of information (perforated tape) is described. The means of removal of information from the computer E-100I in a form which is understandable for the NR-9821A are given. Instructions for the use and programming of the transcription of information onto magnetic tape from the perforated tape and from the keyboard of the calculator are included.

  17. "A Fast Running Program For Minicomputer Based On Exact Derivative Of Optimization Criterions"

    NASA Astrophysics Data System (ADS)

    Hugues, Edgar; Babolat, Claude; Bacchus, J. M.

    1983-10-01

    The very fast evolution of the Hardware and the software brings the optical designer to choice betwen two attitudes. 1) To use the services of a specialized company which is continusly devoloping optical programs. 2) To write its own programs and improve them according to the needs. Theory and experience have to help themselves to realize an harmonious balance in order to get product improvements through programs improvements. CERCO has choosen the second alternative.

  18. The Impact of Minicomputers and Microcomputers on the Software-Oriented Curriculum.

    ERIC Educational Resources Information Center

    Roehrkasse, Robert C.

    Discussed are the requirements of a software-oriented engineering curriculum that also includes use of computer hardware. Three areas are identified as necessary in such a curriculum: functional area users, systems programming, and mini-micro technology. Each of these areas is discussed in terms of instructional methods and suggested topics.…

  19. Mini-Computers and the Building Trades: A Guide for Teachers of Vocational Education. Final Report.

    ERIC Educational Resources Information Center

    Asplen, Donald; And Others

    These training materials are designed to help vocational education teachers introduce students to the utilization and installation of mini- and microcomputers in residential and small business buildings. It consists of two chapters. Chapter 1 contains general materials, designed to promote awareness, and chapter 2 contains materials which are…

  20. Retrieving Records from a Gigabyte of Text on a Minicomputer Using Statistical Ranking.

    ERIC Educational Resources Information Center

    Harman, Donna; Candela, Gerald

    1990-01-01

    Describes the advantages of a prototype retrieval system that uses statistically based ranked retrieval of records rather than traditional boolean methods, especially for end users. Several new techniques are also discussed including bit mapping, pruning, methods of building inverted files, and types of search engine. (26 references) (EAM)

  1. Computerized Clown Serves as Teaching "Toy"

    ERIC Educational Resources Information Center

    Creative Computing, 1978

    1978-01-01

    One example is given of the success and feasibility of specially designed teaching toys that are really mini-computer terminals connected by telephone lines to a minicomputer which transmits individualized, daily programmed instruction to children. (MN)

  2. Dead reckoner navigation project

    NASA Technical Reports Server (NTRS)

    Ellis, R.; Sweet, L.

    1981-01-01

    A previous dead reckoner involved a classical gyrocompass, a Hewlett-Packard minicomputer, and a true airspeed sensor. In an effort to bring the cost of this system more in line with the realities of general aviation, recent work was done on replacing the minicomputer with a microcomputer and implementing a fluidic rate sensor in the compass system in place of the directional gyro.

  3. Maximizing the Mini for Administratice Computing.

    ERIC Educational Resources Information Center

    Clarkson, Marcia Shonnard

    A medium sized time-sharing minicomputer performs both administrative and academic computing for the University of the South. Some advantages of the system include the opportunity for student access to the terminals, and the creation of new student jobs involving computer activities. Also, the minicomputer system hardware is less expensive to run…

  4. A Successful Transition from Mini- to Microcomputer-Assisted Instruction: The Norfolk Experience.

    ERIC Educational Resources Information Center

    Gull, Randall L.

    1980-01-01

    Reviews reasons for the decision to change from a timeshare minicomputer to microcomputers, financial considerations involved, the purchase of hardware, the problem posed by the lack of compatible software for the microcomputers, and the development of the Assisted Instructional Development System (AIDS) for adapting minicomputer software and…

  5. Computer program and user documentation medical data tape retrieval system

    NASA Technical Reports Server (NTRS)

    Anderson, J.

    1971-01-01

    This volume provides several levels of documentation for the program module of the NASA medical directorate mini-computer storage and retrieval system. A biomedical information system overview describes some of the reasons for the development of the mini-computer storage and retrieval system. It briefly outlines all of the program modules which constitute the system.

  6. Transcription of the Workshop on General Aviation Advanced Avionics Systems

    NASA Technical Reports Server (NTRS)

    Tashker, M. (Editor)

    1975-01-01

    Papers are presented dealing with the design of reliable, low cost, advanced avionics systems applicable to general aviation in the 1980's and beyond. Sensors, displays, integrated circuits, microprocessors, and minicomputers are among the topics discussed.

  7. Electron Optics Cannot Be Taught through Computation?

    ERIC Educational Resources Information Center

    van der Merwe, J. P.

    1980-01-01

    Describes how certain concepts basic to electron optics may be introduced to undergraduate physics students by calculating trajectories of charged particles through electrostatic fields which can be evaluated on minicomputers with a minimum of programing effort. (Author/SA)

  8. Design and performance of a large vocabulary discrete word recognition system. Volume 2: Appendixes. [flow charts and users manual

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The users manual for the word recognition computer program contains flow charts of the logical diagram, the memory map for templates, the speech analyzer card arrangement, minicomputer input/output routines, and assembly language program listings.

  9. Cooperative processing data bases

    NASA Technical Reports Server (NTRS)

    Hasta, Juzar

    1991-01-01

    Cooperative processing for the 1990's using client-server technology is addressed. The main theme is concepts of downsizing from mainframes and minicomputers to workstations on a local area network (LAN). This document is presented in view graph form.

  10. Turnkey CAD/CAM selection and evaluation

    NASA Technical Reports Server (NTRS)

    Moody, T.

    1980-01-01

    The methodology to be followed in evaluating and selecting a computer system for manufacturing applications is discussed. Main frames and minicomputers are considered. Benchmark evaluations, demonstrations, and contract negotiations are discussed.

  11. Vault Safety and Inventory System users manual, PRIME 2350. Revision 1

    SciTech Connect

    Downey, N.J.

    1994-12-14

    This revision is issued to request review of the attached document: VSIS User Manual, PRIME 2350, which provides user information for the operation of the VSIS (Vault Safety and Inventory System). It describes operational aspects of Prime 2350 minicomputer and vault data acquisition equipment. It also describes the User`s Main Menu and menu functions, including REPORTS. Also, system procedures for the Prime 2350 minicomputer are covered.

  12. The revolution in data gathering systems

    NASA Technical Reports Server (NTRS)

    Cambra, J. M.; Trover, W. F.

    1975-01-01

    Data acquisition systems used in NASA's wind tunnels from the 1950's through the present time are summarized as a baseline for assessing the impact of minicomputers and microcomputers on data acquisition and data processing. Emphasis is placed on the cyclic evolution in computer technology which transformed the central computer system, and finally the distributed computer system. Other developments discussed include: medium scale integration, large scale integration, combining the functions of data acquisition and control, and micro and minicomputers.

  13. User microprogrammable processors for high data rate telemetry preprocessing

    NASA Technical Reports Server (NTRS)

    Pugsley, J. H.; Ogrady, E. P.

    1973-01-01

    The use of microprogrammable processors for the preprocessing of high data rate satellite telemetry is investigated. The following topics are discussed along with supporting studies: (1) evaluation of commercial microprogrammable minicomputers for telemetry preprocessing tasks; (2) microinstruction sets for telemetry preprocessing; and (3) the use of multiple minicomputers to achieve high data processing. The simulation of small microprogrammed processors is discussed along with examples of microprogrammed processors.

  14. MINIS: Multipurpose Interactive NASA Information System

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The Multipurpose Interactive NASA Information Systems (MINIS) was developed in response to the need for a data management system capable of operation on several different minicomputer systems. The desired system had to be capable of performing the functions of a LANDSAT photo descriptive data retrieval system while remaining general in terms of other acceptable user definable data bases. The system also had to be capable of performing data base updates and providing user-formatted output reports. The resultant MINI System provides all of these capabilities and several other features to complement the data management system. The MINI System is currently implemented on two minicomputer systems and is in the process of being installed on another minicomputer system. The MINIS is operational on four different data bases.

  15. Study of systems and techniques for data base management

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  16. Design of a microprocessor-based Control, Interface and Monitoring (CIM unit for turbine engine controls research

    NASA Technical Reports Server (NTRS)

    Delaat, J. C.; Soeder, J. F.

    1983-01-01

    High speed minicomputers were used in the past to implement advanced digital control algorithms for turbine engines. These minicomputers are typically large and expensive. It is desirable for a number of reasons to use microprocessor-based systems for future controls research. They are relatively compact, inexpensive, and are representative of the hardware that would be used for actual engine-mounted controls. The Control, Interface, and Monitoring Unit (CIM) contains a microprocessor-based controls computer, necessary interface hardware and a system to monitor while it is running an engine. It is presently being used to evaluate an advanced turbofan engine control algorithm.

  17. A program for mass spectrometer control and data processing analyses in isotope geology; written in BASIC for an 8K Nova 1120 computer

    USGS Publications Warehouse

    Stacey, J.S.; Hope, J.

    1975-01-01

    A system is described which uses a minicomputer to control a surface ionization mass spectrometer in the peak switching mode, with the object of computing isotopic abundance ratios of elements of geologic interest. The program uses the BASIC language and is sufficiently flexible to be used for multiblock analyses of any spectrum containing from two to five peaks. In the case of strontium analyses, ratios are corrected for rubidium content and normalized for mass spectrometer fractionation. Although almost any minicomputer would be suitable, the model used was the Data General Nova 1210 with 8K memory. Assembly language driver program and interface hardware-descriptions for the Nova 1210 are included.

  18. A digital TV system for the detection of high speed human motion

    NASA Astrophysics Data System (ADS)

    Fang, R. C.

    1981-08-01

    Two array cameras and a force plate were linked to a PDP-11/34 minicomputer for an on-line recording of high speed human motion. A microprocessor-based interface system was constructed to allow preprocessing and coordinating of the video data before being transferred to the minicomputer. Control programs of the interface system are stored in the disk and loaded into the program storage areas of the microprocessor before the interface system starts its operation. Software programs for collecting and processing video and force data have been written. Experiments on the detection of human jumping have been carried out. Normal gait and amputee gait have also been recorded and analyzed.

  19. Computer Program and User Documentation Medical Data Input System

    NASA Technical Reports Server (NTRS)

    Anderson, J.

    1971-01-01

    Several levels of documentation are presented for the program module of the NASA medical directorate minicomputer storage and retrieval system. The biomedical information system overview gives reasons for the development of the minicomputer storage and retrieval system. It briefly describes all of the program modules which constitute the system. A technical discussion oriented to the programmer is given. Each subroutine is described in enough detail to permit in-depth understanding of the routines and to facilitate program modifications. The program utilization section may be used as a users guide.

  20. Program for Development of Artificial Intelligence

    NASA Technical Reports Server (NTRS)

    Riley, Gary; Culbert, Chris; Lopez, Frank

    1987-01-01

    C Language Integrated Production System (CLIPS) computer program is shell for developing expert systems. Designed to enable research, development, and delivery of artificial intelligence on conventional computers. Primary design goals for CLIPS are portability, efficiency, and functionality. Meets or out-performs most microcomputer- and minicomputer-based artificial-intelligence tools. Written in C.

  1. A Comprehensive Model for the Design of Micro and Mini Computer Systems in School Districts: A Guide for Developing Computer Systems for Local School Districts.

    ERIC Educational Resources Information Center

    Graczyk, Sandra L.; Kiser, Chester

    This administrative and instructional guide offers information and recommendations for computer design techniques based on literature sources and school district applications; design of micro- and mini-computer systems is intended for those with little or no experience. Chapter 1, "Planning for the Computer System: Choosing Purposes and Parts,"…

  2. Computer Series, 51: Bits and Pieces, 20.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1984-01-01

    Describes: Apple stereochemistry program; CNDO/2-INDO mini-computer calculations; direct linear plot procedure for enzyme kinetics calculations; construction of nonlinear Scatchard plots; simulation of mass spectral envelopes of polyisotopic elements; graphics with a dot-matrix printer; MINC computer in the physical chemistry laboratory; hallway…

  3. MODC2 procedures for assembly of MODCOMP-2 programs using the Sigma 5 assembler

    NASA Technical Reports Server (NTRS)

    Layland, J. W.

    1976-01-01

    A set of programs was written to enable the METASYMBOL macro-assembler of the Sigma 5 to assemble programs for an attached MODCOMP-2 minicomputer. This program set is a follow-on to previously developed program sets which facilitated assemblies for the PDP-11 and SDS-930.

  4. An Interactive, Interdisciplinary, On-Line Graphics System for Presenting and Manipulating Directed Graphs.

    ERIC Educational Resources Information Center

    Beazley, William; And Others

    An interactive graphics system has been implemented for tutorial purposes and for research in man-machine communication of structural digraphs. An IMLAC intelligent terminal with ligthpen input is used in conjunction with a NOVA minicomputer. Successful application in linguistics and engineering problem solving are discussed, the latter in detail.…

  5. Operating manual for the RRL 8 channel data logger

    NASA Technical Reports Server (NTRS)

    Paluch, E. J.; Shelton, J. D.; Gardner, C. S.

    1979-01-01

    A data collection device which takes measurements from external sensors at user specified time intervals is described. Three sensor ports are dedicated to temperature, air pressure, and dew point. Five general purpose sensor ports are provided. The user specifies when the measurements are recorded as well as when the information is read or stored in a minicomputer or a paper tape.

  6. An Examination of the Potential Relationship between Technology and Persistence among At-Risk College Students

    ERIC Educational Resources Information Center

    Hughey, Aaron W.; Manco, Charlene M.

    2012-01-01

    Academically underprepared college students, i.e., those identified as needing developmental (remedial) English, mathematics and reading courses in order to maximize their potential for academic success at college-level studies, were provided with the opportunity to rent, for a minimal, subsidized fee, mini-computers bundled with digital course…

  7. Design and Implementation of Instructional Computer Systems.

    ERIC Educational Resources Information Center

    Graczyk, Sandra L.

    1989-01-01

    Presents an input-process-output (IPO) model that can facilitate the design and implementation of instructional micro and minicomputer systems in school districts. A national survey of school districts with outstanding computer systems is described, a systems approach to develop the model is explained, and evaluation of the system is discussed.…

  8. A Summary and Commentary on D. and S. Premack's "Original Intelligence"

    ERIC Educational Resources Information Center

    Greer, R. Douglas

    2006-01-01

    Some evolutionary cognitive and developmental psychologists propose that the human mind consists of domain-specific modules. These are characterized as self-contained "mini-computers" that process information of a certain kind. In their book, "Original Intelligence," the Premacks set out to provide a synthesis of evidence from various fields in…

  9. The prediction of acoustical particle motion using an efficient polynomial curve fit procedure

    NASA Technical Reports Server (NTRS)

    Marshall, S. E.; Bernhard, R.

    1984-01-01

    A procedure is examined whereby the acoustic model parameters, natural frequencies and mode shapes, in the cavities of transportation vehicles are determined experimentally. The acoustic model shapes are described in terms of the particle motion. The acoustic modal analysis procedure is tailored to existing minicomputer based spectral analysis systems.

  10. Teaching Tips.

    ERIC Educational Resources Information Center

    Kanervo, Ellen; And Others

    1980-01-01

    Contains teaching ideas from six journalism teachers on the following topics: teaching electronic editing, using minicomputers in an advertising media course, five ways to make grading stories easier, the point and code system of grading, student coverage of state government, and the "guided design" teaching technique. (RL)

  11. COED Transactions, Vol. XI, No. 7 & 8, July/August 1979. A Miniature Automated Warehouse: A Laboratory Teaching Device.

    ERIC Educational Resources Information Center

    Mitchell, Eugene E., Ed.

    A do-it-yourself laboratory course in automated systems designed at the University of Florida is described. Using a working model of a warehouse interfaced with a minicomputer as a working laboratory, the student gains hands-on experience in operations programing and applications of scheduling, materials handling, and heuristic optimization. (BT)

  12. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    ERIC Educational Resources Information Center

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  13. What's Where In Software: An Update.

    ERIC Educational Resources Information Center

    Currents, 1995

    1995-01-01

    A directory lists computer software vendors offering software useful in administering college alumni and development programs. Listings include client/server system vendors and minicomputer and mainframe system vendors. Each listing contains the vendor name and address, contact person, software title(s), cost, hardware requirements, and client…

  14. Commonalities in Pedagogy Situating Cell Phone Use in the Classroom

    ERIC Educational Resources Information Center

    Abend, Laurie Lafer

    2013-01-01

    Technology has become embedded in all aspects of students' lives as they increasingly rely on mobile technology devices such as cell phones to access and share information. Cell phones function as portable, affordable, and ubiquitous mini-computers, yet few teachers have leveraged the benefits of cell phone technology for teaching and learning…

  15. A High Resolution Graphic Input System for Interactive Graphic Display Terminals. Appendix B.

    ERIC Educational Resources Information Center

    Van Arsdall, Paul Jon

    The search for a satisfactory computer graphics input system led to this version of an analog sheet encoder which is transparent and requires no special probes. The goal of the research was to provide high resolution touch input capabilities for an experimental minicomputer based intelligent terminal system. The technique explored is compatible…

  16. Microprocessors in U.S. Electrical Engineering Departments, 1974-1975.

    ERIC Educational Resources Information Center

    Sloan, M. E.

    Drawn from a survey of engineering departments known to be teaching microprocessor courses, this paper shows that the adoption of microprocessors by Electrical Engineering Departments has been rapid compared with their adoption of minicomputers. The types of courses that are being taught can be categorized as: surveys of microprocessors, intensive…

  17. Integrated Computer-Aided Drafting Instruction (ICADI).

    ERIC Educational Resources Information Center

    Chen, C. Y.; McCampbell, David H.

    Until recently, computer-aided drafting and design (CAD) systems were almost exclusively operated on mainframes or minicomputers and their cost prohibited many schools from offering CAD instruction. Today, many powerful personal computers are capable of performing the high-speed calculation and analysis required by the CAD application; however,…

  18. Data Input for Libraries: State-of-the-Art Report.

    ERIC Educational Resources Information Center

    Buckland, Lawrence F.

    This brief overview of new manuscript preparation methods which allow authors and editors to set their own type discusses the advantages and disadvantages of optical character recognition (OCR), microcomputers and personal computers, minicomputers, and word processors for editing and database entry. Potential library applications are also…

  19. Application of Adaptive Decision Aiding Systems to Computer-Assisted Instruction. Final Report, January-December 1974.

    ERIC Educational Resources Information Center

    May, Donald M.; And Others

    The minicomputer-based Computerized Diagnostic and Decision Training (CDDT) system described combines the principles of artificial intelligence, decision theory, and adaptive computer assisted instruction for training in electronic troubleshooting. The system incorporates an adaptive computer program which learns the student's diagnostic and…

  20. Sunrise to Sunset Lifelong Learning Via Microwave Networks: From a National Heritage.

    ERIC Educational Resources Information Center

    Hart, Russ A.

    Of necessity, adult educators will be turning to technological delivery forms to meet the insistent call for increasing numbers of programs. As teleconferencing, television, microwave, minicomputer, satellite, fiberoptic, and laser technologies continue to expand, they hold promise of educating millions of adult students on and off campus. A…

  1. The Use of Computer Networks in Data Gathering and Data Analysis.

    ERIC Educational Resources Information Center

    Yost, Michael; Bremner, Fred

    This document describes the review, analysis, and decision-making process that Trinity University, Texas, went through to develop the three-part computer network that they use to gather and analyze EEG (electroencephalography) and EKG (electrocardiogram) data. The data are gathered in the laboratory on a PDP-1124, an analog minicomputer. Once…

  2. Technological Discontinuities and Organizational Environments.

    ERIC Educational Resources Information Center

    Tushman, Michael L.; Anderson, Philip

    1986-01-01

    Technological effects on environmental conditions are analyzed using longitudinal data from the minicomputer, cement, and airline industries. Technology evolves through periods of incremental change punctuated by breakthroughs that enhance or destroy the competence of firms. Competence-destroying discontinuities increase environmental turbulence;…

  3. Technological Discontinuities and Dominant Designs: A Cyclical Model of Technological Change.

    ERIC Educational Resources Information Center

    Anderson, Philip; Tushman, Michael L.

    1990-01-01

    Based on longitudinal studies of the cement, glass, and minicomputer industries, this article proposes a technological change model in which a technological breakthrough, or discontinuity, initiates an era of intense technical variation and selection, culminating in a single dominant design and followed by a period of incremental technical…

  4. An inexpensive vehicle speed detector

    NASA Technical Reports Server (NTRS)

    Broussard, P., Jr.

    1973-01-01

    Low-power minicomputer can plug into automobile cigarette lighter. It measures time it takes observed car to travel premeasured distance and provides immediate readout of speed. Potentially, detector could be manufactured for less than $200 per unit and would have very low maintenance cost.

  5. Industrial robots and robotics

    SciTech Connect

    Kafrissen, S.; Stephens, M.

    1984-01-01

    This book discusses the study of robotics. It provides information of hardware, software, applications and economics. Eleven chapters examine the following: Minicomputers, Microcomputers, and Microprocessors; The Servo-Control System; The Activators; Robot Vision Systems; and Robot Workcell Environments. Twelve appendices supplement the data.

  6. Synchronous multi-microprocessor system for implementing digital signal processing algorithms

    SciTech Connect

    Barnwell, T.P. III; Hodges, C.J.M.

    1982-01-01

    This paper discusses the details of a multi-microprocessor system design as a research facility for studying multiprocessor implementation of digital signal processing algorithms. The overall system, which consists of a control microprocessor, eight satellite microprocessors, a control minicomputer, and extensive distributed software, has proven to be an effect tool in the study of multiprocessor implementations. 5 references.

  7. Computer Managed Instruction in Navy Training.

    ERIC Educational Resources Information Center

    Middleton, Morris G.; And Others

    An investigation was made of the feasibility of computer-managed instruction (CMI) for the Navy. Possibilities were examined regarding a centralized computer system for all Navy training, minicomputers for remote classes, and shipboard computers for on-board training. The general state of the art and feasibility of CMI were reviewed, alternative…

  8. Circulation and Finding System.

    ERIC Educational Resources Information Center

    Pierce, A. R.

    This report describes an online minicomputer-based system, one combining library inventory control with catalog access, that was implemented at Virginia Tech's main library in order to meet the demands of increased circulation activity and rising staff costs. Following overviews of the institutional environment, the systems development department,…

  9. Unix becoming healthcare's standard operating system.

    PubMed

    Gardner, E

    1991-02-11

    An unfamiliar buzzword is making its way into healthcare executives' vocabulary, as well as their computer systems. Unix is being touted by many industry observers as the most likely candidate to be a standard operating system for minicomputers, mainframes and computer networks.

  10. Learning Technologies Prototype Classroom Project

    ERIC Educational Resources Information Center

    Miller, Jo; Janovsky, Kathy

    2003-01-01

    During the 2001 summer holidays, the main Social Science classroom at St Ursula's College, a Catholic Secondary Girls' school of 740 pupils in Toowomba, Queensland was renovated. A mini-computer laboratory of four nests of computers was incorporated into the traditional teaching space. (See Diagram 1 and photograph). This room was named the…

  11. Microcomputer Page Layout (MicroPLA) Routine for Text-Graphic Materials: User's Guide. Technical Report 162.

    ERIC Educational Resources Information Center

    Galyon, Rosalind; And Others

    Based on an earlier user's guide to a minicomputer page layout system called PLA (Terrell, 1982), this guide is designed for use in the development and production of text-graphic materials for training relatively unskilled technicians to perform complex procedures. A microcomputer version of PLA, MicroPLA uses the Commodore 8032 microcomputer to…

  12. Programable Interface Handles Many Peripherals

    NASA Technical Reports Server (NTRS)

    Jasinski, M.

    1982-01-01

    Microprocessor-based interface simplifies interconnection of peripheral device with common memory of network of minicomputers. Interface consists of microprocessor, bidirectional port that connects to common memory, bidirectional port that connects to user-selected peripheral, and asynchronous serial communications port. Programable interface is based around 6800 microprocessor. It is assembled from 90 integrated circuits.

  13. A system for the management of requests at an image data bank. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Debarrosaguirre, J. L. (Principal Investigator)

    1984-01-01

    An automated system was implemented to supersede existing manual procedures in fulfilling user requests made to a remote sensing data bank, concerning specifically LANDSAT imagery. The system controls the several production steps from request entry to the shipment of each final product. Special solutions and techniques were employed due to the severe limitations, in both hardware and software of the host minicomputer system.

  14. Radioactivities in returned lunar materials and in meteorites

    NASA Technical Reports Server (NTRS)

    Fireman, E. L.

    1984-01-01

    Carbon 14 terrestial ages were determined with low level minicomputers and accelerator mass spectrometry on 1 Yamato and 18 Allan Hills and nearby sited meteorites. Techniques for an accelerator mass spectrometer which make C(14) measurements on small samples were developed. Also Be(10) concentrations were measured in Byrd core and Allan Hills ice samples.

  15. A practical Hadamard transform spectrometer for astronomical application

    NASA Technical Reports Server (NTRS)

    Tai, M. H.

    1977-01-01

    The mathematical properties of Hadamard matrices and their application to spectroscopy are discussed. A comparison is made between Fourier and Hadamard transform encoding in spectrometry. The spectrometer is described and its laboratory performance evaluated. The algorithm and programming of inverse transform are given. A minicomputer is used to recover the spectrum.

  16. Bringing Up an Automated Circulation System: Staffing Needs.

    ERIC Educational Resources Information Center

    Buck, Dayna

    1986-01-01

    This staffing survey of 34 Geac minicomputer-based circulation system installations (public, academic, state, and special libraries, library consortia) was designed to identify the variables which affect staffing needs. Discussion covers differences in responses, types of staff positions, core and ancillary job duties, application training,…

  17. Atmospheric and Oceanographic Information Processing System (AOIPS) system description

    NASA Technical Reports Server (NTRS)

    Bracken, P. A.; Dalton, J. T.; Billingsley, J. B.; Quann, J. J.

    1977-01-01

    The development of hardware and software for an interactive, minicomputer based processing and display system for atmospheric and oceanographic information extraction and image data analysis is described. The major applications of the system are discussed as well as enhancements planned for the future.

  18. ARC, State, Private Industry Join to Provide Computer Technician Training in North Georgia.

    ERIC Educational Resources Information Center

    Blanton, Bill

    1982-01-01

    Describes an innovative training program in minicomputer technology at Dalton Junior College (Georgia) funded by the state department of education, private industry and the Appalachian Regional Commission (ARC). Points out reasons for the program's success: shortage of skilled people and the prospect of quick entry into the workworld. (LC)

  19. Local Area Networks.

    ERIC Educational Resources Information Center

    Bullard, David

    1983-01-01

    The proliferation of word processors, micro- and minicomputer systems, and other digital office equipment is causing major design changes in existing networks. Local Area Networks (LANs) which have adequately served terminal users in the past must now be redesigned. Implementation at Clemson is described. (MLW)

  20. Fossil-fuel power plants: Computer systems for power plant control, maintenance, and operation. October 1976-December 1989 (A Bibliography from the COMPENDEX data base). Report for October 1976-December 1989

    SciTech Connect

    Not Available

    1990-02-01

    This bibliography contains citations concerning fossil-fuel power plant computer systems. Minicomputer and microcomputer systems used for monitoring, process control, performance calculations, alarming, and administrative applications are discussed. Topics emphasize power plant control, maintenance and operation. (Contains 240 citations fully indexed and including a title list.)

  1. A Distributed Processing Approach to Payroll Time Reporting for a Large School District.

    ERIC Educational Resources Information Center

    Freeman, Raoul J.

    1983-01-01

    Describes a system for payroll reporting from geographically disparate locations in which data is entered, edited, and verified locally on minicomputers and then uploaded to a central computer for the standard payroll process. Communications and hardware, time-reporting software, data input techniques, system implementation, and its advantages are…

  2. The Rise of K-12 Blended Learning: Profiles of Emerging Models

    ERIC Educational Resources Information Center

    Staker, Heather

    2011-01-01

    Some innovations change everything. The rise of personal computers in the 1970s decimated the mini-computer industry. TurboTax forever changed tax accounting, and MP3s made libraries of compact discs obsolete. These innovations bear the traits of what Harvard Business School Professor Clayton M. Christensen terms a "disruptive innovation."…

  3. Modern programming language

    NASA Technical Reports Server (NTRS)

    Feldman, G. H.; Johnson, J. A.

    1980-01-01

    Structural-programming language is especially-tailored for producing assembly language programs for MODCOMP II and IV mini-computes. Modern programming language consists of set of simple and powerful control structures that include sequencing alternative selection, looping, sub-module linking, comment insertion, statement continuation, and compilation termination capabilities.

  4. New Information Technologies: Some Observations on What Is in Store for Libraries.

    ERIC Educational Resources Information Center

    Black, John B.

    This outline of new technological developments and their applications in the library and information world considers innovations in three areas: automation, telecommunications, and the publishing industry. There is mention of the growth of online systems, minicomputers, microcomputers, and word processing; the falling costs of automation; the…

  5. A Detailed Comparison of Maxicalculators. Illinois Series on Educational Applications of Computers. Number 6.

    ERIC Educational Resources Information Center

    Doring, Richard; Hicks, Bruce

    A comparison is made of four maxicalculators and two minicomputers with an emphasis on two, the HP 9830 and the Wang 2200. Comparisons are in the form of a table with individual guidelines for analysis followed by the specific characteristics of the particular calculator. Features compared include: manual input facilities, screen, secondary…

  6. Introduction to acoustic emission

    NASA Technical Reports Server (NTRS)

    Possa, G.

    1983-01-01

    Typical acoustic emission signal characteristics are described and techniques which localize the signal source by processing the acoustic delay data from multiple sensors are discussed. The instrumentation, which includes sensors, amplifiers, pulse counters, a minicomputer and output devices is examined. Applications are reviewed.

  7. Automatic processing system for shadowgraph and interference patterns

    NASA Technical Reports Server (NTRS)

    Vereninov, I. A.; Lazarev, V. D.; Popov, S. S.; Tarasov, V. S.

    1987-01-01

    The design and operation of an automatic system for the processing of shadowgraph and interference images are described. The system includes a two-coordinate processing table with an optical system for the projection of transparent images onto the photodetector, an image filter in the photodetector field, and a device for controlling the movement of the table and transmitting information to the minicomputer.

  8. Speak Out and Touch Someone. The OMLTA Yearbook, 1983.

    ERIC Educational Resources Information Center

    Snyder, Barbara, Ed.

    A number of topics of interest to secondary school foreign language teachers are discussed in this issue. The following articles are included: (1) "Teaching and Learning a Foreign Language via Tele(Video)phone: A Futuristic Mini-Computer Design," by G. Harewood; (2) "Meeting Students' Communication Needs," by B. Marckel and "Functional/Notional…

  9. Organizational Strategies for End-User Computing Support.

    ERIC Educational Resources Information Center

    Blackmun, Robert R.; And Others

    1988-01-01

    Effective support for end users of computers has been an important issue in higher education from the first applications of general purpose mainframe computers through minicomputers, microcomputers, and supercomputers. The development of end user support is reviewed and organizational models are examined. (Author/MLW)

  10. Cloud Computing and the Power to Choose

    ERIC Educational Resources Information Center

    Bristow, Rob; Dodds, Ted; Northam, Richard; Plugge, Leo

    2010-01-01

    Some of the most significant changes in information technology are those that have given the individual user greater power to choose. The first of these changes was the development of the personal computer. The PC liberated the individual user from the limitations of the mainframe and minicomputers and from the rules and regulations of centralized…

  11. Hardware Developments; Microcomputers and Processors; Grade School/High School Instructional; and Computer-Aided Design. Papers Presented at the Association for Educational Data Systems Annual Convention (Phoenix, Arizona, May 3-7, 1976).

    ERIC Educational Resources Information Center

    Association for Educational Data Systems, Washington, DC.

    Compiled are ten papers describing computer hardware and computer use in elementary and secondary school instruction presented at the Association for Educational Data Systems (AEDS) 1976 convention. An oral/aural terminal is described followed by two papers about the use of minicomputers and microprocessors. Seven papers discuss various uses of…

  12. An Experimental CMI System on the PDP 11/20.

    ERIC Educational Resources Information Center

    Espeland, L. Roger; Walker, Gerald S.

    A computer-managed instructional (CMI) system is being developed for use in investigating a CMI environment for Air Force technical training using the PDP 11/20 minicomputer. Software and hardware interfaces are now available for 24k core memory with an additional 128k random access disc storage. Hardware interfaces are complete for the student…

  13. A distributed data base management capability for the deep space network

    NASA Technical Reports Server (NTRS)

    Bryan, A. I.

    1976-01-01

    The Configuration Control and Audit Assembly (CCA) is reported that has been designed to provide a distributed data base management capability for the DSN. The CCA utilizes capabilities provided by the DSN standard minicomputer and the DSN standard nonreal time high level management oriented programming language, MBASIC. The characteristics of the CCA for the first phase of implementation are described.

  14. Mass Storage Systems.

    ERIC Educational Resources Information Center

    Ranade, Sanjay; Schraeder, Jeff

    1991-01-01

    Presents an overview of the mass storage market and discusses mass storage systems as part of computer networks. Systems for personal computers, workstations, minicomputers, and mainframe computers are described; file servers are explained; system integration issues are raised; and future possibilities are suggested. (LRW)

  15. Implementing Computer-Assisted Instruction: The Garland Way.

    ERIC Educational Resources Information Center

    Douglas, Eli; Bryant, Deborah G.

    1985-01-01

    After much study, administrators at the Garland Independent School District (Texas) adopted a minicomputer-based system which provides curriculum development for kindergarten through grade 12. The process of implementing computer-assisted instruction in this district is described. Results after the first year and teacher training are examined. (JN)

  16. RAMAS: The RITL Automated Management System. Master Control and Periodicals Control Subsystems. Stockholm Papers in Library and Information Science.

    ERIC Educational Resources Information Center

    Ya-chun, Lian

    An automated minicomputer-based library management system is being developed at the Swedish Royal Institute of Technology Library (RITL). RAMAS (the RITL Automated Management System) currently deals with periodical check-in, claiming, index-handling, and binding control. A RAMAS bibliographic record can be accessed from eight different points…

  17. Freebies for Investors--Precise Incremental Yield Value

    ERIC Educational Resources Information Center

    Michelson, Irving

    1977-01-01

    Competition for savings dollars has led to free gift bonus offers as incentive for new deposits. A concise new formula presented here permits calculation of the total yield using an inexpensive minicomputer. Yield is expressed in terms of interest rate, effective discount value of gift bonus, and period of deposit. (Author/MA)

  18. IPCS user's manual

    SciTech Connect

    McGoldrick, P.R.

    1980-12-11

    The Interprocess Communications System (IPCS) was written to provide a virtual machine upon which the Supervisory Control and Diagnostic System (SCDS) for the Mirror Fusion Test Facility (MFTF) could be built. The hardware upon which the IPCS runs consists of nine minicomputers sharing some common memory.

  19. The ILS--The Pentagon Library's Experience.

    ERIC Educational Resources Information Center

    Mullane, Ruth

    1984-01-01

    Describes implementation of five subsystems of Integrated Library System's (ILS) version 2.1 (minicomputer-based automated library system) at the Pentagon Library: online catalog (search strategies, user acceptance); bibliographic subsystems (cataloging, retrospective conversion); circulation; serials check-in; administrative subsystem (report…

  20. A comprehensive package for DNA sequence analysis in FORTRAN IV for the PDP-11.

    PubMed

    Arnold, J; Eckenrode, V K; Lemke, K; Phillips, G J; Schaeffer, S W

    1986-01-10

    A computer package written in Fortran-IV for the PDP-11 minicomputer is described. The package's novel features are: software for voice-entry of sequence data; a less memory intensive algorithm for optimal sequence alignment; and programs that fit statistical models to nucleic acid and protein sequences.

  1. An Off-Line Simulation System for Development of Real-Time FORTRAN Programs.

    ERIC Educational Resources Information Center

    White, James W.

    Implementation of an ISA FORTRAN standard for executive functions and process input-output within a simulation system called MINIFOR provides a useful real-time program development tool for small single function, dedicated minicomputers having a FORTRAN compiler but limited program development aids. A FORTRAN-based pre-compiler is used off-line to…

  2. Networking Projects around the United States.

    ERIC Educational Resources Information Center

    Klinck, Nancy A., Ed.

    1990-01-01

    Reviews networking projects that have been developed by educational institutions and computer companies. Highlights include minicomputers that network personal computer workstations; voice mail messages; data processing systems; linking university databases; the National Science Foundation Network (NSFNET); a computer link between a high school…

  3. The Local Area Network (LAN) and Library Automation.

    ERIC Educational Resources Information Center

    Farr, Rick C.

    1983-01-01

    Discussion of the use of inexpensive microcomputers in local area information networks (LAN) notes such minicomputer problems as costs, capacity, downtime, and vendor dependence, and the advantages of using LAN in offices and libraries, less costly LAN upgrades, library vendors and LAN systems, and LAN problems. A 28-item bibliography is provided.…

  4. Automatic visual inspection of hybrid microcircuits

    SciTech Connect

    Hines, R.E.

    1980-05-01

    An automatic visual inspection system using a minicomputer and a video digitizer was developed for inspecting hybrid microcircuits (HMC) and thin-film networks (TFN). The system performed well in detecting missing components on HMCs and reduced the testing time for each HMC by 75%.

  5. Description and Initial Evaluation of a Computer-Based Individual Trainer for the Radar Intercept Observer.

    ERIC Educational Resources Information Center

    Rigney, Joseph W.; And Others

    An individual trainer for giving students in the radar intercept observer (RIO) schools concentrated practice in procedures for air-to-air intercepts was designed around a programmable graphics terminal with two integral minicomputers and 8k of core memory. The trainer automatically administers practice in computing values of variables in the…

  6. Using cooperative simultaneous parallelism in nonhomogeneous microcomputer clusters

    SciTech Connect

    Meng, J.

    1984-10-01

    Based on our experience controlling different types of microprocessors in clusters through the central processor-memory connection, and based on our succussful parallel operation of a cluster of minicomputer central processors, a MIDAS-like arrangement of different microcomputers in a cluster is useful in dealing with a variety of problems. The MIDAS-like arrangement refers to master-slave control of the cluster by a minicomputer and to data being passed through the cluster by connecting prefilled memories into and out of processor address space. We discuss how to connect the hardware into a cluster and conclude with descriptions of how to apply the hardware to selected diverse applications. 11 refs., 8 figs.

  7. Display-management system for MFTF

    SciTech Connect

    Nelson, D.O.

    1981-01-01

    The Mirror Fusion Test Facility (MFTF) is controlled by 65 local control microcomputers which are supervised by a local network of nine 32-bit minicomputers. Associated with seven of the nine computers are state-of-the-art graphics devices, each with extensive local processing capability. These devices provide the means for an operator to interact with the control software running on the minicomputers. It is critical that the information the operator views accurately reflects the current state of the experiment. This information is integrated into dynamically changing pictures called displays. The primary organizational component of the display system is the software-addressable segment. The segments created by the display creation software are managed by display managers associated with each graphics device. Each display manager uses sophisticated storage management mechanisms to keep the proper segments resident in the local graphics device storage.

  8. Optical computer switching network

    NASA Technical Reports Server (NTRS)

    Clymer, B.; Collins, S. A., Jr.

    1985-01-01

    The design for an optical switching system for minicomputers that uses an optical spatial light modulator such as a Hughes liquid crystal light valve is presented. The switching system is designed to connect 80 minicomputers coupled to the switching system by optical fibers. The system has two major parts: the connection system that connects the data lines by which the computers communicate via a two-dimensional optical matrix array and the control system that controls which computers are connected. The basic system, the matrix-based connecting system, and some of the optical components to be used are described. Finally, the details of the control system are given and illustrated with a discussion of timing.

  9. Continuous fission-product monitor system at Oyster Creek. Final report

    SciTech Connect

    Collins, L.L.; Chulick, E.T.

    1980-10-01

    A continuous on-line fission product monitor has been installed at the Oyster Creek Nuclear Generating Station, Forked River, New Jersey. The on-line monitor is a minicomputer-controlled high-resolution gamma-ray spectrometer system. An intrinsic Ge detector scans a collimated sample line of coolant from one of the plant's recirculation loops. The minicomputer is a Nuclear Data 6620 system. Data were accumulated for the period from April 1979 through January 1980, the end of cycle 8 for the Oyster Creek plant. Accumulated spectra, an average of three a day, were stored on magnetic disk and subsequently analyzed for fisson products, Because of difficulties in measuring absolute detector efficiency, quantitative fission product concentrations in the coolant could not be determined. Data for iodine fission products are reported as a function of time. The data indicate the existence of fuel defects in the Oyster Creek core during cycle 8.

  10. Techniques for digital enhancement of Landsat MSS data using an Apple II+ microcomputer

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.; Cartin, K. F.

    1984-01-01

    The information provided by remotely sensed data collected from orbiting platforms has been useful in many research fields. Particularly convenient for evaluation are generally digital data stored on computer compatible tapes (CCT's). The major advantages of CCT's are the quality of the data and the accessibility to computer manipulation. Minicomputer systems are widely used for the required computer processing operations. However, microprocessor-related technological advances make it now possible to process CCT data with computing systems which can be obtained at a much lower price than minicomputer systems. A detailed description is provided of the design considerations of a microcomputer-based Digital Image Analysis System (DIAS). Particular attention is given to the algorithms which are incorporated for eighter edge enhancement or smoothing Landsat multispectral scanner data.

  11. A computer-aided design system geared toward conceptual design in a research environment. [for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    STACK S. H.

    1981-01-01

    A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.

  12. Two dimensional recursive digital filters for near real time image processing

    NASA Technical Reports Server (NTRS)

    Olson, D.; Sherrod, E.

    1980-01-01

    A program was designed toward the demonstration of the feasibility of using two dimensional recursive digital filters for subjective image processing applications that require rapid turn around. The concept of the use of a dedicated minicomputer for the processor for this application was demonstrated. The minicomputer used was the HP1000 series E with a RTE 2 disc operating system and 32K words of memory. A Grinnel 256 x 512 x 8 bit display system was used to display the images. Sample images were provided by NASA Goddard on a 800 BPI, 9 track tape. Four 512 x 512 images representing 4 spectral regions of the same scene were provided. These images were filtered with enhancement filters developed during this effort.

  13. Cactus

    SciTech Connect

    Sexton, R.L.

    1983-03-01

    The CACTUS project (computer-aided control, tracking, and updating system) was initiated by the Bendix Kansas City Division to address specific work-in-process problems encountered in a cable department. Since then, the project has been expanded to additional electrical manufacturing departments because of potential productivity gains from the system. The philosophy of CACTUS is to add an element of distributed data proessing to the centralized data processing system currently in use for control of work in process. Under this system, the existing chain of communications between the host computer and the CRT terminals in a department is severed. A mini-computer established in the department communicates directly with the central system, and departmental communication is then established with the mini-computer. The advantages, disadvantages, operation performance, and economics of the system are discussed.

  14. TDRSS data handling and management system study. Ground station systems for data handling and relay satellite control

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Results of a two-phase study of the (Data Handling and Management System DHMS) are presented. An original baseline DHMS is described. Its estimated costs are presented in detail. The DHMS automates the Tracking and Data Relay Satellite System (TDRSS) ground station's functions and handles both the forward and return link user and relay satellite data passing through the station. Direction of the DHMS is effected via a TDRSS Operations Control Central (OCC) that is remotely located. A composite ground station system, a modified DHMS (MDHMS), was conceptually developed. The MDHMS performs both the DHMS and OCC functions. Configurations and costs are presented for systems using minicomputers and midicomputers. It is concluded that a MDHMS should be configured with a combination of the two computer types. The midicomputers provide the system's organizational direction and computational power, and the minicomputers (or interface processors) perform repetitive data handling functions that relieve the midicomputers of these burdensome tasks.

  15. Instructions for using the U.S. Geological Survey data base of wells on Long Island, New York

    USGS Publications Warehouse

    Hawkins, George W.; Terlecki, Gregory M.

    1983-01-01

    The population of central and eastern Long Island, New York depends on ground water for its supply of fresh water. Data on more than 7,500 wells on the island have been collected by various State and local agencies and compiled by the U.S. Geological Survey since 1906. During 1975-81, the Geological Survey developed a data base for its Data General Nova 1220 minicomputer to store and process the well information. The data base is composed of seven sections, each of which may be revised and updated. Three types of magnetic devices with limited capacity are used for data storage--disk, Linctape, and 9-track tape. This breakdown makes each section small enough to store and update on a small minicomputer while allowing simultaneous data retrieval from all sections. This manual gives complete instructions for revising, storing, and retrieving well data. Most programming is in FORTRAN, but some is in assembly language. (USGS)

  16. Escort: A data acquisition and display system to support research testing

    NASA Technical Reports Server (NTRS)

    Miller, R. L.

    1978-01-01

    Primarily designed to acquire data at steady state test conditions, the system can also monitor slow transients such as those generated in moving to a new test condition. The system configuration makes use of a microcomputer at the test site which acts as a communications multiplexer between the measurement and display devices and a centrally located minicomputer. A variety of measurement and display devices are supported using a modular approach. This allows each system to be configured with the proper combination of devices to meet the specific test requirements, while still leaving the option to add special interfaces when needed. Centralization of the minicomputer improves utilization through sharing. The creation of a pool of minis to provide data acquisition and display services to a variable number of running tests also offers other important advantages.

  17. Radiology capital asset management.

    PubMed

    Wagener, G N; Pridlides, A J

    1993-01-01

    Radiology administrators are expected not only to take on the ultimate accountability for meeting the needs and challenges of present day-to-day operations, but also to plan for the future. Computer Aided Facility Management (CAFM), as a tool, enables radiology managers to obtain up-to-date data to manage their services. Using Autocad on a unix-based minicomputer as the graphical base generator and integrating information from a MUMPS-based minicomputer, the CAFM process can define areas to be studied for productivity and life cycle costs. From an analysis of radiology service, management was able to make solid judgement calls for equipment replacement and facility project renovation to effectively manage radiology resources.

  18. Instrumentation, techniques and data reduction associated with airfoil testing programs at Wichita State University

    NASA Technical Reports Server (NTRS)

    Rodgers, E. J.; Wentz, W. H., Jr.; Seetharam, H. C.

    1978-01-01

    Two dimensional airfoil testing was conducted at the Wichita State University Beech Wind Tunnel for a number of years. The instrumentation developed and adapted during this period of testing for determination of flow fields along with traversing mechanisms for these probes are discussed. In addition, some of the techniques used to account for interference effects associated with the apparatus used for this two dimensional testing are presented. The application of a minicomputer to the data reduction and presentation is discussed.

  19. Dedicated multiprocessor system for calculating Josephson-junction noise thermometer frequency variances at high speed

    SciTech Connect

    Cutkosky, R.D.

    1983-07-01

    A Josephson-junction noise thermometer produces a sequence of frequency readings from whose variations the temperature of the thermometer may be calculated. A preprocessor system has been constructed to collect the frequency readings delivered to an IEEE 488 bus by an ordinary counter operating at up to 1000 readings per second, perform the required calculations, and send summary information to a desk calculator or minicomputer on another 488 bus at a more convenient rate.

  20. An implementation of the distributed programming structural synthesis system (PROSSS)

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1981-01-01

    A method is described for implementing a flexible software system that combines large, complex programs with small, user-supplied, problem-dependent programs and that distributes their execution between a mainframe and a minicomputer. The Programming Structural Synthesis System (PROSSS) was the specific software system considered. The results of such distributed implementation are flexibility of the optimization procedure organization and versatility of the formulation of constraints and design variables.

  1. A Workstation-Based Inpatient Clinical System in the Johns Hopkins Hospital

    PubMed Central

    Schneider, Marvin; Tolchin, Stephen G.; Kahane, Stephen N.; Goldberg, Howard S.; Barta, Patrick

    1985-01-01

    The Johns Hopkins Hospital has initiated an ambitious program to apply modern technologies to the development of a new, comprehensive clinical information system. This system integrates many distinct functional subsystems using a local area network. One component of this system is a distributed inpatient clinical management system. This paper discusses a workstation-based design with minicomputer support. User interface requirements, system architecture, project plans and alternative approaches are discussed.

  2. [Quantified self movement--the new mantra of life insurance companies].

    PubMed

    Becher, St

    2016-06-01

    Wearables are small personal minicomputers that register biometric data. In such a way, the insurance industry hopes to create new sales opportunities and products, and simplify underwriting. Lower premiums will promote the use of wearables. The related possibilities and unanswered questions are discussed in this article. Utilisation of big data offers the insurance industry a range of new opportunities. The benefit must be proven in the future, however. PMID:27483687

  3. Display system for imaging scientific telemetric information

    NASA Technical Reports Server (NTRS)

    Zabiyakin, G. I.; Rykovanov, S. N.

    1979-01-01

    A system for imaging scientific telemetric information, based on the M-6000 minicomputer and the SIGD graphic display, is described. Two dimensional graphic display of telemetric information and interaction with the computer, in analysis and processing of telemetric parameters displayed on the screen is provided. The running parameter information output method is presented. User capabilities in the analysis and processing of telemetric information imaged on the display screen and the user language are discussed and illustrated.

  4. Contribution Of Infrared Strobophotogrammetry In Movements Analysis - Applications

    NASA Astrophysics Data System (ADS)

    Mallard, R.; Cololentz, A.; Fossier, E.

    1986-07-01

    A three dimensional television system (VICON) is described. This device is connected to a minicomputer and is used to achieve a biostereometric study of human movements. Three synchronised video cameras enable to capture up to fourty trajectories of reflective markers fixed on anatomical landmarks and illuminated by infrared stroboscopes at fifty fields per second. 3-D trajectories are computed automatically. Displacements, velocities, accelerations and angles data are used to modelize movements.

  5. Simple digital pulse-programing circuit

    NASA Technical Reports Server (NTRS)

    Langston, J. L.

    1979-01-01

    Pulse-sequencing circuit uses only shift register and Exclusive-OR gates. Circuit also serves as date-transition edge detector (for rising or falling edges). It is used in sample-and-hold, analog-to-digital conversion sequence control, multiphase clock logic, precise delay control computer control logic, edge detectors, other timing applications, and provides simple means to generate timing and control signals for data transfer, addressing, or mode control in microprocessors and minicomputers.

  6. TMS communications hardware. Volume 2: Bus interface unit

    NASA Technical Reports Server (NTRS)

    Brown, J. S.; Hopkins, G. T.

    1979-01-01

    A prototype coaxial cable bus communication system used in the Trend Monitoring System to interconnect intelligent graphics terminals to a host minicomputer is described. The terminals and host are connected to the bus through a microprocessor-based RF modem termed a Bus Interface Unit (BIU). The BIU hardware and the Carrier Sense Multiple Access Listen-While-Talk protocol used on the network are described.

  7. TMS communications hardware. Volume 1: Computer interfaces

    NASA Technical Reports Server (NTRS)

    Brown, J. S.; Weinrich, S. S.

    1979-01-01

    A prototpye coaxial cable bus communications system was designed to be used in the Trend Monitoring System (TMS) to connect intelligent graphics terminals (based around a Data General NOVA/3 computer) to a MODCOMP IV host minicomputer. The direct memory access (DMA) interfaces which were utilized for each of these computers are identified. It is shown that for the MODCOMP, an off-the-shell board was suitable, while for the NOVAs, custon interface circuitry was designed and implemented.

  8. TMS communications software. Volume 1: Computer interfaces

    NASA Technical Reports Server (NTRS)

    Brown, J. S.; Lenker, M. D.

    1979-01-01

    A prototype bus communications system, which is being used to support the Trend Monitoring System (TMS) as well as for evaluation of the bus concept is considered. Hardware and software interfaces to the MODCOMP and NOVA minicomputers are included. The system software required to drive the interfaces in each TMS computer is described. Documentation of other software for bus statistics monitoring and for transferring files across the bus is also included.

  9. Computer program modifications of Open-file report 82-1065; a comprehensive system for interpreting seismic-refraction and arrival-time data using interactive computer methods

    USGS Publications Warehouse

    Ackermann, Hans D.; Pankratz, Leroy W.; Dansereau, Danny A.

    1983-01-01

    The computer programs published in Open-File Report 82-1065, A comprehensive system for interpreting seismic-refraction arrival-time data using interactive computer methods (Ackermann, Pankratz, and Dansereau, 1982), have been modified to run on a mini-computer. The new version uses approximately 1/10 of the memory of the initial version, is more efficient and gives the same results.

  10. X- And y-axis driver for rotating microspheres

    DOEpatents

    Weinstein, Berthold W.

    1979-01-01

    Apparatus for precise control of the motion and position of microspheres for examination of their interior and/or exterior. The apparatus includes an x- and y-axis driver mechanism controlled, for example, by a minicomputer for selectively rotating microspheres retained between a pair of manipulator arms having flat, smooth end surfaces. The driver mechanism includes an apertured plate and ball arrangement which provided for coupled equal and opposite movement of the manipulator arms in two perpendicular directions.

  11. Study of software application of airborne laser doppler system for severe storms measurement

    NASA Technical Reports Server (NTRS)

    Alley, P. L.

    1979-01-01

    Significant considerations are described for performing a Severe Storms Measurement program in real time. Particular emphasis is placed on the sizing and timing requirements for a minicomputer-based system. Analyses of several factors which could impact the effectiveness of the system are presented. The analyses encompass the problems of data acquisition, data storage, data registration, correlation, and flow field computation, and error induced by aircraft motion, moment estimation, and pulse integration.

  12. A computer system for biomedical equipment maintenance reporting.

    PubMed

    Rabbie, H R; Korte, R L

    1979-01-01

    Biomedical equipment maintenance and repair activities involve an increasing amount of paperwork and report generation. A computer system is described that collects the necessary data on-line for accuracy and efficiency. The system can generate various reports relating to repair history, spare parts usage and preventive maintenance scheduling. It is implemented on a time-sharing minicomputer, and can reduce substantially the time spent by biomedical engineers in documenting their activities. PMID:10243926

  13. High-intensity flux mapper for concentrating solar collectors

    SciTech Connect

    Cannon, T.W.; Gaul, H.W.

    1982-02-01

    The flux mapper consists of a ceramic scatter plate, video camera with silicon diode array image tube (vidicon), 75 mm focal-length lens with appropriate filters, video frame store, television monitors, disk drive, magnetic tape drive and minicomputer. The camera and scatter plate are installed on a parabolic solar collector at SERI's Advanced Component Research Facility. Calibration was made by focussing the sun directly onto the vidicon target. Light intensity calibration is estimated to be accurate to about 7%. (LEW)

  14. Signal processing at the Poker Flat MST radar

    NASA Technical Reports Server (NTRS)

    Carter, D. A.

    1983-01-01

    Signal processing for Mesosphere-Stratosphere-Troposphere (MST) radar is carried out by a combination of hardware in high-speed, special-purpose devices and software in a general-purpose, minicomputer/array processor. A block diagram of the signal processing system is presented, and the steps in the processing pathway are described. The current processing capabilities are given, and a system offering greater coherent integration speed is advanced which hinges upon a high speed preprocessor.

  15. Operation System for Automatization of an Experiment in Radioastronomy

    NASA Astrophysics Data System (ADS)

    Bogdanov, V. V.

    The problem-oriented operation system ER (for the minicomputer "Electronica-100 I") is intended for use in the low level of the unit "acquisition" of the automatic complex for radio observations at the radio telescope RATAN-600. The main functions of this system are the following: conduction of the dialogue User - a computer, realization of the multitask regime, providing multiprogramming, control of data input/output.

  16. SHIVA - A multitask data acquisition system for the Oslo University cyclotron laboratory

    SciTech Connect

    Skaali, B.; Haugen, A.; Ingebretsen, F.; Midttun, G.

    1983-10-01

    The authors describe a general nuclear data acquisition system implemented on a minicomputer using the standard facilities of a real time operating system. The CAMAC data acquisition hardware is controlled by a high speed ADC scanner module. Sorting of multiparameter data is based on a flexible Transformation Of Nuclear Event (TONE) language. The data processing rate, including tape transfer, is several thousand events/s, depending on the complexity of the sorting program.

  17. Flight simulators. Part 1: Present situation and trends. Part 2: Implications for training

    NASA Technical Reports Server (NTRS)

    Hass, D.; Volk, W.

    1977-01-01

    The present situation and developments in the technology of flight simulators based on digital computers are evaluated from the standpoint of training airline flight crews. Areas covered are minicomputers and their advantages in terms of cost, space and time savings, software data packets, motion simulation, visual simulation and instructor aids. The division of training time between aircraft and simulator training and the possible advantages from increased use of simulators are evaluated.

  18. FTIR (Fourier transform infrared) spectrophotometry for thin film monitors: Computer and equipment integration for enhanced capabilities

    SciTech Connect

    Cox, J.N.; Sedayao, J.; Shergill, G.; Villasol, R. ); Haaland, D.M. )

    1990-01-01

    Fourier transform infrared spectrophotometry (FTIR) is a valuable technique for monitoring thin films used in semiconductor device manufacture. Determinations of the constituent contents in borophosphosilicate (BPSG), phosphosilicate (PSG), silicon oxynitride (SiON:H,OH), and spin-on-glass (SOG) thin films are a few applications. Due to the nature of the technique, FTIR instrumentation is one of the most extensively computer-dependent pieces of equipment that is likely to be found in a microelectronics plant. In the role of fab monitor or reactor characterization tool, FTIR instruments can rapidly generate large amounts of data. By linking a local FTIR data station to a remote minicomputer its capabilities are greatly improved. We discuss three caused of enhancement. First, the FTIR in the fab area communicates and interacts in real time with the minicomputer: transferring data segments to it, instructing it to perform sophisticated processing, and returning the result to the operator in the fab. Characterizations of PSG thin films by this approach are discussed. Second, the spectra of large numbers of samples are processed locally. The large database is then transmitted to the minicomputer for study by statistical/graphics software. Results of CVD-reactor spatial profiling experiments for plasma SiON are presented. Third, processing of calibration spectra is performed on the minicomputer to optimize the accuracy and precision of a Partial Least Squares'' analysis mode. This model is then transferred to the data station in the fab. The analysis of BPSG thin films is discussed in this regard. The prospects for fully automated at-line monitoring and for real-time, in-situ monitoring will be discussed. 10 refs., 4 figs.

  19. Data base design for a worldwide multicrop information system

    NASA Technical Reports Server (NTRS)

    Driggers, W. G.; Downs, J. M.; Hickman, J. R.; Packard, R. L. (Principal Investigator)

    1979-01-01

    A description of the USDA Application Test System data base design approach and resources is presented. The data is described in detail by category, with emphasis on those characteristics which influenced the design most. It was concluded that the use of a generalized data base in support of crop assessment is a sound concept. The IDMS11 minicomputer base system is recommended for this purpose.

  20. ART/Ada design project, phase 1: Project plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan and schedule for Phase 1 of the Ada based ESBT Design Research Project is described. The main platform for the project is a DEC Ada compiler on VAX mini-computers and VAXstations running the Virtual Memory System (VMS) operating system. The Ada effort and lines of code are given in tabular form. A chart is given of the entire project life cycle.

  1. Processing PCM Data in Real Time

    NASA Technical Reports Server (NTRS)

    Wissink, T. L.

    1982-01-01

    Novel hardware configuration makes it possible for Space Shuttle launch processing system to monitor pulse-code-modulated data in real time. Using two microprogramable "option planes," incoming PCM data are monitored for changes at rate of one frame of data (80 16-bit words) every 10 milliseconds. Real-time PCM processor utilizes CPU in mini-computer and CPU's in two option planes.

  2. Laser velocimeter (autocovariance) buffer interface

    NASA Technical Reports Server (NTRS)

    Clemmons, J. I., Jr.

    1981-01-01

    A laser velocimeter (autocovariance) buffer interface (LVABI) was developed to serve as the interface between three laser velocimeter high speed burst counters and a minicomputer. A functional description is presented of the instrument and its unique features which allow the studies of flow velocity vector analysis, turbulence power spectra, and conditional sampling of other phenomena. Typical applications of the laser velocimeter using the LVABI are presented to illustrate its various capabilities.

  3. Functional definition and design of a USDA system

    NASA Technical Reports Server (NTRS)

    Evans, S. M.; Dario, E. R.; Dickinson, G. L. (Principal Investigator)

    1979-01-01

    The fundamental definition and design of a U.S.D.A. system utilizing the LACIE technology avaliable as of June 1976, is discussed. The organization and methods described are focused on LACIE technology in terms of its transfer for use applications. The simulation of a feasible system design provided timely answers to system design questions, such as the ability of a minicomputer to handle the proposed geometrical correction of MSS data.

  4. [Quantified self movement--the new mantra of life insurance companies].

    PubMed

    Becher, St

    2016-06-01

    Wearables are small personal minicomputers that register biometric data. In such a way, the insurance industry hopes to create new sales opportunities and products, and simplify underwriting. Lower premiums will promote the use of wearables. The related possibilities and unanswered questions are discussed in this article. Utilisation of big data offers the insurance industry a range of new opportunities. The benefit must be proven in the future, however.

  5. Large aperture ac interferometer for optical testing.

    PubMed

    Moore, D T; Murray, R; Neves, F B

    1978-12-15

    A 20-cm clear aperture modified Twyman-Green interferometer is described. The system measures phase with an AC technique called phase-lock interferometry while scanning the aperture with a dual galvanometer scanning system. Position information and phase are stored in a minicomputer with disk storage. This information is manipulated with associated software, and the wavefront deformation due to a test component is graphically displayed in perspective and contour on a CRT terminal. PMID:20208642

  6. Real-time flight test data distribution and display

    NASA Technical Reports Server (NTRS)

    Nesel, Michael C.; Hammons, Kevin R.

    1988-01-01

    Enhancements to the real-time processing and display systems of the NASA Western Aeronautical Test Range are described. Display processing has been moved out of the telemetry and radar acquisition processing systems super-minicomputers into user/client interactive graphic workstations. Real-time data is provided to the workstations by way of Ethernet. Future enhancement plans include use of fiber optic cable to replace the Ethernet.

  7. Automation of the process of speech signal segmentation in an analogic-numeric system

    NASA Astrophysics Data System (ADS)

    Domagala, P.

    Eighteen Polish words uttered by 12 voices (7 male and 5 female) were taperecorded and analyzed by computer. Numeric analysis of the dynamic spectrum was implemented using an algorithm composed of simple logical sentences on the MERA 303 minicomputer. Compared with the visual segmentation achieved in the spectrographic computer images, correctness of segmentation reached a level of about 94 percent. No differences were found in quality of segmentation between male and female utterances.

  8. The use of computers in dairy herd health program: A review

    PubMed Central

    Lissemore, Kerry D.

    1989-01-01

    This review of the literature covers the changes in the approach to veterinary health management that led to the introduction of computerized herd health programs and the various other applications of the computer in the practice of dairy herd medicine. The role that production recording systems, mainframe computers, minicomputers, and microcomputers have played in the evolution of herd health programs are also reviewed. PMID:17423392

  9. Optimizing Xenix I/O

    SciTech Connect

    Bottorff, P.; Potts, B.

    1983-08-01

    High performance microprocessors, inexpensive Winchester disk drives and low cost high density dynamic random access memories are making it feasible to incorporate minicomputer operating systems such as Unix into multiuser/multitasking microcomputers. However, before Unix and its derivatives can be efficiently integrated into a microcomputer environment, certain I/O and memory management hardware design problems previously limited to larger computer systems must be solved. These are discussed.

  10. The Parkes radio telescope - 1986

    NASA Astrophysics Data System (ADS)

    Ables, J. G.; Jacka, C. E.; McConnell, D.; Schinckel, A. E.; Hunt, A. J.

    The Parkes radio telescope has been refurbished 25 years after its commisioning in 1961, with complete replacement of its drive and control systems. The new computer system distributes computing tasks among a loosely coupled network of minicomputers which communicate via full duplex serial lines. Central to the control system is the 'CLOCK' element, which relates all positioning of the telescope to absolute time and synchronizes the logging of astronomical data. Two completely independent servo loops furnish telescope positioning functions.

  11. A Fortran program for fast and compact processing of clinical radiotherapy data.

    PubMed

    Coles, I P; Dale, R G

    1984-01-01

    A set of Fortran IV programs have been developed to enable a patient registry to operate on a minicomputer of a type frequently used for treatment planning within radiotherapy departments. The system is both comprehensive and flexible, allowing the efficient storage of clinical data in the form of coded units. The coding format used enables inexperienced operators to enter, or extract data from the system with the minimum of keyboard operations.

  12. A new theory for rapid calculation of the ground pattern of the incident sound intensity produced by a maneuvering jet airplane

    NASA Technical Reports Server (NTRS)

    Barger, R. L.

    1980-01-01

    An approximate method for computing the jet noise pattern of a maneuvering airplane is described. The method permits one to relate the noise pattern individually to the influences of airplane speed and acceleration, jet velocity and acceleration, and the flight path curvature. The analytic formulation determines the ground pattern directly without interpolation and runs rapidly on a minicomputer. Calculated examples including a climbing turn and a simple climb pattern with a gradual throttling back are presented.

  13. Microcomputer-based digital image processing - A tutorial package for exploration geologists

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.; Cartin, K. F.

    1985-01-01

    An Apple II microcomputer-based software package for analysis of digital data developed at the University of Oklahoma, the Digital Image Analysis System (DIAS), provides a relatively low-cost, portable alternative to large, dedicated minicomputers for digital image processing education. Digital processing techniques for analysis of Landsat MSS data and a series of tutorial exercises for exploration geologists are described and evaluated. DIAS allows in-house training that does not interfere with computer-based prospect analysis objectives.

  14. Forth system for coherent-scatter radar data acquisition and processing

    NASA Technical Reports Server (NTRS)

    Rennier, A. D.; Bowhill, S. A.

    1985-01-01

    A real time collection system was developed for the Urbana coherent scatter radar system. The new system, designed for use with a microcomputer, has several advantages over the old system implemented with a minicomputer. The software used to collect the data is described as well as the processing software used to analyze the data. In addition a magnetic tape format for coherent scatter data exchange is given.

  15. Application of a personal computer for the uncoupled vibration analysis of wind turbine blade and counterweight assemblies

    NASA Technical Reports Server (NTRS)

    White, P. R.; Little, R. R.

    1985-01-01

    A research effort was undertaken to develop personal computer based software for vibrational analysis. The software was developed to analytically determine the natural frequencies and mode shapes for the uncoupled lateral vibrations of the blade and counterweight assemblies used in a single bladed wind turbine. The uncoupled vibration analysis was performed in both the flapwise and chordwise directions for static rotor conditions. The effects of rotation on the uncoupled flapwise vibration of the blade and counterweight assemblies were evaluated for various rotor speeds up to 90 rpm. The theory, used in the vibration analysis codes, is based on a lumped mass formulation for the blade and counterweight assemblies. The codes are general so that other designs can be readily analyzed. The input for the codes is generally interactive to facilitate usage. The output of the codes is both tabular and graphical. Listings of the codes are provided. Predicted natural frequencies of the first several modes show reasonable agreement with experimental results. The analysis codes were originally developed on a DEC PDP 11/34 minicomputer and then downloaded and modified to run on an ITT XTRA personal computer. Studies conducted to evaluate the efficiency of running the programs on a personal computer as compared with the minicomputer indicated that, with the proper combination of hardware and software options, the efficiency of using a personal computer exceeds that of a minicomputer.

  16. libvaxdata: VAX data format conversion routines

    USGS Publications Warehouse

    Baker, Lawrence M.

    2005-01-01

    libvaxdata provides a collection of routines for converting numeric data-integer and floating-point-to and from the formats used on a Digital Equipment Corporation1 (DEC) VAX 32-bit minicomputer (Brunner, 1991). Since the VAX numeric data formats are inherited from those used on a DEC PDP-11 16-bit minicomputer, these routines can be used to convert PDP-11 data as well. VAX numeric data formats are also the default data formats used on DEC Alpha 64-bit minicomputers running OpenVMS The libvaxdata routines are callable from Fortran or C. They require that the caller use two's-complement format for integer data and IEEE 754 format (ANSI/IEEE, 1985) for floating-point data. They also require that the 'natural' size of a C int type (integer) is 32 bits. That is the case for most modern 32-bit and 64-bit computer systems. Nevertheless, you may wish to consult the Fortran or C compiler documentation on your system to be sure. Some Fortran compilers support conversion of VAX numeric data on-the-fly when reading or writing unformatted files, either as a compiler option or a run-time I/O option. This feature may be easier to use than the libvaxdata routines. Consult the Fortran compiler documentation on your system to determine if this alternative is available to you. 1Later Compaq Computer Corporation, now Hewlett-Packard Company

  17. The future of cost accounting systems in healthcare.

    PubMed

    Ladd, R D; Feverstein, T M

    1987-07-01

    The development of cost accounting/cost management programs provides one of the most exciting systems development opportunities for healthcare professionals. Despite countervailing factors, the requirement for cost management information is here to stay. The current status of systems development can be described as a positive step by a majority of institutions. To address system requirements, there are currently 16 mainframe computer, 20 minicomputer and 29 microcomputer software programs available. The availability of these software resources identifies numerous alternatives for future cost accounting/cost management applications. For the question has become, not "if" you require a cost management application, but rather what kind.

  18. Calculating Storage Requirements for Office Practice Systems

    PubMed Central

    Stead, William W.; Hammond, William E.

    1985-01-01

    The disk space requirements of small and medium sized group practices using a comprehensive medical information system supported by either a micro-computer or a mini-computer are analyzed. Efficient operation requires that 23%-54% of a typical system disk be used for files other than patient records. Data is presented to allow prediction of both the number of records that will need to be maintained for a practice and the average size of each record based upon the type of data required by the practice.

  19. Correlator computer interface and module implementation: Mark 3 processor

    NASA Technical Reports Server (NTRS)

    Nesman, E. F.

    1980-01-01

    Two hardware aspects of the Mark 3 correlator are briefly described. The first area concerns the choice of interface to the controlling minicomputer and the second area concerns the implementation of the correlator module. Multiple computer automated measurement and control (CAMAC) modules and a single large CAMAC module were considered as possible packaging forms for the correlator. The large CAMAC module approach was chosen because of the difficulty in partitioning the correlator with minimum interconnections, the fabrication economy of a single large planar assembly, and the desire to minimize the number of modules.

  20. User's manual for the Functional Relay Operation Monitor (FROM)

    SciTech Connect

    Gustke, F.R.

    1981-02-01

    Sandia's Digital Systems Development Division 1521 has developed a new functional relay tester. Capabilities of this tester include the measurement of coil and contact resistance, hipot, operate current, and contact operation and bounce times. The heart of the tester is a Hewlett-Packard 21MX minicomputer that uses BASIC or FORTRAN programming languages. All measurements are made by means of simple program calls, and all measurement standards are traceable to the National Bureau of Standards. Functional relay test data are stored on a disc drive and can be output as hard copy, manipulated in the computer, or sent over a distributed-system link to other Sandia computers. 17 figures, 4 tables.

  1. Use of functional mass in renal scintigraphy to detect segmental arterial lesions

    SciTech Connect

    Stibolt, T.B. Jr.; Bacher, J.D.; Dunnick, N.R.; Lock, A.; Jones, A.E.; Bailey, J.J.

    1982-04-01

    Renography using a gamma camera, a minicomputer, (/sup 123/I)orthoiodohippurate ((/sup 123/I)OIH), and a canine model was employed to evaluate computer-generated maps of regional renal function. Renograms were obtained before and after ligations of the right renal arterial branch in four dogs, with subsequent angiographic and histologic confirmation of the lesions. Postoperative time-activity curves were normal. Washout and persistence index in three of four right kidneys showed regional abnormality. Functional renal mapping may provide a clinical technique for evaluating human renal vascular hypertension.

  2. Operations manual for the data acquisition and reduction system, Area III Actuator Test Facility

    SciTech Connect

    Smith, E.L.

    1984-03-01

    This manual describes the operation of the new minicomputer-based data acquisition and reduction system for the Area III Actuator Test Facility at Sandia National Laboratories. The stand-alone digital system will alleviate current problems with test control and analysis and with data presentation. In its initial configuration, it will digitize test results recorded on FM tape and present the data graphically for reports. The ultimate goal of this system is to upgrade the performance, safety, and turnaround time of actuator tests through improved quality, analysis, and presentation of data.

  3. Development and Implementation of Kumamoto Technopolis Regional Database T-KIND

    NASA Astrophysics Data System (ADS)

    Onoue, Noriaki

    T-KIND (Techno-Kumamoto Information Network for Data-Base) is a system for effectively searching information of technology, human resources and industries which are necessary to realize Kumamoto Technopolis. It is composed of coded database, image database and LAN inside technoresearch park which is the center of R & D in the Technopolis. It constructs on-line system by networking general-purposed computers, minicomputers, optical disk file systems and so on, and provides the service through public telephone line. Two databases are now available on enterprise information and human resource information. The former covers about 4,000 enterprises, and the latter does about 2,000 persons.

  4. Spherical wave decompostion approach to ultrasonic field calculations

    SciTech Connect

    Griffice, C.P.; Seydel, J.A.

    1981-12-01

    A simple, flexible, accurate, and comprehensive numerical method is presented for theoretically analyzing the diffraction field of a continuous wave transducer of arbitrary size, shape, and frequency. Using the extensively studied circular transducer for comparison, numerical results are shown for an unfocused transducer with uniform velocity excitation as well as for a focused transducer with Gaussian velocity excitation. Data concerning the execution time, program size, and convergence of the method are also presented for its implementation as a design tool on a minicomputer system.

  5. The development of an engineering computer graphics laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, D. C.; Garrett, R. E.

    1975-01-01

    Hardware and software systems developed to further research and education in interactive computer graphics were described, as well as several of the ongoing application-oriented projects, educational graphics programs, and graduate research projects. The software system consists of a FORTRAN 4 subroutine package, in conjunction with a PDP 11/40 minicomputer as the primary computation processor and the Imlac PDS-1 as an intelligent display processor. The package comprises a comprehensive set of graphics routines for dynamic, structured two-dimensional display manipulation, and numerous routines to handle a variety of input devices at the Imlac.

  6. A Reporting System for Non-Invasive Cardiovascular Investigations

    PubMed Central

    Covvey, H.D.; Van Horik, M.; Hum, J.; Sole, M.J.; Schwartz, L.; Rakowski, H.; Wigle, E.D.

    1978-01-01

    A computer-based system has been developed to support the collection, reporting and storage of data acquired during non-invasive cardiac investigations. Currently the system serves 1-D echocardiography and graded exercise testing. Optical mark forms are used to record information in computer-readable form. A terminal station consisting of a CRT terminal, an optical mark reader and a printer is used for input and output from a central minicomputer database management system. Even when the costs associated with database storage are included, the overall cost of the system compares favorably with the option of using typists to produce reports.

  7. Quantitative analysis of defects in silicon. Silicon sheet growth development for the large are silicon sheet task of the low-cost solar array project

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Smith, J. M.; Bruce, T.; Oidwai, H. A.

    1980-01-01

    One hundred and seventy four silicon sheet samples were analyzed for twin boundary density, dislocation pit density, and grain boundary length. Procedures were developed for the quantitative analysis of the twin boundary and dislocation pit densities using a QTM-720 Quantitative Image Analyzing system. The QTM-720 system was upgraded with the addition of a PDP 11/03 mini-computer with dual floppy disc drive, a digital equipment writer high speed printer, and a field-image feature interface module. Three versions of a computer program that controls the data acquisition and analysis on the QTM-720 were written. Procedures for the chemical polishing and etching were also developed.

  8. Development and Operation of a MUMPS Laboratory Information System: A Decade's Experience

    PubMed Central

    Miller, R. E.; Causey, J. P.; Moore, G. W.; Wilk, G. E.

    1988-01-01

    We describe more than a decade's experience with inhouse development and operation of a clinical laboratory computer system written in the MUMPS programming language for a 1000 bed teaching hospital. The JHLIS is a networked minicomputer system that supports accessioning, instrument monitoring, and result reporting for over 3000 specimens and 30,000 test results daily. Development and operation of the system accounts for 6% of the budget of the laboratories which have had a 70% increase in workload over the past decade. Our experience with purchased MUMPS software maintained and enhanced inhouse suggests an attractive alternative to lengthy inhouse development.

  9. Implementation of the Integrated Library System: University of Maryland Health Sciences Library.

    PubMed Central

    Feng, C C; Freiburger, G; Knudsen, P C

    1983-01-01

    The Health Sciences Library, University of Maryland, has implemented the Integrated Library System (ILS), a minicomputer-based library automation system developed by the Lister Hill National Center for Biomedical Communications, National Library of Medicine. The process of moving a library from a manual to a computerized system required comprehensive planning and strong commitment by the staff. Implementation activities included hardware and software modification, conversion of manual files, staff training, and system publicity. ILS implementation resulted in major changes in procedures in the circulation, reference, and cataloging departments. PMID:6688748

  10. Programming for energy monitoring/display system in multicolor lidar system research

    NASA Technical Reports Server (NTRS)

    Alvarado, R. C., Jr.; Allen, R. J.

    1982-01-01

    The Z80 microprocessor based computer program that directs and controls the operation of the six channel energy monitoring/display system that is a part of the NASA Multipurpose Airborne Differential Absorption Lidar (DIAL) system is described. The program is written in the Z80 assembly language and is located on EPROM memories. All source and assembled listings of the main program, five subroutines, and two service routines along with flow charts and memory maps are included. A combinational block diagram shows the interfacing (including port addresses) between the six power sensors, displays, front panel controls, the main general purpose minicomputer, and this dedicated microcomputer system.

  11. Networking of microcomputers in the radiology department.

    PubMed

    Markivee, C R

    1985-10-01

    A microcomputer may be installed in any of several areas in a radiology department or office to automate data processing. Such areas include the reception desk, the transcription office, the quality-control station, and remote or satellite radiography rooms. Independent microcomputers can be interconnected by networking, using small hardware and software packages and cables, to effect communication between them, afford access to a common data base, and share peripheral devices such as hard disks and printers. A network of microcomputers can perform many of the functions of a larger minicomputer system at lower cost and can be assembled in small modules as budgetary constraints allow. PMID:3876011

  12. Data base management study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data base management techniques and applicable equipment are described. Recommendations which will assist potential NASA data users in selecting and using appropriate data base management tools and techniques are presented. Classes of currently available data processing equipment ranging from basic terminals to large minicomputer systems were surveyed as they apply to the needs of potential SEASAT data users. Cost and capabilities projections for this equipment through 1985 were presented. A test of a typical data base management system was described, as well as the results of this test and recommendations to assist potential users in determining when such a system is appropriate for their needs. The representative system tested was UNIVAC's DMS 1100.

  13. A real time data acquisition system using the MIL-STD-1553B bus. [for transmission of data to host computer for control law processing

    NASA Technical Reports Server (NTRS)

    Peri, Frank, Jr.

    1992-01-01

    A flight digital data acquisition system that uses the MIL-STD-1553B bus for transmission of data to a host computer for control law processing is described. The instrument, the Remote Interface Unit (RIU), can accommodate up to 16 input channels and eight output channels. The RIU employs a digital signal processor to perform local digital filtering before sending data to the host. The system allows flexible sensor and actuator data organization to facilitate quick control law computations on the host computer. The instrument can also run simple control laws autonomously without host intervention. The RIU and host computer together have replaced a similar larger, ground minicomputer system with favorable results.

  14. Dual charge-coupled device /CCD/, astronomical spectrometer and direct imaging camera. II - Data handling and control systems

    NASA Astrophysics Data System (ADS)

    Dewey, D.; Ricker, G. R.

    The data collection system for the MASCOT (MIT Astronomical Spectrometer/Camera for Optical Telescopes) is described. The system relies on an RCA 1802 microprocessor-based controller, which serves to collect and format data, to present data to a scan converter, and to operate a device communication bus. A NOVA minicomputer is used to record and recall frame images and to perform refined image processing. The RCA 1802 also provides instrument mode control for the MASCOT. Commands are issued using STOIC, a FORTH-like language. Sufficient flexibility has been provided so that a variety of CCDs can be accommodated.

  15. GEM: Statistical weather forecasting procedure

    NASA Technical Reports Server (NTRS)

    Miller, R. G.

    1983-01-01

    The objective of the Generalized Exponential Markov (GEM) Program was to develop a weather forecast guidance system that would: predict between 0 to 6 hours all elements in the airways observations; respond instantly to the latest observed conditions of the surface weather; process these observations at local sites on minicomputing equipment; exceed the accuracy of current persistence predictions at the shortest prediction of one hour and beyond; exceed the accuracy of current forecast model output statistics inside eight hours; and be capable of making predictions at one location for all locations where weather information is available.

  16. UNIX-based data management system for the Mobile Satellite Propagation Experiment (PiFEx)

    NASA Technical Reports Server (NTRS)

    Kantak, Anil V.

    1987-01-01

    A new method is presented for handling data resulting from Mobile Satellite propagation experiments such as the Pilot Field Experiment (PiFEx) conducted by JPL. This method uses the UNIX operating system and C programming language. The data management system is implemented on a VAX minicomputer. The system automatically divides the large data file housing data from various experiments under a predetermined format into various individual files containing data from each experiment. The system also has a number of programs written in C and FORTRAN languages to allow the researcher to obtain meaningful quantities from the data at hand.

  17. RIPS: a UNIX-based reference information program for scientists.

    PubMed

    Klyce, S D; Rózsa, A J

    1983-09-01

    A set of programs is described which implement a personal reference management and information retrieval system on a UNIX-based minicomputer. The system operates in a multiuser configuration with a host of user-friendly utilities that assist entry of reference material, its retrieval, and formatted printing for associated tasks. A search command language was developed without restriction in keyword vocabulary, number of keywords, or level of parenthetical expression nesting. The system is readily transported, and by design is applicable to any academic specialty.

  18. An improved version of the table look-up algorithm for pattern recognition. [for MSS data processing

    NASA Technical Reports Server (NTRS)

    Eppler, W. G.

    1974-01-01

    The table look-up approach to pattern recognition has been used for 3 years at several research centers in a variety of applications. A new version has been developed which is faster, requires significantly less core memory, and retains full precision of the input data. The new version can be used on low-cost minicomputers having 32K words (16 bits each) of core memory and fixed-point arithmetic; no special-purpose hardware is required. An initial FORTRAN version of this system can classify an ERTS computer-compatible tape into 24 classes in less than 15 minutes.

  19. Distributed processing techniques: interface design for interactive information sharing.

    PubMed

    Wagner, J R; Krumbholz, S D; Silber, L K; Aniello, A J

    1978-01-01

    The Information Systems Division of the University of Iowa Hospitals and Clinics has successfully designed and implemented a set of generalized interface data-handling routines that control message traffic between a satellite minicomputer in a clinical laboratory and a large main-frame computer. A special queue status inquiry transaction has also been developed that displays the current message-processing backlog and other system performance information. The design and operation of these programs are discussed in detail, with special emphasis on the message-queuing and verification techniques required in a distributed processing environment.

  20. APSAS; an Automated Particle Size Analysis System

    USGS Publications Warehouse

    Poppe, Lawrence J.; Eliason, A.H.; Fredericks, J.J.

    1985-01-01

    The Automated Particle Size Analysis System integrates a settling tube and an electroresistance multichannel particle-size analyzer (Coulter Counter) with a Pro-Comp/gg microcomputer and a Hewlett Packard 2100 MX(HP 2100 MX) minicomputer. This system and its associated software digitize the raw sediment grain-size data, combine the coarse- and fine-fraction data into complete grain-size distributions, perform method of moments and inclusive graphics statistics, verbally classify the sediment, generate histogram and cumulative frequency plots, and transfer the results into a data-retrieval system. This system saves time and labor and affords greater reliability, resolution, and reproducibility than conventional methods do.

  1. F100 Multivariable Control Synthesis Program. Computer Implementation of the F100 Multivariable Control Algorithm

    NASA Technical Reports Server (NTRS)

    Soeder, J. F.

    1983-01-01

    As turbofan engines become more complex, the development of controls necessitate the use of multivariable control techniques. A control developed for the F100-PW-100(3) turbofan engine by using linear quadratic regulator theory and other modern multivariable control synthesis techniques is described. The assembly language implementation of this control on an SEL 810B minicomputer is described. This implementation was then evaluated by using a real-time hybrid simulation of the engine. The control software was modified to run with a real engine. These modifications, in the form of sensor and actuator failure checks and control executive sequencing, are discussed. Finally recommendations for control software implementations are presented.

  2. Evaluation of initial collector field performance at the Langley Solar Building Test Facility

    NASA Technical Reports Server (NTRS)

    Boyle, R. J.; Jensen, R. N.; Knoll, R. H.

    1977-01-01

    The thermal performance of the solar collector field for the NASA Langley Solar Building Test Facility is given for October 1976 through January 1977. A 1,180 square meter solar collector field with seven collector designs helped to provide hot water for the building heating system and absorption air conditioner. The collectors were arranged in 12 rows with nominally 51 collectors per row. Heat transfer rates for each row were calculated and recorded along with sensor, insolation, and weather data every five minutes using a minicomputer. The agreement between the experimental and predicted collector efficiencies was generally within five percentage points.

  3. Implementation of the Integrated Library System: University of Maryland Health Sciences Library.

    PubMed

    Feng, C C; Freiburger, G; Knudsen, P C

    1983-07-01

    The Health Sciences Library, University of Maryland, has implemented the Integrated Library System (ILS), a minicomputer-based library automation system developed by the Lister Hill National Center for Biomedical Communications, National Library of Medicine. The process of moving a library from a manual to a computerized system required comprehensive planning and strong commitment by the staff. Implementation activities included hardware and software modification, conversion of manual files, staff training, and system publicity. ILS implementation resulted in major changes in procedures in the circulation, reference, and cataloging departments. PMID:6688748

  4. A computer system to analyze showers in nuclear emulsions: Center Director's discretionary fund report

    NASA Technical Reports Server (NTRS)

    Meegan, C. A.; Fountain, W. F.; Berry, F. A., Jr.

    1987-01-01

    A system to rapidly digitize data from showers in nuclear emulsions is described. A TV camera views the emulsions though a microscope. The TV output is superimposed on the monitor of a minicomputer. The operator uses the computer's graphics capability to mark the positions of particle tracks. The coordinates of each track are stored on a disk. The computer then predicts the coordinates of each track through successive layers of emulsion. The operator, guided by the predictions, thus tracks and stores the development of the shower. The system provides a significant improvement over purely manual methods of recording shower development in nuclear emulsion stacks.

  5. Some snippets of history

    NASA Astrophysics Data System (ADS)

    West, Richard; Breysacher, Jacques; Laustsen, Svend; Hofstadt, Daniel; Swings, Jean-Pierre; Enard, Daniel; Moorwood, Alan; Nees, Walter; Wilson, Ray; Benvenuti, Piero; Cesarsky, Catherine; Glindemann, Andreas

    2002-09-01

    Memories of early times at ESO; Early days of the OPC; How ESO got its Optics Group; Renata Scotto at La Silla; La Silla vaut bien une Messe; First experience at La Silla, and some activities for the VLT; The early days of instrumentation at ESO; The early days of infrared instrumentation at ESO; ESO's first step into the world of minicomputers; First Astronomical Light at the NTT; Recovery of a historical document; First Light of UT4 (from The Messenger No. 101, Sept. 2000); First Fringes with ANTU and MELIPAL (from The Messenger No. 106, Dec. 2001)

  6. Applications of intelligent-measurement systems in controlled-fusion research

    SciTech Connect

    Owen, E.W.; Shimer, D.W.; Lindquist, W.B.; Peterson, R.L.; Wyman, R.H.

    1981-06-22

    The paper describes the control and instrumentation for the Mirror Fusion Test Facility at the Lawrence Livermore National Laboratory, California, USA. This large-scale scientific experiment in controlled thermonuclear fusion, which is currently being expanded, originally had 3000 devices to control and 7000 sensors to monitor. A hierarchical computer control system, is used with nine minicomputers forming the supervisory system. There are approximately 55 local control and instrumentation microcomputers. In addition, each device has its own monitoring equipment, which in some cases consists of a small computer. After describing the overall system a more detailed account is given of the control and instrumentation for two large superconducting magnets.

  7. Electromagnetic-acoustic-transducer synthetic-aperture system for thick-weld inspection

    NASA Astrophysics Data System (ADS)

    Fortunko, C. M.; Schramm, R. E.; Moulder, J. C.; McColskey, J. D.

    1984-05-01

    A system is described based on electromagnetic acoustic transducers (EMATs) as an approach to automated nondestructive evaluation of thick weldments. Applications include a new type of ultrasonic inspection system for thick, butt welds used in ship construction. A minicomputer controlled transducer positioned and acquired the digitized ultrasonic waveforms for synthetic aperture processing. The synthetic aperture technique further improved signal quality and yielded flaw localization through the weld thickness. Details include the design of the transducers and electronics, as well as the mechanical positioner, signal processing algorithms, and complete computer program listings.

  8. The Earth Resources Laboratory Applications Software (ELAS) in university research and education: An operator oriented geobased information system

    NASA Technical Reports Server (NTRS)

    Coker, B. L.; Kind, T. C.; Smith, W. F., Jr.; Weber, N. V.

    1981-01-01

    Created for analyzing and processing digital data such as that collected by multispectral scanners or digitized from maps, ELAS is designed for ease of user operation and includes its own FORTRAN operating monitor and an expandable set of application modules which are FORTRAN overlays. On those machines that do not support FORTRAN overlaying, the modules exist as subprograms. The subsystem can be implemented on most 16-bit or 32-bit machines and is capable of, but not limited to, operating on low-cost minicomputer systems. The recommended hardware configuration for ELAS and a representative listing of some operating and application modules are presented.

  9. Optical instrumentation engineering in science, technology and society; Proceedings of the Sixteenth Annual Technical Meeting, San Mateo, Calif., October 16-18, 1972

    NASA Technical Reports Server (NTRS)

    Katz, Y. H.

    1973-01-01

    Visual tracking performance in instrumentation is discussed together with photographic pyrometry in an aeroballistic range, optical characteristics of spherical vapor bubbles in liquids, and the automatic detection and control of surface roughness by coherent diffraction patterns. Other subjects explored are related to instruments, sensors, systems, holography, and pattern recognition. Questions of data handling are also investigated, taking into account minicomputer image storage for holographic interferometry analysis, the design of a video amplifier for a 90 MHz bandwidth, and autostereoscopic screens. Individual items are announced in this issue.

  10. Close to real life. [solving for transonic flow about lifting airfoils using supercomputers

    NASA Technical Reports Server (NTRS)

    Peterson, Victor L.; Bailey, F. Ron

    1988-01-01

    NASA's Numerical Aerodynamic Simulation (NAS) facility for CFD modeling of highly complex aerodynamic flows employs as its basic hardware two Cray-2s, an ETA-10 Model Q, an Amdahl 5880 mainframe computer that furnishes both support processing and access to 300 Gbytes of disk storage, several minicomputers and superminicomputers, and a Thinking Machines 16,000-device 'connection machine' processor. NAS, which was the first supercomputer facility to standardize operating-system and communication software on all processors, has done important Space Shuttle aerodynamics simulations and will be critical to the configurational refinement of the National Aerospace Plane and its intergrated powerplant, which will involve complex, high temperature reactive gasdynamic computations.

  11. High resolution color raster computer animation of space filling molecular models

    SciTech Connect

    Max, N.L.

    1981-01-01

    The ATOMLLL system efficiently produces realistic photographs of ball-and-stick or space-filling molecular models, with color shading, highlights, shadows, and transparency. The hidden surface problem for a scene composed of intersecting spheres and cylinders is solved on a CDC-7600, which outputs onto magnetic tape the outlines of the visible parts of each object. The outlines are then rendered, at up to 4096 x 4096 resolution, by a Dicomed D-48 color film recorder, controlled by a Varian V-75 minicomputer. The Varian computes the shading and highlights for each pixel in a fast microcoded loop. Recent modifications to give shadows and transparency are described.

  12. ECG-gated emission computed tomography of the cardiac blood pool

    SciTech Connect

    Moore, M.L.; Murphy, P.H.; Burdine, J.A.

    1980-01-01

    ECG-gated cross-sectional images of the cardiac blood pool were produced using a specially constructed emission computed tomographic scanner. A pair of large-field-of-view cameras were mounted in opposition in a gantry that rotates 360/sup 0/ about the patient. The coordinates of each detected event, the output of a physiological synchronizer, and the position of the camera heads were input to a dedicated minicomputer which was used to produce the images. Display as a movie permitted evaluation of regional and global wall motion in cross section without the disadvantages of superimposed blood pools as obtained in nontomographic views.

  13. Use Of Infrared Imagery In Continuous Flow Wind Tunnels

    NASA Astrophysics Data System (ADS)

    Stallings, D. W.; Whetsel, R. G.

    1983-03-01

    Thermal mapping with infrared imagery is a very useful test technique in continuous flow wind tunnels. Convective-heating patterns over large areas of a model can be obtained through remote sensing of the surface temperature. A system has been developed at AEDC which uses a commercially available infrared scanning camera to produce these heat-transfer maps. In addition to the camera, the system includes video monitors, an analog tape recording, an analog-to-digital converter, a digitizer control, and two minicomputers. This paper will describe the individual components, data reduction techniques, and typical applications. *

  14. ART/Ada design project, phase 1. Task 3 report: Test plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan is described for the integrated testing and benchmark of Phase Ada based ESBT Design Research Project. The integration testing is divided into two phases: (1) the modules that do not rely on the Ada code generated by the Ada Generator are tested before the Ada Generator is implemented; and (2) all modules are integrated and tested with the Ada code generated by the Ada Generator. Its performance and size as well as its functionality is verified in this phase. The target platform is a DEC Ada compiler on VAX mini-computers and VAX stations running the VMS operating system.

  15. Rocketdyne automated dynamics data analysis and management system

    NASA Technical Reports Server (NTRS)

    Tarn, Robert B.

    1988-01-01

    An automated dynamics data analysis and management systems implemented on a DEC VAX minicomputer cluster is described. Multichannel acquisition, Fast Fourier Transformation analysis, and an online database have significantly improved the analysis of wideband transducer responses from Space Shuttle Main Engine testing. Leakage error correction to recover sinusoid amplitudes and correct for frequency slewing is described. The phase errors caused by FM recorder/playback head misalignment are automatically measured and used to correct the data. Data compression methods are described and compared. The system hardware is described. Applications using the data base are introduced, including software for power spectral density, instantaneous time history, amplitude histogram, fatigue analysis, and rotordynamics expert system analysis.

  16. Guide to buying and using energy-efficient office equipment

    SciTech Connect

    1995-12-31

    The introduction to this booklet notes the rise in office electrical energy consumption due to the increasing use of electrical office equipment, and the impact of this consumption on the environment. The booklet then strives to introduce the criterion of energy efficiency as something to consider when purchasing office equipment. It covers the following types of equipment: Computers and displays (but not mainframes, minicomputers or high-end workstations), printers, photocopiers, facsimile machines, modern information and telecommunications technologies, and other office machines and appliances such as scanners, task lighting, and voice mail systems. Power management systems, conserving paper, and efficient use of office equipment are also discussed.

  17. Integrating the university medical center. Phase one: providing an information backbone.

    PubMed Central

    Berry, S. J.; Reber, E.; Offeman, W. E.

    1991-01-01

    UCLA School of Medicine represents a diverse computing community where the creation of each individual network has been driven by applications, price/performance and functionality. Indeed, the ability to connect to other computers has had no bearing on selection. Yet, there exists a need to seamlessly connect the individual networks to other minicomputers, mainframes and remote computers. We have created a school wide backbone network that will enable an individual from a single workstation to access a wide variety of services residing on any number of machines. PMID:1807658

  18. I-BIEM, an iterative boundary integral equation method for computer solutions of current distribution problems with complex boundaries: A new algorithm. I - Theoretical

    NASA Technical Reports Server (NTRS)

    Cahan, B. D.; Scherson, Daniel; Reid, Margaret A.

    1988-01-01

    A new algorithm for an iterative computation of solutions of Laplace's or Poisson's equations in two dimensions, using Green's second identity, is presented. This algorithm converges strongly and geometrically and can be applied to curved, irregular, or moving boundaries with nonlinear and/or discontinuous boundary conditions. It has been implemented in Pascal on a number of micro- and minicomputers and applied to several geometries. Cases with known analytic solutions have been tested. Convergence to within 0.1 percent to 0.01 percent of the theoretical values are obtained in a few minutes on a microcomputer.

  19. Comparison of existing digital image analysis systems for the analysis of Thematic Mapper data

    NASA Technical Reports Server (NTRS)

    Likens, W. C.; Wrigley, R. C.

    1984-01-01

    Most existing image analysis systems were designed with the Landsat Multi-Spectral Scanner in mind, leaving open the question of whether or not these systems could adequately process Thematic Mapper data. In this report, both hardware and software systems have been evaluated for compatibility with TM data. Lack of spectral analysis capability was not found to be a problem, though techniques for spatial filtering and texture varied. Computer processing speed and data storage of currently existing mini-computer based systems may be less than adequate. Upgrading to more powerful hardware may be required for many TM applications.

  20. Distributed data base systems with special emphasis toward POREL

    NASA Technical Reports Server (NTRS)

    Neuhold, E. J.

    1984-01-01

    In the last few years a number of research and advanced development projects have resulted in distributed data base management prototypes. POREL, developed at the University of Stuttgart, is a multiuser, distributed, relational system developed for wide and local area networks of minicomputers and advanced micros. The general objectives of such data base systems and the architecture of POREL are discussed. In addition a comparison of some of the the existing distributed DMBS is included to provide the reader with information about the current state of the art.

  1. System of Programmed Modules for Measuring Photographs with a Gamma-Telescope

    NASA Technical Reports Server (NTRS)

    Averin, S. A.; Veselova, G. V.; Navasardyan, G. V.

    1978-01-01

    Physical experiments using tracking cameras resulted in hundreds of thousands of stereo photographs of events being received. To process such a large volume of information, automatic and semiautomatic measuring systems are required. At the Institute of Space Research of the Academy of Science of the USSR, a system for processing film information from the spark gamma-telescope was developed. The system is based on a BPS-75 projector in line with the minicomputer Elektronika 1001. The report describes this system. The various computer programs available to the operators are discussed.

  2. User's operating procedures. Volume 3: Projects directorate information programs

    NASA Technical Reports Server (NTRS)

    Haris, C. G.; Harris, D. K.

    1985-01-01

    A review of the user's operating procedures for the scout project automatic data system, called SPADS is presented. SPADS is the results of the past seven years of software development on a prime mini-computer. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, three of three, provides the instructions to operate the projects directorate information programs in data retrieval and file maintenance via the user friendly menu drivers.

  3. User's operating procedures. Volume 1: Scout project information programs

    NASA Technical Reports Server (NTRS)

    Harris, C. G.; Harris, D. K.

    1985-01-01

    A review of the user's operating procedures for the Scout Project Automatic Data System, called SPADS is given. SPADS is the result of the past seven years of software development on a Prime minicomputer located at the Scout Project Office. SPADS was developed as a single entry, multiple cross reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. The instructions to operate the Scout Project Information programs in data retrieval and file maintenance via the user friendly menu drivers is presented.

  4. Man-machine analysis of translation and work tasks of Skylab films

    NASA Technical Reports Server (NTRS)

    Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.

    1979-01-01

    An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.

  5. Gait Analysis Laboratory

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Complete motion analysis laboratory has evolved out of analyzing walking patterns of crippled children at Stanford Children's Hospital. Data is collected by placing tiny electrical sensors over muscle groups of child's legs and inserting step-sensing switches in soles of shoes. Miniature radio transmitters send signals to receiver for continuous recording of abnormal walking pattern. Engineers are working to apply space electronics miniaturization techniques to reduce size and weight of telemetry system further as well as striving to increase signal bandwidth so analysis can be performed faster and more accurately using a mini-computer.

  6. Registration of Heat Capacity Mapping Mission day and night images

    NASA Technical Reports Server (NTRS)

    Watson, K.; Hummer-Miller, S.; Sawatzky, D. L. (Principal Investigator)

    1982-01-01

    Neither iterative registration, using drainage intersection maps for control, nor cross correlation techniques were satisfactory in registering day and night HCMM imagery. A procedure was developed which registers the image pairs by selecting control points and mapping the night thermal image to the daytime thermal and reflectance images using an affine transformation on a 1300 by 1100 pixel image. The resulting image registration is accurate to better than two pixels (RMS) and does not exhibit the significant misregistration that was noted in the temperature-difference and thermal-inertia products supplied by NASA. The affine transformation was determined using simple matrix arithmetic, a step that can be performed rapidly on a minicomputer.

  7. Evaluation of initial collector field performance at the Langley Solar Building Test Facility

    NASA Technical Reports Server (NTRS)

    Boyle, R. J.; Knoll, R. H.; Jensen, R. N.

    1977-01-01

    The thermal performance of the solar collector field for the NASA Langley Solar Building Test Facility is given for October 1976 through January 1977. An 1180 square meter solar collector field with seven collector designs helped to provide hot water for the building heating system and absorption air conditioner. The collectors were arranged in 12 rows with nominally 51 collectors per row. Heat transfer rates for each row are calculated and recorded along with sensor, insolation, and weather data every 5 minutes using a mini-computer. The agreement between the experimental and predicted collector efficiencies was generally within five percentage points.

  8. User's operating procedures. Volume 2: Scout project financial analysis program

    NASA Technical Reports Server (NTRS)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  9. Application of image processing techniques to fluid flow data analysis

    NASA Technical Reports Server (NTRS)

    Giamati, C. C.

    1981-01-01

    The application of color coding techniques used in processing remote sensing imagery to analyze and display fluid flow data is discussed. A minicomputer based color film recording and color CRT display system is described. High quality, high resolution images of two-dimensional data are produced on the film recorder. Three dimensional data, in large volume, are used to generate color motion pictures in which time is used to represent the third dimension. Several applications and examples are presented. System hardware and software is described.

  10. Consolidation of data base for Army generalized missile model

    NASA Technical Reports Server (NTRS)

    Klenke, D. J.; Hemsch, M. J.

    1980-01-01

    Data from plume interaction tests, nose mounted canard configuration tests, and high angle of attack tests on the Army Generalized Missile model are consolidated in a computer program which makes them readily accessible for plotting, listing, and evaluation. The program is written in FORTRAN and will run on an ordinary minicomputer. It has the capability of retrieving any coefficient from the existing DATAMAN tapes and displaying it in tabular or plotted form. Comparisons of data taken in several wind tunnels and of data with the predictions of Program MISSILE2 are also presented.

  11. Alternatives in the complement and structure of NASA teleprocessing resources

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results are presented of a program to identify technical innovations which would have an impact on NASA data processing and describe as fully as possible the development work necessary to exploit them. Seven of these options for NASA development, as the opportunities to participate in and enhance the advancing information system technology were called, are reported. A detailed treatment is given of three of the options, involving minicomputers, mass storage devices and software development techniques. These areas were picked by NASA as having the most potential for improving their operations.

  12. MORPH-I (Ver 1.0) a software package for the analysis of scanning electron micrograph (binary formatted) images for the assessment of the fractal dimension of enclosed pore surfaces

    USGS Publications Warehouse

    Mossotti, Victor G.; Eldeeb, A. Raouf; Oscarson, Robert

    1998-01-01

    MORPH-I is a set of C-language computer programs for the IBM PC and compatible minicomputers. The programs in MORPH-I are used for the fractal analysis of scanning electron microscope and electron microprobe images of pore profiles exposed in cross-section. The program isolates and traces the cross-sectional profiles of exposed pores and computes the Richardson fractal dimension for each pore. Other programs in the set provide for image calibration, display, and statistical analysis of the computed dimensions for highly complex porous materials. Requirements: IBM PC or compatible; minimum 640 K RAM; mathcoprocessor; SVGA graphics board providing mode 103 display.

  13. A Satellite Frost Forecasting System for Florida

    NASA Technical Reports Server (NTRS)

    Martsolf, J. D.

    1981-01-01

    Since the first of two minicomputers that are the main components of the satellite frost forecast system was delivered in 1977, the system has evolved appreciably. A geostationary operational environmental satellite (GOES) system provides the satellite data. The freeze of January 12-14, 1981, was documented with increasing interest in potential of such systems. Satellite data is now acquired digitally rather than by redigitizing the GOES-Tap transmissions. Data acquisition is now automated, i.e., the computers are programmed to operate the system with little, if any, operation intervention.

  14. Computers, medical care and privacy.

    PubMed

    Fresse, J

    1985-01-01

    This paper describes Physician Actuated Computerized Treatment (PACT) which provides paperless Medical Office Management (MOM) (1). Software, hardware and physician are fused to produce an on-line database medical management system containing medical records, clerical functions and bookkeeping. PACT developed in the 1980's, was financed entirely by private physicians in a working clinical environment. MOM operates on a mini-computer with a minimum of 10 MB hard disk and 16K of memory. Maximum system design is a function of cost and total desired on-line storage. User friendly screens can prompt the operator in English, Spanish, French, German and Italian. Data entry is in native language.

  15. Manipulator for rotating and examining small spheres

    DOEpatents

    Weinstein, Berthold W. [Livermore, CA; Willenborg, David L. [Livermore, CA

    1980-02-12

    A manipulator which provides fast, accurate rotational positioning of a small sphere, such as an inertial confinement fusion target, which allows inspecting of the entire surface of the sphere. The sphere is held between two flat, flexible tips which move equal amounts in opposite directions. This provides rolling of the ball about two orthogonal axes without any overall translation. The manipulator may be controlled, for example, by an x- and y-axis driven controlled by a mini-computer which can be programmed to generate any desired scan pattern.

  16. Manipulator for rotating and examining small spheres

    DOEpatents

    Weinstein, B.W.; Willenborg, D.L.

    1980-02-12

    A manipulator is disclosed which provides fast, accurate rotational positioning of a small sphere, such as an inertial confinement fusion target, which allows inspecting of the entire surface of the sphere. The sphere is held between two flat, flexible tips which move equal amounts in opposite directions. This provides rolling of the ball about two orthogonal axes without any overall translation. The manipulator may be controlled, for example, by an x- and y-axis driven controlled by a mini-computer which can be programmed to generate any desired scan pattern. 8 figs.

  17. Computerized nuclear material system at Sandia National Laboratories

    SciTech Connect

    Tischhauser, J.L.

    1980-01-01

    SNLA developed and implemented a nuclear material control and accountability system on an HP 3000 minicomputer. The Sandia Nuclear Materials Computer System (SNMCS) which became operative in January 1980 provides: control of shipments and receivals of nuclear material, control of internal transfers of nuclear material, automated inventory with a bar code system, control of inventory adjustments, automated reporting/transmitting to other contractors and operations offices, automated ledgers and journals for material weights and costs, and interface to the Albuquerque Operations Office (ALO) Automated 741 System.

  18. Operator Station Design System - A computer aided design approach to work station layout

    NASA Technical Reports Server (NTRS)

    Lewis, J. L.

    1979-01-01

    The Operator Station Design System is resident in NASA's Johnson Space Center Spacecraft Design Division Performance Laboratory. It includes stand-alone minicomputer hardware and Panel Layout Automated Interactive Design and Crew Station Assessment of Reach software. The data base consists of the Shuttle Transportation System Orbiter Crew Compartment (in part), the Orbiter payload bay and remote manipulator (in part), and various anthropometric populations. The system is utilized to provide panel layouts, assess reach and vision, determine interference and fit problems early in the design phase, study design applications as a function of anthropometric and mission requirements, and to accomplish conceptual design to support advanced study efforts.

  19. Computers for artificial intelligence a technology assessment and forecast

    SciTech Connect

    Miller, R.K.

    1986-01-01

    This study reviews the development and current state-of-the-art in computers for artificial intelligence, including LISP machines, AI workstations, professional and engineering workstations, minicomputers, mainframes, and supercomputers. Major computer systems for AI applications are reviewed. The use of personal computers for expert system development is discussed, and AI software for the IBM PC, Texas Instrument Professional Computer, and Apple MacIntosh is presented. Current research aimed at developing a new computer for artificial intelligence is described, and future technological developments are discussed.

  20. An experimental study of a hybrid adaptive control system

    NASA Technical Reports Server (NTRS)

    Lizewski, E. F.; Monopoli, R. V.

    1974-01-01

    A Liapunov type model reference adaptive control system with five adjustable gains is implemented using a PDP-11 digital computer and an EAI 380 analog computer. The plant controlled is a laboratory type dc servo system. It is made to follow closely a second order linear model. The experimental results demonstrate the feasibility of implementing this rather complex design using only a minicomputer and a reasonable number of operational amplifiers. Also, it points out that satisfactory performance can be achieved even when certain assumptions necessary for the theory are not satisfied.

  1. A convenient and adaptable package of computer programs for DNA and protein sequence management, analysis and homology determination.

    PubMed Central

    Pustell, J; Kafatos, F C

    1984-01-01

    We describe the further development of a widely used package of DNA/protein sequence analysis programs (1). Important revisions have been made based on user experience, and new features, multi-user capability, and a set of large scale homology programs have been added. The programs are very user friendly, economical of time and memory, and extremely transportable. They are written in a version of FORTRAN which will compile, with a few defined changes, as FORTRAN 66, FORTRAN 77, FORTRAN IV, FORTRAN IV+, and others. They are running on a variety of microcomputers, minicomputers, and mainframes, in both single user and multi-user configurations. PMID:6320100

  2. On the development of an interactive resource information management system for analysis and display of spatiotemporal data

    NASA Technical Reports Server (NTRS)

    Schell, J. A.

    1974-01-01

    The recent availability of timely synoptic earth imagery from the Earth Resources Technology Satellites (ERTS) provides a wealth of information for the monitoring and management of vital natural resources. Formal language definitions and syntax interpretation algorithms were adapted to provide a flexible, computer information system for the maintenance of resource interpretation of imagery. These techniques are incorporated, together with image analysis functions, into an Interactive Resource Information Management and Analysis System, IRIMAS, which is implemented on a Texas Instruments 980A minicomputer system augmented with a dynamic color display for image presentation. A demonstration of system usage and recommendations for further system development are also included.

  3. U. S. GEOLOGICAL SURVEY'S NATIONAL REAL-TIME HYDROLOGIC INFORMATION SYSTEM USING GOES SATELLITE TECHNOLOGY.

    USGS Publications Warehouse

    Shope, William G.

    1987-01-01

    The U. S. Geological Survey maintains the basic hydrologic data collection system for the United States. The Survey is upgrading the collection system with electronic communications technologies that acquire, telemeter, process, and disseminate hydrologic data in near real-time. These technologies include satellite communications via the Geostationary Operational Environmental Satellite, Data Collection Platforms in operation at over 1400 Survey gaging stations, Direct-Readout Ground Stations at nine Survey District Offices and a network of powerful minicomputers that allows data to be processed and disseminate quickly.

  4. Refractive index and absorption detector for liquid chromatography based on Fabry-Perot interferometry

    DOEpatents

    Yeung, E.S.; Woodruff, S.D.

    1984-06-19

    A refractive index and absorption detector are disclosed for liquid chromatography. It is based in part on a Fabry-Perot interferometer and is used for the improved detection of refractive index and absorption. It includes a Fabry-Perot interferometer having a normally fixed first partially reflecting mirror and a movable second partially reflecting mirror. A chromatographic flow-cell is positioned between the mirrors along the optical axis of a monochromatic laser beam passing through the interferometer. A means for deriving information about the interference fringes coming out of the interferometer is used with a mini-computer to compute the refractive index of the specimen injected into the flow cell. The minicomputer continuously scans the interferometer for continuous refractive index readings and outputs the continuous results of the scans on a chart recorder. The absorption of the specimen can concurrently be scanned by including a second optical path for an excitation laser which will not interfere with the first laser, but will affect the specimen so that absorption properties can be detected. By first scanning for the refractive index of the specimen, and then immediately adding the excitation laser and subsequently scanning for the refractive index again, the absorption of the specimen can be computed and recorded. 10 figs.

  5. Refractive index and absorption detector for liquid chromatography based on Fabry-Perot interferometry

    DOEpatents

    Yeung, Edward S.; Woodruff, Steven D.

    1984-06-19

    A refractive index and absorption detector for liquid chromatography. It is based in part on a Fabry-Perot interferometer and is used for the improved detection of refractive index and absorption. It includes a Fabry-Perot interferometer having a normally fixed first partially reflecting mirror and a movable second partially reflecting mirror. A chromatographic flow-cell is positioned between the mirrors along the optical axis of a monochromatic laser beam passing through the interferometer. A means for deriving information about the interference fringes coming out of the interferometer is used with a mini-computer to compute the refractive index of the specimen injected into the flow cell. The minicomputer continuously scans the interferometer for continuous refractive index readings and outputs the continuous results of the scans on a chart recorder. The absorption of the specimen can concurrently be scanned by including a second optical path for an excitation laser which will not interfere with the first laser, but will affect the specimen so that absorption properties can be detected. By first scanning for the refractive index of the specimen, and then immediately adding the excitation laser and subsequently scanning for the refractive index again, the absorption of the specimen can be computed and recorded.

  6. Computer analysis of digital well logs

    USGS Publications Warehouse

    Scott, James H.

    1984-01-01

    A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.

  7. LaRC local area networks to support distributed computing

    NASA Technical Reports Server (NTRS)

    Riddle, E. P.

    1984-01-01

    The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs.

  8. Technology innovation and management in the US Bureau of the Census: Discussion and recommendations

    SciTech Connect

    Tonn, B.; Edwards, R.; Goeltz, R.; Hake, K.

    1990-09-01

    This report contains a set of recommendations prepared by Oak Ridge National Laboratory (ORNL) for the US Bureau of the Census pertaining to technology innovation and management. Technology has the potential to benefit the Bureau's data collection, capture, processing, and analysis activities. The entire Bureau was represented from Decennial Census to Economic Programs and various levels of Bureau management and numerous experts in technology. Throughout the Bureau, workstations, minicomputers, and microcomputers have found their place along side the Bureau's mainframes. The Bureau's new computer file structure called the Topologically Integrated Geographic Encoding and Referencing data base (TIGER) represents a major innovation in geographic information systems and impressive progress has been made with Computer Assisted Telephone Interviewing (CATI). Other innovations, such as SPRING, which aims to provide Bureau demographic analysts with the capability of interactive data analysis on minicomputers, are in the initial stages of development. Recommendations fall into five independent, but mutually beneficial categories. (1) The ADP Steering Committee be disbanded and replaced with The Technology Forum. (2) Establishment of a Technology Review Committee (TRC), to be composed of technology experts from outside the Bureau. (3) Designate technological gurus. These individuals will be the Bureau's experts in new and innovative technologies. (4) Adopt a technology innovation process. (5) Establish an Advanced Technology Studies Staff (ATSS) to promote technology transfer, obtain funding for technological innovation, manage innovation projects unable to find a home in other divisions, evaluate innovations that cut across Bureau organizational boundaries, and provide input into Bureau technology analyses. (JF)

  9. Computer network that assists in the planning, execution and evaluation of in-reactor experiments

    SciTech Connect

    Bauer, T.H.; Froehle, P.H.; August, C.; Baldwin, R.D.; Johanson, E.W.; Kraimer, M.R.; Simms, R.; Klickman, A.E.

    1985-01-01

    For over 20 years complex, in-reactor experiments have been performed at Argonne National Laboratory (ANL) to investigate the performance of nuclear reactor fuel and to support the development of large computer codes that address questions of reactor safety in full-scale plants. Not only are computer codes an important end-product of the research, but computer analysis is also involved intimately at most stages of experiment planning, data reduction, and evaluation. For instance, many experiments are of sufficiently long duration or, if they are of brief duration, occur in such a purposeful sequence that need for speedy availability of on-line data is paramount. This is made possible most efficiently by computer assisted displays and evaluation. A purposeful linking of main-frame, mini, and micro computers has been effected over the past eight years which greatly enhances the speed with which experimental data are reduced to useful forms and applied to the relevant technological issues. This greater efficiency in data management led also to improvements in the planning and execution of subsequent experiments. Raw data from experiments performed at INEL is stored directly on disk and tape with the aid of minicomputers. Either during or shortly after an experiment, data may be transferred, via a direct link, to the Illinois offices of ANL where the data base is stored on a minicomputer system. This Idaho-to-Illinois link has both enhanced experiment performance and allowed rapid dissemination of results.

  10. A multiprocessor airborne lidar data system

    NASA Technical Reports Server (NTRS)

    Wright, C. W.; Bailey, S. A.; Heath, G. E.; Piazza, C. R.

    1988-01-01

    A new multiprocessor data acquisition system was developed for the existing Airborne Oceanographic Lidar (AOL). This implementation simultaneously utilizes five single board 68010 microcomputers, the UNIX system V operating system, and the real time executive VRTX. The original data acquisition system was implemented on a Hewlett Packard HP 21-MX 16 bit minicomputer using a multi-tasking real time operating system and a mixture of assembly and FORTRAN languages. The present collection of data sources produce data at widely varied rates and require varied amounts of burdensome real time processing and formatting. It was decided to replace the aging HP 21-MX minicomputer with a multiprocessor system. A new and flexible recording format was devised and implemented to accommodate the constantly changing sensor configuration. A central feature of this data system is the minimization of non-remote sensing bus traffic. Therefore, it is highly desirable that each micro be capable of functioning as much as possible on-card or via private peripherals. The bus is used primarily for the transfer of remote sensing data to or from the buffer queue.

  11. Quality control in a deterministic manufacturing environment

    SciTech Connect

    Barkman, W.E.; Babelay, E.F.; De Mint, P.D.; Lewis, J.C.; Woodard, L.M.

    1985-01-24

    An approach for establishing quality control in processes which exhibit undesired continual or intermittent excursions in key process parameters is discussed. The method is called deterministic manufacturing, and it is designed to employ automatic monitoring of the key process variables for process certification, but utilizes only sample certification of the process output to verify the validity of the measurement process. The system utilizes a local minicomputer to sample the appropriate process parameters that describe the condition of the machine tool, the cutting process, and the computer numerical control system. Sampled data are pre-processed by the minicomputer and then sent to a host computer that maintains a permanent data base describing the manufacturing conditions for each work piece. Parts are accepted if the various parameters remain within the required limits during the machining cycle. The need for additional actions is flagged if limits are exceeded. With this system it is possible to retrospectively examine the process status just prior to the occurrence of a problem. (LEW)

  12. Distributed information system (water fact sheet)

    USGS Publications Warehouse

    Harbaugh, A.W.

    1986-01-01

    During 1982-85, the Water Resources Division (WRD) of the U.S. Geological Survey (USGS) installed over 70 large minicomputers in offices across the country to support its mission in the science of hydrology. These computers are connected by a communications network that allows information to be shared among computers in each office. The computers and network together are known as the Distributed Information System (DIS). The computers are accessed through the use of more than 1500 terminals and minicomputers. The WRD has three fundamentally different needs for computing: data management; hydrologic analysis; and administration. Data management accounts for 50% of the computational workload of WRD because hydrologic data are collected in all 50 states, Puerto Rico, and the Pacific trust territories. Hydrologic analysis consists of 40% of the computational workload of WRD. Cost accounting, payroll, personnel records, and planning for WRD programs occupies an estimated 10% of the computer workload. The DIS communications network is shown on a map. (Lantz-PTT)

  13. From the genetic to the computer program: the historicity of 'data' and 'computation' in the investigations on the nematode worm C. elegans (1963-1998).

    PubMed

    García-Sancho, Miguel

    2012-03-01

    This paper argues that the history of the computer, of the practice of computation and of the notions of 'data' and 'programme' are essential for a critical account of the emergence and implications of data-driven research. In order to show this, I focus on the transition that the investigations on the worm C. elegans experienced in the Laboratory of Molecular Biology of Cambridge (UK). Throughout the 1980s, this research programme evolved from a study of the genetic basis of the worm's development and behaviour to a DNA mapping and sequencing initiative. By examining the changing computing technologies which were used at the Laboratory, I demonstrate that by the time of this transition researchers shifted from modelling the worm's genetic programme on a mainframe apparatus to writing minicomputer programs aimed at providing map and sequence data which was then circulated to other groups working on the genetics of C. elegans. The shift in the worm research should thus not be simply explained in the application of computers which transformed the project from hypothesis-driven to a data-intensive endeavour. The key factor was rather a historically specific technology-in-house and easy programmable minicomputers-which redefined the way of achieving the project's long-standing goal, leading the genetic programme to co-evolve with the practices of data production and distribution.

  14. Digital system for structural dynamics simulation

    SciTech Connect

    Krauter, A.I.; Lagace, L.J.; Wojnar, M.K.; Glor, C.

    1982-11-01

    State-of-the-art digital hardware and software for the simulation of complex structural dynamic interactions, such as those which occur in rotating structures (engine systems). System were incorporated in a designed to use an array of processors in which the computation for each physical subelement or functional subsystem would be assigned to a single specific processor in the simulator. These node processors are microprogrammed bit-slice microcomputers which function autonomously and can communicate with each other and a central control minicomputer over parallel digital lines. Inter-processor nearest neighbor communications busses pass the constants which represent physical constraints and boundary conditions. The node processors are connected to the six nearest neighbor node processors to simulate the actual physical interface of real substructures. Computer generated finite element mesh and force models can be developed with the aid of the central control minicomputer. The control computer also oversees the animation of a graphics display system, disk-based mass storage along with the individual processing elements.

  15. Coordination and establishment of centralized facilities and services for the University of Alaska ERTS survey of the Alaskan environment

    NASA Technical Reports Server (NTRS)

    Belon, A. E. (Principal Investigator)

    1972-01-01

    The author has identified the following significant results. Specifications have been prepared for the engineering design and construction of a digital color display unit which will be used for automatic processing of ERTS data. The color display unit is a disk refresh memory with computer interfaced input and a color cathode ray tube output display. The system features both analog and digital post disk data manipulation and a versatile color coding device suitable for displaying not only images, but also computer generated graphics such as diagrams, maps, and overlays. Input is from IBM compatible 9 track, 800 BPI tapes, as generated by an IBM 360 computer. ERTS digital tapes are read into the 360, where various analyses such as maximum likelihood classification are performed and the results are written on a magnetic tape which is the input to the color display unit. The greatest versatility in the data manipulation area is provided by the minicomputer built into the color display unit, which is off-line from the main 360 computer. The minicomputer is able to read any line from the refresh disk and place it in its 4K-16 bit memory. Considerable flexibility is available for post-processing enhancement of images by the investigator.

  16. Data Processing and Analysis Systems for JT-60U

    SciTech Connect

    Matsuda, T.; Totsuka, T.; Tsugita, T.; Oshima, T.; Sakata, S.; Sato, M.; Iwasaki, K.

    2002-09-15

    The JT-60U data processing system is a large computer complex gradually modernized by utilizing progressive computer and network technology. A main computer using state-of-the-art CMOS technology can handle {approx}550 MB of data per discharge. A gigabit ethernet switch with FDDI ports has been introduced to cope with the increase of handling data. Workstation systems with VMEbus serial highway drivers for CAMAC have been developed and used to replace many minicomputer systems. VMEbus-based fast data acquisition systems have also been developed to enlarge and replace a minicomputer system for mass data.The JT-60U data analysis system is composed of a JT-60U database server and a JT-60U analysis server, which are distributed UNIX servers. The experimental database is stored in the 1TB RAID disk of the JT-60U database server and is composed of ZENKEI and diagnostic databases. Various data analysis tools are available on the JT-60U analysis server. For the remote collaboration, technical features of the data analysis system have been applied to the computer system to access JT-60U data via the Internet. Remote participation in JT-60U experiments has been successfully conducted since 1996.

  17. Digital system for structural dynamics simulation

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Lagace, L. J.; Wojnar, M. K.; Glor, C.

    1982-01-01

    State-of-the-art digital hardware and software for the simulation of complex structural dynamic interactions, such as those which occur in rotating structures (engine systems). System were incorporated in a designed to use an array of processors in which the computation for each physical subelement or functional subsystem would be assigned to a single specific processor in the simulator. These node processors are microprogrammed bit-slice microcomputers which function autonomously and can communicate with each other and a central control minicomputer over parallel digital lines. Inter-processor nearest neighbor communications busses pass the constants which represent physical constraints and boundary conditions. The node processors are connected to the six nearest neighbor node processors to simulate the actual physical interface of real substructures. Computer generated finite element mesh and force models can be developed with the aid of the central control minicomputer. The control computer also oversees the animation of a graphics display system, disk-based mass storage along with the individual processing elements.

  18. Development of an integrated control and measurement system

    SciTech Connect

    Manges, W.W.

    1984-03-01

    This thesis presents a tutorial on the issues involved in the development of a minicomputer-based, distributed intelligence data acquisition and process control system to support complex experimental facilities. The particular system discussed in this thesis is under development for the Atomic Vapor Laser Isotope Separation (AVLIS) Program at the Oak Ridge Gaseous Diffusion Plant (ORGDP). In the AVLIS program, we were careful to integrate the computer sections of the implementation into the instrumentation system rather than adding them as an appendage. We then addressed the reliability and availability of the system as a separate concern. Thus, our concept of an integrated control and measurement (ICAM) system forms the basis for this thesis. This thesis details the logic and philosophy that went into the development of this system and explains why the commercially available turn-key systems generally are not suitable. Also, the issues involved in the specification of the components for such an integrated system are emphasized.

  19. High Resolution Thermography In Medicine

    NASA Astrophysics Data System (ADS)

    Clark, R. P.; Goff, M. R.; Culley, J. E.

    1988-10-01

    A high resolution medical thermal imaging system using an 8 element SPRI1E detector is described. Image processing is by an Intellect 100 processor and is controlled by a DEC LSI 11/23 minicomputer. Image storage is with a 170 Mbyte winchester disc together with archival storage on 12 inch diameter optical discs having a capacity of 1 Gbyte per side. The system is currently being evaluated for use in physiology and medicine. Applications outlined include the potential of thermographic screening to identify genetic carriers in X-linked hypohidrotic ectodermal dysplasia (XED), detailed vas-cular perfusion studies in health and disease and the relation-ship between cutaneous blood flow, neurological peripheral function and skin surface temperature.

  20. Computer-assisted Gran titration procedure for strong acid determination

    SciTech Connect

    Phillips, M.F.; Gaffney, J.S.; Goodrich, R.W.; Tanner, R.L.

    1984-10-01

    An automated method for determining, by coulometric titration, small amounts of strong acid in the presence of weak acids is given. Essentially, a pH meter and a coulometer are coupled with a Tektronix 4052 mini-computer, and a two-step computer program then directs the titration and calculates the equivalence point by the method of Gran. A comparison of precision and accuracy of results for test solutions by manual and automated data reduction methods is presented. The method is being used successfully to analyze for the H/sup +/ content in ambient aerosol samples from aerometric field experiments, and can be used for cloud and rainwater samples as well. 3 references, 1 figure, 1 table.

  1. The Evolution of a Computerized Medical Information System

    PubMed Central

    Hammond, W. Ed; Stead, W. W.

    1986-01-01

    This paper presents the eighteen year history leading to the development of a computerized medical information system and discusses the factors which influenced its philosophy, design and implementation. This system, now called TMR, began as a single-user, tape-oriented minicomputer package and now exists as a multi-user, multi-database, multi-computer system capable of supporting a full range of users in both the inpatient and outpatient settings. The paper discusses why we did what we did, what worked, and what didn't work. Current projects are emphasized including networking and the integration of inpatient and outpatient functions into a single system. A theme of the paper is how hardware and software technological advancements, increasing sophistication of our users, our increasing experience, and just plain luck contributed to the success of TMR.

  2. Laboratory data manipulation tools basic data handling programs. Volume 2: Detailed software/hardware documentation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The set of computer programs described allows for data definition, data input, and data transfer between the LSI-11 microcomputers and the VAX-11/780 minicomputer. Program VAXCOM allows for a simple method of textual file transfer from the LSI to the VAX. Program LSICOM allows for easy file transfer from the VAX to the LSI. Program TTY changes the LSI-11 operators console to the LSI's printing device. Program DICTIN provides a means for defining a data set for input to either computer. Program DATAIN is a simple to operate data entry program which is capable of building data files on either machine. Program LEDITV is an extremely powerful, easy to use, line oriented text editor. Program COPYSBF is designed to print out textual files on the line printer without character loss from FORTRAN carriage control or wide record transfer.

  3. Dynamic response of damaged angleplied fiber composites

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.; Lark, R. F.

    1979-01-01

    The effects of low level damage induced by monotonic load, cyclic load and/or residual stresses on the vibration frequencies and damping factors of fiber composite angleplied laminates were investigated. Two different composite systems were studied - low modulus fiber and ultra high modulus fiber composites. The results obtained show that the frequencies and damping factors of angleplied laminates made from low modulus fiber composites are sensitive to low level damage while those made from ultra high modulus composites are not. Vibration tests may not be sufficiently sensitive to assess concentrated local damage in angleplied laminates. Dynamic response determined from low-velocity impact coupled with the Fast Fourier Transform and packaged in a minicomputer can be a convenient procedure for assessing low-level damage.

  4. Culvert analysis program for indirect measurement of discharge

    USGS Publications Warehouse

    Fulford, Janice M.; ,

    1993-01-01

    A program based on the U.S. Geological Survey (USGS) methods for indirectly computing peak discharges through culverts allows users to employ input data formats used by the water surface profile program (WSPRO). The program can be used to compute discharge rating surfaces or curves that describe the behavior of flow through a particular culvert or to compute discharges from measurements of upstream of the gradually varied flow equations and has been adapted slightly to provide solutions that minimize the need for the user to determine between different flow regimes. The program source is written in Fortran 77 and has been run on mini-computers and personal computers. The program does not use or require graphics capability, a color monitor, or a mouse.

  5. Clinical Protocol Information System

    PubMed Central

    Wirtschafter, David D.; Gams, Richard; Ferguson, Carol; Blackwell, William; Boackle, Paul

    1980-01-01

    The Clinical Protocol Information System (CPIS) supports the clinical research and patient care objectives of the SouthEastern Cancer Study Group (SEG). The information system goals are to improve the evaluability of clinical trials, decrease the frequency of adverse patient events, implement drug toxicity surveillance, improve the availability of study data and demonstrate the criteria for computer networks that can impact on the general medical care of the community. Nodes in the network consist of Data General MicroNova MP-100 minicomputers that drive the interactive data dialogue and communicate with the network concentrator (another DG MicroNova) in Birmingham. Functions supported include: source data editing, care “advice,” care “audit,” care “explanation,” and treatment note printing. The complete database is updated nightly and resides on UAB's IBM 370/158-AP.

  6. Numerical methods: Analytical benchmarking in transport theory

    SciTech Connect

    Ganapol, B.D. )

    1988-01-01

    Numerical methods applied to reactor technology have reached a high degree of maturity. Certainly one- and two-dimensional neutron transport calculations have become routine, with several programs available on personal computer and the most widely used programs adapted to workstation and minicomputer computational environments. With the introduction of massive parallelism and as experience with multitasking increases, even more improvement in the development of transport algorithms can be expected. Benchmarking an algorithm is usually not a very pleasant experience for the code developer. Proper algorithmic verification by benchmarking involves the following considerations: (1) conservation of particles, (2) confirmation of intuitive physical behavior, and (3) reproduction of analytical benchmark results. By using today's computational advantages, new basic numerical methods have been developed that allow a wider class of benchmark problems to be considered.

  7. Development and implementation of a low cost micro computer system for LANDSAT analysis and geographic data base applications

    NASA Technical Reports Server (NTRS)

    Faust, N.; Jordon, L.

    1981-01-01

    Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.

  8. Speech as a pilot input medium

    NASA Technical Reports Server (NTRS)

    Plummer, R. P.; Coler, C. R.

    1977-01-01

    The speech recognition system under development is a trainable pattern classifier based on a maximum-likelihood technique. An adjustable uncertainty threshold allows the rejection of borderline cases for which the probability of misclassification is high. The syntax of the command language spoken may be used as an aid to recognition, and the system adapts to changes in pronunciation if feedback from the user is available. Words must be separated by .25 second gaps. The system runs in real time on a mini-computer (PDP 11/10) and was tested on 120,000 speech samples from 10- and 100-word vocabularies. The results of these tests were 99.9% correct recognition for a vocabulary consisting of the ten digits, and 99.6% recognition for a 100-word vocabulary of flight commands, with a 5% rejection rate in each case. With no rejection, the recognition accuracies for the same vocabularies were 99.5% and 98.6% respectively.

  9. Land use survey using remote sensing and geographical information systems

    NASA Astrophysics Data System (ADS)

    Suga, Yuzo

    1992-07-01

    A hybrid system which integrates Remote Sensing (RS) data and Geographical Information Systems (GIS) information, has been developed for land use survey in Hiroshima city. The system consists of three interrelated subsystems, i.e., a personal computer, a minicomputer and an engineering workstation: The system can handle an image data base consisting of satellite digital images such as Landsat TM and Spot HRV data, a line map data base consisting of topography and land use zoning, and an updating land use information data base consisting of raster and vector data such as remote sensing data and digital mapping data. This paper describes the implementation of the integration of multiple sensors/multi-temporal remote sensing images with digital mapping data. The application of the system to a land use survey is discussed with respect to a method of extracting land use information based on remote sensing and geographical information systems.

  10. Proposed MIDAS II processing array

    SciTech Connect

    Meng, J.

    1982-03-01

    MIDAS (Modular Interactive Data Analysis System) is a ganged processor scheme used to interactively process large data bases occurring as a finite sequence of similar events. The existing device uses a system of eight ganged minicomputer central processor boards servicing a rotating group of 16 memory blocks. A proposal for MIDAS II, the successor to MIDAS, is to use a much larger number of ganged processors, one per memory block, avoiding the necessity of switching memories from processor to processor. To be economic, MIDAS II must use a small, relatively fast and inexpensive microprocessor, such as the TMS 9995. This paper analyzes the use of the TMS 9995 applied to the MIDAS II processing array, emphasizing computational, architectural and physical characteristics which make the use of the TMS 9995 attractive for this application.

  11. Advances in automatic extraction of information from multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Erickson, J. D.

    1975-01-01

    The state-of-the-art of automatic multispectral scanner data analysis and interpretation is reviewed. Sources of system variability which tend to obscure the spectral characteristics of the classes under consideration are discussed, and examples of the application of spatial and temporal discrimination bases are given. Automatic processing functions, techniques and methods, and equipment are described with particular attention to those that are applicable to large land surveys using satellite data. The development and characteristics of the Multivariate Interactive Digital Analysis System (MIDAS) for processing aircraft or satellite multispectral scanning data are discussed in detail. The MIDAS system combines the parallel digital implementation capabilities of a low-cost processor with a general purpose PDP-11/45 minicomputer to provide near-real-time data processing. The preprocessing functions are user-selectable. The input subsystem accepts data stored on high density digital tape, computer compatible tape, and analog tape.

  12. Programs for generating data tables for the annual water-resources data report of the U.S. Geological Survey

    USGS Publications Warehouse

    Mason, R.R.; Hill, C.L.

    1988-01-01

    The U.S. Geological Survey has developed software that interfaces with the Automated Data Processing System to facilitate and expedite preparation of the annual water-resources data report. This software incorporates a feature that prepares daily values tables and appends them to previously edited files containing station manuscripts. Other features collate the merged files with miscellaneous sections of the report. The report is then printed as page-size, camera-ready copy. All system components reside on a minicomputer; this provides easy access and use by remote field offices. Automation of the annual report preparation process results in significant savings of labor and cost. Use of the system for producing the 1986 annual report in the North Carolina District realized a labor savings of over two man-months. A fully implemented system would produce a greater savings and speed release of the report to users.

  13. Automation in photogrammetry: Recent developments and applications (1972-1976)

    USGS Publications Warehouse

    Thompson, M.M.; Mikhail, E.M.

    1976-01-01

    An overview of recent developments in the automation of photogrammetry in various countries is presented. Conclusions regarding automated photogrammetry reached at the 1972 Congress in Ottawa are reviewed first as a background for examining the developments of 1972-1976. Applications are described for each country reporting significant developments. Among fifteen conclusions listed are statements concerning: the widespread practice of equipping existing stereoplotters with simple digitizers; the growing tendency to use minicomputers on-line with stereoplotters; the optimization of production of digital terrain models by progressive sampling in stereomodels; the potential of digitization of a photogrammetric model by density correlation on epipolar lines; the capabilities and economic aspects of advanced systems which permit simultaneous production of orthophotos, contours, and digital terrain models; the economy of off-line orthophoto systems; applications of digital image processing; automation by optical techniques; applications of sensors other than photographic imagery, and the role of photogrammetric phases in a completely automated cartographic system. ?? 1976.

  14. Mood-congruent memory in daily life: evidence from interactive ambulatory monitoring.

    PubMed

    Loeffler, Simone N; Myrtek, Michael; Peper, Martin

    2013-05-01

    Evidence from the psychological laboratory indicates that emotional states tend to facilitate the encoding and retrieval of stimuli of the same emotional valence. To explore mood-congruent memory and the role of arousal in daily life, we applied a new interactive ambulatory technique. Psychophysiological arousal as indexed by non-metabolic heart rate, self-reported emotions and situational information were assessed during 24-h recordings in 70 healthy participants. The emotional state was used to trigger word list presentations on a minicomputer. Our results show that psychophysiological arousal at the time of encoding enhanced the recall of negative words in negative emotional conditions, whereas low psychophysiological arousal facilitated recall of positive words. In positive contexts, mood congruency was more prominent when arousal was low. These results demonstrate how automated experimentation with an ambulatory technique may help to assess emotional memory in real-world contexts, thus providing new methods for diverse fields of application.

  15. AI tools in computer based problem solving

    NASA Technical Reports Server (NTRS)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  16. Microcomputers and neurobiology: a short review.

    PubMed

    Fraser, P J

    1985-12-01

    A brief history of the application of computing techniques emphasizes the two-part development with expensive minicomputers available in a few laboratories being added to by inexpensive microcomputers ubiquitously available. Computers are used for microscope control and plotting, serial section reconstruction, morphometric measurement, stereology, video image analysis, photometry and fluorescence microscopy. Basic principles are exemplified by considering nerve cell reconstruction. General principles of computerized electrical measurement including filtering, averaging and stimulus generation are discussed. Computerized waveform selection as used for spike discrimination, when considered along with computer control of electrode position and the growing availability of multichannel recording arrays, suggests a possible advance in automatic analyses. With the ability to process more complex waveforms successfully, electrophysiological data such as compound extracellular potentials may usefully replace the cleaner, but more limited intracellular data. Success with multichannel feedback controlled stimulators making paraplegics stand and walk point to a developing application with much potential.

  17. Interactive simulation of digital communication systems

    NASA Astrophysics Data System (ADS)

    Modestino, J. W.; Matis, K. R.

    1984-01-01

    In this paper, efforts to develop a comprehensive tool for the digital simulation of a wide variety of point-to-point digital communication systems are described. These efforts have resulted in the interactive communications simulator (ICS), a flexible, graphics-oriented, and highly interactive hardware/software system consisting of a typical minicomputer acting as host to a fast peripheral array processor. This system is presently being employed both to evaluate existing modem performance and to explore new modulation/coding concepts approprate for military, commercial, and space applications. A detailed functional description of the ICS is provided together with pertinent software considerations. An outline of existinig ICS capabilities is presented and illustrated through typical graphical output. A discussion of channel modeling considerations is provided. The use of the ICS in the overall design of receiver structures for impulsive noise channels will be illustrated.

  18. Data reduction programs for a laser radar system

    NASA Astrophysics Data System (ADS)

    Badavi, F. F.; Copeland, G. E.

    1984-01-01

    The listing and description of software routines which were used to analyze the analog data obtained from LIDAR - system are given. All routines are written in FORTRAN - IV on a HP - 1000/F minicomputer which serves as the heart of the data acquisition system for the LIDAR program. This particular system has 128 kilobytes of highspeed memory and is equipped with a Vector Instruction Set (VIS) firmware package, which is used in all the routines, to handle quick execution of different long loops. The system handles floating point arithmetic in hardware in order to enhance the speed of execution. This computer is a 2177 C/F series version of HP - 1000 RTE-IVB data acquisition computer system which is designed for real time data capture/analysis and disk/tape mass storage environment.

  19. Landsat electron beam recorder

    NASA Astrophysics Data System (ADS)

    Grosso, P. F.; Whitley, J. P.

    A minicomputer-controlled electron beam recorder (EBR) presently in use at the Brazilian Government's Institute De Pesquisas Espaclais (INPE) satellite ground station is described. This 5-in.-film-size EBR is used to record both Landsat and SPOT satellite imagery in South America. A brief electron beam recorder technology review is presented. The EBR is capable of recording both vector and text data from computer-aided design, publishing, and line art systems and raster data from image scanners, raster image processors (RIPS), halftone/screen generators, and remote image sensors. A variety of image formats may be recorded on numerous film sizes (16 mm, 35 mm, 70 mm, 105 mm, 5-in, 5.5-in., and 9.5-in.). These recordings are used directly or optically enlarged depending on the final product.

  20. High-performance control system for a heavy-ion medical accelerator

    SciTech Connect

    Lancaster, H.D.; Magyary, S.B.; Sah, R.C.

    1983-03-01

    A high performance control system is being designed as part of a heavy ion medical accelerator. The accelerator will be a synchrotron dedicated to clinical and other biomedical uses of heavy ions, and it will deliver fully stripped ions at energies up to 800 MeV/nucleon. A key element in the design of an accelerator which will operate in a hospital environment is to provide a high performance control system. This control system will provide accelerator modeling to facilitate changes in operating mode, provide automatic beam tuning to simplify accelerator operations, and provide diagnostics to enhance reliability. The control system being designed utilizes many microcomputers operating in parallel to collect and transmit data; complex numerical computations are performed by a powerful minicomputer. In order to provide the maximum operational flexibility, the Medical Accelerator control system will be capable of dealing with pulse-to-pulse changes in beam energy and ion species.

  1. ATS-6 - Spacecraft Attitude Precision Pointing and Slewing Adaptive Control Experiment

    NASA Technical Reports Server (NTRS)

    Isley, W. C.; Endres, D. L.

    1975-01-01

    The primary objective of the Spacecraft Attitude Precision Pointing and Slewing Adaptive Control (SAPPSAC) experiment is to establish feasibility and evaluate capabilities of a ground-based spacecraft attitude control system, wherein RF command and telemetry links, together with a ground station on-line minicomputer, perform closed loop attitude control of the Applications Technology Satellite-6 (ATS-6). The ground processor is described, including operational characteristics and the controller software. Attitude maneuvers include precision pointing to fixed targets, slewing between targets, and generation of prescribed ground tracks. Test results show high performance and reliability for over 30 hours of on-line control with no serious anomalies. Attitude stabilization relative to a prescribed target has been achieved to better than 0.007 deg in pitch and roll and 0.02 deg in yaw for a period of 43 min. Ground tracks were generated which had maximum latitude/longitude deviations less than 0.15 deg from reference.

  2. Remote sensing information sciences research group: Browse in the EOS era

    NASA Technical Reports Server (NTRS)

    Estes, John E.; Star, Jeffrey L.

    1989-01-01

    The problem of science data browse was examined. Given the tremendous data volumes that are planned for future space missions, particularly the Earth Observing System in the late 1990's, the need for access to large spatial databases must be understood. Work was continued to refine the concept of data browse. Further, software was developed to provide a testbed of the concepts, both to locate possibly interesting data, as well as view a small portion of the data. Build II was placed on a minicomputer and a PC in the laboratory, and provided accounts for use in the testbed. Consideration of the testbed software as an element of in-house data management plans was begun.

  3. Vibration in Planetary Gear Systems with Unequal Planet Stiffnesses

    NASA Technical Reports Server (NTRS)

    Frater, J. L.; August, R.; Oswald, F. B.

    1982-01-01

    An algorithm suitable for a minicomputer was developed for finding the natural frequencies and mode shapes of a planetary gear system which has unequal stiffnesses between the Sun/planet and planet/ring gear meshes. Mode shapes are represented in the form of graphical computer output that illustrates the lateral and rotational motion of the three coaxial gears and the planet gears. This procedure permits the analysis of gear trains utilizing nonuniform mesh conditions and user specified masses, stiffnesses, and boundary conditions. Numerical integration of the equations of motion for planetary gear systems indicates that this algorithm offers an efficient means of predicting operating speeds which may result in high dynamic tooth loads.

  4. The GEMPAK Barnes interactive objective map analysis scheme. [General Meteorological Software Package

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Kocin, P. J.; Desjardins, M.

    1983-01-01

    The analysis scheme and meteorological applications of the GEMPAK data analysis and display software system developed by NASA are described. The program was devised to permit objective, versatile, and practical analysis of satellite meteorological data using a minicomputer and a display system with graphics capability. A data area can be selected within the data file for the globe, and data-sparse regions can be avoided. Distances between observations and the nearest observation points are calculated in order to avoid errors when determining synoptic weather conditions. The Barnes (1973) successive correction method is employed to restore the amplitude of small yet resolvable wavelengths suppressed in an initial filtering pass. The rms deviation is then calculated in relation to available measured data. Examples are provided of treatment of VISSR data from the GOES satellite and a study of the impact of incorrect cloud height data on synoptic weather field analysis.

  5. A distributed data base management system. [for Deep Space Network

    NASA Technical Reports Server (NTRS)

    Bryan, A. I.

    1975-01-01

    Major system design features of a distributed data management system for the NASA Deep Space Network (DSN) designed for continuous two-way deep space communications are described. The reasons for which the distributed data base utilizing third-generation minicomputers is selected as the optimum approach for the DSN are threefold: (1) with a distributed master data base, valid data is available in real-time to support DSN management activities at each location; (2) data base integrity is the responsibility of local management; and (3) the data acquisition/distribution and processing power of a third-generation computer enables the computer to function successfully as a data handler or as an on-line process controller. The concept of the distributed data base is discussed along with the software, data base integrity, and hardware used. The data analysis/update constraint is examined.

  6. MIDAS, prototype Multivariate Interactive Digital Analysis System, Phase 1. Volume 2: Diagnostic system

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.

    1974-01-01

    The MIDAS System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughout. The hardware and software generated in Phase I of the over-all program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating 2 x 105 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. Diagnostic programs used to test MIDAS' operations are presented.

  7. SIFT - Design and analysis of a fault-tolerant computer for aircraft control. [Software Implemented Fault Tolerant systems

    NASA Technical Reports Server (NTRS)

    Wensley, J. H.; Lamport, L.; Goldberg, J.; Green, M. W.; Levitt, K. N.; Melliar-Smith, P. M.; Shostak, R. E.; Weinstock, C. B.

    1978-01-01

    SIFT (Software Implemented Fault Tolerance) is an ultrareliable computer for critical aircraft control applications that achieves fault tolerance by the replication of tasks among processing units. The main processing units are off-the-shelf minicomputers, with standard microcomputers serving as the interface to the I/O system. Fault isolation is achieved by using a specially designed redundant bus system to interconnect the processing units. Error detection and analysis and system reconfiguration are performed by software. Iterative tasks are redundantly executed, and the results of each iteration are voted upon before being used. Thus, any single failure in a processing unit or bus can be tolerated with triplication of tasks, and subsequent failures can be tolerated after reconfiguration. Independent execution by separate processors means that the processors need only be loosely synchronized, and a novel fault-tolerant synchronization method is described.

  8. Computer-Assisted Photo Interpretation System

    NASA Astrophysics Data System (ADS)

    Niedzwiadek, Harry A.

    1981-11-01

    A computer-assisted photo interpretation research (CAPIR) system has been developed at the U.S. Army Engineer Topographic Laboratories (ETL), Fort Belvoir, Virginia. The system is based around the APPS-IV analytical plotter, a photogrammetric restitution device that was designed and developed by Autometric specifically for interactive, computerized data collection activities involving high-resolution, stereo aerial photographs. The APPS-IV is ideally suited for feature analysis and feature extraction, the primary functions of a photo interpreter. The APPS-IV is interfaced with a minicomputer and a geographic information system called AUTOGIS. The AUTOGIS software provides the tools required to collect or update digital data using an APPS-IV, construct and maintain a geographic data base, and analyze or display the contents of the data base. Although the CAPIR system is fully functional at this time, considerable enhancements are planned for the future.

  9. Computer code to interchange CDS and wave-drag geometry formats

    NASA Technical Reports Server (NTRS)

    Johnson, V. S.; Turnock, D. L.

    1986-01-01

    A computer program has been developed on the PRIME minicomputer to provide an interface for the passage of aircraft configuration geometry data between the Rockwell Configuration Development System (CDS) and a wireframe geometry format used by aerodynamic design and analysis codes. The interface program allows aircraft geometry which has been developed in CDS to be directly converted to the wireframe geometry format for analysis. Geometry which has been modified in the analysis codes can be transformed back to a CDS geometry file and examined for physical viability. Previously created wireframe geometry files may also be converted into CDS geometry files. The program provides a useful link between a geometry creation and manipulation code and analysis codes by providing rapid and accurate geometry conversion.

  10. University of Missouri-Rolla cloud simulation facility - Proto II chamber

    NASA Technical Reports Server (NTRS)

    White, Daniel R.; Carstens, John C.; Hagen, Donald E.; Schmitt, John L.; Kassner, James L.

    1987-01-01

    The design and supporting systems for the cooled-wall expansion cloud chamber, designated Proto II, are described. The chamber is a 10-sided vertical cylinder designed to be operated with interior wall temperatures between +40 and -40 C, and is to be utilized to study microphysical processes active in atmospheric clouds and fogs. Temperatures are measured using transistor thermometers which have a range of + or - 50 C and a resolution of about + or - 0.001 C; and pressures are measured in the chamber by a differential strain gauge pressure transducer. The methods used for temperature and pressure control are discussed. Consideration is given to the chamber windows, optical table, photographic/video, optical attenuation, Mie scattering, and the scanning system for the chamber. The system's minicomputer and humidifier, sample preparation, and chamber flushing are examined.

  11. Acoustic monitoring of power-plant valves

    NASA Astrophysics Data System (ADS)

    Allen, J. W.; Hartman, W. F.; Robinson, J. C.

    1982-06-01

    Advanced surveillance diagnostics were applied to key nuclear power plant valves to improve the availability of the power plant. Two types of valves were monitored: BWR three-stage, pilot-operated safety/relief valves and PWR feedwater control valves. Excessive leakage across the pilot-disc seat in BWR safety/relief valves can cause the second-stage pressure to reach the critical value that activates the valve, even though the set pressure was not exceeded. Acoustic emissions created by the leak noise were monitored and calibrated to indicate incipient activation of the safety/relief valve. Hydrodynamic, vibration, control and process signals from PWR feedwater control valves were monitored by a mini-computer based surveillance system. On-line analysis of these signals coupled with earlier analytic modelling identified: (1) cavitation, (2) changes in steam packaging tightness, (3) valve stem torquing, (4) transducer oscillations, and (5) peak vibration levels during power transients.

  12. Acoustic monitoring of power plant valves

    NASA Astrophysics Data System (ADS)

    Allen, J. W.; Hartman, W. F.; Robinson, J. C.

    1982-06-01

    Advanced surveillance diagnostics were applied to key nuclear power plant valves to improve the availability of the power plant. Two types of valves were monitored: boiling water reactor (BWR) three-stage, pilot-operated safety/relief valves and pressurized water reactor (PWR) feedwater control valves. Excessive leakage across the pilot-disc seat in BWR safety/relief valves can cause the second-stage pressure to reach the critical value that activates the valve, even though the set pressure was not exceeded. Acoustic emission created by the leak noise were monitored and calibrated to indicate incipient activation of the safety/relief valve. Hydrodynamic, vibration, control and process signals frm PWR feedwater control valves were monitored by a mini-computer based surveillance system.

  13. VME and network applications for the JT-60U control system

    NASA Astrophysics Data System (ADS)

    Kimura, T.

    1994-12-01

    The control system for the large tokamak JT-60 at JAERI-Naka was completed in 1985. It was originally composed of 16-bit industrial minicomputers and CAMAC systems with 16-bit microcomputers. Rejuvenation of the control system has become necessary to improving the control performance. The renewal of the control system is also stimulated by the requirements for upgrading the software development and hardware maintenance environment of the control system. This paper describes how the control system was and will be upgraded, utilizing the advanced technologies of VME and networks. Two years ago, a VMEbus-based 32-bit multiprocessor system for fast plasma control and a new operator's console using UNIX workstations connected to an Ethernet LAN were developed to cope with upgrading the JT-60 tokamak (JT-60U). VME and network applications are now being extended to the level of subsystem controllers.

  14. Semiautomated inspection of superfinished spherical surfaces

    SciTech Connect

    Klingsporn, P.E.

    1980-01-01

    Lapping and polishing techniques are used at Bendix Kansas City to fabricate superfinished spherical metal surfaces. A laser-light reflection method has been developed for semiautomated inspection of the surfaces. The reflected and diffracted light intensity distributions from the spherical surface are measured with an array of photodetectors interfaced with a data sampler and a minicomputer programmed to distinguish between pits and scratches. For automated measurement, standard deviations for scratch width and depth are 3 and 0.3 ..mu..m (120 and 12 ..mu..in.), respectively, and for pit diameter and depth are 5.8 and 0.9 ..mu..m (230 and 36 ..mu..in.), respectively. A laser interferometric displacement measuring system interfaced with the computer is used for automated measurement of surface waviness.

  15. Surface temperatures and temperature gradient features of the US Gulf Coast waters

    NASA Technical Reports Server (NTRS)

    Huh, O. K.; Rouse, L. J., Jr.; Smith, G. W.

    1977-01-01

    Satellite thermal infrared data on the Gulf of Mexico show that a seasonal cycle exists in the horizontal surface temperature structure. In the fall, the surface temperatures of both coastal and deep waters are nearly uniform. With the onset of winter, atmospheric cold fronts, which are accompanied by dry, low temperature air and strong winds, draw heat from the sea. A band of cooler water forming on the inner shelf expands, until a thermal front develops seaward along the shelf break between the cold shelf waters and the warmer deep waters of the Gulf. Digital analysis of the satellite data was carried out in an interactive mode using a minicomputer and software. A time series of temperature profiles illustrates the temporal and spatial changes in the sea-surface temperature field.

  16. State-of-the-art Monte Carlo 1988

    SciTech Connect

    Soran, P.D.

    1988-06-28

    Particle transport calculations in highly dimensional and physically complex geometries, such as detector calibration, radiation shielding, space reactors, and oil-well logging, generally require Monte Carlo transport techniques. Monte Carlo particle transport can be performed on a variety of computers ranging from APOLLOs to VAXs. Some of the hardware and software developments, which now permit Monte Carlo methods to be routinely used, are reviewed in this paper. The development of inexpensive, large, fast computer memory, coupled with fast central processing units, permits Monte Carlo calculations to be performed on workstations, minicomputers, and supercomputers. The Monte Carlo renaissance is further aided by innovations in computer architecture and software development. Advances in vectorization and parallelization architecture have resulted in the development of new algorithms which have greatly reduced processing times. Finally, the renewed interest in Monte Carlo has spawned new variance reduction techniques which are being implemented in large computer codes. 45 refs.

  17. Evolution and Integration of Medical Laboratory Information System in an Asia National Medical Center

    NASA Astrophysics Data System (ADS)

    Cheng, Po-Hsun; Chen, Sao-Jie; Lai, Jin-Shin

    This work elucidates the evolution of three generations of the laboratory information system in the National Taiwan University Hospital, which were respectively implemented in an IBM Series/1 mini-computer, a client/server and a plug-and-play HL7 interface engine environment respectively. The experience of using the HL7 healthcare information exchange in the hospital information system, laboratory information system, and automatic medical instruments over the past two decades are illustrated and discussed. The latest design challenge in developing intelligent laboratory information services is to organize effectively distributed and heterogeneous medical instruments through the message gateways. Such experiences had spread to some governmental information systems for different purposes in Taiwan; besides, the healthcare information exchange standard, software reuse mechanism, and application service provider adopted in developing the plug-and-play laboratory information system are also illustrated.

  18. Daylight spectra of individual lightning flashes in the 370-690 nm region

    NASA Technical Reports Server (NTRS)

    Orville, R. E.

    1980-01-01

    An optical multichannel analyzer slit spectrometer coupled to a minicomputer was used to record lightning spectra. This is the first successful application of a slit spectrometer to the study of individual lightning flashes and it was accomplished in the daytime. Over 300 spectra were obtained in 1978 and 1979 and are correlated with other experiments in the Thunderstorm Research International Program (TRIP). The spectra duplicate previously published nighttime data but reveal for the first time the relative intensity of H-alpha (656.3 nm) and H-beta (486.1 nm) emissions above their daytime absorption features. These are the characteristic Fraunhofer C and F lines in the solar spectrum. This result suggests that the observation of lightning from space may be accomplished by monitoring the hydrogen emissions from lightning which occur on earth, or on other planets with hydrogen in their atmospheres, such as Jupiter and Venus where lightning recently has been reported.

  19. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  20. Correction factors for on-line microprobe analysis of multielement alloy systems

    NASA Technical Reports Server (NTRS)

    Unnam, J.; Tenney, D. R.; Brewer, W. D.

    1977-01-01

    An on-line correction technique was developed for the conversion of electron probe X-ray intensities into concentrations of emitting elements. This technique consisted of off-line calculation and representation of binary interaction data which were read into an on-line minicomputer to calculate variable correction coefficients. These coefficients were used to correct the X-ray data without significantly increasing computer core requirements. The binary interaction data were obtained by running Colby's MAGIC 4 program in the reverse mode. The data for each binary interaction were represented by polynomial coefficients obtained by least-squares fitting a third-order polynomial. Polynomial coefficients were generated for most of the common binary interactions at different accelerating potentials and are included. Results are presented for the analyses of several alloy standards to demonstrate the applicability of this correction procedure.

  1. The Lockheed alternate partial polarizer universal filter

    NASA Technical Reports Server (NTRS)

    Title, A. M.

    1976-01-01

    A tunable birefringent filter using an alternate partial polarizer design has been built. The filter has a transmission of 38% in polarized light. Its full width at half maximum is .09A at 5500A. It is tunable from 4500 to 8500A by means of stepping motor actuated rotating half wave plates and polarizers. Wave length commands and thermal compensation commands are generated by a PPD 11/10 minicomputer. The alternate partial polarizer universal filter is compared with the universal birefringent filter and the design techniques, construction methods, and filter performance are discussed in some detail. Based on the experience of this filter some conclusions regarding the future of birefringent filters are elaborated.

  2. [Computerized monitoring system in the operating center with UNIX and X-window].

    PubMed

    Tanaka, Y; Hashimoto, S; Chihara, E; Kinoshita, T; Hirose, M; Nakagawa, M; Murakami, T

    1992-01-01

    We previously reported the fully automated data logging system in the operating center. Presently, we revised the system using a highly integrated operating system, UNIX instead of OS/9. With this multi-task and multi-window (X-window) system, we could monitor all 12 rooms in the operating center at a time. The system in the operating center consists of 2 computers, SONY NEWS1450 (UNIX workstation) and Sord M223 (CP/M, data logger). On the bitmapped display of the workstation, using X-window, the data of all the operating rooms can be visualized. Furthermore, 2 other minicomputers (Fujitsu A50 in the conference room, and A60 in the ICU) and a workstation (Sun3-80 in the ICU) were connected with ethernet. With the remote login function (NFS), we could easily obtain the data during the operation from outside the operating center. This system works automatically and needs no routine maintenance.

  3. Time-resolved EPR spectroscopy in a Unix environment.

    PubMed

    Lacoff, N M; Franke, J E; Warden, J T

    1990-02-01

    A computer-aided time-resolved electron paramagnetic resonance (EPR) spectrometer implemented under version 2.9 BSD Unix was developed by interfacing a Varian E-9 EPR spectrometer and a Biomation 805 waveform recorder to a PDP-11/23A minicomputer having MINC A/D and D/A capabilities. Special problems with real-time data acquisition in a multiuser, multitasking Unix environment, addressing of computer main memory for the control of hardware devices, and limitation of computer main memory were resolved, and their solutions are presented. The time-resolved EPR system and the data acquisition and analysis programs, written entirely in C, are described. Furthermore, the benefits of utilizing the Unix operating system and the C language are discussed, and system performance is illustrated with time-resolved EPR spectra of the reaction center cation in photosystem 1 of green plant photosynthesis.

  4. Tritium Migration Analysis Program Version 4

    1991-06-12

    TMAP4 was developed as a safety analysis code, mainly to analyze tritium retention and loss in fusion reactor structures and systems during normal operational and accident conditions. It incorporates one-dimensional thermal and mass-diffusive transport and trapping calculations through structures and zero dimensional fluid transport between enclosures and across the interface between enclosures and structures. Diffusion structures may be linked together with other structures, and multiple structures may interact with an enclosure. A key feature ismore » the ability to input problem definition parameters as constants, interpolation tables, or FORTRAN equations. The code is specifically intended for use under a DOS operating system on PC type minicomputers, but it has also been run successfully on workstations and mainframe computer systems. Use of the equation-input feature requires access to a FORTRAN-77 compiler, and a linker program is required.« less

  5. A Computer System for Processing Tumor Registry Data

    PubMed Central

    Leahey, Charles F.

    1981-01-01

    An interactive computer system for processing tumor registry data has been developed by the Washington, D.C. VA Medical Center Systems Development Group. The automated registry system replaces a manual registry, which had been implemented according to the guidelines established for Cancer Programs by the American College of Surgeons. A permanent on-line data base of patient data is maintained by a minicomputer at the medical center. A user oriented application program provides entry, edit, and retrieval of patient data in the following formats - Suspense, Master, Accession, and Follow-up registers, and in Abstract form. Data entered in any of the formats is stored in a common file, and is available as needed in any other format. The programs were written in the standard Mumps Language. Construction of the Tumor Registry application was greatly assisted by use of the File Manager, a data base file management package written in the standard Mumps language.

  6. Software for Digital Acquisition System and Application to Environmental Monitoring

    NASA Technical Reports Server (NTRS)

    Copeland, G. E.

    1975-01-01

    Criteria for selection of a minicomputer for use as a core resident acquisition system were developed for the ODU Mobile Air Pollution Laboratory. A comprehensive data acquisition program named MONARCH was instituted in a DEC-8/E-8K 12-bit computer. Up to 32 analog voltage inputs are scanned sequentially, converted to BCD, and then to actual numbers. As many as 16 external devices (valves or any other two-state device) are controlled independently. MONARCH is written as a foreground-background program, controlled by an external clock which interrupts once per minute. Transducer voltages are averaged over user specified time intervals and, upon completion of any desired time sequence, outputted are: day, hour, minute, second; state of external valves; average value of each analogue voltage (E Format); as well as standard deviations of these values. Output is compatible with any serially addressed media.

  7. Implementation of the DYMAC system at the new Los Alamos Plutonium Processing Facility. Phase II report

    SciTech Connect

    Malanify, J.J.; Amsden, D.C.

    1982-08-01

    The DYnamic Materials ACcountability System - called DYMAC - performs accountability functions at the new Los Alamos Plutonium Processing Facility where it began operation when the facility opened in January 1978. A demonstration program, DYMAC was designed to collect and assess inventory information for safeguards purposes. It accomplishes 75% of its design goals. DYMAC collects information about the physical inventory through deployment of nondestructive assay instrumentation and video terminals throughout the facility. The information resides in a minicomputer where it can be immediately sorted and displayed on the video terminals or produced in printed form. Although the capability now exists to assess the collected data, this portion of the program is not yet implemented. DYMAC in its present form is an excellent tool for process and quality control. The facility operator relies on it exclusively for keeping track of the inventory and for complying with accountability requirements of the US Department of Energy.

  8. The Israeli National Medical Library's new minicomputerized on-line integrated system (MAIMON).

    PubMed

    Avriel, D; Miller, R; Fuchs, C

    1981-04-01

    An in-house library system based on a dedicated mini-computer has been in operation in the Israel National Medical Library since the summer of 1979. The integrated system, called MAIMON, features on-line access to bibliographic and circulation records. It replaces manual procedures in cataloging, searching, lending, and reservations. The system provides previously unavailable statistics on items in heavy use and demand, items to be removed from the active collection, and who uses what in the library. It is designed to be user cordial and to save users' time. The system has been very favorably accepted by patrons, and frees professional librarians from time-consuming clerical routine tasks. The system is evaluated in terms of performance, convenience, and cost. PMID:6784799

  9. Telemetry Computer System at Wallops Flight Center

    NASA Technical Reports Server (NTRS)

    Bell, H.; Strock, J.

    1980-01-01

    This paper describes the Telemetry Computer System in operation at NASA's Wallops Flight Center for real-time or off-line processing, storage, and display of telemetry data from rockets and aircraft. The system accepts one or two PCM data streams and one FM multiplex, converting each type of data into computer format and merging time-of-day information. A data compressor merges the active streams, and removes redundant data if desired. Dual minicomputers process data for display, while storing information on computer tape for further processing. Real-time displays are located at the station, at the rocket launch control center, and in the aircraft control tower. The system is set up and run by standard telemetry software under control of engineers and technicians. Expansion capability is built into the system to take care of possible future requirements.

  10. Digital resolver for helicopter model blade motion analysis

    NASA Technical Reports Server (NTRS)

    Daniels, T. S.; Berry, J. D.; Park, S.

    1992-01-01

    The paper reports the development and initial testing of a digital resolver to replace existing analog signal processing instrumentation. Radiometers, mounted directly on one of the fully articulated blades, are electrically connected through a slip ring to analog signal processing circuitry. The measured signals are periodic with azimuth angle and are resolved into harmonic components, with 0 deg over the tail. The periodic nature of the helicopter blade motion restricts the frequency content of each flapping and yaw signal to the fundamental and harmonics of the rotor rotational frequency. A minicomputer is employed to collect these data and then plot them graphically in real time. With this and other information generated by the instrumentation, a helicopter test pilot can then adjust the helicopter model's controls to achieve the desired aerodynamic test conditions.

  11. In-circuit fault injector user's guide

    NASA Technical Reports Server (NTRS)

    Padilla, Peter A.

    1987-01-01

    A fault injector system, called an in-circuit injector, was designed and developed to facilitate fault injection experiments performed at NASA-Langley's Avionics Integration Research Lab (AIRLAB). The in-circuit fault injector (ICFI) allows fault injections to be performed on electronic systems without special test features, e.g., sockets. The system supports stuck-at-zero, stuck-at-one, and transient fault models. The ICFI system is interfaced to a VAX-11/750 minicomputer. An interface program has been developed in the VAX. The computer code required to access the interface program is presented. Also presented is the connection procedure to be followed to connect the ICFI system to a circuit under test and the ICFI front panel controls which allow manual control of fault injections.

  12. Investigation of creep by use of closed loop servo-hydraulic test system

    NASA Technical Reports Server (NTRS)

    Wu, H. C.; Yao, J. C.

    1981-01-01

    Creep tests were conducted by means of a closed loop servo-controlled materials test system. These tests are different from the conventional creep tests in that the strain history prior to creep may be carefully monitored. Tests were performed for aluminum alloy 6061-0 at 150 C and monitored by a PDP 11/04 minicomputer at a preset constant plastic-strain rate prehistory. The results show that the plastic-strain rate prior to creep plays a significant role in creep behavior. The endochronic theory of viscoplasticity was applied to describe the observed creep curves. The concepts of intrinsic time and strain rate sensitivity function are employed and modified according to the present observation.

  13. TMAP4 User`s Manual

    SciTech Connect

    Longhurst, G.R.; Holland, D.F.; Jones, J.L.; Merrill, B.J.

    1992-06-12

    The Tritium Migration Analysis Program, Version 4 (TMAP4) has been developed by the Fusion Safety Program at the Idaho National Engineering Laboratory (INEL) as a safety analysis code, mainly to analyze tritium retention and loss in fusion reactor structures and systems during normal operation and accident conditions. TMAP4 incorporates one-dimensional thermal- and mass-diffusive transport and trapping calculations through structures and zero dimensional fluid transport between enclosures and across the interface between enclosures and structures. A key feature is the ability to input problem definition parameters as constants, interpolation tables, or FORTRAN equations. The code is specifically intended for use under a DOS operating system on PC-type mini-computers, but it has also been run successfully on workstations and mainframe computer systems. Use of the equation-input feature requires access to a FORTRAN-77 compiler and a linker program.

  14. TMAP4 User's Manual

    SciTech Connect

    Longhurst, G.R.; Holland, D.F.; Jones, J.L.; Merrill, B.J.

    1992-06-12

    The Tritium Migration Analysis Program, Version 4 (TMAP4) has been developed by the Fusion Safety Program at the Idaho National Engineering Laboratory (INEL) as a safety analysis code, mainly to analyze tritium retention and loss in fusion reactor structures and systems during normal operation and accident conditions. TMAP4 incorporates one-dimensional thermal- and mass-diffusive transport and trapping calculations through structures and zero dimensional fluid transport between enclosures and across the interface between enclosures and structures. A key feature is the ability to input problem definition parameters as constants, interpolation tables, or FORTRAN equations. The code is specifically intended for use under a DOS operating system on PC-type mini-computers, but it has also been run successfully on workstations and mainframe computer systems. Use of the equation-input feature requires access to a FORTRAN-77 compiler and a linker program.

  15. Supervisory control and diagnostics system for the mirror fusion test facility: overview and status 1980

    SciTech Connect

    McGoldrick, P.R.

    1981-01-01

    The Mirror Fusion Test Facility (MFTF) is a complex facility requiring a highly-computerized Supervisory Control and Diagnostics System (SCDS) to monitor and provide control over ten subsystems; three of which require true process control. SCDS will provide physicists with a method of studying machine and plasma behavior by acquiring and processing up to four megabytes of plasma diagnostic information every five minutes. A high degree of availability and throughput is provided by a distributed computer system (nine 32-bit minicomputers on shared memory). Data, distributed across SCDS, is managed by a high-bandwidth Distributed Database Management System. The MFTF operators' control room consoles use color television monitors with touch sensitive screens; this is a totally new approach. The method of handling deviations to normal machine operation and how the operator should be notified and assisted in the resolution of problems has been studied and a system designed.

  16. Contribution of the Spacelab data management system to lower cost space research

    NASA Technical Reports Server (NTRS)

    Burger, J. J.; Tanner, E. R.

    1976-01-01

    This paper reviews the design and operation of the Spacelab data management system as it has evolved. Significant improvements and extensions of the original baseline system have been incorporated and will be discussed. They include the capability for the remote control of Spacelab subsystems, improved remote data acquisition units, a high-rate digital data multiplexer, and an improved high-rate digital recorder. Emphasis will be placed on the overall system aspects, including considerations on the use of minicomputers as an adjunct to the basic Spacelab data system. The approach for experiment related software production and integration will be addressed as well. The paper focuses on the contributions of the data management system in reducing the cost of research in Spacelab.

  17. The experimental computer control of a two-dimensional hyperbolic system

    NASA Technical Reports Server (NTRS)

    Yam, Y.; Lang, J. H.; Staelin, D. H.; Johnson, T. L.

    1985-01-01

    The experimental computer control of a two-dimensional hyperbolic system is described. The system consists of a 5-foot gold-coated rubber membrane mounted on a circular cylindrical drum. Seven electrodes reside on a command surface located behind the membrane inside the drum. These electrodes served as capacitive sensors and electrostatic force actuators of transverse membrane deflection. The membrane was modelled as flat, isotropic and uniformly tensioned. Transverse membrane deflections were expanded in normal modes. Controllers regulating membrane deflection are designed using aggregation and design procedures based upon sensor and actuator influence functions. The resulting control laws are implemented on a minicomputer in two sets of experiments. The experimental study confirms the theoretically predicted behavior of the system, usefulness of the aggregation and design procedures, and the expectation that spillover can be made a beneficial source of damping in residual systems.

  18. Advanced application flight experiment breadboard pulse compression radar altimeter program

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Design, development and performance of the pulse compression radar altimeter is described. The high resolution breadboard system is designed to operate from an aircraft at 10 Kft above the ocean and to accurately measure altitude, sea wave height and sea reflectivity. The minicomputer controlled Ku band system provides six basic variables and an extensive digital recording capability for experimentation purposes. Signal bandwidths of 360 MHz are obtained using a reflective array compression line. Stretch processing is used to achieve 1000:1 pulse compression. The system range command LSB is 0.62 ns or 9.25 cm. A second order altitude tracker, aided by accelerometer inputs is implemented in the system software. During flight tests the system demonstrated an altitude resolution capability of 2.1 cm and sea wave height estimation accuracy of 10%. The altitude measurement performance exceeds that of the Skylab and GEOS-C predecessors by approximately an order of magnitude.

  19. Oxygen analyzer

    DOEpatents

    Benner, W.H.

    1984-05-08

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N/sub 2/), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable obtained by decomposing the sample at 1135/sup 0/C, or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135/sup 0/C as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N/sub 2/, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  20. Oxygen analyzer

    DOEpatents

    Benner, William H.

    1986-01-01

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  1. Systems used to automate medical libraries--analysis by type of library.

    PubMed

    Miido, H

    1995-06-01

    Analysis of data recorded in 626 questionnaires on systems used to automate medical libraries by type of library showed that academic and industrial libraries automated all serial and book functions to a greater degree that the other types of medical libraries (hospital, governmental, institutional and medical centre). Almost 75% of the academic libraries had automated some or all serial functions; 80% had automated some or all book functions. Half of the hospital, institutional and medical centre libraries (50-51%) performed all serial functions manually; 29-40% performed all book functions manually. No software was used consistently by all types of libraries to process serials or books. Mainframe or minicomputers were used more by academic libraries to process serials and books than personal computers.

  2. Development of an occupational health data base system.

    PubMed

    Dye, B J; Lombard, R A; Worthy, C D

    1983-06-01

    Operational concerns, coupled with rising workers' compensation costs and the proliferation of regulatory requirements, call for a new approach to occupational health data base management. To meet this challenge, an automated system to store and manage worker and workplace exposure data is being developed. The system will include individual minicomputers at local Air Force bases and a central host computer for long-term storage and retrieval. The first step in establishing this data base is the standardization of data entry and storage at base level. This manual system, known as the Standardized Occupational Health Program (SOHP), serves as the basic building block for the Computerized Occupational Health Program (COHP). Standardization and automation of all relevant industrial hygiene, occupational medicine, and environmental data will significantly enhance the flow of information needed by those charged with providing a healthful work environment for Air Force personnel.

  3. Berkeley automated supernova search

    SciTech Connect

    Kare, J.T.; Pennypacker, C.R.; Muller, R.A.; Mast, T.S.; Crawford, F.S.; Burns, M.S.

    1981-01-01

    The Berkeley automated supernova search employs a computer controlled 36-inch telescope and charge coupled device (CCD) detector to image 2500 galaxies per night. A dedicated minicomputer compares each galaxy image with stored reference data to identify supernovae in real time. The threshold for detection is m/sub v/ = 18.8. We plan to monitor roughly 500 galaxies in Virgo and closer every night, and an additional 6000 galaxies out to 70 Mpc on a three night cycle. This should yield very early detection of several supernovae per year for detailed study, and reliable premaximum detection of roughly 100 supernovae per year for statistical studies. The search should be operational in mid-1982.

  4. The spatial and logical organization of devices in an advanced industrial robot system

    NASA Technical Reports Server (NTRS)

    Ruoff, C. F.

    1980-01-01

    This paper describes the geometrical and device organization of a robot system which is based in part upon transformations of Cartesian frames and exchangeable device tree structures. It discusses coordinate frame transformations, geometrical device representation and solution degeneracy along with the data structures which support the exchangeable logical-physical device assignments. The system, which has been implemented in a minicomputer, supports vision, force, and other sensors. It allows tasks to be instantiated with logically equivalent devices and it allows tasks to be defined relative to appropriate frames. Since these frames are, in turn, defined relative other frames this organization provides a significant simplification in task specification and a high degree of system modularity.

  5. The experimental results of a self tuning adaptive controller using online frequency identification. [for Galileo spacecraft

    NASA Technical Reports Server (NTRS)

    Chiang, W.-W.; Cannon, R. H., Jr.

    1985-01-01

    A fourth-order laboratory dynamic system featuring very low structural damping and a noncolocated actuator-sensor pair has been used to test a novel real-time adaptive controller, implemented in a minicomputer, which consists of a state estimator, a set of state feedback gains, and a frequency-locked loop for real-time parameter identification. The adaptation algorithm employed can correct controller error and stabilize the system for more than 50 percent variation in the plant's natural frequency, compared with a 10 percent stability margin in frequency variation for a fixed gain controller having the same performance as the nominal plant condition. The very rapid convergence achievable by this adaptive system is demonstrated experimentally, and proven with simple, root-locus methods.

  6. The control system of the photon factory storage ring

    NASA Astrophysics Data System (ADS)

    Pak, Cheol On

    1989-05-01

    The Photon Factory 2.5 GeV electron storage ring at KEK, a dedicated machine for synchrotron radiation, stored its first beam on March, 1982. The first control system of the storage ring comprised seven distributed minicomputers connected through a star-type network. However, from 1985 they have been gradually replaced in order to meet increasing system requirements. At present, the control system uses four "supermini" computers as device controllers and a general-purpose computer as a library computer. These computers are connected to each other through a token ring-type network. Each control computer independently performs several processes. However, console functions as man-machine interfaces of all processes can be treated in a unified way using the network. A prototype database for operation logging has been completed and tested.

  7. MIDAS, prototype Multivariate Interactive Digital Analysis System, phase 1. Volume 1: System description

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.

    1974-01-01

    The MIDAS System is described as a third-generation fast multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turnaround time and significant gains in throughput. The hardware and software are described. The system contains a mini-computer to control the various high-speed processing elements in the data path, and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 200,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation.

  8. Evaluation of a computer aided X-ray fluorographic system. Part 2: Image processing

    NASA Astrophysics Data System (ADS)

    Burch, S. F.; Cocking, S. J.

    1981-12-01

    The TV imagery from a computer aided X-ray fluorographic system has been digitally processed with an I2S model 70E image processor, controlled by a PDP 11/60 minicomputer. The image processor allowed valuable processing for detection of defects in cast components to be carried out at television frame rates. Summation of TV frames was used to reduce noise, and hence improve the thickness sensitivity of the system. A displaced differencing technique and interactive contrast enhancement were then used to improve the reliability of inspection by removing spurious blemishes and interferences lines, while simultaneously enhancing the visibility of real defects. The times required for these operations are given, and the benefits provided for X-ray fluorography are illustrated by the results from inspection of aero engine castings.

  9. A method for diagnosing surface parameters using geostationary satellite imagery and a boundary-layer model. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Polansky, A. C.

    1982-01-01

    A method for diagnosing surface parameters on a regional scale via geosynchronous satellite imagery is presented. Moisture availability, thermal inertia, atmospheric heat flux, and total evaporation are determined from three infrared images obtained from the Geostationary Operational Environmental Satellite (GOES). Three GOES images (early morning, midafternoon, and night) are obtained from computer tape. Two temperature-difference images are then created. The boundary-layer model is run, and its output is inverted via cubic regression equations. The satellite imagery is efficiently converted into output-variable fields. All computations are executed on a PDP 11/34 minicomputer. Output fields can be produced within one hour of the availability of aligned satellite subimages of a target area.

  10. MIDAS, prototype Multivariate Interactive Digital Analysis System, phase 1. Volume 3: Wiring diagrams

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.

    1974-01-01

    The Midas System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in Phase I of the overall program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 2 x 100,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. The MIDAS construction and wiring diagrams are given.

  11. Alsep data processing: How we processed Apollo Lunar Seismic Data

    NASA Technical Reports Server (NTRS)

    Latham, G. V.; Nakamura, Y.; Dorman, H. J.

    1979-01-01

    The Apollo lunar seismic station network gathered data continuously at a rate of 3 x 10 to the 8th power bits per day for nearly eight years until the termination in September, 1977. The data were processed and analyzed using a PDP-15 minicomputer. On the average, 1500 long-period seismic events were detected yearly. Automatic event detection and identification schemes proved unsuccessful because of occasional high noise levels and, above all, the risk of overlooking unusual natural events. The processing procedures finally settled on consist of first plotting all the data on a compressed time scale, visually picking events from the plots, transferring event data to separate sets of tapes and performing detailed analyses using the latter. Many problems remain especially for automatically processing extraterrestrial seismic signals.

  12. Flexible missile autopilot design studies with PC-MATLAB/386

    NASA Technical Reports Server (NTRS)

    Ruth, Michael J.

    1989-01-01

    Development of a responsive, high-bandwidth missile autopilot for airframes which have structural modes of unusually low frequency presents a challenging design task. Such systems are viable candidates for modern, state-space control design methods. The PC-MATLAB interactive software package provides an environment well-suited to the development of candidate linear control laws for flexible missile autopilots. The strengths of MATLAB include: (1) exceptionally high speed (MATLAB's version for 80386-based PC's offers benchmarks approaching minicomputer and mainframe performance); (2) ability to handle large design models of several hundred degrees of freedom, if necessary; and (3) broad extensibility through user-defined functions. To characterize MATLAB capabilities, a simplified design example is presented. This involves interactive definition of an observer-based state-space compensator for a flexible missile autopilot design task. MATLAB capabilities and limitations, in the context of this design task, are then summarized.

  13. Acquisition of quantitative physiological data and computerized image reconstruction using a single scan TV system

    NASA Technical Reports Server (NTRS)

    Baily, N. A.

    1976-01-01

    A single-scan radiography system has been interfaced to a minicomputer, and the combined system has been used with a variety of fluoroscopic systems and image intensifiers available in clinical facilities. The system's response range is analyzed, and several applications are described. These include determination of the gray scale for typical X-ray-fluoroscopic-television chains, measurement of gallstone volume in patients, localization of markers or other small anatomical features, determinations of organ areas and volumes, computer reconstruction of tomographic sections of organs in motion, and computer reconstruction of transverse axial body sections from fluoroscopic images. It is concluded that this type of system combined with a minimum of statistical processing shows excellent capabilities for delineating small changes in differential X-ray attenuation.

  14. Automated search for supernovae

    SciTech Connect

    Kare, J.T.

    1984-11-15

    This thesis describes the design, development, and testing of a search system for supernovae, based on the use of current computer and detector technology. This search uses a computer-controlled telescope and charge coupled device (CCD) detector to collect images of hundreds of galaxies per night of observation, and a dedicated minicomputer to process these images in real time. The system is now collecting test images of up to several hundred fields per night, with a sensitivity corresponding to a limiting magnitude (visual) of 17. At full speed and sensitivity, the search will examine some 6000 galaxies every three nights, with a limiting magnitude of 18 or fainter, yielding roughly two supernovae per week (assuming one supernova per galaxy per 50 years) at 5 to 50 percent of maximum light. An additional 500 nearby galaxies will be searched every night, to locate about 10 supernovae per year at one or two percent of maximum light, within hours of the initial explosion.

  15. Using CLIPS in a distributed system: The Network Control Center (NCC) expert system

    NASA Technical Reports Server (NTRS)

    Wannemacher, Tom

    1990-01-01

    This paper describes an intelligent troubleshooting system for the Help Desk domain. It was developed on an IBM-compatible 80286 PC using Microsoft C and CLIPS and an AT&T 3B2 minicomputer using the UNIFY database and a combination of shell script, C programs and SQL queries. The two computers are linked by a lan. The functions of this system are to help non-technical NCC personnel handle trouble calls, to keep a log of problem calls with complete, concise information, and to keep a historical database of problems. The database helps identify hardware and software problem areas and provides a source of new rules for the troubleshooting knowledge base.

  16. Reconstruction Of Anatomical Shapes From Moire Contourographs

    NASA Astrophysics Data System (ADS)

    Saunders, Carl G.

    1983-07-01

    A Moire system which rotates an object in front of a slit camera has been used to obtain continuous photographic maps around amputee socket and shoe last shapes. Previous analysis methods required the use of IBM 370 hardware and extensive software overhead. Using a systematic manual digitizing technique and user-interactive FORTRAN software, the shape reconstruction has been easily performed on a PDP-11 minicomputer system. Both the digitizing technique and the software are oriented towards the shape reproduction process. Numerically controlled machining parameters are used to identify a "skewed" grid of required points along the cutter path. Linear interpolation and anti-interference techniques resulted in reproduction of shoe lasts to within 0.05 inches (1.2 millimeters) from the sensing axis. Difficulties were experienced in obtaining information to resolve the ends of the shapes. Current efforts focus on circumferential shape sensing of live subjects and automatic digitization of sensed data.

  17. Costing clinical biochemistry services as part of an operational management budgeting system.

    PubMed

    Tarbit, I F

    1986-08-01

    The process of costing clinical biochemistry tests as a component of the commissioning of a unit management budgeting system based on an International Computers Limited (ICL) minicomputer system was examined. Methods of apportioning consumable and labour costs under direct and indirect cost headings and as test and request charges were investigated, and in this currently operational system it was found that 38% of consumable costs and 57% of labour costs were not a direct component of the routine analysis function. Means of assigning test costs to a given request source and the incorporation of such charges into clinical budget statements were looked at. A reduction in laboratory workload did not produce a comparable reduction in laboratory costs. For a theoretical reduction in workload of 20% only a 3.8% laboratory saving in recoverable costs could be expected.

  18. Scheduler software for tracking and data relay satellite system loading analysis: User manual and programmer guide

    NASA Technical Reports Server (NTRS)

    Craft, R.; Dunn, C.; Mccord, J.; Simeone, L.

    1980-01-01

    A user guide and programmer documentation is provided for a system of PRIME 400 minicomputer programs. The system was designed to support loading analyses on the Tracking Data Relay Satellite System (TDRSS). The system is a scheduler for various types of data relays (including tape recorder dumps and real time relays) from orbiting payloads to the TDRSS. Several model options are available to statistically generate data relay requirements. TDRSS time lines (representing resources available for scheduling) and payload/TDRSS acquisition and loss of sight time lines are input to the scheduler from disk. Tabulated output from the interactive system includes a summary of the scheduler activities over time intervals specified by the user and overall summary of scheduler input and output information. A history file, which records every event generated by the scheduler, is written to disk to allow further scheduling on remaining resources and to provide data for graphic displays or additional statistical analysis.

  19. Auto covariance computer

    NASA Technical Reports Server (NTRS)

    Hepner, T. E.; Meyers, J. F. (Inventor)

    1985-01-01

    A laser velocimeter covariance processor which calculates the auto covariance and cross covariance functions for a turbulent flow field based on Poisson sampled measurements in time from a laser velocimeter is described. The device will process a block of data that is up to 4096 data points in length and return a 512 point covariance function with 48-bit resolution along with a 512 point histogram of the interarrival times which is used to normalize the covariance function. The device is designed to interface and be controlled by a minicomputer from which the data is received and the results returned. A typical 4096 point computation takes approximately 1.5 seconds to receive the data, compute the covariance function, and return the results to the computer.

  20. Design and implementation of a medium speed communications interface and protocol for a low cost, refreshed display computer

    NASA Technical Reports Server (NTRS)

    Phyne, J. R.; Nelson, M. D.

    1975-01-01

    The design and implementation of hardware and software systems involved in using a 40,000 bit/second communication line as the connecting link between an IMLAC PDS 1-D display computer and a Univac 1108 computer system were described. The IMLAC consists of two independent processors sharing a common memory. The display processor generates the deflection and beam control currents as it interprets a program contained in the memory; the minicomputer has a general instruction set and is responsible for starting and stopping the display processor and for communicating with the outside world through the keyboard, teletype, light pen, and communication line. The processing time associated with each data byte was minimized by designing the input and output processes as finite state machines which automatically sequence from each state to the next. Several tests of the communication link and the IMLAC software were made using a special low capacity computer grade cable between the IMLAC and the Univac.

  1. A multifunction network of computers in a large pathology department. Integration of word processing, data base management, and general purpose computing using network principles.

    PubMed

    Cechner, R L

    1983-04-01

    We report the strategies for design and implementation of a system of compatible mini- and microcomputers intended to improve efficiency of data handling and research in several clinical and academic divisions of a large university hospital department of Pathology. A dedicated, preprogrammed, multiuser, word processing computer and small standalone word processing stations connected to a general purpose minicomputer, using standard network protocols for communication, permit virtually error-free movement of data between computers doing functionally distinct tasks of word processing, data base management, and general scientific computing. User programming is minimized and is done in a simple, well-known, high-level language. We show that cost, thruput, speed, document volume, document size, revision cycle frequency, data base size, and frequency of use are important design criteria and that existing staff can be trained easily to operate the systems.

  2. Expert system for scheduling simulation lab sessions

    NASA Technical Reports Server (NTRS)

    Lund, Chet

    1990-01-01

    Implementation and results of an expert system used for scheduling session requests for the Systems Engineering Simulator (SES) laboratory at the NASA Johnson Space Center (JSC) are discussed. Weekly session requests are received from astronaut crew trainers, procedures developers, engineering assessment personnel, software developers, and various others who wish to access the computers, scene generators, and other simulation equipment available to them in the SES lab. The expert system under discussion is comprised of a data acquisition portion - two Pascal programs run on a personal computer - and a CLIPS program installed on a minicomputer. A brief introduction to the SES lab and its scheduling background is given. A general overview of the system is provided, followed by a detailed description of the constraint-reduction process and of the scheduler itself. Results from a ten-week trial period using this approach are discussed. Finally, a summary of the expert system's strengths and shortcomings are provided.

  3. Fourier emission infrared microspectrophotometer for surface analysis. I - Application to lubrication problems

    NASA Technical Reports Server (NTRS)

    Lauer, J. L.; King, V. W.

    1979-01-01

    A far-infrared interferometer was converted into an emission microspectrophotometer for surface analysis. To cover the mid-infrared as well as the far-infrared the Mylar beamsplitter was made replaceable by a germanium-coated salt plate, and the Moire fringe counting system used to locate the moveable Michelson mirror was improved to read 0.5 micron of mirror displacement. Digital electronics and a dedicated minicomputer were installed for data collection and processing. The most critical element for the recording of weak emission spectra from small areas was, however, a reflecting microscope objective and phase-locked signal detection with simultaneous referencing to a blackbody source. An application of the technique to lubrication problems is shown.

  4. Study of cryogenic propellant systems for loading the space shuttle

    NASA Technical Reports Server (NTRS)

    Voth, R. O.; Steward, W. G.; Hall, W. J.

    1974-01-01

    Computer programs were written to model the liquid oxygen loading system for the space shuttle. The programs allow selection of input data through graphic displays which schematically depict the part of the system being modeled. The computed output is also displayed in the form of graphs and printed messages. Any one of six computation options may be selected. The first four of these pertain to thermal stresses, pressure surges, cooldown times, flow rates and pressures during cooldown. Options five and six deal with possible water hammer effects due to closing of valves, steady flow and transient response to changes in operating conditions after cooldown. Procedures are given for operation of the graphic display unit and minicomputer.

  5. Geodetic Accuracy of LANDSAT-4 Multispectral Scanner and Thematic Mapper Data

    NASA Technical Reports Server (NTRS)

    Thormodsgard, J. M.; Devries, D. J.

    1984-01-01

    The geodetic accuracy of an MSS or TM scene is assessed using a minicomputer and appropriate software, a digitizer, and an image display device. The calculated image location of a selected feature is compared with the actual image location obtained though visual inspection of the image on the display. Measurements of 15 to 20 features evenly distributed throughout the image provide an estimate of the geodetic accuracy of the scene. Tests of two system-corrected MSS scenes measured geodetic registration root-mean-square (RMS) errors of approximately 3,200 m or 57 pixels. Tests of two TM system-corrected scenes measured RMS errors of approximately 1,250 and 1,000 m, or 44 and 35 pixels, respectively. All errors were primarily translational, implying good internal scene registration of both MSS and TM data. The one MSS GCP-corrected scene which was evaluated had an RMS error of approximately 325 m or 6 pixels.

  6. A statistical data analysis and plotting program for cloud microphysics experiments

    NASA Technical Reports Server (NTRS)

    Jordan, A. J.

    1981-01-01

    The analysis software developed for atmospheric cloud microphysics experiments conducted in the laboratory as well as aboard a KC-135 aircraft is described. A group of four programs was developed and implemented on a Hewlett Packard 1000 series F minicomputer running under HP's RTE-IVB operating system. The programs control and read data from a MEMODYNE Model 3765-8BV cassette recorder, format the data on the Hewlett Packard disk subsystem, and generate statistical data (mean, variance, standard deviation) and voltage and engineering unit plots on a user selected plotting device. The programs are written in HP FORTRAN IV and HP ASSEMBLY Language with the graphics software using the HP 1000 Graphics. The supported plotting devices are the HP 2647A graphics terminal, the HP 9872B four color pen plotter, and the HP 2608A matrix line printer.

  7. Composite structural materials. [aircraft structures

    NASA Technical Reports Server (NTRS)

    Ansell, G. S.; Loewy, R. G.; Wiberley, S. E.

    1980-01-01

    The use of filamentary composite materials in the design and construction of primary aircraft structures is considered with emphasis on efforts to develop advanced technology in the areas of physical properties, structural concepts and analysis, manufacturing, and reliability and life prediction. The redesign of a main spar/rib region on the Boeing 727 elevator near its actuator attachment point is discussed. A composite fabrication and test facility is described as well as the use of minicomputers for computer aided design. Other topics covered include (1) advanced structural analysis methids for composites; (2) ultrasonic nondestructive testing of composite structures; (3) optimum combination of hardeners in the cure of epoxy; (4) fatigue in composite materials; (5) resin matrix characterization and properties; (6) postbuckling analysis of curved laminate composite panels; and (7) acoustic emission testing of composite tensile specimens.

  8. Diagnosis of alcoholic cirrhosis with the right-to-left hepatic lobe ratio: concise communication

    SciTech Connect

    Shreiner, D.P.; Barlai-Kovach, M.

    1981-02-01

    Since scans of cirrhotic livers commonly show a reduction in size and colloid uptake of the right lobe, a quantitative measure of uptake was made using a minicomputer to determine total counts in regions of interest defined over each lobe. Right-to-left ratios were then compared in 103 patients. For normal paitents the mean ratio +- 1 s.d. was 2.85 +- 0.65, and the mean for patients with known cirrhosis was 1.08 +- 0.33. Patients with other liver diseases had ratios similar to the normal group. The normal range of the right-to-left lobe ratio was 1.55 to 4.15. The sensitivity of the ratio for alcoholic cirrhosis was 85.7% and the specificity was 100% in this patient population. The right-to-left lobe ratio was more sensitive and specific for alcoholic cirrhosis than any other criterion tested. An hypothesis is described to explain these results.

  9. An on-line method for the acquisition of medical information for computer processing as applied to radiotherapy.

    PubMed

    Möller, T R; Gustafsson, T

    1977-06-01

    Based on a structured medical record, specially designed for patients with malignant disease, an on-line data capture system has been developed. This enables the collection of virtually any type of information contained in the patient's case notes. The structure of the record is described, with actual examples. The record is typed on a typewriter terminal linked to a mini-computer. Data is recorded as code + heading + value string. The headings are identified automatically, and an internal code generated, describing the type of information. Record keeping according to the principles described was introduced in clinical routine at the department in 1971. Data collection was implemented later that year, using an off-line magnetic tape encoder (IBM MT72). The system has been developed further and converted to a versatile on-line system. The data base, collected with these systems, now contains data on about 20,000 patients. PMID:862391

  10. A pulse code modulation decommutator enhanced 'quick look' data reduction system

    NASA Astrophysics Data System (ADS)

    Black, D. G.; Woodworth, L. A.

    Modern Pulse Code Modulated instrumentation systems allow for acquisition of large quantities of measurements at high data rates. The capability of signal monitoring to certify system operation before and during a test is essential. It is necessary to accomplish data reduction of these signals using computer-based systems to provide the conversions of the raw binary data to an 'engineering unit' equivalent form. Decommutation and storage of the data samples must be accomplished prior to this conversion process. Retrieving and monitoring data from an existing system presented significant problems using available hardware necessitating the development and construction of a versatile computer interface and data reduction system. This minicomputer-based system is capable of satisfying virtually all on-site 'quick-look' requirements.

  11. Wind tunnel evaluation of air-foil performance using simulated ice shapes

    NASA Technical Reports Server (NTRS)

    Bragg, M. B.; Zaguli, R. J.; Gregorek, G. M.

    1982-01-01

    A two-phase wind tunnel test was conducted in the 6 by 9 foot Icing Research Tunnel (IRT) at NASA Lewis Research Center to evaluate the effect of ice on the performance of a full scale general aviation wing. In the first IRT tests, rime and glaze shapes were carefully documented as functions of angle of attack and free stream conditions. Next, simulated ice shapes were constructed for two rime and two glaze shapes and used in the second IRT tunnel entry. The ice shapes and the clean airfoil were tapped to obtain surface pressures and a probe used to measure the wake characteristics. These data were recorded and processed, on-line, with a minicomputer/digital data acquisition system. The effect of both rime and glaze ice on the pressure distribution, Cl, Cd, and Cm are presented.

  12. Tracing technology in the Association of Academic Health Sciences Libraries

    PubMed Central

    Guard, J. Roger; Peay, Wayne J.

    2003-01-01

    From the beginning of the association, technology and the Association of Academic Health Sciences Libraries (AAHSL) have been intertwined. Technology was the focus of one of the first committees. Innovative applications of technology have been employed in the operations of the association. Early applications of mini-computers were used in preparing the Annual Statistics. The association's use of network communications was among the first in the country and later applications of the Web have enhanced association services. For its members, technology has transformed libraries. The association's support of the early development of Integrated Advanced Information Management Systems (IAIMS) and of its recent reconceptualization has contributed to the intellectual foundation for this revolution. PMID:12883580

  13. ANNIE - INTERACTIVE PROCESSING OF DATA BASES FOR HYDROLOGIC MODELS.

    USGS Publications Warehouse

    Lumb, Alan M.; Kittle, John L.

    1985-01-01

    ANNIE is a data storage and retrieval system that was developed to reduce the time and effort required to calibrate, verify, and apply watershed models that continuously simulate water quantity and quality. Watershed models have three categories of input: parameters to describe segments of a drainage area, linkage of the segments, and time-series data. Additional goals for ANNIE include the development of software that is easily implemented on minicomputers and some microcomputers and software that has no special requirements for interactive display terminals. Another goal is for the user interaction to be based on the experience of the user so that ANNIE is helpful to the inexperienced user and yet efficient and brief for the experienced user. Finally, the code should be designed so that additional hydrologic models can easily be added to ANNIE.

  14. Microcumpter computation of water quality discharges

    USGS Publications Warehouse

    Helsel, Dennis R.

    1983-01-01

    A fully prompted program (SEDQ) has been developed to calculate daily and instantaneous water quality (QW) discharges. It is written in a version of BASIC, and requires inputs of gage heights, discharge rating curve, shifts, and water quality concentration information. Concentration plots may be modified interactively using the display screen. Semi-logarithmic plots of concentration and water quality discharge are output to the display screen, and optionally to plotters. A summary table of data is also output. SEDQ could be a model program for micro and minicomputer systems likely to be in use within the Water Resources Division, USGS, in the near future. The daily discharge-weighted mean concentration is one output from SEDQ. It is defined in this report, differentiated from the currently used mean concentration, and designated the ' equivalent concentration. ' (USGS)

  15. Personal computer applications in DIII-D neutral beam operation

    SciTech Connect

    Glad, A.S.

    1986-08-01

    An IBM PC AT has been implemented to improve operation of the DIII-D neutral beams. The PC system provides centralization of all beam data with reasonable access for on-line shot-to-shot control and analysis. The PC hardware was configured to interface all four neutral beam host minicomputers, support multitasking, and provide storage for approximately one month's accumulation of beam data. The PC software is composed of commercial packages used for performance and statistical analysis (i.e., LOTUS 123, PC PLOT, etc.), host communications software (i.e., PCLink, KERMIT, etc.), and applications developed software utilizing f-smcapso-smcapsr-smcapst-smcapsr-smcapsa-smcapsn-smcaps and b-smcapsa-smcapss-smcapsIc-smcaps. The objectives of this paper are to describe the implementation of the PC system, the methods of integrating the various software packages, and the scenario for on-line control and analysis.

  16. KEK NODAL system

    SciTech Connect

    Kurokawa, S.; Abe, K.; Akiyama, A.; Katoh, T.; Kikutani, E.; Koiso, H.; Kurihara, N.; Oide, K.; Shinomoto, M.

    1985-10-01

    The KEK NODAL system, which is based on the NODAL devised at the CERN SPS, works on an optical-fiber token ring network of twenty-four minicomputers (Hitachi HIDIC 80's) to control the TRISTAN accelerator complex, now being constructed at KEK. KEK NODAL retains main features of the original NODAL: the interpreting scheme, the multi-computer programming facility, and the data-module concept. In addition, it has the following characteristics: fast execution due to the compiler-interpreter method, a multicomputer file system, a full-screen editing facility, and a dynamic linkage scheme of data modules and NODAL functions. The structure of the KEK NODAL system under PMS, a real-time multitasking operating system of HIDIC 80, is described; the NODAL file system is also explained.

  17. Development of a multiplane multispeed balancing system for turbine systems

    NASA Technical Reports Server (NTRS)

    Martin, M. R.

    1984-01-01

    A prototype high speed balancing system was developed for assembled gas turbine engine modules. The system permits fully assembled gas turbine modules to be operated and balanced at selected speeds up to full turbine speed. The balancing system is a complete stand-alone system providing all necesary lubrication and support hardware for full speed operation. A variable speed motor provides the drive power. A drive belt and gearbox provide rotational speeds up to 21,000 rpm inside a vacuum chamber. The heart of the system is a dedicated minicomputer with attendant data acquisition, storage and I/O devices. The computer is programmed to be completely interactive with the operator. The system was installed at CCAD and evaluated by testing 20 T55 power turbines and 20 T53 power turbines. Engine test results verified the performance of the high speed balanced turbines.

  18. Laboratory procedures used in the hot corrosion project

    SciTech Connect

    Jeys, T.R.

    1980-04-08

    The objective of the Hot Corrosion Project in the LLNL Metals and Ceramics Division is to study the physical and chemical mechanisms of corrosion of nickel, iron, and some of their alloys when these metals are subjected to oxidizing or sulfidizing environments at temperatures between 850 and 950/sup 0/C. To obtain meaningful data in this study, we must rigidly control many parameters. Parameters are discussed and the methods chosen to control them in this laboratory. Some of the mechanics and manipulative procedures that are specifically related to data access and repeatability are covered. The method of recording and processing the data from each experiment using an LS-11 minicomputer are described. The analytical procedures used to evaluate the specimens after the corrosion tests are enumerated and discussed.

  19. Foqus: a FORTRAN program for the quantitative analysis of x-ray spectra from thin biological specimens.

    PubMed

    Fuchs, H; Fuchs, W

    1981-01-01

    An online FORTRAN program for the quantitative analysis of energy dispersive X-ray spectra from thin biological specimens is presented. The methods of background suppression by digital filtering and peak deconvolution by linear least-squares fitting with measured peak profiles are used. The continuum quantitation method for spectra from thin biological sections as proposed by Hall is applied. The performance of the computer program, utilizing the facilities of a disk operating system, is demonstrated. The routines were optimized for speed, resulting in a run-time of less than 5 seconds on a 16 bit minicomputer for a full quantitation for 7 elements of an energy dispersive thin section X-ray spectrum, including an optional absorption correction. Since no assembly language subroutines are implemented, the restrictions for the use of the program with different computer systems are minimized.

  20. FINDS: A fault inferring nonlinear detection system. User's guide

    NASA Technical Reports Server (NTRS)

    Lancraft, R. E.; Caglayan, A. K.

    1983-01-01

    The computer program FINDS is written in FORTRAN-77, and is intended for operation on a VAX 11-780 or 11-750 super minicomputer, using the VMS operating system. The program detects, isolates, and compensates for failures in navigation aid instruments and onboard flight control and navigation sensors of a Terminal Configured Vehicle aircraft in a Microwave Landing System environment. In addition, FINDS provides sensor fault tolerant estimates for the aircraft states which are then used by an automatic guidance and control system to land the aircraft along a prescribed path. FINDS monitors for failures by evaluating all sensor outputs simultaneously using the nonlinear analytic relationships between the various sensor outputs arising from the aircraft point mass equations of motion. Hence, FINDS is an integrated sensor failure detection and isolation system.

  1. Selecting a labor information system. What to ask, what to avoid.

    PubMed

    Garcia, L

    1990-12-01

    Payroll expenses may account for over half of all of a hospital's expenses. Manual time card processing requires an abundance of staff time and can often result in costly errors. To alleviate this problem, many healthcare facilities are implementing computerized labor information systems. To minimize the risk of selecting the wrong system, hospital administrators should ask the following questions before committing to any computerized labor information system: Is the software designed for hospital use and easily adaptable to each hospital's unique policies? How flexible is the software's reporting system? Does it include automatic scheduling that creates generic schedules? Does the system have the capability of securing time and attendance records and documenting the audit trail? Does the system include an accurate and reliable badge reader? What type of hardware is best for the particular hospital--microcomputer, minicomputer, or mainframe? Finally, to guarantee successful software installation, the vendor should have extensive experience and documentation in the system's implementation. PMID:10108009

  2. Instrumentation for controlling and monitoring environmental control and life support systems

    NASA Technical Reports Server (NTRS)

    Yang, P. Y.; Gyorki, J. R.; Wynveen, R. A.

    1978-01-01

    Advanced Instrumentation concepts for improving performance of manned spacecraft Environmental Control and Life Support Systems (EC/LSS) have been developed at Life Systems, Inc. The difference in specific EC/LSS instrumentation requirements and hardware during the transition from exploratory development to flight production stages are discussed. Details of prior control and monitor instrumentation designs are reviewed and an advanced design presented. The latter features a minicomputer-based approach having the flexibility to meet process hardware test programs and the capability to be refined to include the control dynamics and fault diagnostics needed in future flight systems where long duration, reliable operation requires in-flight hardware maintenance. The emphasis is on lower EC/LSS hardware life cycle costs by simplicity in instrumentation and using it to save crew time during flight operation.

  3. DCN/SEEDIS: the Distributed Computer Network (DCN) and Socio-Economic-Environmental Demographic Information System (SEEDIS). An introduction to the Distributed Computer Network

    SciTech Connect

    Sventek, V.A.

    1982-09-01

    This introduction was designed to serve as support documentation for a five day course presented to DOL/ETA at Regional Offices by LBL staff. At these presentations, new users of the DCN receive instruction on the basic components of the VAX 11/780 minicomputers of which the DCN is comprised, including VMS (the VAX 11/780 Operating System), use of interactive terminals with VMS, an overview of the VMS directory structure, introduction to a text editor, and an introduction to Datatrieve (a data entry and retrieval system developed by Digital Equipment Corporation). Specific topics presented include: use of the terminal keyboard; logging on to VMS (the VAX 11/780 operating system); VMS directory structure; files manipulation; and introduction to datatrieve.

  4. New developments in personal computer software for accelerator simulation and analysis

    NASA Astrophysics Data System (ADS)

    Gillespie, George H.; Orthel, John L.

    1993-09-01

    The increasing power of personal computers is offering accelerator designers new options for meeting their computational requirements. Standalone and highly portable machines provide accelerator scientists with different approaches to solving problems traditionally relegated to centralized mainframe, mini-computer or networked workstation environments. Advances in user interfaces, which have provided enhanced productivity for many business and technical applications, are now being implemented for accelerator design and analysis codes. We have developed new software packages for the Macintosh personal computer platform in this vein and discuss two of them here. For use with existing FORTRAN design and analysis codes, a unique graphical user interface (GUI) has been developed. The second package is the Numerical Electrodynamics Laboratory (NEDlab), a new two-dimensional (cylindrical or Cartesian) particle and field simulation program.

  5. Recognition of movement object collision

    NASA Astrophysics Data System (ADS)

    Chang, Hsiao Tsu; Sun, Geng-tian; Zhang, Yan

    1991-03-01

    The paper explores the collision recognition of two objects in both crisscross and revolution motions A mathematical model has been established based on the continuation theory. The objects of any shape may be regarded as being built of many 3siniplexes or their convex hulls. Therefore the collision problem of two object in motion can be reduced to the collision of two corresponding 3siinplexes on two respective objects accordingly. Thus an optimized algorithm is developed for collision avoidance which is suitable for computer control and eliminating the need for vision aid. With this algorithm computation time has been reduced significantly. This algorithm is applicable to the path planning of mobile robots And also is applicable to collision avoidance of the anthropomorphic arms grasping two complicated shaped objects. The algorithm is realized using LISP language on a VAX8350 minicomputer.

  6. Advances in systems for interactive processing and display of meteorological data

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.

    1983-01-01

    Advances in systems for interactive processing and display of meteorological data are reviewed, with particular attention given to developments in hardware and software, meteorological data base, analysis and display, and systems availability. These developments include inexpensive minicomputers which give the user almost instantaneous results for many types of jobs; image terminals with the capability to enhance, quantify, animate, and compare image and graphical data; accessibility of a large meteorological data base and the capability of merging different types of data; and sophisticated analysis and multidimensional display techniques. Critical problems still to be solved include getting quick access to historical and real time data bases from any system and making it easy to transport software from one system to another.

  7. An application of the Multi-Purpose System Simulation /MPSS/ model to the Monitor and Control Display System /MACDS/ at the National Aeronautics and Space Administration /NASA/ Goddard Space Flight Center /GSFC/

    NASA Technical Reports Server (NTRS)

    Mill, F. W.; Krebs, G. N.; Strauss, E. S.

    1976-01-01

    The Multi-Purpose System Simulator (MPSS) model was used to investigate the current and projected performance of the Monitor and Control Display System (MACDS) at the Goddard Space Flight Center in processing and displaying launch data adequately. MACDS consists of two interconnected mini-computers with associated terminal input and display output equipment and a disk-stored data base. Three configurations of MACDS were evaluated via MPSS and their performances ascertained. First, the current version of MACDS was found inadequate to handle projected launch data loads because of unacceptable data backlogging. Second, the current MACDS hardware with enhanced software was capable of handling two times the anticipated data loads. Third, an up-graded hardware ensemble combined with the enhanced software was capable of handling four times the anticipated data loads.

  8. Computer graphics application in the engineering design integration system

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Abel, R. W.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Stewart, W. A.

    1975-01-01

    The computer graphics aspect of the Engineering Design Integration (EDIN) system and its application to design problems were discussed. Three basic types of computer graphics may be used with the EDIN system for the evaluation of aerospace vehicles preliminary designs: offline graphics systems using vellum-inking or photographic processes, online graphics systems characterized by direct coupled low cost storage tube terminals with limited interactive capabilities, and a minicomputer based refresh terminal offering highly interactive capabilities. The offline line systems are characterized by high quality (resolution better than 0.254 mm) and slow turnaround (one to four days). The online systems are characterized by low cost, instant visualization of the computer results, slow line speed (300 BAUD), poor hard copy, and the early limitations on vector graphic input capabilities. The recent acquisition of the Adage 330 Graphic Display system has greatly enhanced the potential for interactive computer aided design.

  9. AOIPS data base management systems support for GARP data sets

    NASA Technical Reports Server (NTRS)

    Gary, J. P.

    1977-01-01

    A data base management system is identified, developed to provide flexible access to data sets produced by GARP during its data systems tests. The content and coverage of the data base are defined and a computer-aided, interactive information storage and retrieval system, implemented to facilitate access to user specified data subsets, is described. The computer programs developed to provide the capability were implemented on the highly interactive, minicomputer-based AOIPS and are referred to as the data retrieval system (DRS). Implemented as a user interactive but menu guided system, the DRS permits users to inventory the data tape library and create duplicate or subset data sets based on a user selected window defined by time and latitude/longitude boundaries. The DRS permits users to select, display, or produce formatted hard copy of individual data items contained within the data records.

  10. Automatic volume calibration system

    SciTech Connect

    Gates, A.J.; Aaron, C.C.

    1985-05-06

    The Automatic Volume Calibration System presently consists of three independent volume-measurement subsystems and can possibly be expanded to five subsystems. When completed, the system will manually or automatically perform the sequence of valve-control and data-acquisition operations required to measure given volumes. An LSI-11 minicomputer controls the vacuum and pressure sources and controls solenoid control valves to open and close various volumes. The input data are obtained from numerous displacement, temperature, and pressure sensors read by the LSI-11. The LSI-11 calculates the unknown volume from the data acquired during the sequence of valve operations. The results, based on the Ideal Gas Law, also provide information for feedback and control. This paper describes the volume calibration system, its subsystems, and the integration of the various instrumentation used in the system's design and development. 11 refs., 13 figs., 4 tabs.

  11. Use of analytical mechanics in defining acoustic-test methodology

    NASA Astrophysics Data System (ADS)

    Emergy, A. F.; Thomas, G. H.

    1982-06-01

    One of the more pressing needs is the ability to measure the level of stress in materials due either to active loading or to the residual strains caused by plastic deformation. Acoustic techniques have not been exploited on a routine testing basis for stress determination because their use requires not only very sophisticated instrumentation, but also because they are difficult to automate and require laborious and delicate personal operation. However, the development of small, inexpensive micro and minicomputes which can be dedicated to single tasks offers the possibility of designing acoustic nondestructive evaluation (ANDE) procedures which will be accurate, reasonably priced and for which the computer may provide automated testing and data processing. A number of different ANDE measurements can be made, but the problem is to define their sensitivity and accuracy in order to choose one or more which are useful in a production sense.

  12. Spinal cord stimulators and radiotherapy: first case report and practice guidelines.

    PubMed

    Walsh, Lorraine; Guha, Daipayan; Purdie, Thomas G; Bedard, Philippe; Easson, Alexandra; Liu, Fei-Fei; Hodaie, Mojgan

    2011-01-01

    Spinal cord stimulators (SCS) are a well-recognised treatment modality in the management of a number of chronic neuropathic pain conditions, particularly failed back syndrome and radiculopathies. The implantable pulse generator (IPG) component of the SCS is designed and operates in a similar fashion to that of a cardiac pacemaker. The IPG consists of an electrical generator, lithium battery, transmitter/receiver and a minicomputer. When stimulated, it generates pulsed electrical signals which stimulate the dorsal columns of the spinal cord, thus alleviating pain. Analogous to a cardiac pacemaker, it can be potentially damaged by ionising radiation from a linear accelerator, in patients undergoing radiotherapy. Herein we report our clinical management of the first reported case of a patient requiring adjuvant breast radiotherapy who had a SCS in situ. We also provide useful practical recommendations on the management of this scenario within a radiation oncology department. PMID:22024340

  13. Trends in on-line data processing

    NASA Astrophysics Data System (ADS)

    Masetti, Massimo

    1981-04-01

    The development of integrated circuits has been characterized by an exponential growth of gates on a single chip that will still continue in the coming years. In parallel the price per bit is dropping down with more or less the same law. As a consequence of this a few statements can be made: -The present 16-bit minicomputer in a small configuration is going to be substituted by a 16-bit microcomputer, and the 16-bit minicomputer in a powerful configuration by a 32-bit midi having also a virtual memory facility. -Fully programmable or microcoded powerful devices like the LASS hardware processor or MICE, will allow an efficient on-line filter. Higher computing-speed can be achieved by a multiprocessor configuration which can be insensitive to hardware failures. Therefore we are moving towards an integrated on-line computing system with much higher computing power than now and the present distinction between on-line and off-line will no longer be so sharp. As more processing can be performed on-line, fast high quality feed-back can be provided for the experiment. In the years to come the trend towards more processing power, at a lower price, and assembled in the same hardware volume will continue for at least five years; at the same time the future large high-energy physics experiments at LEP will be carried out within a wide international collaboration. In this environment methods must be found for a large fraction of the work to be distributed amongst the collaborators. To accomplish this aim it is necessary to introduce common standard practices concerning both hardware and software, in such a way that the separate parts, developed by the collaborators, will be plug-compatible.

  14. The History of the Data Systems AutoChemist® (ACH) and AutoChemist-PRISMA (PRISMA®): from 1964 to 1986

    PubMed Central

    2014-01-01

    Summary Objectives This paper presents the history of data system development steps (1964 – 1986) for the clinical analyzers AutoChemist®, and its successor AutoChemist PRISMA® (PRogrammable Individually Selective Modular Analyzer). The paper also partly recounts the history of development steps of the minicomputer PDP 8 from Digital Equipment. The first PDP 8 had 4 core memory boards of 1 K each and was large as a typical oven baking sheet and about 10 years later, PDP 8 was a “one chip microcomputer” with a 32 K memory chip. The fast developments of PDP 8 come to have a strong influence on the development of the data system for AutoChemist. Five major releases of the software were made during this period (1-5 MIACH). Results The most important aims were not only to calculate the results, but also be able to monitor their quality and automatically manage the orders, store the results in digital form for later statistical analysis and distribute the results to the physician in charge of the patient using thesame computer as the analyzer. Another result of the data system was the ability to customize AutoChemist to handle sample identification by using bar codes and the presentation of results to different types of laboratories. Conclusions Digital Equipment launched the PDP 8 just as a new minicomputer was desperately needed. No other known alternatives were available at the time. This was to become a key success factor for AutoChemist. That the AutoChemist with such a high capacity required a computer for data collection was obvious already in the early 1960s. That computer development would be so rapid and that one would be able to accomplish so much with a data system was even suspicious at the time. In total, 75; AutoChemist (31) and PRISMA (44) were delivered Worldwide The last PRISMA was delivered in 1987 to the Veteran Hospital Houston, TX USA PMID:24853032

  15. Evaluation of three electronic report processing systems for preparing hydrologic reports of the U.S Geological Survey, Water Resources Division

    USGS Publications Warehouse

    Stiltner, G.J.

    1990-01-01

    In 1987, the Water Resources Division of the U.S. Geological Survey undertook three pilot projects to evaluate electronic report processing systems as a means to improve the quality and timeliness of reports pertaining to water resources investigations. The three projects selected for study included the use of the following configuration of software and hardware: Ventura Publisher software on an IBM model AT personal computer, PageMaker software on a Macintosh computer, and FrameMaker software on a Sun Microsystems workstation. The following assessment criteria were to be addressed in the pilot studies: The combined use of text, tables, and graphics; analysis of time; ease of learning; compatibility with the existing minicomputer system; and technical limitations. It was considered essential that the camera-ready copy produced be in a format suitable for publication. Visual improvement alone was not a consideration. This report consolidates and summarizes the findings of the electronic report processing pilot projects. Text and table files originating on the existing minicomputer system were successfully transformed to the electronic report processing systems in American Standard Code for Information Interchange (ASCII) format. Graphics prepared using a proprietary graphics software package were transferred to all the electronic report processing software through the use of Computer Graphic Metafiles. Graphics from other sources were entered into the systems by scanning paper images. Comparative analysis of time needed to process text and tables by the electronic report processing systems and by conventional methods indicated that, although more time is invested in creating the original page composition for an electronically processed report , substantial time is saved in producing subsequent reports because the format can be stored and re-used by electronic means as a template. Because of the more compact page layouts, costs of printing the reports were 15% to 25

  16. Evolution of the Mobile Information SysTem (MIST)

    NASA Technical Reports Server (NTRS)

    Litaker, Harry L., Jr.; Thompson, Shelby; Archer, Ronald D.

    2008-01-01

    The Mobile Information SysTem (MIST) had its origins in the need to determine whether commercial off the shelf (COTS) technologies could improve intervehicular activities (IVA) on International Space Station (ISS) crew maintenance productivity. It began with an exploration of head mounted displays (HMDs), but quickly evolved to include voice recognition, mobile personal computing, and data collection. The unique characteristic of the MIST lies within its mobility, in which a vest is worn that contains a mini-computer and supporting equipment, and a headband with attachments for a HMD, lipstick camera, and microphone. Data is then captured directly by the computer running Morae(TM) or similar software for analysis. To date, the MIST system has been tested in numerous environments such as two parabolic flights on NASA's C-9 microgravity aircraft and several mockup facilities ranging from ISS to the Altair Lunar Sortie Lander. Functional capabilities have included its lightweight and compact design, commonality across systems and environments, and usefulness in remote collaboration. Human Factors evaluations of the system have proven the MIST's ability to be worn for long durations of time (approximately four continuous hours) with no adverse physical deficits, moderate operator compensation, and low workload being reported as measured by Corlett Bishop Discomfort Scale, Cooper-Harper Ratings, and the NASA Total Workload Index (TLX), respectively. Additionally, through development of the system, it has spawned several new applications useful in research. For example, by only employing the lipstick camera, microphone, and a compact digital video recorder (DVR), we created a portable, lightweight data collection device. Video is recorded from the participants point of view (POV) through the use of the camera mounted on the side of the head. Both the video and audio is recorded directly into the DVR located on a belt around the waist. This data is then transferred to

  17. The graphics and data acquisition software package

    NASA Technical Reports Server (NTRS)

    Crosier, W. G.

    1981-01-01

    A software package was developed for use with micro and minicomputers, particularly the LSI-11/DPD-11 series. The package has a number of Fortran-callable subroutines which perform a variety of frequently needed tasks for biomedical applications. All routines are well documented, flexible, easy to use and modify, and require minimal programmer knowledge of peripheral hardware. The package is also economical of memory and CPU time. A single subroutine call can perform any one of the following functions: (1) plot an array of integer values from sampled A/D data, (2) plot an array of Y values versus an array of X values; (3) draw horizontal and/or vertical grid lines of selectable type; (4) annotate grid lines with user units; (5) get coordinates of user controlled crosshairs from the terminal for interactive graphics; (6) sample any analog channel with program selectable gain; (7) wait a specified time interval, and (8) perform random access I/O of one or more blocks of a sequential disk file. Several miscellaneous functions are also provided.

  18. Rectification of terrain induced distortions in radar imagery

    NASA Technical Reports Server (NTRS)

    Kwok, Ronald; Curlander, John C.; Pang, Shirley S.

    1987-01-01

    This paper describes a technique to generate geocoded synthetic aperture radar (SAR) imagery corrected for terrain induced geometric distortions. This algorithm transforms the raw slant range image, generated by the signal processor, into a map registered product, resampled to either Universal Transverse Mercator (UTM) or Polar Stereographic projections, and corrected for foreshortening. The technique utilizes the space platform trajectory information in conjunction with a digital elevation map (DEM) of the target area to generate an ortho-radar map with near-autonomous operation. The current procedure requires only two to three tie-points to compensate for the platform position uncertainty that results in translational error between the image and the DEM. This approach is unique in that it does not require generation of a simulated radar image from the DEM or a grid of tie-points to characterize the image-to-map distortions. Rather, it models the inherent distortions based on knowledge of the radar data collection characteristics, the signal Doppler parameters, and the local terrain height to automatically predict the registration transformation. This algorithm has been implemented on a minicomputer system equipped with an array processor and a large random-access memory to optimize the throughput.

  19. The South African National Digital Seismological System (SANDSS), a dial-up telephone-linked network

    NASA Astrophysics Data System (ADS)

    Fernandez, L. M.; Otto, M. A.; Steyn, J.

    1992-08-01

    The use of automatic telephone lines on a dial-up basis, to connect modified Personal Computers (PCs) to a Control Centre's minicomputer, has been shown to be a reliable, inexpensive method of operating a large Seismological network. Accurate time control is obtained by automatic telephone synchronisation of the real-time clocks of the stations. The network does not operate exactly in real time, but only in "quasi" real time. On a routine basis the data stored by the PCs are transmitted every night (when telephone rates are low) to the processing centre. In emergency cases, the data can be requested at any time by telephone. A set of parameters, such as gain, triggering algorithms constants etc., can be remotely controlled. Results are collected at a rate of 50 samples/s, 16-bit record, and transmitted on an error-free basis at a rate of 2,400 bauds. During a testing period of 2 months the performance of one station in terms of the percentage of seismic events recorded digitally and the total number of events detected on a conventional analog seismogram was 85%, with 91% of regional and local events recorded.

  20. In-phase and out-of-phase axial-torsional fatigue behavior of Haynes 188 at 760 C

    NASA Technical Reports Server (NTRS)

    Kalluri, Sreeramesh; Bonacuse, Peter J.

    1991-01-01

    Isothermal, in-phase and out-of-phase axial-torsional fatigue experiments have been conducted at 760 C on uniform gage section, thin-walled tubular specimens of a wrought cobalt-base superalloy, Haynes 188. Test-control and data acquisition were accomplished with a minicomputer. Fatigue lives of the in- and out-of-phase axial-torsional fatigue tests have been estimated with four different multiaxial fatigue life prediction models that were developed primarly for predicting axial-torsional fatigue lives at room temperature. The models investigated were: (1) the von Mises equivalent strain range; (2) the Modified Multiaxiality Factor Approach; (3) the Modified Smith-Watson-Topper Parameter; and (4) the critical shear plane method of Fatemi, Socie, and Kurath. In general, life predictions by the von Mises equivalent strain range model were within a factor of 2 for a majority of the tests and the predictions by the Modified Multiaxiality Factor Approach were within a factor of 2, while predictions of the Modified Smith-Watson-Topper Parameter and of the critical shear plane method of Fatemi, Socie, and Kurath were unconservative and conservative, respectively, by up to factors of 4. In some of the specimens tested under combined axial-torsional loading conditions, fatigue cracks initiated near extensometer indentations. Two design modifications have been proposed to the thin-walled tubular specimen to overcome this problem.

  1. Integration of autonomous systems for remote control of data acquisition and diagnostics in the TJ-II device

    SciTech Connect

    Vega, J.; Mollinedo, A.; Lopez, A.; Pacios, L.

    1997-01-01

    The data acquisition system for TJ-II will consist of a central computer, containing the data base of the device, and a set of independent systems (personal computers, embedded ones, workstations, minicomputers, PLCs, and microprocessor systems among others), controlling data collection, and automated diagnostics. Each autonomous system can be used to isolate and manage specific problems in the most efficient manner. These problems are related to data acquisition, hard ({mu}s{endash}ms) real time requirements, soft (ms{endash}s) real time requirements, remote control of diagnostics, etc. In the operation of TJ-II, the programming of systems will be carried out from the central computer. Coordination and synchronization will be performed by linking systems to local area networks. Several Ethernet segments and FDDI rings will be used for these purposes. Programmable logic controller devices (PLCs) used for diagnostic low level control will be linked among them through a fast serial link, the RS485 Profibus standard. One VME crate, running on the OS-9 real time operating system, will be assigned as a gateway, so as to connect the PLCs based systems with an Ethernet segment. {copyright} {ital 1997 American Institute of Physics.}

  2. Upgrading NASA/DOSE laser ranging system control computers

    NASA Technical Reports Server (NTRS)

    Ricklefs, Randall L.; Cheek, Jack; Seery, Paul J.; Emenheiser, Kenneth S.; Hanrahan, William P., III; Mcgarry, Jan F.

    1993-01-01

    Laser ranging systems now managed by the NASA Dynamics of the Solid Earth (DOSE) and operated by the Bendix Field Engineering Corporation, the University of Hawaii, and the University of Texas have produced a wealth on interdisciplinary scientific data over the last three decades. Despite upgrades to the most of the ranging station subsystems, the control computers remain a mix of 1970's vintage minicomputers. These encompass a wide range of vendors, operating systems, and languages, making hardware and software support increasingly difficult. Current technology allows replacement of controller computers at a relatively low cost while maintaining excellent processing power and a friendly operating environment. The new controller systems are now being designed using IBM-PC-compatible 80486-based microcomputers, a real-time Unix operating system (LynxOS), and X-windows/Motif IB, and serial interfaces have been chosen. This design supports minimizing short and long term costs by relying on proven standards for both hardware and software components. Currently, the project is in the design and prototyping stage with the first systems targeted for production in mid-1993.

  3. Computer measurement and representation of the heart in two and three dimensions

    NASA Technical Reports Server (NTRS)

    Rasmussen, D.

    1976-01-01

    Methods for the measurement and display by minicomputer of cardiac images obtained from fluoroscopy to permit an accurate assessment of functional changes are discussed. Heart contours and discrete points can be digitized automatically or manually, with the recorded image in a video, cine, or print format. As each frame is digitized it is assigned a code name identifying the data source, experiment, run, view, and frame, and the images are filed for future reference in any sequence. Two views taken at the same point in the heart cycle are used to compute the spatial position of the ventricle apex and the midpoint of the aortic valve. The remainder of the points on the chamber border are corrected for the linear distortion of the X-rays by projection to a plane containing the chord between the apex and the aortic valve center and oriented so that lines perpendicular to the chord are parallel to the image intensifier face. The image of the chamber surface is obtained by generating circular cross sections with diameters perpendicular to the major chord. The transformed two- and three-dimensional imagery can be displayed in either static or animated form using a graphics terminal.

  4. Software used with the flux mapper at the solar parabolic dish test site

    NASA Technical Reports Server (NTRS)

    Miyazono, C.

    1984-01-01

    Software for data archiving and data display was developed for use on a Digital Equipment Corporation (DEC) PDP-11/34A minicomputer for use with the JPL-designed flux mapper. The flux mapper is a two-dimensional, high radiant energy scanning device designed to measure radiant flux energies expected at the focal point of solar parabolic dish concentrators. Interfacing to the DEC equipment was accomplished by standard RS-232C serial lines. The design of the software was dicated by design constraints of the flux-mapper controller. Early attemps at data acquisition from the flux-mapper controller were not without difficulty. Time and personnel limitations result in an alternative method of data recording at the test site with subsequent analysis accomplished at a data evaluation location at some later time. Software for plotting was also written to better visualize the flux patterns. Recommendations for future alternative development are discussed. A listing of the programs used in the anaysis is included in an appendix.

  5. Mobile gamma-ray scanning system for detecting radiation anomalies associated with /sup 226/Ra-bearing materials

    SciTech Connect

    Myrick, T.E.; Blair, M.S.; Doane, R.W.; Goldsmith, W.A.

    1982-11-01

    A mobile gamma-ray scanning system has been developed by Oak Ridge National Laboratory for use in the Department of Energy's remedial action survey programs. The unit consists of a NaI(T1) detection system housed in a specially-equipped van. The system is operator controlled through an on-board mini-computer, with data output provided on the computer video screen, strip chart recorders, and an on-line printer. Data storage is provided by a floppy disk system. Multichannel analysis capabilities are included for qualitative radionuclide identification. A /sup 226/Ra-specific algorithm is employed to identify locations containing residual radium-bearing materials. This report presents the details of the system description, software development, and scanning methods utilized with the ORNL system. Laboratory calibration and field testing have established the system sensitivity, field of view, and other performance characteristics, the results of which are also presented. Documentation of the instrumentation and computer programs are included.

  6. A C Language Implementation of the SRO (Murdock) Detector/Analyzer

    USGS Publications Warehouse

    Murdock, James N.; Halbert, Scott E.

    1991-01-01

    A signal detector and analyzer algorithm was described by Murdock and Hutt in 1983. The algorithm emulates the performance of a human interpreter of seismograms. It estimates the signal onset, the direction of onset (positive or negative), the quality of these determinations, the period and amplitude of the signal, and the background noise at the time of the signal. The algorithm has been coded in C language for implementation as a 'blackbox' for data similar to that of the China Digital Seismic Network. A driver for the algorithm is included, as are suggestions for other drivers. In all of these routines, plus several FIR filters that are included as well, floating point operations are not required. Multichannel operation is supported. Although the primary use of the code has been for in-house processing of broadband and short period data of the China Digital Seismic Network, provisions have been made to process the long period and very long period data of that system as well. The code for the in-house detector, which runs on a mini-computer, is very similar to that of the field system, which runs on a microprocessor. The code is documented.

  7. Design and implementation of the Interactive Communications Simulator (ICS)

    NASA Astrophysics Data System (ADS)

    Modestino, J. W.; Matis, K. R.; Jung, K. Y.; Vickers, A. L.

    1981-04-01

    This report describes the development and capabilities of a fairly comprehensive system for the digital simulation of a wide variety of point-to-point digital communication systems. This system is called the Interactive Communications Simulator (ICS). The ICS is a flexible, graphics oriented, high speed, and highly interactive hardware/software system consisting of a PDP 11-40 minicomputer acting as host to a fast peripheral array processor, the Floating Point Systems AP-120B. Its modeling structure is the classical breakout of the various generic signal processing functions in any communication link, from source to sink. The signal processing functions are predominantly modeled in terms of their complex envelope representations for uniformity, ease, and accuracy of analyses. It is fully supported by an extensive graphics support package and many powerful analysis subroutines, to facilitate user-interactions, analyses, and output displays. The modeling and simulation tasks are optimally partitioned between the PDP 11/40 host and the AP-120B peripheral array processor to ensure ease-of-use and highly efficient manipulations. The ICS also features realistic channel models, in addition to the analytically expedient Additive White Gaussian Noise (AWGN) channel, so that the performance and behavior of all modeled transceiver functions can be more exactly assessed and specified. All simulation modules are written in the AP Assembly Language, and the system software, graphics support package and analysis subroutines are written in DEC FORTRAN IV.

  8. Design outline for a new multiman ATC simulation facility at NASA-Ames Research Center

    NASA Technical Reports Server (NTRS)

    Kreifeldt, J. G.; Gallagher, O.

    1977-01-01

    A new and unique facility for studying human factors aspects in aeronautics is being planned for use in the Man-Vehicle Systems Research Division at the NASA-Ames Research Center. This facility will replace the existing three cockpit-single ground controller station and be expandable to include approximately seven cockpits and two ground controller stations. Unlike the previous system, each cockpit will be mini-computer centered and linked to a main CPU to effect a distributed computation facility. Each simulator will compute its own flight dynamic and flight path predictor. Mechanical flight instruments in each cockpit will be locally supported and CRT cockpit displays of (e.g.) traffic and or RNAV information will be centrally computed and distributed as a means of extending the existing computational and graphical resources. An outline of the total design is presented which addresses the technical design options and research possibilities of this unique man-machine facility and which may also serve as a model for other real time distributed simulation facilities.

  9. High Frequency Sampling of TTL Pulses on a Raspberry Pi for Diffuse Correlation Spectroscopy Applications.

    PubMed

    Tivnan, Matthew; Gurjar, Rajan; Wolf, David E; Vishwanath, Karthik

    2015-08-12

    Diffuse Correlation Spectroscopy (DCS) is a well-established optical technique that has been used for non-invasive measurement of blood flow in tissues. Instrumentation for DCS includes a correlation device that computes the temporal intensity autocorrelation of a coherent laser source after it has undergone diffuse scattering through a turbid medium. Typically, the signal acquisition and its autocorrelation are performed by a correlation board. These boards have dedicated hardware to acquire and compute intensity autocorrelations of rapidly varying input signal and usually are quite expensive. Here we show that a Raspberry Pi minicomputer can acquire and store a rapidly varying time-signal with high fidelity. We show that this signal collected by a Raspberry Pi device can be processed numerically to yield intensity autocorrelations well suited for DCS applications. DCS measurements made using the Raspberry Pi device were compared to those acquired using a commercial hardware autocorrelation board to investigate the stability, performance, and accuracy of the data acquired in controlled experiments. This paper represents a first step toward lowering the instrumentation cost of a DCS system and may offer the potential to make DCS become more widely used in biomedical applications.

  10. A five-collector system for the simultaneous measurement of argon isotope ratios in a static mass spectrometer

    USGS Publications Warehouse

    Stacey, J.S.; Sherrill, N.D.; Dalrymple, G.B.; Lanphere, M.A.; Carpenter, N.V.

    1981-01-01

    A system is described that utilizes five separate Faraday-cup collector assemblies, aligned along the focal plane of a mass spectrometer, to collect simultaneous argon ion beams at masses 36-40. Each collector has its own electrometer amplifier and analog-to-digital measuring channel, the outputs of which are processed by a minicomputer that also controls the mass spectrometer. The mass spectrometer utilizes a 90?? sector magnetic analyzer with a radius of 23 cm, in which some degree of z-direction focussing is provided for all the ion beams by the fringe field of the magnet. Simultaneous measurement of the ion beams helps to eliminate mass-spectrometer memory as a significant source of measurement error during an analysis. Isotope ratios stabilize between 7 and 9 s after sample admission into the spectrometer, and thereafter changes in the measured ratios are linear, typically to within ??0.02%. Thus the multi-collector arrangement permits very short extrapolation times for computation of initial ratios, and also provides the advantages of simultaneous measurement of the ion currents in that errors due to variations in ion beam intensity are minimized. A complete analysis takes less than 10 min, so that sample throughput can be greatly enhanced. In this instrument, the factor limiting analytical precision now lies in short-term apparent variations in the interchannel calibration factors. ?? 1981.

  11. Development and evaluation of an automated reflectance microscope system for the petrographic characterization of bituminous coals

    SciTech Connect

    Hoover, D. S.; Davis, A.

    1980-10-01

    The development of automated coal petrographic techniques will lessen the demands on skilled personnel to do routine work. This project is concerned with the development and successful testing of an instrument which will meet these needs. The fundamental differences in reflectance of the three primary maceral groups should enable their differentiation in an automated-reflectance frequency histogram (reflectogram). Consequently, reflected light photometry was chosen as the method for automating coal petrographic analysis. Three generations of an automated system (called Rapid Scan Versions I, II and III) were developed and evaluated for petrographic analysis. Their basic design was that of a reflected-light microscope photometer with an automatic stage, interfaced with a minicomputer. The hardware elements used in the Rapid Scan Version I limited the system's flexibility and presented problems with signal digitization and measurement precision. Rapid Scan Version II was designed to incorporate a new microscope photometer and computer system. A digital stepping stage was incorporated into the Rapid Scan Version III system. The precision of reflectance determination of this system was found to be +- 0.02 percent reflectance. The limiting factor in quantitative interpretation of Rapid Scan reflectograms is the resolution of reflectance populations of the individual maceral groups. Statistical testing indicated that reflectograms were highly reproducible, and a new computer program, PETAN, was written to interpret the curves for vitrinite reflectance parameters ad petrographic.

  12. Sarcomere length dispersion in single skeletal muscle fibers and fiber bundles.

    PubMed

    Paolini, P J; Sabbadini, R; Roos, K P; Baskin, R J

    1976-08-01

    Light diffraction patterns produced by single skeletal muscle fibers and small fiber bundles of Rana pipiens semitendinosus have been examined at rest and during tetanic contraction. The muscle diffraction patterns were recorded with a vidicon camera interfaced to a minicomputer. Digitized video output was analyzed on-line to determine mean sarcomere length, line intensity, and the distribution of sarcomere lengths. The occurrence of first-order line intensity and peak amplitude maxima at approximately 3.0 mum is interpreted in terms of simple scattering theory. Measurements made along the length of a singel fiber reveal small variations in calculated mean sarcomere length (SD about 1.2%) and its percent dispersion (2.1% +/- 0.8%). Dispersion in small multifiber preparations increases approximately linearly with fiber number (about 0.2% per fiber) to a maximum of 8-10% in large bundles. Dispersion measurements based upon diffraction line analysis are comparable to SDs calculated from length distribution histograms obtained by light micrography of the fiber. First-order line intensity decreases by about 40% during tetanus; larger multifibered bundles exhibit substantial increases in sarcomere dispersion during contraction, but single fibers show no appreciable dispersion change. These results suggest the occurrence of asynchronous static or dynamic axial disordering of thick filaments, with a persistence in long range order of sarcomere spacing during contraction in single fibers. PMID:1084766

  13. A real-time electronic imaging system for solar X-ray observations from sounding rockets

    NASA Technical Reports Server (NTRS)

    Davis, J. M.; Ting, J. W.; Gerassimenko, M.

    1979-01-01

    A real-time imaging system for displaying the solar coronal soft X-ray emission, focussed by a grazing incidence telescope, is described. The design parameters of the system, which is to be used primarily as part of a real-time control system for a sounding rocket experiment, are identified. Their achievement with a system consisting of a microchannel plate, for the conversion of X-rays into visible light, and a slow-scan vidicon, for recording and transmission of the integrated images, is described in detail. The system has a quantum efficiency better than 8 deg above 8 A, a dynamic range of 1000 coupled with a sensitivity to single photoelectrons, and provides a spatial resolution of 15 arc seconds over a field of view of 40 x 40 square arc minutes. The incident radiation is filtered to eliminate wavelengths longer than 100 A. Each image contains 3.93 x 10 to the 5th bits of information and is transmitted to the ground where it is processed by a mini-computer and displayed in real-time on a standard TV monitor.

  14. Spent Fuel Test - Climax data acquisition system operations manual

    SciTech Connect

    Nyholm, R.A.

    1983-01-01

    The Spent Fuel Test-Climax (SFT-C) is a test of the retrievable, deep geologic storage of commercially generated, spent nuclear reactor fuel in granite rock. Eleven spent fuel assemblies, together with 6 electrical simulators and 20 guard heaters, are emplaced 420 m below the surface in the Climax granite at the US Department of Energy Nevada Test Site. On June 2, 1978, Lawrence Livermore National Laboratory (LLNL) secured funding for the SFT-C, and completed spent fuel emplacement May 28, 1980. The multi-year duration test is located in a remote area and is unattended much of the time. An extensive array of radiological safety and geotechnical instrumentation is deployed to monitor the test performance. A dual minicomputer-based data acquisition system (DAS) collects and processes data from more than 900 analog instruments. This report documents the software element of the LLNL developed SFT-C Data Acquisition System. It defines the operating system and hardware interface configurations, the special applications software and data structures, and support software.

  15. Spent fuel test. Climax data acquisition system integration report

    SciTech Connect

    Nyholm, R.A.; Brough, W.G.; Rector, N.L.

    1982-06-01

    The Spent Fuel Test - Climax (SFT-C) is a test of the retrievable, deep geologic storage of commercially generated, spent nuclear reactor fuel in granitic rock. Eleven spent fuel assemblies, together with 6 electrical simulators and 20 guard heaters, are emplaced 420 m below the surface in the Climax granite at the Nevada Test Site. On June 2, 1978, Lawrence Livermore National Laboratory (LLNL) secured funding for the SFT-C, and completed spent fuel emplacement May 28, 1980. This multi-year duration test is located in a remote area and is unattended much of the time. An extensive array of radiological safety and geotechnical instrumentation is deployed to monitor the test performance. A dual minicomputer-based data acquisition system collects and processes data from more than 900 analog instruments. This report documents the design and functions of the hardware and software elements of the Data Acquisition System and describes the supporting facilities which include environmental enclosures, heating/air-conditioning/humidity systems, power distribution systems, fire suppression systems, remote terminal stations, telephone/modem communications, and workshop areas. 9 figures.

  16. The New CERN PS control system overview and status

    NASA Astrophysics Data System (ADS)

    Perriollat, F.; Serre, C.

    1994-12-01

    The CERN PS control system is being completely rejuvenated. The existing system, whose design options were frozen in 1978, uses 16 bit minicomputers for process computers and for conventional consoles and services. These are being replaced by the agreed CERN Standard Architecture, using UNIX workstations as operator interface and VME based processors or PC frontends, under LynxOS. All CAMAC is essentially preserved. The project covers about five years and proceeds in steps of one year. Swicht over takes place in the annual shutdown, early in each year. No interference with the machine operation schedule is tolerated and this implies that no extra machine stops are planned for controls. The first two steps have been completed and operate to the complete satisfaction of the users; the LPI (Lep Preinjector) machines run since March 92 and the Proton Linac was started in March 93. The control system of the Lead Linac is being commissioned during 93 and 94. The third step of rejuvenation concerns the Booster machine and is under implementation now. The paper describes the architecture, the techniques used, the major components and the experience gained up to the conference time.

  17. Determination of physical and chemical states of lubricants in concentrated contacts, part 1

    NASA Technical Reports Server (NTRS)

    Lauer, J. L.

    1979-01-01

    A Fourier emission infrared microspectrometer, set up on a vibration-proof optical table and interfaced to a dedicated minicomputer, was used to record infrared emission spectra from elastohydrodynamic bearing contacts. Its range was extended to cover the entire mid-infrared from 2 to 15 micron. A series of experiments with 5P4E polyphenyl ether showed the existence of a temperature gradient through the lubricant in an ehd contact, which is perpendicular to the flow direction. The experiments also show marked polarization of some of the spectral bands, indicating a molecular alignment. Alignment is less evident at high pressure than at low pressure. To account for this behavior, a model is suggested along the lines developed for the conformational changes observed in long-chain polymers when subjected to increased pressure--to accommodate closer packing, molecules become kinked and curl up. Experiments with a traction fluid showed periodic changes of flow pattern associated with certain spectral changes. These observations will be studied further. A study by infrared attenuated total reflection spectrophotometry was undertaken to determine whether gamma irradiation would change polyethylene wear specimens. The results were negative.

  18. Electroencephalographic cartography. II. By means of statistical group studies-activation by visual attention.

    PubMed

    Etevenon, P; Tortrat, D; Benkelfat, C

    1985-01-01

    10 male volunteers, right-handers, mean age 30.4 years, were recorded in four successive sequences: under 'eyes closed' conditions, right and then left hemisphere, followed by an 'eyes open' situation with visual attention fixed on a cartoon, right and then left hemisphere recordings. Each EEG recording was made simultaneously over 16 EEG channels for each hemisphere, according to a protocol previously described as well as Fourier analysis and EEG mapping on a minicomputer (HP 5451 C, HP 1000). Each EEG recording was stored on a cartography data base, and 90 maps could be drawn from 10 spectral parameters applied to the raw EEG and 5 frequency bands. Permutation paired Fisher tests were applied to three main EEG parameters: mean centroid frequencies, RMS amplitudes in microvolts and relative (%) amplitudes. Activation of EEG in the 'eyes open' situation during visual fixation was found compared to the 'eyes closed' situation: decreasing dominant EEG frequency and low delta and theta mean frequencies, no change in a mean alpha frequency; increasing fast mean beta frequencies, together with a major decrease in theta, alpha, beta 1 amplitudes, and a concomitant increase in raw EEG, delta and beta 2 amplitudes. Finally, the percent alpha amplitude was decreased when other percent amplitudes were increased in delta, theta, beta 1 and beta 2 frequency bands. A symmetry between hemispheres was observed in the 'eyes closed' situation. Averaged EEG maps between subjects illustrate these findings, especially relative (%) alpha amplitude maps and also maps of coefficients of resonance of the alpha rhythm.

  19. Biomedical applications of MWPCs for digital imaging of soft β- emitters

    NASA Astrophysics Data System (ADS)

    Bellazzini, R.; Del Guerra, A.; Massai, M. M.; Spandre, G.

    1983-11-01

    We have built an experimental facility equipped with multiwire proportional chambers and a PDP 11/23 mini-computer for the digital imaging of two-dimensional 3H distributions in biological and medical applications. A spatial resolution of ˜ 1.5 mm (fwhm), a sensitivity of 10 -1 Bq/cm 2, and an efficiency of ˜ 10% with a uniformity of 4% have been measured with a MWPC working at atmospheric pressure with 2 mm anode pitch and cathode-coupled delay line read-out. A second chamber with 1 mm anode pitch at 45° with respect to the cathode wires has been operated at 2 atm. In this case a spatial resolution of ˜ 800 ≃m (fwhm) for 3H sources has been measured along both directions. The images obtained in biological and medical applications are presented, namely: (1) identification of human clones with defective repair of UV-induced damage, (2) study of regional carbohydrate consumption in myocardial tissue.

  20. A UNIX interface to supercomputers

    SciTech Connect

    McBryan, O.A.

    1985-01-01

    We describe a convenient interface between UNIX-based work-stations or minicomputers, and supercomputers such as the CRAY series machines. Using this interface, the user can issue commands entirely on the UNIX system, with remote compilation, loading and execution performed on the supercomputer. The interface is not a remote login interface. Rather the domain of various UNIX utilities such as compilers, archivers and loaders are extended to include the CRAY. The user need know essentially nothing about the CRAY operating system, commands or filename restrictions. Standard UNIX utilities will perform CRAY operations transparently. UNIX command names and arguments are mapped to corresponding CRAY equivalents, suitable options are selected as needed, UNIX directory tree filenames are coerced to allowable CRAY names and all source and output files are automatically transferred between the machines. The primary purpose of the software is to allow the programmer to benefit from the interactive features of UNIX systems including screen editors, software maintenance utilities such as make and SCCS and in general to avail of the large set of UNIX text manipulation features. The interface was designed particularly to support development of very large multi-file programs, possibly consisting of hundreds of files and hundreds of thousands of lines of code. All CRAY source is kept on the work-station. We have found that using the software, the complete program development phase for a large CRAY application may be performed entirely on a work-station.

  1. Puff-Plume Atmospheric Deposition Model.

    1992-06-24

    Version: 00 PFPL is an interactive transport and diffusion program developed for real-time calculation of the location and concentration of toxic or radioactive materials during an accidental release. Deposition calculations are included. The potential exists at the Savannah River Plant for releases of either toxic gases or radionuclides. The automated system developed to provide real-time information on the trajectory and concentration of an accidental release consists of meteorological towers, a minicomputer, and a network ofmore » terminals called the Weather Information and Display (WIND) System. PFPL which simulates either instantaneous (puff) or continuous (plume) releases is the primary code used at Savannah River for emergency response. Data files are provided for demonstration. The software for archiving the required on-line meteorological data is not included. Subroutines used for graphic display of results and operational control of the DEC VT100 and Tektronix terminals in the terminal network are included. Anyone wishing t use these routines must make appropriate modifications to the file TERMINALS.DAT. The DAT files provided were copied during the afternoon of December 28, 1983. Test runs attempting to use these files should specify release times on or before that date. Any user wishing to obtain numerical output only form the model based on conditions in his locality must supply appropriate wind data for the program.« less

  2. Peripheral processors for high-speed simulation. [helicopter cockpit simulator

    NASA Technical Reports Server (NTRS)

    Karplus, W. J.

    1977-01-01

    This paper describes some of the results of a study directed to the specification and procurement of a new cockpit simulator for an advanced class of helicopters. A part of the study was the definition of a challenging benchmark problem, and detailed analyses of it were made to assess the suitability of a variety of simulation techniques. The analyses showed that a particularly cost-effective approach to the attainment of adequate speed for this extremely demanding application is to employ a large minicomputer acting as host and controller for a special-purpose digital peripheral processor. Various realizations of such peripheral processors, all employing state-of-the-art electronic circuitry and a high degree of parallelism and pipelining, are available or under development. The types of peripheral processors array processors, simulation-oriented processors, and arrays of processing elements - are analyzed and compared. They are particularly promising approaches which should be suitable for high-speed simulations of all kinds, the cockpit simulator being a case in point.

  3. An interferometric strain-displacement measurement system

    NASA Technical Reports Server (NTRS)

    Sharpe, William N., Jr.

    1989-01-01

    A system for measuring the relative in-plane displacement over a gage length as short as 100 micrometers is described. Two closely spaced indentations are placed in a reflective specimen surface with a Vickers microhardness tester. Interference fringes are generated when they are illuminated with a He-Ne laser. As the distance between the indentations expands or contracts with applied load, the fringes move. This motion is monitored with a minicomputer-controlled system using linear diode arrays as sensors. Characteristics of the system are: (1) gage length ranging from 50 to 500 micrometers, but 100 micrometers is typical; (2) least-count resolution of approximately 0.0025 micrometer; and (3) sampling rate of 13 points per second. In addition, the measurement technique is non-contacting and non-reinforcing. It is useful for strain measurements over small gage lengths and for crack opening displacement measurements near crack tips. This report is a detailed description of a new system recently installed in the Mechanisms of Materials Branch at the NASA Langley Research Center. The intent is to enable a prospective user to evaluate the applicability of the system to a particular problem and assemble one if needed.

  4. The first "space" vegetables have been grown in the "SVET" greenhouse using controlled environmental conditions

    NASA Astrophysics Data System (ADS)

    Ivanova, T. N.; Bercovich, Yu. A.; Mashinskiy, A. L.; Meleshko, G. I.

    The paper describes the "SVET" project—a new generation of space greenhouse with small dimensions. Through the use of a minicomputer, "SVET" is fully capable of automatically operating and controlling environmental systems for higher plant growth. A number of preliminary studies have shown the radish and cabbage to be potentially important crops for CELSS (Closed Environmental Life Support System). The "SVET" space greenhouse was mounted on the "CRYSTAL" technological module docked to the Mir orbital space station on 10 June 1990. Soviet cosmonauts Balandin and Solovyov started the first experiments with the greenhouse on 15 June 1990. Preliminary results of seed cultivation over an initial 54-day period in "SVET" are presented. Morphometrical characteristics of plants brought back to Earth are given. Alteration in plant characteristics, such as growth and developmental changes, or morphological contents were noted. A crop of radish plants was harvested under microgravity conditions. Characteristics of plant environmental control parameters and an estimation of functional properties of control and regulation systems of the "SVET" greenhouse in space flight as received via telemetry data is reported.

  5. A system for processing Landsat and other georeferenced data for resource management applications

    NASA Technical Reports Server (NTRS)

    Whitley, S. L.

    1979-01-01

    The NASA Earth Resources Laboratory has developed a transferrable system for processing Landsat and disparate data with capabilities for digital data classification, georeferencing, overlaying, and data base management. This system is known as the Earth Resources Data Analysis System. The versatility of the system has been demonstrated with applications in several disciplines. A description is given of a low-cost data system concept that is suitable for transfer to one's available in-house minicomputer or to a low-cost computer purchased for this purpose. Software packages are described that process Landsat data to produce surface cover classifications and that geographically reference the data to the UTM projection. Programs are also described that incorporate several sets of Landsat derived information, topographic information, soils information, rainfall information, etc., into a data base. Selected application algorithms are discussed and sample products are presented. The types of computers on which the low-cost data system concept has been implemented are identified, typical implementation costs are given, and the source where the software may be obtained is identified.

  6. A geographic information system for resource managers based on multi-level remote sensing data

    NASA Technical Reports Server (NTRS)

    Wheeler, D. J.; Ridd, M. K.

    1985-01-01

    Procedures followed in developing a test case geographic information system derived primarily from remotely sensed data for the North Cache Soil Conservation District (SCD) in northern Utah are outlined. The North Cache SCD faces serious problems regarding water allocation, flood and geologic hazards, urban encroachment into prime farmland, soil erosion, and wildlife habitat. Four fundamental data planes were initially entered into the geo-referenced data base: (1) land use/land cover information for the agricultural and built-up areas of the valley obtained from various forms of aerial photography; (2) vegetation/land cover in mountains classified digitally from Landsat; (3) geomorphic terrain units derived from aerial photography and soil maps; and (4) digital terrain maps obtained from DMA digital data. The land use/vegetation/land cover information from manual photographic and Landsat interpretation were joined digitally into a single data plane with an integrated legend, and segmented into quadrangle units. These were merged with the digitized geomorphic units and the digital terrain data using a Prime 400 minicomputer. All data planes were geo-referenced to a UTM coordinate grid.

  7. A geographic information system for resource managers based on multi-level remote sensing data

    NASA Technical Reports Server (NTRS)

    Wheeler, D. J.; Ridd, M. K.

    1984-01-01

    Procedures followed in developing a test case geographic information system derived primarily from remotely sensed data for the North Cache Soil Conservation District (SCD) in northern Utah are outlined. The North Cache SCD faces serious problems regarding water allocation, flood and geologic hazards, urban encroachment into prime farmland, soil erosion, and wildlife habitat. Four fundamental data planes were initially entered into the geo-referenced data base: (1) land use/land cover information for the agricultural and built-up areas of the valley obtained from various forms of aerial photography; (2) vegetation/land cover in mountains classified digitally from LANDSAT; (3) geomorphic terrain units derived from aerial photography and soil maps; and (4) digital terrain maps obtained from DMA digital data. The land use/vegetation/land cover information from manual photographic and LANDSAT interpretation were joined digitally into a single data plane with an integrated legend, and segmented into quadrangle units. These were merged with the digitized geomorphic units and the digital terrain data using a Prime 400 minicomputer. All data planes were geo-referenced to a UTM coordinate grid.

  8. ERS-1 SAR data processing

    NASA Technical Reports Server (NTRS)

    Leung, K.; Bicknell, T.; Vines, K.

    1986-01-01

    To take full advantage of the synthetic aperature radar (SAR) to be flown on board the European Space Agency's Remote Sensing Satellite (ERS-1) (1989) and the Canadian Radarsat (1990), the implementation of a receiving station in Alaska is being studied to gather and process SAR data pertaining in particular to regions within the station's range of reception. The current SAR data processing requirement is estimated to be on the order of 5 minutes per day. The Interim Digital Sar Processor (IDP) which was under continual development through Seasat (1978) and SIR-B (1984) can process slightly more than 2 minutes of ERS-1 data per day. On the other hand, the Advanced Digital SAR Processore (ADSP), currently under development for the Shuttle Imaging Radar C (SIR-C, 1988) and the Venus Radar Mapper, (VMR, 1988), is capable of processing ERS-1 SAR data at a real time rate. To better suit the anticipated ERS-1 SAR data processing requirement, both a modified IDP and an ADSP derivative are being examined. For the modified IDP, a pipelined architecture is proposed for the mini-computer plus array processor arrangement to improve throughout. For the ADSP derivative, a simplified version is proposed to enhance ease of implementation and maintainability while maintaing real time throughput rates. These processing systems are discussed and evaluated.

  9. Distribution of computer functionality for accelerator control at the Brookhaven AGS

    SciTech Connect

    Stevens, A.; Clifford, T.; Frankel, R.

    1985-01-01

    A set of physical and functional system components and their interconnection protocols have been established for all controls work at the AGS. Portions of these designs were tested as part of enhanced operation of the AGS as a source of polarized protons and additional segments will be implemented during the continuing construction efforts which are adding heavy ion capability to our facility. Included in our efforts are the following computer and control system elements: a broad band local area network, which embodies MODEMS; transmission systems and branch interface units; a hierarchical layer, which performs certain data base and watchdog/alarm functions; a group of work station processors (Apollo's) which perform the function of traditional minicomputer host(s) and a layer, which provides both real time control and standardization functions for accelerator devices and instrumentation. Data base and other accelerator functionality is assigned to the most correct level within our network for both real time performance, long-term utility, and orderly growth.

  10. Modernization of the NASA IRTF Telescope Control System

    NASA Astrophysics Data System (ADS)

    Pilger, Eric J.; Harwood, James V.; Onaka, Peter M.

    1994-06-01

    We describe the ongoing modernization of the NASA IR Telescope Facility Telescope Control System. A major mandate of this project is to keep the telescope available for observations throughout. Therefore, we have developed an incremental plan that will allow us to replace components of the software and hardware without shutting down the system. The current system, running under FORTH on a DEC LSI 11/23 minicomputer interfaced to a Bus and boards developed in house, will be replaced with a combination of a Sun SPARCstation running SunOS, a MicroSPARC based Single Board Computer running LynxOS, and various intelligent VME based peripheral cards. The software is based on a design philosophy originally developed by Pat Wallace for use on the Anglo Australian Telescope. This philosophy has gained wide acceptance, and is currently used in a number of observatories around the world. A key element of this philosophy is the division of the TCS into `Virtual' and `Real' parts. This will allow us to replace the higher level functions of the TCS with software running on the Sun, while still relying on the LSI 11/23 for performance of the lower level functions. Eventual transfer of lower level functions to the MicroSPARC system will then proceed incrementally through use of a Q-Bus to VME-Bus converter.

  11. High strain rate properties of unidirectional composites, part 1

    NASA Technical Reports Server (NTRS)

    Daniel, I. M.

    1991-01-01

    Experimental methods were developed for testing and characterization of composite materials at strain rates ranging from quasi-static to over 500 s(sup -1). Three materials were characterized, two graphite/epoxies and a graphite/S-glass/epoxy. Properties were obtained by testing thin rings 10.16 cm (4 in.) in diameter, 2.54 cm (1 in.) wide, and six to eight plies thick under internal pressure. Unidirectional 0 degree, 90 degree, and 10 degree off-axis rings were tested to obtain longitudinal, transverse, and in-plane shear properties. In the dynamic tests internal pressure was applied explosively through a liquid and the pressure was measured with a calibrated steel ring. Strains in the calibration and specimen rings were recorded with a digital processing oscilloscope. The data were processed and the equation of motion solved numerically by the mini-computer attached to the oscilloscope. Results were obtained and plotted in the form of dynamic stress-strain curves. Longitudinal properties which are governed by the fibers do not vary much with strain rate with only a moderate (up to 20 percent) increase in modulus. Transverse modulus and strength increase sharply with strain rate reaching values up to three times the static values. The in-plane shear modulus and shear strength increase noticeably with strain rate by up to approximately 65 percent. In all cases ultimate strains do not vary significantly with strain rates.

  12. Fluctuations in tension during contraction of single muscle fibers.

    PubMed Central

    Borejdo, J; Morales, M F

    1977-01-01

    We have searched for fluctuations in the steady-state tension developed by stimulated single muscle fibers. Such tension "noise" is expected to be present as a result of the statistical fluctuations in the number and/or state of myosin cross-bridges interacting with thin filament sites at any time. A sensitive electro-optical tension transducer capable of resolving the expected fluctuations in magnitude and frequency was constructed to search for the fluctuations. The noise was analyzed by computing the power spectra and amplitude of stochastic fluctuations in the photomultiplier counting rate, which was made proportional to muscle force. The optical system and electronic instrumentation together with the minicomputer software are described. Tensions were measured in single skinned glycerinated rabbit psoas muscle fibers in rigor and during contraction and relaxation. The results indicate the presence of fluctuations in contracting muscles and a complete absence of tension noise in eith rigor or relaxation. Also, a numerical method was developed to simulate the power spectra and amplitude of fluctuations, given the rate constants for association and dissociation of the cross-bridges and actin. The simulated power spectra and the frequency distributions observed experimentally are similar. PMID:922123

  13. Test plan for 32-bit microcomputers for the Water Resources Division; Chapter A, Test plan for acquisition of prototype 32-bit microcomputers

    USGS Publications Warehouse

    Hutchison, N.E.; Harbaugh, A.W.; Holloway, R.A.; Merk, C.F.

    1987-01-01

    The Water Resources Division (WRD) of the U.S. Geological Survey is evaluating 32-bit microcomputers to determine how they can complement, and perhaps later replace, the existing network of minicomputers. The WRD is also designing a National Water Information System (NWIS) that will combine and integrate the existing National Water Data Storage and Retrieval System (WATSTORE), National Water Data Exchange (NAWDEX), and components of several other existing systems. The procedures and testing done in a market evaluation of 32-bit microcomputers are documented. The results of the testing are documented in the NWIS Project Office. The market evaluation was done to identify commercially available hardware and software that could be used for implementing early NWIS prototypes to determine the applicability of 32-bit microcomputers for data base and general computing applications. Three microcomputers will be used for these prototype studies. The results of the prototype studies will be used to compile requirements for a Request for Procurement (RFP) for hardware and software to meet the WRD 's needs in the early 1990's. The identification of qualified vendors to provide the prototype hardware and software included reviewing industry literature, and making telephone calls and personal visits to prospective vendors. Those vendors that appeared to meet general requirements were required to run benchmark tests. (Author 's abstract)

  14. Modular on-board adaptive imaging

    NASA Technical Reports Server (NTRS)

    Eskenazi, R.; Williams, D. S.

    1978-01-01

    Feature extraction involves the transformation of a raw video image to a more compact representation of the scene in which relevant information about objects of interest is retained. The task of the low-level processor is to extract object outlines and pass the data to the high-level process in a format that facilitates pattern recognition tasks. Due to the immense computational load caused by processing a 256x256 image, even a fast minicomputer requires a few seconds to complete this low-level processing. It is, therefore, necessary to consider hardware implementation of these low-level functions to achieve real-time processing speeds. The considered project had the objective to implement a system in which the continuous feature extraction process is not affected by the dynamic changes in the scene, varying lighting conditions, or object motion relative to the cameras. Due to the high bandwidth (3.5 MHz) and serial nature of the TV data, a pipeline processing scheme was adopted as the overall architecture of this system. Modularity in the system is achieved by designing circuits that are generic within the overall system.

  15. Integration and software for thermal test of heat rate sensors. [space shuttle external tank

    NASA Technical Reports Server (NTRS)

    Wojciechowski, C. J.; Shrider, K. R.

    1982-01-01

    A minicomputer controlled radiant test facility is described which was developed and calibrated in an effort to verify analytical thermal models of instrumentation islands installed aboard the space shuttle external tank to measure thermal flight parameters during ascent. Software was provided for the facility as well as for development tests on the SRB actuator tail stock. Additional testing was conducted with the test facility to determine the temperature and heat flux rate and loads required to effect a change of color in the ET tank external paint. This requirement resulted from the review of photographs taken of the ET at separation from the orbiter which showed that 75% of the external tank paint coating had not changed color from its original white color. The paint on the remaining 25% of the tank was either brown or black, indicating that it had degraded due to heating or that the spray on form insulation had receded in these areas. The operational capability of the facility as well as the various tests which were conducted and their results are discussed.

  16. High Frequency Sampling of TTL Pulses on a Raspberry Pi for Diffuse Correlation Spectroscopy Applications.

    PubMed

    Tivnan, Matthew; Gurjar, Rajan; Wolf, David E; Vishwanath, Karthik

    2015-01-01

    Diffuse Correlation Spectroscopy (DCS) is a well-established optical technique that has been used for non-invasive measurement of blood flow in tissues. Instrumentation for DCS includes a correlation device that computes the temporal intensity autocorrelation of a coherent laser source after it has undergone diffuse scattering through a turbid medium. Typically, the signal acquisition and its autocorrelation are performed by a correlation board. These boards have dedicated hardware to acquire and compute intensity autocorrelations of rapidly varying input signal and usually are quite expensive. Here we show that a Raspberry Pi minicomputer can acquire and store a rapidly varying time-signal with high fidelity. We show that this signal collected by a Raspberry Pi device can be processed numerically to yield intensity autocorrelations well suited for DCS applications. DCS measurements made using the Raspberry Pi device were compared to those acquired using a commercial hardware autocorrelation board to investigate the stability, performance, and accuracy of the data acquired in controlled experiments. This paper represents a first step toward lowering the instrumentation cost of a DCS system and may offer the potential to make DCS become more widely used in biomedical applications. PMID:26274961

  17. Transferring ecosystem simulation codes to supercomputers

    NASA Technical Reports Server (NTRS)

    Skiles, J. W.; Schulbach, C. H.

    1995-01-01

    Many ecosystem simulation computer codes have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Supercomputing platforms (both parallel and distributed systems) have been largely unused, however, because of the perceived difficulty in accessing and using the machines. Also, significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers must be considered. We have transferred a grassland simulation model (developed on a VAX) to a Cray Y-MP/C90. We describe porting the model to the Cray and the changes we made to exploit the parallelism in the application and improve code execution. The Cray executed the model 30 times faster than the VAX and 10 times faster than a Unix workstation. We achieved an additional speedup of 30 percent by using the compiler's vectoring and 'in-line' capabilities. The code runs at only about 5 percent of the Cray's peak speed because it ineffectively uses the vector and parallel processing capabilities of the Cray. We expect that by restructuring the code, it could execute an additional six to ten times faster.

  18. Study of a hybrid multispectral processor

    NASA Technical Reports Server (NTRS)

    Marshall, R. E.; Kriegler, F. J.

    1973-01-01

    A hybrid processor is described offering enough handling capacity and speed to process efficiently the large quantities of multispectral data that can be gathered by scanner systems such as MSDS, SKYLAB, ERTS, and ERIM M-7. Combinations of general-purpose and special-purpose hybrid computers were examined to include both analog and digital types as well as all-digital configurations. The current trend toward lower costs for medium-scale digital circuitry suggests that the all-digital approach may offer the better solution within the time frame of the next few years. The study recommends and defines such a hybrid digital computing system in which both special-purpose and general-purpose digital computers would be employed. The tasks of recognizing surface objects would be performed in a parallel, pipeline digital system while the tasks of control and monitoring would be handled by a medium-scale minicomputer system. A program to design and construct a small, prototype, all-digital system has been started.

  19. Interactive Forecasting with the National Weather Service River Forecast System

    NASA Technical Reports Server (NTRS)

    Smith, George F.; Page, Donna

    1993-01-01

    The National Weather Service River Forecast System (NWSRFS) consists of several major hydrometeorologic subcomponents to model the physics of the flow of water through the hydrologic cycle. The entire NWSRFS currently runs in both mainframe and minicomputer environments, using command oriented text input to control the system computations. As computationally powerful and graphically sophisticated scientific workstations became available, the National Weather Service (NWS) recognized that a graphically based, interactive environment would enhance the accuracy and timeliness of NWS river and flood forecasts. Consequently, the operational forecasting portion of the NWSRFS has been ported to run under a UNIX operating system, with X windows as the display environment on a system of networked scientific workstations. In addition, the NWSRFS Interactive Forecast Program was developed to provide a graphical user interface to allow the forecaster to control NWSRFS program flow and to make adjustments to forecasts as necessary. The potential market for water resources forecasting is immense and largely untapped. Any private company able to market the river forecasting technologies currently developed by the NWS Office of Hydrology could provide benefits to many information users and profit from providing these services.

  20. A new approach for data acquisition at the JPL space simulators

    NASA Technical Reports Server (NTRS)

    Fisher, Terry C.

    1992-01-01

    In 1990, a personal computer based data acquisition system was put into service for the Space Simulators and Environmental Test Laboratory at the Jet Propulsion Laboratory (JPL) in Pasadena, California. The new system replaced an outdated minicomputer system which had been in use since 1980. This new data acquisition system was designed and built by JPL for the specific task of acquiring thermal test data in support of space simulation and thermal vacuum testing at JPL. The data acquisition system was designed using powerful personal computers and local-area-network (LAN) technology. Reliability, expandability, and maintainability were some of the most important criteria in the design of the data system and in the selection of hardware and software components. The data acquisition system is used to record both test chamber operational data and thermal data from the unit under test. Tests are conducted in numerous small thermal vacuum chambers and in the large solar simulator and range in size from individual components using only 2 or 3 thermocouples to entire planetary spacecraft requiring in excess of 1200 channels of test data. The system supports several of these tests running concurrently. The previous data system is described along with reasons for its replacement, the types of data acquired, the new data system, and the benefits obtained from the new system including information on tests performed to date.

  1. H-coal fluid dynamics. Final report, August 1, 1977-December 31, 1979

    SciTech Connect

    Not Available

    1980-04-16

    This report presents the results of work aimed at understanding the hydrodynamic behavior of the H-Coal reactor. A summary of the literature search related to the fluid dynamic behavior of gas/liquid/solid systems has been presented. Design details of a cold flow unit were discussed. The process design of this cold flow model followed practices established by HRI in their process development unit. The cold fow unit has been used to conduct experiments with nitrogen, kerosene, or kerosene/coal char slurries, and HDS catalyst, which at room temperature have properties similar to those existing in the H-Coal reactor. Mineral oil, a high-viscosity liquid, was also used. The volume fractions occupied by gas/liquid slurries and catalyst particles were determined by several experimental techniques. The use of a mini-computer for data collection and calculation has greatly accelerated the analysis and reporting of data. Data on nitrogen/kerosene/HDS catalyst and coal char fines are presented in this paper. Correlations identified in the literature search were utilized to analyze the data. From this analysis it became evident that the Richardson-Zaki correlation describes the effect of slurry flow rate on catalyst expansion. Three-phase fluidization data were analyzed with two models.

  2. 1985 ACSM-ASPRS Fall Convention, Indianapolis, IN, September 8-13, 1985, Technical Papers

    SciTech Connect

    Not Available

    1985-01-01

    Papers are presented on Landsat image data quality analysis, primary data acquisition, cartography, geodesy, land surveying, and the applications of satellite remote sensing data. Topics discussed include optical scanning and interactive color graphics; the determination of astrolatitudes and astrolongitudes using x, y, z-coordinates on the celestial sphere; raster-based contour plotting from digital elevation models using minicomputers or microcomputers; the operational techniques of the GPS when utilized as a survey instrument; public land surveying and high technology; the use of multitemporal Landsat MSS data for studying forest cover types; interpretation of satellite and aircraft L-band synthetic aperture radar imagery; geological analysis of Landsat MSS data; and an interactive real time digital image processing system. Consideration is given to a large format reconnaissance camera; creating an optimized color balance for TM and MSS imagery; band combination selection for visual interpretation of thematic mapper data for resource management; the effect of spatial filtering on scene noise and boundary detail in thematic mapper imagery; the evaluation of the geometric quality of thematic mapper photographic data; and the analysis and correction of Landsat 4 and 5 thematic mapper sensor data.

  3. Voice Controlled Wheelchair

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Michael Condon, a quadraplegic from Pasadena, California, demonstrates the NASA-developed voice-controlled wheelchair and its manipulator, which can pick up packages, open doors, turn a TV knob, and perform a variety of other functions. A possible boon to paralyzed and other severely handicapped persons, the chair-manipulator system responds to 35 one-word voice commands, such as "go," "stop," "up," "down," "right," "left," "forward," "backward." The heart of the system is a voice-command analyzer which utilizes a minicomputer. Commands are taught I to the computer by the patient's repeating them a number of times; thereafter the analyzer recognizes commands only in the patient's particular speech pattern. The computer translates commands into electrical signals which activate appropriate motors and cause the desired motion of chair or manipulator. Based on teleoperator and robot technology for space-related programs, the voice-controlled system was developed by Jet Propulsion Laboratory under the joint sponsorship of NASA and the Veterans Administration. The wheelchair-manipulator has been tested at Rancho Los Amigos Hospital, Downey, California, and is being evaluated at the VA Prosthetics Center in New York City.

  4. History of Robotic and Remotely Operated Telescopes

    NASA Astrophysics Data System (ADS)

    Genet, Russell M.

    2011-03-01

    While automated instrument sequencers were employed on solar eclipse expeditions in the late 1800s, it wasn't until the 1960s that Art Code and associates at Wisconsin used a PDP minicomputer to automate an 8-inch photometric telescope. Although this pioneering project experienced frequent equipment failures and was shut down after a couple of years, it paved the way for the first space telescopes. Reliable microcomputers initiated the modern era of robotic telescopes. Louis Boyd and I applied single board microcomputers with 64K of RAM and floppy disk drives to telescope automation at the Fairborn Observatory, achieving reliable, fully robotic operation in 1983 that has continued uninterrupted for 28 years. In 1985 the Smithsonian Institution provided us with a suburb operating location on Mt. Hopkins in southern Arizona, while the National Science Foundation funded additional telescopes. Remote access to our multiple robotic telescopes at the Fairborn Observatory began in the late 1980s. The Fairborn Observatory, with its 14 fully robotic telescopes and staff of two (one full and one part time) illustrates the potential for low operating and maintenance costs. As the information capacity of the Internet has expanded, observational modes beyond simple differential photometry opened up, bringing us to the current era of real-time remote access to remote observatories and global observatory networks. Although initially confined to smaller telescopes, robotic operation and remote access are spreading to larger telescopes as telescopes from afar becomes the normal mode of operation.

  5. Fluid dynamics of double diffusive systems

    SciTech Connect

    Koseff, J.R.

    1989-04-07

    A study of mixing processes in doubly diffusive systems is being conducted. Continuous gradients of two diffusing components (heat and salinity in our case) are being used as initial conditions, and forcing is introduced by lateral heating and surface shear. The goals of the proposed work include: (1) quantification of the effects of finite amplitude disturbances on stable, double diffusive systems, particularly with respect to lateral heating, (2) development of an improved understanding of the physical phenomena present in wind-driven shear flows in double diffusive stratified environments, (3) increasing our knowledge-base on turbulent flow in stratified environments and how to represent it, and (4) formulation of a numerical code for such flows. The work is being carried out in an experimental facility which is located in the Stanford Environmental Fluid Mechanics Laboratory, and on laboratory minicomputers and CRAY computers. In particular we are focusing on the following key issues: (1) the formation and propagation of double diffusive intrusions away from a heated wall and the effects of lateral heating on the double diffusive system; (2) the interaction between the double diffusively influenced fluxes and the turbulence induced fluxes; (3) the measurement of heat and mass fluxes; and (4) the influence of double diffusive gradients on mixed layer deepening. 1 fig.

  6. Definition study for photovoltaic residential prototype system

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Hulstrom, R. L.; Cookson, C.; Waldman, B. H.; Lane, R. A.

    1976-01-01

    A parametric sensitivity study and definition of the conceptual design is presented. A computer program containing the solar irradiance, solar array, and energy balance models was developed to determine the sensitivities of solar insolation and the corresponding solar array output at five sites selected for this study as well as the performance of several solar array/battery systems. A baseline electrical configuration was chosen, and three design options were recommended. The study indicates that the most sensitive parameters are the solar insolation and the inverter efficiency. The baseline PST selected is comprised of a 133 sg m solar array, 250 ampere hour battery, one to three inverters, and a full shunt regulator to limit the upper solar array voltage. A minicomputer controlled system is recommended to provide the overall control, display, and data acquisition requirements. Architectural renderings of two photovoltaic residential concepts, one above ground and the other underground, are presented. The institutional problems were defined in the areas of legal liabilities during and after installation of the PST, labor practices, building restrictions and architectural guides, and land use.

  7. Development and applications of an interactive digital filter design program.

    PubMed

    Woo, H W; Kim, Y M; Tompkins, W J

    1985-10-01

    We have implemented an interactive digital filter design program in the HP 1000 computer at the Department of Electrical Engineering of the University of Washington. This program allows users to design different types of filters interactively with both amplitude and phase responses displayed on graphic devices. The performance of each designed filter can be evaluated conveniently before the best one is chosen and implemented for any particular application. This program can design recursive filters, e.g. Butterworth, Chebyshev and elliptic, or nonrecursive filters with one out of six different windows, i.e. rectangular, triangular, Hann, Hamming, Blackman and Kaiser. The main outputs from this program are coefficients of a transfer function of an analog filter, a digital filter, or both. Therefore, the design of both analog and digital filters is facilitated by using this program. The program is very simple to use and does not require background in analog or digital filter principles in order to run it. The program is written in standard FORTRAN and is about 30 kbytes in size excluding the graphics display routines. Since it uses standard FORTRAN, it can be easily transported to minicomputer and microcomputer systems that have a FORTRAN compiler and minimal graphics capabilities. This program is available for distribution to interested institutions and laboratories.

  8. Energy management system survey of architectures

    SciTech Connect

    Evans, J.W.

    1989-01-01

    Since the earliest days of computers, one trend has been continuous: the only thing that has grown faster than computer power is the demand for computer power. The challenge for system designers is to accommodate this type of growth in a manner that avoids replacing the whole system often. Three very different approaches are being used by the major suppliers today, but with a common them; functions are distributed to various computers and various types of computers to meet the diverse requirements of an EMS. Early control centers were built around a single computer or a redundant pair of computers. Most of the systems delivered before 1975 were based on the Xerox Sigma 5 and Sigma 9 computers, the premier real-time processors of their day. Xerox left the computer business, and suppliers adopted various minicomputers as the heart of their systems. The popular choices were the SEL (later Gould) 32 series, the Harris H series, the Modular Computer Systems MODCOMP IV and CLASSIC series, and the CDC 16-bit machines like the CYBER 18. Like the Sigma 5, these machines are all excellent real-time processors and could easily handle the requirements of early control centers.

  9. Mark 3 correlator hardware and software

    NASA Technical Reports Server (NTRS)

    Whitney, A. R.

    1980-01-01

    The Mark 3 correlator system is described in some detail. The correlator system is based on a modular philosophy. Each correlator module independently processes the data from one track pair. Therefore, 28 modules are necessary to complete a full one baseline processor and 84 modules for a full 3 baseline processor. Each correlator module has two interfaces: (1) data and clock from each of the two tracks to be correlated and (2) Computer Automated Measurement and Control (CAMAC) dataway interface to the computer. The processor is organized around the IEEE CAMAC standard architecture, housing 15 correlator modules in each of 6 crates. This allows one pass processing of a full 3 baseline 28 track observation or a 6 baseline (4 station) 14 track observation. The correlator architecture allows easy expansion for up to 8 stations. The computer system is an HP 1000 system utilizing a 16 bit minicomputer with disc and tape peripherals. The processing software is also organized in a modular fashion with many independent but cooperative programs controlling the operation of the Mark 3 processor. Processing time through the correlator is normally real time or faster, with graphics displays providing real time monitor and control of the processing operation.

  10. Cellstat--A continuous culture system of a bacteriophage for the study of the mutation rate and the selection process at the DNA level

    NASA Astrophysics Data System (ADS)

    Husimi, Yuzuru; Nishigaki, Koichi; Kinoshita, Yasunori; Tanaka, Toyosuke

    1982-04-01

    A bacteriophage is continuously cultured in the flow of the host bacterial cell under the control of a minicomputer. In the culture, the population of the noninfected cell is kept constant by the endogeneous regulation mechanism, so it is called the ''cellstat'' culture. Due to the high dilution rate of the host cell, the mutant cell cannot be selected in the cellstat. Therefore, the cellstat is suitable for the study of the mutation rate and the selection process of a bacteriophage under well-defined environmental conditions (including physiological condition of the host cell) without being interfered by host-cell mutations. Applications to coliphage fd, a secretion type phage, are shown as a measurement example. A chimera between fd and a plasmid pBR322 is cultured more than 100 h. The process of population changeovers by deletion mutants indicates that the deletion hot spots exist in this cloning vector and that this apparatus can be used also for testing instability of a recombinant DNA.

  11. Pacific Missile Test Center Information Resources Management Organization (code 0300): The ORACLE client-server and distributed processing architecture

    SciTech Connect

    Beckwith, A. L.; Phillips, J. T.

    1990-06-10

    Computing architectures using distributed processing and distributed databases are increasingly becoming considered acceptable solutions for advanced data processing systems. This is occurring even though there is still considerable professional debate as to what truly'' distributed computing actually is and despite the relative lack of advanced relational database management software (RDBMS) capable of meeting database and system integrity requirements for developing reliable integrated systems. This study investigates the functionally of ORACLE data base management software that is performing distributed processing between a MicroVAX/VMS minicomputer and three MS-DOS-based microcomputers. The ORACLE database resides on the MicroVAX and is accessed from the microcomputers with ORACLE SQL*NET, DECnet, and ORACLE PC TOOL PACKS. Data gathered during the study reveals that there is a demonstrable decrease in CPU demand on the MicroVAX, due to distributed processing'', when the ORACLE PC Tools are used to access the database as opposed to database access from dumb'' terminals. Also discovered were several hardware/software constraints that must be considered in implementing various software modules. The results of the study indicate that this distributed data processing architecture is becoming sufficiently mature, reliable, and should be considered for developing applications that reduce processing on central hosts. 33 refs., 2 figs.

  12. Text processing for technical reports (direct computer-assisted origination, editing, and output of text)

    SciTech Connect

    De Volpi, A.; Fenrick, M. R.; Stanford, G. S.; Fink, C. L.; Rhodes, E. A.

    1980-10-01

    Documentation often is a primary residual of research and development. Because of this important role and because of the large amount of time consumed in generating technical reports, particularly those containing formulas and graphics, an existing data-processing computer system has been adapted so as to provide text-processing of technical documents. Emphasis has been on accuracy, turnaround time, and time savings for staff and secretaries, for the types of reports normally produced in the reactor development program. The computer-assisted text-processing system, called TXT, has been implemented to benefit primarily the originator of technical reports. The system is of particular value to professional staff, such as scientists and engineers, who have responsibility for generating much correspondence or lengthy, complex reports or manuscripts - especially if prompt turnaround and high accuracy are required. It can produce text that contains special Greek or mathematical symbols. Written in FORTRAN and MACRO, the program TXT operates on a PDP-11 minicomputer under the RSX-11M multitask multiuser monitor. Peripheral hardware includes videoterminals, electrostatic printers, and magnetic disks. Either data- or word-processing tasks may be performed at the terminals. The repertoire of operations has been restricted so as to minimize user training and memory burden. Spectarial staff may be readily trained to make corrections from annotated copy. Some examples of camera-ready copy are provided.

  13. From Past Issues: The More Things Change...

    NASA Astrophysics Data System (ADS)

    1998-07-01

    Though computers were still housed in large, air-conditioned rooms and were often programmed via decks of punched cards, a number of chemists were making effective use of them in teaching as well as research. Eight papers in this issue reported on computer programs. Castleberry, Culp, and Lagowski described an educational experiment in which the effectiveness of computer-based instruction was evaluated in a general chemistry course. Breneman reported on minicomputer-aided instruction, and others described programs that normalized grades, calculated heats of combustion, analyzed results of physical chemistry experiments, solved secular equations, calculated mass spectra, and calculated rate constants. Output devices were usually character based and graphics were rudimentary, as exemplified by the teletype plots of hydrogenic orbitals shown above. The editorial, "On Abandoning Grading and Reconsidering Standards" advocated neither and presented four arguments for maintaining traditional standards and realistic grades. This immediately followed half a decade when poor grades might result in being drafted and serving in Vietnam and student protests were based on government policy rather than whether or not to enforce rules against student drinking. Editor Lippincott pointed out that after several years few students return to thank a professor for making things easy, but many express appreciation for challenges that proved they could do more than they thought they could.

  14. Confocal Laser Microscope Scanning Applied To Three-Dimensional Studies Of Biological Specimens.

    NASA Astrophysics Data System (ADS)

    Franksson, Olof; Liljeborg, Anders; Carlsson, Kjell; Forsgren, Per-Ola

    1987-08-01

    The depth-discriminating property of confocal laser microscope scanners can be used to record the three-dimensional structure of specimens. A number of thin sections (approx. 1 μm thick) can be recorded by a repeated process of image scanning and refocusing of the microscope. We have used a confocal microscope scanner in a number of feasibility studies to investigate its possibilities and limitations. It has proved to be well suited for examining fluorescent specimens with a complicated three-dimensional structure, such as nerve cells. It has also been used to study orchid seeds, as well as cell colonies, greatly facilitating evaluation of such specimens. Scanning of the specimens is performed by a focused laser beam that is deflected by rotating mirrors, and the reflected or fluorescent light from the specimen is detected. The specimen thus remains stationary during image scanning, and is only moved stepwise in the vertical direction for refocusing between successive sections. The scanned images consist of 256*256 or 512*512 pixels, each pixel containing 8 bits of data. After a scanning session a large number of digital images, representing consecutive sections of the specimen, are stored on a disk memory. In a typical case 200 such 256*256 images are stored. To display and process this information in a meaningful way requires both appropriate software and a powerful computer. The computer used is a 32-bits minicomputer equipped with an array processor (FPS 100). The necessary software was developed at our department.

  15. GEEF: a geothermal engineering and economic feasibility model. Description and user's manual

    SciTech Connect

    Not Available

    1982-09-01

    The model is designed to enable decision makers to compare the economics of geothermal projects with the economics of alternative energy systems at an early stage in the decision process. The geothermal engineering and economic feasibility computer model (GEEF) is written in FORTRAN IV language and can be run on a mainframe or a mini-computer system. An abbreviated version of the model is being developed for usage in conjunction with a programmable desk calculator. The GEEF model has two main segments, namely (i) the engineering design/cost segment and (ii) the economic analysis segment. In the engineering segment, the model determines the numbers of production and injection wells, heat exchanger design, operating parameters for the system, requirement of supplementary system (to augment the working fluid temperature if the resource temperature is not sufficiently high), and the fluid flow rates. The model can handle single stage systems as well as two stage cascaded systems in which the second stage may involve a space heating application after a process heat application in the first stage.

  16. ANL statement of site strategy for computing workstations

    SciTech Connect

    Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.; Bretscher, M.E.; Engert, D.E.; Moszur, F.M.; Mueller, C.J.; O'Brien, D.E.; Schlesselman, C.G.; Troyer, L.J.

    1991-11-01

    This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is to develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.

  17. David Florida Laboratory Thermal Vacuum Data Processing System

    NASA Technical Reports Server (NTRS)

    Choueiry, Elie

    1994-01-01

    During 1991, the Space Simulation Facility conducted a survey to assess the requirements and analyze the merits for purchasing a new thermal vacuum data processing system for its facilities. A new, integrated, cost effective PC-based system was purchased which uses commercial off-the-shelf software for operation and control. This system can be easily reconfigured and allows its users to access a local area network. In addition, it provides superior performance compared to that of the former system which used an outdated mini-computer and peripheral hardware. This paper provides essential background on the old data processing system's features, capabilities, and the performance criteria that drove the genesis of its successor. This paper concludes with a detailed discussion of the thermal vacuum data processing system's components, features, and its important role in supporting our space-simulation environment and our capabilities for spacecraft testing. The new system was tested during the ANIK E spacecraft test, and was fully operational in November 1991.

  18. Pneumatic sample-transfer system for use with the Lawrence Livermore National Laboratory rotating target neutron source (RTNS-I)

    SciTech Connect

    Williams, R.E.

    1981-07-01

    A pneumatic sample-transfer system is needed to be able to rapidly retrieve samples irradiated with 14-MeV neutrons at the Rotating Target Neutron Source (RTNS-I). The rabbit system, already in place for many years, has been refurbished with modern system components controlled by an LSI-11 minicomputer. Samples can now be counted three seconds after an irradiation. There are many uses for this expanded 14-MeV neutron activation capability. Several fission products difficult to isolate from mixed fission fragments can be produced instead through (n,p) or (n,..cap alpha..) reactions with stable isotopes. Mass-separated samples of Nd, Mo, and Se, for example, can be irradiated to produce Pr, Nb, and As radionuclides sufficient for decay scheme studies. The system may also be used for multielement fast-neutron activation analysis because the neutron flux is greater than 2 x 10/sup 11/ n/cm/sup 2/-sec. Single element analyses of Si and O are also possible. Finally, measurements of fast-neutron cross sections producing short-lived activation products can be performed with this system. A description of the rabbit system and instructions for its use are presented in this report.

  19. Multi-site magnetotelluric measurement system with real-time data analysis. Final technical report No. 210

    SciTech Connect

    Becker, J.D.; Bostick, F.X. Jr.; Smith, H.W.

    1981-09-01

    A magnetotelluric measurement system has been designed to provide a more cost effective electrical method for geothermal and mineral exploration. The theoretical requirements and sensitivities of the magnetotelluric inversion process were specifically addressed in determining system performance requirements. Significantly reduced instrument noise levels provide improved data quality, and simultaneous measurement at up to six locations provides reduced cost per site. Remotely located, battery powered, instrumentation packages return data to a central controlling site through a 2560 baud wire-line or radio link. Each remote package contains preamplifiers, data conditioning filters, and a 12-bit gain ranging A-D converter for frequencies from 0.001 Hz to 8 Hz. Data frequencies above 8 Hz are processed sequentially by a heterodyne receiver to reduce bandwidth to within the limits of the 2560 baud data link. The central data collection site provides overall control for the entire system. The system operator interacts with the system through a CRT terminal, and he receives hard copy from a matrix graphics printer. Data from the remote packages may be recorded in time sequence on a magnetic tape cartridge system, or an optional Hewlett-Packard 21MX minicomputer can be used to perform real-time frequency analysis. The results of this analysis provide feedback to the operator for improved evaluation of system performance and for selection of future measurement sites.

  20. Multiple access mass storage network

    SciTech Connect

    Wentz, D.L. Jr.

    1980-01-01

    The Multi-Access Storage Subnetwork (MASS) is the latest addition to the Octopus computer network at Lawrence Livermore Laboratory. The subnetwork provides shared mass storage for the Laboratory's multiple-host computer configuration. A Control Data Corp. 38500 Mass Storage facility is interfaces by MASS to the large, scientific worker computers to provide an on-line capacity of 1 trillion bits of user-accessible data. The MASS architecture offers a very high performance approach to the management of large data storage, as well as a high degree of reliability needed for operation in the Laboratory's timesharing environment. MASS combines state-of-the-art digital hardware with an innovative system philosophy. The key LLL design features of the subnetwork that contribute to the high performance include the following: a data transmission scheme that provides a 40-Mbit/s channel over distances of up to 1000 ft, a large metal-oxide-semiconductor (MOS) memory buffer controlled by a 24-port memory multiplexer with an aggregate data rate of 280 Mbit/s, and a set of high-speed microprocessor-based controllers driving the commercial mass storage units. Reliability of the system is provided by a completely redundant network, including two control minicomputer systems. Also enhancing reliability is error detection and correction in the MOS memory. A hardware-generated checksum is carried with each file throughout the entire network to ensure integrity of user files. 6 figures, 1 table.

  1. Review of the Water Resources Information System of Argentina

    USGS Publications Warehouse

    Hutchison, N.E.

    1987-01-01

    A representative of the U.S. Geological Survey traveled to Buenos Aires, Argentina, in November 1986, to discuss water information systems and data bank implementation in the Argentine Government Center for Water Resources Information. Software has been written by Center personnel for a minicomputer to be used to manage inventory (index) data and water quality data. Additional hardware and software have been ordered to upgrade the existing computer. Four microcomputers, statistical and data base management software, and network hardware and software for linking the computers have also been ordered. The Center plans to develop a nationwide distributed data base for Argentina that will include the major regional offices as nodes. Needs for continued development of the water resources information system for Argentina were reviewed. Identified needs include: (1) conducting a requirements analysis to define the content of the data base and insure that all user requirements are met, (2) preparing a plan for the development, implementation, and operation of the data base, and (3) developing a conceptual design to inform all development personnel and users of the basic functionality planned for the system. A quality assurance and configuration management program to provide oversight to the development process was also discussed. (USGS)

  2. SCAILET: An intelligent assistant for satellite ground terminal operations

    NASA Technical Reports Server (NTRS)

    Shahidi, A. K.; Crapo, J. A.; Schlegelmilch, R. F.; Reinhart, R. C.; Petrik, E. J.; Walters, J. L.; Jones, R. E.

    1993-01-01

    NASA Lewis Research Center has applied artificial intelligence to an advanced ground terminal. This software application is being deployed as an experimenter interface to the link evaluation terminal (LET) and was named Space Communication Artificial Intelligence for the Link Evaluation Terminal (SCAILET). The high-burst-rate (HBR) LET provides 30-GHz-transmitting and 20-GHz-receiving, 220-Mbps capability for wide band communications technology experiments with the Advanced Communication Technology Satellite (ACTS). The HBR-LET terminal consists of seven major subsystems. A minicomputer controls and monitors these subsystems through an IEEE-488 or RS-232 protocol interface. Programming scripts (test procedures defined by design engineers) configure the HBR-LET and permit data acquisition. However, the scripts are difficult to use, require a steep learning curve, are cryptic, and are hard to maintain. This discourages experimenters from utilizing the full capabilities of the HBR-LET system. An intelligent assistant module was developed as part of the SCAILET software. The intelligent assistant addresses critical experimenter needs by solving and resolving problems that are encountered during the configuring of the HBR-LET system. The intelligent assistant is a graphical user interface with an expert system running in the background. In order to further assist and familiarize an experimenter, an on-line hypertext documentation module was developed and included in the SCAILET software.

  3. Augmented burst-error correction for UNICON laser memory. [digital memory

    NASA Technical Reports Server (NTRS)

    Lim, R. S.

    1974-01-01

    A single-burst-error correction system is described for data stored in the UNICON laser memory. In the proposed system, a long fire code with code length n greater than 16,768 bits was used as an outer code to augment an existing inner shorter fire code for burst error corrections. The inner fire code is a (80,64) code shortened from the (630,614) code, and it is used to correct a single-burst-error on a per-word basis with burst length b less than or equal to 6. The outer code, with b less than or equal to 12, would be used to correct a single-burst-error on a per-page basis, where a page consists of 512 32-bit words. In the proposed system, the encoding and error detection processes are implemented by hardware. A minicomputer, currently used as a UNICON memory management processor, is used on a time-demanding basis for error correction. Based upon existing error statistics, this combination of an inner code and an outer code would enable the UNICON system to obtain a very low error rate in spite of flaws affecting the recorded data.

  4. Interactive geologic modeling

    SciTech Connect

    Glaeser, J.D.; Krajewski, S.A.

    1984-04-01

    Improved success in finding hydrocarbons and minerals depends on developing geologic models from seismic, gravity, and magnetic data that most closely approximate real-world settings. Although data processing remains the chore of mainframe and minicomputers, interpretations and modeling of geologic and geophysical information now are best accomplished on personal computers because these computers afford the explorationist maximum freedom to shape and fine tune geophysical evaluations. Three case histories use the GEOSIM geophysical modeling systems to delineate exploration targets. The first example is Silurian Niagaran reef trends in the Michigan basin. Here, differences in seismic reef anomalies result from variations in carbonate-evaporite stratigraphy encasing the reefs, reef geometry, and reef reservoir parameters. These variations which influence real seismic-response differences can be successfully matched using appropriate geologic models in generating synthetic seismic reef anomalies. The second example applies gravity and magnetic data to seismic modeling of a Wyoming coal field. Detailed seismic stratigraphy helps locate those portions of the field having multiple seams, although it does not resolve individual economic zones. Gravity data do identify pinchout margins of multiseam zones and pinchouts between principal coals. Magnetic data are then used to delineate the burn (clinker) margin. Seismic modeling of subtle stratigraphic traps is the broader area of exploration interest contained in the first 2 examples. In the third, successfully modeled and tested examples of lateral changes in deltaic facies and of faulted, unconformity-bounded continent-margin sequences are shown to be successful guides to reinterpretation of seismic data.

  5. The 1983-84 Connecticut 45-Hz-band field-strength measurements

    NASA Astrophysics Data System (ADS)

    Bannister, P. R.

    1986-03-01

    Extremely low frequency (ELF) measurements are made of the transverse horizontal magnetic field strength received in Connecticut. The AN/BSR-1 receiver consists of an AN/UYK-20 minicomputer, a signal timing and interface unit (STIU), a rubidium frequency time standard, two magnetic tape recorders, and a preamplifier. The transmission source of these farfield (1.6-Mm range) measurements is the U.S. Navy's ELF Wisconsin Test Facility (WTF), located in the Chequamegon National Forest in north central Wisconsin, about 8 km south of the village of Clam Lake. The WTF consists of two 22.5-km antennas; one of which is situated approximately in the north-south (NS) direction and the other approximately in the east-west (EW) direction. Each antenna is grounded at both ends. The electrical axis of the WTF EW antenna is 11 deg east of north at 45 Hz and 14 deg east of north at 75Hz. The electrical axis of the WTF NS antenna is 11 deg east of north at 45 Hz and 14 deg east of north at 75 Hz. The WTF array can be steered electrically. Its radiated power is approximately 0.5 W at 45 Hz and 1 W at 75 Hz. This report will compare results of 45 Hz band data taken during 1983 to 1984 with previous 45 Hz band measurements.

  6. WATEQ4F - a personal computer Fortran translation of the geochemical model WATEQ2 with revised data base

    USGS Publications Warehouse

    Ball, J.W.; Nordstrom, D.K.; Zachmann, D.W.

    1987-01-01

    A FORTRAN 77 version of the PL/1 computer program for the geochemical model WATEQ2, which computes major and trace element speciation and mineral saturation for natural waters has been developed. The code (WATEQ4F) has been adapted to execute on an IBM PC or compatible microcomputer. Two versions of the code are available, one operating with IBM Professional FORTRAN and an 8087 or 89287 numeric coprocessor, and one which operates without a numeric coprocessor using Microsoft FORTRAN 77. The calculation procedure is identical to WATEQ2, which has been installed on many mainframes and minicomputers. Limited data base revisions include the addition of the following ions: AlHS04(++), BaS04, CaHS04(++), FeHS04(++), NaF, SrC03, and SrHCO3(+). This report provides the reactions and references for the data base revisions, instructions for program operation, and an explanation of the input and output files. Attachments contain sample output from three water analyses used as test cases and the complete FORTRAN source listing. U.S. Geological Survey geochemical simulation program PHREEQE and mass balance program BALANCE also have been adapted to execute on an IBM PC or compatible microcomputer with a numeric coprocessor and the IBM Professional FORTRAN compiler. (Author 's abstract)

  7. Managing for the next big thing. Interview by Paul Hemp.

    PubMed

    Ruettgers, M

    2001-01-01

    In this HBR interview, CEO Michael Ruettgers speaks in detail about the managerial practices that have allowed EMC to anticipate and exploit disruptive technologies, market opportunities, and business models ahead of its competitors. He recounts how the company repeatedly ventured into untested markets, ultimately transforming itself from a struggling maker of minicomputer memory boards into a data storage powerhouse and one of the most successful companies of the past decade. The company has achieved sustained and nearly unrivaled revenue, profit, and shareprice growth through a number of means. Emphasizing timing and speed, Ruettgers says, is critical. That's meant staggering products rather than developing them sequentially and avoiding the excessive refinements that slow time to market. Indeed, a sense of urgency, Ruettgers explains, has been critical to EMC's success. Processes such as quarterly goal setting and monthly forecasting meetings help maintain a sense of urgency and allow managers to get early glimpses of changes in the market. So does an environment in which personal accountability is stressed and the corporate focus is single-minded. Perhaps most important, the company has procedures to glean insights from customers. Intensive forums involving EMC engineers and leading-edge customers, who typically push for unconventional solutions to their problems, often yield new product features. Similarly, a customer service system that includes real-time monitoring of product use enables EMC to understand customer needs firsthand.

  8. Distribution and clearance of radioactive aerosol on the nasal mucosa.

    PubMed

    McLean, J A; Bacon, J R; Mathews, K P; Thrall, J H; Banas, J M; Hedden, J; Bayne, N K

    1984-03-01

    The distribution and clearance of aerosolized radioactive technetium 99m pertechnate in physiologic buffered saline was analyzed in four human adult asymptomatic volunteers following delivery into one nostril in the same manner as for nasal challenge testing (i.e., 0.1 ml via a 251 DeVilbiss atomizer powered by a compressor delivering 0.10 +/- 0.01 gm/spray). For comparison, squeeze bottles and spray bottles from commercial sources, a 114 and a 127 DeVilbiss atomizer, and a pipette were employed. Lateral imagery via minicomputer processing was used to determine both distribution and clearance of the radiotracer. The counts after 1 minute were lower following pipette delivery than with the other devices. None yielded discernable , wide-spread distribution of aerosol throughout the nasal cavity. Following delivery from the 251 atomizer, mean clearance at 17 minutes was 60.0%. Similar clearance rates were obtained with the other spraying methods except for lower values with the squeeze bottle. Analysis of six hour clearance studies by linear regression showed a relatively rapid initial phase, which is probably due largely to mucociliary clearance, and a prolonged late phase related to the very slow disappearance of residual material located far anteriorly in the nose. Achieving good initial retention and rapid clearance of material deposited anteriorly in the nose are desirable attributes of devices employed for administering materials intranasally.

  9. An imaging system for PLIF/Mie measurements for a combusting flow

    NASA Technical Reports Server (NTRS)

    Wey, C. C.; Ghorashi, B.; Marek, C. J.; Wey, C.

    1990-01-01

    The equipment required to establish an imaging system can be divided into four parts: (1) the light source and beam shaping optics; (2) camera and recording; (3) image acquisition and processing; and (4) computer and output systems. A pulsed, Nd:YAG-pummped, frequency-doubled dye laser which can freeze motion in the flowfield is used for an illumination source. A set of lenses is used to form the laser beam into a sheet. The induced fluorescence is collected by an UV-enhanced lens and passes through an UV-enhanced microchannel plate intensifier which is optically coupled to a gated solid state CCD camera. The output of the camera is simultaneously displayed on a monitor and recorded on either a laser videodisc set of a Super VHS VCR. This videodisc set is controlled by a minicomputer via a connection to the RS-232C interface terminals. The imaging system is connected to the host computer by a bus repeater and can be multiplexed between four video input sources. Sample images from a planar shear layer experiment are presented to show the processing capability of the imaging system with the host computer.

  10. Pharmacist monitoring of parenteral nutrition: clinical and cost effectiveness.

    PubMed

    Mutchie, K D; Smith, K A; MacKay, M W; Marsh, C; Juluson, D

    1979-06-01

    The effect of pharmacist involvement in total parenteral nutrient (TPN) therapy on patient outcome and cost of therapy was studied. Data from 26 patients who received standard TPN solutions without pharmacist monitoring (Group 1) were compared with those from 26 patients whose TPN therapy was individualized (by use of a minicomputer) and monitored by a pharmacist (Group 2). Six patients from each group who were 35 days of age or younger and who received TPN as the only caloric source for 8 to 20 days were compared for clinical response. Mean duration of TPN therapy increased form 12.3 +/- 9 days for Group 1 to 14.8 +/- 12 days for Group 2, and the TPN use rate for Group 2 was 31% above that for group 1. The mean daily charge for TPN was greater for Group 1 ($72.00) than for Group 2 ($50.18). The pharmacy's mean cost per course of TPN for Group 2 was $44.10 less than that for Group 1. The mean weight gain in Group 1 was significantly less (4 g/day) than that in Group 2 (17 g/day) (p less than 0.05) (for the six patients per group compared). Pharmacist monitoring of TPN reduced the pharmacy's costs and patient charges for TPN and improved the patients' clinical responses to TPN.

  11. Mathematical models for space shuttle ground systems

    NASA Technical Reports Server (NTRS)

    Tory, E. G.

    1985-01-01

    Math models are a series of algorithms, comprised of algebraic equations and Boolean Logic. At Kennedy Space Center, math models for the Space Shuttle Systems are performed utilizing the Honeywell 66/80 digital computers, Modcomp II/45 Minicomputers and special purpose hardware simulators (MicroComputers). The Shuttle Ground Operations Simulator operating system provides the language formats, subroutines, queueing schemes, execution modes and support software to write, maintain and execute the models. The ground systems presented consist primarily of the Liquid Oxygen and Liquid Hydrogen Cryogenic Propellant Systems, as well as liquid oxygen External Tank Gaseous Oxygen Vent Hood/Arm and the Vehicle Assembly Building (VAB) High Bay Cells. The purpose of math modeling is to simulate the ground hardware systems and to provide an environment for testing in a benign mode. This capability allows the engineers to check out application software for loading and launching the vehicle, and to verify the Checkout, Control, & Monitor Subsystem within the Launch Processing System. It is also used to train operators and to predict system response and status in various configurations (normal operations, emergency and contingent operations), including untried configurations or those too dangerous to try under real conditions, i.e., failure modes.

  12. The 50-MHz meteor radar observation at Syowa Station, Antarctica

    NASA Technical Reports Server (NTRS)

    Tanaka, T.; Ogawa, T.; Igarashi, K.; Fujii, R.

    1985-01-01

    The 50-MHz Doppler radar installed at Syowa Station (69 deg 00'S, 39 deg 35'E), Antarctica, in 1982 can detect continuously a meteor echo if an operator assigns the meteor mode operation to the radar. The radar has two narrow antenna beams (4 deg in the horizontal plane), one toward geomagnetic south and the other toward approximately geographic south, with a crossing angle of about 33 deg. The minicomputer annexed to the radar controls the transmission and reception of a 50-MHz wave. If the receiver detects a meteor echo, the flag signal is sent to the computer. Then the computer begins to determine the echo range with a time resolution of 1 micro s and to sample every 200 microns/s for 1 s the Doppler signal and echo intensity at the particular range (R). The line-of-sight velocity (V sub D) of the echo trail is calculated from the output from the Doppler signal detection circuit having an offset frequency by using the so-called zero-crossing method. The echo amplitude decay time calculated by a least-mean square method is used to obtain the ambipolar diffusion coefficient (D) and then to calculate the echo height (H). About 120 day observations were made during 1982-1983. Some early results are presented. magnetic tapes together with V sub D, D, H and R for later analysis in Japan. About 120 day observation were made during 1982-1983. Some early results are presented.

  13. High Frequency Sampling of TTL Pulses on a Raspberry Pi for Diffuse Correlation Spectroscopy Applications

    PubMed Central

    Tivnan, Matthew; Gurjar, Rajan; Wolf, David E.; Vishwanath, Karthik

    2015-01-01

    Diffuse Correlation Spectroscopy (DCS) is a well-established optical technique that has been used for non-invasive measurement of blood flow in tissues. Instrumentation for DCS includes a correlation device that computes the temporal intensity autocorrelation of a coherent laser source after it has undergone diffuse scattering through a turbid medium. Typically, the signal acquisition and its autocorrelation are performed by a correlation board. These boards have dedicated hardware to acquire and compute intensity autocorrelations of rapidly varying input signal and usually are quite expensive. Here we show that a Raspberry Pi minicomputer can acquire and store a rapidly varying time-signal with high fidelity. We show that this signal collected by a Raspberry Pi device can be processed numerically to yield intensity autocorrelations well suited for DCS applications. DCS measurements made using the Raspberry Pi device were compared to those acquired using a commercial hardware autocorrelation board to investigate the stability, performance, and accuracy of the data acquired in controlled experiments. This paper represents a first step toward lowering the instrumentation cost of a DCS system and may offer the potential to make DCS become more widely used in biomedical applications. PMID:26274961

  14. Some Problems and Solutions in Transferring Ecosystem Simulation Codes to Supercomputers

    NASA Technical Reports Server (NTRS)

    Skiles, J. W.; Schulbach, C. H.

    1994-01-01

    Many computer codes for the simulation of ecological systems have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Recent recognition of ecosystem science as a High Performance Computing and Communications Program Grand Challenge area emphasizes supercomputers (both parallel and distributed systems) as the next set of tools for ecological simulation. Transferring ecosystem simulation codes to such systems is not a matter of simply compiling and executing existing code on the supercomputer since there are significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers. To more appropriately match the application to the architecture (necessary to achieve reasonable performance), the parallelism (if it exists) of the original application must be exploited. We discuss our work in transferring a general grassland simulation model (developed on a VAX in the FORTRAN computer programming language) to a Cray Y-MP. We show the Cray shared-memory vector-architecture, and discuss our rationale for selecting the Cray. We describe porting the model to the Cray and executing and verifying a baseline version, and we discuss the changes we made to exploit the parallelism in the application and to improve code execution. As a result, the Cray executed the model 30 times faster than the VAX 11/785 and 10 times faster than a Sun 4 workstation. We achieved an additional speed-up of approximately 30 percent over the original Cray run by using the compiler's vectorizing capabilities and the machine's ability to put subroutines and functions "in-line" in the code. With the modifications, the code still runs at only about 5% of the Cray's peak speed because it makes ineffective use of the vector processing capabilities of the Cray. We conclude with a discussion and future plans.

  15. Parallel processing research in the former Soviet Union

    SciTech Connect

    Dongarra, J.J.; Snyder, L.; Wolcott, P.

    1992-03-01

    This technical assessment report examines strengths and weaknesses of parallel processing research and development in the Soviet Union from the 1980s to June 1991. The assessment was carried out by panel of US scientists who are experts on parallel processing hardware, software, algorithms, and applications, and on Soviet computing. Soviet computer research and development organizations have pursued many of the major avenues of inquiry related to parallel processing that the West has chosen to explore. But, the limited size and substantial breadth of their effort have limited the collective depth of Soviet activity. Even more serious limitations (and delays) of Soviet achievement in parallel processing research can be traced to shortcomings of the Soviet computer industry, which was unable to supply adequate, reliable computer components. Without the ability to build, demonstrate, and test embodiments of their ideas in actual high-performance parallel hardware, both the scope of activity and the success of Soviet parallel processing researchers were severely limited. The quality of the Soviet parallel processing research assessed varied from very sound and interesting to pedestrian, with most of the groups at the major hardware and software centers to which the work is largely confined doing good (or at least serious) research. In a few instances, interesting and competent parallel language development work was found at institutions not associated with hardware development efforts. Unlike Soviet mainframe and minicomputer developers, Soviet parallel processing researchers have not concentrated their efforts on reverse- engineering specific Western systems. No evidence was found of successful Soviet attempts to use breakthroughs in parallel processing technology to leapfrog'' impediments and limitations that Soviet industrial weakness in microelectronics and other computer manufacturing areas impose on the performance of high-end Soviet computers.

  16. Parallel processing research in the former Soviet Union

    SciTech Connect

    Dongarra, J.J.; Snyder, L.; Wolcott, P.

    1992-03-01

    This technical assessment report examines strengths and weaknesses of parallel processing research and development in the Soviet Union from the 1980s to June 1991. The assessment was carried out by panel of US scientists who are experts on parallel processing hardware, software, algorithms, and applications, and on Soviet computing. Soviet computer research and development organizations have pursued many of the major avenues of inquiry related to parallel processing that the West has chosen to explore. But, the limited size and substantial breadth of their effort have limited the collective depth of Soviet activity. Even more serious limitations (and delays) of Soviet achievement in parallel processing research can be traced to shortcomings of the Soviet computer industry, which was unable to supply adequate, reliable computer components. Without the ability to build, demonstrate, and test embodiments of their ideas in actual high-performance parallel hardware, both the scope of activity and the success of Soviet parallel processing researchers were severely limited. The quality of the Soviet parallel processing research assessed varied from very sound and interesting to pedestrian, with most of the groups at the major hardware and software centers to which the work is largely confined doing good (or at least serious) research. In a few instances, interesting and competent parallel language development work was found at institutions not associated with hardware development efforts. Unlike Soviet mainframe and minicomputer developers, Soviet parallel processing researchers have not concentrated their efforts on reverse- engineering specific Western systems. No evidence was found of successful Soviet attempts to use breakthroughs in parallel processing technology to ``leapfrog`` impediments and limitations that Soviet industrial weakness in microelectronics and other computer manufacturing areas impose on the performance of high-end Soviet computers.

  17. Graphics processing, video digitizing, and presentation of geologic information

    SciTech Connect

    Sanchez, J.D. )

    1990-02-01

    Computer users have unparalleled opportunities to use powerful desktop computers to generate, manipulate, analyze and use graphic information for better communication. Processing graphic geologic information on a personal computer like the Amiga used for the projects discussed here enables geoscientists to create and manipulate ideas in ways once available only to those with access to large budgets and large mainframe computers. Desktop video applications such as video digitizing and powerful graphic processing application programs add a new dimension to the creation and manipulation of geologic information. Videotape slide shows and animated geology give geoscientists new tools to examine and present information. Telecommunication programs such as ATalk III, which can be used as an all-purpose telecommunications program or can emulate a Tektronix 4014 terminal, allow the user to access Sun and Prime minicomputers and manipulate graphic geologic information stored there. Graphics information displayed on the monitor screen can be captured and saved in the standard Amiga IFF graphic format. These IFF files can be processed using image processing programs such as Butcher. Butcher offers edge mapping, resolution conversion, color separation, false colors, toning, positive-negative reversals, etc. Multitasking and easy expansion that includes IBM-XT and AT co-processing offer unique capabilities for graphic processing and file transfer between Amiga-DOS and MS-DOS. Digital images produced by satellites and airborne scanners can be analyzed on the Amiga using the A-Image processing system developed by the CSIRO Division of Mathematics and Statistics and the School of Mathematics and Computing at Curtin University, Australia.

  18. Research, development and demonstration of nickel-zinc batteries for electric vehicle propulsion. Annual report, 1979. [70 W/lb

    SciTech Connect

    Not Available

    1980-06-01

    This second annual report under Contract No. 31-109-39-4200 covers the period July 1, 1978 through August 31, 1979. The program demonstrates the feasibility of the nickel-zinc battery for electric vehicle propulsion. The program is divided into seven distinct but highly interactive tasks collectively aimed at the development and commercialization of nickel-zinc technology. These basic technical tasks are separator development, electrode development, product design and analysis, cell/module battery testing, process development, pilot manufacturing, and thermal management. A Quality Assurance Program has also been established. Significant progress has been made in the understanding of separator failure mechanisms, and a generic category of materials has been specified for the 300+ deep discharge (100% DOD) applications. Shape change has been reduced significantly. A methodology has been generated with the resulting hierarchy: cycle life cost, volumetric energy density, peak power at 80% DOD, gravimetric energy density, and sustained power. Generation I design full-sized 400-Ah cells have yielded in excess of 70 W/lb at 80% DOD. Extensive testing of cells, modules, and batteries is done in a minicomputer-based testing facility. The best life attained with electric vehicle-size cell components is 315 cycles at 100% DOD (1.0V cutoff voltage), while four-cell (approx. 6V) module performance has been limited to about 145 deep discharge cycles. The scale-up of processes for production of components and cells has progressed to facilitate component production rates of thousands per month. Progress in the area of thermal management has been significant, with the development of a model that accurately represents heat generation and rejection rates during battery operation. For the balance of the program, cycle life of > 500 has to be demonstrated in modules and full-sized batteries. 40 figures, 19 tables. (RWR)

  19. The digital geologic map of Colorado in ARC/INFO format, Part B. Common files

    USGS Publications Warehouse

    Green, Gregory N.

    1992-01-01

    This geologic map was prepared as a part of a study of digital methods and techniques as applied to complex geologic maps. The geologic map was digitized from the original scribe sheets used to prepare the published Geologic Map of Colorado (Tweto 1979). Consequently the digital version is at 1:500,000 scale using the Lambert Conformal Conic map projection parameters of the state base map. Stable base contact prints of the scribe sheets were scanned on a Tektronix 4991 digital scanner. The scanner automatically converts the scanned image to an ASCII vector format. These vectors were transferred to a VAX minicomputer, where they were then loaded into ARC/INFO. Each vector and polygon was given attributes derived from the original 1979 geologic map. This database was developed on a MicroVAX computer system using VAX V 5.4 nd ARC/INFO 5.0 software. UPDATE: April 1995, The update was done solely for the purpose of adding the abilitly to plot to an HP650c plotter. Two new ARC/INFO plot AMLs along with a lineset and shadeset for the HP650C design jet printer have been included. These new files are COLORADO.650, INDEX.650, TWETOLIN.E00 and TWETOSHD.E00. These files were created on a UNIX platform with ARC/INFO 6.1.2. Updated versions of INDEX.E00, CONTACT.E00, LINE.E00, DECO.E00 and BORDER.E00 files that included the newly defined HP650c items are also included. * Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government. Descriptors: The Digital Geologic Map of Colorado in ARC/INFO Format Open-File Report 92-050

  20. The digital geologic map of Colorado in ARC/INFO format, Part A. Documentation

    USGS Publications Warehouse

    Green, Gregory N.

    1992-01-01

    This geologic map was prepared as a part of a study of digital methods and techniques as applied to complex geologic maps. The geologic map was digitized from the original scribe sheets used to prepare the published Geologic Map of Colorado (Tweto 1979). Consequently the digital version is at 1:500,000 scale using the Lambert Conformal Conic map projection parameters of the state base map. Stable base contact prints of the scribe sheets were scanned on a Tektronix 4991 digital scanner. The scanner automatically converts the scanned image to an ASCII vector format. These vectors were transferred to a VAX minicomputer, where they were then loaded into ARC/INFO. Each vector and polygon was given attributes derived from the original 1979 geologic map. This database was developed on a MicroVAX computer system using VAX V 5.4 nd ARC/INFO 5.0 software. UPDATE: April 1995, The update was done solely for the purpose of adding the abilitly to plot to an HP650c plotter. Two new ARC/INFO plot AMLs along with a lineset and shadeset for the HP650C design jet printer have been included. These new files are COLORADO.650, INDEX.650, TWETOLIN.E00 and TWETOSHD.E00. These files were created on a UNIX platform with ARC/INFO 6.1.2. Updated versions of INDEX.E00, CONTACT.E00, LINE.E00, DECO.E00 and BORDER.E00 files that included the newly defined HP650c items are also included. * Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government. Descriptors: The Digital Geologic Map of Colorado in ARC/INFO Format Open-File Report 92-050

  1. Acoustic systems for the measurement of streamflow

    USGS Publications Warehouse

    Laenen, Antonius; Smith, Winchell

    1983-01-01

    The acoustic velocity meter (AVM), also referred to as an ultrasonic flowmeter, has been an operational tool for the measurement of streamflow since 1965. Very little information is available concerning AVM operation, performance, and limitations. The purpose of this report is to consolidate information in such a manner as to provide a better understanding about the application of this instrumentation to streamflow measurement. AVM instrumentation is highly accurate and nonmechanical. Most commercial AVM systems that measure streamflow use the time-of-travel method to determine a velocity between two points. The systems operate on the principle that point-to-point upstream travel-time of sound is longer than the downstream travel-time, and this difference can be monitored and measured accurately by electronics. AVM equipment has no practical upper limit of measurable velocity if sonic transducers are securely placed and adequately protected. AVM systems used in streamflow measurement generally operate with a resolution of ?0.01 meter per second but this is dependent on system frequency, path length, and signal attenuation. In some applications the performance of AVM equipment may be degraded by multipath interference, signal bending, signal attenuation, and variable streamline orientation. Presently used minicomputer systems, although expensive to purchase and maintain, perform well. Increased use of AVM systems probably will be realized as smaller, less expensive, and more conveniently operable microprocessor-based systems become readily available. Available AVM equipment should be capable of flow measurement in a wide variety of situations heretofore untried. New signal-detection techniques and communication linkages can provide additional flexibility to the systems so that operation is possible in more river and estuary situations.

  2. COMPUTER MODEL OF TEMPERATURE DISTRIBUTION IN OPTICALLY PUMPED LASER RODS

    NASA Technical Reports Server (NTRS)

    Farrukh, U. O.

    1994-01-01

    Managing the thermal energy that accumulates within a solid-state laser material under active pumping is of critical importance in the design of laser systems. Earlier models that calculated the temperature distribution in laser rods were single dimensional and assumed laser rods of infinite length. This program presents a new model which solves the temperature distribution problem for finite dimensional laser rods and calculates both the radial and axial components of temperature distribution in these rods. The modeled rod is either side-pumped or end-pumped by a continuous or a single pulse pump beam. (At the present time, the model cannot handle a multiple pulsed pump source.) The optical axis is assumed to be along the axis of the rod. The program also assumes that it is possible to cool different surfaces of the rod at different rates. The user defines the laser rod material characteristics, determines the types of cooling and pumping to be modeled, and selects the time frame desired via the input file. The program contains several self checking schemes to prevent overwriting memory blocks and to provide simple tracing of information in case of trouble. Output for the program consists of 1) an echo of the input file, 2) diffusion properties, radius and length, and time for each data block, 3) the radial increments from the center of the laser rod to the outer edge of the laser rod, and 4) the axial increments from the front of the laser rod to the other end of the rod. This program was written in Microsoft FORTRAN77 and implemented on a Tandon AT with a 287 math coprocessor. The program can also run on a VAX 750 mini-computer. It has a memory requirement of about 147 KB and was developed in 1989.

  3. a Portable Apparatus for Absolute Measurements of the Earth's Gravity.

    NASA Astrophysics Data System (ADS)

    Zumberge, Mark Andrew

    We have developed a new, portable apparatus for making absolute measurements of the acceleration due to the earth's gravity. We use the method of interferometrically determining the acceleration of a freely falling corner -cube prism. The falling object is surrounded by a chamber which is driven vertically inside a fixed vacuum chamber. This falling chamber is servoed to track the falling corner -cube to shield it from drag due to background gas. In addition, the drag-free falling chamber removes the need for a magnetic release, shields the falling object from electrostatic forces, and provides a means of both gently arresting the falling object and quickly returning it to its start position, to allow rapid acquisition of data. A synthesized long period isolation device reduces the noise due to seismic oscillations. A new type of Zeeman laser is used as the light source in the interferometer, and is compared with the wavelength of an iodine stabilized laser. The times of occurrence of 45 interference fringes are measured to within 0.2 nsec over a 20 cm drop and are fit to a quadratic by an on-line minicomputer. 150 drops can be made in ten minutes resulting in a value of g having a precision of 3 to 6 parts in 10('9). Systematic errors have been determined to be less than 5 parts in 10('9) through extensive tests. Three months of gravity data have been obtained with a reproducibility ranging from 5 to 10 parts in 10('9). The apparatus has been designed to be easily portable. Field measurements are planned for the immediate future. An accuracy of 6 parts in 10('9) corresponds to a height sensitivity of 2 cm. Vertical motions in the earth's crust and tectonic density changes that may precede earthquakes are to be investigated using this apparatus.

  4. Research Directed at Developing a Classical Theory to Describe Isotope Separation of Polyatomic Molecules Illuminated by Intense Infrared Radiation. Final Report for period May 7, 1979 to September 30, 1979; Extension December 31, 1997

    DOE R&D Accomplishments Database

    Lamb, W. E. Jr.

    1981-12-01

    This final report describes research on the theory of isotope separation produced by the illumination of polyatomic molecules by intense infrared laser radiation. This process is investigated by treating the molecule, sulfur hexafluoride, as a system of seven classical particles that obey the Newtonian equations of motion. A minicomputer is used to integrate these differential equations. The particles are acted on by interatomic forces, and by the time-dependent electric field of the laser. We have a very satisfactory expression for the interaction of the laser and the molecule which is compatible with infrared absorption and spectroscopic data. The interatomic potential is capable of improvement, and progress on this problem is still being made. We have made several computer runs of the dynamical behavior of the molecule using a reasonably good model for the interatomic force law. For the laser parameters chosen, we find that typically the molecule passes quickly through the resonance region into the quasi-continuum and even well into the real continuum before dissociation actually occurs. When viewed on a display terminal, the motions are exceedingly complex. As an aid to the visualization of the process, we have made a number of 16 mm movies depicting a three-dimensional representation of the motion of the seven particles. These show even more clearly the enormous complexity of the motions, and make clear the desirability of finding ways of characterizing the motion in simple ways without giving all of the numerical detail. One of the ways to do this is to introduce statistical parameters such as a temperature associated with the distribution of kinetic energies of the single particle. We have made such an analysis of our data runs, and have found favorable indications that such methods will prove useful in keeping track of the dynamical histories.

  5. [Ventricular activation sequence estimated by body surface isochrone map].

    PubMed

    Hayashi, H; Ishikawa, T; Takami, K; Kojima, H; Yabe, S; Ohsugi, S; Miyachi, K; Sotobata, I

    1985-06-01

    This study was performed to evaluate the usefulness of the body surface isochrone map (VAT map) for identifying the ventricular activation sequence, and it was correlated with the isopotential map. Subjects consisted of 42 normal healthy adults, 18 patients with artificial ventricular pacemakers, and 100 patients with ventricular premature beats (VPB). The sites of pacemaker implantations were the right ventricular endocardial apex (nine cases), right ventricular epicardial apex (five cases), right ventricular inflow tract (one case), left ventricular epicardial apex (one case), and posterior base of the left ventricle via the coronary sinus (two cases). An isopotential map was recorded by the mapper HPM-6500 (Chunichi-Denshi Co.) on the basis of an 87 unipolar lead ECG, and a VAT isochrone map was drawn by a minicomputer. The normal VAT map was classified by type according to alignment of isochrone lines, and their frequency was 57.1% for type A, 16.7% for type B, and 26.2% for type C. In the VAT map of ventricular pacing, the body surface area of initial isochrone lines represented well the sites of pacemaker stimuli. In the VAT map of VPB, the sites of origin of VPB agreed well with those as determined by the previous study using an isopotential map. The density of the isochrone lines suggested the mode of conduction via the specialized conduction system or ventricular muscle. The VAT map is a very useful diagnostic method to predict the ventricular activation sequence more directly in a single sheet of the map. PMID:2419457

  6. Automated noninvasive determination of mixed venous pCO2.

    PubMed

    Leavell, K; Finkelstein, S M; Warwick, W J; Budd, J R

    1986-01-01

    The determination of mixed venous pCO2 is desirable for assessing the metabolic and respiratory status of a patient. A totally automated, laboratory computer-controlled noninvasive system has been developed to determine mixed venous pCO2 by an equilibrium rebreathing method or by an exponential compartmental analysis for cases in which equilibrium is not achieved. A gas mixture is charged to a 2-liter anesthesia bag contained in a thermostatically controlled chamber used to maintain the temperature at 37 degrees C. This feature improves upon past rebreathing methods and eliminates water vapor as a variable in gas composition measurement. This bag is connected to a rebreathing circuit controlled by a minicomputer. The subject breathes from a mouthpiece attached to a two-way valve and rebreathes the gas mixture for a period of 30 seconds. Inspirate and expirate hoses are placed in the rebreathing bag to ensure a more uniform gas distribution than is generally found in rebreathing systems. Exchange of CO2 takes place between lungs and rebreathing bag, and the concentration of CO2 is continuously monitored by a mass spectrometer. After a period of time, the concentration of CO2 in the rebreathing bag, the alveoli, and the mixed venous blood come into equilibrium, demonstrated by a plateau on the record of CO2 concentration vs. time. Compartmental analysis predicts the mixed venous pCO2 even if an equilibrium is not established. This feature is a significant benefit of this new method, eliminating problems associated with establishing an equilibrium, such as gas mixture volume adjustment, recirculation, and poor ventilation. The predicted value agrees with the equilibrium valve for cases in which equilibrium is reached.(ABSTRACT TRUNCATED AT 250 WORDS)

  7. Energy from true in situ processing of Antrim shale: sampling and analytical systems

    SciTech Connect

    Pihlaja, R.K.

    1980-08-01

    Reliable on-line analysis of production gas composition is fundamental to the success of an in situ extraction experiment in Antrim shale. An automated sampling and analysis system designed to meet this need has provided high quality analytical data for three extraction trials without a single day when no data were taken. The production gas samples were routinely analyzed by both gas chromatography (GC) and a bank of continuous on-line process gas analyzers. The GC's analyzed for H/sub 2/, O/sub 2/ + Ar, N/sub 2/, CO, CO/sub 2/, SO/sub 2/, H/sub 2/S, individual C/sub 1/ - C/sub 5/ hydrocarbon species, and lumped C/sub 6/ + hydrocarbon species, each analysis requiring up to an hour to run. The process gas analyzers measured CO, CO/sub 2/, total hydrocarbons (% vol CH/sub 4/ equivalent), and O/sub 2/ continuously. The process gas analyzers were shown to be especially well suited for this application because of their fast response. The GC data provided itemized composition details as well as an independent check of process analyzer data. Sample selection, data collection and processing from both the GC's and process gas analyzers was handled by a Perkin Elmer Sigma-10 minicomputer. The combination of the two analytical techniques and automated data handling yielded a versatile and powerful system. The production gas sampling system demonstrated the feasibility of transmitting a properly treated gas sample through a long (1000 ft) 1/8'' diameter sample line. The small bore tubing allowed the analytical instruments to be located a safe distance away from the well heads and yet maintain a reasonably short sample transport lag time without handling large volumes of gas.

  8. EEG cartography of a night of sleep and dreams. A longitudinal study with provoked awakenings.

    PubMed

    Etevenon, P; Guillou, S

    1986-01-01

    A night of sleep has been recorded under the conditions of a sleep laboratory. The subject was a woman of 55 years, well-trained in dream recall. The subject was awakened three times at the end of sleep cycles. EEG was monitored for 7 h with a 16-channel polygraph (REEGA 16, Alvar) connected to two systems of EEG cartography: minicomputers (HP Fourier Analyser 5451 C and HP 1000) and a microinformatic system (Cartovar, Alvar). A second 8-channel polygraph (Mini-huit, Alvar) was used in parallel for polygraphy (EOG, EMG, respiration, actogram, EKG). Based on immediate visual inspection of EEG and polygraphic tracings, 500 EEG recordings of selected epochs (of 6, 30 or 60 s length) have been quantified, submitted on-line to spectral analysis (on Cartovar) and stored on floppy disks for further printing of EEG maps. The 16 EEG channels were placed over the scalp according to the 10/20 system and following Giannitrapani's placement. We have chosen a common average electrode. For each of the 500 EEG epochs, four EEG maps were edited (raw EEG between 0 and 30 Hz, 0 and 7 Hz, 8 and 12 Hz, 13 and 30 Hz). Each of these 2,000 maps has been checked visually in comparison with the polygraphic recordings for visual rejection of artifacts or transitory states. The remaining EEG epochs and EEG maps, scored by 2 independent trained sleep scorers, were classified into stages I, II, III-IV, and REM, apart from control runs of active wakefulness with eyes open (EO) and quiet wakefulness with eyes closed (EC), which were undertaken on mini- and microsystems of EEG analysis.(ABSTRACT TRUNCATED AT 250 WORDS)

  9. The digital geologic map of Colorado in ARC/INFO format

    USGS Publications Warehouse

    Green, Gregory N.

    1992-01-01

    This geologic map was prepared as a part of a study of digital methods and techniques as applied to complex geologic maps. The geologic map was digitized from the original scribe sheets used to prepare the published Geologic Map of Colorado (Tweto 1979). Consequently the digital version is at 1:500,000 scale using the Lambert Conformal Conic map projection parameters of the state base map. Stable base contact prints of the scribe sheets were scanned on a Tektronix 4991 digital scanner. The scanner automatically converts the scanned image to an ASCII vector format. These vectors were transferred to a VAX minicomputer, where they were then loaded into ARC/INFO. Each vector and polygon was given attributes derived from the original 1979 geologic map. This database was developed on a MicroVAX computer system using VAX V 5.4 nd ARC/INFO 5.0 software. UPDATE: April 1995, The update was done solely for the purpose of adding the abilitly to plot to an HP650c plotter. Two new ARC/INFO plot AMLs along with a lineset and shadeset for the HP650C design jet printer have been included. These new files are COLORADO.650, INDEX.650, TWETOLIN.E00 and TWETOSHD.E00. These files were created on a UNIX platform with ARC/INFO 6.1.2. Updated versions of INDEX.E00, CONTACT.E00, LINE.E00, DECO.E00 and BORDER.E00 files that included the newly defined HP650c items are also included. * Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government. Descriptors: The Digital Geologic Map of Colorado in ARC/INFO Format Open-File Report 92-050

  10. An Experimental Digital Image Processor

    NASA Astrophysics Data System (ADS)

    Cok, Ronald S.

    1986-12-01

    A prototype digital image processor for enhancing photographic images has been built in the Research Laboratories at Kodak. This image processor implements a particular version of each of the following algorithms: photographic grain and noise removal, edge sharpening, multidimensional image-segmentation, image-tone reproduction adjustment, and image-color saturation adjustment. All processing, except for segmentation and analysis, is performed by massively parallel and pipelined special-purpose hardware. This hardware runs at 10 MHz and can be adjusted to handle any size digital image. The segmentation circuits run at 30 MHz. The segmentation data are used by three single-board computers for calculating the tonescale adjustment curves. The system, as a whole, has the capability of completely processing 10 million three-color pixels per second. The grain removal and edge enhancement algorithms represent the largest part of the pipelined hardware, operating at over 8 billion integer operations per second. The edge enhancement is performed by unsharp masking, and the grain removal is done using a collapsed Walsh-hadamard transform filtering technique (U.S. Patent No. 4549212). These two algo-rithms can be realized using four basic processing elements, some of which have been imple-mented as VLSI semicustom integrated circuits. These circuits implement the algorithms with a high degree of efficiency, modularity, and testability. The digital processor is controlled by a Digital Equipment Corporation (DEC) PDP 11 minicomputer and can be interfaced to electronic printing and/or electronic scanning de-vices. The processor has been used to process over a thousand diagnostic images.

  11. Cyclic axial-torsional deformation behavior of a cobalt-base superalloy

    SciTech Connect

    Bonacuse, P.J.; Kalluri, S.

    1992-11-01

    Multiaxial loading, especially at elevated temperature, can cause the inelastic response of a material to differ significantly from that predicted by simple flow rules, i.e., von Mises or Tresca. To quantify some of these differences, the cyclic high-temperature, deformation behavior of a wrought cobalt-based superalloy, Haynes 188, is investigated under combined axial and torsional loads. Haynes 188 is currently used in many aerospace gas turbine and rocket engine applications, e.g., the combustor liner for the T800 turboshaft engine for the RAH-66 Comanche helicopter and the liquid oxygen posts in the main injector of the space shuttle main engine. The deformation behavior of this material is assessed through the examination of hysteresis loops generated from a biaxial fatigue test program. A high-temperature axial, torsional, and combined axial-torsional fatigue data base has been generated on Haynes 188 at 760 C. Cyclic loading tests have been conducted on uniform gauge section tubular specimens in a servohydraulic axial-torsional test rig. Test control and data acquisition were accomplished with a minicomputer. In this paper, the cyclic hardening characteristics and typical hysteresis loops in the axial stress versus axial strain, shear stress versus engineering shear strain, axial strain versus engineering shear strain, and axial stress versus shear stress spaces are presented for cyclic, in-phase and out-of-phase, axial torsional tests. For in-phase tests three different values of the proportionality constant, lambda (ratio of engineering shear strain amplitude to axial strain amplitude), are examined, viz., 0.86, 1.73, and 3.46. In the out-of-phase tests, three different values of the phase angle, phi (between the axial and engineering shear strain waveforms), are studied, viz., 30, 60, and 90 deg with lambda = 1.73.

  12. Structural Analysis Made 'NESSUSary'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Everywhere you look, chances are something that was designed and tested by a computer will be in plain view. Computers are now utilized to design and test just about everything imaginable, from automobiles and airplanes to bridges and boats, and elevators and escalators to streets and skyscrapers. Computer-design engineering first emerged in the 1970s, in the automobile and aerospace industries. Since computers were in their infancy, however, architects and engineers during the time were limited to producing only designs similar to hand-drafted drawings. (At the end of 1970s, a typical computer-aided design system was a 16-bit minicomputer with a price tag of $125,000.) Eventually, computers became more affordable and related software became more sophisticated, offering designers the "bells and whistles" to go beyond the limits of basic drafting and rendering, and venture into more skillful applications. One of the major advancements was the ability to test the objects being designed for the probability of failure. This advancement was especially important for the aerospace industry, where complicated and expensive structures are designed. The ability to perform reliability and risk assessment without using extensive hardware testing is critical to design and certification. In 1984, NASA initiated the Probabilistic Structural Analysis Methods (PSAM) project at Glenn Research Center to develop analysis methods and computer programs for the probabilistic structural analysis of select engine components for current Space Shuttle and future space propulsion systems. NASA envisioned that these methods and computational tools would play a critical role in establishing increased system performance and durability, and assist in structural system qualification and certification. Not only was the PSAM project beneficial to aerospace, it paved the way for a commercial risk- probability tool that is evaluating risks in diverse, down- to-Earth application

  13. Design of a real-time wind turbine simulator using a custom parallel architecture

    NASA Technical Reports Server (NTRS)

    Hoffman, John A.; Gluck, R.; Sridhar, S.

    1995-01-01

    The design of a new parallel-processing digital simulator is described. The new simulator has been developed specifically for analysis of wind energy systems in real time. The new processor has been named: the Wind Energy System Time-domain simulator, version 3 (WEST-3). Like previous WEST versions, WEST-3 performs many computations in parallel. The modules in WEST-3 are pure digital processors, however. These digital processors can be programmed individually and operated in concert to achieve real-time simulation of wind turbine systems. Because of this programmability, WEST-3 is very much more flexible and general than its two predecessors. The design features of WEST-3 are described to show how the system produces high-speed solutions of nonlinear time-domain equations. WEST-3 has two very fast Computational Units (CU's) that use minicomputer technology plus special architectural features that make them many times faster than a microcomputer. These CU's are needed to perform the complex computations associated with the wind turbine rotor system in real time. The parallel architecture of the CU causes several tasks to be done in each cycle, including an IO operation and the combination of a multiply, add, and store. The WEST-3 simulator can be expanded at any time for additional computational power. This is possible because the CU's interfaced to each other and to other portions of the simulation using special serial buses. These buses can be 'patched' together in essentially any configuration (in a manner very similar to the programming methods used in analog computation) to balance the input/ output requirements. CU's can be added in any number to share a given computational load. This flexible bus feature is very different from many other parallel processors which usually have a throughput limit because of rigid bus architecture.

  14. Immunoglobulin surface-binding kinetics studied by total internal reflection with fluorescence correlation spectroscopy.

    PubMed Central

    Thompson, N L; Axelrod, D

    1983-01-01

    An experimental application of total internal reflection with fluorescence correlation spectroscopy (TIR/FCS) is presented. TIR/FCS is a new technique for measuring the binding and unbinding rates and surface diffusion coefficient of fluorescent-labeled solute molecules in equilibrium at a surface. A laser beam totally internally reflects at the solid-liquid interface, selectively exciting surface-adsorbed molecules. Fluorescence collected by a microscope from a small, well-defined surface area approximately 5 micron2 spontaneously fluctuates as solute molecules randomly bind to, unbind from, and/or diffuse along the surface in chemical equilibrium. The fluorescence is detected by a photomultiplier and autocorrelated on-line by a minicomputer. The shape of the autocorrelation function depends on the bulk and surface diffusion coefficients, the binding rate constants, and the shape of the illuminated and observed region. The normalized amplitude of the autocorrelation function depends on the average number of molecules bound within the observed area. TIR/FCS requires no spectroscopic or thermodynamic change between dissociated and complexed states and no extrinsic perturbation from equilibrium. Using TIR/FCS, we determine that rhodamine-labeled immunoglobulin and insulin each nonspecifically adsorb to serum albumin-coated fused silica with both reversible and irreversible components. The characteristic time of the most rapidly reversible component measured is approximately 5 ms and is limited by the rate of bulk diffusion. Rhodamine-labeled bivalent antibodies to dinitrophenyl (DNP) bind to DNP-coated fused silica virtually irreversibly. Univalent Fab fragments of these same antibodies appear to specifically bind to DNP-coated fused silica, accompanied by a large amount of nonspecific binding. TIR/FCS is shown to be a feasible technique for measuring absorption/desorption kinetic rates at equilibrium. In suitable systems where nonspecific binding is low, TIR

  15. CONFIT: a computer code for thermal conductivity probe data reduction with the use of parameter estimation techniques

    SciTech Connect

    Koski, J A

    1982-05-01

    The basis and operation of the computer code CONFIT are described, and a sample case provided. The code uses parameter estimation techniques to obtain thermal conductivity and other parameters of interest from temperature versus time data acquired with the use of line-source type thermal conductivity probes. The basic estimation approach consists of fitting (in the least-squares sense) analytical problem solutions to the experimental data. Problem parameters (e.g., thermal conductivity) are used as curve fit variables, and are thus determined when the least-squares fit is achieved. Some advantages of the method include the following: requirements for development of the straight line region of the log-time versus probe temperature curve are minimized. (This permits shorter runs with low conductivity materials and more rapid return to equilibrium after the run is completed when compared to standard data reduction techniques); deviations between the experimental data and the analytical model are easily observed and analyzed. (Statistical tests on the residuals, the differences between the experimental data and the analytical solution, can be used to confirm the validity of the results); and contact resistance between the probe and the test material can be estimated simultaneously with the conductivity, simplifying data reduction. The code is written in Fortran IV (based on ANSI 1966 Fortran) and has been implemented on a Control Data Corporation 6600 computer and on a Hewlett-Packard 1000 minicomputer system in an interactive mode. With minor modifications, the program can be used with more recent Fortran compilers, e.g., Fortran V, based on ANSI 1977 Fortran.

  16. Computer-generated speech

    SciTech Connect

    Aimthikul, Y.

    1981-12-01

    This thesis reviews the essential aspects of speech synthesis and distinguishes between the two prevailing techniques: compressed digital speech and phonemic synthesis. It then presents the hardware details of the five speech modules evaluated. FORTRAN programs were written to facilitate message creation and retrieval with four of the modules driven by a PDP-11 minicomputer. The fifth module was driven directly by a computer terminal. The compressed digital speech modules (T.I. 990/306, T.S.I. Series 3D and N.S. Digitalker) each contain a limited vocabulary produced by the manufacturers while both the phonemic synthesizers made by Votrax permit an almost unlimited set of sounds and words. A text-to-phoneme rules program was adapted for the PDP-11 (running under the RSX-11M operating system) to drive the Votrax Speech Pac module. However, the Votrax Type'N Talk unit has its own built-in translator. Comparison of these modules revealed that the compressed digital speech modules were superior in pronouncing words on an individual basis but lacked the inflection capability that permitted the phonemic synthesizers to generate more coherent phrases. These findings were necessarily highly subjective and dependent on the specific words and phrases studied. In addition, the rapid introduction of new modules by manufacturers will necessitate new comparisons. However, the results of this research verified that all of the modules studied do possess reasonable quality of speech that is suitable for man-machine applications. Furthermore, the development tools are now in place to permit the addition of computer speech output in such applications.

  17. [The effect of the motion of forefoot joint at the force exerted upon the floor during walking exercise].

    PubMed

    Maeda, A; Nishizono, H; Ebashi, H; Shibayama, H

    1993-11-01

    In walking exercise the human body is exposed to external forces. Some of them are produced by constraints such as surface, shoes or opponent. In kick action of walking, the ground reaction force (GRF) is the most important external force. The magnitude of the GRF, its direction, and point of application have an influence on the load on the human body. The purpose of this study is to clarify the role of forefoot joint (artt. metatarsophalangeae) at the force exerted upon the floor during kick action of walking. The device used in this study to analyze the GRF and its three components consists of Kistler's force platform. Output from force transducer was collected online with a TEAC data recorder and MEM-4101 minicomputer. The impact force measurements were taken from the anterior-posterior force time curves at the take-off for 1 subject walking 10 trials at 2 m/sec with 2 different pairs of shoes (Shoes 1: thin sole of 4mm, and Shoes 2: thick sole of 40mm) and without shoes. High speed (200f/sec) cinematography was also used to analyze the angular displacement of forefoot joint at the take-off of walking exercise. The force acting at the forefoot joint may produce the anterior-posterior force of the GRF which is defined as the propelling power acting on the human body during walking exercise. The result showed that the impact force peak occurred 40-60 msec before take-off and the propelling part of kick action accounted for only about 6% of the external force.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:8123187

  18. Recent advances in EEG data processing.

    PubMed

    Zetterberg, L H

    1978-01-01

    It is argued that the most interesting advances in EEG signal processing are with methods based on descriptive mathematical models of the process. Formulation of auto-regressive (AR) and mixed autoregressive and moving average (ARMA) models is reviewed for the scalar and the multidimensional cases and extensions to allow time-varying coefficients are pointed out. Data processing with parametric models, DPPM, involves parameter estimation and a large number of algorithms are available. Emphasis is put on those that are simple to apply and require a modest amount of computation. A recursive algorithm by Levinson, Robinson and Durbin is well suited for estimation of the coefficients in the AR model and for tests of model order. It is applicable to both the scalar and multidimensional cases. The ARMA model can be handled by approximation of an AR model or by nonlinear optimization. Recursive estimation with AR and ARMA models is reviewed and the connection with the Kalman filter pointed out. In this way processes with time-varying properties may be handled and a stationarity index is defined. The recursive algorithms can deal with AR or ARMA models in the same way. A reformulation of the algorithm to include sparsely updated parameter estimates significantly speeds up the calculations. It will allow several EEG channels to be handled simultaneously in real time on a modern minicomputer installation. DPPM has been particularly successful in the areas of spectral analysis and detection of short transients such as spikes and sharp waves. Recently some interesting attempts have been made to apply classification algorithms to estimated parameters. A brief review is made of the main results in these areas.

  19. Side-scan sonar mapping: Pseudo-real-time processing and mosaicking techniques

    SciTech Connect

    Danforth, W.W.; Schwab, W.C.; O'Brien, T.F. ); Karl, H. )

    1990-05-01

    The US Geological Survey (USGS) surveyed 1,000 km{sup 2} of the continental shelf off San Francisco during a 17-day cruise, using a 120-kHz side-scan sonar system, and produced a digitally processed sonar mosaic of the survey area. The data were processed and mosaicked in real time using software developed at the Lamont-Doherty Geological Observatory and modified by the USGS, a substantial task due to the enormous amount of data produced by high-resolution side-scan systems. Approximately 33 megabytes of data were acquired every 1.5 hr. The real-time sonar images were displayed on a PC-based workstation and the data were transferred to a UNIX minicomputer where the sonar images were slant-range corrected, enhanced using an averaging method of desampling and a linear-contrast stretch, merged with navigation, geographically oriented at a user-selected scale, and finally output to a thermal printer. The hard-copy output was then used to construct a mosaic of the survey area. The final product of this technique is a UTM-projected map-mosaic of sea-floor backscatter variations, which could be used, for example, to locate appropriate sites for sediment sampling to ground truth the sonar imagery while still at sea. More importantly, reconnaissance surveys of this type allow for the analysis and interpretation of the mosaic during a cruise, thus greatly reducing the preparation time needed for planning follow-up studies of a particular area.

  20. Galatea ¬â€?An Interactive Computer Graphics System For Movie And Video Analysis

    NASA Astrophysics Data System (ADS)

    Potel, Michael J.; MacKay, Steven A.; Sayre, Richard E.

    1983-03-01

    Extracting quantitative information from movie film and video recordings has always been a difficult process. The Galatea motion analysis system represents an application of some powerful interactive computer graphics capabilities to this problem. A minicomputer is interfaced to a stop-motion projector, a data tablet, and real-time display equipment. An analyst views a film and uses the data tablet to track a moving position of interest. Simultaneously, a moving point is displayed in an animated computer graphics image that is synchronized with the film as it runs. Using a projection CRT and a series of mirrors, this image is superimposed on the film image on a large front screen. Thus, the graphics point lies on top of the point of interest in the film and moves with it at cine rates. All previously entered points can be displayed simultaneously in this way, which is extremely useful in checking the accuracy of the entries and in avoiding omission and duplication of points. Furthermore, the moving points can be connected into moving stick figures, so that such representations can be transcribed directly from film. There are many other tools in the system for entering outlines, measuring time intervals, and the like. The system is equivalent to "dynamic tracing paper" because it is used as though it were tracing paper that can keep up with running movie film. We have applied this system to a variety of problems in cell biology, cardiology, biomechanics, and anatomy. We have also extended the system using photogrammetric techniques to support entry of three-dimensional moving points from two (or more) films taken simultaneously from different perspective views. We are also presently constructing a second, lower-cost, microcomputer-based system for motion analysis in video, using digital graphics and video mixing to achieve the graphics overlay for any composite video source image.

  1. Two dimensional NMR of liquids and oriented molecules

    SciTech Connect

    Gochin, M.

    1987-02-01

    Chapter 1 discusses the quantum mechanical formalism used for describing the interaction between magnetic dipoles that dictates the appearance of a spectrum. The NMR characteristics of liquids and liquid crystals are stressed. Chapter 2 reviews the theory of multiple quantum and two dimensional NMR. Properties of typical spectra and phase cycling procedures are discussed. Chapter 3 describes a specific application of heteronuclear double quantum coherence to the removal of inhomogeneous broadening in liquids. Pulse sequences have been devised which cancel out any contribution from this inhomogeneity to the final spectrum. An interpretation of various pulse sequences for the case of /sup 13/C and /sup 1/H is given, together with methods of spectral editing by removal or retention of the homo- or heteronuclear J coupling. The technique is applied to a demonstration of high resolution in both frequency and spatial dimensions with a surface coil. In Chapter 4, multiple quantum filtered 2-D spectroscopy is demonstrated as an effective means of studying randomly deuterated molecules dissolved in a nematic liquid crystal. Magnitudes of dipole coupling constants have been determined for benzene and hexane, and their signs and assignments found from high order multiple quantum spectra. For the first time, a realistic impression of the conformation of hexane can be estimated from these results. Chapter 5 is a technical description of the MDB DCHIB-DR11W parallel interface which has been set up to transfer data between the Data General Nova 820 minicomputer, interfaced to the 360 MHz spectrometer, and the Vax 11/730. It covers operation of the boards, physical specifications and installation, and programs for testing and running the interface.

  2. Rapid calculation of functional maps of glucose metabolic rate and individual model rate parameters from serial 2-FDG images

    SciTech Connect

    Koeppe, R.A.; Holden, J.E.; Hutchins, G.D.

    1985-05-01

    The authors have developed a method for the rapid pixel-by-pixel estimation of glucose metabolic rate from a dynamic sequence of PCT images acquired over 40 minutes following venous bolus injection of 2-deoxy-2-fluoro-D-glucose (2-FDG). The calculations are based on the conventional four parameter model. The dephosphorylation rate (k/sub 4/) cannot be reliably estimated from only 40 minutes of data; however, neglecting dephosphorylation can nonetheless introduce significant biases into the parameter estimation processes. In the authors' method, the rate is constrained to fall within a small range about a presumed value. Computer simulation studies show that this constraint greatly reduces the systematic biases in the other three fitted parameters and in the metabolic rate that arise from the assumption of no dephosphorylation. The parameter estimation scheme used is formally identical to one originally developed for dynamic methods of cerebral blood flow estimation. Estimation of metabolic rate and the individual model rate parameters k/sub 1/, k/sub 2/, and k/sub 3/, can be carried out for each pixel sequence of a 100 x 100 pixel image in less than two minutes on our PDP 11/60 minicomputer with floating point processor. While the maps of k/sub 2/ amd k/sub 3/ are quite noisy, accurate estimates of average values can be attained for regions of a few cm/sup 2/. The maps of metabolic rate offer many advantages in addition to that of direct visualization. These include improved statistical precision and the avoidance of averaging failure in the fitting of heterogeneous regions.

  3. Cyclic axial-torsional deformation behavior of a cobalt-base superalloy

    NASA Technical Reports Server (NTRS)

    Bonacuse, Peter J.; Kalluri, Sreeramesh

    1992-01-01

    Multiaxial loading, especially at elevated temperature, can cause the inelastic response of a material to differ significantly from that predicted by simple flow rules, i.e., von Mises or Tresca. To quantify some of these differences, the cyclic high-temperature, deformation behavior of a wrought cobalt-based superalloy, Haynes 188, is investigated under combined axial and torsional loads. Haynes 188 is currently used in many aerospace gas turbine and rocket engine applications, e.g., the combustor liner for the T800 turboshaft engine for the RAH-66 Comanche helicopter and the liquid oxygen posts in the main injector of the space shuttle main engine. The deformation behavior of this material is assessed through the examination of hysteresis loops generated from a biaxial fatigue test program. A high-temperature axial, torsional, and combined axial-torsional fatigue data base has been generated on Haynes 188 at 760 C. Cyclic loading tests have been conducted on uniform gauge section tubular specimens in a servohydraulic axial-torsional test rig. Test control and data acquisition were accomplished with a minicomputer. In this paper, the cyclic hardening characteristics and typical hysteresis loops in the axial stress versus axial strain, shear stress versus engineering shear strain, axial strain versus engineering shear strain, and axial stress versus shear stress spaces are presented for cyclic, in-phase and out-of-phase, axial torsional tests. For in-phase tests three different values of the proportionality constant, lambda (ratio of engineering shear strain amplitude to axial strain amplitude), are examined, viz., 0.86, 1.73, and 3.46. In the out-of-phase tests, three different values of the phase angle, phi (between the axial and engineering shear strain waveforms), are studied, viz., 30, 60, and 90 deg with lambda = 1.73. The cyclic hardening behaviors of all the tests conducted on Haynes 188 at 760 C are evaluated using the von Mises equivalent stress

  4. Two-Dimensional Analysis of Narrow Gate Effects in Micron and Submicron Mosfets

    NASA Astrophysics Data System (ADS)

    Chung, Shao-Shiun

    Variations of the device characteristics due to the geometry effects in narrow gate MOSFETs, such as threshold voltage shift and subthreshold characteristics, are important factors in designing next generation MOS-VLSI circuits. It is well known that numerical methods, using the exact 2 -D solutions of the transport equation and Poisson's equation for studying the geometry effect of small MOSFETs, are more accurate than simple charge-control analysis. The 2-D numerical model of Ji and Sah demonstrated important design features of the threshold voltage of narrow gate MOSFETs. However, studies of MOSFET characteristics using 2-D numerical analysis, which take into account the effects of all the device parameters, such as gate oxide thickness, backgate bias, and substrate doping, are lacking. Particularly, the analysis of the subthreshold characteristic for narrow gate MOSFETs was not reported before. The ideas in Ji-Sah's depletion approximation model, as well as their analysis method, have been extended to take into account the electrons and holes in the numerical solution of Poisson's equation. Using a super-minicomputer (VAX-11/750), a new 2-D program (NAROMOS-II) using the finite difference method has been developed in this thesis. Based on the 2-D results and device physics, a threshold voltage model and a subthreshold characteristics model for CAD of MOS-VLSI are proposed to describe the geometry effect of narrow gate MOSFETs. These models are based on the extraction of four model parameters: two for the threshold voltage model, and two for the subthreshold characteristics model. All of these model parameters can be verified numerically or experimentally. Results for the threshold voltage model compare favorably with numerical and reported experimental data. Dependences of the device performance on the device parameters are then investigated, using the above analysis techniques. Simple forms of the models of the threshold voltage shift and subthreshold

  5. LOOK- A TEXT FILE DISPLAY PROGRAM

    NASA Technical Reports Server (NTRS)

    Vavrus, J. L.

    1994-01-01

    The LOOK program was developed to permit a user to examine a text file in a psuedo-random access manner. Many engineering and scientific programs generate large amounts of printed output. Often this output needs to be examined in only a few places. On mini-computers (like the DEC VAX) high-speed printers are usually at a premium. One alternative is to save the output in a text file and examine it with a text editor. The slowness of a text editor, the possibility of inadvertently changing the output, and other factors make this an unsatisfactory solution. The LOOK program provides the user with a means of rapidly examining the contents of an ASCII text file. LOOK's basis of operation is to open the text file for input only and then access it in a block-wise fashion. LOOK handles the text formatting and displays the text lines on the screen. The user can move forward or backward in the file by a given number of lines or blocks. LOOK also provides the ability to "scroll" the text at various speeds in the forward or backward directions. The user can perform a search for a string (or a combination of up to 10 strings) in a forward or backward direction. Also, user selected portions of text may be extracted and submitted to print or placed in a file. Additional features available to the LOOK user include: cancellation of an operation with a keystroke, user definable keys, switching mode of operation (e.g. 80/132 column), on-line help facility, trapping broadcast messages, and the ability to spawn a sub-process to carry out DCL functions without leaving LOOK. The LOOK program is written in FORTRAN 77 and MACRO ASSEMBLER for interactive execution and has been implemented on a DEC VAX computer using VAX/VMS with a central memory requirement of approximately 430K of 8 bit bytes. LOOK operation is terminal independent but will take advantage of the features of the DEC VT100 terminal if available. LOOK was developed in 1983.

  6. Cyclic Axial-Torsional Deformation Behavior of a Cobalt-Base Superalloy

    NASA Technical Reports Server (NTRS)

    Bonacuse, Peter J.; Kalluri, Sreeramesh

    1995-01-01

    The cyclic, high-temperature deformation behavior of a wrought cobalt-base super-alloy, Haynes 188, is investigated under combined axial and torsional loads. This is accomplished through the examination of hysteresis loops generated from a biaxial fatigue test program. A high-temperature axial, torsional, and combined axial-torsional fatigue database has been generated on Haynes 188 at 760 C. Cyclic loading tests have been conducted on uniform gage section tubular specimens in a servohydraulic axial-torsional test rig. Test control and data acquisition were accomplished with a minicomputer. The fatigue behavior of Haynes 188 at 760 C under axial, torsional, and combined axial-torsional loads and the monotonic and cyclic deformation behaviors under axial and torsional loads have been previously reported. In this paper, the cyclic hardening characteristics and typical hysteresis loops in the axial stress versus axial strain, shear stress ,versus engineering shear strain, axial strain versus engineering shear strain. and axial stress versus shear stress spaces are presented for cyclic in-phase and out-of-phase axial-torsional tests. For in-phase tests, three different values of the proportionality constant lambda (the ratio of engineering shear strain amplitude to axial strain amplitude, are examined, viz. 0.86, 1.73, and 3.46. In the out-of-phase tests, three different values of the phase angle, phi (between the axial and engineering shear strain waveforms), are studied, viz., 30, 60, and 90 degrees with lambda equals 1.73. The cyclic hardening behaviors of all the tests conducted on Haynes 188 at 760 C are evaluated using the von Mises equivalent stress-strain and the maximum shear stress-maximum engineering shear strain (Tresca) curves. Comparisons are also made between the hardening behaviors of cyclic axial, torsional, and combined in-phase (lambda = 1.73 and phi = 0) and out-of-phase (lambda = 1.73 and phi = 90') axial-torsional fatigue tests. These comparisons

  7. Obituary: Arthur Dodd Code (1923-2009)

    NASA Astrophysics Data System (ADS)

    Marché, Jordan D., II

    2009-12-01

    future course of stellar astronomy," a prediction strongly borne out in the decades that followed. In 1959, Code founded the Space Astronomy Laboratory (SAL) within the UW Department of Astronomy. Early photometric and spectrographic equipment was test-flown aboard NASA's X-15 rocket plane and Aerobee sounding rockets. Along with other SAL personnel, including Theodore E. Houck, Robert C. Bless, and John F. McNall, Code (as principal investigator) was responsible for the design of the Wisconsin Experiment Package (WEP) as one of two suites of instruments to be flown aboard the Orbiting Astronomical Observatory (OAO), which represented a milestone in the advent of space astronomy. With its seven reflecting telescopes feeding five filter photometers and two scanning spectrometers, WEP permitted the first extended observations in the UV portion of the spectrum. After the complete failure of the OAO-1 spacecraft (launched in 1966), OAO-2 was successfully launched on 7 December 1968 and gathered data on over a thousand celestial objects during the next 50 months, including stars, nebulae, galaxies, planets, and comets. These results appeared in a series of more than 40 research papers, chiefly in the Ap.J., along with the 1972 monograph, The Scientific Results from the Orbiting Astronomical Observatory (OAO-2), edited by Code. Between the OAO launches, other SAL colleagues of Code developed the Wisconsin Automatic Photoelectric Telescope (or APT), the first computer-controlled (or "robotic") telescope. Driven by a PDP-8 mini-computer, it routinely collected atmospheric extinction data. Code was also chosen principal investigator for the Wisconsin Ultraviolet Photo-Polarimeter Experiment (or WUPPE). This used a UV-sensitive polarimeter designed by Kenneth Nordsieck that was flown twice aboard the space shuttles in 1990 and 1995. Among other findings, WUPPE observations demonstrated that interstellar dust does not appreciably change the direction of polarization of starlight

  8. Computer networks: making the decision to join one.

    PubMed

    Massy, W F

    1974-11-01

    I began this article with the thesis that the director of a university computer center is in a double bind. He is under increasing pressure because of competition with networks and minicomputers at the same time that his funding base is weakening. The breadth of demand for computer services, and the cost of developing new services, are increasing dramatically. The director is pressed by budget officers and internal economics to run more efficiently, but if in so doing he fails to meet new needs or downgrades effectiveness for some existing users he runs the risks of losing demand to the competition and hence worsening his immediate financial problems. The impact of networks on this state of affairs might be, briefly, as follows: 1) The centrally planned computer utility would take these pressures off the individual campus computer center and lodge them in a state, regional, or perhaps even a national network organization. While this might be desirable in some cases (depending on the scale of operations), I believe that economies of scale would tend to be more than offset by diseconomies in planning, management, and control; by a reduction of responsiveness to users' needs; and by a slowing of the rate of innovation in computing. 2) The distributive network substitutes a "market economy" for a centrally planned one. Subject to a certain amount of planning and regulation, which might be undertaken by colleges and universities themselves, individual researchers can tap larger markets for services, and participating institutions can obtain at least part of their computing needs on a variable cost basis at prices determined by competition. 3) Membership in a distributive network with sufficient breadth and depth of resources can emancipate the director of the computer center by widening options and allowing him to serve more effectively the steadily broadening range of legitimate academic and research computing needs without his having to stretch his internal resources

  9. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION WITH CLIPSITS)

    NASA Technical Reports Server (NTRS)

    Riley, , .

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  10. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (MACINTOSH VERSION)

    NASA Technical Reports Server (NTRS)

    Culbert, C.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  11. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION)

    NASA Technical Reports Server (NTRS)

    Riley, G.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  12. Applications of research from the U.S. Geological Survey program, assessment of regional earthquake hazards and risk along the Wasatch Front, Utah

    USGS Publications Warehouse

    Gori, Paula L.

    1993-01-01

    engineering studies. Translated earthquake hazard maps have also been developed to identify areas that are particularly vulnerable to various causes of damage such as ground shaking, surface rupturing, and liquefaction. The implementation of earthquake hazard reduction plans are now under way in various communities in Utah. The results of a survey presented in this paper indicate that technical public officials (planners and building officials) have an understanding of the earthquake hazards and how to mitigate the risks. Although the survey shows that the general public has a slightly lower concern about the potential for economic losses, they recognize the potential problems and can support a number of earthquake mitigation measures. The study suggests that many community groups along the Wasatch Front, including volunteer groups, business groups, and elected and appointed officials, are ready for action-oriented educational programs. These programs could lead to a significant reduction in the risks associated with earthquake hazards. A DATA BASE DESIGNED FOR URBAN SEISMIC HAZARDS STUDIES: A computerized data base has been designed for use in urban seismic hazards studies conducted by the U.S. Geological Survey. The design includes file structures for 16 linked data sets, which contain geological, geophysical, and seismological data used in preparing relative ground response maps of large urban areas. The data base is organized along relational data base principles. A prototype urban hazards data base has been created for evaluation in two urban areas currently under investigation: the Wasatch Front region of Utah and the Puget Sound area of Washington. The initial implementation of the urban hazards data base was accomplished on a microcomputer using dBASE III Plus software and transferred to minicomputers and a work station. A MAPPING OF GROUND-SHAKING INTENSITIES FOR SALT LAKE COUNTY, UTAH: This paper documents the development of maps showing a

  13. NASA/FLAGRO - FATIGUE CRACK GROWTH COMPUTER PROGRAM

    NASA Technical Reports Server (NTRS)

    Forman, R. G.

    1994-01-01

    -intensity factor numerical values can be computed for making comparisons or checks of solutions. NASA/FLAGRO can check for failure of a part-through crack in the mode of a through crack when net ligament yielding occurs. NASA/FLAGRO has a number of special subroutines and files which provide enhanced capabilities and easy entry of data. These include crack case solutions, cyclic load spectrums, nondestructive examination initial flaw sizes, table interpolation, and material properties. The materials properties files are divided into two types, a user defined file and a fixed file. Data is entered and stored in the user defined file during program execution, while the fixed file contains already coded-in property value data for many different materials. Prompted input from CRT terminals consists of initial crack definition (which can be defined automatically), rate solution type, flaw type and geometry, material properties (if they are not in the built-in tables of material data), load spectrum data (if not included in the loads spectrum file), and design limit stress levels. NASA/FLAGRO output includes an echo of the input with any error or warning messages, the final crack size, whether or not critical crack size has been reached for the specified stress level, and a life history profile of the crack propagation. NASA/FLAGRO is modularly designed to facilitate revisions and operation on minicomputers. The program was implemented on a DEC VAX 11/780 with the VMS operating system. NASA/FLAGRO is written in FORTRAN77 and has a memory requirement of 1.4 MB. The program was developed in 1986.

  14. Image analysis techniques. The problem of the quantitative evaluation of thechromatin ultrastructure.

    PubMed

    Maraldi, N M; Marinelli, F; Squarzoni, S; Santi, S; Barbieri, M

    1991-02-01

    The application of image analysis methods to conventional thin sections for electron microscopy to analyze the chromatin arrangement are quite limited. We developed a method which utilizes freeze-fractured samples; the results indicate that the method is suitable for identifying the changes in the chromatin arrangement which occur in physiological, experimental and pathological conditions. The modern era of image analysis begins in 1964, when pictures of the moon transmitted by Ranger 7 were processed by a computer. This processing improved the original picture by enhancing and restoring the image affected by various types of distorsion. These performances have been allowed by the third-generation of computers having the speed and the storage capabilities required for practical use of image processing algorithms. Each image can be converted into a two-dimensional light intensity function: f (x, y), where x and y are the spatial coordinates and f value is proportional to the gray level of the image at that point. The digital image is therefore a matrix whose elements are the pixels (picture elements). A typical digital image can be obtained with a quality comparable to monochrome TV, with a 512×512 pixel array with 64 gray levels. The magnetic disks of commercial minicomputers are thus capable of storing some tenths of images which can be elaborated by the image processor, converting the signal into digital form. In biological images, obtained by light microscopy, the digitation converts the chromatic differences into gray level intensities, thus allowing to define the contours of the cytoplasm, of the nucleus and of the nucleoli. The use of a quantitative staining method for the DNA, the Feulgen reaction, permits to evaluate the ratio between condensed chromatin (stained) and euchromatin (unstained). The digitized images obtained by transmission electron microscopy are rich in details at high resolution. However, the application of image analysis techniques to

  15. User's guide to HYPOINVERSE-2000, a Fortran program to solve for earthquake locations and magnitudes

    USGS Publications Warehouse

    Klein, Fred W.

    2002-01-01

    Hypoinverse is a computer program that processes files of seismic station data for an earthquake (like p wave arrival times and seismogram amplitudes and durations) into earthquake locations and magnitudes. It is one of a long line of similar USGS programs including HYPOLAYR (Eaton, 1969), HYPO71 (Lee and Lahr, 1972), and HYPOELLIPSE (Lahr, 1980). If you are new to Hypoinverse, you may want to start by glancing at the section “SOME SIMPLE COMMAND SEQUENCES” to get a feel of some simpler sessions. This document is essentially an advanced user’s guide, and reading it sequentially will probably plow the reader into more detail than he/she needs. Every user must have a crust model, station list and phase data input files, and glancing at these sections is a good place to begin. The program has many options because it has grown over the years to meet the needs of one the largest seismic networks in the world, but small networks with just a few stations do use the program and can ignore most of the options and commands. History and availability. Hypoinverse was originally written for the Eclipse minicomputer in 1978 (Klein, 1978). A revised version for VAX and Pro-350 computers (Klein, 1985) was later expanded to include multiple crustal models and other capabilities (Klein, 1989). This current report documents the expanded Y2000 version and it supercedes the earlier documents. It serves as a detailed user's guide to the current version running on unix and VAX-alpha computers, and to the version supplied with the Earthworm earthquake digitizing system. Fortran-77 source code (Sun and VAX compatible) and copies of this documentation is available via anonymous ftp from computers in Menlo Park. At present, the computer is swave.wr.usgs.gov and the directory is /ftp/pub/outgoing/klein/hyp2000. If you are running Hypoinverse on one of the Menlo Park EHZ or NCSN unix computers, the executable currently is ~klein/hyp2000/hyp2000. New features. The Y2000 version of

  16. User's guide to HYPOINVERSE-2000, a Fortran program to solve for earthquake locations and magnitudes

    USGS Publications Warehouse

    Klein, Fred W.

    2002-01-01

    Hypoinverse is a computer program that processes files of seismic station data for an earthquake (like p wave arrival times and seismogram amplitudes and durations) into earthquake locations and magnitudes. It is one of a long line of similar USGS programs including HYPOLAYR (Eaton, 1969), HYPO71 (Lee and Lahr, 1972), and HYPOELLIPSE (Lahr, 1980). If you are new to Hypoinverse, you may want to start by glancing at the section “SOME SIMPLE COMMAND SEQUENCES” to get a feel of some simpler sessions. This document is essentially an advanced user’s guide, and reading it sequentially will probably plow the reader into more detail than he/she needs. Every user must have a crust model, station list and phase data input files, and glancing at these sections is a good place to begin. The program has many options because it has grown over the years to meet the needs of one the largest seismic networks in the world, but small networks with just a few stations do use the program and can ignore most of the options and commands. History and availability. Hypoinverse was originally written for the Eclipse minicomputer in 1978 (Klein, 1978). A revised version for VAX and Pro-350 computers (Klein, 1985) was later expanded to include multiple crustal models and other capabilities (Klein, 1989). This current report documents the expanded Y2000 version and it supercedes the earlier documents. It serves as a detailed user's guide to the current version running on unix and VAX-alpha computers, and to the version supplied with the Earthworm earthquake digitizing system. Fortran-77 source code (Sun and VAX compatible) and copies of this documentation is available via anonymous ftp from computers in Menlo Park. At present, the computer is swave.wr.usgs.gov and the directory is /ftp/pub/outgoing/klein/hyp2000. If you are running Hypoinverse on one of the Menlo Park EHZ or NCSN unix computers, the executable currently is ~klein/hyp2000/hyp2000. New features. The Y2000 version of

  17. WE-G-16A-01: Evolution of Radiation Treatment Planning

    SciTech Connect

    Rothenberg, L; Mohan, R; Van Dyk, J; Fraass, B; Bortfeld, T

    2014-06-15

    delineation, assignment of dose requirements, consideration of uncertainties, selection of beam configurations and shaping of beams, and calculations, optimization and evaluation of dose distributions. This will be followed by three presentations covering the evolution of treatment planning, which parallels the evolution of computers, availability of advanced volumetric imaging and the development of novel technologies such as dynamic multi-leaf collimators and online image guidance. This evolution will be divided over three distinct periods - prior to 1970's, the 2D era; from 1980 to the mid-1990's, the 3D era; and from the mid 1990's to today, the IMRT era. When the World was Flat: The Two-Dimensional Radiation Therapy Era” - Jacob Van Dyk In the 2D era, anatomy was defined with the aid of solder wires, special contouring devices and projection x-rays. Dose distributions were calculated manually from single field, flat surface isodoses on transparencies. Precalculated atlases of generic dose distributions were produced by the International Atomic Energy Agency. Massive time-shared main frames and mini-computers were used to compute doses at individual points or dose distributions in a single plane. Beam shapes were generally rectangular, with wedges, missing tissue compensators and occasional blocks to shield critical structures. Dose calculations were measurement-based or they used primary and scatter calculations based on scatter-air ratio methodologies. Dose distributions were displayed on line printers as alpha-numeric character maps or isodose patterns made with pen plotters. More than Pretty Pictures: 3D Treatment Planning and Conformal Therapy - Benedick A. Fraass The introduction of computed tomography allowed the delineation of anatomy three-dimensionally and, supported partly by contracts from the National Cancer Institute, made possible the introduction and clinical use of 3D treatment planning, leading to development and use of 3D conformal therapy in the 1980

  18. The ASC Sequoia Programming Model

    SciTech Connect

    Seager, M

    2008-08-06

    In the late 1980's and early 1990's, Lawrence Livermore National Laboratory was deeply engrossed in determining the next generation programming model for the Integrated Design Codes (IDC) beyond vectorization for the Cray 1s series of computers. The vector model, developed in mid 1970's first for the CDC 7600 and later extended from stack based vector operation to memory to memory operations for the Cray 1s, lasted approximately 20 years (See Slide 5). The Cray vector era was deemed an extremely long lived era as it allowed vector codes to be developed over time (the Cray 1s were faster in scalar mode than the CDC 7600) with vector unit utilization increasing incrementally over time. The other attributes of the Cray vector era at LLNL were that we developed, supported and maintained the Operating System (LTSS and later NLTSS), communications protocols (LINCS), Compilers (Civic Fortran77 and Model), operating system tools (e.g., batch system, job control scripting, loaders, debuggers, editors, graphics utilities, you name it) and math and highly machine optimized libraries (e.g., SLATEC, and STACKLIB). Although LTSS was adopted by Cray for early system generations, they later developed COS and UNICOS operating systems and environment on their own. In the late 1970s and early 1980s two trends appeared that made the Cray vector programming model (described above including both the hardware and system software aspects) seem potentially dated and slated for major revision. These trends were the appearance of low cost CMOS microprocessors and their attendant, departmental and mini-computers and later workstations and personal computers. With the wide spread adoption of Unix in the early 1980s, it appeared that LLNL (and the other DOE Labs) would be left out of the mainstream of computing without a rapid transition to these 'Killer Micros' and modern OS and tools environments. The other interesting advance in the period is that systems were being developed with multiple

  19. Obituary: Arthur Dodd Code (1923-2009)

    NASA Astrophysics Data System (ADS)

    Marché, Jordan D., II

    2009-12-01

    future course of stellar astronomy," a prediction strongly borne out in the decades that followed. In 1959, Code founded the Space Astronomy Laboratory (SAL) within the UW Department of Astronomy. Early photometric and spectrographic equipment was test-flown aboard NASA's X-15 rocket plane and Aerobee sounding rockets. Along with other SAL personnel, including Theodore E. Houck, Robert C. Bless, and John F. McNall, Code (as principal investigator) was responsible for the design of the Wisconsin Experiment Package (WEP) as one of two suites of instruments to be flown aboard the Orbiting Astronomical Observatory (OAO), which represented a milestone in the advent of space astronomy. With its seven reflecting telescopes feeding five filter photometers and two scanning spectrometers, WEP permitted the first extended observations in the UV portion of the spectrum. After the complete failure of the OAO-1 spacecraft (launched in 1966), OAO-2 was successfully launched on 7 December 1968 and gathered data on over a thousand celestial objects during the next 50 months, including stars, nebulae, galaxies, planets, and comets. These results appeared in a series of more than 40 research papers, chiefly in the Ap.J., along with the 1972 monograph, The Scientific Results from the Orbiting Astronomical Observatory (OAO-2), edited by Code. Between the OAO launches, other SAL colleagues of Code developed the Wisconsin Automatic Photoelectric Telescope (or APT), the first computer-controlled (or "robotic") telescope. Driven by a PDP-8 mini-computer, it routinely collected atmospheric extinction data. Code was also chosen principal investigator for the Wisconsin Ultraviolet Photo-Polarimeter Experiment (or WUPPE). This used a UV-sensitive polarimeter designed by Kenneth Nordsieck that was flown twice aboard the space shuttles in 1990 and 1995. Among other findings, WUPPE observations demonstrated that interstellar dust does not appreciably change the direction of polarization of starlight