Science.gov

Sample records for minicomputers

  1. A NASA family of minicomputer systems, Appendix A

    NASA Technical Reports Server (NTRS)

    Deregt, M. P.; Dulfer, J. E.

    1972-01-01

    This investigation was undertaken to establish sufficient specifications, or standards, for minicomputer hardware and software to provide NASA with realizable economics in quantity purchases, interchangeability of minicomputers, software, storage and peripherals, and a uniformly high quality. The standards will define minicomputer system component types, each specialized to its intended NASA application, in as many levels of capacity as required.

  2. Networking: A Solution to the Campus Minicomputer Problem.

    ERIC Educational Resources Information Center

    Fritz, Joseph

    Minicomputer networking can be an alternative solution to the problem of implementing various computer systems in universities. In its simplest case, networking takes the form of multiple small computers communicating over telephone lines to a larger host minicomputer which in turn communicates with the central mainframe. Using computers in this…

  3. Integrated computer-aided design using minicomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.

    1980-01-01

    Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM), a highly interactive software, has been implemented on minicomputers at the NASA Langley Research Center. CAD/CAM software integrates many formerly fragmented programs and procedures into one cohesive system; it also includes finite element modeling and analysis, and has been interfaced via a computer network to a relational data base management system and offline plotting devices on mainframe computers. The CAD/CAM software system requires interactive graphics terminals operating at a minimum of 4800 bits/sec transfer rate to a computer. The system is portable and introduces 'interactive graphics', which permits the creation and modification of models interactively. The CAD/CAM system has already produced designs for a large area space platform, a national transonic facility fan blade, and a laminar flow control wind tunnel model. Besides the design/drafting element analysis capability, CAD/CAM provides options to produce an automatic program tooling code to drive a numerically controlled (N/C) machine. Reductions in time for design, engineering, drawing, finite element modeling, and N/C machining will benefit productivity through reduced costs, fewer errors, and a wider range of configuration.

  4. The Modern Mini-Computer in Laboratory Automation

    ERIC Educational Resources Information Center

    Castellan, N. John, Jr.

    1975-01-01

    A report of the growth and present status of the mini-computer based, time sharing laboratory at the University of Indiana, which describes the system hardware, software, and applications in psychological experimentation. (EH)

  5. Recent Trends in Minicomputer-Based Integrated Learning Systems for Reading and Language Arts Instruction.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    This paper discusses minicomputer-based ILSs (integrated learning systems), i.e., computer-based systems of hardware and software. An example of a minicomputer-based system in a school district (a composite of several actual districts) considers hardware, staffing, scheduling, reactions, problems, and training for a subskill-oriented reading…

  6. Distributing structural optimization software between a mainframe and a minicomputer

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.; Dovi, A. R.; Riley, K. M.

    1981-01-01

    This paper describes a distributed software system for solving large-scale structural optimization problems. Distributing the software between a mainframe computer and a minicomputer takes advantage of some of the best features available on each computer. The described software system consists of a finite element structural analysis computer program, a general purpose optimizer program, and several small user-supplied problem dependent programs. Comparison with a similar system executing entirely on the mainframe computer reveals that the distributed system costs less, uses computer resources more efficiently and improves production through faster turnaround and improved user control. The system interfaces with interactive graphics software for generating models and displaying the intermediate and final results

  7. Migration of 1970s Minicomputer Controls to Modern Toolkit Software

    SciTech Connect

    Juras, R.C.; Meigs, M.J.; Sinclair, J.A.; Tatum, B.A.

    1999-11-13

    Controls for accelerators and associated systems at the Holifield Radioactive Ion Beam Facility (HRIBF) at Oak Ridge National Laboratory have been migrated from 197Os-vintage minicomputers to a modern system based on Vista and EPICS toolkit software. Stability and capabilities of EPICS software have motivated increasing use of EPICS for accelerator controls. In addition, very inexpensive subsystems based on EPICS and the EPICS portable CA server running on Linux PCs have been implemented to control an ion source test facility and to control a building-access badge reader system. A new object-oriented, extensible display manager has been developed for EPICS to facilitate the transition to EPICS and will be used in place of MEDM. EPICS device support has been developed for CAMAC serial highway controls.

  8. Smithsonian Astrophysical Observatory's minicomputer vs. the laser. [computer predictions for laser tracking stations

    NASA Technical Reports Server (NTRS)

    Cherniack, J. R.

    1973-01-01

    Review of some of the problems encountered in replacing a CDC 6400, that was used for supplying a network of laser tracking stations with predictions, by an 8K Data General 1200 minicomputer with a teletype for I/O. Before the replacement, the predictions were expensive to compute and to transmit, and were clumsy logistically. The achieved improvements are described, along with every step it took to accomplish them, and the incurred costs.

  9. A brief description of the Medical Information Computer System (MEDICS). [real time minicomputer system

    NASA Technical Reports Server (NTRS)

    Moseley, E. C.

    1974-01-01

    The Medical Information Computer System (MEDICS) is a time shared, disk oriented minicomputer system capable of meeting storage and retrieval needs for the space- or non-space-related applications of at least 16 simultaneous users. At the various commercially available low cost terminals, the simple command and control mechanism and the generalized communication activity of the system permit multiple form inputs, real-time updating, and instantaneous retrieval capability with a full range of options.

  10. Using joined minicomputer-microcomputer systems for intricate sample and data manipulations

    SciTech Connect

    Meng, J.D.

    1980-09-01

    We have produced, over the past three years, three automated x-ray fluorescence based elemental analysis systems, that combine a minicomputer and a microcomputer to perform intricate sample and data manipulations. The mini-micro combination facilitates the reuse of sizable sections of hardware and programs for different x-ray analysis projects. Each of our systems has been a step closer to an optimum general solution. The combination reaps economic benefits throughout development, fabrication and maintenance, an important consideration for designers of custom-built, one-of-a-kind data analysis systems such as these.

  11. Prickett and Lonnquist aquifer simulation program for the Apple II minicomputer

    SciTech Connect

    Hull, L.C.

    1983-02-01

    The Prickett and Lonnquist two-dimensional groundwater model has been programmed for the Apple II minicomputer. Both leaky and nonleaky confined aquifers can be simulated. The model was adapted from the FORTRAN version of Prickett and Lonnquist. In the configuration presented here, the program requires 64 K bits of memory. Because of the large number of arrays used in the program, and memory limitations of the Apple II, the maximum grid size that can be used is 20 rows by 20 columns. Input to the program is interactive, with prompting by the computer. Output consists of predicted lead values at the row-column intersections (nodes).

  12. Potential of minicomputer/array-processor system for nonlinear finite-element analysis

    NASA Technical Reports Server (NTRS)

    Strohkorb, G. A.; Noor, A. K.

    1983-01-01

    The potential of using a minicomputer/array-processor system for the efficient solution of large-scale, nonlinear, finite-element problems is studied. A Prime 750 is used as the host computer, and a software simulator residing on the Prime is employed to assess the performance of the Floating Point Systems AP-120B array processor. Major hardware characteristics of the system such as virtual memory and parallel and pipeline processing are reviewed, and the interplay between various hardware components is examined. Effective use of the minicomputer/array-processor system for nonlinear analysis requires the following: (1) proper selection of the computational procedure and the capability to vectorize the numerical algorithms; (2) reduction of input-output operations; and (3) overlapping host and array-processor operations. A detailed discussion is given of techniques to accomplish each of these tasks. Two benchmark problems with 1715 and 3230 degrees of freedom, respectively, are selected to measure the anticipated gain in speed obtained by using the proposed algorithms on the array processor.

  13. Ruggedized minicomputer hardware and software topics, 1981: Proceedings of the 4th ROLM MIL-SPEC Computer User's Group Conference

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Presentations of a conference on the use of ruggedized minicomputers are summarized. The following topics are discussed: (1) the role of minicomputers in the development and/or certification of commercial or military airplanes in both the United States and Europe; (2) generalized software error detection techniques; (3) real time software development tools; (4) a redundancy management research tool for aircraft navigation/flight control sensors; (5) extended memory management techniques using a high order language; and (6) some comments on establishing a system maintenance scheme. Copies of presentation slides are also included.

  14. An interactive two-dimensinal finite element process modelling package for a single user mini-computer

    NASA Astrophysics Data System (ADS)

    Ferguson, R. S.; Doherty, J. G.

    1984-12-01

    The algorithms and models of an accurate finite element based simulation of th eprocessing steps of semiconductor wafer fabrication are described. Properties of the latest generation of single user mini-computers allow the process engineer to use the computer package in an interactive mode. The process steps modelled are, implantation, oxidation/diffusion and annealing. Implantation models are based on the well-tested one-dimensional statistical distributions. Interaction between impurity atoms is assumed to be mainly through the built-in field. To obtain an accurate estimate of the built-in field, the non-linear Poisson equation is solved at the same nodes and in the same elements used for the simulation of the diffusion process. On making the assumption that small time steps are taken in the numerical formulation of the diffusion problem, the finite element equation system becomes linear and can be rapidly solved. Each impurity is assumed to diffuse independently in a non-uniform electric field, enhanced by a component due to the other impurities. Coupling between oxidation and diffusion is accounted for by a simple algorithm that deforms the solution mesh after the oxidising agent reacts with silicon to create a larger volume of SiO 2.

  15. MICTPT - A minicomputer general-purpose microwave two-port analysis program

    NASA Technical Reports Server (NTRS)

    Olson, D. H.; Rosenbaum, F. J.

    1974-01-01

    The implementation of a microwave network-analysis program for computers with 4K words of memory is described. The program is capable of the frequency analysis of networks which include interconnections of lumped elements, transmission lines, waveguides, and any two-port which is described by the elements of a scattering matrix. The network can be described mnemonically rather than by numerical codes. For each frequency in the range, the entire network is collapsed into a single equivalent A matrix, and the input impedance and other characteristics are calculated.

  16. Mini-Computers and the Building Trades: A Guide for Teachers of Vocational Education. Final Report.

    ERIC Educational Resources Information Center

    Asplen, Donald; And Others

    These training materials are designed to help vocational education teachers introduce students to the utilization and installation of mini- and microcomputers in residential and small business buildings. It consists of two chapters. Chapter 1 contains general materials, designed to promote awareness, and chapter 2 contains materials which are…

  17. The Georgetown University Library Information System (LIS): a minicomputer-based integrated library system.

    PubMed Central

    Broering, N C

    1983-01-01

    Georgetown University's Library Information System (LIS), an integrated library system designed and implemented at the Dahlgren Memorial Library, is broadly described from an administrative point of view. LIS' functional components consist of eight "user-friendly" modules: catalog, circulation, serials, bibliographic management (including Mini-MEDLINE), acquisitions, accounting, networking, and computer-assisted instruction. This article touches on emerging library services, user education, and computer information services, which are also changing the role of staff librarians. The computer's networking capability brings the library directly to users through personal or institutional computers at remote sites. The proposed Integrated Medical Center Information System at Georgetown University will include interface with LIS through a network mechanism. LIS is being replicated at other libraries, and a microcomputer version is being tested for use in a hospital setting. PMID:6688749

  18. The Georgetown University Library Information System (LIS): a minicomputer-based integrated library system.

    PubMed

    Broering, N C

    1983-07-01

    Georgetown University's Library Information System (LIS), an integrated library system designed and implemented at the Dahlgren Memorial Library, is broadly described from an administrative point of view. LIS' functional components consist of eight "user-friendly" modules: catalog, circulation, serials, bibliographic management (including Mini-MEDLINE), acquisitions, accounting, networking, and computer-assisted instruction. This article touches on emerging library services, user education, and computer information services, which are also changing the role of staff librarians. The computer's networking capability brings the library directly to users through personal or institutional computers at remote sites. The proposed Integrated Medical Center Information System at Georgetown University will include interface with LIS through a network mechanism. LIS is being replicated at other libraries, and a microcomputer version is being tested for use in a hospital setting. PMID:6688749

  19. The chemical abundances of the Cassiopeia A fast-moving knots - Explosive nucleosynthesis on a minicomputer

    NASA Technical Reports Server (NTRS)

    Johnston, M. D.; Joss, P. C.

    1980-01-01

    A simplified nuclear reaction network for explosive nucleosynthesis calculations is described in which only the most abundant nuclear species and the most important reactions linking these species are considered. This scheme permits the exploration of many cases without excessive computational effort. Good agreement with previous calculations employing more complex reaction networks is obtained. This scheme is applied to the observed chemical abundances of the fast-moving knots in the supernova remnant Cassiopeia A and it is found that a wide range of initial conditions could yield the observed abundances. The abundances of four of the knots with significant and different amounts of elements heavier than oxygen are consistent with an origin in material of the same initial composition but processed at different peak temperatures and densities. Despite the observed high oxygen abundances and low abundances of light elements in the knots, they did not necessarily undergo incomplete oxygen burning; in fact, it is not even necessary that oxygen have been present in the initial composition. The agreement between the calculated and observed chemical abundances in Cas A and similar supernova remnants depends primarily upon the relevant nuclear physics and does not provide strong evidence in favor of any particular model of the supernova event.

  20. A Devoted Mini-Computer System for the Management of Clinical and Laboratory Data in an Intensive Care Unit

    PubMed Central

    Shinozaki, Tamotsu; Deane, Robert S.; Mazuzan, John E.

    1982-01-01

    In order to handle a large amount of clinical, laboratory and physiological information in intensive care units, a prototype distributed computer system is used at the Medical Center Hospital of Vermont. The system enables us to do extra tasks without increasing clerical help, eg., a progress note for respiratory care, statistical data for unit management, computation of cardiac and pulmonary parameters, IV schedule for vasoactive drugs, daily compilation of TISS and APACHE scores, data collection for audits and special products. Special attention is paid to computer/user interaction.

  1. Mini-computer programs for bioequivalence testing of pharmaceutical drug formulations in two-way cross-over studies. Including a survey of current parametric evaluation techniques.

    PubMed

    Wijnand, H P; Timmer, C J

    1983-01-01

    For bioequivalence testing of pharmaceutical formulations of the same drug entity, it is not sufficient to carry out an analysis of variance on the characteristic to be evaluated (e.g., area under the plasma level vs time curve, half-life of elimination, time to plasma-peak level, plasma peak level) and to establish 'classical' 95% confidence intervals for the difference or the ratio of the characteristic concerned. In the past 10 years, several approaches have been proposed as an aid in decision-making: Westlake's 95% intervals, Rodda and Davis' probabilities, Fluehler's posterior probability histograms and the evaluation of the residual variation coefficient. A survey of these approaches is given, together with a discussion of their merits, their differences and their similarities. It is recommended that the final evaluation should be supported by probability density plots, which facilitate easy understanding of the differences and similarities between the various approaches. A bioequivalence study with two types of oral tablets containing bepridil, a new anti-anginal drug, is used as an example. Computer programs are presented, which enable the user to easily apply the various approaches in order to meet requirements of regulatory agencies. PMID:6607152

  2. CS258 S99 1 NOW Handout Page 1

    E-print Network

    California at Berkeley, University of

    networked systems, system architecture currently lags technology Mainframe Minicomputer Personal Computer that things could ever be different ­ mainframe -> mini ­ mini -> workstation -> PC ­ PC -> ??? · It is always

  3. Computer program and user documentation medical data tape retrieval system

    NASA Technical Reports Server (NTRS)

    Anderson, J.

    1971-01-01

    This volume provides several levels of documentation for the program module of the NASA medical directorate mini-computer storage and retrieval system. A biomedical information system overview describes some of the reasons for the development of the mini-computer storage and retrieval system. It briefly outlines all of the program modules which constitute the system.

  4. Downsizing a database platform for increased performance and decreased costs

    SciTech Connect

    Miller, M.M.; Tolendino, L.F.

    1993-06-01

    Technological advances in the world of microcomputers have brought forth affordable systems and powerful software than can compete with the more traditional world of minicomputers. This paper describes an effort at Sandia National Laboratories to decrease operational and maintenance costs and increase performance by moving a database system from a minicomputer to a microcomputer.

  5. The Perils of Personals: Microcomputers in Libraries.

    ERIC Educational Resources Information Center

    Carlson, David H.

    1985-01-01

    Explores microcomputer revolution and assesses role of microcomputers in libraries. Highlights include characteristics of three types of computers (mainframes, minicomputers, microcomputers); hardware limitations of microcomputers (storage capacity, processing speed); advancing technology; the local area network; software problems; and…

  6. The Microcomputer Revolution.

    ERIC Educational Resources Information Center

    Fosdick, Howard

    1980-01-01

    Examines the development of the microcomputer and focuses on its potential for library automation. The characteristics of microcomputers and minicomputers are contrasted and a selected annotated bibliography includes a list of specialty magazines on microcomputers. (RAA)

  7. Design and performance of a large vocabulary discrete word recognition system. Volume 2: Appendixes. [flow charts and users manual

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The users manual for the word recognition computer program contains flow charts of the logical diagram, the memory map for templates, the speech analyzer card arrangement, minicomputer input/output routines, and assembly language program listings.

  8. The Electronic Hermit: Trends in Library Automation.

    ERIC Educational Resources Information Center

    LaRue, James

    1988-01-01

    Reviews trends in library software development including: (1) microcomputer applications; (2) CD-ROM; (3) desktop publishing; (4) public access microcomputers; (5) artificial intelligence; (6) mainframes and minicomputers; and (7) automated catalogs. (MES)

  9. Fourier Transform Methods of Deconvolving Scintigrams Using a General Purpose Digital Computer 

    E-print Network

    Boardman, A. Keith

    1978-01-01

    The adaptation of a general purpose laboratory minicomputer for nuclear medicine imaging is described. Electronic interfaces have been designed and constructed to link nucleonic equipment to a PDP 12 computer. A computer ...

  10. ORCA: A Visualization Toolkit for High-Dimensional Data

    E-print Network

    Washington at Seattle, University of

    on an IDIIOM vector scope driven by a Varian 620 minicomputer connected to an IBM 360 91 mainframe. It so monopolized the computing power of the mainframe that all other computing jobs came to a standstill while

  11. Software Evaluation by the Manual.

    ERIC Educational Resources Information Center

    Mullins, Carolyn

    1987-01-01

    A systematic assessment procedure designed to increase success at buying supportable software is presented. Examples used are from programs for the IBM PC, but most comments apply to other software and to software for minicomputers and mainframes. (MLW)

  12. Turnkey CAD/CAM selection and evaluation

    NASA Technical Reports Server (NTRS)

    Moody, T.

    1980-01-01

    The methodology to be followed in evaluating and selecting a computer system for manufacturing applications is discussed. Main frames and minicomputers are considered. Benchmark evaluations, demonstrations, and contract negotiations are discussed.

  13. Transcription of the Workshop on General Aviation Advanced Avionics Systems

    NASA Technical Reports Server (NTRS)

    Tashker, M. (editor)

    1975-01-01

    Papers are presented dealing with the design of reliable, low cost, advanced avionics systems applicable to general aviation in the 1980's and beyond. Sensors, displays, integrated circuits, microprocessors, and minicomputers are among the topics discussed.

  14. Learning by Interactive Programming: Microcomputer Applications.

    ERIC Educational Resources Information Center

    De Laurentiis, Emiliano

    1980-01-01

    Clarifies often misconstrued distinctions with regard to microcomputers, minicomputers, and maxicomputers. Criteria for educational use of microcomputers are examined, including its potential for language and peripheral expansion, and its communication capabilities. (MER)

  15. The revolution in data gathering systems

    NASA Technical Reports Server (NTRS)

    Cambra, J. M.; Trover, W. F.

    1975-01-01

    Data acquisition systems used in NASA's wind tunnels from the 1950's through the present time are summarized as a baseline for assessing the impact of minicomputers and microcomputers on data acquisition and data processing. Emphasis is placed on the cyclic evolution in computer technology which transformed the central computer system, and finally the distributed computer system. Other developments discussed include: medium scale integration, large scale integration, combining the functions of data acquisition and control, and micro and minicomputers.

  16. Vault Safety and Inventory System users manual, PRIME 2350. Revision 1

    SciTech Connect

    Downey, N.J.

    1994-12-14

    This revision is issued to request review of the attached document: VSIS User Manual, PRIME 2350, which provides user information for the operation of the VSIS (Vault Safety and Inventory System). It describes operational aspects of Prime 2350 minicomputer and vault data acquisition equipment. It also describes the User`s Main Menu and menu functions, including REPORTS. Also, system procedures for the Prime 2350 minicomputer are covered.

  17. User microprogrammable processors for high data rate telemetry preprocessing

    NASA Technical Reports Server (NTRS)

    Pugsley, J. H.; Ogrady, E. P.

    1973-01-01

    The use of microprogrammable processors for the preprocessing of high data rate satellite telemetry is investigated. The following topics are discussed along with supporting studies: (1) evaluation of commercial microprogrammable minicomputers for telemetry preprocessing tasks; (2) microinstruction sets for telemetry preprocessing; and (3) the use of multiple minicomputers to achieve high data processing. The simulation of small microprogrammed processors is discussed along with examples of microprogrammed processors.

  18. MINIS: Multipurpose Interactive NASA Information System

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The Multipurpose Interactive NASA Information Systems (MINIS) was developed in response to the need for a data management system capable of operation on several different minicomputer systems. The desired system had to be capable of performing the functions of a LANDSAT photo descriptive data retrieval system while remaining general in terms of other acceptable user definable data bases. The system also had to be capable of performing data base updates and providing user-formatted output reports. The resultant MINI System provides all of these capabilities and several other features to complement the data management system. The MINI System is currently implemented on two minicomputer systems and is in the process of being installed on another minicomputer system. The MINIS is operational on four different data bases.

  19. Computer Program and User Documentation Medical Data Input System

    NASA Technical Reports Server (NTRS)

    Anderson, J.

    1971-01-01

    Several levels of documentation are presented for the program module of the NASA medical directorate minicomputer storage and retrieval system. The biomedical information system overview gives reasons for the development of the minicomputer storage and retrieval system. It briefly describes all of the program modules which constitute the system. A technical discussion oriented to the programmer is given. Each subroutine is described in enough detail to permit in-depth understanding of the routines and to facilitate program modifications. The program utilization section may be used as a users guide.

  20. Study of systems and techniques for data base management

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  1. A program for mass spectrometer control and data processing analyses in isotope geology; written in BASIC for an 8K Nova 1120 computer

    USGS Publications Warehouse

    Stacey, J.S.; Hope, J.

    1975-01-01

    A system is described which uses a minicomputer to control a surface ionization mass spectrometer in the peak switching mode, with the object of computing isotopic abundance ratios of elements of geologic interest. The program uses the BASIC language and is sufficiently flexible to be used for multiblock analyses of any spectrum containing from two to five peaks. In the case of strontium analyses, ratios are corrected for rubidium content and normalized for mass spectrometer fractionation. Although almost any minicomputer would be suitable, the model used was the Data General Nova 1210 with 8K memory. Assembly language driver program and interface hardware-descriptions for the Nova 1210 are included.

  2. Word Processing at Carnegie-Mellon University.

    ERIC Educational Resources Information Center

    Wineland, Joyce A.

    1982-01-01

    Carnegie-Mellon University is using word processing in several modes to enhance communication: dedicated word processing system, word processing package on a general-purpose minicomputer, computer mail, text editor, text formatter, spelling checker, and programmable printer. Each mode of word processing is discussed. (Author/MLW)

  3. Atmospheric and Oceanographic Information Processing System (AOIPS) system description

    NASA Technical Reports Server (NTRS)

    Bracken, P. A.; Dalton, J. T.; Billingsley, J. B.; Quann, J. J.

    1977-01-01

    The development of hardware and software for an interactive, minicomputer based processing and display system for atmospheric and oceanographic information extraction and image data analysis is described. The major applications of the system are discussed as well as enhancements planned for the future.

  4. Rotating turbine blade pyrometer

    NASA Technical Reports Server (NTRS)

    Buchele, D. R.; Lesco, D. J.

    1974-01-01

    Non-contacting pyrometer system optically measures surface temperature distribution on rotating turbine blade, comprising line-by-line scan via fiber optic probe. Each scan line output is converted to digital signals, temporarily stored in buffer memory, and then processed in minicomputer for display as temperature.

  5. Commonalities in Pedagogy Situating Cell Phone Use in the Classroom

    ERIC Educational Resources Information Center

    Abend, Laurie Lafer

    2013-01-01

    Technology has become embedded in all aspects of students' lives as they increasingly rely on mobile technology devices such as cell phones to access and share information. Cell phones function as portable, affordable, and ubiquitous mini-computers, yet few teachers have leveraged the benefits of cell phone technology for teaching and learning…

  6. Fossil-fuel power plants: Computer systems for power plant control, maintenance, and operation. October 1976-December 1989 (A Bibliography from the COMPENDEX data base). Report for October 1976-December 1989

    SciTech Connect

    Not Available

    1990-02-01

    This bibliography contains citations concerning fossil-fuel power plant computer systems. Minicomputer and microcomputer systems used for monitoring, process control, performance calculations, alarming, and administrative applications are discussed. Topics emphasize power plant control, maintenance and operation. (Contains 240 citations fully indexed and including a title list.)

  7. Fossil fuel power plants: Computer systems for power plant control, maintenance, and operation. (Latest citations from the Compendex database). Published Search

    SciTech Connect

    Not Available

    1993-07-01

    The bibliography contains citations concerning fossil fuel power plant computer systems. Minicomputer and microcomputer systems used for monitoring, process control, performance calculations, alarming, and administrative applications are discussed. Topics emphasize power plant control, maintenance and operation. (Contains 250 citations and includes a subject term index and title list.)

  8. The Use of a Microcomputer Based Array Processor for Real Time Laser Velocimeter Data Processing

    NASA Technical Reports Server (NTRS)

    Meyers, James F.

    1990-01-01

    The application of an array processor to laser velocimeter data processing is presented. The hardware is described along with the method of parallel programming required by the array processor. A portion of the data processing program is described in detail. The increase in computational speed of a microcomputer equipped with an array processor is illustrated by comparative testing with a minicomputer.

  9. The prediction of acoustical particle motion using an efficient polynomial curve fit procedure

    NASA Technical Reports Server (NTRS)

    Marshall, S. E.; Bernhard, R.

    1984-01-01

    A procedure is examined whereby the acoustic model parameters, natural frequencies and mode shapes, in the cavities of transportation vehicles are determined experimentally. The acoustic model shapes are described in terms of the particle motion. The acoustic modal analysis procedure is tailored to existing minicomputer based spectral analysis systems.

  10. Development of a Plasma Panel Hard Copy Unit. Final Report for Period February 1975-November 1975.

    ERIC Educational Resources Information Center

    Gardner, Edward M.; McKnight, Lyle R.

    This report describes an investigation of a technique for producing paper copies of instructional computer terminal displays. Such a device appears to be a useful adjunct for the development of computer-assisted instructional programs by authors. A digital device was simulated with a minicomputer; the techniques used to construct this device are…

  11. Intel Pentium Processor Author: Saraju P. Mohanty

    E-print Network

    Mohanty, Saraju P.

    Processing Unit". The "microprocessor" means the CPU on a single chip. Of course, with the development (L2) cache, SISD, SIMD, processor serial ID (chip ID), VLIW, EPIC, superscalar factor, pipeline depth. The present age is of the age of microcomputers leaving behind the mainframes and the minicomputers, etc. #12

  12. Voice Interactive Analysis System Study. Final Report, August 28, 1978 through March 23, 1979.

    ERIC Educational Resources Information Center

    Harry, D. P.; And Others

    The Voice Interactive Analysis System study continued research and development of the LISTEN real-time, minicomputer based connected speech recognition system, within NAVTRAEQUIPCEN'S program of developing automatic speech technology in support of training. An attempt was made to identify the most effective features detected by the TTI-500 model…

  13. Automatic visual inspection of hybrid microcircuits

    SciTech Connect

    Hines, R.E.

    1980-05-01

    An automatic visual inspection system using a minicomputer and a video digitizer was developed for inspecting hybrid microcircuits (HMC) and thin-film networks (TFN). The system performed well in detecting missing components on HMCs and reduced the testing time for each HMC by 75%.

  14. CURRICULUM VITAE ping.hsu@sjsu.edu

    E-print Network

    Su, Xiao

    with an EAI 640 minicomputer. INDUSTRY PROJECTS: Kenetech Windpower Developed a real-time rotor resistance. Studied and proposed a way to detect speed measurement error (tachometer failure) by power measurement electronics and electrical equipment. Trace Technologies: Studied a high efficient grid-tied photovoltaic

  15. Method of smoothing laser range observations by corrections of orbital parameters and station coordinates

    NASA Astrophysics Data System (ADS)

    Lala, P.; Thao, Bui Van

    1986-11-01

    The first step in the treatment of satellite laser ranging data is its smoothing and rejection of incorrect points. The proposed method uses the comparison of observations with ephemerides and iterative matching of corresponding parameters. The method of solution and a program for a minicomputer are described. Examples of results for satellite Starlette are given.

  16. NEUTRON ACTIVATION ANALYSIS FOR SIMULTANEOUS DETERMINATION OF TRACE ELEMENTS IN AMBIENT AIR COLLECTED ON GLASS-FIBER FILTERS

    EPA Science Inventory

    Arsenic with 25 other elements are simultaneously determined in ambient air samples collected on glass-fiber filter composites at 250 United States sites. The instrumental neutron activation analysis (NAA) technique combined with the power of a dedicated mini-computer resulted in...

  17. DATA ACQUISITION SYSTEM FOR RAPID KINETIC EXPERIMENTS

    EPA Science Inventory

    A data acquisition system has been developed to collect, analyze and store large volumes of rapid kinetic data measured from a stopped-flow spectrophotometer. A digital minicomputer, with an A/D converter, tape drive unit and formatter, analog recorder, oscilloscope, and input/ou...

  18. Program for Development of Artificial Intelligence

    NASA Technical Reports Server (NTRS)

    Riley, Gary; Culbert, Chris; Lopez, Frank

    1987-01-01

    C Language Integrated Production System (CLIPS) computer program is shell for developing expert systems. Designed to enable research, development, and delivery of artificial intelligence on conventional computers. Primary design goals for CLIPS are portability, efficiency, and functionality. Meets or out-performs most microcomputer- and minicomputer-based artificial-intelligence tools. Written in C.

  19. A High Resolution Graphic Input System for Interactive Graphic Display Terminals. Appendix B.

    ERIC Educational Resources Information Center

    Van Arsdall, Paul Jon

    The search for a satisfactory computer graphics input system led to this version of an analog sheet encoder which is transparent and requires no special probes. The goal of the research was to provide high resolution touch input capabilities for an experimental minicomputer based intelligent terminal system. The technique explored is compatible…

  20. Operating manual for the RRL 8 channel data logger

    NASA Technical Reports Server (NTRS)

    Paluch, E. J.; Shelton, J. D.; Gardner, C. S.

    1979-01-01

    A data collection device which takes measurements from external sensors at user specified time intervals is described. Three sensor ports are dedicated to temperature, air pressure, and dew point. Five general purpose sensor ports are provided. The user specifies when the measurements are recorded as well as when the information is read or stored in a minicomputer or a paper tape.

  1. Experiential Based Transcripts and a Test Item Data Bank: Distributed Computing Applications.

    ERIC Educational Resources Information Center

    Edwards, Jesse C.; And Others

    1981-01-01

    The design development of a minicomputer-based information system for physician assistant students at the University of Nebraska Medical Center is reported. A system built on modern concepts of experiential based learning and a computer-assisted instruction curriculum is described. (Author/MLW)

  2. A practical Hadamard transform spectrometer for astronomical application

    NASA Technical Reports Server (NTRS)

    Tai, M. H.

    1977-01-01

    The mathematical properties of Hadamard matrices and their application to spectroscopy are discussed. A comparison is made between Fourier and Hadamard transform encoding in spectrometry. The spectrometer is described and its laboratory performance evaluated. The algorithm and programming of inverse transform are given. A minicomputer is used to recover the spectrum.

  3. Mission of the Future. Proceedings of the Annual Convention of the Association for the Development of Computer-Based Instructional Systems. Volume III: Users Interest Groups (San Diego, California, February 27 to March 1, 1979).

    ERIC Educational Resources Information Center

    Association for the Development of Computer-based Instructional Systems.

    The third of three volumes of papers presented at the 1979 ADCIS convention, this collection includes 30 papers presented to special interest groups--implementation, minicomputer users, National Consortium for Computer Based Music Instruction, and PLATO users. Papers presented to the implementation interest group were concerned with faculty…

  4. Computer Managed Instruction in Navy Training.

    ERIC Educational Resources Information Center

    Middleton, Morris G.; And Others

    An investigation was made of the feasibility of computer-managed instruction (CMI) for the Navy. Possibilities were examined regarding a centralized computer system for all Navy training, minicomputers for remote classes, and shipboard computers for on-board training. The general state of the art and feasibility of CMI were reviewed, alternative…

  5. Use of Medical Library Systems--Geographic Analysis.

    ERIC Educational Resources Information Center

    Miido, Helis

    1992-01-01

    Presents a geographic analysis of results of a survey of medical libraries in the United States, Canada, and Europe that examined the automation of book and serials processing. Levels of automation are investigated, geographic differences between the use of personal computers versus mainframe/minicomputers are discussed, and software systems are…

  6. Mass Storage Systems.

    ERIC Educational Resources Information Center

    Ranade, Sanjay; Schraeder, Jeff

    1991-01-01

    Presents an overview of the mass storage market and discusses mass storage systems as part of computer networks. Systems for personal computers, workstations, minicomputers, and mainframe computers are described; file servers are explained; system integration issues are raised; and future possibilities are suggested. (LRW)

  7. Organizational Strategies for End-User Computing Support.

    ERIC Educational Resources Information Center

    Blackmun, Robert R.; And Others

    1988-01-01

    Effective support for end users of computers has been an important issue in higher education from the first applications of general purpose mainframe computers through minicomputers, microcomputers, and supercomputers. The development of end user support is reviewed and organizational models are examined. (Author/MLW)

  8. Local Databases and Training.

    ERIC Educational Resources Information Center

    Stern, David

    1991-01-01

    Discusses local database development and training experiences at the University of Illinois at Urbana-Champaign libraries. Topics discussed include when to automate; available resources; microcomputers versus mainframes or minicomputers; software; database design factors; test versions; and training. Examples of sample records, help screens,…

  9. What's Where In Software: An Update.

    ERIC Educational Resources Information Center

    Currents, 1995

    1995-01-01

    A directory lists computer software vendors offering software useful in administering college alumni and development programs. Listings include client/server system vendors and minicomputer and mainframe system vendors. Each listing contains the vendor name and address, contact person, software title(s), cost, hardware requirements, and client…

  10. Sunrise to Sunset Lifelong Learning Via Microwave Networks: From a National Heritage.

    ERIC Educational Resources Information Center

    Hart, Russ A.

    Of necessity, adult educators will be turning to technological delivery forms to meet the insistent call for increasing numbers of programs. As teleconferencing, television, microwave, minicomputer, satellite, fiberoptic, and laser technologies continue to expand, they hold promise of educating millions of adult students on and off campus. A…

  11. New starts in research and development, 1982

    NASA Technical Reports Server (NTRS)

    Grosson, J.

    1981-01-01

    An outline in slide form, of some areas of U.S. Navy research and development utilizing airborne minicomputers is presented. The following program considerations are addressed: (1) research and engineering management; (2) budgeting; (3) equipment specifications and construction materials; (4) computer applications; (5) technological capabilities, utilization, and transfer; and (6) military applications.

  12. Technological Discontinuities and Organizational Environments.

    ERIC Educational Resources Information Center

    Tushman, Michael L.; Anderson, Philip

    1986-01-01

    Technological effects on environmental conditions are analyzed using longitudinal data from the minicomputer, cement, and airline industries. Technology evolves through periods of incremental change punctuated by breakthroughs that enhance or destroy the competence of firms. Competence-destroying discontinuities increase environmental turbulence;…

  13. Technological Discontinuities and Dominant Designs: A Cyclical Model of Technological Change.

    ERIC Educational Resources Information Center

    Anderson, Philip; Tushman, Michael L.

    1990-01-01

    Based on longitudinal studies of the cement, glass, and minicomputer industries, this article proposes a technological change model in which a technological breakthrough, or discontinuity, initiates an era of intense technical variation and selection, culminating in a single dominant design and followed by a period of incremental technical…

  14. -----. .-...ICCV.-.-.-----iC.r..iu..:.

    E-print Network

    taken throughout the Unitec States by two teams of anthropometrists using an automated anthropometric data acquisition system. Standard anthropometers, calipers, and tape devices were modified to read electronicalfy and input dimensional data directly to a mini-computer for data processing and storage. Summary

  15. UM-HSRI-BI-75-5 FINAL REPORT MAY 31, 1975

    E-print Network

    of a three-year study designed t o collect analyze, and reduce selected anthropometric data on 4027 infants data was recorded automatically by a portable NOVA 1220 mini-computer system. Center of gravity i n diameters, hand clearance, and grip size dimensions. Data are presented i n both tabular and graphical

  16. An Examination of the Potential Relationship between Technology and Persistence among At-Risk College Students

    ERIC Educational Resources Information Center

    Hughey, Aaron W.; Manco, Charlene M.

    2012-01-01

    Academically underprepared college students, i.e., those identified as needing developmental (remedial) English, mathematics and reading courses in order to maximize their potential for academic success at college-level studies, were provided with the opportunity to rent, for a minimal, subsidized fee, mini-computers bundled with digital course…

  17. 48 CFR 1523.7001 - Policy.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... other portable computers. (4) PC printers - laser, inkjet or dot matrix (stand-alone or networked). (5) High-speed printers used on a PC network (less than approximately 20 pages per minute). (6) Monitors...) Workstations. (2) File servers. (3) Mainframe equipment. (4) Minicomputers. (5) High-speed printers used...

  18. ULTRASONIC BIOLOGICAL EFFECT EXPOSURE SYSTEM W. D. O'Brien, Jr., C. L. Christman and S. Yarrow

    E-print Network

    Illinois at Urbana-Champaign, University of

    ULTRASONIC BIOLOGICAL EFFECT EXPOSURE SYSTEM W. D. O'Brien, Jr., C. L. Christman and S. Yarrow Food to ultrasonic energy. The system can reproduce the ultrasonic signals of medical diagnostic and therapeutic. minicomputer controls the exposure time and the net electrical power to the ultrasonic transducer assembly

  19. An Off-Line Simulation System for Development of Real-Time FORTRAN Programs.

    ERIC Educational Resources Information Center

    White, James W.

    Implementation of an ISA FORTRAN standard for executive functions and process input-output within a simulation system called MINIFOR provides a useful real-time program development tool for small single function, dedicated minicomputers having a FORTRAN compiler but limited program development aids. A FORTRAN-based pre-compiler is used off-line to…

  20. Radioactivities in returned lunar materials and in meteorites

    NASA Technical Reports Server (NTRS)

    Fireman, E. L.

    1984-01-01

    Carbon 14 terrestial ages were determined with low level minicomputers and accelerator mass spectrometry on 1 Yamato and 18 Allan Hills and nearby sited meteorites. Techniques for an accelerator mass spectrometer which make C(14) measurements on small samples were developed. Also Be(10) concentrations were measured in Byrd core and Allan Hills ice samples.

  1. The ILS--The Pentagon Library's Experience.

    ERIC Educational Resources Information Center

    Mullane, Ruth

    1984-01-01

    Describes implementation of five subsystems of Integrated Library System's (ILS) version 2.1 (minicomputer-based automated library system) at the Pentagon Library: online catalog (search strategies, user acceptance); bibliographic subsystems (cataloging, retrospective conversion); circulation; serials check-in; administrative subsystem (report…

  2. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    ERIC Educational Resources Information Center

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  3. Cloud Computing and the Power to Choose

    ERIC Educational Resources Information Center

    Bristow, Rob; Dodds, Ted; Northam, Richard; Plugge, Leo

    2010-01-01

    Some of the most significant changes in information technology are those that have given the individual user greater power to choose. The first of these changes was the development of the personal computer. The PC liberated the individual user from the limitations of the mainframe and minicomputers and from the rules and regulations of centralized…

  4. A system for the management of requests at an image data bank. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Debarrosaguirre, J. L. (principal investigator)

    1984-01-01

    An automated system was implemented to supersede existing manual procedures in fulfilling user requests made to a remote sensing data bank, concerning specifically LANDSAT imagery. The system controls the several production steps from request entry to the shipment of each final product. Special solutions and techniques were employed due to the severe limitations, in both hardware and software of the host minicomputer system.

  5. A Guide to Using the Bibliographic Features of the Integrated Library System (ILS).

    ERIC Educational Resources Information Center

    King, Susan G.

    This manual provides guidance in the use of the Integrated Library System (ILS), a library minicomputer system in which all automated library functions are processed against a single database. It is oriented toward ILS users with no ADP training or experience. Written in MUMPS, a higher-level language, the system includes the following…

  6. Electronic engineer's design station user's guide

    SciTech Connect

    Magnuson, W.G. Jr.; Shectman, R.M.; Hatfield, L.; Willett, G.W.; Loomis, H.H. Jr.

    1981-06-01

    This guide is a description of how the Design Station is used to enter a designer's sketch on the minicomputer-based interactive graphics system. Schematic construction, component placement, output control, and save/restore of designs are all described in detail. The interactive graphics menu options are described and an explanation of their actions is given.

  7. Equipping a robot with omnidirectional depth

    E-print Network

    Conradt, Jörg

    computer. #12;2 #12;CONTENTS 3 Contents 1 Hardware 5 1.1 Raspberry Pi on OmniRob 11 4 Timeplan 13 #12;4 CONTENTS #12;5 Chapter 1 Hardware 1.1 Raspberry Pi The first Minicomputer i tried was the Raspberry Pi with Raspbian as operating system. It turned out

  8. Integrated library systems.

    PubMed Central

    Goldstein, C M

    1983-01-01

    The development of integrated library systems is discussed. The four major discussion points are (1) initial efforts; (2) network resources; (3) minicomputer-based systems; and (4) beyond library automation. Four existing systems are cited as examples of current systems. PMID:6354321

  9. LS/2000--The Integrated Library System from OCLC.

    ERIC Educational Resources Information Center

    Olson, Susan

    1984-01-01

    Discusses design features of the Online Catalog of LS/2000, OCLC's enhanced version of Integrated Library System. This minicomputer-based system provides bibliographic file maintenance, circulation control, and online catalog searching. Examples of available displays--holdings, full MARC, work forms, keyword entry, index selection, brief citation,…

  10. The Rise of K-12 Blended Learning: Profiles of Emerging Models

    ERIC Educational Resources Information Center

    Staker, Heather

    2011-01-01

    Some innovations change everything. The rise of personal computers in the 1970s decimated the mini-computer industry. TurboTax forever changed tax accounting, and MP3s made libraries of compact discs obsolete. These innovations bear the traits of what Harvard Business School Professor Clayton M. Christensen terms a "disruptive innovation."…

  11. 12 October 1990 EUROPHYSICS CONFERENCE

    E-print Network

    Kozak, Victor R.

    under1 construction: VEPP-2H, NAP-M of the "Elektronika" series. In the first systems only the most vital parts were automated, i.e the power supply to minicomputers which have been in common use in the 70s. The appli cation of a sufficiently powerful compu ter to

  12. Industrial robots and robotics

    SciTech Connect

    Kafrissen, S.; Stephens, M.

    1984-01-01

    This book discusses the study of robotics. It provides information of hardware, software, applications and economics. Eleven chapters examine the following: Minicomputers, Microcomputers, and Microprocessors; The Servo-Control System; The Activators; Robot Vision Systems; and Robot Workcell Environments. Twelve appendices supplement the data.

  13. Cactus

    SciTech Connect

    Sexton, R.L.

    1983-03-01

    The CACTUS project (computer-aided control, tracking, and updating system) was initiated by the Bendix Kansas City Division to address specific work-in-process problems encountered in a cable department. Since then, the project has been expanded to additional electrical manufacturing departments because of potential productivity gains from the system. The philosophy of CACTUS is to add an element of distributed data proessing to the centralized data processing system currently in use for control of work in process. Under this system, the existing chain of communications between the host computer and the CRT terminals in a department is severed. A mini-computer established in the department communicates directly with the central system, and departmental communication is then established with the mini-computer. The advantages, disadvantages, operation performance, and economics of the system are discussed.

  14. Impact of some architectural features of the implementation of a concurrent pascal machine

    SciTech Connect

    Kim, K.H.; Abou-el-naga, A.

    1982-01-01

    The concurrent pascal machine (CPM), which is a virtual machine designed to support concurrent processes, was implemented on an 8-bit microcomputer as the beginning step toward constructing a fault-tolerant microcomputer network. The CPM architecture, whose first implementation was based on a PDP 11/45 minicomputer, reflects considerable influence of the PDP 11/45 architecture. The architectural differences between the 16-bit minicomputer and the 8-bit microcomputer which have significant impact on the difficulty of implementing CPMs are analyzed. Then some details on the implementation of the 16-bit virtual machine (CPM) on the 8-bit microcomuter are presented along with some approaches for tuning the CPM architecture to yield more efficient implementations on microcomputers. 7 references.

  15. TDRSS data handling and management system study. Ground station systems for data handling and relay satellite control

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Results of a two-phase study of the (Data Handling and Management System DHMS) are presented. An original baseline DHMS is described. Its estimated costs are presented in detail. The DHMS automates the Tracking and Data Relay Satellite System (TDRSS) ground station's functions and handles both the forward and return link user and relay satellite data passing through the station. Direction of the DHMS is effected via a TDRSS Operations Control Central (OCC) that is remotely located. A composite ground station system, a modified DHMS (MDHMS), was conceptually developed. The MDHMS performs both the DHMS and OCC functions. Configurations and costs are presented for systems using minicomputers and midicomputers. It is concluded that a MDHMS should be configured with a combination of the two computer types. The midicomputers provide the system's organizational direction and computational power, and the minicomputers (or interface processors) perform repetitive data handling functions that relieve the midicomputers of these burdensome tasks.

  16. Display-management system for MFTF

    SciTech Connect

    Nelson, D.O.

    1981-01-01

    The Mirror Fusion Test Facility (MFTF) is controlled by 65 local control microcomputers which are supervised by a local network of nine 32-bit minicomputers. Associated with seven of the nine computers are state-of-the-art graphics devices, each with extensive local processing capability. These devices provide the means for an operator to interact with the control software running on the minicomputers. It is critical that the information the operator views accurately reflects the current state of the experiment. This information is integrated into dynamically changing pictures called displays. The primary organizational component of the display system is the software-addressable segment. The segments created by the display creation software are managed by display managers associated with each graphics device. Each display manager uses sophisticated storage management mechanisms to keep the proper segments resident in the local graphics device storage.

  17. Two dimensional recursive digital filters for near real time image processing

    NASA Technical Reports Server (NTRS)

    Olson, D.; Sherrod, E.

    1980-01-01

    A program was designed toward the demonstration of the feasibility of using two dimensional recursive digital filters for subjective image processing applications that require rapid turn around. The concept of the use of a dedicated minicomputer for the processor for this application was demonstrated. The minicomputer used was the HP1000 series E with a RTE 2 disc operating system and 32K words of memory. A Grinnel 256 x 512 x 8 bit display system was used to display the images. Sample images were provided by NASA Goddard on a 800 BPI, 9 track tape. Four 512 x 512 images representing 4 spectral regions of the same scene were provided. These images were filtered with enhancement filters developed during this effort.

  18. A computer-aided design system geared toward conceptual design in a research environment. [for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    STACK S. H.

    1981-01-01

    A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.

  19. Optical computer switching network

    NASA Technical Reports Server (NTRS)

    Clymer, B.; Collins, S. A., Jr.

    1985-01-01

    The design for an optical switching system for minicomputers that uses an optical spatial light modulator such as a Hughes liquid crystal light valve is presented. The switching system is designed to connect 80 minicomputers coupled to the switching system by optical fibers. The system has two major parts: the connection system that connects the data lines by which the computers communicate via a two-dimensional optical matrix array and the control system that controls which computers are connected. The basic system, the matrix-based connecting system, and some of the optical components to be used are described. Finally, the details of the control system are given and illustrated with a discussion of timing.

  20. Techniques for digital enhancement of Landsat MSS data using an Apple II+ microcomputer

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.; Cartin, K. F.

    1984-01-01

    The information provided by remotely sensed data collected from orbiting platforms has been useful in many research fields. Particularly convenient for evaluation are generally digital data stored on computer compatible tapes (CCT's). The major advantages of CCT's are the quality of the data and the accessibility to computer manipulation. Minicomputer systems are widely used for the required computer processing operations. However, microprocessor-related technological advances make it now possible to process CCT data with computing systems which can be obtained at a much lower price than minicomputer systems. A detailed description is provided of the design considerations of a microcomputer-based Digital Image Analysis System (DIAS). Particular attention is given to the algorithms which are incorporated for eighter edge enhancement or smoothing Landsat multispectral scanner data.

  1. Continuous fission-product monitor system at Oyster Creek. Final report

    SciTech Connect

    Collins, L.L.; Chulick, E.T.

    1980-10-01

    A continuous on-line fission product monitor has been installed at the Oyster Creek Nuclear Generating Station, Forked River, New Jersey. The on-line monitor is a minicomputer-controlled high-resolution gamma-ray spectrometer system. An intrinsic Ge detector scans a collimated sample line of coolant from one of the plant's recirculation loops. The minicomputer is a Nuclear Data 6620 system. Data were accumulated for the period from April 1979 through January 1980, the end of cycle 8 for the Oyster Creek plant. Accumulated spectra, an average of three a day, were stored on magnetic disk and subsequently analyzed for fisson products, Because of difficulties in measuring absolute detector efficiency, quantitative fission product concentrations in the coolant could not be determined. Data for iodine fission products are reported as a function of time. The data indicate the existence of fuel defects in the Oyster Creek core during cycle 8.

  2. Electromechanical three-axis development for remote handling in the Hot Experimental Facility

    SciTech Connect

    Garin, J.; Bolfing, B.J.; Satterlee, P.E.; Babcock, S.M.

    1981-01-01

    A three-axis closed-loop position control system has been designed and installed on an overhead bridge, carriage, tube hoist for automotive positioning of manipulation at a remotely maintained work site. The system provides accurate (within 3 min) and repeatable three-axis positioning of the manipulator. The position control system has been interfaced to a supervisory minicomputer system that provides teach-playback capability of manipulator positioning and color graphic display of the three-axis system position.

  3. Smithsonian Astrophysical Observatory laser tracking systems

    NASA Technical Reports Server (NTRS)

    Pearlman, M. R.; Lanham, N. W.; Lehr, C. G.; Wohn, J.

    1977-01-01

    The four SAO laser satellite-ranging systems, located in Brazil, Peru, Australia, and Arizona, have been in operation for more than five years and have provided ranging data at accuracy levels of a meter or better. The paper examines system hardware (laser transmitter, the electronics, mount, photoreceiver, minicomputer, and station timing) and software (prediction program, calibration programs, and data handling and quick-look programs) and also considers calibration, station operation, and system performance.

  4. TMS communications software. Volume 1: Computer interfaces

    NASA Technical Reports Server (NTRS)

    Brown, J. S.; Lenker, M. D.

    1979-01-01

    A prototype bus communications system, which is being used to support the Trend Monitoring System (TMS) as well as for evaluation of the bus concept is considered. Hardware and software interfaces to the MODCOMP and NOVA minicomputers are included. The system software required to drive the interfaces in each TMS computer is described. Documentation of other software for bus statistics monitoring and for transferring files across the bus is also included.

  5. Large aperture ac interferometer for optical testing.

    PubMed

    Moore, D T; Murray, R; Neves, F B

    1978-12-15

    A 20-cm clear aperture modified Twyman-Green interferometer is described. The system measures phase with an AC technique called phase-lock interferometry while scanning the aperture with a dual galvanometer scanning system. Position information and phase are stored in a minicomputer with disk storage. This information is manipulated with associated software, and the wavefront deformation due to a test component is graphically displayed in perspective and contour on a CRT terminal. PMID:20208642

  6. Justifying and Planning an Energy Monitoring System in an Existing Plant 

    E-print Network

    Stublen, A. P.; Wellman, C. M.; Kell, S. A.

    1998-01-01

    electronic meters, trip units, or relays that are installed in switchgear to monitor system operations and energy use. These units are then linked to multiplexers, which can report to a central PC or minicomputer. They can provide information on voltage... Lines as status inputs for a software-based mimic panel. Transformer and switchgear heater monitoring are included to allow reliability monitoring. Disturbance capture and harmonic snapshots are listed, although they may be extra cost features...

  7. Microcomputer-based digital image processing - A tutorial package for exploration geologists

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.; Cartin, K. F.

    1985-01-01

    An Apple II microcomputer-based software package for analysis of digital data developed at the University of Oklahoma, the Digital Image Analysis System (DIAS), provides a relatively low-cost, portable alternative to large, dedicated minicomputers for digital image processing education. Digital processing techniques for analysis of Landsat MSS data and a series of tutorial exercises for exploration geologists are described and evaluated. DIAS allows in-house training that does not interfere with computer-based prospect analysis objectives.

  8. Computer program modifications of Open-file report 82-1065; a comprehensive system for interpreting seismic-refraction and arrival-time data using interactive computer methods

    USGS Publications Warehouse

    Ackermann, Hans D.; Pankratz, Leroy W.; Dansereau, Danny A.

    1983-01-01

    The computer programs published in Open-File Report 82-1065, A comprehensive system for interpreting seismic-refraction arrival-time data using interactive computer methods (Ackermann, Pankratz, and Dansereau, 1982), have been modified to run on a mini-computer. The new version uses approximately 1/10 of the memory of the initial version, is more efficient and gives the same results.

  9. Study of software application of airborne laser doppler system for severe storms measurement

    NASA Technical Reports Server (NTRS)

    Alley, P. L.

    1979-01-01

    Significant considerations are described for performing a Severe Storms Measurement program in real time. Particular emphasis is placed on the sizing and timing requirements for a minicomputer-based system. Analyses of several factors which could impact the effectiveness of the system are presented. The analyses encompass the problems of data acquisition, data storage, data registration, correlation, and flow field computation, and error induced by aircraft motion, moment estimation, and pulse integration.

  10. Display system for imaging scientific telemetric information

    NASA Technical Reports Server (NTRS)

    Zabiyakin, G. I.; Rykovanov, S. N.

    1979-01-01

    A system for imaging scientific telemetric information, based on the M-6000 minicomputer and the SIGD graphic display, is described. Two dimensional graphic display of telemetric information and interaction with the computer, in analysis and processing of telemetric parameters displayed on the screen is provided. The running parameter information output method is presented. User capabilities in the analysis and processing of telemetric information imaged on the display screen and the user language are discussed and illustrated.

  11. Data base design for a worldwide multicrop information system

    NASA Technical Reports Server (NTRS)

    Driggers, W. G.; Downs, J. M.; Hickman, J. R.; Packard, R. L. (principal investigators)

    1979-01-01

    A description of the USDA Application Test System data base design approach and resources is presented. The data is described in detail by category, with emphasis on those characteristics which influenced the design most. It was concluded that the use of a generalized data base in support of crop assessment is a sound concept. The IDMS11 minicomputer base system is recommended for this purpose.

  12. Automation of the process of speech signal segmentation in an analogic-numeric system

    NASA Astrophysics Data System (ADS)

    Domagala, P.

    Eighteen Polish words uttered by 12 voices (7 male and 5 female) were taperecorded and analyzed by computer. Numeric analysis of the dynamic spectrum was implemented using an algorithm composed of simple logical sentences on the MERA 303 minicomputer. Compared with the visual segmentation achieved in the spectrographic computer images, correctness of segmentation reached a level of about 94 percent. No differences were found in quality of segmentation between male and female utterances.

  13. The Kwasan Image Processing System.

    NASA Astrophysics Data System (ADS)

    Nakai, Y.; Kitai, R.; Asada, T.; Iwasaki, K.

    The Kwasan Image Processing System is a general purpose interactive image processing and analyzing system designed to process a large amount of photographic and photoelectric data. The hardware of the system mainly consists of a PDS MICRO-10 microdensitometer, a VAX-11/750 minicomputer, a 456 M bytes Winchester disk and a VS11 color-graphic terminal. The application programs "PDS, KIPS, STII" enable users to analyze spectrographic plates and two-dimensional images without site-special knowledge of programming.

  14. ART/Ada design project, phase 1: Project plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan and schedule for Phase 1 of the Ada based ESBT Design Research Project is described. The main platform for the project is a DEC Ada compiler on VAX mini-computers and VAXstations running the Virtual Memory System (VMS) operating system. The Ada effort and lines of code are given in tabular form. A chart is given of the entire project life cycle.

  15. Future freeze forecasting

    NASA Technical Reports Server (NTRS)

    Bartholic, J. F.; Sutherland, R. A.

    1979-01-01

    Real time GOES thermal data acquisition, an energy balance minimum temperature prediction model and a statistical model are incorporated into a minicomputer system. These components make up the operational "Satellite Freeze Forecast System" being used to aid NOAA, NWS forecasters in developing their freeze forecasts. The general concept of the system is presented in this paper. Specific detailed aspects of the system can be found in the reference cited.

  16. Geometric assessment of image quality using digital image registration techniques

    NASA Technical Reports Server (NTRS)

    Tisdale, G. E.

    1976-01-01

    Image registration techniques were developed to perform a geometric quality assessment of multispectral and multitemporal image pairs. Based upon LANDSAT tapes, accuracies to a small fraction of a pixel were demonstrated. Because it is insensitive to the choice of registration areas, the technique is well suited to performance in an automatic system. It may be implemented at megapixel-per-second rates using a commercial minicomputer in combination with a special purpose digital preprocessor.

  17. An implementation of the distributed programming structural synthesis system (PROSSS)

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1981-01-01

    A method is described for implementing a flexible software system that combines large, complex programs with small, user-supplied, problem-dependent programs and that distributes their execution between a mainframe and a minicomputer. The Programming Structural Synthesis System (PROSSS) was the specific software system considered. The results of such distributed implementation are flexibility of the optimization procedure organization and versatility of the formulation of constraints and design variables.

  18. Design of a microprocessor based packet switch

    NASA Technical Reports Server (NTRS)

    Carville, D. E.; Arozullah, M.

    1978-01-01

    The design of a packet switch incorporating a minicomputer to control and process data packets in a communications system has been reported. This paper proposes a design which uses a 6500 series microprocessor to realize the functions of a message switch in order to evaluate the viability of microprocessors in this application. The proposed design realizes the functions of header analysis, packet security, error control and line prioritization. The system hardware and software are presented together with a brief description of operation.

  19. Optimizing Xenix I/O

    SciTech Connect

    Bottorff, P.; Potts, B.

    1983-08-01

    High performance microprocessors, inexpensive Winchester disk drives and low cost high density dynamic random access memories are making it feasible to incorporate minicomputer operating systems such as Unix into multiuser/multitasking microcomputers. However, before Unix and its derivatives can be efficiently integrated into a microcomputer environment, certain I/O and memory management hardware design problems previously limited to larger computer systems must be solved. These are discussed.

  20. Dedicated multiprocessor system for calculating Josephson-junction noise thermometer frequency variances at high speed

    SciTech Connect

    Cutkosky, R.D.

    1983-07-01

    A Josephson-junction noise thermometer produces a sequence of frequency readings from whose variations the temperature of the thermometer may be calculated. A preprocessor system has been constructed to collect the frequency readings delivered to an IEEE 488 bus by an ordinary counter operating at up to 1000 readings per second, perform the required calculations, and send summary information to a desk calculator or minicomputer on another 488 bus at a more convenient rate.

  1. TMS communications hardware. Volume 1: Computer interfaces

    NASA Technical Reports Server (NTRS)

    Brown, J. S.; Weinrich, S. S.

    1979-01-01

    A prototpye coaxial cable bus communications system was designed to be used in the Trend Monitoring System (TMS) to connect intelligent graphics terminals (based around a Data General NOVA/3 computer) to a MODCOMP IV host minicomputer. The direct memory access (DMA) interfaces which were utilized for each of these computers are identified. It is shown that for the MODCOMP, an off-the-shell board was suitable, while for the NOVAs, custon interface circuitry was designed and implemented.

  2. FTIR (Fourier transform infrared) spectrophotometry for thin film monitors: Computer and equipment integration for enhanced capabilities

    SciTech Connect

    Cox, J.N.; Sedayao, J.; Shergill, G.; Villasol, R. ); Haaland, D.M. )

    1990-01-01

    Fourier transform infrared spectrophotometry (FTIR) is a valuable technique for monitoring thin films used in semiconductor device manufacture. Determinations of the constituent contents in borophosphosilicate (BPSG), phosphosilicate (PSG), silicon oxynitride (SiON:H,OH), and spin-on-glass (SOG) thin films are a few applications. Due to the nature of the technique, FTIR instrumentation is one of the most extensively computer-dependent pieces of equipment that is likely to be found in a microelectronics plant. In the role of fab monitor or reactor characterization tool, FTIR instruments can rapidly generate large amounts of data. By linking a local FTIR data station to a remote minicomputer its capabilities are greatly improved. We discuss three caused of enhancement. First, the FTIR in the fab area communicates and interacts in real time with the minicomputer: transferring data segments to it, instructing it to perform sophisticated processing, and returning the result to the operator in the fab. Characterizations of PSG thin films by this approach are discussed. Second, the spectra of large numbers of samples are processed locally. The large database is then transmitted to the minicomputer for study by statistical/graphics software. Results of CVD-reactor spatial profiling experiments for plasma SiON are presented. Third, processing of calibration spectra is performed on the minicomputer to optimize the accuracy and precision of a Partial Least Squares'' analysis mode. This model is then transferred to the data station in the fab. The analysis of BPSG thin films is discussed in this regard. The prospects for fully automated at-line monitoring and for real-time, in-situ monitoring will be discussed. 10 refs., 4 figs.

  3. The application of charge-coupled device processors in automatic-control systems

    NASA Technical Reports Server (NTRS)

    Mcvey, E. S.; Parrish, E. A., Jr.

    1977-01-01

    The application of charge-coupled device (CCD) processors to automatic-control systems is suggested. CCD processors are a new form of semiconductor component with the unique ability to process sampled signals on an analog basis. Specific implementations of controllers are suggested for linear time-invariant, time-varying, and nonlinear systems. Typical processing time should be only a few microseconds. This form of technology may become competitive with microprocessors and minicomputers in addition to supplementing them.

  4. Application of a personal computer for the uncoupled vibration analysis of wind turbine blade and counterweight assemblies

    NASA Technical Reports Server (NTRS)

    White, P. R.; Little, R. R.

    1985-01-01

    A research effort was undertaken to develop personal computer based software for vibrational analysis. The software was developed to analytically determine the natural frequencies and mode shapes for the uncoupled lateral vibrations of the blade and counterweight assemblies used in a single bladed wind turbine. The uncoupled vibration analysis was performed in both the flapwise and chordwise directions for static rotor conditions. The effects of rotation on the uncoupled flapwise vibration of the blade and counterweight assemblies were evaluated for various rotor speeds up to 90 rpm. The theory, used in the vibration analysis codes, is based on a lumped mass formulation for the blade and counterweight assemblies. The codes are general so that other designs can be readily analyzed. The input for the codes is generally interactive to facilitate usage. The output of the codes is both tabular and graphical. Listings of the codes are provided. Predicted natural frequencies of the first several modes show reasonable agreement with experimental results. The analysis codes were originally developed on a DEC PDP 11/34 minicomputer and then downloaded and modified to run on an ITT XTRA personal computer. Studies conducted to evaluate the efficiency of running the programs on a personal computer as compared with the minicomputer indicated that, with the proper combination of hardware and software options, the efficiency of using a personal computer exceeds that of a minicomputer.

  5. More than mainframes

    SciTech Connect

    Daniels, H.A.; Mayor, N.

    1985-08-01

    This article describes the mix and match trend that is taking place in computer systems design today. Rather than relying on one mainframe or minicomputer to do all computing tasks, industries are blending microcomputers, minicomputers, and mainframes. Such a system can excel in manipulating large and lengthy calculations, a job usually done on mainframes, as well as simultaneously and repeatedly processing simple data in real time, work best handled by mini- and micro-computers. The use of such distributed architecture is discussed as it relates to the power utility industry. In the past few years, several developments have forced utilities to demand more from an energy-management system (EMS) than a lone mainframe or minicomputer can handle. Functions of power utility plants require computing power for both massive number crunching and real-time processing of data from many sources. The type of computer combinations in the EMS that best suit each utility are determined by how large or small the utility is. What architecture is best for which application is discussed.

  6. Close to real life. [solving for transonic flow about lifting airfoils using supercomputers

    NASA Technical Reports Server (NTRS)

    Peterson, Victor L.; Bailey, F. Ron

    1988-01-01

    NASA's Numerical Aerodynamic Simulation (NAS) facility for CFD modeling of highly complex aerodynamic flows employs as its basic hardware two Cray-2s, an ETA-10 Model Q, an Amdahl 5880 mainframe computer that furnishes both support processing and access to 300 Gbytes of disk storage, several minicomputers and superminicomputers, and a Thinking Machines 16,000-device 'connection machine' processor. NAS, which was the first supercomputer facility to standardize operating-system and communication software on all processors, has done important Space Shuttle aerodynamics simulations and will be critical to the configurational refinement of the National Aerospace Plane and its intergrated powerplant, which will involve complex, high temperature reactive gasdynamic computations.

  7. ECG-gated emission computed tomography of the cardiac blood pool

    SciTech Connect

    Moore, M.L.; Murphy, P.H.; Burdine, J.A.

    1980-01-01

    ECG-gated cross-sectional images of the cardiac blood pool were produced using a specially constructed emission computed tomographic scanner. A pair of large-field-of-view cameras were mounted in opposition in a gantry that rotates 360/sup 0/ about the patient. The coordinates of each detected event, the output of a physiological synchronizer, and the position of the camera heads were input to a dedicated minicomputer which was used to produce the images. Display as a movie permitted evaluation of regional and global wall motion in cross section without the disadvantages of superimposed blood pools as obtained in nontomographic views.

  8. Evaluation of initial collector field performance at the Langley Solar Building Test Facility

    NASA Technical Reports Server (NTRS)

    Boyle, R. J.; Jensen, R. N.; Knoll, R. H.

    1977-01-01

    The thermal performance of the solar collector field for the NASA Langley Solar Building Test Facility is given for October 1976 through January 1977. A 1,180 square meter solar collector field with seven collector designs helped to provide hot water for the building heating system and absorption air conditioner. The collectors were arranged in 12 rows with nominally 51 collectors per row. Heat transfer rates for each row were calculated and recorded along with sensor, insolation, and weather data every five minutes using a minicomputer. The agreement between the experimental and predicted collector efficiencies was generally within five percentage points.

  9. New kind of user interface for controlling MFTF diagnostics

    SciTech Connect

    Preckshot, G.G.; Saroyan, R.A.; Mead, J.E.

    1983-11-29

    The Mirror Fusion Test Facility (MFTF) at Lawrence Livermore National Laboratory is faced with the problem of controlling a multitude of plasma diagnostics instruments from a central, multiprocessor computer facility. A 16-bit microprocessor-based workstation allows each physicist entree into the central multiprocessor, which consists of nine Perkin-Elmer 32-bit minicomputers. The workstation provides the user interface to the larger system, with display graphics, windowing, and a physics notebook. Controlling a diagnostic is now equivalent to making entries into a traditional physics notebook.

  10. A convenient and adaptable package of computer programs for DNA and protein sequence management, analysis and homology determination.

    PubMed Central

    Pustell, J; Kafatos, F C

    1984-01-01

    We describe the further development of a widely used package of DNA/protein sequence analysis programs (1). Important revisions have been made based on user experience, and new features, multi-user capability, and a set of large scale homology programs have been added. The programs are very user friendly, economical of time and memory, and extremely transportable. They are written in a version of FORTRAN which will compile, with a few defined changes, as FORTRAN 66, FORTRAN 77, FORTRAN IV, FORTRAN IV+, and others. They are running on a variety of microcomputers, minicomputers, and mainframes, in both single user and multi-user configurations. PMID:6320100

  11. Gamma-resonance system on line with microcomputer

    SciTech Connect

    Bil'dyukevich, E.V.; Gurachevskii, V.L; Litvinovich, Y.A.; Mashlan, M.; Misevich, O.V.

    1986-05-01

    This paper describes a system that consists of a modernized YaGRS-4M spectrometer interfaced with an Elektronika-60 microcomputer, which operates as a terminal for SM-4 and Elektronika-100I minicomputers. It is shown that organization of storage and real-time display of Mossbauer spectra by direct memory access, which completely eliminates losses d of physical data, frees the processor of the microcomputer for solution of problems that put an experiment on-line. The advantages of a multi-level system for automation of gamma-resonance experiments are discussed.

  12. Control system theory of operation

    SciTech Connect

    Not Available

    1982-01-01

    Control of the field of heliostats is accomplished by means of a distributed computer control system consisting of a minicomputer located in the plant control room and a network of data buses and microcomputer-based controllers located at the heliostats. The reflective surface on each heliostat is rotated about azimuth and elevation axes by means of a gear-drive unit and electric motors. The actual azimuth and elevation angles are determined by means of incremental optical encoders and a microcomputer, and the microcomputer provides the logic to turn the drive motors on and off as required.

  13. Alternatives in the complement and structure of NASA teleprocessing resources

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results are presented of a program to identify technical innovations which would have an impact on NASA data processing and describe as fully as possible the development work necessary to exploit them. Seven of these options for NASA development, as the opportunities to participate in and enhance the advancing information system technology were called, are reported. A detailed treatment is given of three of the options, involving minicomputers, mass storage devices and software development techniques. These areas were picked by NASA as having the most potential for improving their operations.

  14. Optical instrumentation engineering in science, technology and society; Proceedings of the Sixteenth Annual Technical Meeting, San Mateo, Calif., October 16-18, 1972

    NASA Technical Reports Server (NTRS)

    Katz, Y. H.

    1973-01-01

    Visual tracking performance in instrumentation is discussed together with photographic pyrometry in an aeroballistic range, optical characteristics of spherical vapor bubbles in liquids, and the automatic detection and control of surface roughness by coherent diffraction patterns. Other subjects explored are related to instruments, sensors, systems, holography, and pattern recognition. Questions of data handling are also investigated, taking into account minicomputer image storage for holographic interferometry analysis, the design of a video amplifier for a 90 MHz bandwidth, and autostereoscopic screens. Individual items are announced in this issue.

  15. Microcontroller interface for diode array spectrometry

    NASA Astrophysics Data System (ADS)

    Aguo, L.; Williams, R. R.

    An alternative to bus-based computer interfacing is presented using diode array spectrometry as a typical application. The new interface consists of an embedded single-chip microcomputer, known as a microcontroller, which provides all necessary digital I/O and analog-to-digital conversion (ADC) along with an unprecedented amount of intelligence. Communication with a host computer system is accomplished by a standard serial interface so this type of interfacing is applicable to a wide range of personal and minicomputers and can be easily networked. Data are acquired asynchronousty and sent to the host on command. New operating modes which have no traditional counterparts are presented.

  16. Man-machine analysis of translation and work tasks of Skylab films

    NASA Technical Reports Server (NTRS)

    Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.

    1979-01-01

    An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.

  17. ART/Ada design project, phase 1. Task 3 report: Test plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan is described for the integrated testing and benchmark of Phase Ada based ESBT Design Research Project. The integration testing is divided into two phases: (1) the modules that do not rely on the Ada code generated by the Ada Generator are tested before the Ada Generator is implemented; and (2) all modules are integrated and tested with the Ada code generated by the Ada Generator. Its performance and size as well as its functionality is verified in this phase. The target platform is a DEC Ada compiler on VAX mini-computers and VAX stations running the VMS operating system.

  18. Impact for the 80's, proceedings of a conference on selected technology for business and industry, 1980

    SciTech Connect

    Not Available

    1980-01-01

    This conference proceedings contains 26 papers on selected technologies derived from activities at the NASA Lewis Research Center. Subjects covered were ground-based energy (an overview), aircraft propulsion (an overview), wind power commercialization, materials and structures, lubrication and bearings, Stirling and gas turbine engines, electric and hybrid vehicles, coal gasification and cogeneration, solar photovoltaics, materials processing in space, technology transfer, ion beam applications, magnetic heat pump, long-life cathode and traveling wave tube, multiroller traction drive, general-aviation aircraft engines, redox, laser applications, advanced battery systems, and minicomputers and microprocessors. Fifteen papers are indexed and abstracted separately.

  19. User's operating procedures. Volume 3: Projects directorate information programs

    NASA Technical Reports Server (NTRS)

    Haris, C. G.; Harris, D. K.

    1985-01-01

    A review of the user's operating procedures for the scout project automatic data system, called SPADS is presented. SPADS is the results of the past seven years of software development on a prime mini-computer. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, three of three, provides the instructions to operate the projects directorate information programs in data retrieval and file maintenance via the user friendly menu drivers.

  20. System of Programmed Modules for Measuring Photographs with a Gamma-Telescope

    NASA Technical Reports Server (NTRS)

    Averin, S. A.; Veselova, G. V.; Navasardyan, G. V.

    1978-01-01

    Physical experiments using tracking cameras resulted in hundreds of thousands of stereo photographs of events being received. To process such a large volume of information, automatic and semiautomatic measuring systems are required. At the Institute of Space Research of the Academy of Science of the USSR, a system for processing film information from the spark gamma-telescope was developed. The system is based on a BPS-75 projector in line with the minicomputer Elektronika 1001. The report describes this system. The various computer programs available to the operators are discussed.

  1. Registration of Heat Capacity Mapping Mission day and night images

    NASA Technical Reports Server (NTRS)

    Watson, K.; Hummer-Miller, S.; Sawatzky, D. L. (principal investigators)

    1982-01-01

    Neither iterative registration, using drainage intersection maps for control, nor cross correlation techniques were satisfactory in registering day and night HCMM imagery. A procedure was developed which registers the image pairs by selecting control points and mapping the night thermal image to the daytime thermal and reflectance images using an affine transformation on a 1300 by 1100 pixel image. The resulting image registration is accurate to better than two pixels (RMS) and does not exhibit the significant misregistration that was noted in the temperature-difference and thermal-inertia products supplied by NASA. The affine transformation was determined using simple matrix arithmetic, a step that can be performed rapidly on a minicomputer.

  2. User's manual for the Functional Relay Operation Monitor (FROM)

    SciTech Connect

    Gustke, F.R.

    1981-02-01

    Sandia's Digital Systems Development Division 1521 has developed a new functional relay tester. Capabilities of this tester include the measurement of coil and contact resistance, hipot, operate current, and contact operation and bounce times. The heart of the tester is a Hewlett-Packard 21MX minicomputer that uses BASIC or FORTRAN programming languages. All measurements are made by means of simple program calls, and all measurement standards are traceable to the National Bureau of Standards. Functional relay test data are stored on a disc drive and can be output as hard copy, manipulated in the computer, or sent over a distributed-system link to other Sandia computers. 17 figures, 4 tables.

  3. Tubular heat exchanger design. Complement to the report MT 131

    NASA Astrophysics Data System (ADS)

    Vandeberghe, F.

    1980-11-01

    An interactive program for a minicomputer which calculates the thermal performance of shell and tube heat exchangers was written. The algorithms used and program data flow are described. Heat transfer and pressure drop correlations were assembled from the literature to aid in limiting the overdimensioning of heat exchangers. The user can solve design problems by updating geometrical input parametes until the desired performance criteria are reached. The behavior of a given heat exchanger in partial load or overload conditions, or an exchanger having different fluids can be checked by changing general performance criteria or fluid numbers. An example of a marine oil cooler is used to illustrate use of the program.

  4. User's operating procedures. Volume 2: Scout project financial analysis program

    NASA Technical Reports Server (NTRS)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  5. Clinical application of a light-pen computer system for quantitative angiography

    NASA Technical Reports Server (NTRS)

    Alderman, E. L.

    1975-01-01

    The paper describes an angiographic analysis system which uses a video disk for recording and playback, a light-pen for data input, minicomputer processing, and an electrostatic printer/plotter for hardcopy output. The method is applied to quantitative analysis of ventricular volumes, sequential ventriculography for assessment of physiologic and pharmacologic interventions, analysis of instantaneous time sequence of ventricular systolic and diastolic events, and quantitation of segmental abnormalities. The system is shown to provide the capability for computation of ventricular volumes and other measurements from operator-defined margins by greatly reducing the tedium and errors associated with manual planimetry.

  6. User's operating procedures. Volume 1: Scout project information programs

    NASA Technical Reports Server (NTRS)

    Harris, C. G.; Harris, D. K.

    1985-01-01

    A review of the user's operating procedures for the Scout Project Automatic Data System, called SPADS is given. SPADS is the result of the past seven years of software development on a Prime minicomputer located at the Scout Project Office. SPADS was developed as a single entry, multiple cross reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. The instructions to operate the Scout Project Information programs in data retrieval and file maintenance via the user friendly menu drivers is presented.

  7. Missouri Automated Radiology System: a dynamic, interactive diagnostic and management system for radiant images.

    PubMed

    Lodwick, G S; Tully, R J; Markivee, C R; Hakimi, B R; Dittrich, F J

    1977-01-01

    Missouri Automated Radiology System has functioned in full support of the Department of Radiology for more than 7 years. For the past 5 years, MARS has functioned as a minicomputer system on a DEC (Digital Equipment Corporation) PDP-15 computer. While continuing to effectively support the department, in daily use by 20 staff and 15 resident physicians, MARS has continued to function in a research and development mode. With the continuous development of new applications, MARS is now essential to the function of the department and has again proven the point that physicians and computers can function symbiotically in the medical environment. PMID:10297278

  8. High-speed simulation of transients in nuclear power plants

    SciTech Connect

    Wulff, W.; Cheng, H.S.; Lekach, S.V.; Mallen, A.N.

    1984-01-01

    A combination of advanced modeling techniques and modern, special-purpose peripheral minicomputer technology is presented which affords realistic predictions of plant transient and severe off-normal events in LWR power plants through on-line simulations at a speed ten times greater than actual process speeds. Results are shown for a BWR plant simulation. Results are shown to demonstrate computing capacity, accuracy, and speed. Simulation speeds have been achieved which are 110 times larger than those of a CDC-7600 mainframe computer or ten times greater than real-time speed.

  9. Application of image processing techniques to fluid flow data analysis

    NASA Technical Reports Server (NTRS)

    Giamati, C. C.

    1981-01-01

    The application of color coding techniques used in processing remote sensing imagery to analyze and display fluid flow data is discussed. A minicomputer based color film recording and color CRT display system is described. High quality, high resolution images of two-dimensional data are produced on the film recorder. Three dimensional data, in large volume, are used to generate color motion pictures in which time is used to represent the third dimension. Several applications and examples are presented. System hardware and software is described.

  10. A new system for observing solar oscillations at the Mount Wilson Observatory. I - System design and installation

    NASA Technical Reports Server (NTRS)

    Rhodes, E. J., Jr.; Howard, R. F.; Ulrich, R. K.; Smith, E. J.

    1983-01-01

    An observation system designed to obtain daily measurements of solar photospheric and subphotospheric rotational velocities, from the frequency splitting of nonradial solar p-mode oscillations of degree greater than 150, is nearing completion of the Mount Wilson Observatory. The system will combine a 244 x 248 pixel CID camera with a high speed floating point array processor, a 32-bit minicomputer, and a large capacity disk storage system. These components will be integrated into the spectrograph of the 60-foot solar tower telescope at Mount Wilson.

  11. Computerized nuclear material system at Sandia National Laboratories

    SciTech Connect

    Tischhauser, J.L.

    1980-01-01

    SNLA developed and implemented a nuclear material control and accountability system on an HP 3000 minicomputer. The Sandia Nuclear Materials Computer System (SNMCS) which became operative in January 1980 provides: control of shipments and receivals of nuclear material, control of internal transfers of nuclear material, automated inventory with a bar code system, control of inventory adjustments, automated reporting/transmitting to other contractors and operations offices, automated ledgers and journals for material weights and costs, and interface to the Albuquerque Operations Office (ALO) Automated 741 System.

  12. An experimental study of a hybrid adaptive control system

    NASA Technical Reports Server (NTRS)

    Lizewski, E. F.; Monopoli, R. V.

    1974-01-01

    A Liapunov type model reference adaptive control system with five adjustable gains is implemented using a PDP-11 digital computer and an EAI 380 analog computer. The plant controlled is a laboratory type dc servo system. It is made to follow closely a second order linear model. The experimental results demonstrate the feasibility of implementing this rather complex design using only a minicomputer and a reasonable number of operational amplifiers. Also, it points out that satisfactory performance can be achieved even when certain assumptions necessary for the theory are not satisfied.

  13. The library and its home computer: automation as if people mattered.

    PubMed Central

    Avriel, D

    1983-01-01

    To provide its users with quick and easy access to the library resources, the Muriel and Philip Berman National Medical Library, Jerusalem, between 1978 and 1982 developed an integrated library system (MAIMON) on a minicomputer. Because humans are the most important element of the library system, MAIMON's performance was evaluated in terms of benefits provided to patrons, library management, and library staff. After successfully adopting the system, users' needs and expectations have grown. How the existing system will be used and expanded to meet the new information demands at the library is discussed. Images PMID:6626802

  14. The library and its home computer: automation as if people mattered.

    PubMed

    Avriel, D

    1983-07-01

    To provide its users with quick and easy access to the library resources, the Muriel and Philip Berman National Medical Library, Jerusalem, between 1978 and 1982 developed an integrated library system (MAIMON) on a minicomputer. Because humans are the most important element of the library system, MAIMON's performance was evaluated in terms of benefits provided to patrons, library management, and library staff. After successfully adopting the system, users' needs and expectations have grown. How the existing system will be used and expanded to meet the new information demands at the library is discussed. PMID:6626802

  15. Automation of internal library operations in academic health sciences libraries: a state of the art report.

    PubMed

    Grefsheim, S F; Larson, R H; Bader, S A; Matheson, N W

    1982-04-01

    A survey of automated records management in the United States and Canada was developed to identify existing on-line library systems and technical expertise. Follow-up interviews were conducted with ten libraries. Tables compare the features and availability of four main frame and four minicomputer systems. Results showed: a trend toward vendor-supplied systems; little coordination of efforts among schools; current system developments generally on a universitywide basis; and the importance of having the cooperation of campus computer facilities to the success of automation efforts. PMID:7066571

  16. Automation of internal library operations in academic health sciences libraries: a state of the art report.

    PubMed Central

    Grefsheim, S F; Larson, R H; Bader, S A; Matheson, N W

    1982-01-01

    A survey of automated records management in the United States and Canada was developed to identify existing on-line library systems and technical expertise. Follow-up interviews were conducted with ten libraries. Tables compare the features and availability of four main frame and four minicomputer systems. Results showed: a trend toward vendor-supplied systems; little coordination of efforts among schools; current system developments generally on a universitywide basis; and the importance of having the cooperation of campus computer facilities to the success of automation efforts. PMID:7066571

  17. Rocketdyne automated dynamics data analysis and management system

    NASA Technical Reports Server (NTRS)

    Tarn, Robert B.

    1988-01-01

    An automated dynamics data analysis and management systems implemented on a DEC VAX minicomputer cluster is described. Multichannel acquisition, Fast Fourier Transformation analysis, and an online database have significantly improved the analysis of wideband transducer responses from Space Shuttle Main Engine testing. Leakage error correction to recover sinusoid amplitudes and correct for frequency slewing is described. The phase errors caused by FM recorder/playback head misalignment are automatically measured and used to correct the data. Data compression methods are described and compared. The system hardware is described. Applications using the data base are introduced, including software for power spectral density, instantaneous time history, amplitude histogram, fatigue analysis, and rotordynamics expert system analysis.

  18. U. S. GEOLOGICAL SURVEY'S NATIONAL REAL-TIME HYDROLOGIC INFORMATION SYSTEM USING GOES SATELLITE TECHNOLOGY.

    USGS Publications Warehouse

    Shope, William G., Jr.

    1987-01-01

    The U. S. Geological Survey maintains the basic hydrologic data collection system for the United States. The Survey is upgrading the collection system with electronic communications technologies that acquire, telemeter, process, and disseminate hydrologic data in near real-time. These technologies include satellite communications via the Geostationary Operational Environmental Satellite, Data Collection Platforms in operation at over 1400 Survey gaging stations, Direct-Readout Ground Stations at nine Survey District Offices and a network of powerful minicomputers that allows data to be processed and disseminate quickly.

  19. MORPH-I (Ver 1.0) a software package for the analysis of scanning electron micrograph (binary formatted) images for the assessment of the fractal dimension of enclosed pore surfaces

    USGS Publications Warehouse

    Mossotti, Victor G.; Eldeeb, A. Raouf; Oscarson, Robert

    1998-01-01

    MORPH-I is a set of C-language computer programs for the IBM PC and compatible minicomputers. The programs in MORPH-I are used for the fractal analysis of scanning electron microscope and electron microprobe images of pore profiles exposed in cross-section. The program isolates and traces the cross-sectional profiles of exposed pores and computes the Richardson fractal dimension for each pore. Other programs in the set provide for image calibration, display, and statistical analysis of the computed dimensions for highly complex porous materials. Requirements: IBM PC or compatible; minimum 640 K RAM; mathcoprocessor; SVGA graphics board providing mode 103 display.

  20. Operator Station Design System - A computer aided design approach to work station layout

    NASA Technical Reports Server (NTRS)

    Lewis, J. L.

    1979-01-01

    The Operator Station Design System is resident in NASA's Johnson Space Center Spacecraft Design Division Performance Laboratory. It includes stand-alone minicomputer hardware and Panel Layout Automated Interactive Design and Crew Station Assessment of Reach software. The data base consists of the Shuttle Transportation System Orbiter Crew Compartment (in part), the Orbiter payload bay and remote manipulator (in part), and various anthropometric populations. The system is utilized to provide panel layouts, assess reach and vision, determine interference and fit problems early in the design phase, study design applications as a function of anthropometric and mission requirements, and to accomplish conceptual design to support advanced study efforts.

  1. Programming for energy monitoring/display system in multicolor lidar system research

    NASA Technical Reports Server (NTRS)

    Alvarado, R. C., Jr.; Allen, R. J.

    1982-01-01

    The Z80 microprocessor based computer program that directs and controls the operation of the six channel energy monitoring/display system that is a part of the NASA Multipurpose Airborne Differential Absorption Lidar (DIAL) system is described. The program is written in the Z80 assembly language and is located on EPROM memories. All source and assembled listings of the main program, five subroutines, and two service routines along with flow charts and memory maps are included. A combinational block diagram shows the interfacing (including port addresses) between the six power sensors, displays, front panel controls, the main general purpose minicomputer, and this dedicated microcomputer system.

  2. Data acquisition and command system for use with a microprocessor-based control chassis. [PIGMI-Pion Generation for Medical Irradiations

    SciTech Connect

    Halbig, J.K.; Klosterbuer, S.F.; Martinez, V.A. Jr.

    1980-01-01

    The Pion Generation for Medical Irradiations (PIGMI) program at the Los Alamos Scientific Laboratory is developing the technology to build smaller, less expensive, and more reliable proton linear accelerators for medical applications, and has designed a powerful, simple, inexpensive, and reliable control and data acquisition system that is central to the program development. The system is a NOVA-3D minicomputer interfaced to several outlying microprocessor-based controllers, which accomplish control and data acquisition through data I/O chasis. The equipment interface chassis, which can issue binary commands, read binary data, issue analog commands, and read timed and untimed analog data is described.

  3. APSAS; an Automated Particle Size Analysis System

    USGS Publications Warehouse

    Poppe, Lawrence J.; Eliason, A.H.; Fredericks, J.J.

    1985-01-01

    The Automated Particle Size Analysis System integrates a settling tube and an electroresistance multichannel particle-size analyzer (Coulter Counter) with a Pro-Comp/gg microcomputer and a Hewlett Packard 2100 MX(HP 2100 MX) minicomputer. This system and its associated software digitize the raw sediment grain-size data, combine the coarse- and fine-fraction data into complete grain-size distributions, perform method of moments and inclusive graphics statistics, verbally classify the sediment, generate histogram and cumulative frequency plots, and transfer the results into a data-retrieval system. This system saves time and labor and affords greater reliability, resolution, and reproducibility than conventional methods do.

  4. UNIX-based data management system for the Mobile Satellite Propagation Experiment (PiFEx)

    NASA Technical Reports Server (NTRS)

    Kantak, Anil V.

    1987-01-01

    A new method is presented for handling data resulting from Mobile Satellite propagation experiments such as the Pilot Field Experiment (PiFEx) conducted by JPL. This method uses the UNIX operating system and C programming language. The data management system is implemented on a VAX minicomputer. The system automatically divides the large data file housing data from various experiments under a predetermined format into various individual files containing data from each experiment. The system also has a number of programs written in C and FORTRAN languages to allow the researcher to obtain meaningful quantities from the data at hand.

  5. A real time programmable data compression system for video data.

    NASA Technical Reports Server (NTRS)

    Kutz, R. L.; Davisson, L. D.

    1971-01-01

    Description of the implementation of a data compression system for the real-time operational transmission (through microwave links) of ATS satellite pictures between the command and data acquisition station and a central location for computer processing. The system features the use of general-purpose minicomputers for encoding and decoding; this makes it possible to vary the employed data compression technique and to make simultaneous statistical calculations on the data. Data compression and expansion is accomplished in a manner that does not lower data quality.

  6. Effects of misalignment on mechanical behavior of metals in creep

    NASA Technical Reports Server (NTRS)

    Wu, H. C.

    1981-01-01

    Creep tests were conducted by means of a closed loop servocontrolled materials test system. The strain history prior to creep is carefully monitored. Tests were performed for aluminum alloy 6061-O at 150 C and were monitored by a PDP 11/04 minicomputer at a preset constant plastic strain rate prehistory. The results show that the plastic strain rate prior to creep plays a significant role in creep behavior. The endochronic theory of viscoplasticity was applied to describe the observed creep curves. Intrinsic time and strain rate sensitivity function concepts are employed and modified according to the present observation.

  7. Implementation of the Integrated Library System: University of Maryland Health Sciences Library.

    PubMed

    Feng, C C; Freiburger, G; Knudsen, P C

    1983-07-01

    The Health Sciences Library, University of Maryland, has implemented the Integrated Library System (ILS), a minicomputer-based library automation system developed by the Lister Hill National Center for Biomedical Communications, National Library of Medicine. The process of moving a library from a manual to a computerized system required comprehensive planning and strong commitment by the staff. Implementation activities included hardware and software modification, conversion of manual files, staff training, and system publicity. ILS implementation resulted in major changes in procedures in the circulation, reference, and cataloging departments. PMID:6688748

  8. Implementation of the Integrated Library System: University of Maryland Health Sciences Library.

    PubMed Central

    Feng, C C; Freiburger, G; Knudsen, P C

    1983-01-01

    The Health Sciences Library, University of Maryland, has implemented the Integrated Library System (ILS), a minicomputer-based library automation system developed by the Lister Hill National Center for Biomedical Communications, National Library of Medicine. The process of moving a library from a manual to a computerized system required comprehensive planning and strong commitment by the staff. Implementation activities included hardware and software modification, conversion of manual files, staff training, and system publicity. ILS implementation resulted in major changes in procedures in the circulation, reference, and cataloging departments. PMID:6688748

  9. A computer system to analyze showers in nuclear emulsions: Center Director's discretionary fund report

    NASA Technical Reports Server (NTRS)

    Meegan, C. A.; Fountain, W. F.; Berry, F. A., Jr.

    1987-01-01

    A system to rapidly digitize data from showers in nuclear emulsions is described. A TV camera views the emulsions though a microscope. The TV output is superimposed on the monitor of a minicomputer. The operator uses the computer's graphics capability to mark the positions of particle tracks. The coordinates of each track are stored on a disk. The computer then predicts the coordinates of each track through successive layers of emulsion. The operator, guided by the predictions, thus tracks and stores the development of the shower. The system provides a significant improvement over purely manual methods of recording shower development in nuclear emulsion stacks.

  10. GEM: Statistical weather forecasting procedure

    NASA Technical Reports Server (NTRS)

    Miller, R. G.

    1983-01-01

    The objective of the Generalized Exponential Markov (GEM) Program was to develop a weather forecast guidance system that would: predict between 0 to 6 hours all elements in the airways observations; respond instantly to the latest observed conditions of the surface weather; process these observations at local sites on minicomputing equipment; exceed the accuracy of current persistence predictions at the shortest prediction of one hour and beyond; exceed the accuracy of current forecast model output statistics inside eight hours; and be capable of making predictions at one location for all locations where weather information is available.

  11. Computer control and data management in an LSI fabrication facility

    SciTech Connect

    Doyal, L. A.; Weaver, D. L.; Gwyn, C. W.

    1980-06-01

    A minicomputer system is used to control diffusion furnaces, monitor temperatures, provide operator instructions for each processing step, and record detailed process histories for wafer lots fabricated in the Sandia Semiconductor Development Laboratory. The system provides a complete data base for laboratory operations, a variety of displays describing equipment status, scheduling and utilization summaries for equipment, wafer and mask inventories, and laboratory management information. The wafer lot history includes a record indicating the operator, time, date, and specification recipe for each process step, special notes summarizing process deviations, results of inspection steps, and in-line capacitance, oxide thickness, or resistivity measurements.

  12. Dual charge-coupled device /CCD/, astronomical spectrometer and direct imaging camera. II - Data handling and control systems

    NASA Technical Reports Server (NTRS)

    Dewey, D.; Ricker, G. R.

    1980-01-01

    The data collection system for the MASCOT (MIT Astronomical Spectrometer/Camera for Optical Telescopes) is described. The system relies on an RCA 1802 microprocessor-based controller, which serves to collect and format data, to present data to a scan converter, and to operate a device communication bus. A NOVA minicomputer is used to record and recall frame images and to perform refined image processing. The RCA 1802 also provides instrument mode control for the MASCOT. Commands are issued using STOIC, a FORTH-like language. Sufficient flexibility has been provided so that a variety of CCDs can be accommodated.

  13. Networking of microcomputers in the radiology department.

    PubMed

    Markivee, C R

    1985-10-01

    A microcomputer may be installed in any of several areas in a radiology department or office to automate data processing. Such areas include the reception desk, the transcription office, the quality-control station, and remote or satellite radiography rooms. Independent microcomputers can be interconnected by networking, using small hardware and software packages and cables, to effect communication between them, afford access to a common data base, and share peripheral devices such as hard disks and printers. A network of microcomputers can perform many of the functions of a larger minicomputer system at lower cost and can be assembled in small modules as budgetary constraints allow. PMID:3876011

  14. Manipulator for rotating and examining small spheres

    DOEpatents

    Weinstein, B.W.; Willenborg, D.L.

    1980-02-12

    A manipulator is disclosed which provides fast, accurate rotational positioning of a small sphere, such as an inertial confinement fusion target, which allows inspecting of the entire surface of the sphere. The sphere is held between two flat, flexible tips which move equal amounts in opposite directions. This provides rolling of the ball about two orthogonal axes without any overall translation. The manipulator may be controlled, for example, by an x- and y-axis driven controlled by a mini-computer which can be programmed to generate any desired scan pattern. 8 figs.

  15. Manipulator for rotating and examining small spheres

    DOEpatents

    Weinstein, Berthold W. [Livermore, CA; Willenborg, David L. [Livermore, CA

    1980-02-12

    A manipulator which provides fast, accurate rotational positioning of a small sphere, such as an inertial confinement fusion target, which allows inspecting of the entire surface of the sphere. The sphere is held between two flat, flexible tips which move equal amounts in opposite directions. This provides rolling of the ball about two orthogonal axes without any overall translation. The manipulator may be controlled, for example, by an x- and y-axis driven controlled by a mini-computer which can be programmed to generate any desired scan pattern.

  16. On the development of an interactive resource information management system for analysis and display of spatiotemporal data

    NASA Technical Reports Server (NTRS)

    Schell, J. A.

    1974-01-01

    The recent availability of timely synoptic earth imagery from the Earth Resources Technology Satellites (ERTS) provides a wealth of information for the monitoring and management of vital natural resources. Formal language definitions and syntax interpretation algorithms were adapted to provide a flexible, computer information system for the maintenance of resource interpretation of imagery. These techniques are incorporated, together with image analysis functions, into an Interactive Resource Information Management and Analysis System, IRIMAS, which is implemented on a Texas Instruments 980A minicomputer system augmented with a dynamic color display for image presentation. A demonstration of system usage and recommendations for further system development are also included.

  17. Quantitative analysis of defects in silicon. Silicon sheet growth development for the large are silicon sheet task of the low-cost solar array project

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Smith, J. M.; Bruce, T.; Oidwai, H. A.

    1980-01-01

    One hundred and seventy four silicon sheet samples were analyzed for twin boundary density, dislocation pit density, and grain boundary length. Procedures were developed for the quantitative analysis of the twin boundary and dislocation pit densities using a QTM-720 Quantitative Image Analyzing system. The QTM-720 system was upgraded with the addition of a PDP 11/03 mini-computer with dual floppy disc drive, a digital equipment writer high speed printer, and a field-image feature interface module. Three versions of a computer program that controls the data acquisition and analysis on the QTM-720 were written. Procedures for the chemical polishing and etching were also developed.

  18. F100 Multivariable Control Synthesis Program. Computer Implementation of the F100 Multivariable Control Algorithm

    NASA Technical Reports Server (NTRS)

    Soeder, J. F.

    1983-01-01

    As turbofan engines become more complex, the development of controls necessitate the use of multivariable control techniques. A control developed for the F100-PW-100(3) turbofan engine by using linear quadratic regulator theory and other modern multivariable control synthesis techniques is described. The assembly language implementation of this control on an SEL 810B minicomputer is described. This implementation was then evaluated by using a real-time hybrid simulation of the engine. The control software was modified to run with a real engine. These modifications, in the form of sensor and actuator failure checks and control executive sequencing, are discussed. Finally recommendations for control software implementations are presented.

  19. Prescriptive concepts for advanced nuclear materials control and accountability systems

    SciTech Connect

    Whitty, W.J.; Strittmatter, R.B.; Ford, W.; Tisinger, R.M.; Meyer, T.H.

    1987-06-01

    Networking- and distributed-processing hardware and software have the potential of greatly enhancing nuclear materials control and accountability (MC and A) systems, from both safeguards and process operations perspectives, while allowing timely integrated safeguards activities and enhanced computer security at reasonable cost. A hierarchical distributed system is proposed consisting of groups of terminal and instruments in plant production and support areas connected to microprocessors that are connected to either larger microprocessors or minicomputers. These micros and/or minis are connected to a main machine, which might be either a mainframe or a super minicomputer. Data acquisition, preliminary input data validation, and transaction processing occur at the lowest level. Transaction buffering, resource sharing, and selected data processing occur at the intermediate level. The host computer maintains overall control of the data base and provides routine safeguards and security reporting and special safeguards analyses. The research described outlines the distribution of MC and A system requirements in the hierarchical system and distributed processing applied to MC and A. Implications of integrated safeguards and computer security concepts for the distributed system design are discussed. 10 refs., 4 figs.

  20. Digital system for structural dynamics simulation

    SciTech Connect

    Krauter, A.I.; Lagace, L.J.; Wojnar, M.K.; Glor, C.

    1982-11-01

    State-of-the-art digital hardware and software for the simulation of complex structural dynamic interactions, such as those which occur in rotating structures (engine systems). System were incorporated in a designed to use an array of processors in which the computation for each physical subelement or functional subsystem would be assigned to a single specific processor in the simulator. These node processors are microprogrammed bit-slice microcomputers which function autonomously and can communicate with each other and a central control minicomputer over parallel digital lines. Inter-processor nearest neighbor communications busses pass the constants which represent physical constraints and boundary conditions. The node processors are connected to the six nearest neighbor node processors to simulate the actual physical interface of real substructures. Computer generated finite element mesh and force models can be developed with the aid of the central control minicomputer. The control computer also oversees the animation of a graphics display system, disk-based mass storage along with the individual processing elements.

  1. Digital system for structural dynamics simulation

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Lagace, L. J.; Wojnar, M. K.; Glor, C.

    1982-01-01

    State-of-the-art digital hardware and software for the simulation of complex structural dynamic interactions, such as those which occur in rotating structures (engine systems). System were incorporated in a designed to use an array of processors in which the computation for each physical subelement or functional subsystem would be assigned to a single specific processor in the simulator. These node processors are microprogrammed bit-slice microcomputers which function autonomously and can communicate with each other and a central control minicomputer over parallel digital lines. Inter-processor nearest neighbor communications busses pass the constants which represent physical constraints and boundary conditions. The node processors are connected to the six nearest neighbor node processors to simulate the actual physical interface of real substructures. Computer generated finite element mesh and force models can be developed with the aid of the central control minicomputer. The control computer also oversees the animation of a graphics display system, disk-based mass storage along with the individual processing elements.

  2. VLSI research

    NASA Astrophysics Data System (ADS)

    Brodersen, R. W.

    1984-04-01

    A scaled version of the RISC II chip has been fabricated and tested and these new chips have a cycle time that would outperform a VAX 11/780 by about a factor of two on compiled integer C programs. The architectural work on a RISC chip designed for a Smalltalk implementation has been completed. This chip, called SOAR (Smalltalk On a RISC), should run program s4-15 times faster than the Xerox 1100 (Dolphin), a TTL minicomputer, and about as fast as the Xerox 1132 (Dorado), a $100,000 ECL minicomputer. The 1983 VLSI tools tape has been converted for use under the latest UNIX release (4.2). The Magic (formerly called Caddy) layout system will be a unified set of highly automated tools that cover all aspects of the layout process, including stretching, compaction, tiling and routing. A multiple window package and design rule checker for this system have just been completed and compaction and stretching are partially implemented. New slope-based timing models for the Crystal timing analyzer are now fully implemented and in regular use. In an accuracy test using a dozen critical paths from the RISC II processor and cache chips it was found that Crystal's estimates were within 5-10% of SPICE's estimates, while being a factor of 10,000 times faster.

  3. Implementation of a personal-computer-based real-time hardware-in-the-loop U.S. Army aviation and missile command simulator

    NASA Astrophysics Data System (ADS)

    Beck, David L.; Bennett, Robert G.

    2002-07-01

    With the rapid increase in computational power of the standard personal computer, many tasks that could only be performed by a mini-computer or mainframe can now be performed by the common personal computer. Ten years ago, computational and data transfer requirements for a real-time hardware-in-the-loop simulator could only be met by specialized high performance mini-computers. Today, personal computers shoulder the bulk of the computational load in the U.S. Army Aviation and Missile Command's Radio Frequency Simulation System, and one of the U.S. Army Aviation and Missile Command's millimeter wave simulation systems is currently undergoing a transition to personal computers. This paper discusses how personal computers have been used as the computational backbone for a real-time hardware-in-the-loop simulator, and some of the advantages and disadvantages of a PC based simulation. This paper also provides some general background on what the Radio Frequency Simulation System (RFSS) is and how it works, since the RFSS has successfully implemented a PC based real-time hardware-in-the-loop simulator.

  4. JANE, A new information retrieval system for the Radiation Shielding Information Center

    SciTech Connect

    Trubey, D.K.

    1991-05-01

    A new information storage and retrieval system has been developed for the Radiation Shielding Information Center (RSIC) at Oak Ridge National Laboratory to replace mainframe systems that have become obsolete. The database contains citations and abstracts of literature which were selected by RSIC analysts and indexed with terms from a controlled vocabulary. The database, begun in 1963, has been maintained continuously since that time. The new system, called JANE, incorporates automatic indexing techniques and on-line retrieval using the RSIC Data General Eclipse MV/4000 minicomputer, Automatic indexing and retrieval techniques based on fuzzy-set theory allow the presentation of results in order of Retrieval Status Value. The fuzzy-set membership function depends on term frequency in the titles and abstracts and on Term Discrimination Values which indicate the resolving power of the individual terms. These values are determined by the Cover Coefficient method. The use of a commercial database base to store and retrieve the indexing information permits rapid retrieval of the stored documents. Comparisons of the new and presently-used systems for actual searches of the literature indicate that it is practical to replace the mainframe systems with a minicomputer system similar to the present version of JANE. 18 refs., 10 figs.

  5. From the genetic to the computer program: the historicity of 'data' and 'computation' in the investigations on the nematode worm C. elegans (1963-1998).

    PubMed

    García-Sancho, Miguel

    2012-03-01

    This paper argues that the history of the computer, of the practice of computation and of the notions of 'data' and 'programme' are essential for a critical account of the emergence and implications of data-driven research. In order to show this, I focus on the transition that the investigations on the worm C. elegans experienced in the Laboratory of Molecular Biology of Cambridge (UK). Throughout the 1980s, this research programme evolved from a study of the genetic basis of the worm's development and behaviour to a DNA mapping and sequencing initiative. By examining the changing computing technologies which were used at the Laboratory, I demonstrate that by the time of this transition researchers shifted from modelling the worm's genetic programme on a mainframe apparatus to writing minicomputer programs aimed at providing map and sequence data which was then circulated to other groups working on the genetics of C. elegans. The shift in the worm research should thus not be simply explained in the application of computers which transformed the project from hypothesis-driven to a data-intensive endeavour. The key factor was rather a historically specific technology-in-house and easy programmable minicomputers-which redefined the way of achieving the project's long-standing goal, leading the genetic programme to co-evolve with the practices of data production and distribution. PMID:22326069

  6. Out-of-core nuclear fuel cycle optimization utilizing an engineering workstation

    SciTech Connect

    Turinsky, P.J.; Comes, S.A.

    1986-01-01

    Within the past several years, rapid advances in computer technology have resulted in substantial increases in their performance. The net effect is that problems that could previously only be executed on mainframe computers can now be executed on micro- and minicomputers. The authors are interested in developing an engineering workstation for nuclear fuel management applications. An engineering workstation is defined as a microcomputer with enhanced graphics and communication capabilities. Current fuel management applications range from using workstations as front-end/back-end processors for mainframe computers to completing fuel management scoping calculations. More recently, interest in using workstations for final in-core design calculations has appeared. The authors have used the VAX 11/750 minicomputer, which is not truly an engineering workstation but has comparable performance, to complete both in-core and out-of-core fuel management scoping studies. In this paper, the authors concentrate on our out-of-core research. While much previous work in this area has dealt with decisions concerned with equilibrium cycles, the current project addresses the more realistic situation of nonequilibrium cycles.

  7. Technology innovation and management in the US Bureau of the Census: Discussion and recommendations

    SciTech Connect

    Tonn, B.; Edwards, R.; Goeltz, R.; Hake, K.

    1990-09-01

    This report contains a set of recommendations prepared by Oak Ridge National Laboratory (ORNL) for the US Bureau of the Census pertaining to technology innovation and management. Technology has the potential to benefit the Bureau's data collection, capture, processing, and analysis activities. The entire Bureau was represented from Decennial Census to Economic Programs and various levels of Bureau management and numerous experts in technology. Throughout the Bureau, workstations, minicomputers, and microcomputers have found their place along side the Bureau's mainframes. The Bureau's new computer file structure called the Topologically Integrated Geographic Encoding and Referencing data base (TIGER) represents a major innovation in geographic information systems and impressive progress has been made with Computer Assisted Telephone Interviewing (CATI). Other innovations, such as SPRING, which aims to provide Bureau demographic analysts with the capability of interactive data analysis on minicomputers, are in the initial stages of development. Recommendations fall into five independent, but mutually beneficial categories. (1) The ADP Steering Committee be disbanded and replaced with The Technology Forum. (2) Establishment of a Technology Review Committee (TRC), to be composed of technology experts from outside the Bureau. (3) Designate technological gurus. These individuals will be the Bureau's experts in new and innovative technologies. (4) Adopt a technology innovation process. (5) Establish an Advanced Technology Studies Staff (ATSS) to promote technology transfer, obtain funding for technological innovation, manage innovation projects unable to find a home in other divisions, evaluate innovations that cut across Bureau organizational boundaries, and provide input into Bureau technology analyses. (JF)

  8. Quality control in a deterministic manufacturing environment

    SciTech Connect

    Barkman, W.E.; Babelay, E.F.; De Mint, P.D.; Lewis, J.C.; Woodard, L.M.

    1985-01-24

    An approach for establishing quality control in processes which exhibit undesired continual or intermittent excursions in key process parameters is discussed. The method is called deterministic manufacturing, and it is designed to employ automatic monitoring of the key process variables for process certification, but utilizes only sample certification of the process output to verify the validity of the measurement process. The system utilizes a local minicomputer to sample the appropriate process parameters that describe the condition of the machine tool, the cutting process, and the computer numerical control system. Sampled data are pre-processed by the minicomputer and then sent to a host computer that maintains a permanent data base describing the manufacturing conditions for each work piece. Parts are accepted if the various parameters remain within the required limits during the machining cycle. The need for additional actions is flagged if limits are exceeded. With this system it is possible to retrospectively examine the process status just prior to the occurrence of a problem. (LEW)

  9. Refractive index and absorption detector for liquid chromatography based on Fabry-Perot interferometry

    DOEpatents

    Yeung, Edward S. (Ames, IA); Woodruff, Steven D. (Ames, IA)

    1984-06-19

    A refractive index and absorption detector for liquid chromatography. It is based in part on a Fabry-Perot interferometer and is used for the improved detection of refractive index and absorption. It includes a Fabry-Perot interferometer having a normally fixed first partially reflecting mirror and a movable second partially reflecting mirror. A chromatographic flow-cell is positioned between the mirrors along the optical axis of a monochromatic laser beam passing through the interferometer. A means for deriving information about the interference fringes coming out of the interferometer is used with a mini-computer to compute the refractive index of the specimen injected into the flow cell. The minicomputer continuously scans the interferometer for continuous refractive index readings and outputs the continuous results of the scans on a chart recorder. The absorption of the specimen can concurrently be scanned by including a second optical path for an excitation laser which will not interfere with the first laser, but will affect the specimen so that absorption properties can be detected. By first scanning for the refractive index of the specimen, and then immediately adding the excitation laser and subsequently scanning for the refractive index again, the absorption of the specimen can be computed and recorded.

  10. Refractive index and absorption detector for liquid chromatography based on Fabry-Perot interferometry

    DOEpatents

    Yeung, E.S.; Woodruff, S.D.

    1984-06-19

    A refractive index and absorption detector are disclosed for liquid chromatography. It is based in part on a Fabry-Perot interferometer and is used for the improved detection of refractive index and absorption. It includes a Fabry-Perot interferometer having a normally fixed first partially reflecting mirror and a movable second partially reflecting mirror. A chromatographic flow-cell is positioned between the mirrors along the optical axis of a monochromatic laser beam passing through the interferometer. A means for deriving information about the interference fringes coming out of the interferometer is used with a mini-computer to compute the refractive index of the specimen injected into the flow cell. The minicomputer continuously scans the interferometer for continuous refractive index readings and outputs the continuous results of the scans on a chart recorder. The absorption of the specimen can concurrently be scanned by including a second optical path for an excitation laser which will not interfere with the first laser, but will affect the specimen so that absorption properties can be detected. By first scanning for the refractive index of the specimen, and then immediately adding the excitation laser and subsequently scanning for the refractive index again, the absorption of the specimen can be computed and recorded. 10 figs.

  11. Coordination and establishment of centralized facilities and services for the University of Alaska ERTS survey of the Alaskan environment

    NASA Technical Reports Server (NTRS)

    Belon, A. E. (principal investigator)

    1972-01-01

    The author has identified the following significant results. Specifications have been prepared for the engineering design and construction of a digital color display unit which will be used for automatic processing of ERTS data. The color display unit is a disk refresh memory with computer interfaced input and a color cathode ray tube output display. The system features both analog and digital post disk data manipulation and a versatile color coding device suitable for displaying not only images, but also computer generated graphics such as diagrams, maps, and overlays. Input is from IBM compatible 9 track, 800 BPI tapes, as generated by an IBM 360 computer. ERTS digital tapes are read into the 360, where various analyses such as maximum likelihood classification are performed and the results are written on a magnetic tape which is the input to the color display unit. The greatest versatility in the data manipulation area is provided by the minicomputer built into the color display unit, which is off-line from the main 360 computer. The minicomputer is able to read any line from the refresh disk and place it in its 4K-16 bit memory. Considerable flexibility is available for post-processing enhancement of images by the investigator.

  12. Distributed information system (water fact sheet)

    USGS Publications Warehouse

    Harbaugh, A.W.

    1986-01-01

    During 1982-85, the Water Resources Division (WRD) of the U.S. Geological Survey (USGS) installed over 70 large minicomputers in offices across the country to support its mission in the science of hydrology. These computers are connected by a communications network that allows information to be shared among computers in each office. The computers and network together are known as the Distributed Information System (DIS). The computers are accessed through the use of more than 1500 terminals and minicomputers. The WRD has three fundamentally different needs for computing: data management; hydrologic analysis; and administration. Data management accounts for 50% of the computational workload of WRD because hydrologic data are collected in all 50 states, Puerto Rico, and the Pacific trust territories. Hydrologic analysis consists of 40% of the computational workload of WRD. Cost accounting, payroll, personnel records, and planning for WRD programs occupies an estimated 10% of the computer workload. The DIS communications network is shown on a map. (Lantz-PTT)

  13. Speech as a pilot input medium

    NASA Technical Reports Server (NTRS)

    Plummer, R. P.; Coler, C. R.

    1977-01-01

    The speech recognition system under development is a trainable pattern classifier based on a maximum-likelihood technique. An adjustable uncertainty threshold allows the rejection of borderline cases for which the probability of misclassification is high. The syntax of the command language spoken may be used as an aid to recognition, and the system adapts to changes in pronunciation if feedback from the user is available. Words must be separated by .25 second gaps. The system runs in real time on a mini-computer (PDP 11/10) and was tested on 120,000 speech samples from 10- and 100-word vocabularies. The results of these tests were 99.9% correct recognition for a vocabulary consisting of the ten digits, and 99.6% recognition for a 100-word vocabulary of flight commands, with a 5% rejection rate in each case. With no rejection, the recognition accuracies for the same vocabularies were 99.5% and 98.6% respectively.

  14. A high pressure, high temperature combustor and turbine-cooling test facility

    NASA Technical Reports Server (NTRS)

    Cochran, R. P.; Norris, J. W.

    1976-01-01

    A new test facility is being constructed for developing turbine-cooling and combustor technology for future generation aircraft gas turbine engines. Prototype engine hardware will be investigated in this new facility at gas stream conditions up to 2480 K average turbine inlet temperature and 4.14 x 10 to the 6th power n sq m turbine inlet pressure. The facility will have the unique feature of fully automated control and data acquisition through the use of an integrated system of mini-computers and programmable controllers which will result in more effective use of operating time, will limit the number of operators required, and will provide built in self protection safety systems. The facility and the planning and design considerations are described.

  15. A high-pressure, high-temperature combustor and turbine-cooling test facility

    NASA Technical Reports Server (NTRS)

    Cochran, R. P.; Norris, J. W.; Jones, R. E.

    1976-01-01

    NASA-Lewis Research Center is presently constructing a new test facility for developing turbine-cooling and combustor technology for future generation aircraft gas turbine engines. Prototype engine hardware will be investigated in this new facility at gas stream conditions up to 2480 K average turbine inlet temperature and 4,140,000 N per sq m turbine inlet pressure. The facility will have the unique feature of fully-automated control and data acquisition through the use of an integrated system of minicomputers and programmable controllers, which will result in more effective use of operating time, will limit the number of operators required, and will provide a built-in self-protection safety system. The paper describes the facility and the planning and design considerations involved.

  16. Remote sensing information sciences research group: Browse in the EOS era

    NASA Technical Reports Server (NTRS)

    Estes, John E.; Star, Jeffrey L.

    1989-01-01

    The problem of science data browse was examined. Given the tremendous data volumes that are planned for future space missions, particularly the Earth Observing System in the late 1990's, the need for access to large spatial databases must be understood. Work was continued to refine the concept of data browse. Further, software was developed to provide a testbed of the concepts, both to locate possibly interesting data, as well as view a small portion of the data. Build II was placed on a minicomputer and a PC in the laboratory, and provided accounts for use in the testbed. Consideration of the testbed software as an element of in-house data management plans was begun.

  17. A method for diagnosing surface parameters using geostationary satellite imagery and a boundary-layer model. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Polansky, A. C.

    1982-01-01

    A method for diagnosing surface parameters on a regional scale via geosynchronous satellite imagery is presented. Moisture availability, thermal inertia, atmospheric heat flux, and total evaporation are determined from three infrared images obtained from the Geostationary Operational Environmental Satellite (GOES). Three GOES images (early morning, midafternoon, and night) are obtained from computer tape. Two temperature-difference images are then created. The boundary-layer model is run, and its output is inverted via cubic regression equations. The satellite imagery is efficiently converted into output-variable fields. All computations are executed on a PDP 11/34 minicomputer. Output fields can be produced within one hour of the availability of aligned satellite subimages of a target area.

  18. Automation in photogrammetry: Recent developments and applications (1972-1976)

    USGS Publications Warehouse

    Thompson, M.M.; Mikhail, E.M.

    1976-01-01

    An overview of recent developments in the automation of photogrammetry in various countries is presented. Conclusions regarding automated photogrammetry reached at the 1972 Congress in Ottawa are reviewed first as a background for examining the developments of 1972-1976. Applications are described for each country reporting significant developments. Among fifteen conclusions listed are statements concerning: the widespread practice of equipping existing stereoplotters with simple digitizers; the growing tendency to use minicomputers on-line with stereoplotters; the optimization of production of digital terrain models by progressive sampling in stereomodels; the potential of digitization of a photogrammetric model by density correlation on epipolar lines; the capabilities and economic aspects of advanced systems which permit simultaneous production of orthophotos, contours, and digital terrain models; the economy of off-line orthophoto systems; applications of digital image processing; automation by optical techniques; applications of sensors other than photographic imagery, and the role of photogrammetric phases in a completely automated cartographic system. ?? 1976.

  19. Contextual classification on the massively parallel processor

    NASA Technical Reports Server (NTRS)

    Tilton, James C.

    1987-01-01

    Classifiers are often used to produce land cover maps from multispectral Earth observation imagery. Conventionally, these classifiers have been designed to exploit the spectral information contained in the imagery. Very few classifiers exploit the spatial information content of the imagery, and the few that do rarely exploit spatial information content in conjunction with spectral and/or temporal information. A contextual classifier that exploits spatial and spectral information in combination through a general statistical approach was studied. Early test results obtained from an implementation of the classifier on a VAX-11/780 minicomputer were encouraging, but they are of limited meaning because they were produced from small data sets. An implementation of the contextual classifier is presented on the Massively Parallel Processor (MPP) at Goddard that for the first time makes feasible the testing of the classifier on large data sets.

  20. Brookhaven fastbus/unibus interface

    SciTech Connect

    Benenson, G.; Bauernfeind, J.; Larsen, R.C.; Leipuner, L.B.; Morse, W.M.; Adair, R.K.; Black, J.K.; Campbell, S.R.; Kasha, H.; Schmidt, M.P.

    1983-01-01

    A typical high energy physics experiment requires both a high speed data acquisition and processing system, for data collection and reduction; and a general purpose computer to handle further reduction, bookkeeping and mass storage. Broad differences in architecture, format or technology, will often exist between these two systems, and interface design can become a formidable task. The PDP-11 series minicomputer is widely used in physics research, and the Brookhaven FASTBUS is the only standard high speed data acquisition system which is fully implemented in a current high energy physics experiment. This paper will describe the design and operation of an interface between these two systems. The major issues are elucidated by a preliminary discussion on the basic principles of Bus Systems, and their application to Brookhaven FASTBUS and UNIBUS.

  1. Vibration in Planetary Gear Systems with Unequal Planet Stiffnesses

    NASA Technical Reports Server (NTRS)

    Frater, J. L.; August, R.; Oswald, F. B.

    1982-01-01

    An algorithm suitable for a minicomputer was developed for finding the natural frequencies and mode shapes of a planetary gear system which has unequal stiffnesses between the Sun/planet and planet/ring gear meshes. Mode shapes are represented in the form of graphical computer output that illustrates the lateral and rotational motion of the three coaxial gears and the planet gears. This procedure permits the analysis of gear trains utilizing nonuniform mesh conditions and user specified masses, stiffnesses, and boundary conditions. Numerical integration of the equations of motion for planetary gear systems indicates that this algorithm offers an efficient means of predicting operating speeds which may result in high dynamic tooth loads.

  2. Digital resolver for helicopter model blade motion analysis

    NASA Technical Reports Server (NTRS)

    Daniels, T. S.; Berry, J. D.; Park, S.

    1992-01-01

    The paper reports the development and initial testing of a digital resolver to replace existing analog signal processing instrumentation. Radiometers, mounted directly on one of the fully articulated blades, are electrically connected through a slip ring to analog signal processing circuitry. The measured signals are periodic with azimuth angle and are resolved into harmonic components, with 0 deg over the tail. The periodic nature of the helicopter blade motion restricts the frequency content of each flapping and yaw signal to the fundamental and harmonics of the rotor rotational frequency. A minicomputer is employed to collect these data and then plot them graphically in real time. With this and other information generated by the instrumentation, a helicopter test pilot can then adjust the helicopter model's controls to achieve the desired aerodynamic test conditions.

  3. Development and implementation of a low cost micro computer system for LANDSAT analysis and geographic data base applications

    NASA Technical Reports Server (NTRS)

    Faust, N.; Jordon, L.

    1981-01-01

    Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.

  4. Cryogenic system for a superconducting spectrometer

    NASA Astrophysics Data System (ADS)

    Porter, J.

    1983-03-01

    The Heavy Ion Spectrometer System (HISS) relies upon superconducting coils of cryostable, pool boiling design to provide a maximum particle bending field of 3 tesla. The cryogenic facility including helium refrigeration, gas management, liquid nitrogen system, and the overall control strategy are described. The system normally operates with a 4 K heat load of 150 watts; the LN2 circuits absorb an additional 4000 watts. The 80K intercept control is by an LSI 11 computer. Total available refrigeration at 4K is 400 watts using reciprocating expanders at the 20K and 4K level. The minicomputer has the capability of optimizing overall utility input cost by varying operating points. A hybrid of pneumatic, analog, and digital control is successful in providing full time unattended operation. The 7m diameter magnet/cryostat assembly is rotatable through 180 degrees to provide a variety of spectrometer orientations.

  5. The microcomputer workstation - An alternate hardware architecture for remotely sensed image analysis

    NASA Technical Reports Server (NTRS)

    Erickson, W. K.; Hofman, L. B.; Donovan, W. E.

    1984-01-01

    Difficulties regarding the digital image analysis of remotely sensed imagery can arise in connection with the extensive calculations required. In the past, an expensive large to medium mainframe computer system was needed for performing these calculations. For image-processing applications smaller minicomputer-based systems are now used by many organizations. The costs for such systems are still in the range from $100K to $300K. Recently, as a result of new developments, the use of low-cost microcomputers for image processing and display systems appeared to have become feasible. These developments are related to the advent of the 16-bit microprocessor and the concept of the microcomputer workstation. Earlier 8-bit microcomputer-based image processing systems are briefly examined, and a computer workstation architecture is discussed. Attention is given to a microcomputer workstation developed by Stanford University, and the design and implementation of a workstation network.

  6. Vortex research facility improvements and preliminary density stratification effects on vortex wakes

    NASA Technical Reports Server (NTRS)

    Satran, D. R.; Holbrook, G. T.; Greene, G. C.; Neuhart, D.

    1985-01-01

    Recent modernization of NASA's Vortex Research Facility is described. The facility has a 300-ft test section, scheduled for a 300-ft extension, with constant test speeds of the model up to 100 ft/sec. The data acquisition hardware and software improvements included the installation of a 24-channel PCM system onboard the research vehicle, and a large dedicated 16-bit minicomputer. Flow visualization of the vortex wake in the test section is by particle seeding, and a thin sheet of argon laser light perpendicular to the line of flight; detailed flow field measurements are made with a laser velocimeter optics system. The improved experimental capabilities of the facility were used in a study of atmospheric stratification effects on wake vortex decay, showing that the effects of temperature gradient must be taken into account to avoid misleading conclusions in wake vortex research.

  7. The Lockheed alternate partial polarizer universal filter

    NASA Technical Reports Server (NTRS)

    Title, A. M.

    1976-01-01

    A tunable birefringent filter using an alternate partial polarizer design has been built. The filter has a transmission of 38% in polarized light. Its full width at half maximum is .09A at 5500A. It is tunable from 4500 to 8500A by means of stepping motor actuated rotating half wave plates and polarizers. Wave length commands and thermal compensation commands are generated by a PPD 11/10 minicomputer. The alternate partial polarizer universal filter is compared with the universal birefringent filter and the design techniques, construction methods, and filter performance are discussed in some detail. Based on the experience of this filter some conclusions regarding the future of birefringent filters are elaborated.

  8. Implementation of the DYMAC system at the new Los Alamos Plutonium Processing Facility. Phase II report

    SciTech Connect

    Malanify, J.J.; Amsden, D.C.

    1982-08-01

    The DYnamic Materials ACcountability System - called DYMAC - performs accountability functions at the new Los Alamos Plutonium Processing Facility where it began operation when the facility opened in January 1978. A demonstration program, DYMAC was designed to collect and assess inventory information for safeguards purposes. It accomplishes 75% of its design goals. DYMAC collects information about the physical inventory through deployment of nondestructive assay instrumentation and video terminals throughout the facility. The information resides in a minicomputer where it can be immediately sorted and displayed on the video terminals or produced in printed form. Although the capability now exists to assess the collected data, this portion of the program is not yet implemented. DYMAC in its present form is an excellent tool for process and quality control. The facility operator relies on it exclusively for keeping track of the inventory and for complying with accountability requirements of the US Department of Energy.

  9. Interactive graphical data analysis. Progress report, March 25, 1981-March 24, 1982

    SciTech Connect

    Bloomfield, P.; Tukey, J.W.

    1982-03-01

    Efforts during the first year of this project have emphasized developing a new data-analysis system, developing new algorithms for analyzing multidimensional data, and installing a new minicomputer to support both of these methodological developments. The grammar of the user language for the data-analysis system has been defined, and the specification of the data-structures that it will manipulate has been partly completed. Work on data analysis algorithms has focussed on two areas: an algorithm to assist a data analyst in finding interesting projections of multidimensional data, and an application of canonical correlations to investigating the structure of a time series. A philosophy for data-modification display systems, focused on PRIM-81, has been developed. A class of techniques for curve-isolation in the presence of a noise background have been considered. The use of simple functions in fitting non-linear behavior has to be expanded and improved.

  10. A computer system for geosynchronous satellite navigation

    NASA Technical Reports Server (NTRS)

    Koch, D. W.

    1980-01-01

    A computer system specifically designed to estimate and predict Geostationary Operational Environmental Satellite (GOES-4) navigation parameters using Earth imagery is described. The estimates are needed for spacecraft maneuvers while prediction provide the capability for near real-time image registration. System software is composed of four functional subsystems: (1) data base management; (2) image processing; (3) navigation; and (4) output. Hardware consists of a host minicomputer, a cathode ray tube terminal, a graphics/video display unit, and associated input/output peripherals. System validity is established through the processing of actual imagery obtained by sensors on board the Synchronous Meteorological Satellite (SMS-2). Results indicate the system is capable of operationally providing both accurate GOES-4 navigation estimates and images with a potential registration accuracy of several picture elements (pixels).

  11. Berkeley automated supernova search

    SciTech Connect

    Kare, J.T.; Pennypacker, C.R.; Muller, R.A.; Mast, T.S.; Crawford, F.S.; Burns, M.S.

    1981-01-01

    The Berkeley automated supernova search employs a computer controlled 36-inch telescope and charge coupled device (CCD) detector to image 2500 galaxies per night. A dedicated minicomputer compares each galaxy image with stored reference data to identify supernovae in real time. The threshold for detection is m/sub v/ = 18.8. We plan to monitor roughly 500 galaxies in Virgo and closer every night, and an additional 6000 galaxies out to 70 Mpc on a three night cycle. This should yield very early detection of several supernovae per year for detailed study, and reliable premaximum detection of roughly 100 supernovae per year for statistical studies. The search should be operational in mid-1982.

  12. High-performance control system for a heavy-ion medical accelerator

    SciTech Connect

    Lancaster, H.D.; Magyary, S.B.; Sah, R.C.

    1983-03-01

    A high performance control system is being designed as part of a heavy ion medical accelerator. The accelerator will be a synchrotron dedicated to clinical and other biomedical uses of heavy ions, and it will deliver fully stripped ions at energies up to 800 MeV/nucleon. A key element in the design of an accelerator which will operate in a hospital environment is to provide a high performance control system. This control system will provide accelerator modeling to facilitate changes in operating mode, provide automatic beam tuning to simplify accelerator operations, and provide diagnostics to enhance reliability. The control system being designed utilizes many microcomputers operating in parallel to collect and transmit data; complex numerical computations are performed by a powerful minicomputer. In order to provide the maximum operational flexibility, the Medical Accelerator control system will be capable of dealing with pulse-to-pulse changes in beam energy and ion species.

  13. Modern control techniques for accelerators

    SciTech Connect

    Goodwin, R.W.; Shea, M.F.

    1984-05-01

    Beginning in the mid to late sixties, most new accelerators were designed to include computer based control systems. Although each installation differed in detail, the technology of the sixties and early to mid seventies dictated an architecture that was essentially the same for the control systems of that era. A mini-computer was connected to the hardware and to a console. Two developments have changed the architecture of modern systems: (a) the microprocessor and (b) local area networks. This paper discusses these two developments and demonstrates their impact on control system design and implementation by way of describing a possible architecture for any size of accelerator. Both hardware and software aspects are included.

  14. Tritium Migration Analysis Program Version 4

    Energy Science and Technology Software Center (ESTSC)

    1991-06-12

    TMAP4 was developed as a safety analysis code, mainly to analyze tritium retention and loss in fusion reactor structures and systems during normal operational and accident conditions. It incorporates one-dimensional thermal and mass-diffusive transport and trapping calculations through structures and zero dimensional fluid transport between enclosures and across the interface between enclosures and structures. Diffusion structures may be linked together with other structures, and multiple structures may interact with an enclosure. A key feature ismore »the ability to input problem definition parameters as constants, interpolation tables, or FORTRAN equations. The code is specifically intended for use under a DOS operating system on PC type minicomputers, but it has also been run successfully on workstations and mainframe computer systems. Use of the equation-input feature requires access to a FORTRAN-77 compiler, and a linker program is required.« less

  15. A speech-to-noise ratio measurement algorithm

    NASA Astrophysics Data System (ADS)

    Sims, J. T.

    1985-11-01

    An algorithm to measure speech-to-noise ratios has been implemented on a minicomputer. The algorithm attributes the energy within each consecutive 20-ms frame of a speech-plus-noise waveform to either a speech or noise source. This discrimination process is based upon the known characteristics of frame energy histograms of such waveforms. In response to observed inaccuracies of this discrimination process in cases of low speech versus noise separation, a method of estimating the speech V(rms) of the signal is incorporated, which attempts to recover speech energy, 'masked' by noise. The algorithm's ability to track known speech-to-noise ratios on a decibel-for-decibel basis down to a ratio of approximately 5 dB has been demonstrated by experimentation.

  16. The Evolution of a Computerized Medical Information System

    PubMed Central

    Hammond, W. Ed; Stead, W. W.

    1986-01-01

    This paper presents the eighteen year history leading to the development of a computerized medical information system and discusses the factors which influenced its philosophy, design and implementation. This system, now called TMR, began as a single-user, tape-oriented minicomputer package and now exists as a multi-user, multi-database, multi-computer system capable of supporting a full range of users in both the inpatient and outpatient settings. The paper discusses why we did what we did, what worked, and what didn't work. Current projects are emphasized including networking and the integration of inpatient and outpatient functions into a single system. A theme of the paper is how hardware and software technological advancements, increasing sophistication of our users, our increasing experience, and just plain luck contributed to the success of TMR.

  17. Composite structural materials. [aircraft structures

    NASA Technical Reports Server (NTRS)

    Ansell, G. S.; Loewy, R. G.; Wiberley, S. E.

    1980-01-01

    The use of filamentary composite materials in the design and construction of primary aircraft structures is considered with emphasis on efforts to develop advanced technology in the areas of physical properties, structural concepts and analysis, manufacturing, and reliability and life prediction. The redesign of a main spar/rib region on the Boeing 727 elevator near its actuator attachment point is discussed. A composite fabrication and test facility is described as well as the use of minicomputers for computer aided design. Other topics covered include (1) advanced structural analysis methids for composites; (2) ultrasonic nondestructive testing of composite structures; (3) optimum combination of hardeners in the cure of epoxy; (4) fatigue in composite materials; (5) resin matrix characterization and properties; (6) postbuckling analysis of curved laminate composite panels; and (7) acoustic emission testing of composite tensile specimens.

  18. TV spectrum scanner of the 6-meter telescope

    NASA Astrophysics Data System (ADS)

    Somova, T. A.; Somov, N. N.; Markelov, S. V.; Nebelitskii, V. B.; Spiridonova, O. I.; Fomenko, A. F.

    A television multichannel spectrophotometer has been developed at the Special Astrophysical Observatory of the Academy of Sciences, USSR, for use in the BTA as a means for accumulating and recording the spectra of faint astronomical objects. The image detection system is comprised of a three-stage image tube (UM-92) that is optically coupled to a high sensitivity SIT TV tube. The system operates in a photon counting mode with digital definition of the centers of the photoelectron events, while the spectrophotometer itself operates on-line with a minicomputer which exercises control of the system and performs the primary reductions of the data collected. Block diagrams of the scanner and the television camera control system, as well as illustrations of several scans obtained during observations, are included.

  19. Study of cryogenic propellant systems for loading the space shuttle

    NASA Technical Reports Server (NTRS)

    Voth, R. O.; Steward, W. G.; Hall, W. J.

    1974-01-01

    Computer programs were written to model the liquid oxygen loading system for the space shuttle. The programs allow selection of input data through graphic displays which schematically depict the part of the system being modeled. The computed output is also displayed in the form of graphs and printed messages. Any one of six computation options may be selected. The first four of these pertain to thermal stresses, pressure surges, cooldown times, flow rates and pressures during cooldown. Options five and six deal with possible water hammer effects due to closing of valves, steady flow and transient response to changes in operating conditions after cooldown. Procedures are given for operation of the graphic display unit and minicomputer.

  20. CCD image data acquisition system for optical astronomy.

    NASA Astrophysics Data System (ADS)

    Bhat, P. N.; Patnaik, K.; Kembhavi, A. K.; Patnaik, A. R.; Prabhu, T. P.

    1990-11-01

    A complete image processing system based on a charge coupled device (CCD) has been developed at TIFR, Bombay, for use in optical astronomy. The system consists of a P-8600/B GEC CCD chip, a CCD controller, a VAX 11/725 mini-computer to carry out the image acquisition and display on a VS-11 monitor. All the necessary software and part of the hardware were developed locally, integrated together and installed at the Vainu Bappu Observatory at Kavalur. CCD as an imaging device and its advantages over the conventional photographic plate is briefly reviewed. The acquisition system is described in detail. The preliminary results are presented and the future research programme is outlined.

  1. Flexible missile autopilot design studies with PC-MATLAB/386

    NASA Technical Reports Server (NTRS)

    Ruth, Michael J.

    1989-01-01

    Development of a responsive, high-bandwidth missile autopilot for airframes which have structural modes of unusually low frequency presents a challenging design task. Such systems are viable candidates for modern, state-space control design methods. The PC-MATLAB interactive software package provides an environment well-suited to the development of candidate linear control laws for flexible missile autopilots. The strengths of MATLAB include: (1) exceptionally high speed (MATLAB's version for 80386-based PC's offers benchmarks approaching minicomputer and mainframe performance); (2) ability to handle large design models of several hundred degrees of freedom, if necessary; and (3) broad extensibility through user-defined functions. To characterize MATLAB capabilities, a simplified design example is presented. This involves interactive definition of an observer-based state-space compensator for a flexible missile autopilot design task. MATLAB capabilities and limitations, in the context of this design task, are then summarized.

  2. TMAP4 User`s Manual

    SciTech Connect

    Longhurst, G.R.; Holland, D.F.; Jones, J.L.; Merrill, B.J.

    1992-06-12

    The Tritium Migration Analysis Program, Version 4 (TMAP4) has been developed by the Fusion Safety Program at the Idaho National Engineering Laboratory (INEL) as a safety analysis code, mainly to analyze tritium retention and loss in fusion reactor structures and systems during normal operation and accident conditions. TMAP4 incorporates one-dimensional thermal- and mass-diffusive transport and trapping calculations through structures and zero dimensional fluid transport between enclosures and across the interface between enclosures and structures. A key feature is the ability to input problem definition parameters as constants, interpolation tables, or FORTRAN equations. The code is specifically intended for use under a DOS operating system on PC-type mini-computers, but it has also been run successfully on workstations and mainframe computer systems. Use of the equation-input feature requires access to a FORTRAN-77 compiler and a linker program.

  3. TMAP4 User's Manual

    SciTech Connect

    Longhurst, G.R.; Holland, D.F.; Jones, J.L.; Merrill, B.J.

    1992-06-12

    The Tritium Migration Analysis Program, Version 4 (TMAP4) has been developed by the Fusion Safety Program at the Idaho National Engineering Laboratory (INEL) as a safety analysis code, mainly to analyze tritium retention and loss in fusion reactor structures and systems during normal operation and accident conditions. TMAP4 incorporates one-dimensional thermal- and mass-diffusive transport and trapping calculations through structures and zero dimensional fluid transport between enclosures and across the interface between enclosures and structures. A key feature is the ability to input problem definition parameters as constants, interpolation tables, or FORTRAN equations. The code is specifically intended for use under a DOS operating system on PC-type mini-computers, but it has also been run successfully on workstations and mainframe computer systems. Use of the equation-input feature requires access to a FORTRAN-77 compiler and a linker program.

  4. Alsep data processing: How we processed Apollo Lunar Seismic Data

    NASA Technical Reports Server (NTRS)

    Latham, G. V.; Nakamura, Y.; Dorman, H. J.

    1979-01-01

    The Apollo lunar seismic station network gathered data continuously at a rate of 3 x 10 to the 8th power bits per day for nearly eight years until the termination in September, 1977. The data were processed and analyzed using a PDP-15 minicomputer. On the average, 1500 long-period seismic events were detected yearly. Automatic event detection and identification schemes proved unsuccessful because of occasional high noise levels and, above all, the risk of overlooking unusual natural events. The processing procedures finally settled on consist of first plotting all the data on a compressed time scale, visually picking events from the plots, transferring event data to separate sets of tapes and performing detailed analyses using the latter. Many problems remain especially for automatically processing extraterrestrial seismic signals.

  5. How we processed Apollo lunar seismic data

    NASA Technical Reports Server (NTRS)

    Nakamura, Y.; Latham, G. V.; Dorman, H. J.

    1980-01-01

    The Apollo lunar seismic station network gathered data continuously at a rate of 3 x 10 to the 8th bits per day for nearly eight years until termination in September, 1977. The data were processed and analyzed using a PDP-15 minicomputer. On average, 1500 long-period seismic events were detected yearly. Automatic event detection and identification schemes proved unsuccessful because of occasional high noise levels and, above all, the risk of overlooking unusual natural events. The processing procedures which were finally chosen consist of plotting all the data on a compressed time scale, visually picking events from the plots, transferring event data to separate sets of tapes and performing detailed analyses using the latter. Many problems remain, especially in the automatic processing of extraterrestrial seismic signals.

  6. The experimental computer control of a two-dimensional hyperbolic system

    NASA Technical Reports Server (NTRS)

    Yam, Y.; Lang, J. H.; Staelin, D. H.; Johnson, T. L.

    1985-01-01

    The experimental computer control of a two-dimensional hyperbolic system is described. The system consists of a 5-foot gold-coated rubber membrane mounted on a circular cylindrical drum. Seven electrodes reside on a command surface located behind the membrane inside the drum. These electrodes served as capacitive sensors and electrostatic force actuators of transverse membrane deflection. The membrane was modelled as flat, isotropic and uniformly tensioned. Transverse membrane deflections were expanded in normal modes. Controllers regulating membrane deflection are designed using aggregation and design procedures based upon sensor and actuator influence functions. The resulting control laws are implemented on a minicomputer in two sets of experiments. The experimental study confirms the theoretically predicted behavior of the system, usefulness of the aggregation and design procedures, and the expectation that spillover can be made a beneficial source of damping in residual systems.

  7. Wind tunnel evaluation of air-foil performance using simulated ice shapes

    NASA Technical Reports Server (NTRS)

    Bragg, M. B.; Zaguli, R. J.; Gregorek, G. M.

    1982-01-01

    A two-phase wind tunnel test was conducted in the 6 by 9 foot Icing Research Tunnel (IRT) at NASA Lewis Research Center to evaluate the effect of ice on the performance of a full scale general aviation wing. In the first IRT tests, rime and glaze shapes were carefully documented as functions of angle of attack and free stream conditions. Next, simulated ice shapes were constructed for two rime and two glaze shapes and used in the second IRT tunnel entry. The ice shapes and the clean airfoil were tapped to obtain surface pressures and a probe used to measure the wake characteristics. These data were recorded and processed, on-line, with a minicomputer/digital data acquisition system. The effect of both rime and glaze ice on the pressure distribution, Cl, Cd, and Cm are presented.

  8. Oxygen analyzer

    DOEpatents

    Benner, William H. (Danville, CA)

    1986-01-01

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  9. Oxygen analyzer

    DOEpatents

    Benner, W.H.

    1984-05-08

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N/sub 2/), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable obtained by decomposing the sample at 1135/sup 0/C, or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135/sup 0/C as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N/sub 2/, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  10. Expert system for scheduling simulation lab sessions

    NASA Technical Reports Server (NTRS)

    Lund, Chet

    1990-01-01

    Implementation and results of an expert system used for scheduling session requests for the Systems Engineering Simulator (SES) laboratory at the NASA Johnson Space Center (JSC) are discussed. Weekly session requests are received from astronaut crew trainers, procedures developers, engineering assessment personnel, software developers, and various others who wish to access the computers, scene generators, and other simulation equipment available to them in the SES lab. The expert system under discussion is comprised of a data acquisition portion - two Pascal programs run on a personal computer - and a CLIPS program installed on a minicomputer. A brief introduction to the SES lab and its scheduling background is given. A general overview of the system is provided, followed by a detailed description of the constraint-reduction process and of the scheduler itself. Results from a ten-week trial period using this approach are discussed. Finally, a summary of the expert system's strengths and shortcomings are provided.

  11. MIDAS, prototype Multivariate Interactive Digital Analysis System, phase 1. Volume 1: System description

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.

    1974-01-01

    The MIDAS System is described as a third-generation fast multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turnaround time and significant gains in throughput. The hardware and software are described. The system contains a mini-computer to control the various high-speed processing elements in the data path, and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 200,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation.

  12. A noninterference blade vibration measurement system for gas turbine engines

    NASA Astrophysics Data System (ADS)

    Watkins, William B.; Chi, Ray M.

    1987-06-01

    A noninterfering blade vibration system has been demonstrated in tests of a gas turbine first stage fan. Conceptual design of the system, including its theory, design of case mounted probes, and data acquisition and signal processing hardware was done in a previous effort. The current effort involved instrumentation of an engine fan stage with strain gages; data acquisition using shaft-mounted reference and case-mounted optical probes; recording of data on a wideband tape recorder; and posttest processing using off-line analysis in a facility computer and a minicomputer-based readout system designed for near- real-time readout. Results are presented in terms of true blade vibration frequencies, time and frequency dependent vibration amplitudes and comparison of the optical noninterference results with strain gage readings.

  13. Numerical methods: Analytical benchmarking in transport theory

    SciTech Connect

    Ganapol, B.D. )

    1988-01-01

    Numerical methods applied to reactor technology have reached a high degree of maturity. Certainly one- and two-dimensional neutron transport calculations have become routine, with several programs available on personal computer and the most widely used programs adapted to workstation and minicomputer computational environments. With the introduction of massive parallelism and as experience with multitasking increases, even more improvement in the development of transport algorithms can be expected. Benchmarking an algorithm is usually not a very pleasant experience for the code developer. Proper algorithmic verification by benchmarking involves the following considerations: (1) conservation of particles, (2) confirmation of intuitive physical behavior, and (3) reproduction of analytical benchmark results. By using today's computational advantages, new basic numerical methods have been developed that allow a wider class of benchmark problems to be considered.

  14. Personal computer applications in DIII-D neutral beam operation

    SciTech Connect

    Glad, A.S.

    1986-08-01

    An IBM PC AT has been implemented to improve operation of the DIII-D neutral beams. The PC system provides centralization of all beam data with reasonable access for on-line shot-to-shot control and analysis. The PC hardware was configured to interface all four neutral beam host minicomputers, support multitasking, and provide storage for approximately one month's accumulation of beam data. The PC software is composed of commercial packages used for performance and statistical analysis (i.e., LOTUS 123, PC PLOT, etc.), host communications software (i.e., PCLink, KERMIT, etc.), and applications developed software utilizing f-smcapso-smcapsr-smcapst-smcapsr-smcapsa-smcapsn-smcaps and b-smcapsa-smcapss-smcapsIc-smcaps. The objectives of this paper are to describe the implementation of the PC system, the methods of integrating the various software packages, and the scenario for on-line control and analysis.

  15. KEK NODAL system

    SciTech Connect

    Kurokawa, S.; Abe, K.; Akiyama, A.; Katoh, T.; Kikutani, E.; Koiso, H.; Kurihara, N.; Oide, K.; Shinomoto, M.

    1985-10-01

    The KEK NODAL system, which is based on the NODAL devised at the CERN SPS, works on an optical-fiber token ring network of twenty-four minicomputers (Hitachi HIDIC 80's) to control the TRISTAN accelerator complex, now being constructed at KEK. KEK NODAL retains main features of the original NODAL: the interpreting scheme, the multi-computer programming facility, and the data-module concept. In addition, it has the following characteristics: fast execution due to the compiler-interpreter method, a multicomputer file system, a full-screen editing facility, and a dynamic linkage scheme of data modules and NODAL functions. The structure of the KEK NODAL system under PMS, a real-time multitasking operating system of HIDIC 80, is described; the NODAL file system is also explained.

  16. An application of the Multi-Purpose System Simulation /MPSS/ model to the Monitor and Control Display System /MACDS/ at the National Aeronautics and Space Administration /NASA/ Goddard Space Flight Center /GSFC/

    NASA Technical Reports Server (NTRS)

    Mill, F. W.; Krebs, G. N.; Strauss, E. S.

    1976-01-01

    The Multi-Purpose System Simulator (MPSS) model was used to investigate the current and projected performance of the Monitor and Control Display System (MACDS) at the Goddard Space Flight Center in processing and displaying launch data adequately. MACDS consists of two interconnected mini-computers with associated terminal input and display output equipment and a disk-stored data base. Three configurations of MACDS were evaluated via MPSS and their performances ascertained. First, the current version of MACDS was found inadequate to handle projected launch data loads because of unacceptable data backlogging. Second, the current MACDS hardware with enhanced software was capable of handling two times the anticipated data loads. Third, an up-graded hardware ensemble combined with the enhanced software was capable of handling four times the anticipated data loads.

  17. Using CLIPS in a distributed system: The Network Control Center (NCC) expert system

    NASA Technical Reports Server (NTRS)

    Wannemacher, Tom

    1990-01-01

    This paper describes an intelligent troubleshooting system for the Help Desk domain. It was developed on an IBM-compatible 80286 PC using Microsoft C and CLIPS and an AT&T 3B2 minicomputer using the UNIFY database and a combination of shell script, C programs and SQL queries. The two computers are linked by a lan. The functions of this system are to help non-technical NCC personnel handle trouble calls, to keep a log of problem calls with complete, concise information, and to keep a historical database of problems. The database helps identify hardware and software problem areas and provides a source of new rules for the troubleshooting knowledge base.

  18. Touch-sensitive colour graphics enhance monitoring of loss-of-coolant accident tests

    SciTech Connect

    Snedden, M.D.; Mead, G.L.

    1982-02-01

    A stand-alone computer-based system with an intelligent colour termimal is described for monitoring parameters during loss-of-coolant accident tests. Colour graphic displays and touch-sensitive control have been combined for effective operator interaction. Data collected by the host MODCOMP II minicomputer are dynamically updated on colour pictures generated by the terminal. Experimenters select system functions by touching simulated switches on a transparent touch-sensitive overlay, mounted directly over the face of the colour screen, eliminating the need for a keyboard. Switch labels and colours are changed on the screen by the terminal software as different functions are selected. Interaction is self-prompting and can be learned quickly. System operation for a complete set of 20 tests has demonstrated the convenience of interactive touchsensitive colour graphics.

  19. Diagnosis of alcoholic cirrhosis with the right-to-left hepatic lobe ratio: concise communication

    SciTech Connect

    Shreiner, D.P.; Barlai-Kovach, M.

    1981-02-01

    Since scans of cirrhotic livers commonly show a reduction in size and colloid uptake of the right lobe, a quantitative measure of uptake was made using a minicomputer to determine total counts in regions of interest defined over each lobe. Right-to-left ratios were then compared in 103 patients. For normal paitents the mean ratio +- 1 s.d. was 2.85 +- 0.65, and the mean for patients with known cirrhosis was 1.08 +- 0.33. Patients with other liver diseases had ratios similar to the normal group. The normal range of the right-to-left lobe ratio was 1.55 to 4.15. The sensitivity of the ratio for alcoholic cirrhosis was 85.7% and the specificity was 100% in this patient population. The right-to-left lobe ratio was more sensitive and specific for alcoholic cirrhosis than any other criterion tested. An hypothesis is described to explain these results.

  20. A Scanning laser-velocimeter technique for measuring two-dimensional wake-vortex velocity distributions. [Langley Vortex Research Facility

    NASA Technical Reports Server (NTRS)

    Gartrell, L. R.; Rhodes, D. B.

    1980-01-01

    A rapid scanning two dimensional laser velocimeter (LV) has been used to measure simultaneously the vortex vertical and axial velocity distributions in the Langley Vortex Research Facility. This system utilized a two dimensional Bragg cell for removing flow direction ambiguity by translating the optical frequency for each velocity component, which was separated by band-pass filters. A rotational scan mechanism provided an incremental rapid scan to compensate for the large displacement of the vortex with time. The data were processed with a digital counter and an on-line minicomputer. Vaporized kerosene (0.5 micron to 5 micron particle sizes) was used for flow visualization and LV scattering centers. The overall measured mean-velocity uncertainity is less than 2 percent. These measurements were obtained from ensemble averaging of individual realizations.

  1. Plant analyzer development for high-speed interactive simulation of BWR plant transients

    SciTech Connect

    Wulff, W.; Cheng, H.S.; Mallen, A.N.

    1986-01-01

    Advanced modeling techniques have been combined with modern, special-purpose peripheral minicomputer technology to develop a plant analyzer which provides realistic and accurate predictions of plant transients and severe off-normal events in nuclear power plants through on-line simulations at speeds of approximately 10 times faster than actual process speeds. The new simulation technology serves not only for carrying out routinely and efficiently safety analyses, optimizations of emergency procedures and design changes, parametric studies for obtaining safety margins and for generic training but also for assisting plant operations. Five modeling principles are presented which serve to achieve high-speed simulation of neutron kinetics, thermal conduction, nonhomogeneous and nonequilibrium two-phase flow coolant dynamics, steam line acoustical effects, and the dynamics of the balance of plant and containment systems, control systems and plant protection systems. 21 refs.

  2. Cardio-respiratory control in an infant with Ondine's curse: a multivariate autoregressive modelling approach.

    PubMed

    Ogawa, T; Kojo, M; Fukushima, N; Sonoda, H; Goto, K; Ishiwa, S; Ishiguro, M

    1993-01-01

    We applied spectral analysis through multivariant autoregressive model fitting [1] to RR interval (RRI) and respiratory (RES) oscillation obtained during quiet sleep in an infant with congenital central hypoventilation syndrome (Ondine's curse), a child with obstructive sleep apnea, and two healthy children. Power spectra, impulse response and noise contribution ratio between RRI and RES oscillation were calculated by using a minicomputer PFU-1200 (FACOM) to determine the structure of the feedback system between RRI and RES within the central nervous system. We found that the respiratory noise contribution ratio to RRI was significantly smaller in Ondine's curse (37 +/- 7.7%, at 0.23 Hz) than in obstructive sleep apnea (90 +/- 6.7%, at 0.39 Hz) and healthy subjects. We postulate that the result shows disturbance of the central autonomic control of breathing and heart rate in Ondine's curse. PMID:8436805

  3. Safeguards instrumentation: past, present, future

    SciTech Connect

    Higinbotham, W.A.

    1982-01-01

    Instruments are essential for accounting, for surveillance and for protection of nuclear materials. The development and application of such instrumentation is reviewed, with special attention to international safeguards applications. Active and passive nondestructive assay techniques are some 25 years of age. The important advances have been in learning how to use them effectively for specific applications, accompanied by major advances in radiation detectors, electronics, and, more recently, in mini-computers. The progress in seals has been disappointingly slow. Surveillance cameras have been widely used for many applications other than safeguards. The revolution in TV technology will have important implications. More sophisticated containment/surveillance equipment is being developed but has yet to be exploited. On the basis of this history, some expectations for instrumentation in the near future are presented.

  4. The Mount Wilson solar magnetograph - Scanning and data system

    NASA Technical Reports Server (NTRS)

    Howard, R.

    1976-01-01

    The paper describes a computer-operated image-scanning and data-collection system for the magnetograph at the Mt. Wilson 150-foot Tower telescope. The system is based on a minicomputer with a 32K word core memory and a generalized interface unit for controlling image motion, a keyboard, and an associated television screen. Operation of the solar image guider and the data-collection assembly is outlined along with the observation and data-reduction procedures. Advantages of the system include the ability to move the image in almost any conceivable fashion, a wide choice of integration times, and increased accuracy in magnetic and Doppler calibrations as well as in setting of the magnetic zero level.

  5. Automated search for supernovae

    SciTech Connect

    Kare, J.T.

    1984-11-15

    This thesis describes the design, development, and testing of a search system for supernovae, based on the use of current computer and detector technology. This search uses a computer-controlled telescope and charge coupled device (CCD) detector to collect images of hundreds of galaxies per night of observation, and a dedicated minicomputer to process these images in real time. The system is now collecting test images of up to several hundred fields per night, with a sensitivity corresponding to a limiting magnitude (visual) of 17. At full speed and sensitivity, the search will examine some 6000 galaxies every three nights, with a limiting magnitude of 18 or fainter, yielding roughly two supernovae per week (assuming one supernova per galaxy per 50 years) at 5 to 50 percent of maximum light. An additional 500 nearby galaxies will be searched every night, to locate about 10 supernovae per year at one or two percent of maximum light, within hours of the initial explosion.

  6. Software for Digital Acquisition System and Application to Environmental Monitoring

    NASA Technical Reports Server (NTRS)

    Copeland, G. E.

    1975-01-01

    Criteria for selection of a minicomputer for use as a core resident acquisition system were developed for the ODU Mobile Air Pollution Laboratory. A comprehensive data acquisition program named MONARCH was instituted in a DEC-8/E-8K 12-bit computer. Up to 32 analog voltage inputs are scanned sequentially, converted to BCD, and then to actual numbers. As many as 16 external devices (valves or any other two-state device) are controlled independently. MONARCH is written as a foreground-background program, controlled by an external clock which interrupts once per minute. Transducer voltages are averaged over user specified time intervals and, upon completion of any desired time sequence, outputted are: day, hour, minute, second; state of external valves; average value of each analogue voltage (E Format); as well as standard deviations of these values. Output is compatible with any serially addressed media.

  7. ANNIE - INTERACTIVE PROCESSING OF DATA BASES FOR HYDROLOGIC MODELS.

    USGS Publications Warehouse

    Lumb, Alan M.; Kittle, John L.

    1985-01-01

    ANNIE is a data storage and retrieval system that was developed to reduce the time and effort required to calibrate, verify, and apply watershed models that continuously simulate water quantity and quality. Watershed models have three categories of input: parameters to describe segments of a drainage area, linkage of the segments, and time-series data. Additional goals for ANNIE include the development of software that is easily implemented on minicomputers and some microcomputers and software that has no special requirements for interactive display terminals. Another goal is for the user interaction to be based on the experience of the user so that ANNIE is helpful to the inexperienced user and yet efficient and brief for the experienced user. Finally, the code should be designed so that additional hydrologic models can easily be added to ANNIE.

  8. Laboratory procedures used in the hot corrosion project

    SciTech Connect

    Jeys, T.R.

    1980-04-08

    The objective of the Hot Corrosion Project in the LLNL Metals and Ceramics Division is to study the physical and chemical mechanisms of corrosion of nickel, iron, and some of their alloys when these metals are subjected to oxidizing or sulfidizing environments at temperatures between 850 and 950/sup 0/C. To obtain meaningful data in this study, we must rigidly control many parameters. Parameters are discussed and the methods chosen to control them in this laboratory. Some of the mechanics and manipulative procedures that are specifically related to data access and repeatability are covered. The method of recording and processing the data from each experiment using an LS-11 minicomputer are described. The analytical procedures used to evaluate the specimens after the corrosion tests are enumerated and discussed.

  9. Automatic continuum analysis of reflectance spectra

    NASA Technical Reports Server (NTRS)

    Clark, Roger N.; King, Trude V. V.

    1987-01-01

    A continuum algorithm based on a Segmented Upper Hull method (SUH) is described. An upper hull is performed on segments of a spectrum defined by local minima and maxima. The segments making a complete spectrum are then combined. The definition of the upper hull allows the continuum to be both concave and/or convex, adapting to the shape of the spectrum. The method performs multiple passes on a spectrum by segmenting each local maximum to minimum and performing an upper hull. The algorithm naturally adapts to the widths of absorption features, so that all features are found, including the nature of doublets, triplets, etc. The algorithm is also reasonably fast on common minicomputers so that it might be applied to the large data sets from imaging spectrometers.

  10. Networking and AI systems: Requirements and benefits

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The price performance benefits of network systems is well documented. The ability to share expensive resources sold timesharing for mainframes, department clusters of minicomputers, and now local area networks of workstations and servers. In the process, other fundamental system requirements emerged. These have now been generalized with open system requirements for hardware, software, applications and tools. The ability to interconnect a variety of vendor products has led to a specification of interfaces that allow new techniques to extend existing systems for new and exciting applications. As an example of the message passing system, local area networks provide a testbed for many of the issues addressed by future concurrent architectures: synchronization, load balancing, fault tolerance and scalability. Gold Hill has been working with a number of vendors on distributed architectures that range from a network of workstations to a hypercube of microprocessors with distributed memory. Results from early applications are promising both for performance and scalability.

  11. Laboratory data manipulation tools basic data handling programs. Volume 2: Detailed software/hardware documentation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The set of computer programs described allows for data definition, data input, and data transfer between the LSI-11 microcomputers and the VAX-11/780 minicomputer. Program VAXCOM allows for a simple method of textual file transfer from the LSI to the VAX. Program LSICOM allows for easy file transfer from the VAX to the LSI. Program TTY changes the LSI-11 operators console to the LSI's printing device. Program DICTIN provides a means for defining a data set for input to either computer. Program DATAIN is a simple to operate data entry program which is capable of building data files on either machine. Program LEDITV is an extremely powerful, easy to use, line oriented text editor. Program COPYSBF is designed to print out textual files on the line printer without character loss from FORTRAN carriage control or wide record transfer.

  12. Thermal systems analysis for the Space Infrared Telescope Facility dewar

    NASA Technical Reports Server (NTRS)

    Bhandari, Pradeep; Petrick, S. W.; Schember, Helene

    1991-01-01

    Thermal systems analysis models were used to design SFHe cooled dewar for the Space Infrared Telescope Facility (SIRTF), a 1 m class cryogenically cooled observatory for IR astronomy. The models are capable of computing both the heat leaks into the dewar and the operating temperature of a SFHe tank. The models are aimed at predicting the ability of the SIRTF cryogenic system to satisfy a five-year mission lifetime requirement and maintain the SFHe tank operating temperature of 1.25 K to provide sufficient cooling for science instruments and the optical system. The thermal models are very detailed and very fast with a typical steady state run of about 20 sec on a VAX minicomputer.

  13. Acquisition of quantitative physiological data and computerized image reconstruction using a single scan TV system

    NASA Technical Reports Server (NTRS)

    Baily, N. A.

    1976-01-01

    A single-scan radiography system has been interfaced to a minicomputer, and the combined system has been used with a variety of fluoroscopic systems and image intensifiers available in clinical facilities. The system's response range is analyzed, and several applications are described. These include determination of the gray scale for typical X-ray-fluoroscopic-television chains, measurement of gallstone volume in patients, localization of markers or other small anatomical features, determinations of organ areas and volumes, computer reconstruction of tomographic sections of organs in motion, and computer reconstruction of transverse axial body sections from fluoroscopic images. It is concluded that this type of system combined with a minimum of statistical processing shows excellent capabilities for delineating small changes in differential X-ray attenuation.

  14. Proposed technology and procurement policy for SNAP III. Final report, April-September 1986

    SciTech Connect

    Schneidewind, N.F.

    1986-10-01

    The purpose of this report is to suggest ideas for the technology and procurement policy that would be appropriate for SNAP III in the next decade. Both technology and procurement policy are considered because it would be difficult to implement some of the technology proposed in this report without a change in procurement policy. The report describes the recommended architecture of SNAP III and the software acquisitions and procurements policies to support the architecture. Major recommendations are: Transition from minicomputer to microcomputer system; Transition to proven commercial office system; Use local area network technology; Acquire mass storage capability; Acquire improved graphics capability; Consider automating ship -- shore communications, and start to develop a procurement policy to support the acquisition of the above technology.

  15. Interfacing a torsion pendulum with a microcomputer

    SciTech Connect

    Bush, J.A.; Newby, J.W.

    1983-02-24

    Shear modulus testing is performed on the torsion pendulum at the General Electric Neutron Devices Department (GEND) as a means of gauging the state of cure for a polymer system. However, collection and reduction of the data to obtain the elastic modulus necessitated extensive operator involved measurements and calculations, which were subject to errors. To improve the reliability of the test, an analog-to-digital interface was designed and built to connect the torsion pendulum with a minicomputer. After the necessary programming was prepared, the system was tested and found to be an improvement over the old procedure in both quality and time of operation. An analysis of the data indicated that the computer generated modulus data were equivalent to the hand method data, but potential operator errors in frequency measurements and calculations were eliminated. The interfacing of the pendulum with the computer resulted in an overall time savings of 52 percent.

  16. Computer code to interchange CDS and wave-drag geometry formats

    NASA Technical Reports Server (NTRS)

    Johnson, V. S.; Turnock, D. L.

    1986-01-01

    A computer program has been developed on the PRIME minicomputer to provide an interface for the passage of aircraft configuration geometry data between the Rockwell Configuration Development System (CDS) and a wireframe geometry format used by aerodynamic design and analysis codes. The interface program allows aircraft geometry which has been developed in CDS to be directly converted to the wireframe geometry format for analysis. Geometry which has been modified in the analysis codes can be transformed back to a CDS geometry file and examined for physical viability. Previously created wireframe geometry files may also be converted into CDS geometry files. The program provides a useful link between a geometry creation and manipulation code and analysis codes by providing rapid and accurate geometry conversion.

  17. An operational video data compression system for ATS and ITOS

    NASA Technical Reports Server (NTRS)

    Kutz, R. L.; Davisson, L. D.

    1975-01-01

    An operational data compression system has been developed and implemented for transmission of digitized ATS and ITOS-VHRR satellite video data over the wideband communication link between the Wallops Island, Va. Command and Data Acquisition Station and the National Environmental Satellite Service at Suitland, Md. This system uses minicomputers for the coding and decoding of the data to achieve maximum flexibility together with specially designed interface equipment for greater efficiency. No loss in data quality occurs due to the compression, and, in certain cases, data is transmitted which would be otherwise unavailable due to the limited channel capacity. This paper describes the method of compression, the equipment used, and the compression results attained.

  18. CAMAPPLE: CAMAC interface to the Apple computer

    SciTech Connect

    Oxoby, G.J.; Trang, Q.H.; Williams, S.H.

    1981-04-01

    The advent of the personal microcomputer provides a new tool for the debugging, calibration and monitoring of small scale physics apparatus, e.g., a single detector being developed for a larger physics apparatus. With an appropriate interface these microcomputer systems provide a low cost (1/3 the cost of a comparable minicomputer system), convenient, dedicated, portable system which can be used in a fashion similar to that of portable oscilloscopes. Here, an interface between the Apple computer and CAMAC which is now being used to study the detector for a Cerenkov ring-imaging device is described. The Apple is particularly well-suited to this application because of its ease of use, hi-resolution graphics, peripheral bus and documentation support.

  19. Selecting a labor information system. What to ask, what to avoid.

    PubMed

    Garcia, L

    1990-12-01

    Payroll expenses may account for over half of all of a hospital's expenses. Manual time card processing requires an abundance of staff time and can often result in costly errors. To alleviate this problem, many healthcare facilities are implementing computerized labor information systems. To minimize the risk of selecting the wrong system, hospital administrators should ask the following questions before committing to any computerized labor information system: Is the software designed for hospital use and easily adaptable to each hospital's unique policies? How flexible is the software's reporting system? Does it include automatic scheduling that creates generic schedules? Does the system have the capability of securing time and attendance records and documenting the audit trail? Does the system include an accurate and reliable badge reader? What type of hardware is best for the particular hospital--microcomputer, minicomputer, or mainframe? Finally, to guarantee successful software installation, the vendor should have extensive experience and documentation in the system's implementation. PMID:10108009

  20. Total ozone determination by spectroradiometry in the middle ultraviolet

    NASA Technical Reports Server (NTRS)

    Garrison, L. M.; Doda, D. D.; Green, A. E. S.

    1979-01-01

    A method has been developed to determine total ozone from multispectral measurements of the direct solar irradiance. The total ozone is determined by a least squares fit to the spectrum between 290 nm and 380 nm. The aerosol extinction is accounted for by expanding it in a power series in wavelength; use of the linear term proved adequate. A mobile laboratory incorporating a sky scanner has been developed and used to obtain data to verify the method. Sun tracking, wavelength setting of the double monochromator, and data acquisition are under control of a minicomputer. Results obtained at Wallops Island, Virginia, and Palestine, Texas, agree well with simultaneous Dobson and Canterbury spectrometer and balloon ECC ozonesonde values. The wavelength calibration of the monochromator and the values for the normalized ozone absorption are the most important factors in an accurate determination of total ozone.

  1. A statistical data analysis and plotting program for cloud microphysics experiments

    NASA Technical Reports Server (NTRS)

    Jordan, A. J.

    1981-01-01

    The analysis software developed for atmospheric cloud microphysics experiments conducted in the laboratory as well as aboard a KC-135 aircraft is described. A group of four programs was developed and implemented on a Hewlett Packard 1000 series F minicomputer running under HP's RTE-IVB operating system. The programs control and read data from a MEMODYNE Model 3765-8BV cassette recorder, format the data on the Hewlett Packard disk subsystem, and generate statistical data (mean, variance, standard deviation) and voltage and engineering unit plots on a user selected plotting device. The programs are written in HP FORTRAN IV and HP ASSEMBLY Language with the graphics software using the HP 1000 Graphics. The supported plotting devices are the HP 2647A graphics terminal, the HP 9872B four color pen plotter, and the HP 2608A matrix line printer.

  2. A velocity vector measuring system with 13 asymmetric wedge type yawmeters

    NASA Astrophysics Data System (ADS)

    Nakaya, T.; Hoshio, H.; Noguchi, M.

    1981-06-01

    In order to survey the flow field around the empennage of the NAL STOL research aircraft model in the 6m low speed wind tunnel, a velocity vector measuring system with 13 asymmetric wedge type yawmeters was developed. The rotational angle of the 13 probes and the setting angle of this system are automatically controlled following the sequence previously programmed into a minicomputer system. The hardware, control modes, data reduction, and data processing are described. The accuracy of the flow angle measurement turned out to be satisfactory, but measurements of dynamic pressure and static pressure were less accurate. An example of measurements taken of the flow field around the empennage of the STOL research aircraft model is included.

  3. Scheduler software for tracking and data relay satellite system loading analysis: User manual and programmer guide

    NASA Technical Reports Server (NTRS)

    Craft, R.; Dunn, C.; Mccord, J.; Simeone, L.

    1980-01-01

    A user guide and programmer documentation is provided for a system of PRIME 400 minicomputer programs. The system was designed to support loading analyses on the Tracking Data Relay Satellite System (TDRSS). The system is a scheduler for various types of data relays (including tape recorder dumps and real time relays) from orbiting payloads to the TDRSS. Several model options are available to statistically generate data relay requirements. TDRSS time lines (representing resources available for scheduling) and payload/TDRSS acquisition and loss of sight time lines are input to the scheduler from disk. Tabulated output from the interactive system includes a summary of the scheduler activities over time intervals specified by the user and overall summary of scheduler input and output information. A history file, which records every event generated by the scheduler, is written to disk to allow further scheduling on remaining resources and to provide data for graphic displays or additional statistical analysis.

  4. Correction factors for on-line microprobe analysis of multielement alloy systems

    NASA Technical Reports Server (NTRS)

    Unnam, J.; Tenney, D. R.; Brewer, W. D.

    1977-01-01

    An on-line correction technique was developed for the conversion of electron probe X-ray intensities into concentrations of emitting elements. This technique consisted of off-line calculation and representation of binary interaction data which were read into an on-line minicomputer to calculate variable correction coefficients. These coefficients were used to correct the X-ray data without significantly increasing computer core requirements. The binary interaction data were obtained by running Colby's MAGIC 4 program in the reverse mode. The data for each binary interaction were represented by polynomial coefficients obtained by least-squares fitting a third-order polynomial. Polynomial coefficients were generated for most of the common binary interactions at different accelerating potentials and are included. Results are presented for the analyses of several alloy standards to demonstrate the applicability of this correction procedure.

  5. Development of a multiplane multispeed balancing system for turbine systems

    NASA Technical Reports Server (NTRS)

    Martin, M. R.

    1984-01-01

    A prototype high speed balancing system was developed for assembled gas turbine engine modules. The system permits fully assembled gas turbine modules to be operated and balanced at selected speeds up to full turbine speed. The balancing system is a complete stand-alone system providing all necesary lubrication and support hardware for full speed operation. A variable speed motor provides the drive power. A drive belt and gearbox provide rotational speeds up to 21,000 rpm inside a vacuum chamber. The heart of the system is a dedicated minicomputer with attendant data acquisition, storage and I/O devices. The computer is programmed to be completely interactive with the operator. The system was installed at CCAD and evaluated by testing 20 T55 power turbines and 20 T53 power turbines. Engine test results verified the performance of the high speed balanced turbines.

  6. Real time quality control of meteorological data used in SRP's emergency response system

    SciTech Connect

    Pendergast, M.M.

    1980-05-01

    The Savannah River Laboratory's WIND minicomputer system allows quick and accurate assessment of an accidental release at the Savannah River Plant using data from eight meteorological towers. The accuracy of the assessment is largely determined by the accuracy of the meteorological data; therefore quality control is important in an emergency response system. Real-time quality control of this data will be added to the WIND system to automatically identify inaccurate data. Currently, the system averages the measurements from the towers to minimize the influence of inaccurate data being used in calculations. The computer code used in the real-time quality control has been previously used to identify inaccurate measurements from the archived tower data.

  7. Galileo Institute for Astronomy (IFA) charge-coupled device (CCD) system

    NASA Technical Reports Server (NTRS)

    Hlivak, R. J.; Pilcher, C. B.; Howell, R. R.; Colucci, A. J.; Henry, J. P.

    1982-01-01

    A fully portable self-contained charge-coupled device system has been constructed for shared use with the Galileo Project Imaging Team. The detector currently incorporated in the system is a Texas Instruments 500 x 500 three-phase CCD that has been thinned to operate in the backside illuminated mode. The detector and camera mainframe electronics were provided by the Jet Propulsion Laboratory. The support electronics and control interfaces necessary to operate the mainframe were constructed. The data system, which is built around a DeAnza Visacom VC-5000 Image Processor with an imbedded LSI-11 minicomputer, was also integrated. The capability to do image processing in real-time at the telescope has proved to be extremely valuable. The overall system read noise has been measured at 25 electrons. Full-well capacity is 40,000 electrons. Some results from laboratory tests and initial observing runs at the Mauna Kea 2.2-m telescope are presented.

  8. FINDS: A fault inferring nonlinear detection system. User's guide

    NASA Technical Reports Server (NTRS)

    Lancraft, R. E.; Caglayan, A. K.

    1983-01-01

    The computer program FINDS is written in FORTRAN-77, and is intended for operation on a VAX 11-780 or 11-750 super minicomputer, using the VMS operating system. The program detects, isolates, and compensates for failures in navigation aid instruments and onboard flight control and navigation sensors of a Terminal Configured Vehicle aircraft in a Microwave Landing System environment. In addition, FINDS provides sensor fault tolerant estimates for the aircraft states which are then used by an automatic guidance and control system to land the aircraft along a prescribed path. FINDS monitors for failures by evaluating all sensor outputs simultaneously using the nonlinear analytic relationships between the various sensor outputs arising from the aircraft point mass equations of motion. Hence, FINDS is an integrated sensor failure detection and isolation system.

  9. The History of the Data Systems AutoChemist® (ACH) and AutoChemist-PRISMA (PRISMA®): from 1964 to 1986

    PubMed Central

    2014-01-01

    Summary Objectives This paper presents the history of data system development steps (1964 – 1986) for the clinical analyzers AutoChemist®, and its successor AutoChemist PRISMA® (PRogrammable Individually Selective Modular Analyzer). The paper also partly recounts the history of development steps of the minicomputer PDP 8 from Digital Equipment. The first PDP 8 had 4 core memory boards of 1 K each and was large as a typical oven baking sheet and about 10 years later, PDP 8 was a “one chip microcomputer” with a 32 K memory chip. The fast developments of PDP 8 come to have a strong influence on the development of the data system for AutoChemist. Five major releases of the software were made during this period (1-5 MIACH). Results The most important aims were not only to calculate the results, but also be able to monitor their quality and automatically manage the orders, store the results in digital form for later statistical analysis and distribute the results to the physician in charge of the patient using thesame computer as the analyzer. Another result of the data system was the ability to customize AutoChemist to handle sample identification by using bar codes and the presentation of results to different types of laboratories. Conclusions Digital Equipment launched the PDP 8 just as a new minicomputer was desperately needed. No other known alternatives were available at the time. This was to become a key success factor for AutoChemist. That the AutoChemist with such a high capacity required a computer for data collection was obvious already in the early 1960s. That computer development would be so rapid and that one would be able to accomplish so much with a data system was even suspicious at the time. In total, 75; AutoChemist (31) and PRISMA (44) were delivered Worldwide The last PRISMA was delivered in 1987 to the Veteran Hospital Houston, TX USA PMID:24853032

  10. Evaluation of three electronic report processing systems for preparing hydrologic reports of the U.S Geological Survey, Water Resources Division

    USGS Publications Warehouse

    Stiltner, G.J.

    1990-01-01

    In 1987, the Water Resources Division of the U.S. Geological Survey undertook three pilot projects to evaluate electronic report processing systems as a means to improve the quality and timeliness of reports pertaining to water resources investigations. The three projects selected for study included the use of the following configuration of software and hardware: Ventura Publisher software on an IBM model AT personal computer, PageMaker software on a Macintosh computer, and FrameMaker software on a Sun Microsystems workstation. The following assessment criteria were to be addressed in the pilot studies: The combined use of text, tables, and graphics; analysis of time; ease of learning; compatibility with the existing minicomputer system; and technical limitations. It was considered essential that the camera-ready copy produced be in a format suitable for publication. Visual improvement alone was not a consideration. This report consolidates and summarizes the findings of the electronic report processing pilot projects. Text and table files originating on the existing minicomputer system were successfully transformed to the electronic report processing systems in American Standard Code for Information Interchange (ASCII) format. Graphics prepared using a proprietary graphics software package were transferred to all the electronic report processing software through the use of Computer Graphic Metafiles. Graphics from other sources were entered into the systems by scanning paper images. Comparative analysis of time needed to process text and tables by the electronic report processing systems and by conventional methods indicated that, although more time is invested in creating the original page composition for an electronically processed report , substantial time is saved in producing subsequent reports because the format can be stored and re-used by electronic means as a template. Because of the more compact page layouts, costs of printing the reports were 15% to 25% less than costs of printing the reports prepared by conventional methods. Because the largest report workload in the offices conducting water resources investigations is preparation of Water-Resources Investigations Reports, Open-File Reports, and annual State Data Reports, the pilot studies only involved these projects. (USGS)

  11. Evolution of the Mobile Information SysTem (MIST)

    NASA Technical Reports Server (NTRS)

    Litaker, Harry L., Jr.; Thompson, Shelby; Archer, Ronald D.

    2008-01-01

    The Mobile Information SysTem (MIST) had its origins in the need to determine whether commercial off the shelf (COTS) technologies could improve intervehicular activities (IVA) on International Space Station (ISS) crew maintenance productivity. It began with an exploration of head mounted displays (HMDs), but quickly evolved to include voice recognition, mobile personal computing, and data collection. The unique characteristic of the MIST lies within its mobility, in which a vest is worn that contains a mini-computer and supporting equipment, and a headband with attachments for a HMD, lipstick camera, and microphone. Data is then captured directly by the computer running Morae(TM) or similar software for analysis. To date, the MIST system has been tested in numerous environments such as two parabolic flights on NASA's C-9 microgravity aircraft and several mockup facilities ranging from ISS to the Altair Lunar Sortie Lander. Functional capabilities have included its lightweight and compact design, commonality across systems and environments, and usefulness in remote collaboration. Human Factors evaluations of the system have proven the MIST's ability to be worn for long durations of time (approximately four continuous hours) with no adverse physical deficits, moderate operator compensation, and low workload being reported as measured by Corlett Bishop Discomfort Scale, Cooper-Harper Ratings, and the NASA Total Workload Index (TLX), respectively. Additionally, through development of the system, it has spawned several new applications useful in research. For example, by only employing the lipstick camera, microphone, and a compact digital video recorder (DVR), we created a portable, lightweight data collection device. Video is recorded from the participants point of view (POV) through the use of the camera mounted on the side of the head. Both the video and audio is recorded directly into the DVR located on a belt around the waist. This data is then transferred to another computer for video editing and analysis. Another application has been discovered using simulated flight, in which, a kneeboard is replaced with mini-computer and the HMD to project flight paths and glide slopes for lunar ascent. As technologies evolve, so will the system and its application for research and space system operations.

  12. High Frequency Sampling of TTL Pulses on a Raspberry Pi for Diffuse Correlation Spectroscopy Applications

    PubMed Central

    Tivnan, Matthew; Gurjar, Rajan; Wolf, David E.; Vishwanath, Karthik

    2015-01-01

    Diffuse Correlation Spectroscopy (DCS) is a well-established optical technique that has been used for non-invasive measurement of blood flow in tissues. Instrumentation for DCS includes a correlation device that computes the temporal intensity autocorrelation of a coherent laser source after it has undergone diffuse scattering through a turbid medium. Typically, the signal acquisition and its autocorrelation are performed by a correlation board. These boards have dedicated hardware to acquire and compute intensity autocorrelations of rapidly varying input signal and usually are quite expensive. Here we show that a Raspberry Pi minicomputer can acquire and store a rapidly varying time-signal with high fidelity. We show that this signal collected by a Raspberry Pi device can be processed numerically to yield intensity autocorrelations well suited for DCS applications. DCS measurements made using the Raspberry Pi device were compared to those acquired using a commercial hardware autocorrelation board to investigate the stability, performance, and accuracy of the data acquired in controlled experiments. This paper represents a first step toward lowering the instrumentation cost of a DCS system and may offer the potential to make DCS become more widely used in biomedical applications. PMID:26274961

  13. Development of a remote control console for the HHIRF 25-MV tandem accelerator

    SciTech Connect

    Hasanul Basher, A.M.

    1991-09-01

    The CAMAC-based control system for the 25-MV Tandem Accelerator at HHIRF uses two Perkin-Elmer, 32-bit minicomputers: a message-switching computer and a supervisory computer. Two operator consoles are located on one of the six serial highways. Operator control is provided by means of a console CRT, trackball, assignable shaft encoders and meters. The message-switching computer transmits and receives control information on the serial highways. At present, the CRT pages with updated parameters can be displayed and parameters can be controlled only from the two existing consoles, one in the Tandem control room and the other in the ORIC control room. It has become necessary to expand the control capability to several other locations in the building. With the expansion of control and monitoring capability of accelerator parameters to other locations, the operators will be able to control and observe the result of the control action at the same time. Since the new control console will be PC-based, the existing page format will be changed. The PC will be communicating with the Perkin-Elmer through RS-232 and a communication software package. Hardware configuration has been established, a communication software program that reads the pages from the shared memory has been developed. In this paper, we present the implementation strategy, works completed, existing and new page format, future action plans, explanation of pages and use of related global variables, a sample session, and flowcharts.

  14. WATEQ4F - a personal computer Fortran translation of the geochemical model WATEQ2 with revised data base

    USGS Publications Warehouse

    Ball, J.W.; Nordstrom, D.K.; Zachmann, D.W.

    1987-01-01

    A FORTRAN 77 version of the PL/1 computer program for the geochemical model WATEQ2, which computes major and trace element speciation and mineral saturation for natural waters has been developed. The code (WATEQ4F) has been adapted to execute on an IBM PC or compatible microcomputer. Two versions of the code are available, one operating with IBM Professional FORTRAN and an 8087 or 89287 numeric coprocessor, and one which operates without a numeric coprocessor using Microsoft FORTRAN 77. The calculation procedure is identical to WATEQ2, which has been installed on many mainframes and minicomputers. Limited data base revisions include the addition of the following ions: AlHS04(++), BaS04, CaHS04(++), FeHS04(++), NaF, SrC03, and SrHCO3(+). This report provides the reactions and references for the data base revisions, instructions for program operation, and an explanation of the input and output files. Attachments contain sample output from three water analyses used as test cases and the complete FORTRAN source listing. U.S. Geological Survey geochemical simulation program PHREEQE and mass balance program BALANCE also have been adapted to execute on an IBM PC or compatible microcomputer with a numeric coprocessor and the IBM Professional FORTRAN compiler. (Author 's abstract)

  15. A system for processing Landsat and other georeferenced data for resource management applications

    NASA Technical Reports Server (NTRS)

    Whitley, S. L.

    1979-01-01

    The NASA Earth Resources Laboratory has developed a transferrable system for processing Landsat and disparate data with capabilities for digital data classification, georeferencing, overlaying, and data base management. This system is known as the Earth Resources Data Analysis System. The versatility of the system has been demonstrated with applications in several disciplines. A description is given of a low-cost data system concept that is suitable for transfer to one's available in-house minicomputer or to a low-cost computer purchased for this purpose. Software packages are described that process Landsat data to produce surface cover classifications and that geographically reference the data to the UTM projection. Programs are also described that incorporate several sets of Landsat derived information, topographic information, soils information, rainfall information, etc., into a data base. Selected application algorithms are discussed and sample products are presented. The types of computers on which the low-cost data system concept has been implemented are identified, typical implementation costs are given, and the source where the software may be obtained is identified.

  16. Distribution of computer functionality for accelerator control at the Brookhaven AGS

    SciTech Connect

    Stevens, A.; Clifford, T.; Frankel, R.

    1985-01-01

    A set of physical and functional system components and their interconnection protocols have been established for all controls work at the AGS. Portions of these designs were tested as part of enhanced operation of the AGS as a source of polarized protons and additional segments will be implemented during the continuing construction efforts which are adding heavy ion capability to our facility. Included in our efforts are the following computer and control system elements: a broad band local area network, which embodies MODEMS; transmission systems and branch interface units; a hierarchical layer, which performs certain data base and watchdog/alarm functions; a group of work station processors (Apollo's) which perform the function of traditional minicomputer host(s) and a layer, which provides both real time control and standardization functions for accelerator devices and instrumentation. Data base and other accelerator functionality is assigned to the most correct level within our network for both real time performance, long-term utility, and orderly growth.

  17. ANL statement of site strategy for computing workstations

    SciTech Connect

    Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.; Bretscher, M.E.; Engert, D.E.; Moszur, F.M.; Mueller, C.J.; O'Brien, D.E.; Schlesselman, C.G.; Troyer, L.J.

    1991-11-01

    This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is to develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.

  18. An imaging system for PLIF/Mie measurements for a combusting flow

    NASA Technical Reports Server (NTRS)

    Wey, C. C.; Ghorashi, B.; Marek, C. J.; Wey, C.

    1990-01-01

    The equipment required to establish an imaging system can be divided into four parts: (1) the light source and beam shaping optics; (2) camera and recording; (3) image acquisition and processing; and (4) computer and output systems. A pulsed, Nd:YAG-pummped, frequency-doubled dye laser which can freeze motion in the flowfield is used for an illumination source. A set of lenses is used to form the laser beam into a sheet. The induced fluorescence is collected by an UV-enhanced lens and passes through an UV-enhanced microchannel plate intensifier which is optically coupled to a gated solid state CCD camera. The output of the camera is simultaneously displayed on a monitor and recorded on either a laser videodisc set of a Super VHS VCR. This videodisc set is controlled by a minicomputer via a connection to the RS-232C interface terminals. The imaging system is connected to the host computer by a bus repeater and can be multiplexed between four video input sources. Sample images from a planar shear layer experiment are presented to show the processing capability of the imaging system with the host computer.

  19. Transferring ecosystem simulation codes to supercomputers

    NASA Technical Reports Server (NTRS)

    Skiles, J. W.; Schulbach, C. H.

    1995-01-01

    Many ecosystem simulation computer codes have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Supercomputing platforms (both parallel and distributed systems) have been largely unused, however, because of the perceived difficulty in accessing and using the machines. Also, significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers must be considered. We have transferred a grassland simulation model (developed on a VAX) to a Cray Y-MP/C90. We describe porting the model to the Cray and the changes we made to exploit the parallelism in the application and improve code execution. The Cray executed the model 30 times faster than the VAX and 10 times faster than a Unix workstation. We achieved an additional speedup of 30 percent by using the compiler's vectoring and 'in-line' capabilities. The code runs at only about 5 percent of the Cray's peak speed because it ineffectively uses the vector and parallel processing capabilities of the Cray. We expect that by restructuring the code, it could execute an additional six to ten times faster.

  20. SCAILET - An intelligent assistant for satellite ground terminal operations

    NASA Technical Reports Server (NTRS)

    Shahidi, A. K.; Crapo, J. A.; Schlegelmilch, R. F.; Reinhart, R. C.; Petrik, E. J.; Walters, J. L.; Jones, R. E.

    1992-01-01

    Space communication artificial intelligence for the link evaluation terminal (SCAILET) is an experimenter interface to the link evaluation terminal (LET) developed by NASA through the application of artificial intelligence to an advanced ground terminal. The high-burst-rate (HBR) LET provides the required capabilities for wideband communications experiments with the advanced communications technology satellite (ACTS). The HBR-LET terminal consists of seven major subsystems and is controlled and monitored by a minicomputer through an IEEE-488 or RS-232 interface. Programming scripts configure HBR-LET and allow data acquisition but are difficult to use and therefore the full capabilities of the system are not utilized. An intelligent assistant module was developed as part of the SCAILET module and solves problems encountered during configuration of the HBR-LET system. This assistant is a graphical interface with an expert system running in the background and allows users to configure instrumentation, program sequences and reference documentation. The simplicity of use makes SCAILET a superior interface to the ASCII terminal and continuous monitoring allows nearly flawless configuration and execution of HBR-LET experiments.

  1. Development and evaluation of an automated reflectance microscope system for the petrographic characterization of bituminous coals

    SciTech Connect

    Hoover, D. S.; Davis, A.

    1980-10-01

    The development of automated coal petrographic techniques will lessen the demands on skilled personnel to do routine work. This project is concerned with the development and successful testing of an instrument which will meet these needs. The fundamental differences in reflectance of the three primary maceral groups should enable their differentiation in an automated-reflectance frequency histogram (reflectogram). Consequently, reflected light photometry was chosen as the method for automating coal petrographic analysis. Three generations of an automated system (called Rapid Scan Versions I, II and III) were developed and evaluated for petrographic analysis. Their basic design was that of a reflected-light microscope photometer with an automatic stage, interfaced with a minicomputer. The hardware elements used in the Rapid Scan Version I limited the system's flexibility and presented problems with signal digitization and measurement precision. Rapid Scan Version II was designed to incorporate a new microscope photometer and computer system. A digital stepping stage was incorporated into the Rapid Scan Version III system. The precision of reflectance determination of this system was found to be +- 0.02 percent reflectance. The limiting factor in quantitative interpretation of Rapid Scan reflectograms is the resolution of reflectance populations of the individual maceral groups. Statistical testing indicated that reflectograms were highly reproducible, and a new computer program, PETAN, was written to interpret the curves for vitrinite reflectance parameters ad petrographic.

  2. [Automated system for collecting, processing and storing individual dosimetric control data].

    PubMed

    Sobolev, I A; Khomchik, L M; Zarkh, V G; Khoziainov, V A; Shurkus, A E

    1983-10-01

    Organization of a automated system of individual dosimetric control on the basis of mini-computer M-6000 was described. The hard ware system was considered, a coding principle for initial information was proposed. A block diagram of the soft ware consisting of a set of six interrelated programs was presented. Each one was considered in detail. As a result of this system a data bank is being set up for 10000 persons under central individual dosimetric control. The introduction of the automated system made it possible to do away with manual processing, to improve the reliability of processing, to classify registration forms, to control the time course of individual exposures, to detect the most hazardous from the radiation point of view departments and places of work, occupations, and to issue recommendations to improve technological processes to make radiation situation better. The automated system of individual dosimetric control can be recommended for factories and institutions where centralized individual dosimetric control is needed for a numerous staff. PMID:6633201

  3. Determination of physical and chemical states of lubricants in concentrated contacts, part 1

    NASA Technical Reports Server (NTRS)

    Lauer, J. L.

    1979-01-01

    A Fourier emission infrared microspectrometer, set up on a vibration-proof optical table and interfaced to a dedicated minicomputer, was used to record infrared emission spectra from elastohydrodynamic bearing contacts. Its range was extended to cover the entire mid-infrared from 2 to 15 micron. A series of experiments with 5P4E polyphenyl ether showed the existence of a temperature gradient through the lubricant in an ehd contact, which is perpendicular to the flow direction. The experiments also show marked polarization of some of the spectral bands, indicating a molecular alignment. Alignment is less evident at high pressure than at low pressure. To account for this behavior, a model is suggested along the lines developed for the conformational changes observed in long-chain polymers when subjected to increased pressure--to accommodate closer packing, molecules become kinked and curl up. Experiments with a traction fluid showed periodic changes of flow pattern associated with certain spectral changes. These observations will be studied further. A study by infrared attenuated total reflection spectrophotometry was undertaken to determine whether gamma irradiation would change polyethylene wear specimens. The results were negative.

  4. Development of a microcomputer data base of manufacturing, installation, and operating experience for the NSSS designer

    SciTech Connect

    Borchers, W.A.; Markowski, E.S.

    1986-01-01

    Future nuclear steam supply systems (NSSSs) will be designed in an environment of powerful micro hardware and software and these systems will be linked by local area networks (LAN). With such systems, individual NSSS designers and design groups will establish and maintain local data bases to replace existing manual files and data sources. One such effort of this type in Combustion Engineering's (C-E's) NSSS engineering organization is the establishment of a data base of historical manufacturing, installation, and operating experience to provide designers with information to improve on current designs and practices. In contrast to large mainframe or minicomputer data bases, which compile industry-wide data, the data base described here is implemented on a microcomputer, is design specific, and contains a level of detail that is of interest to system and component designers. DBASE III, a popular microcomputer data base management software package, is used. In addition to the immediate benefits provided by the data base, the development itself provided a vehicle for identifying procedural and control aspects that need to be addressed in the environment of local microcomputer data bases. This paper describes the data base and provides some observations on the development, use, and control of local microcomputer data bases in a design organization.

  5. History of Robotic and Remotely Operated Telescopes

    NASA Astrophysics Data System (ADS)

    Genet, Russell M.

    2011-03-01

    While automated instrument sequencers were employed on solar eclipse expeditions in the late 1800s, it wasn't until the 1960s that Art Code and associates at Wisconsin used a PDP minicomputer to automate an 8-inch photometric telescope. Although this pioneering project experienced frequent equipment failures and was shut down after a couple of years, it paved the way for the first space telescopes. Reliable microcomputers initiated the modern era of robotic telescopes. Louis Boyd and I applied single board microcomputers with 64K of RAM and floppy disk drives to telescope automation at the Fairborn Observatory, achieving reliable, fully robotic operation in 1983 that has continued uninterrupted for 28 years. In 1985 the Smithsonian Institution provided us with a suburb operating location on Mt. Hopkins in southern Arizona, while the National Science Foundation funded additional telescopes. Remote access to our multiple robotic telescopes at the Fairborn Observatory began in the late 1980s. The Fairborn Observatory, with its 14 fully robotic telescopes and staff of two (one full and one part time) illustrates the potential for low operating and maintenance costs. As the information capacity of the Internet has expanded, observational modes beyond simple differential photometry opened up, bringing us to the current era of real-time remote access to remote observatories and global observatory networks. Although initially confined to smaller telescopes, robotic operation and remote access are spreading to larger telescopes as telescopes from afar becomes the normal mode of operation.

  6. A geographic information system for resource managers based on multi-level remote sensing data

    NASA Technical Reports Server (NTRS)

    Wheeler, D. J.; Ridd, M. K.

    1985-01-01

    Procedures followed in developing a test case geographic information system derived primarily from remotely sensed data for the North Cache Soil Conservation District (SCD) in northern Utah are outlined. The North Cache SCD faces serious problems regarding water allocation, flood and geologic hazards, urban encroachment into prime farmland, soil erosion, and wildlife habitat. Four fundamental data planes were initially entered into the geo-referenced data base: (1) land use/land cover information for the agricultural and built-up areas of the valley obtained from various forms of aerial photography; (2) vegetation/land cover in mountains classified digitally from Landsat; (3) geomorphic terrain units derived from aerial photography and soil maps; and (4) digital terrain maps obtained from DMA digital data. The land use/vegetation/land cover information from manual photographic and Landsat interpretation were joined digitally into a single data plane with an integrated legend, and segmented into quadrangle units. These were merged with the digitized geomorphic units and the digital terrain data using a Prime 400 minicomputer. All data planes were geo-referenced to a UTM coordinate grid.

  7. The graphics and data acquisition software package

    NASA Technical Reports Server (NTRS)

    Crosier, W. G.

    1981-01-01

    A software package was developed for use with micro and minicomputers, particularly the LSI-11/DPD-11 series. The package has a number of Fortran-callable subroutines which perform a variety of frequently needed tasks for biomedical applications. All routines are well documented, flexible, easy to use and modify, and require minimal programmer knowledge of peripheral hardware. The package is also economical of memory and CPU time. A single subroutine call can perform any one of the following functions: (1) plot an array of integer values from sampled A/D data, (2) plot an array of Y values versus an array of X values; (3) draw horizontal and/or vertical grid lines of selectable type; (4) annotate grid lines with user units; (5) get coordinates of user controlled crosshairs from the terminal for interactive graphics; (6) sample any analog channel with program selectable gain; (7) wait a specified time interval, and (8) perform random access I/O of one or more blocks of a sequential disk file. Several miscellaneous functions are also provided.

  8. Matched filters for bin picking.

    PubMed

    Dessimoz, J D; Birk, J R; Kelley, R B; Martins, H A; Lin, C

    1984-06-01

    Currently, a major difficulty for the widespread use of robots in assembly and material handling comes from the necessity of feeding accurately positioned workpieces to robots. ``Bin picking'' techniques help reduce this constraint. This paper presents the application of matched filters for enabling robots with vision to acquire workpieces randomly stored in bins. This approach complements heuristic methods already reported. The concept of matched filter is an old one. Here, however, it is redefined to take into account robot end-effector features, in terms of geometry and mechanics. In particular, the proposed filters match local workpiece structures where the robot end-effector is likely to grasp successfully and hold workpieces. The local nature of the holdsites is very important as computation costs are shown to vary with the fifth power of structure size. In addition, the proposed filters tend to have a narrow angular bandwidth. An example, which features a parallel-jaw hand is developed in detail, using both statistical and Fourier models. Both approaches concur in requiring a very small number of filters (typically four), even if a good orientation accuracy is expected (two degrees). Success rates of about 90 percent in three or fewer attempts have been experimentally obtained on a system which includes a small minicomputer, a 128 × 128 pixel solidstate camera, a prototype Cartesian robot, and a ``universal'' parallel-jaw hand. PMID:22499650

  9. Voice Controlled Wheelchair

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Michael Condon, a quadraplegic from Pasadena, California, demonstrates the NASA-developed voice-controlled wheelchair and its manipulator, which can pick up packages, open doors, turn a TV knob, and perform a variety of other functions. A possible boon to paralyzed and other severely handicapped persons, the chair-manipulator system responds to 35 one-word voice commands, such as "go," "stop," "up," "down," "right," "left," "forward," "backward." The heart of the system is a voice-command analyzer which utilizes a minicomputer. Commands are taught I to the computer by the patient's repeating them a number of times; thereafter the analyzer recognizes commands only in the patient's particular speech pattern. The computer translates commands into electrical signals which activate appropriate motors and cause the desired motion of chair or manipulator. Based on teleoperator and robot technology for space-related programs, the voice-controlled system was developed by Jet Propulsion Laboratory under the joint sponsorship of NASA and the Veterans Administration. The wheelchair-manipulator has been tested at Rancho Los Amigos Hospital, Downey, California, and is being evaluated at the VA Prosthetics Center in New York City.

  10. Software for device-independent graphical input

    SciTech Connect

    Hamlin, G.

    1982-01-01

    A three-level model and a graphics software structure based on the model that was developed with the goal of making graphical applications independent of the input devices are described. The software structure makes graphical applications independent of the input devices in a manner similar to the way the SIGGRAPH CORE proposal makes them independent of the output devices. A second goal was to provide a convenient means for application programmers to specify the user-input language for their applications. The software consists of an input handler and a table-driven parser. The input handler manages a CORE-like event queue, changing input events into terminal symbols and making their terminal symbols available to the parser in a uniform manner. It also removes most device dependencies. The parser is table driven from a Backus-Naur form (BNF) grammer that specifies the user-input language. The lower level grammar rules remove the remaining device dependencies from the input, and the higher level grammar rules specify legal sentences in the user-input language. Implementation of this software is on a table-top minicomputer. Experience with retrofitting existing applications indicates that one can find a grammar that removes essentially all the device dependencies from the application proper.

  11. A C Language Implementation of the SRO (Murdock) Detector/Analyzer

    USGS Publications Warehouse

    Murdock, James N.; Halbert, Scott E.

    1991-01-01

    A signal detector and analyzer algorithm was described by Murdock and Hutt in 1983. The algorithm emulates the performance of a human interpreter of seismograms. It estimates the signal onset, the direction of onset (positive or negative), the quality of these determinations, the period and amplitude of the signal, and the background noise at the time of the signal. The algorithm has been coded in C language for implementation as a 'blackbox' for data similar to that of the China Digital Seismic Network. A driver for the algorithm is included, as are suggestions for other drivers. In all of these routines, plus several FIR filters that are included as well, floating point operations are not required. Multichannel operation is supported. Although the primary use of the code has been for in-house processing of broadband and short period data of the China Digital Seismic Network, provisions have been made to process the long period and very long period data of that system as well. The code for the in-house detector, which runs on a mini-computer, is very similar to that of the field system, which runs on a microprocessor. The code is documented.

  12. A geographic information system for resource managers based on multi-level remote sensing data

    NASA Technical Reports Server (NTRS)

    Wheeler, D. J.; Ridd, M. K.

    1984-01-01

    Procedures followed in developing a test case geographic information system derived primarily from remotely sensed data for the North Cache Soil Conservation District (SCD) in northern Utah are outlined. The North Cache SCD faces serious problems regarding water allocation, flood and geologic hazards, urban encroachment into prime farmland, soil erosion, and wildlife habitat. Four fundamental data planes were initially entered into the geo-referenced data base: (1) land use/land cover information for the agricultural and built-up areas of the valley obtained from various forms of aerial photography; (2) vegetation/land cover in mountains classified digitally from LANDSAT; (3) geomorphic terrain units derived from aerial photography and soil maps; and (4) digital terrain maps obtained from DMA digital data. The land use/vegetation/land cover information from manual photographic and LANDSAT interpretation were joined digitally into a single data plane with an integrated legend, and segmented into quadrangle units. These were merged with the digitized geomorphic units and the digital terrain data using a Prime 400 minicomputer. All data planes were geo-referenced to a UTM coordinate grid.

  13. Updated overview of the Tevatron control system

    SciTech Connect

    Lucas, P.

    1987-10-01

    A single unified control system is used for all of the Fermilab accelerators and storage rings, from the LINAC to the Tevatron and antiproton source. A review of the general features is given - these include a 'host' system consisting of a number of minicomputers integrated with many distributed microprocessors in a variety of subsystems, usage of an in-house developed protocol, GAS, for communication between the two classes of machines, and a Parameter Page program, designed in conjunction with the system database, which allows a wide variety of quantities to be read and set in a coherent fashion. Recent developments include the implementation of a block transfer and 'fast time plot' facility through CAMAC, inclusion of several new computers in the host, a better understanding of system throughput, greatly improved reliability, advent of programs which sequence a large number of independent operations, and the construction of new hardware subsystems. Possible future system upgrades will be briefly presented. A summary of the utilization of a quite large software staff, at a time when the system is no longer under construction, will be discussed.

  14. High performance/low cost accelerator control system

    SciTech Connect

    Magyary, S.; Glatz, J.; Lancaster, H.; Selph, F.; Fahmie, M.; Ritchie, A.; Timossi, C.; Hinkson, C.; Benjegerdes, R.

    1980-10-01

    Implementation of a high performance computer control system tailored to the requirements of the SuperHILAC accelerator is described. This system uses a distributed (star-type) structure with fiber optic data links; multiple CPU's operate in parallel at each node. A large number (20) of the latest 16-bit microcomputer boards are used to get a significant processor bandwidth (exceeding that of many mini-computers) at a reasonable price. Because of the large CPU bandwidth, software costs and complexity are significantly reduced and programming can be less real-time critical. In addition all programming can be in a high level language. Dynamically assigned and labeled knobs together with touch-screens allow a flexible operator interface. An X-Y vector graphics system allows display and labeling of real-time signals as well as general plotting functions. Both the accelerator parameters and the graphics system can be driven from BASIC interactive programs in addition to the pre-canned user routines. This allows new applications to be developed quickly and efficiently by physicists, operators, etc. The system, by its very nature and design, is easily upgraded (via next generation of boards) and repaired (by swapping of boards) without a large hardware support group. This control system is now being tested on an existing beamline and is performing well. The techniques used in this system can be readily applied to industrial control systems.

  15. Interactive Forecasting with the National Weather Service River Forecast System

    NASA Technical Reports Server (NTRS)

    Smith, George F.; Page, Donna

    1993-01-01

    The National Weather Service River Forecast System (NWSRFS) consists of several major hydrometeorologic subcomponents to model the physics of the flow of water through the hydrologic cycle. The entire NWSRFS currently runs in both mainframe and minicomputer environments, using command oriented text input to control the system computations. As computationally powerful and graphically sophisticated scientific workstations became available, the National Weather Service (NWS) recognized that a graphically based, interactive environment would enhance the accuracy and timeliness of NWS river and flood forecasts. Consequently, the operational forecasting portion of the NWSRFS has been ported to run under a UNIX operating system, with X windows as the display environment on a system of networked scientific workstations. In addition, the NWSRFS Interactive Forecast Program was developed to provide a graphical user interface to allow the forecaster to control NWSRFS program flow and to make adjustments to forecasts as necessary. The potential market for water resources forecasting is immense and largely untapped. Any private company able to market the river forecasting technologies currently developed by the NWS Office of Hydrology could provide benefits to many information users and profit from providing these services.

  16. In-phase and out-of-phase axial-torsional fatigue behavior of Haynes 188 at 760 C

    NASA Technical Reports Server (NTRS)

    Kalluri, Sreeramesh; Bonacuse, Peter J.

    1991-01-01

    Isothermal, in-phase and out-of-phase axial-torsional fatigue experiments have been conducted at 760 C on uniform gage section, thin-walled tubular specimens of a wrought cobalt-base superalloy, Haynes 188. Test-control and data acquisition were accomplished with a minicomputer. Fatigue lives of the in- and out-of-phase axial-torsional fatigue tests have been estimated with four different multiaxial fatigue life prediction models that were developed primarly for predicting axial-torsional fatigue lives at room temperature. The models investigated were: (1) the von Mises equivalent strain range; (2) the Modified Multiaxiality Factor Approach; (3) the Modified Smith-Watson-Topper Parameter; and (4) the critical shear plane method of Fatemi, Socie, and Kurath. In general, life predictions by the von Mises equivalent strain range model were within a factor of 2 for a majority of the tests and the predictions by the Modified Multiaxiality Factor Approach were within a factor of 2, while predictions of the Modified Smith-Watson-Topper Parameter and of the critical shear plane method of Fatemi, Socie, and Kurath were unconservative and conservative, respectively, by up to factors of 4. In some of the specimens tested under combined axial-torsional loading conditions, fatigue cracks initiated near extensometer indentations. Two design modifications have been proposed to the thin-walled tubular specimen to overcome this problem.

  17. Augmented burst-error correction for UNICON laser memory. [digital memory

    NASA Technical Reports Server (NTRS)

    Lim, R. S.

    1974-01-01

    A single-burst-error correction system is described for data stored in the UNICON laser memory. In the proposed system, a long fire code with code length n greater than 16,768 bits was used as an outer code to augment an existing inner shorter fire code for burst error corrections. The inner fire code is a (80,64) code shortened from the (630,614) code, and it is used to correct a single-burst-error on a per-word basis with burst length b less than or equal to 6. The outer code, with b less than or equal to 12, would be used to correct a single-burst-error on a per-page basis, where a page consists of 512 32-bit words. In the proposed system, the encoding and error detection processes are implemented by hardware. A minicomputer, currently used as a UNICON memory management processor, is used on a time-demanding basis for error correction. Based upon existing error statistics, this combination of an inner code and an outer code would enable the UNICON system to obtain a very low error rate in spite of flaws affecting the recorded data.

  18. Guide to sharing personal computer resources via local area networks

    SciTech Connect

    Winkler, L.

    1986-03-01

    This Guide is for professional staff who commonly need computing tools on personal computers, minicomputers, mainframe computers, and supercomputers. It provides information and recommendations about personal computer local area networks in the context of the larger scheme of computing tools and services at the Laboratory. The material presented here is for the person considering installation of a personal computer local area network. Chapter 1 introduces the reader to the concept of personal computer local area networks and provides background material on networking. Chapter 2 summarizes Computing Services' evaluation of personal computer local area networking in general terms. Chapter 3 describes the technical and functional details of Computing Services' Personal Computer Local Area Network Evaluation and Demonstration Project. Chapters 4 and 5 are for individuals who are familiar with personal computing and who will be responsible for establishing a local area network. Chapter 4 covers technical issues relating to the prototype network installation in Building 221. Chapter 5 warns potential users what to expect when establishing a local area network. 7 figs., 9 tabs.

  19. Developing pharmacy applications using a microcomputer relational database in a long-term care psychiatric hospital.

    PubMed

    Salek, W

    1989-03-01

    The database applications developed with a microcomputer for a 1000 bed long-term care forensic psychiatric care hospital are described. The implementation of a microcomputer system was instituted as an interim measure prior to the development of a hospital wide minicomputer system. Primary emphasis was placed on increasing the efficiency of professional staff while enhancing clinical therapeutic monitoring. The system operates on an IBM-AT with 30 megabyte hard disk drive and an Epson FX-100 dot matrix printer. A relational database manager, Team-Up, was utilized in the development of applications that included census maintenance, scheduled drug inventory, drug regimen review, drug utilization protocols and a skilled nursing unit dose patient profile. Other ancillary functions included generation of stock labels, a literature abstract database and an on-line policy and procedure manual. Advantages of the system include an increase in staff productivity through the use of information that is readily attainable from the patient database. Possible disadvantages are the programming and hardware limitations imposed by a microcomputer system. Long term care psychiatric facilities may be able to enhance staff efficiency by computerizing existing manual systems. Because of the diverse and specialized requirements of long term care facilities, a microcomputer used in conjunction with a programmable relational database can be easily customized to fulfill this need. PMID:10292384

  20. Definition study for photovoltaic residential prototype system

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Hulstrom, R. L.; Cookson, C.; Waldman, B. H.; Lane, R. A.

    1976-01-01

    A parametric sensitivity study and definition of the conceptual design is presented. A computer program containing the solar irradiance, solar array, and energy balance models was developed to determine the sensitivities of solar insolation and the corresponding solar array output at five sites selected for this study as well as the performance of several solar array/battery systems. A baseline electrical configuration was chosen, and three design options were recommended. The study indicates that the most sensitive parameters are the solar insolation and the inverter efficiency. The baseline PST selected is comprised of a 133 sg m solar array, 250 ampere hour battery, one to three inverters, and a full shunt regulator to limit the upper solar array voltage. A minicomputer controlled system is recommended to provide the overall control, display, and data acquisition requirements. Architectural renderings of two photovoltaic residential concepts, one above ground and the other underground, are presented. The institutional problems were defined in the areas of legal liabilities during and after installation of the PST, labor practices, building restrictions and architectural guides, and land use.

  1. David Florida Laboratory Thermal Vacuum Data Processing System

    NASA Technical Reports Server (NTRS)

    Choueiry, Elie

    1994-01-01

    During 1991, the Space Simulation Facility conducted a survey to assess the requirements and analyze the merits for purchasing a new thermal vacuum data processing system for its facilities. A new, integrated, cost effective PC-based system was purchased which uses commercial off-the-shelf software for operation and control. This system can be easily reconfigured and allows its users to access a local area network. In addition, it provides superior performance compared to that of the former system which used an outdated mini-computer and peripheral hardware. This paper provides essential background on the old data processing system's features, capabilities, and the performance criteria that drove the genesis of its successor. This paper concludes with a detailed discussion of the thermal vacuum data processing system's components, features, and its important role in supporting our space-simulation environment and our capabilities for spacecraft testing. The new system was tested during the ANIK E spacecraft test, and was fully operational in November 1991.

  2. Compact, high-speed algorithm for laying out printed circuit board runs

    NASA Astrophysics Data System (ADS)

    Zapolotskiy, D. Y.

    1985-09-01

    A high speed printed circuit connection layout algorithm is described which was developed within the framework of an interactive system for designing two-sided printed circuit broads. For this reason, algorithm speed was considered, a priori, as a requirement equally as important as the inherent demand for minimizing circuit run lengths and the number of junction openings. This resulted from the fact that, in order to provide psychological man/machine compatibility in the design process, real-time dialog during the layout phase is possible only within limited time frames (on the order of several seconds) for each circuit run. The work was carried out for use on an ARM-R automated work site complex based on an SM-4 minicomputer with a 32K-word memory. This limited memory capacity heightened the demand for algorithm speed and also tightened data file structure and size requirements. The layout algorithm's design logic is analyzed. The structure and organization of the data files are described.

  3. Spent fuel test. Climax data acquisition system integration report

    SciTech Connect

    Nyholm, R.A.; Brough, W.G.; Rector, N.L.

    1982-06-01

    The Spent Fuel Test - Climax (SFT-C) is a test of the retrievable, deep geologic storage of commercially generated, spent nuclear reactor fuel in granitic rock. Eleven spent fuel assemblies, together with 6 electrical simulators and 20 guard heaters, are emplaced 420 m below the surface in the Climax granite at the Nevada Test Site. On June 2, 1978, Lawrence Livermore National Laboratory (LLNL) secured funding for the SFT-C, and completed spent fuel emplacement May 28, 1980. This multi-year duration test is located in a remote area and is unattended much of the time. An extensive array of radiological safety and geotechnical instrumentation is deployed to monitor the test performance. A dual minicomputer-based data acquisition system collects and processes data from more than 900 analog instruments. This report documents the design and functions of the hardware and software elements of the Data Acquisition System and describes the supporting facilities which include environmental enclosures, heating/air-conditioning/humidity systems, power distribution systems, fire suppression systems, remote terminal stations, telephone/modem communications, and workshop areas. 9 figures.

  4. Spent Fuel Test - Climax data acquisition system operations manual

    SciTech Connect

    Nyholm, R.A.

    1983-01-01

    The Spent Fuel Test-Climax (SFT-C) is a test of the retrievable, deep geologic storage of commercially generated, spent nuclear reactor fuel in granite rock. Eleven spent fuel assemblies, together with 6 electrical simulators and 20 guard heaters, are emplaced 420 m below the surface in the Climax granite at the US Department of Energy Nevada Test Site. On June 2, 1978, Lawrence Livermore National Laboratory (LLNL) secured funding for the SFT-C, and completed spent fuel emplacement May 28, 1980. The multi-year duration test is located in a remote area and is unattended much of the time. An extensive array of radiological safety and geotechnical instrumentation is deployed to monitor the test performance. A dual minicomputer-based data acquisition system (DAS) collects and processes data from more than 900 analog instruments. This report documents the software element of the LLNL developed SFT-C Data Acquisition System. It defines the operating system and hardware interface configurations, the special applications software and data structures, and support software.

  5. AESOP XX: summary of proceedings. [Gatlinburg, Tennessee, April 24 to 26, 1979

    SciTech Connect

    1980-03-01

    The 20th meeting of the Association for Energy Systems, Operations, and Programming (AESOP) was held in Gatlinburg, Tennessee, on April 24 to 26, 1979. Representatives of DOE Headquarters discussed the effects that new security and privacy regulations will have on automatic data processing operations. The status and future possibilities of the Business Management Information System (BMIS) were also discussed. Then representatives of various DOE offices and contractors presented reports on various topics. This report contains two-page summaries of the papers presented at the meeting. Session topics and titles of papers were as follows: Washington report (New ADP issues; BMIS: the Business Management Information System; Nuclear weapons and the computer); Improving the productivity of the computing analyst/programer (What productivity improvement tools are available; Rocky Flats experience with SDM/70; Albuquerque Operations Office experience with SDM/70; Planning and project management; Minicomputer standards and programer productivity; MRC productivity gains through applications development tools); User viewpoints and expectations of data processing (User perspectives on computer applications; User viewpoints on environmental studies; Planning and implementing a procurement system; Two sides of the DP coin); Data base management (Use of data base systems within DOE; Future trends in data base hardware; Future trends in data base software; Toward automating the data base design process); and Management discussions. Complete versions of three of the papers have already been cited in ERA. These can be located by reference to the entry CONF-790431-- in the Report Number Index. (RWR)

  6. Pneumatic sample-transfer system for use with the Lawrence Livermore National Laboratory rotating target neutron source (RTNS-I)

    SciTech Connect

    Williams, R.E.

    1981-07-01

    A pneumatic sample-transfer system is needed to be able to rapidly retrieve samples irradiated with 14-MeV neutrons at the Rotating Target Neutron Source (RTNS-I). The rabbit system, already in place for many years, has been refurbished with modern system components controlled by an LSI-11 minicomputer. Samples can now be counted three seconds after an irradiation. There are many uses for this expanded 14-MeV neutron activation capability. Several fission products difficult to isolate from mixed fission fragments can be produced instead through (n,p) or (n,..cap alpha..) reactions with stable isotopes. Mass-separated samples of Nd, Mo, and Se, for example, can be irradiated to produce Pr, Nb, and As radionuclides sufficient for decay scheme studies. The system may also be used for multielement fast-neutron activation analysis because the neutron flux is greater than 2 x 10/sup 11/ n/cm/sup 2/-sec. Single element analyses of Si and O are also possible. Finally, measurements of fast-neutron cross sections producing short-lived activation products can be performed with this system. A description of the rabbit system and instructions for its use are presented in this report.

  7. Software used with the flux mapper at the solar parabolic dish test site

    NASA Technical Reports Server (NTRS)

    Miyazono, C.

    1984-01-01

    Software for data archiving and data display was developed for use on a Digital Equipment Corporation (DEC) PDP-11/34A minicomputer for use with the JPL-designed flux mapper. The flux mapper is a two-dimensional, high radiant energy scanning device designed to measure radiant flux energies expected at the focal point of solar parabolic dish concentrators. Interfacing to the DEC equipment was accomplished by standard RS-232C serial lines. The design of the software was dicated by design constraints of the flux-mapper controller. Early attemps at data acquisition from the flux-mapper controller were not without difficulty. Time and personnel limitations result in an alternative method of data recording at the test site with subsequent analysis accomplished at a data evaluation location at some later time. Software for plotting was also written to better visualize the flux patterns. Recommendations for future alternative development are discussed. A listing of the programs used in the anaysis is included in an appendix.

  8. Computerized rapid analysis of complex mixtures by gas chromatography

    SciTech Connect

    Demirgian, J.C.

    1984-04-01

    A computerized rapid analysis system and its application to two research problems are described. The system uses one central minicomputer to gather analog and digital data from several instruments. Various programs calibrate and convert the data into formatted reports. If desired, this computer sends the report to a more powerful mainframe computer for comparison of different reports and establishment of a data base. Data interpretation begins when a calibration program accesses an external standard file, identifies peaks, calculates response factors for each peak, and stores the results on disk. A master program produces a quantitative report based on retention times and two relative retention time index systems. This report is transferred by modem to a mainframe computer. Statistical programs on the mainframe computer compare runs and/or reprint the data in another format. Additional programming allows for the addition and correction of peak names. The entire procedure produces reports rapidly, allows for variable formats, and compares data from multiple runs. The applicability of the system in studying fluid degradation and solvent extraction efficiency is demonstrated. 13 references, 10 figures, 1 table.

  9. Migrating an energy management system to a networked architecture

    SciTech Connect

    Radtke, M.A.

    1996-02-01

    This paper will chronicle the activity at Wisconsin Public Service Corporation (WPSC) that resulted in the complete migration of a traditional, late 1970`s vintage, Energy management System (EMS). The new environment includes networked microcomputers, minicomputers, and the corporate mainframe, and provides on-line access to employees outside the energy control center and some WPSC customers. In the late 1980`s, WPSC was forecasting an EMS computer upgrade or replacement to address both capacity and technology needs. Reasoning that access to diverse computing resources would best position the company to accommodate the uncertain needs of the energy industry in the 90`s, WPSC chose to investigate an in-place migration to a network of computers, able to support heterogeneous hardware and operating systems. The system was developed in a modular fashion, with individual modules being deployed as soon as they were completed. The functional and technical specification was continuously enhanced as operating experience was gained from each operational module. With the migration of the original EMS computers complete, the networked system called DEMAXX (Distributed Energy Management Architecture with eXtensive eXpandability) has exceeded expectations in the areas of: cost, performance, flexibility, and reliability.

  10. Improving computer simulation through use of a graphical user interface

    SciTech Connect

    Fife, L.D.; Henry, S.R.

    1995-02-01

    Historically, simulation has been performed on large mainframe and minicomputers. These computers were limited in number, expensive, and controlled by a small group of professionals highly trained in the use of these resources. These resources were often difficult to access and use; they were not user friendly. The appearance of high-speed, high-capacity desktop computers allowed an increasing number of engineers and scientists access to computers and simulators. Because these new resources were often difficult to use and the users were not highly trained, however, many of the problems associated with difficulty of use remained. Graphical User Interfaces (GUI`s) have helped solve many of the usage problems. GUI`s provide additional capabilities that increase the ability of scientists and engineers to use their computer resources. These capabilities increase the value of simulators by extending usage through increased functionality. The consistency of a well-designed family of simulator GUI`s can enhance training effectiveness by increasing familiarity with basic operations, assisting in the validation of data, and improving help facilities available to the user.

  11. Commercial space development needs cheap launchers

    NASA Astrophysics Data System (ADS)

    Benson, James William

    1998-01-01

    SpaceDev is in the market for a deep space launch, and we are not going to pay $50 million for it. There is an ongoing debate about the elasticity of demand related to launch costs. On the one hand there are the ``big iron'' NASA and DoD contractors who say that there is no market for small or inexpensive launchers, that lowering launch costs will not result in significantly more launches, and that the current uncompetitive pricing scheme is appropriate. On the other hand are commercial companies which compete in the real world, and who say that there would be innumerable new launches if prices were to drop dramatically. I participated directly in the microcomputer revolution, and saw first hand what happened to the big iron computer companies who failed to see or heed the handwriting on the wall. We are at the same stage in the space access revolution that personal computers were in the late '70s and early '80s. The global economy is about to be changed in ways that are just as unpredictable as those changes wrought after the introduction of the personal computer. Companies which fail to innovate and keep producing only big iron will suffer the same fate as IBM and all the now-extinct mainframe and minicomputer companies. A few will remain, but with a small share of the market, never again to be in a position to dominate.

  12. Pacific Missile Test Center Information Resources Management Organization (code 0300): The ORACLE client-server and distributed processing architecture

    SciTech Connect

    Beckwith, A. L.; Phillips, J. T.

    1990-06-10

    Computing architectures using distributed processing and distributed databases are increasingly becoming considered acceptable solutions for advanced data processing systems. This is occurring even though there is still considerable professional debate as to what truly'' distributed computing actually is and despite the relative lack of advanced relational database management software (RDBMS) capable of meeting database and system integrity requirements for developing reliable integrated systems. This study investigates the functionally of ORACLE data base management software that is performing distributed processing between a MicroVAX/VMS minicomputer and three MS-DOS-based microcomputers. The ORACLE database resides on the MicroVAX and is accessed from the microcomputers with ORACLE SQL*NET, DECnet, and ORACLE PC TOOL PACKS. Data gathered during the study reveals that there is a demonstrable decrease in CPU demand on the MicroVAX, due to distributed processing'', when the ORACLE PC Tools are used to access the database as opposed to database access from dumb'' terminals. Also discovered were several hardware/software constraints that must be considered in implementing various software modules. The results of the study indicate that this distributed data processing architecture is becoming sufficiently mature, reliable, and should be considered for developing applications that reduce processing on central hosts. 33 refs., 2 figs.

  13. H-coal fluid dynamics. Final report, August 1, 1977-December 31, 1979

    SciTech Connect

    Not Available

    1980-04-16

    This report presents the results of work aimed at understanding the hydrodynamic behavior of the H-Coal reactor. A summary of the literature search related to the fluid dynamic behavior of gas/liquid/solid systems has been presented. Design details of a cold flow unit were discussed. The process design of this cold flow model followed practices established by HRI in their process development unit. The cold fow unit has been used to conduct experiments with nitrogen, kerosene, or kerosene/coal char slurries, and HDS catalyst, which at room temperature have properties similar to those existing in the H-Coal reactor. Mineral oil, a high-viscosity liquid, was also used. The volume fractions occupied by gas/liquid slurries and catalyst particles were determined by several experimental techniques. The use of a mini-computer for data collection and calculation has greatly accelerated the analysis and reporting of data. Data on nitrogen/kerosene/HDS catalyst and coal char fines are presented in this paper. Correlations identified in the literature search were utilized to analyze the data. From this analysis it became evident that the Richardson-Zaki correlation describes the effect of slurry flow rate on catalyst expansion. Three-phase fluidization data were analyzed with two models.

  14. Pressure Measurement Systems

    NASA Technical Reports Server (NTRS)

    1990-01-01

    System 8400 is an advanced system for measurement of gas and liquid pressure, along with a variety of other parameters, including voltage, frequency and digital inputs. System 8400 offers exceptionally high speed data acquisition through parallel processing, and its modular design allows expansion from a relatively inexpensive entry level system by the addition of modular Input Units that can be installed or removed in minutes. Douglas Juanarena was on the team of engineers that developed a new technology known as ESP (electronically scanned pressure). The Langley ESP measurement system was based on miniature integrated circuit pressure-sensing transducers that communicated pressure information to a minicomputer. In 1977, Juanarena formed PSI to exploit the NASA technology. In 1978 he left Langley, obtained a NASA license for the technology, introduced the first commercial product, the 780B pressure measurement system. PSI developed a pressure scanner for automation of industrial processes. Now in its second design generation, the DPT-6400 is capable of making 2,000 measurements a second and has 64 channels by addition of slave units. New system 8400 represents PSI's bid to further exploit the $600 million U.S. industrial pressure measurement market. It is geared to provide a turnkey solution to physical measurement.

  15. High Frequency Sampling of TTL Pulses on a Raspberry Pi for Diffuse Correlation Spectroscopy Applications.

    PubMed

    Tivnan, Matthew; Gurjar, Rajan; Wolf, David E; Vishwanath, Karthik

    2015-01-01

    Diffuse Correlation Spectroscopy (DCS) is a well-established optical technique that has been used for non-invasive measurement of blood flow in tissues. Instrumentation for DCS includes a correlation device that computes the temporal intensity autocorrelation of a coherent laser source after it has undergone diffuse scattering through a turbid medium. Typically, the signal acquisition and its autocorrelation are performed by a correlation board. These boards have dedicated hardware to acquire and compute intensity autocorrelations of rapidly varying input signal and usually are quite expensive. Here we show that a Raspberry Pi minicomputer can acquire and store a rapidly varying time-signal with high fidelity. We show that this signal collected by a Raspberry Pi device can be processed numerically to yield intensity autocorrelations well suited for DCS applications. DCS measurements made using the Raspberry Pi device were compared to those acquired using a commercial hardware autocorrelation board to investigate the stability, performance, and accuracy of the data acquired in controlled experiments. This paper represents a first step toward lowering the instrumentation cost of a DCS system and may offer the potential to make DCS become more widely used in biomedical applications. PMID:26274961

  16. Planning for library automation using MINISIS.

    PubMed

    Sly, M

    1983-04-01

    Discussed in this article are the questions that should be addressed by a library which is contemplating automation. The 1st part of the paper deals with the systems study, the results of which from the basis of a library's decision to automate. Specific points to be considered in planning for an automation project are dealt with in the 2nd part. It is assumed that the software to be adopted is MINISIS, a mini-computer data base management system developed by the International Development Research Centre. However, the discussion in the 2nd part is relevant for any library embarking on an automation project. Among the advice given in the conclusion is: 1) The success of the library's automation project depends a great deal on the interaction between the library and the computer staff. 2) The involvement of all the library staff should be maximized from the early planning stages. 3) At every stage of the planning process, decisions must be documented along with reasons why they were made. 4) Training and procedure manual must be written before implementation. 5) The need for thorough planning cannot be overemphasized. PMID:12279642

  17. Review of the Water Resources Information System of Argentina

    USGS Publications Warehouse

    Hutchison, N.E.

    1987-01-01

    A representative of the U.S. Geological Survey traveled to Buenos Aires, Argentina, in November 1986, to discuss water information systems and data bank implementation in the Argentine Government Center for Water Resources Information. Software has been written by Center personnel for a minicomputer to be used to manage inventory (index) data and water quality data. Additional hardware and software have been ordered to upgrade the existing computer. Four microcomputers, statistical and data base management software, and network hardware and software for linking the computers have also been ordered. The Center plans to develop a nationwide distributed data base for Argentina that will include the major regional offices as nodes. Needs for continued development of the water resources information system for Argentina were reviewed. Identified needs include: (1) conducting a requirements analysis to define the content of the data base and insure that all user requirements are met, (2) preparing a plan for the development, implementation, and operation of the data base, and (3) developing a conceptual design to inform all development personnel and users of the basic functionality planned for the system. A quality assurance and configuration management program to provide oversight to the development process was also discussed. (USGS)

  18. PSA: A program to streamline orbit determination for launch support operations

    NASA Technical Reports Server (NTRS)

    Legerton, V. N.; Mottinger, N. A.

    1988-01-01

    An interactive, menu driven computer program was written to streamline the orbit determination process during the critical launch support phase of a mission. Residing on a virtual memory minicomputer, this program retains the quantities in-core needed to obtain a least squares estimate of the spacecraft trajectory with interactive displays to assist in rapid radio metric data evaluation. Menu-driven displays allow real time filter and data strategy development. Graphical and tabular displays can be sent to a laser printer for analysis without exiting the program. Products generated by this program feed back to the main orbit determination program in order to further refine the estimate of the trajectory. The final estimate provides a spacecraft ephemeris which is transmitted to the mission control center and used for antenna pointing and frequency predict generation by the Deep Space Network. The development and implementation process of this program differs from that used for most other navigation software by allowing the users to check important operating features during development and have changes made as needed.

  19. Upgrading NASA/DOSE laser ranging system control computers

    NASA Technical Reports Server (NTRS)

    Ricklefs, Randall L.; Cheek, Jack; Seery, Paul J.; Emenheiser, Kenneth S.; Hanrahan, William P., III; Mcgarry, Jan F.

    1993-01-01

    Laser ranging systems now managed by the NASA Dynamics of the Solid Earth (DOSE) and operated by the Bendix Field Engineering Corporation, the University of Hawaii, and the University of Texas have produced a wealth on interdisciplinary scientific data over the last three decades. Despite upgrades to the most of the ranging station subsystems, the control computers remain a mix of 1970's vintage minicomputers. These encompass a wide range of vendors, operating systems, and languages, making hardware and software support increasingly difficult. Current technology allows replacement of controller computers at a relatively low cost while maintaining excellent processing power and a friendly operating environment. The new controller systems are now being designed using IBM-PC-compatible 80486-based microcomputers, a real-time Unix operating system (LynxOS), and X-windows/Motif IB, and serial interfaces have been chosen. This design supports minimizing short and long term costs by relying on proven standards for both hardware and software components. Currently, the project is in the design and prototyping stage with the first systems targeted for production in mid-1993.

  20. 1985 ACSM-ASPRS Fall Convention, Indianapolis, IN, September 8-13, 1985, Technical Papers

    SciTech Connect

    Not Available

    1985-01-01

    Papers are presented on Landsat image data quality analysis, primary data acquisition, cartography, geodesy, land surveying, and the applications of satellite remote sensing data. Topics discussed include optical scanning and interactive color graphics; the determination of astrolatitudes and astrolongitudes using x, y, z-coordinates on the celestial sphere; raster-based contour plotting from digital elevation models using minicomputers or microcomputers; the operational techniques of the GPS when utilized as a survey instrument; public land surveying and high technology; the use of multitemporal Landsat MSS data for studying forest cover types; interpretation of satellite and aircraft L-band synthetic aperture radar imagery; geological analysis of Landsat MSS data; and an interactive real time digital image processing system. Consideration is given to a large format reconnaissance camera; creating an optimized color balance for TM and MSS imagery; band combination selection for visual interpretation of thematic mapper data for resource management; the effect of spatial filtering on scene noise and boundary detail in thematic mapper imagery; the evaluation of the geometric quality of thematic mapper photographic data; and the analysis and correction of Landsat 4 and 5 thematic mapper sensor data.

  1. Confocal Laser Microscope Scanning Applied To Three-Dimensional Studies Of Biological Specimens.

    NASA Astrophysics Data System (ADS)

    Franksson, Olof; Liljeborg, Anders; Carlsson, Kjell; Forsgren, Per-Ola

    1987-08-01

    The depth-discriminating property of confocal laser microscope scanners can be used to record the three-dimensional structure of specimens. A number of thin sections (approx. 1 ?m thick) can be recorded by a repeated process of image scanning and refocusing of the microscope. We have used a confocal microscope scanner in a number of feasibility studies to investigate its possibilities and limitations. It has proved to be well suited for examining fluorescent specimens with a complicated three-dimensional structure, such as nerve cells. It has also been used to study orchid seeds, as well as cell colonies, greatly facilitating evaluation of such specimens. Scanning of the specimens is performed by a focused laser beam that is deflected by rotating mirrors, and the reflected or fluorescent light from the specimen is detected. The specimen thus remains stationary during image scanning, and is only moved stepwise in the vertical direction for refocusing between successive sections. The scanned images consist of 256*256 or 512*512 pixels, each pixel containing 8 bits of data. After a scanning session a large number of digital images, representing consecutive sections of the specimen, are stored on a disk memory. In a typical case 200 such 256*256 images are stored. To display and process this information in a meaningful way requires both appropriate software and a powerful computer. The computer used is a 32-bits minicomputer equipped with an array processor (FPS 100). The necessary software was developed at our department.

  2. Three-axis electron-beam test facility

    NASA Astrophysics Data System (ADS)

    Dayton, J. A., Jr.; Ebihara, B. T.

    1981-03-01

    An electron beam test facility, which consists of a precision multidimensional manipulator built into an ultra-high-vacuum bell jar, was designed, fabricated, and operated at Lewis Research Center. The position within the bell jar of a Faraday cup which samples current in the electron beam under test, is controlled by the manipulator. Three orthogonal axes of motion are controlled by stepping motors driven by digital indexers, and the positions are displayed on electronic totalizers. In the transverse directions, the limits of travel are approximately + or - 2.5 cm from the center with a precision of 2.54 micron (0.0001 in.); in the axial direction, approximately 15.0 cm of travel are permitted with an accuracy of 12.7 micron (0.0005 in.). In addition, two manually operated motions are provided, the pitch and yaw of the Faraday cup with respect to the electron beam can be adjusted to within a few degrees. The current is sensed by pulse transformers and the data are processed by a dual channel box car averager with a digital output. The beam tester can be operated manually or it can be programmed for automated operation. In the automated mode, the beam tester is controlled by a microcomputer (installed at the test site) which communicates with a minicomputer at the central computing facility. The data are recorded and later processed by computer to obtain the desired graphical presentations.

  3. Integration and software for thermal test of heat rate sensors. [space shuttle external tank

    NASA Technical Reports Server (NTRS)

    Wojciechowski, C. J.; Shrider, K. R.

    1982-01-01

    A minicomputer controlled radiant test facility is described which was developed and calibrated in an effort to verify analytical thermal models of instrumentation islands installed aboard the space shuttle external tank to measure thermal flight parameters during ascent. Software was provided for the facility as well as for development tests on the SRB actuator tail stock. Additional testing was conducted with the test facility to determine the temperature and heat flux rate and loads required to effect a change of color in the ET tank external paint. This requirement resulted from the review of photographs taken of the ET at separation from the orbiter which showed that 75% of the external tank paint coating had not changed color from its original white color. The paint on the remaining 25% of the tank was either brown or black, indicating that it had degraded due to heating or that the spray on form insulation had receded in these areas. The operational capability of the facility as well as the various tests which were conducted and their results are discussed.

  4. Quantitative scintigraphy with deconvolutional analysis for the dynamic measurement of hepatic function

    SciTech Connect

    Tagge, E.P.; Campbell, D.A. Jr.; Reichle, R.; Averill, D.R. Jr.; Merion, R.M.; Dafoe, D.C.; Turcotte, J.G.; Juni, J.E.

    1987-06-01

    A mathematical technique known as deconvolutional analysis was used to provide a critical and previously missing element in the computations required to quantitate hepatic function scintigraphically. This computer-assisted technique allowed for the determination of the time required, in minutes, of a labeled bilirubin analog (/sup 99m/Tc-disofenin) to enter the liver via blood and exit via bile. This interval was referred to as the mean transit time (MTT). The critical process provided for by deconvolution is the mathematical simulation of a bolus injection of tracer directly into the afferent blood supply of the liver. The raw data required for this simulation are obtained from the intravenous injection of labeled disofenin, a member of the HIDA family of radiopharmaceuticals. In this study, we perform experiments which document that the simulation process itself is accurate. We then calculate the MTT under a variety of experimental conditions involving progressive hepatic ischemia/reperfusion injury and correlate these results with the results of simultaneously performed BSP determinations and hepatic histology. The experimental group with the most pronounced histologic findings (necrosis, vacuolization, disorganization of hepatic cords) also have the most prolonged MTT and BSP half-life. However, both quantitative imaging and BSP testing are able to identify milder degrees of hepatic ischemic injury not reflected in the histologic evaluation. Quantitative imaging with deconvolutional analysis is a technique easily adaptable to the standard nuclear medicine minicomputer. It provides rapid results and appears to be a sensitive monitor of hepatic functional disturbances resulting from ischemia and reperfusion.

  5. An interferometric strain-displacement measurement system

    NASA Technical Reports Server (NTRS)

    Sharpe, William N., Jr.

    1989-01-01

    A system for measuring the relative in-plane displacement over a gage length as short as 100 micrometers is described. Two closely spaced indentations are placed in a reflective specimen surface with a Vickers microhardness tester. Interference fringes are generated when they are illuminated with a He-Ne laser. As the distance between the indentations expands or contracts with applied load, the fringes move. This motion is monitored with a minicomputer-controlled system using linear diode arrays as sensors. Characteristics of the system are: (1) gage length ranging from 50 to 500 micrometers, but 100 micrometers is typical; (2) least-count resolution of approximately 0.0025 micrometer; and (3) sampling rate of 13 points per second. In addition, the measurement technique is non-contacting and non-reinforcing. It is useful for strain measurements over small gage lengths and for crack opening displacement measurements near crack tips. This report is a detailed description of a new system recently installed in the Mechanisms of Materials Branch at the NASA Langley Research Center. The intent is to enable a prospective user to evaluate the applicability of the system to a particular problem and assemble one if needed.

  6. High strain rate properties of unidirectional composites, part 1

    NASA Technical Reports Server (NTRS)

    Daniel, I. M.

    1991-01-01

    Experimental methods were developed for testing and characterization of composite materials at strain rates ranging from quasi-static to over 500 s(sup -1). Three materials were characterized, two graphite/epoxies and a graphite/S-glass/epoxy. Properties were obtained by testing thin rings 10.16 cm (4 in.) in diameter, 2.54 cm (1 in.) wide, and six to eight plies thick under internal pressure. Unidirectional 0 degree, 90 degree, and 10 degree off-axis rings were tested to obtain longitudinal, transverse, and in-plane shear properties. In the dynamic tests internal pressure was applied explosively through a liquid and the pressure was measured with a calibrated steel ring. Strains in the calibration and specimen rings were recorded with a digital processing oscilloscope. The data were processed and the equation of motion solved numerically by the mini-computer attached to the oscilloscope. Results were obtained and plotted in the form of dynamic stress-strain curves. Longitudinal properties which are governed by the fibers do not vary much with strain rate with only a moderate (up to 20 percent) increase in modulus. Transverse modulus and strength increase sharply with strain rate reaching values up to three times the static values. The in-plane shear modulus and shear strength increase noticeably with strain rate by up to approximately 65 percent. In all cases ultimate strains do not vary significantly with strain rates.

  7. Text processing for technical reports (direct computer-assisted origination, editing, and output of text)

    SciTech Connect

    De Volpi, A.; Fenrick, M. R.; Stanford, G. S.; Fink, C. L.; Rhodes, E. A.

    1980-10-01

    Documentation often is a primary residual of research and development. Because of this important role and because of the large amount of time consumed in generating technical reports, particularly those containing formulas and graphics, an existing data-processing computer system has been adapted so as to provide text-processing of technical documents. Emphasis has been on accuracy, turnaround time, and time savings for staff and secretaries, for the types of reports normally produced in the reactor development program. The computer-assisted text-processing system, called TXT, has been implemented to benefit primarily the originator of technical reports. The system is of particular value to professional staff, such as scientists and engineers, who have responsibility for generating much correspondence or lengthy, complex reports or manuscripts - especially if prompt turnaround and high accuracy are required. It can produce text that contains special Greek or mathematical symbols. Written in FORTRAN and MACRO, the program TXT operates on a PDP-11 minicomputer under the RSX-11M multitask multiuser monitor. Peripheral hardware includes videoterminals, electrostatic printers, and magnetic disks. Either data- or word-processing tasks may be performed at the terminals. The repertoire of operations has been restricted so as to minimize user training and memory burden. Spectarial staff may be readily trained to make corrections from annotated copy. Some examples of camera-ready copy are provided.

  8. A five-collector system for the simultaneous measurement of argon isotope ratios in a static mass spectrometer

    USGS Publications Warehouse

    Stacey, J.S.; Sherrill, N.D.; Dalrymple, G.B.; Lanphere, M.A.; Carpenter, N.V.

    1981-01-01

    A system is described that utilizes five separate Faraday-cup collector assemblies, aligned along the focal plane of a mass spectrometer, to collect simultaneous argon ion beams at masses 36-40. Each collector has its own electrometer amplifier and analog-to-digital measuring channel, the outputs of which are processed by a minicomputer that also controls the mass spectrometer. The mass spectrometer utilizes a 90?? sector magnetic analyzer with a radius of 23 cm, in which some degree of z-direction focussing is provided for all the ion beams by the fringe field of the magnet. Simultaneous measurement of the ion beams helps to eliminate mass-spectrometer memory as a significant source of measurement error during an analysis. Isotope ratios stabilize between 7 and 9 s after sample admission into the spectrometer, and thereafter changes in the measured ratios are linear, typically to within ??0.02%. Thus the multi-collector arrangement permits very short extrapolation times for computation of initial ratios, and also provides the advantages of simultaneous measurement of the ion currents in that errors due to variations in ion beam intensity are minimized. A complete analysis takes less than 10 min, so that sample throughput can be greatly enhanced. In this instrument, the factor limiting analytical precision now lies in short-term apparent variations in the interchannel calibration factors. ?? 1981.

  9. Integration of autonomous systems for remote control of data acquisition and diagnostics in the TJ-II device

    NASA Astrophysics Data System (ADS)

    Vega, J.; Mollinedo, A.; López, A.; Pacios, L.; Dormido, S.

    1997-01-01

    The data acquisition system for TJ-II will consist of a central computer, containing the data base of the device, and a set of independent systems (personal computers, embedded ones, workstations, minicomputers, PLCs, and microprocessor systems among others), controlling data collection, and automated diagnostics. Each autonomous system can be used to isolate and manage specific problems in the most efficient manner. These problems are related to data acquisition, hard (?s-ms) real time requirements, soft (ms-s) real time requirements, remote control of diagnostics, etc. In the operation of TJ-II, the programming of systems will be carried out from the central computer. Coordination and synchronization will be performed by linking systems to local area networks. Several Ethernet segments and FDDI rings will be used for these purposes. Programmable logic controller devices (PLCs) used for diagnostic low level control will be linked among them through a fast serial link, the RS485 Profibus standard. One VME crate, running on the OS-9 real time operating system, will be assigned as a gateway, so as to connect the PLCs based systems with an Ethernet segment.

  10. Integration of autonomous systems for remote control of data acquisition and diagnostics in the TJ-II device

    SciTech Connect

    Vega, J.; Mollinedo, A.; Lopez, A.; Pacios, L.

    1997-01-01

    The data acquisition system for TJ-II will consist of a central computer, containing the data base of the device, and a set of independent systems (personal computers, embedded ones, workstations, minicomputers, PLCs, and microprocessor systems among others), controlling data collection, and automated diagnostics. Each autonomous system can be used to isolate and manage specific problems in the most efficient manner. These problems are related to data acquisition, hard ({mu}s{endash}ms) real time requirements, soft (ms{endash}s) real time requirements, remote control of diagnostics, etc. In the operation of TJ-II, the programming of systems will be carried out from the central computer. Coordination and synchronization will be performed by linking systems to local area networks. Several Ethernet segments and FDDI rings will be used for these purposes. Programmable logic controller devices (PLCs) used for diagnostic low level control will be linked among them through a fast serial link, the RS485 Profibus standard. One VME crate, running on the OS-9 real time operating system, will be assigned as a gateway, so as to connect the PLCs based systems with an Ethernet segment. {copyright} {ital 1997 American Institute of Physics.}

  11. Expansion and redundancy of the Doublet III data-acquisition computer system

    SciTech Connect

    McHarg, B.B. Jr.

    1982-02-01

    The Doublet III Data Acquisition computer system acquires, achives, and processes approximately two and one-half megabytes of data in a five minute shot cycle of the Doublet III tokamak fusion experiment. The quantity of data acquired and processing performed in future fusion experiments and Doublet III in particular is certain to increase and will require greater reliability and efficiency in data handling. The current system consists of one minicomputer, four disk drives, two tape drives, one bulk memory unit, plus terminals, printers, and hard copiers. Recent experience indicates the need for improvement in two areas: user access to data and redundancy. The most heavily accessed data is from the most recent shot, and this data is accessed simultaneously for archiving, summarizing data, and user purposes. To speed up acquisition, improve access to data, and reduce contention for the same data on disk, it is planned to retain the most recent shot in a greatly expanded bulk memory unit. Also a second computer will be added devoted solely to data acquisition. The software design of the system will however be such that the acquisition system could be run with only one computer, consistent with the philosophy of redundancy. Redundant software design also includes the capability to switch data files between disk drives, to switch archiving of data between tape drives, and to write data directly to disk should the bulk memory unit fail.

  12. SCAILET: An intelligent assistant for satellite ground terminal operations

    NASA Technical Reports Server (NTRS)

    Shahidi, A. K.; Crapo, J. A.; Schlegelmilch, R. F.; Reinhart, R. C.; Petrik, E. J.; Walters, J. L.; Jones, R. E.

    1993-01-01

    NASA Lewis Research Center has applied artificial intelligence to an advanced ground terminal. This software application is being deployed as an experimenter interface to the link evaluation terminal (LET) and was named Space Communication Artificial Intelligence for the Link Evaluation Terminal (SCAILET). The high-burst-rate (HBR) LET provides 30-GHz-transmitting and 20-GHz-receiving, 220-Mbps capability for wide band communications technology experiments with the Advanced Communication Technology Satellite (ACTS). The HBR-LET terminal consists of seven major subsystems. A minicomputer controls and monitors these subsystems through an IEEE-488 or RS-232 protocol interface. Programming scripts (test procedures defined by design engineers) configure the HBR-LET and permit data acquisition. However, the scripts are difficult to use, require a steep learning curve, are cryptic, and are hard to maintain. This discourages experimenters from utilizing the full capabilities of the HBR-LET system. An intelligent assistant module was developed as part of the SCAILET software. The intelligent assistant addresses critical experimenter needs by solving and resolving problems that are encountered during the configuring of the HBR-LET system. The intelligent assistant is a graphical user interface with an expert system running in the background. In order to further assist and familiarize an experimenter, an on-line hypertext documentation module was developed and included in the SCAILET software.

  13. Development of a toxin knowledge system. Annual summary report, 6 April 1988-1 December 1989

    SciTech Connect

    Trammel, H.L.

    1990-10-20

    To provide rapidly accessible, up-to-date knowledge about low molecular weight toxins, the development of a Toxin Knowledge System (TKS) was begun. This system was designed to extract facts from published literature using structured abstracting techniques, store these facts in a standard knowledge structure, control the terminology entered through the use of a standard nomenclature system, and subsequently generate standard monographs on individual toxins. The system was developed using a relational database management system and associated programming language on a minicomputer. The TKS application facilitates the extraction of needed information through a sophisticated user interface. Data from both journals and books can be entered into the system. The user enters citation data and is then prompted for information about each study design, each subject group and exposure regimen within each design, and how these factors interact to produce clinical effects. A clinical findings vocabulary was developed so that clinical effects could be easily recorded. Rudamentary monograph generation capabilities were included but need further refinement. To facilitate USAMRIID usage and further development, the TKS application, the clinical finding vocabulary, and journal listing vocabulary was been ported to the MS-DOS platform.

  14. Solar system of the Doho Park Gymnasium

    SciTech Connect

    Takama, S.

    1982-01-01

    Conventional solar systems have operated on heating demand, irrespective of the amount of heat gained from the collector. Consequently, there have been cases when large amounts of auxiliary heat were necessary, and the solar fraction was greatly reduced. In particular, there were many cases where a large system, intended for multipurpose use, was controlled simply by the fluid temperature alone. It often happened that the system could not be run by solar heat because of low temperatures, even though effective amounts of solar radiation were available. Therefore, a key point for the effective use of solar systems is to adjust the imbalance between heat gained from the collector and the heating load. The Doho Park Gymnasium project, which has Japan's largest collector area -- 1912 m/sup 2/ for multipurpose use -- took this point into full consideration. In the system, anticipated heat collection, Qu, heat storage, Qs, and heating loads, Ql, are calculated every hour by a minicomputer. The heating load is then controlled in accordance with a prearranged priority order for effective utilization of solar energy and, to a lesser extent, of auxiliary heat.

  15. Galatea ¬â€?An Interactive Computer Graphics System For Movie And Video Analysis

    NASA Astrophysics Data System (ADS)

    Potel, Michael J.; MacKay, Steven A.; Sayre, Richard E.

    1983-03-01

    Extracting quantitative information from movie film and video recordings has always been a difficult process. The Galatea motion analysis system represents an application of some powerful interactive computer graphics capabilities to this problem. A minicomputer is interfaced to a stop-motion projector, a data tablet, and real-time display equipment. An analyst views a film and uses the data tablet to track a moving position of interest. Simultaneously, a moving point is displayed in an animated computer graphics image that is synchronized with the film as it runs. Using a projection CRT and a series of mirrors, this image is superimposed on the film image on a large front screen. Thus, the graphics point lies on top of the point of interest in the film and moves with it at cine rates. All previously entered points can be displayed simultaneously in this way, which is extremely useful in checking the accuracy of the entries and in avoiding omission and duplication of points. Furthermore, the moving points can be connected into moving stick figures, so that such representations can be transcribed directly from film. There are many other tools in the system for entering outlines, measuring time intervals, and the like. The system is equivalent to "dynamic tracing paper" because it is used as though it were tracing paper that can keep up with running movie film. We have applied this system to a variety of problems in cell biology, cardiology, biomechanics, and anatomy. We have also extended the system using photogrammetric techniques to support entry of three-dimensional moving points from two (or more) films taken simultaneously from different perspective views. We are also presently constructing a second, lower-cost, microcomputer-based system for motion analysis in video, using digital graphics and video mixing to achieve the graphics overlay for any composite video source image.

  16. Acoustic systems for the measurement of streamflow

    USGS Publications Warehouse

    Laenen, Antonius; Smith, Winchell

    1983-01-01

    The acoustic velocity meter (AVM), also referred to as an ultrasonic flowmeter, has been an operational tool for the measurement of streamflow since 1965. Very little information is available concerning AVM operation, performance, and limitations. The purpose of this report is to consolidate information in such a manner as to provide a better understanding about the application of this instrumentation to streamflow measurement. AVM instrumentation is highly accurate and nonmechanical. Most commercial AVM systems that measure streamflow use the time-of-travel method to determine a velocity between two points. The systems operate on the principle that point-to-point upstream travel-time of sound is longer than the downstream travel-time, and this difference can be monitored and measured accurately by electronics. AVM equipment has no practical upper limit of measurable velocity if sonic transducers are securely placed and adequately protected. AVM systems used in streamflow measurement generally operate with a resolution of ?0.01 meter per second but this is dependent on system frequency, path length, and signal attenuation. In some applications the performance of AVM equipment may be degraded by multipath interference, signal bending, signal attenuation, and variable streamline orientation. Presently used minicomputer systems, although expensive to purchase and maintain, perform well. Increased use of AVM systems probably will be realized as smaller, less expensive, and more conveniently operable microprocessor-based systems become readily available. Available AVM equipment should be capable of flow measurement in a wide variety of situations heretofore untried. New signal-detection techniques and communication linkages can provide additional flexibility to the systems so that operation is possible in more river and estuary situations.

  17. Research, development and demonstration of nickel-zinc batteries for electric vehicle propulsion. Annual report, 1979. [70 W/lb

    SciTech Connect

    Not Available

    1980-06-01

    This second annual report under Contract No. 31-109-39-4200 covers the period July 1, 1978 through August 31, 1979. The program demonstrates the feasibility of the nickel-zinc battery for electric vehicle propulsion. The program is divided into seven distinct but highly interactive tasks collectively aimed at the development and commercialization of nickel-zinc technology. These basic technical tasks are separator development, electrode development, product design and analysis, cell/module battery testing, process development, pilot manufacturing, and thermal management. A Quality Assurance Program has also been established. Significant progress has been made in the understanding of separator failure mechanisms, and a generic category of materials has been specified for the 300+ deep discharge (100% DOD) applications. Shape change has been reduced significantly. A methodology has been generated with the resulting hierarchy: cycle life cost, volumetric energy density, peak power at 80% DOD, gravimetric energy density, and sustained power. Generation I design full-sized 400-Ah cells have yielded in excess of 70 W/lb at 80% DOD. Extensive testing of cells, modules, and batteries is done in a minicomputer-based testing facility. The best life attained with electric vehicle-size cell components is 315 cycles at 100% DOD (1.0V cutoff voltage), while four-cell (approx. 6V) module performance has been limited to about 145 deep discharge cycles. The scale-up of processes for production of components and cells has progressed to facilitate component production rates of thousands per month. Progress in the area of thermal management has been significant, with the development of a model that accurately represents heat generation and rejection rates during battery operation. For the balance of the program, cycle life of > 500 has to be demonstrated in modules and full-sized batteries. 40 figures, 19 tables. (RWR)

  18. Eddy-current inspection for steam generator tubing program. Annual progress report for period ending December 31, 1979

    SciTech Connect

    Dodd, C.V.; Deeds, W.E.; McClung, R.W.

    1980-07-01

    Eddy-current methods provide the best in-service inspection of steam generator tubing, but present techniques can produce ambiguity because of the many independent variables that affect the signals. The current development program has used mathematical models and developed or modified computer programs to design optimum probes, instrumentation, and techniques for multifrequency, multiproperty examinations. Interactive calculations and experimental measurements have been made with the use of modular eddy-current instrumentation and a minicomputer. These establish the coefficients for the complex equations that define the values of the desired properties (and the attainable accuracy) despite changes in other significant variables. The computer programs for calculating the accuracy with which various properties can be measured indicate that the tubing wall thickness and the defect size can be measured much more accurately than is currently required, even when other properties are varying. Our experimental measurements have confirmed these results, although more testing is needed for all the different combinations of cases and different types of defects. To facilitate the extensive laboratory scanning of the matrix of specimens that are necessary to develop algorithms for detection and analysis for all the possible combinations of positions of flaws, tube supports, and probe coils, we have designed, constructed, and begun operation of a computer-controlled automatic positioner. We have demonstrated the ability to overcome the large signals produced by the edge of the tube supports. An advanced microcomputer has been designed, constructed, and installed in the instrumentation to control the examination and provide real-time calculations of the desired properties for display recording during the scanning of the tube.

  19. COMPUTER MODEL OF TEMPERATURE DISTRIBUTION IN OPTICALLY PUMPED LASER RODS

    NASA Technical Reports Server (NTRS)

    Farrukh, U. O.

    1994-01-01

    Managing the thermal energy that accumulates within a solid-state laser material under active pumping is of critical importance in the design of laser systems. Earlier models that calculated the temperature distribution in laser rods were single dimensional and assumed laser rods of infinite length. This program presents a new model which solves the temperature distribution problem for finite dimensional laser rods and calculates both the radial and axial components of temperature distribution in these rods. The modeled rod is either side-pumped or end-pumped by a continuous or a single pulse pump beam. (At the present time, the model cannot handle a multiple pulsed pump source.) The optical axis is assumed to be along the axis of the rod. The program also assumes that it is possible to cool different surfaces of the rod at different rates. The user defines the laser rod material characteristics, determines the types of cooling and pumping to be modeled, and selects the time frame desired via the input file. The program contains several self checking schemes to prevent overwriting memory blocks and to provide simple tracing of information in case of trouble. Output for the program consists of 1) an echo of the input file, 2) diffusion properties, radius and length, and time for each data block, 3) the radial increments from the center of the laser rod to the outer edge of the laser rod, and 4) the axial increments from the front of the laser rod to the other end of the rod. This program was written in Microsoft FORTRAN77 and implemented on a Tandon AT with a 287 math coprocessor. The program can also run on a VAX 750 mini-computer. It has a memory requirement of about 147 KB and was developed in 1989.

  20. Some Problems and Solutions in Transferring Ecosystem Simulation Codes to Supercomputers

    NASA Technical Reports Server (NTRS)

    Skiles, J. W.; Schulbach, C. H.

    1994-01-01

    Many computer codes for the simulation of ecological systems have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Recent recognition of ecosystem science as a High Performance Computing and Communications Program Grand Challenge area emphasizes supercomputers (both parallel and distributed systems) as the next set of tools for ecological simulation. Transferring ecosystem simulation codes to such systems is not a matter of simply compiling and executing existing code on the supercomputer since there are significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers. To more appropriately match the application to the architecture (necessary to achieve reasonable performance), the parallelism (if it exists) of the original application must be exploited. We discuss our work in transferring a general grassland simulation model (developed on a VAX in the FORTRAN computer programming language) to a Cray Y-MP. We show the Cray shared-memory vector-architecture, and discuss our rationale for selecting the Cray. We describe porting the model to the Cray and executing and verifying a baseline version, and we discuss the changes we made to exploit the parallelism in the application and to improve code execution. As a result, the Cray executed the model 30 times faster than the VAX 11/785 and 10 times faster than a Sun 4 workstation. We achieved an additional speed-up of approximately 30 percent over the original Cray run by using the compiler's vectorizing capabilities and the machine's ability to put subroutines and functions "in-line" in the code. With the modifications, the code still runs at only about 5% of the Cray's peak speed because it makes ineffective use of the vector processing capabilities of the Cray. We conclude with a discussion and future plans.

  1. FACSIM/MRS-1: Cask receiving and consolidation model documentation and user's guide

    SciTech Connect

    Lotz, T.L.; Shay, M.R.

    1987-06-01

    The Pacific Northwest Laboratory (PNL) has developed a stochastic computer model, FACSIM/MRS, to assist in assessing the operational performance of the Monitored Retrievable Storage (MRS) waste-handling facility. This report provides the documentation and user's guide for the component FACSIM/MRS-1, which is also referred to as the front-end model. The FACSIM/MRS-1 model simulates the MRS cask-receiving and spent-fuel consolidation activities. The results of the assessment of the operational performance of these activities are contained in a second report, FACSIM/MRS-1: Cask Receiving and Consolidation Performance Assessment (Lotz and Shay 1987). The model of MRS canister storage and shipping operations is presented in FACSIM/MRS-2: Storage and Shipping Model Documentation and User's Guide (Huber et al. 1987). The FACSIM/MRS model uses the commercially available FORTRAN-based SIMAN (SIMulation ANalysis language) simulation package (Pegden 1982). SIMAN provides a set of FORTRAN-coded commands, called block operations, which are used to build detailed models of continuous or discrete events that make up the operations of any process, such as the operation of an MRS facility. The FACSIM models were designed to run on either an IBM-PC or a VAX minicomputer. The FACSIM/MRS-1 model is flexible enough to collect statistics concerning almost any aspect of the cask receiving and consolidation operations of an MRS facility. The MRS model presently collects statistics on 51 quantities of interest during the simulation. SIMAN reports the statistics with two forms of output: a SIMAN simulation summary and an optional set of SIMAN output files containing data for use by more detailed post processors and report generators.

  2. Structural Analysis Made 'NESSUSary'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Everywhere you look, chances are something that was designed and tested by a computer will be in plain view. Computers are now utilized to design and test just about everything imaginable, from automobiles and airplanes to bridges and boats, and elevators and escalators to streets and skyscrapers. Computer-design engineering first emerged in the 1970s, in the automobile and aerospace industries. Since computers were in their infancy, however, architects and engineers during the time were limited to producing only designs similar to hand-drafted drawings. (At the end of 1970s, a typical computer-aided design system was a 16-bit minicomputer with a price tag of $125,000.) Eventually, computers became more affordable and related software became more sophisticated, offering designers the "bells and whistles" to go beyond the limits of basic drafting and rendering, and venture into more skillful applications. One of the major advancements was the ability to test the objects being designed for the probability of failure. This advancement was especially important for the aerospace industry, where complicated and expensive structures are designed. The ability to perform reliability and risk assessment without using extensive hardware testing is critical to design and certification. In 1984, NASA initiated the Probabilistic Structural Analysis Methods (PSAM) project at Glenn Research Center to develop analysis methods and computer programs for the probabilistic structural analysis of select engine components for current Space Shuttle and future space propulsion systems. NASA envisioned that these methods and computational tools would play a critical role in establishing increased system performance and durability, and assist in structural system qualification and certification. Not only was the PSAM project beneficial to aerospace, it paved the way for a commercial risk- probability tool that is evaluating risks in diverse, down- to-Earth application

  3. Fact retrieval for the 1980s

    SciTech Connect

    Hampel, V.E.

    1981-07-01

    This report reviews prevailing methodologies of fact retrieval in science and technology and makes suprise-free projections for the decade to come: numeric databases are shown to overtake in size and number the large bibliographic collections. This is expected to lead toward more sophisticiated, interactive data analysis techniques with graphical display options. The availability of low-cost intelligent computer terminals, micro- and minicomputers, is shown to make aggregation and post-processing of retrieved information from different sources readily possible. This capability may come into conflict with legal constraints and is bound to affect the traditional marketing of information. It will lead to the extraction of higher forms of intelligence from text and data. The user community is seen to shift from expert information specialists, who act now as middlemen, to the end-users of information. This less experienced user community will challenge the ingenuity of system designers for self-guiding, adaptive, and yet more sophisticated man-machine interfaces. The merging of wide-band digital communication networks with computer technologies will make it possible to interconnect computers, information centers, word processors, and other peripherals, worldwide. Techniques of tabular and graphical fact retrieval are examined. The prospects of fact retrieval by voice, touch screens, and videotext are discussed. The potential of two unusual three-dimensional display techniques, the computer-generated time-resolved integral hologram and the projection of virtual data images into space, are discussed. Resulting problems are examined and some solutions given by example of experience with the integrated Technology Information System at the Lawrence Livermore National Laboratory.

  4. Side-scan sonar mapping: Pseudo-real-time processing and mosaicking techniques

    SciTech Connect

    Danforth, W.W.; Schwab, W.C.; O'Brien, T.F. ); Karl, H. )

    1990-05-01

    The US Geological Survey (USGS) surveyed 1,000 km{sup 2} of the continental shelf off San Francisco during a 17-day cruise, using a 120-kHz side-scan sonar system, and produced a digitally processed sonar mosaic of the survey area. The data were processed and mosaicked in real time using software developed at the Lamont-Doherty Geological Observatory and modified by the USGS, a substantial task due to the enormous amount of data produced by high-resolution side-scan systems. Approximately 33 megabytes of data were acquired every 1.5 hr. The real-time sonar images were displayed on a PC-based workstation and the data were transferred to a UNIX minicomputer where the sonar images were slant-range corrected, enhanced using an averaging method of desampling and a linear-contrast stretch, merged with navigation, geographically oriented at a user-selected scale, and finally output to a thermal printer. The hard-copy output was then used to construct a mosaic of the survey area. The final product of this technique is a UTM-projected map-mosaic of sea-floor backscatter variations, which could be used, for example, to locate appropriate sites for sediment sampling to ground truth the sonar imagery while still at sea. More importantly, reconnaissance surveys of this type allow for the analysis and interpretation of the mosaic during a cruise, thus greatly reducing the preparation time needed for planning follow-up studies of a particular area.

  5. Automated performance monitoring and assessment for DCS digital systems

    NASA Astrophysics Data System (ADS)

    Jankauskas, L. E.; Mizesko, M.; Falzone, W. J.; Chace, B. D.; Wilson, G. G.

    1980-07-01

    As an aid in evaluating technical control techniques, an emulation facility that automatically performs the status monitoring, performance assessment, and fault isolation transmission control functions as they apply to the digital Defense Communications System (DCS) has been developed. This emulation facility is a multicomputer system which automatically monitors and isolates faults for digital transmission equipments. The status monitoring and performance assessment functions are performed by two processors, the Adaptive Channel Estimator (ACE) and an LSI 11/03, the composite being referred to as the CPMAS-D unit. When the software residing in the CPMAS-D unit detects a monitor point transition, it transmits the monitor point information to the CPMAS Emulator, a PDP 11/60 minicomputer. These messages, called exception reports, enable the CPMAS Emulator to perform its prime mission: fault isolation. A unique fault isolation algorithm has been developed for test with this emulation facility. The algorithm consists of three discrete steps. First, the equipment alarms are mapped into their effect upon each transmission path (link, supergroup, group, or channel). Second, the stations with the faulty equipment are located by deleting the impact of sympathetic alarms. Third, the faulty equipment is identified using the equipment alarm status. Testing of the fault isolation algorithm is enhanced by an emulated network consisting of up to 16 stations, 2048 equipments, and two nodal control areas. Monitor point simulators and T1-4000 multiplexers, which provide simulated and real time inputs to two CPMAS-D units, are also part of the emulation facility. Technical control terminals are provided to evaluate man/machine operation in an automated technical control environment.

  6. An Experimental Digital Image Processor

    NASA Astrophysics Data System (ADS)

    Cok, Ronald S.

    1986-12-01

    A prototype digital image processor for enhancing photographic images has been built in the Research Laboratories at Kodak. This image processor implements a particular version of each of the following algorithms: photographic grain and noise removal, edge sharpening, multidimensional image-segmentation, image-tone reproduction adjustment, and image-color saturation adjustment. All processing, except for segmentation and analysis, is performed by massively parallel and pipelined special-purpose hardware. This hardware runs at 10 MHz and can be adjusted to handle any size digital image. The segmentation circuits run at 30 MHz. The segmentation data are used by three single-board computers for calculating the tonescale adjustment curves. The system, as a whole, has the capability of completely processing 10 million three-color pixels per second. The grain removal and edge enhancement algorithms represent the largest part of the pipelined hardware, operating at over 8 billion integer operations per second. The edge enhancement is performed by unsharp masking, and the grain removal is done using a collapsed Walsh-hadamard transform filtering technique (U.S. Patent No. 4549212). These two algo-rithms can be realized using four basic processing elements, some of which have been imple-mented as VLSI semicustom integrated circuits. These circuits implement the algorithms with a high degree of efficiency, modularity, and testability. The digital processor is controlled by a Digital Equipment Corporation (DEC) PDP 11 minicomputer and can be interfaced to electronic printing and/or electronic scanning de-vices. The processor has been used to process over a thousand diagnostic images.

  7. Graphics processing, video digitizing, and presentation of geologic information

    SciTech Connect

    Sanchez, J.D. )

    1990-02-01

    Computer users have unparalleled opportunities to use powerful desktop computers to generate, manipulate, analyze and use graphic information for better communication. Processing graphic geologic information on a personal computer like the Amiga used for the projects discussed here enables geoscientists to create and manipulate ideas in ways once available only to those with access to large budgets and large mainframe computers. Desktop video applications such as video digitizing and powerful graphic processing application programs add a new dimension to the creation and manipulation of geologic information. Videotape slide shows and animated geology give geoscientists new tools to examine and present information. Telecommunication programs such as ATalk III, which can be used as an all-purpose telecommunications program or can emulate a Tektronix 4014 terminal, allow the user to access Sun and Prime minicomputers and manipulate graphic geologic information stored there. Graphics information displayed on the monitor screen can be captured and saved in the standard Amiga IFF graphic format. These IFF files can be processed using image processing programs such as Butcher. Butcher offers edge mapping, resolution conversion, color separation, false colors, toning, positive-negative reversals, etc. Multitasking and easy expansion that includes IBM-XT and AT co-processing offer unique capabilities for graphic processing and file transfer between Amiga-DOS and MS-DOS. Digital images produced by satellites and airborne scanners can be analyzed on the Amiga using the A-Image processing system developed by the CSIRO Division of Mathematics and Statistics and the School of Mathematics and Computing at Curtin University, Australia.

  8. The Automatic Remote Geomagnetic Observatory System (ARGOS) operated in the U.K. by the British Geological Survey

    NASA Astrophysics Data System (ADS)

    Riddick, J. C.; Greenwood, A. C.; Stuart, W. F.

    ARGOS, an instrument and recording system for performing standard geomagnetic observatory functions at the three U.K. observatories, is described. Operations are controlled by a minicomputer at each observatory communicating by modem through the public telephone system to a central computer in Edinburgh. A fluxgate magnetometer provides 10-s samples of the variation field at + 1 nT resolution. These values are filtered to produce 1-min values (centred on the minute) which in turn are used to compute hourly mean values. These and other derivatives of the raw data are stored in the observatory computer and transmitted to Edinburgh daily by operator command, where they are transferred to a data file which can be accessed by users via the Joint Academic Network computer network (JANET). Observatory data are available to users by this means within 24 h and can be made available in near real time by special arrangement. ARGOS performs standardization measurements of the values of the field components remotely using a proton magnetometer employing standard techniques for absolute observations. Comparison of these Baseline Reference Measurements with manual absolute observations shows them to be acceptable for baseline adoption. In the first year of operation it has been established that ARGOS produces data which are of comparable quality to the classical standards expected from the U.K. observatories. Data loss has been less than 1%. Further automation of routine procedures (e.g. magnetogram plotting, editing, baseline adoption and the adjustment of minute values day by day) will be the focus of attention in the next 2 years.

  9. Design of a real-time wind turbine simulator using a custom parallel architecture

    NASA Technical Reports Server (NTRS)

    Hoffman, John A.; Gluck, R.; Sridhar, S.

    1995-01-01

    The design of a new parallel-processing digital simulator is described. The new simulator has been developed specifically for analysis of wind energy systems in real time. The new processor has been named: the Wind Energy System Time-domain simulator, version 3 (WEST-3). Like previous WEST versions, WEST-3 performs many computations in parallel. The modules in WEST-3 are pure digital processors, however. These digital processors can be programmed individually and operated in concert to achieve real-time simulation of wind turbine systems. Because of this programmability, WEST-3 is very much more flexible and general than its two predecessors. The design features of WEST-3 are described to show how the system produces high-speed solutions of nonlinear time-domain equations. WEST-3 has two very fast Computational Units (CU's) that use minicomputer technology plus special architectural features that make them many times faster than a microcomputer. These CU's are needed to perform the complex computations associated with the wind turbine rotor system in real time. The parallel architecture of the CU causes several tasks to be done in each cycle, including an IO operation and the combination of a multiply, add, and store. The WEST-3 simulator can be expanded at any time for additional computational power. This is possible because the CU's interfaced to each other and to other portions of the simulation using special serial buses. These buses can be 'patched' together in essentially any configuration (in a manner very similar to the programming methods used in analog computation) to balance the input/ output requirements. CU's can be added in any number to share a given computational load. This flexible bus feature is very different from many other parallel processors which usually have a throughput limit because of rigid bus architecture.

  10. Computer-generated speech

    SciTech Connect

    Aimthikul, Y.

    1981-12-01

    This thesis reviews the essential aspects of speech synthesis and distinguishes between the two prevailing techniques: compressed digital speech and phonemic synthesis. It then presents the hardware details of the five speech modules evaluated. FORTRAN programs were written to facilitate message creation and retrieval with four of the modules driven by a PDP-11 minicomputer. The fifth module was driven directly by a computer terminal. The compressed digital speech modules (T.I. 990/306, T.S.I. Series 3D and N.S. Digitalker) each contain a limited vocabulary produced by the manufacturers while both the phonemic synthesizers made by Votrax permit an almost unlimited set of sounds and words. A text-to-phoneme rules program was adapted for the PDP-11 (running under the RSX-11M operating system) to drive the Votrax Speech Pac module. However, the Votrax Type'N Talk unit has its own built-in translator. Comparison of these modules revealed that the compressed digital speech modules were superior in pronouncing words on an individual basis but lacked the inflection capability that permitted the phonemic synthesizers to generate more coherent phrases. These findings were necessarily highly subjective and dependent on the specific words and phrases studied. In addition, the rapid introduction of new modules by manufacturers will necessitate new comparisons. However, the results of this research verified that all of the modules studied do possess reasonable quality of speech that is suitable for man-machine applications. Furthermore, the development tools are now in place to permit the addition of computer speech output in such applications.

  11. Cyclic axial-torsional deformation behavior of a cobalt-base superalloy

    NASA Technical Reports Server (NTRS)

    Bonacuse, Peter J.; Kalluri, Sreeramesh

    1992-01-01

    Multiaxial loading, especially at elevated temperature, can cause the inelastic response of a material to differ significantly from that predicted by simple flow rules, i.e., von Mises or Tresca. To quantify some of these differences, the cyclic high-temperature, deformation behavior of a wrought cobalt-based superalloy, Haynes 188, is investigated under combined axial and torsional loads. Haynes 188 is currently used in many aerospace gas turbine and rocket engine applications, e.g., the combustor liner for the T800 turboshaft engine for the RAH-66 Comanche helicopter and the liquid oxygen posts in the main injector of the space shuttle main engine. The deformation behavior of this material is assessed through the examination of hysteresis loops generated from a biaxial fatigue test program. A high-temperature axial, torsional, and combined axial-torsional fatigue data base has been generated on Haynes 188 at 760 C. Cyclic loading tests have been conducted on uniform gauge section tubular specimens in a servohydraulic axial-torsional test rig. Test control and data acquisition were accomplished with a minicomputer. In this paper, the cyclic hardening characteristics and typical hysteresis loops in the axial stress versus axial strain, shear stress versus engineering shear strain, axial strain versus engineering shear strain, and axial stress versus shear stress spaces are presented for cyclic, in-phase and out-of-phase, axial torsional tests. For in-phase tests three different values of the proportionality constant, lambda (ratio of engineering shear strain amplitude to axial strain amplitude), are examined, viz., 0.86, 1.73, and 3.46. In the out-of-phase tests, three different values of the phase angle, phi (between the axial and engineering shear strain waveforms), are studied, viz., 30, 60, and 90 deg with lambda = 1.73. The cyclic hardening behaviors of all the tests conducted on Haynes 188 at 760 C are evaluated using the von Mises equivalent stress-strain and the maximum shear stress-maximum engineering shear strain (Tresca) curves. Comparisons are also made between the hardening behaviors of cyclic axial, torsional, and combined in-phase and out-of-phase axial-torsional fatigue tests. These comparisons are accomplished through simple Ramberg-Osgood type stress-strain functions for cyclic, axial stress-strain and shear stress-engineering shear strain curves.

  12. Cyclic Axial-Torsional Deformation Behavior of a Cobalt-Base Superalloy

    NASA Technical Reports Server (NTRS)

    Bonacuse, Peter J.; Kalluri, Sreeramesh

    1995-01-01

    The cyclic, high-temperature deformation behavior of a wrought cobalt-base super-alloy, Haynes 188, is investigated under combined axial and torsional loads. This is accomplished through the examination of hysteresis loops generated from a biaxial fatigue test program. A high-temperature axial, torsional, and combined axial-torsional fatigue database has been generated on Haynes 188 at 760 C. Cyclic loading tests have been conducted on uniform gage section tubular specimens in a servohydraulic axial-torsional test rig. Test control and data acquisition were accomplished with a minicomputer. The fatigue behavior of Haynes 188 at 760 C under axial, torsional, and combined axial-torsional loads and the monotonic and cyclic deformation behaviors under axial and torsional loads have been previously reported. In this paper, the cyclic hardening characteristics and typical hysteresis loops in the axial stress versus axial strain, shear stress ,versus engineering shear strain, axial strain versus engineering shear strain. and axial stress versus shear stress spaces are presented for cyclic in-phase and out-of-phase axial-torsional tests. For in-phase tests, three different values of the proportionality constant lambda (the ratio of engineering shear strain amplitude to axial strain amplitude, are examined, viz. 0.86, 1.73, and 3.46. In the out-of-phase tests, three different values of the phase angle, phi (between the axial and engineering shear strain waveforms), are studied, viz., 30, 60, and 90 degrees with lambda equals 1.73. The cyclic hardening behaviors of all the tests conducted on Haynes 188 at 760 C are evaluated using the von Mises equivalent stress-strain and the maximum shear stress-maximum engineering shear strain (Tresca) curves. Comparisons are also made between the hardening behaviors of cyclic axial, torsional, and combined in-phase (lambda = 1.73 and phi = 0) and out-of-phase (lambda = 1.73 and phi = 90') axial-torsional fatigue tests. These comparisons are accomplished through simple Ramberg-Osgood type stress-strain functions for cyclic, axial stress-strain and shear stress-engineering shear strain curves.

  13. LOOK- A TEXT FILE DISPLAY PROGRAM

    NASA Technical Reports Server (NTRS)

    Vavrus, J. L.

    1994-01-01

    The LOOK program was developed to permit a user to examine a text file in a psuedo-random access manner. Many engineering and scientific programs generate large amounts of printed output. Often this output needs to be examined in only a few places. On mini-computers (like the DEC VAX) high-speed printers are usually at a premium. One alternative is to save the output in a text file and examine it with a text editor. The slowness of a text editor, the possibility of inadvertently changing the output, and other factors make this an unsatisfactory solution. The LOOK program provides the user with a means of rapidly examining the contents of an ASCII text file. LOOK's basis of operation is to open the text file for input only and then access it in a block-wise fashion. LOOK handles the text formatting and displays the text lines on the screen. The user can move forward or backward in the file by a given number of lines or blocks. LOOK also provides the ability to "scroll" the text at various speeds in the forward or backward directions. The user can perform a search for a string (or a combination of up to 10 strings) in a forward or backward direction. Also, user selected portions of text may be extracted and submitted to print or placed in a file. Additional features available to the LOOK user include: cancellation of an operation with a keystroke, user definable keys, switching mode of operation (e.g. 80/132 column), on-line help facility, trapping broadcast messages, and the ability to spawn a sub-process to carry out DCL functions without leaving LOOK. The LOOK program is written in FORTRAN 77 and MACRO ASSEMBLER for interactive execution and has been implemented on a DEC VAX computer using VAX/VMS with a central memory requirement of approximately 430K of 8 bit bytes. LOOK operation is terminal independent but will take advantage of the features of the DEC VT100 terminal if available. LOOK was developed in 1983.

  14. Obituary: Arthur Dodd Code (1923-2009)

    NASA Astrophysics Data System (ADS)

    Marché, Jordan D., II

    2009-12-01

    Former AAS president Arthur Dodd Code, age 85, passed away at Meriter Hospital in Madison, Wisconsin on 11 March 2009, from complications involving a long-standing pulmonary condition. Code was born in Brooklyn, New York on 13 August 1923, as the only child of former Canadian businessman Lorne Arthur Code and Jesse (Dodd) Code. An experienced ham radio operator, he entered the University of Chicago in 1940, but then enlisted in the U.S. Navy (1943-45) and was later stationed as an instructor at the Naval Research Laboratory, Washington, D.C. During the war, he gained extensive practical experience with the design and construction of technical equipment that served him well in years ahead. Concurrently, he took physics courses at George Washington University (some under the tutelage of George Gamow). In 1945, he was admitted to the graduate school of the University of Chicago, without having received his formal bachelor's degree. In 1950, he was awarded his Ph.D. for a theoretical study of radiative transfer in O- and B-type stars, directed by Subrahmanyan Chandrasekhar. hired onto the faculty of the Department of Astronomy at the University of Wisconsin-Madison (1951-56). He then accepted a tenured appointment at the California Institute of Technology and the Mount Wilson and Palomar Observatories (1956-58). But following the launch of Sputnik, Code returned to Wisconsin in 1958 as full professor of astronomy, director of the Washburn Observatory, and department chairman so that he could more readily pursue his interest in space astronomy. That same year, he was chosen a member of the Space Science Board of the National Academy of Sciences (created during the International Geophysical Year) and shortly became one of five principal investigators of the original NASA Space Science Working Group. In a cogent 1960 essay, Code argued that astrophysical investigations, when conducted from beyond the Earth's atmosphere, "cannot fail to have a tremendous impact on the future course of stellar astronomy," a prediction strongly borne out in the decades that followed. In 1959, Code founded the Space Astronomy Laboratory (SAL) within the UW Department of Astronomy. Early photometric and spectrographic equipment was test-flown aboard NASA's X-15 rocket plane and Aerobee sounding rockets. Along with other SAL personnel, including Theodore E. Houck, Robert C. Bless, and John F. McNall, Code (as principal investigator) was responsible for the design of the Wisconsin Experiment Package (WEP) as one of two suites of instruments to be flown aboard the Orbiting Astronomical Observatory (OAO), which represented a milestone in the advent of space astronomy. With its seven reflecting telescopes feeding five filter photometers and two scanning spectrometers, WEP permitted the first extended observations in the UV portion of the spectrum. After the complete failure of the OAO-1 spacecraft (launched in 1966), OAO-2 was successfully launched on 7 December 1968 and gathered data on over a thousand celestial objects during the next 50 months, including stars, nebulae, galaxies, planets, and comets. These results appeared in a series of more than 40 research papers, chiefly in the Ap.J., along with the 1972 monograph, The Scientific Results from the Orbiting Astronomical Observatory (OAO-2), edited by Code. Between the OAO launches, other SAL colleagues of Code developed the Wisconsin Automatic Photoelectric Telescope (or APT), the first computer-controlled (or "robotic") telescope. Driven by a PDP-8 mini-computer, it routinely collected atmospheric extinction data. Code was also chosen principal investigator for the Wisconsin Ultraviolet Photo-Polarimeter Experiment (or WUPPE). This used a UV-sensitive polarimeter designed by Kenneth Nordsieck that was flown twice aboard the space shuttles in 1990 and 1995. Among other findings, WUPPE observations demonstrated that interstellar dust does not appreciably change the direction of polarization of starlight, thereby supporting its possible composition as graphite. Code was the recipie

  15. Applications of research from the U.S. Geological Survey program, assessment of regional earthquake hazards and risk along the Wasatch Front, Utah

    USGS Publications Warehouse

    Gori, Paula L., (Edited By)

    1993-01-01

    INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and engineering studies. Translated earthquake hazard maps have also been developed to identify areas that are particularly vulnerable to various causes of damage such as ground shaking, surface rupturing, and liquefaction. The implementation of earthquake hazard reduction plans are now under way in various communities in Utah. The results of a survey presented in this paper indicate that technical public officials (planners and building officials) have an understanding of the earthquake hazards and how to mitigate the risks. Although the survey shows that the general public has a slightly lower concern about the potential for economic losses, they recognize the potential problems and can support a number of earthquake mitigation measures. The study suggests that many community groups along the Wasatch Front, including volunteer groups, business groups, and elected and appointed officials, are ready for action-oriented educational programs. These programs could lead to a significant reduction in the risks associated with earthquake hazards. A DATA BASE DESIGNED FOR URBAN SEISMIC HAZARDS STUDIES: A computerized data base has been designed for use in urban seismic hazards studies conducted by the U.S. Geological Survey. The design includes file structures for 16 linked data sets, which contain geological, geophysical, and seismological data used in preparing relative ground response maps of large urban areas. The data base is organized along relational data base principles. A prototype urban hazards data base has been created for evaluation in two urban areas currently under investigation: the Wasatch Front region of Utah and the Puget Sound area of Washington. The initial implementation of the urban hazards data base was accomplished on a microcomputer using dBASE III Plus software and transferred to minicomputers and a work station. A MAPPING OF GROUND-SHAKING INTENSITIES FOR SALT LAKE COUNTY, UTAH: This paper documents the development of maps showing a

  16. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION WITH CLIPSITS)

    NASA Technical Reports Server (NTRS)

    Riley, , .

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.

  17. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION)

    NASA Technical Reports Server (NTRS)

    Riley, G.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.

  18. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (MACINTOSH VERSION)

    NASA Technical Reports Server (NTRS)

    Culbert, C.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.

  19. WE-G-16A-01: Evolution of Radiation Treatment Planning

    SciTech Connect

    Rothenberg, L; Mohan, R; Van Dyk, J; Fraass, B; Bortfeld, T

    2014-06-15

    Welcome and Introduction - Lawrence N. Rothenberg This symposium is one a continuing series of presentations at AAPM Annual Meetings on the historical aspects of medical physics, radiology, and radiation oncology that have been organized by the AAPM History Committee. Information on previous presentations including “Early Developments in Teletherapy” (Indianapolis 2013), “Historical Aspects of Cross-Sectional Imaging” (Charlotte 2012), “Historical Aspects of Brachytherapy” (Vancouver 2011), “50 Years of Women in Medical Physics” (Houston 2008), and “Roentgen's Early Investigations” (Minneapolis 2007) can be found in the Education Section of the AAPM Website. The Austin 2014 History Symposium will be on “Evolution of Radiation Treatment Planning.” Overview - Radhe Mohan Treatment planning is one of the most critical components in the chain of radiation therapy of cancers. Treatment plans of today contain a wide variety of sophisticated information conveying the potential clinical effectiveness of the designed treatment to practitioners. Examples of such information include dose distributions superimposed on three- or even four-dimensional anatomic images; dose volume histograms, dose, dose-volume and dose-response indices for anatomic structures of interest; etc. These data are used for evaluating treatment plans and for making treatment decisions. The current state-of-the-art has evolved from the 1940s era when the dose to the tumor and normal tissues was estimated approximately by manual means. However, the symposium will cover the history of the field from the late-1950's, when computers were first introduced for treatment planning, to the present state involving the use of high performance computing and advanced multi-dimensional anatomic, functional and biological imaging, focusing only on external beam treatment planning. The symposium will start with a general overview of the treatment planning process including imaging, structure delineation, assignment of dose requirements, consideration of uncertainties, selection of beam configurations and shaping of beams, and calculations, optimization and evaluation of dose distributions. This will be followed by three presentations covering the evolution of treatment planning, which parallels the evolution of computers, availability of advanced volumetric imaging and the development of novel technologies such as dynamic multi-leaf collimators and online image guidance. This evolution will be divided over three distinct periods - prior to 1970's, the 2D era; from 1980 to the mid-1990's, the 3D era; and from the mid 1990's to today, the IMRT era. When the World was Flat: The Two-Dimensional Radiation Therapy Era” - Jacob Van Dyk In the 2D era, anatomy was defined with the aid of solder wires, special contouring devices and projection x-rays. Dose distributions were calculated manually from single field, flat surface isodoses on transparencies. Precalculated atlases of generic dose distributions were produced by the International Atomic Energy Agency. Massive time-shared main frames and mini-computers were used to compute doses at individual points or dose distributions in a single plane. Beam shapes were generally rectangular, with wedges, missing tissue compensators and occasional blocks to shield critical structures. Dose calculations were measurement-based or they used primary and scatter calculations based on scatter-air ratio methodologies. Dose distributions were displayed on line printers as alpha-numeric character maps or isodose patterns made with pen plotters. More than Pretty Pictures: 3D Treatment Planning and Conformal Therapy - Benedick A. Fraass The introduction of computed tomography allowed the delineation of anatomy three-dimensionally and, supported partly by contracts from the National Cancer Institute, made possible the introduction and clinical use of 3D treatment planning, leading to development and use of 3D conformal therapy in the 1980's. 3D computer graphics and 3D anatomical structure definitions made possible Beam's Eye V

  20. The ASC Sequoia Programming Model

    SciTech Connect

    Seager, M

    2008-08-06

    In the late 1980's and early 1990's, Lawrence Livermore National Laboratory was deeply engrossed in determining the next generation programming model for the Integrated Design Codes (IDC) beyond vectorization for the Cray 1s series of computers. The vector model, developed in mid 1970's first for the CDC 7600 and later extended from stack based vector operation to memory to memory operations for the Cray 1s, lasted approximately 20 years (See Slide 5). The Cray vector era was deemed an extremely long lived era as it allowed vector codes to be developed over time (the Cray 1s were faster in scalar mode than the CDC 7600) with vector unit utilization increasing incrementally over time. The other attributes of the Cray vector era at LLNL were that we developed, supported and maintained the Operating System (LTSS and later NLTSS), communications protocols (LINCS), Compilers (Civic Fortran77 and Model), operating system tools (e.g., batch system, job control scripting, loaders, debuggers, editors, graphics utilities, you name it) and math and highly machine optimized libraries (e.g., SLATEC, and STACKLIB). Although LTSS was adopted by Cray for early system generations, they later developed COS and UNICOS operating systems and environment on their own. In the late 1970s and early 1980s two trends appeared that made the Cray vector programming model (described above including both the hardware and system software aspects) seem potentially dated and slated for major revision. These trends were the appearance of low cost CMOS microprocessors and their attendant, departmental and mini-computers and later workstations and personal computers. With the wide spread adoption of Unix in the early 1980s, it appeared that LLNL (and the other DOE Labs) would be left out of the mainstream of computing without a rapid transition to these 'Killer Micros' and modern OS and tools environments. The other interesting advance in the period is that systems were being developed with multiple 'cores' in them and called Symmetric Multi-Processor or Shared Memory Processor (SMP) systems. The parallel revolution had begun. The Laboratory started a small 'parallel processing project' in 1983 to study the new technology and its application to scientific computing with four people: Tim Axelrod, Pete Eltgroth, Paul Dubois and Mark Seager. Two years later, Eugene Brooks joined the team. This team focused on Unix and 'killer micro' SMPs. Indeed, Eugene Brooks was credited with coming up with the 'Killer Micro' term. After several generations of SMP platforms (e.g., Sequent Balance 8000 with 8 33MHz MC32032s, Allian FX8 with 8 MC68020 and FPGA based Vector Units and finally the BB&N Butterfly with 128 cores), it became apparent to us that the killer micro revolution would indeed take over Crays and that we definitely needed a new programming and systems model. The model developed by Mark Seager and Dale Nielsen focused on both the system aspects (Slide 3) and the code development aspects (Slide 4). Although now succinctly captured in two attached slides, at the time there was tremendous ferment in the research community as to what parallel programming model would emerge, dominate and survive. In addition, we wanted a model that would provide portability between platforms of a single generation but also longevity over multiple--and hopefully--many generations. Only after we developed the 'Livermore Model' and worked it out in considerable detail did it become obvious that what we came up with was the right approach. In a nutshell, the applications programming model of the Livermore Model posited that SMP parallelism would ultimately not scale indefinitely and one would have to bite the bullet and implement MPI parallelism within the Integrated Design Code (IDC). We also had a major emphasis on doing everything in a completely standards based, portable methodology with POSIX/Unix as the target environment. We decided against specialized libraries like STACKLIB for performance, but kept as many general purpose, portable math libraries as were needed by the co