Deregt, M. P.; Dulfer, J. E.
This investigation was undertaken to establish sufficient specifications, or standards, for minicomputer hardware and software to provide NASA with realizable economics in quantity purchases, interchangeability of minicomputers, software, storage and peripherals, and a uniformly high quality. The standards will define minicomputer system component types, each specialized to its intended NASA application, in as many levels of capacity as required.
In spite of the limitations of minicomputers, Western Washington State College (WWSC) has developed a useful interactive computing system based on a Model 7/32 Interdata minicomputer and the computer program, PILOT. The major disadvantages of minicomputers are the difficulty of securing maintenance and the reliance often on the single language,…
CEMREL, Inc., St. Louis, MO.
This material describes two games, Minicomputer Tug-of-War and Minicomputer Golf. The Papy Minicomputer derives its name from George Papy, who invented and introduced it in the 1950's. The Minicomputer is seen as an abacus with the flavor of a computer in its schematic representation of numbers. Its manner of representation combines decimal…
Storaasli, O. O.; Foster, E. P.
Minicomputers are receiving increased use throughout the aerospace industry. Until recently, their use focused primarily on process control and numerically controlled tooling applications, while their exposure to and the opportunity for structural calculations has been limited. With the increased availability of this computer hardware, the question arises as to the feasibility and practicality of carrying out comprehensive structural analysis on a minicomputer. This paper presents results on the potential for using minicomputers for structural analysis by (1) selecting a comprehensive, finite-element structural analysis system in use on large mainframe computers; (2) implementing the system on a minicomputer; and (3) comparing the performance of the minicomputers with that of a large mainframe computer for the solution to a wide range of finite element structural analysis problems.
Factors to be weighed when selecting a minicomputer system as the basis for an image analysis computer facility vary depending on whether the user organization procures a new computer or selects an existing facility to serve as an image analysis host. Some conditions not directly related to hardware or software should be considered such as the flexibility of the computer center staff, their encouragement of innovation, and the availability of the host processor to a broad spectrum of potential user organizations. Particular attention must be given to: image analysis software capability; the facilities of a potential host installation; the central processing unit; the operating system and languages; main memory; disk storage; tape drives; hardcopy output; and other peripherals. The operational environment, accessibility; resource limitations; and operational supports are important. Charges made for program execution and data storage must also be examined.
Morris, C. F.
A small but very effective minicomputer-based speech processing system costing just over 30,000 dollars is described here. The hardware and software comprising the system are discussed as well as immediate and future research applications.
Storaasli, O. O.
Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM), a highly interactive software, has been implemented on minicomputers at the NASA Langley Research Center. CAD/CAM software integrates many formerly fragmented programs and procedures into one cohesive system; it also includes finite element modeling and analysis, and has been interfaced via a computer network to a relational data base management system and offline plotting devices on mainframe computers. The CAD/CAM software system requires interactive graphics terminals operating at a minimum of 4800 bits/sec transfer rate to a computer. The system is portable and introduces 'interactive graphics', which permits the creation and modification of models interactively. The CAD/CAM system has already produced designs for a large area space platform, a national transonic facility fan blade, and a laminar flow control wind tunnel model. Besides the design/drafting element analysis capability, CAD/CAM provides options to produce an automatic program tooling code to drive a numerically controlled (N/C) machine. Reductions in time for design, engineering, drawing, finite element modeling, and N/C machining will benefit productivity through reduced costs, fewer errors, and a wider range of configuration.
Hicks, R. M.; Szelazek, C. A.
A computer program developed for the automated design of low speed airfoils utilizes a generalized Joukowski method for aerodynamic analysis coupled with a conjugate gradient, penalty function, numerical optimization algorithm to give an efficient calculation technique for use with minicomputers. The program designs airfoils with a prescribed pressure distribution as well as those which minimize or maximize some aerodynamic force coefficient. At present the method is restricted to inviscid, incompressible flow. A typical design problem will execute in 4.5 hr on an HP 9830 minicomputer.
This paper discusses minicomputer-based ILSs (integrated learning systems), i.e., computer-based systems of hardware and software. An example of a minicomputer-based system in a school district (a composite of several actual districts) considers hardware, staffing, scheduling, reactions, problems, and training for a subskill-oriented reading…
Juras, R.C.; Meigs, M.J.; Sinclair, J.A.; Tatum, B.A.
Controls for accelerators and associated systems at the Holifield Radioactive Ion Beam Facility (HRIBF) at Oak Ridge National Laboratory have been migrated from 197Os-vintage minicomputers to a modern system based on Vista and EPICS toolkit software. Stability and capabilities of EPICS software have motivated increasing use of EPICS for accelerator controls. In addition, very inexpensive subsystems based on EPICS and the EPICS portable CA server running on Linux PCs have been implemented to control an ion source test facility and to control a building-access badge reader system. A new object-oriented, extensible display manager has been developed for EPICS to facilitate the transition to EPICS and will be used in place of MEDM. EPICS device support has been developed for CAMAC serial highway controls.
R. J. O'Connell; WILLIAM A. KOCSIS; ROBERT L. SCHOENFELD
The use of a minicomputer on-line to identify and measure the time series of different nerve impulses mixed in a single recording channel is described. The study involves electrical discharges recorded simultaneously from two of the olfactory receptors found on the antenna of the male red-banded leaf roller moth. These receptor cells were stimulated by odors normally secreted by the
M. Thompson; J. B. W. Padley
Details are given of the derivation of formulae for the solution of the wave equations on a mini-computer, using a linear segment approximation to the sound-speed profile. A method is described for calculating the total propagation loss, and a brief description of the computer program used is given.
Moseley, E. C.
The Medical Information Computer System (MEDICS) is a time shared, disk oriented minicomputer system capable of meeting storage and retrieval needs for the space- or non-space-related applications of at least 16 simultaneous users. At the various commercially available low cost terminals, the simple command and control mechanism and the generalized communication activity of the system permit multiple form inputs, real-time updating, and instantaneous retrieval capability with a full range of options.
The Control Data Corporation Type 200 User Terminal utilizes a unique communications protocol to provide users with batch mode remote terminal access to Control Data computers. CDC/1000 is a software subsystem that implements this protocol on Hewlett-Packard minicomputers running the Real Time Executive III, IV, or IVB operating systems. This report provides brief descriptions of the various software modules comprising CDC/1000, and contains detailed instructions for integrating CDC/1000 into the Hewlett Packard operating system and for operating UTERM, the user interface program for CDC/1000. 6 figures.
The Prickett and Lonnquist two-dimensional groundwater model has been programmed for the Apple II minicomputer. Both leaky and nonleaky confined aquifers can be simulated. The model was adapted from the FORTRAN version of Prickett and Lonnquist. In the configuration presented here, the program requires 64 K bits of memory. Because of the large number of arrays used in the program, and memory limitations of the Apple II, the maximum grid size that can be used is 20 rows by 20 columns. Input to the program is interactive, with prompting by the computer. Output consists of predicted lead values at the row-column intersections (nodes).
Raehtz, K G; Walker, P C
A pediatric TPN computer program, written in Cobol 74 machine language, was developed for use on a minicomputer system. The program calculates the volume of each ingredient needed to prepare a pediatric TPN solution, generates a recipe work card and labels, calculates clinical monitoring information for each patient and develops a clinical monitoring profile for the pharmacist to use in monitoring parenteral nutrition therapy. Use of the program resulted in a significant reduction (71%) in the time needed ot complete TPN calculations. Significant decreases in calculation and labeling errors were also realized. PMID:10312684
Presentations of a conference on the use of ruggedized minicomputers are summarized. The following topics are discussed: (1) the role of minicomputers in the development and/or certification of commercial or military airplanes in both the United States and Europe; (2) generalized software error detection techniques; (3) real time software development tools; (4) a redundancy management research tool for aircraft navigation/flight control sensors; (5) extended memory management techniques using a high order language; and (6) some comments on establishing a system maintenance scheme. Copies of presentation slides are also included.
Schell, James Leo
Rree of MASTER OF SCIENCE , "1ay l977 Ma jor Subject: 1'. lectrical Enpineerinp THE DEVELOPMENT OF A PBOCrBAMMABLE 4-CHANNEL A/D CONVERSION SYSTEM FOR THI' Tl 9HOA MINICOMPUTER A Thesis by JAMi" S LEO SCHELL Approved as to style and content by...: (Chairman o Committee (Head Depar ent Member Membe r May 1977 ABSTRACT The Development of a Programmable 4-Channel A/D Conversion System for the TI 980A Minicomputer, (May 1977) James Leo Schell, B. S. , Texas ASM University Chairman of Advisory...
Binder, R.; Kuo, F. F.
An experimental facility is described which allows new computer communications techniques to be tested under conditions closely approximating those of real systems. A three-processor minicomputer configuration is used to achieve real-time operation at channel transmission rates of up to 50 Kbits per second. One processor runs a channel controller-concentrator program, a second is dedicated to simulation of the communication channel characteristics, and the third to the simulation of up to 1000 user terminals. The latter are divided into classes consisting of interactive time-sharing users of differing characteristics and file nodes, mixed in different proportions. Real user nodes are connected to the channel simulator processor, providing experience with actual operating characteristics under different channel loadings.
I. R. Perry; A. Gamble
This review discusses the role of high-level languages and real-time operating systems when minicomputers are used in the field of scientific instrumentation. Comparative information is given on the advantages and disadvantages of the main high-level languages currently in use in the scientific environment. Information on high-level languages and real-time operating system facilities available on current minicomputers was obtained by sending
Ferguson, R. S.; Doherty, J. G.
The algorithms and models of an accurate finite element based simulation of th eprocessing steps of semiconductor wafer fabrication are described. Properties of the latest generation of single user mini-computers allow the process engineer to use the computer package in an interactive mode. The process steps modelled are, implantation, oxidation/diffusion and annealing. Implantation models are based on the well-tested one-dimensional statistical distributions. Interaction between impurity atoms is assumed to be mainly through the built-in field. To obtain an accurate estimate of the built-in field, the non-linear Poisson equation is solved at the same nodes and in the same elements used for the simulation of the diffusion process. On making the assumption that small time steps are taken in the numerical formulation of the diffusion problem, the finite element equation system becomes linear and can be rapidly solved. Each impurity is assumed to diffuse independently in a non-uniform electric field, enhanced by a component due to the other impurities. Coupling between oxidation and diffusion is accounted for by a simple algorithm that deforms the solution mesh after the oxidising agent reacts with silicon to create a larger volume of SiO 2.
Montoya, R. J.; Jai, A. R.
The paper describes a minicomputer-based, real-time closed loop remote control system at NASA Langley outdoor facility which is used to determine the stall/departure/spin characteristics of high-performance aircraft. The experiments are conducted with 15% dynamically scaled, unpowered models that are dropped from 3000 m and ground controlled. The effects of time delays and sampling rates on the stability of the control system and the selection of digital algorithms to meet frequency response and real time constraints are examined. Also described is the implementation of the modular software for the flexible programming of multi-axis control laws.
Hill, J. W.
Real-time performance data was collected during a pick-up task carried out with a Rancho master-slave manipulator using a minicomputer-based data taker. In addition to the usual task-time measurements, computer algorithms to integrate the energy consumed and to count and time the number of moves were implemented. In addition to these measures, several derived measures such as the fraction of time moving (MRATIO) and mean time per move (MBAR) were obtained in an off-line analysis. Preliminary results of the time delay experiment indicate that two new measures, MRATIO and MBAR, are almost an order of magnitude more sensitive than task time, the conventional measure, in determining performance changes with transmission delays in the range from 0.0 to 1.0s.
J. M. Williams; J. R. Machulda
Chanslor-Western Oil and Development Co. started a program of cyclic steam recovery in the Midway-Sunset field in 1964. After the basic technique of cyclic steam recovery was proven in the field, efficiency was improved by a well-by-well approach to thermal-recovery project analysis. The result has been one of the most successful cyclic steam projects in the San Joaquin Valley. Project
Daniel L. Pierson
This paper discussed the problems encountered and techniques used in conducting the performance evaluation of a multi-processor on-line manpower data collection system. The two main problems were: (1) a total lack of available software tools, and (2) many commonly used hardware monitor measures (e.g., CPU busy, disk seek in progress) were either meaningless or not available. The main technique used
Artificial intelligence is becoming increasingly attractive to commercial users thanks to computer architectures designed to support the LISP language. As an example of the novel features of the new architectures, LISP Machine Inc.'s lambda machine is described.
S. Labík; A. Malijevský
An algorithm that utilizes the advantage of a machine language and integer arithmetic is proposed for the Monte Carlo simulations of the radial distribution function of fluid hard spheres. The influence of integer arithmetic upon the accuracy of results is discussed. The method can be easily adapted for more complicated potentials.
for the multiplexer control inputs are listed below; (1+2) = Z (8 2+8 Y+8 8 +8 8 ) 0 (1+2) = E (8 2 + 8 Y + 8 8 + 8 8 ) 0 (1+2) = E (8 2+8 Y+8 8 +8 8 ) 0 (1+2) = Z (8 2+8 Y+8 8 +8 8 ) 15 o 15 1 (4) ? Z (815 X + 815 8 ) (4) = Z (815 X + 8 8 ) (2) (3) (4) (5... to the 7-segment decoder driver are shown in Fig. 12. ADC DIGITAL OUTPUT MSB (BIT 10) TO BIT 15-9 OF THE NICROINTERFACE CARD (C -C ) 15 9 BIT 9 CB +5V BIT 8 C7 C6 +5V 5V BIT 7 BIT 6 LOGIC "1" BIT 5 BIT 4 5 C4 C3 +5V 5V +5V C2 +5V BIT...
Burrage, George Richard
- ters each and the teleprocessor memory is an 8-bit format of only one ASCII character. Hence, two load operations must be used to transfer one 21148 word to the teleprocessor. 41 INITIALIZE TO READY STATE OF SELECT MODE LOAD MAR WITH FWAM SET... address for locating information within the ROM. A 6-bit register is used to store these bits . They select a 35-bit area containing the print information for each character. Since the ROM is organized as five parallel 448 X 1-bit ROM's, only five bits...
J. G. Gilbert; R. J. Shovlin
An algorithim for calculating apparent transmission line impedance to the point of a fault is presented as an approach to distance type protection via a dedicated digital computer. Phase voltages and currents are sampled asynchronously approximately 24 times per cycle and operated on to yield apparent resistances and reactances. Descrimination between phase to phase, phase to ground and two phase
Roehrkasse, Robert C.
Discussed are the requirements of a software-oriented engineering curriculum that also includes use of computer hardware. Three areas are identified as necessary in such a curriculum: functional area users, systems programming, and mini-micro technology. Each of these areas is discussed in terms of instructional methods and suggested topics.…
Schell, James Leo
flow from the CPU to peripheral devices along the output hus. Except for the direction of data flow and the contents of bit 10 in the first word of an I/O instruction, the WDS and BDS instruction format is identical. Bit 10 is a logic 1 for the WDS... of four possible groups of 64 devices. Together, the group and external register numbers can specify any of 256 possible external devices. The second word of the I/O instruction contains the internal register number (R) and several select bits for 0 1...
Miller, M.M.; Tolendino, L.F.
Technological advances in the world of microcomputers have brought forth affordable systems and powerful software than can compete with the more traditional world of minicomputers. This paper describes an effort at Sandia National Laboratories to decrease operational and maintenance costs and increase performance by moving a database system from a minicomputer to a microcomputer.
Garg, Devendra P.
In order to obtain student feedback in computer programing courses at Duke University, a computer-based anonymous audience response system was used. This system consisted of a minicomputer, voting consoles, and a large electronic display. Students set their voting consoles in response to the question and the minicomputer interrogated the consoles.…
Attala, Emile E.; Howard, James A.
Very little work has been done in the broad field of computer-assisted instruction (CAI) to exploring the use of a minicomputer as another learning resource in the instructional process. Accordingly a cost-effective Learning Resource Aided Instruction (LRAI) System centered around a Data General NOVA minicomputer augmented with slide…
Cabrol, Daniel; Cachet, Claude
Describes the technical and pedagogical characteristics of the ESSOR system simulation on minicomputers, which allow the simulation of science experiments in the laboratory. Reflects on several years of use and development in the field of chemistry. (Author/DS)
The software used to operate and maintain the remote hard copy is described. All operating software that runs in the NOVA minicomputers is covered as are various utility and diagnostic programs used for creating and checking this software. 2 figures.
De Laurentiis, Emiliano
Clarifies often misconstrued distinctions with regard to microcomputers, minicomputers, and maxicomputers. Criteria for educational use of microcomputers are examined, including its potential for language and peripheral expansion, and its communication capabilities. (MER)
The users manual for the word recognition computer program contains flow charts of the logical diagram, the memory map for templates, the speech analyzer card arrangement, minicomputer input/output routines, and assembly language program listings.
van der Merwe, J. P.
Describes how certain concepts basic to electron optics may be introduced to undergraduate physics students by calculating trajectories of charged particles through electrostatic fields which can be evaluated on minicomputers with a minimum of programing effort. (Author/SA)
Castleman, K. R.; Frieden, H. J.; Johnson, E. T.; Rennie, P. A.; Wall, R. J.
Minicomputer-controlled system automatically prepares and analyses blood samples and displays karyotype in pictorial form as primary output. System accuracy is assured by operator interaction at key points during process. System can process up to 576 specimens per day.
Tashker, M. (editor)
Papers are presented dealing with the design of reliable, low cost, advanced avionics systems applicable to general aviation in the 1980's and beyond. Sensors, displays, integrated circuits, microprocessors, and minicomputers are among the topics discussed.
Richard E. Brown
Dartmouth College's Kiewit Network connects nearly all of the computing resources on the campus: mainframes, minicomputers, personal computers, terminals, printers, and file servers. It is a large internetwork, based on the AppleTalk protocols. There are currently over 2900 AppleTalk outlets in 44 zones on campus. Over 90 minicomputers act as bridges between 177 AppleTalk twisted pair busses. This paper describes
L. L. Collins; E. T. Chulick
A continuous on-line fission product monitor has been installed at the Oyster Creek Nuclear Generating Station, Forked River, New Jersey. The on-line monitor is a minicomputer-controlled high-resolution gamma-ray spectrometer system. An intrinsic Ge detector scans a collimated sample line of coolant from one of the plant's recirculation loops. The minicomputer is a Nuclear Data 6620 system. Data were accumulated for
Cambra, J. M.; Trover, W. F.
Data acquisition systems used in NASA's wind tunnels from the 1950's through the present time are summarized as a baseline for assessing the impact of minicomputers and microcomputers on data acquisition and data processing. Emphasis is placed on the cyclic evolution in computer technology which transformed the central computer system, and finally the distributed computer system. Other developments discussed include: medium scale integration, large scale integration, combining the functions of data acquisition and control, and micro and minicomputers.
The Multipurpose Interactive NASA Information Systems (MINIS) was developed in response to the need for a data management system capable of operation on several different minicomputer systems. The desired system had to be capable of performing the functions of a LANDSAT photo descriptive data retrieval system while remaining general in terms of other acceptable user definable data bases. The system also had to be capable of performing data base updates and providing user-formatted output reports. The resultant MINI System provides all of these capabilities and several other features to complement the data management system. The MINI System is currently implemented on two minicomputer systems and is in the process of being installed on another minicomputer system. The MINIS is operational on four different data bases.
This paper reports on a recent effort at the Lister Hill National Center for Biomedical Communications where it is shown that a medical bibliographic database, when linked to a mini-computer's file handling software, can be used to build a demonstration database of document images. The images of the tables of contents from approximately fifty medical books were stored on magnetic disks. The medical bibliographic database, hosted on a remote mainframe computer, was linked to the minicomputer controlling the magnetic disks. The result is that users who do bibliographic searches on a particular medical subject may view, at their work station, the tables of contents and title page from selected citations. The result is also that the document image database is accessed and managed using pre-existing features of the bibliographic database and the minicomputer's operating system.
Several levels of documentation are presented for the program module of the NASA medical directorate minicomputer storage and retrieval system. The biomedical information system overview gives reasons for the development of the minicomputer storage and retrieval system. It briefly describes all of the program modules which constitute the system. A technical discussion oriented to the programmer is given. Each subroutine is described in enough detail to permit in-depth understanding of the routines and to facilitate program modifications. The program utilization section may be used as a users guide.
Maddox, R.N.; Erbar, J.H.
By handling certain process-engineering calculations, a minicomputer with only 64,000 bytes of memory can greatly reduce the time that engineers spend searching for reasonable process conditions and evaluating the processes. Designed to work in a completely interactive mode, three FORTRAN programs have been developed on the Altair Attache for process-engineering applications (all are easily converted to similar minicomputer systems). Although not intended to supersede large-scale process simulation, these programs optimize the engineer's time by combining the individual's judgment and intuition with the calculational capabilities of a small, relatively cheap computer.
Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.
Delaat, J. C.; Soeder, J. F.
High speed minicomputers were used in the past to implement advanced digital control algorithms for turbine engines. These minicomputers are typically large and expensive. It is desirable for a number of reasons to use microprocessor-based systems for future controls research. They are relatively compact, inexpensive, and are representative of the hardware that would be used for actual engine-mounted controls. The Control, Interface, and Monitoring Unit (CIM) contains a microprocessor-based controls computer, necessary interface hardware and a system to monitor while it is running an engine. It is presently being used to evaluate an advanced turbofan engine control algorithm.
Stacey, J.S.; Hope, J.
A system is described which uses a minicomputer to control a surface ionization mass spectrometer in the peak switching mode, with the object of computing isotopic abundance ratios of elements of geologic interest. The program uses the BASIC language and is sufficiently flexible to be used for multiblock analyses of any spectrum containing from two to five peaks. In the case of strontium analyses, ratios are corrected for rubidium content and normalized for mass spectrometer fractionation. Although almost any minicomputer would be suitable, the model used was the Data General Nova 1210 with 8K memory. Assembly language driver program and interface hardware-descriptions for the Nova 1210 are included.
William D. Stakem; Alan Schneider
Long-life battery systems are discussed in this paper as a cost effective standby power source for data retention in volatile memory systems. Continuing development of these power sources has resulted in batteries to satisfy many of the memory backup needs in the rapidly expanding microprocessor and minicomputer applications field, especially with random access memories based on CMOS and other low
The Interprocess Communications System (IPCS) was written to provide a virtual machine upon which the Supervisory Control and Diagnostic System (SCDS) for the Mirror Fusion Test Facility (MFTF) could be built. The hardware upon which the IPCS runs consists of nine minicomputers sharing some common memory.
Doring, Richard; Hicks, Bruce
A brief review is presented of the characteristics of four maxicalculators (HP 9830, Wang 2200, IBM 5100, MCM/700) and two minicomputers (Classic, Altair 8800). The HP 9830 and the Wang 2200 are thought to be the best adapted to serve entire schools and their unique properties are discussed. Some criteria of what should be taken into account in…
instrumentations, nuclear geophysical methods, EDP system for a power station, optimization of system models: control, man - machine relations, system structures, reliability and safety, minicomputers. Three years The report reviews the activities of the department in the fields: instrumentation for Riso experiments
Bracken, P. A.; Dalton, J. T.; Billingsley, J. B.; Quann, J. J.
The development of hardware and software for an interactive, minicomputer based processing and display system for atmospheric and oceanographic information extraction and image data analysis is described. The major applications of the system are discussed as well as enhancements planned for the future.
Nee, John G.; Kare, Audhut P.
Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)
Lala, P.; Thao, Bui Van
The first step in the treatment of satellite laser ranging data is its smoothing and rejection of incorrect points. The proposed method uses the comparison of observations with ephemerides and iterative matching of corresponding parameters. The method of solution and a program for a minicomputer are described. Examples of results for satellite Starlette are given.
Eva Kocsis; James F. Conway; Alasdair C. Steven
The PIC system is an integrated package of image processing software written in Fortran and C. Throughout its 16 years of continuous development, PIC has been designed for the processing of electron micrographs with emphasis on the particular requirements for structural analysis of biological macromolecules. PIC has been implemented on successive generations of Digital Equipment Corporation dedicated minicomputers and workstations.
A. Peter Slootweg
The multistreamer Side-Looking Seismic system presented in this paper makes a sonograph of uncovered or buried crustal topography, thus revealing the structural fabric of the oceanic basement, even when this is covered with a sedimentary layer. Major elements of the system are an airgun as a sound source, five single-channel parallel streamers and two minicomputers for signal capture and processing.
Meyer, R. A.; Halfen, F. J.; Alley, A. D.
SP-100 Control Systems modeling was done using a thermal hydraulic transient analysis model called ARIES-S. The ARIES-S Computer Simulation provides a basis for design, integration and analysis of the reactor including the control and protection systems. It is a modular digital computer simulation written in FORTRAN that operates interactively in real time on a VAX minicomputer.
Air Products and Chemicals has successfully implemented a mini-computer monitoring and control system at its LaPorte Syngas Complex as an integral part of its energy management strategy. The production complex consists of two separate plants at La...
J. A. Balter; A. Tomlinson
This report describes the evolution of the FTD User Communications System concept from an expanded query facility for specific FTD Scientific and Technical Information System (STIS) to a facility running on a minicomputer and supporting a range at computer-based activities, including a generalized data base and file access, text handling and report generation, calculation, simple plots and graphics, and access
A minicomputer controlled automotive emissions sampling and analysis system (the Real-Time System) was developed to determine vehicular modal emissions over various test cycles. This data acquisition system can sample real-time emissions at a rate of 10 samples/s. A buffer utiliz...
Peter M. Bell
The National Weather Service (NWS) is modernizing its systems for collecting and processing the upper-air information used to make weather predictions. The program will take 5 years and cost $6.5 million. Under the program, NWS will convert from a manual to an automatic system for entering the data collected at its 114 nationwide field stations into minicomputers for processing. Electronic
Rappaport, Wanda; Olenbush, Elizabeth
Time-shared, Interactive, Computer-Controlled Information Television (TICCIT) is a computer-based system of instruction designed to provide low-cost, high-quality education that is completely individualized. Using inexpensive minicomputers, color television sets, and typewriter-like keyboards, TICCIT can serve as many as 128 students…
Grosch, Audrey N.
Alternative approaches to the building of monographic bibliography files for an on-line data management system using minicomputers at the University of Minnesota Libraries' Twin Cities Campus center are described. Secondary and primary sources of the Machine-Readable Cataloging (MARC) II records are considered--including Blackwell-North America,…
John P. Glaser; Robert F. Beckley; Pasha Roberts; James K. Marra; Frederick L. Hiltz; Jean Hurley
Brigham and Women's Hospital is converting its financial, administrative and clinical information systems from a mini-computer environment to a platform based on MUMPS and a network of several thousand personal computers. This article describes the project rationale and status and provides an overview of the architecture of the new system. The initial results of the project indicate that the personal
Kenyon, Robert V.
with a bite bar covered with dental impression material, stabilized the head. A minicomputer was used monocular testing, the nontested eye was prevented from seeing the target by occlusion with either a black eyepatch or a large partition. These meth¬ ods of occlusion had no effect on eye move¬ ments. Subjects were
Tai, M. H.
The mathematical properties of Hadamard matrices and their application to spectroscopy are discussed. A comparison is made between Fourier and Hadamard transform encoding in spectrometry. The spectrometer is described and its laboratory performance evaluated. The algorithm and programming of inverse transform are given. A minicomputer is used to recover the spectrum.
David F. Cahn; Stephen R. Phillips
An algorithm has been developed that efficiently solves a large class of robot navigation and obstacle avoidance problems using range information as its sole input from the environment. The system resides in a minicomputer and requires very small memory (1500 words) and computing time (1.35 s) allocations while solving simulated problems of broadly ranging spatial complexity and operational intricacy. It
Arsenic with 25 other elements are simultaneously determined in ambient air samples collected on glass-fiber filter composites at 250 United States sites. The instrumental neutron activation analysis (NAA) technique combined with the power of a dedicated mini-computer resulted in...
May, Donald M.; And Others
The minicomputer-based Computerized Diagnostic and Decision Training (CDDT) system described combines the principles of artificial intelligence, decision theory, and adaptive computer assisted instruction for training in electronic troubleshooting. The system incorporates an adaptive computer program which learns the student's diagnostic and…
A data acquisition system has been developed to collect, analyze and store large volumes of rapid kinetic data measured from a stopped-flow spectrophotometer. A digital minicomputer, with an A/D converter, tape drive unit and formatter, analog recorder, oscilloscope, and input/ou...
DATMAN is a data management system which runs on a variety of minicomputers. Currently, versions are supported on the following computers: PRIME and PDP 11/70 under IAS. DATMAN has facilities for creating data bases, retrieving selected data from data in data bases, retrieving se...
Miller, Jo; Janovsky, Kathy
During the 2001 summer holidays, the main Social Science classroom at St Ursula's College, a Catholic Secondary Girls' school of 740 pupils in Toowomba, Queensland was renovated. A mini-computer laboratory of four nests of computers was incorporated into the traditional teaching space. (See Diagram 1 and photograph). This room was named the…
The computer control system for the Fusion Materials Irradiation Test Facility (FMIT) prototype accelerator was designed using distributed intelligence driven by a distributed database. The system consists of two minicomputers in the central control room and four microcomputers residing in CAMAC crates located near appropriate subsystems of the accelerator. The system uses single vendor hardware as much as practical in
James A. Johnson
The computer control system for the Fusion Materials Irradiation Test Facility (FMIT) prototype accelerator was designed using distributed intelligence driven by a distributed database. The system consists of two minicomputers in the central control room and four microcomputers residing in CAMAC crates located near appropriate subsystems of the accelerator. The system uses single vendor hardware as much as practical in
Juckiewicz, Robert; Kroculick, Joseph
Columbia University's major program to distribute its central administrative data processing to its various schools and departments is described. The Distributed Administrative Management Information System (DAMIS) will link every department and school within the university via micrcomputers, terminals, and/or minicomputers to the central…
H. Bassen; J. Silberberg; F. Houston; W. Knight; C. Christman; M. Greberman
Virtually all of the medical devices utilizing electronics will contain a micro or minicomputer by 1990. These devices accounted for $7 billion in U.S. sales in 1984. Their capabilities can provide the means for new or greatly improved medical procedures, and ensure greater patient safety. However, these benefits can easily be compromised if ``computer safety'' is not practiced in the
Yost, Michael; Bremner, Fred
This document describes the review, analysis, and decision-making process that Trinity University, Texas, went through to develop the three-part computer network that they use to gather and analyze EEG (electroencephalography) and EKG (electrocardiogram) data. The data are gathered in the laboratory on a PDP-1124, an analog minicomputer. Once…
Graczyk, Sandra L.
Presents an input-process-output (IPO) model that can facilitate the design and implementation of instructional micro and minicomputer systems in school districts. A national survey of school districts with outstanding computer systems is described, a systems approach to develop the model is explained, and evaluation of the system is discussed.…
The MESH research group at the University of Illinois is currently designing a network of minicomputers. Extensions allowing some of the machines making up this network to be dynamically microprogrammable are proposed. These microprogrammable machines will be able to be reconfigured into specialized processors designed to run specific jobs efficiently. Modifications to the network operating system required by this extension
In-Sheng Cheng; S. H. Koozekanani; M. T. Fatehi
A new method for analysis and recording of gait parameters is reported. This method consists of a television camera interfaced with a PDP-11\\/10 minicomputer. The TV camera picks up anatomical points of interest such as knee joint, ankle joint, etc., to which small lights are attached and the computer calculates their coordinates and joint angles as a function of time.
Black, John B.
This outline of new technological developments and their applications in the library and information world considers innovations in three areas: automation, telecommunications, and the publishing industry. There is mention of the growth of online systems, minicomputers, microcomputers, and word processing; the falling costs of automation; the…
Jeane, H. L.
Module contains mask register, line register, primary sync register, secondary sync register, push-pop stacking register, control section, and interrupt address generator. APIM operates in conjunction with logic found in majority of minicomputers to provide fully-vectored interrupt capabilities.
•PERSONAL DIGITAL ASSISTANTS (PDAs), are handheld minicomputers used for tasks such as referencing, documentation, and data storage and retrieval.•THESE DEVICES can help perioperative nurses solve problems associated with updating, maintaining, and retrieving surgical preference cards and accessing treatment and medication references.•CHOOSING A PDA and accessories, finding basic software, and writing a perioperative nursing program can be painless, even for a
Haag, Vincent H.; Hammond, Robert P.
This manual is designed for use in inservice work with teachers preparing to use the Comprehensive School Mathematics Program (CSMP) in grades K-2, and as a resource guide for later use. In it, the nonstandard topics used in the CSMP primary program are developed through discussion and a variety of exercises. These topics include the minicomputer…
J. Axelrod; D. A. Sands; B. Akselrod
A Honeywell Level-6 minicomputer has been adapted to the task of conducting the test and calibration of the HOE Homing Sensor, a long wave infrared cryogenically cooled sensor that performs target acquisition and terminal guidance on a tactical ballistic missile interceptor. Essential elements, flow charts and block diagrams are presented for both hardware and software. Assembly language programs are also
Bristow, Rob; Dodds, Ted; Northam, Richard; Plugge, Leo
Some of the most significant changes in information technology are those that have given the individual user greater power to choose. The first of these changes was the development of the personal computer. The PC liberated the individual user from the limitations of the mainframe and minicomputers and from the rules and regulations of centralized…
A directory lists computer software vendors offering software useful in administering college alumni and development programs. Listings include client/server system vendors and minicomputer and mainframe system vendors. Each listing contains the vendor name and address, contact person, software title(s), cost, hardware requirements, and client…
K. FURUT; T. OCHIAI; N. ONO
The paper is concerned with the attitude control of a triple inverted pendulum. The lowest hinge is free for rotation and the torques of the upper two hinges are manipulated not only to stabilize the pendulum but also to control its attitude. The control system is designed by using CAD developed by the author and is realized by a minicomputer.
Galyon, Rosalind; And Others
Based on an earlier user's guide to a minicomputer page layout system called PLA (Terrell, 1982), this guide is designed for use in the development and production of text-graphic materials for training relatively unskilled technicians to perform complex procedures. A microcomputer version of PLA, MicroPLA uses the Commodore 8032 microcomputer to…
K. Nagata; N. Basugi; T. Fukushima; T. Tango; I. Suzuki; T. Kaminuma; S. Kurashina
A new method of discriminating pathological cerebral atrophy from physiological atrophy during aging is reported. The authors advocate a pixel counting method using a minicomputer for the quantitative measurement of cerebral atrophy. Five hundred cases were studied with this quantitative method and the normal range of the physiological atrophy was determined statistically. In order to estimate the degree of cerebral
Hart, Russ A.
Of necessity, adult educators will be turning to technological delivery forms to meet the insistent call for increasing numbers of programs. As teleconferencing, television, microwave, minicomputer, satellite, fiberoptic, and laser technologies continue to expand, they hold promise of educating millions of adult students on and off campus. A…
C. T. Russell
The development of an interactive graphics system for speeding up the identification and analysis of magnetic field data received from spacecraft is described. A flat file system was designed with a standard header file which contained space for user notes. The system featured a minicomputer with the capacity to handle up to 20 remote terminal requests simultaneously. Time series of
Air Products and Chemicals has successfully implemented a mini-computer monitoring and control system at its LaPorte Syngas Complex as an integral part of its energy management strategy. The production complex consists of two separate plants at La...
R. Noonan; V. Basili; R. Hamlet; M. Lay; D. Mills; J. Turner; M. Zelkowitz
In recent years there has been a phenomenal growth of interest in networks of computers sharing both data and programs. For example, Cmdr. Grace Hopper has advocated a hierarchical network of minicomputers to perform the data processing function in which the tasks to be done would be distributed over the network, with only summary or extract information being passed from
King, Susan G.
This manual provides guidance in the use of the Integrated Library System (ILS), a library minicomputer system in which all automated library functions are processed against a single database. It is oriented toward ILS users with no ADP training or experience. Written in MUMPS, a higher-level language, the system includes the following…
Discusses design features of the Online Catalog of LS/2000, OCLC's enhanced version of Integrated Library System. This minicomputer-based system provides bibliographic file maintenance, circulation control, and online catalog searching. Examples of available displays--holdings, full MARC, work forms, keyword entry, index selection, brief citation,…
Sandia's Digital Systems Development Division 1521 has developed a new functional relay tester. Capabilities of this tester include the measurement of coil and contact resistance, hipot, operate current, and contact operation and bounce times. The heart of the tester is a Hewlett-Packard 21MX minicomputer that uses BASIC or FORTRAN programming languages. All measurements are made by means of simple program
David R. Cheriton; Michael A. Malcolm; Lawrence S. Melen; Gary R. Sager
Thoth is a portable real-time operating system which has been developed at the University of Waterloo. Various configurations of Thoth have been running since May 1976; it is currently running on two minicomputers with quite different architectures (Texas Instruments 990 and Data General NOVA). This research is motivated by the difficulties encountered when moving application programs from one system to
Current and Retrospective Sources of Machine Readable Monograph Cataloging Records: A Study of Their Potential Cost and Utility in Automated System Development at the University of Minnesota. Revised Edition.
Grosch, Audrey N.
A discussion of alternatives and costs for building monographic bibliographic files for an on-line management system using minicomputers at the University of Minnesota Libraries, Twin Cities Campus, considers secondary and primary sources of MARC II records, including BLACKWELL-North America, Information Dynamics Corporation BIBNET and Ohio…
Feldman, G. H.; Johnson, J. A.
Structural-programming language is especially-tailored for producing assembly language programs for MODCOMP II and IV mini-computes. Modern programming language consists of set of simple and powerful control structures that include sequencing alternative selection, looping, sub-module linking, comment insertion, statement continuation, and compilation termination capabilities.
G. Persky; D. N. Deutsch; D. G. Schweikert
LTX is a minicomputer-based design system for largescale integrated circuit chip layout which offers a flexible set of interactive and automatic procedures for translating a circuit connectivity description into a finished mask design. The system encompasses algorithms for two-dimensional placement, string placement, exploitation of equivalent terminals, decomposition of routing into channels, and channel routing. Circuit connectivity is preserved during interactive
Miller, R. L.
Primarily designed to acquire data at steady state test conditions, the system can also monitor slow transients such as those generated in moving to a new test condition. The system configuration makes use of a microcomputer at the test site which acts as a communications multiplexer between the measurement and display devices and a centrally located minicomputer. A variety of measurement and display devices are supported using a modular approach. This allows each system to be configured with the proper combination of devices to meet the specific test requirements, while still leaving the option to add special interfaces when needed. Centralization of the minicomputer improves utilization through sharing. The creation of a pool of minis to provide data acquisition and display services to a variable number of running tests also offers other important advantages.
Based on the need for a more accurate data acquisition system, a proposal was submitted by the Department of Mechanical Engineering at Colorado State University (CSU) to the Department of Energy for the development of a minicomputer-based system. One attractive feature of this approach is that a curve fit of data is available in real time. Also, because the system is fairly small in physical dimensions it can be easily carried and installed as a portable unit. The project consisted of two phases: development of the algorithm for data analysis and provision of the software for the Hewlett Packard HP9845T minicomputer; and testing of the algorithm on a Darrieus wind turbine generator located at the Colorado State University dairy farm.
A minicomputer-based data acquisition system was developed that supplies a curve fit of data in real time and is small enough to be easily carried and installed as a portable unit. The algorithm for data analysis was developed and the software provided for a Hewlett Packard HP9845T minicomputer. The algorithm was then tested on a Darrieus wind turbine generator located at the Colorado State University dairy farm. The measurements include: wind speed and direction, air temperature, and ac power for each phase of the three-phase induction generator. A linear model is described for the power vs. wind speed curve, and the effect of wind variation (gustiness) is considered. Software for the system is listed and discussed. Results are then presented and compared. (LEW)
Collins, L.L.; Chulick, E.T.
A continuous on-line fission product monitor has been installed at the Oyster Creek Nuclear Generating Station, Forked River, New Jersey. The on-line monitor is a minicomputer-controlled high-resolution gamma-ray spectrometer system. An intrinsic Ge detector scans a collimated sample line of coolant from one of the plant's recirculation loops. The minicomputer is a Nuclear Data 6620 system. Data were accumulated for the period from April 1979 through January 1980, the end of cycle 8 for the Oyster Creek plant. Accumulated spectra, an average of three a day, were stored on magnetic disk and subsequently analyzed for fisson products, Because of difficulties in measuring absolute detector efficiency, quantitative fission product concentrations in the coolant could not be determined. Data for iodine fission products are reported as a function of time. The data indicate the existence of fuel defects in the Oyster Creek core during cycle 8.
Mackin, T. F.; Sulester, J. M. (principal investigators)
As LACIE Procedure 1 evolved from the Classification and Mensuration Subsystem smallfields procedures, it became evident that two computational systems would have merit-the LACIE/Earth Resources Interactive Processing System based on a large IBM-360 computer oriented for operational use with high computational throughput, and a smaller, highly interactive system based on a PDP 11-45 minicomputer and its display system, the IMAGE-100. The latter had advantages for certain phases; notably, interactive spectral aids could be implemented quite rapidly. This would allow testing and development of Procedure 1 before its implementation on the LACIE/Earth Resources Interactive Processing System. The resulting minicomputer system, called the Classification and Mensuration Subsystem IMAGE-100 Hybrid System, allowed Procedure-1 operations to be performed interactively, except for clustering, classification, and automatic selection of best acquisitions, which were offloaded to the LACIE/Earth Resources Interactive Processing System.
STACK S. H.
A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.
Harrington, J. A., Jr.; Cartin, K. F.
The information provided by remotely sensed data collected from orbiting platforms has been useful in many research fields. Particularly convenient for evaluation are generally digital data stored on computer compatible tapes (CCT's). The major advantages of CCT's are the quality of the data and the accessibility to computer manipulation. Minicomputer systems are widely used for the required computer processing operations. However, microprocessor-related technological advances make it now possible to process CCT data with computing systems which can be obtained at a much lower price than minicomputer systems. A detailed description is provided of the design considerations of a microcomputer-based Digital Image Analysis System (DIAS). Particular attention is given to the algorithms which are incorporated for eighter edge enhancement or smoothing Landsat multispectral scanner data.
Lewis, R. A.; Johnston, A. R.
A scanning Laser Rangefinder (LRF) which operates in conjunction with a minicomputer as part of a robotic vehicle is described. The description, in sufficient detail for replication, modification, and maintenance, includes both hardware and software. Also included is a discussion of functional requirements relative to a detailing of the instrument and its performance, a summary of the robot system in which the LRF functions, the software organization, interfaces and description, and the applications to which the LRF has been put.
Characterization of coal-derived liquids and other fossil fuel related materials employing mass spectrometry. Mass spectrometry and fossil-energy conversion technology: a review. Quarterly report, March 30June 29, 1978
The following activities in regard to the development of micromolecular probe distillation in combination with field-ionization mass spectrometry (FI\\/MS) for quantitative analysis are reported. The temperature-control module for the direct-introduction probe was received and successfully interfaced to both the probe and the NOVA 3\\/12. Both temperatures and FI\\/MS data were minicomputer acquired for probe distillation of a 19 component synthetic
E. Ball; Jerome A. Feldman; James R. Low; Richard F. Rashid; Paul Rovner
The RIG system provides convenient access to a wide range of computing facilities. The system includes five large mini-computers in a very fast internal network, disk and tape storage, a printer\\/plotter, and a number of display terminals. These are connected to larger campus machines (IBM 360\\/65 and DEC KL10) and to the ARPANET. The operating system and other software support
Fromm, F. R.; Northouse, R. A.
A possible solution to the analysis of the massive amounts of multi-spectral scanner data from the Earth Resource Technical Satellite (ERTS) program is proposed. This solution is offered as an adaptive on-line classification scheme. The classifier is described as well as its controller which is based on ground truth data. Cluster analysis is presented as an alternative approach to the ground truth data. Adaptive feature selection is discussed and possible mini-computer implementations are offered.
Enrico Sangiorgi; Bruno Riccò; Franco Venturi
An efficient Monte Carlo device simulator has been developed as a postprocessor of a two-dimensional numerical analyzer based on the drift-diffusion model. The Monte Carlo package analyzes real VLSI MOSFETs in a minicomputer environment, overcoming some existing theoretical and practical problems. In particular, the particle free-flight time distribution is obtained by a new algorithm, leading to a CPU time saving
Rosen, C. A.
Topics explored in the development of integrated programmable automation systems include: numerically controlled and computer controlled machining; machine intelligence and the emulation of human-like capabilities; large scale semiconductor integration technology applications; and sensor technology for asynchronous local computation without burdening the executive minicomputer which controls the whole system. The role and development of training aids, and the potential application of these aids to augmented teleoperator systems are discussed.
Allen, Bradley P.
The plan and schedule for Phase 1 of the Ada based ESBT Design Research Project is described. The main platform for the project is a DEC Ada compiler on VAX mini-computers and VAXstations running the Virtual Memory System (VMS) operating system. The Ada effort and lines of code are given in tabular form. A chart is given of the entire project life cycle.
Bou-Saada, T. E.; Haberl, J. S.
, ventilating, and air-conditioning (HVAC) design tools to supplement tedious manual energy calculations. Initially, super mini-computers or mainframe computers were required. Unfortunately, simulation was restricted to research organizations supported by public... an hourly MBE of -0.7% and a CV(RMSE) of 23.1% which compare favorably with the most accurate hourly neural network models. INTRODUCTION Computers and programmable calculators have been used extensively during the past three decades as effective heating...
John H. Wensley; L. Lamport; J. Goldberg; M. W. Green; K. N. Levitt; P. M. Melliar-Smith; R. E. Shostak; C. B. Weinstock
SIFT (Software Implemented Fault Tolerance) is an ultrareliable computer for critical aircraft control applications that achieves fault tolerance by the replication of tasks among processing units. The main processing units are off-the-shelf minicomputers, with standard microcomputers serving as the interface to the I\\/O system. Fault isolation is achieved by using a specially designed redundant bus system to interconnect the proeessing
Zabiyakin, G. I.; Rykovanov, S. N.
A system for imaging scientific telemetric information, based on the M-6000 minicomputer and the SIGD graphic display, is described. Two dimensional graphic display of telemetric information and interaction with the computer, in analysis and processing of telemetric parameters displayed on the screen is provided. The running parameter information output method is presented. User capabilities in the analysis and processing of telemetric information imaged on the display screen and the user language are discussed and illustrated.
Wissink, T. L.
Novel hardware configuration makes it possible for Space Shuttle launch processing system to monitor pulse-code-modulated data in real time. Using two microprogramable "option planes," incoming PCM data are monitored for changes at rate of one frame of data (80 16-bit words) every 10 milliseconds. Real-time PCM processor utilizes CPU in mini-computer and CPU's in two option planes.
Rennier, A. D.; Bowhill, S. A.
A real time collection system was developed for the Urbana coherent scatter radar system. The new system, designed for use with a microcomputer, has several advantages over the old system implemented with a minicomputer. The software used to collect the data is described as well as the processing software used to analyze the data. In addition a magnetic tape format for coherent scatter data exchange is given.
Sverre Holm; A. Maoy; E.-A. Herland
A high-performance processing facility for Synthetic Aperture Radar (SAR) is described. The SAR processor is designed for the ERS-1 remote sensing satellite and will process one 100 km by 100 km scene in six to seven minutes. The SAR processor is built around a 320 MFLOPS parallel processor. The front-end processor is a mini-computer which provides the input\\/output capacity necessary
A. Mura; M. Tomljanovich
An integrated interactive system for the P.C. boards production is described. The system comprises a minicomputer, a teletype, a paper tape puncher and reader and a package of analysis, control and post-processing programs.The input data to the system consist of circuit schematics, coded in terms of electrical components and connections, and manual layouts of the P.C.B.'s. Both inputs are internally
Garin, J.; Bolfing, B.J.; Satterlee, P.E.; Babcock, S.M.
A three-axis closed-loop position control system has been designed and installed on an overhead bridge, carriage, tube hoist for automotive positioning of manipulation at a remotely maintained work site. The system provides accurate (within 3 min) and repeatable three-axis positioning of the manipulator. The position control system has been interfaced to a supervisory minicomputer system that provides teach-playback capability of manipulator positioning and color graphic display of the three-axis system position.
Tisdale, G. E.
Image registration techniques were developed to perform a geometric quality assessment of multispectral and multitemporal image pairs. Based upon LANDSAT tapes, accuracies to a small fraction of a pixel were demonstrated. Because it is insensitive to the choice of registration areas, the technique is well suited to performance in an automatic system. It may be implemented at megapixel-per-second rates using a commercial minicomputer in combination with a special purpose digital preprocessor.
Murphy, June I.; Matte, Walter B.; Broz, Thomas
In 1980 the Ontario Cancer Treatment and Research Foundation embarked upon an ambitious program to introduce computing in its seven original cancer centres. Considerable expansion of its computer facilities to handle the Ontario Cancer Registry was also required. This paper describes the current status of the program which involves a combination of centralized and distributed computing using an IBM mainframe, Honeywell minicomputers and IBM microcomputers in local area networks. Included are brief descriptions of the major application areas.
A. E. Leybourne; K. S. Ali
The electronics and computer engineering programs at the University of Southern Mississippi offer two process control courses. One is based on classical continuous domain techniques while the other utilizes the domain. Historically, Distributed Process (DPC) tasks have required main-frame or mini-computers, usually with proprietary software. Our experience with an LC-4 Controller (micro-computer) which has been effectively utilized in the discrete
Brown, J. S.; Weinrich, S. S.
A prototpye coaxial cable bus communications system was designed to be used in the Trend Monitoring System (TMS) to connect intelligent graphics terminals (based around a Data General NOVA/3 computer) to a MODCOMP IV host minicomputer. The direct memory access (DMA) interfaces which were utilized for each of these computers are identified. It is shown that for the MODCOMP, an off-the-shell board was suitable, while for the NOVAs, custon interface circuitry was designed and implemented.
Brown, J. S.; Lenker, M. D.
A prototype bus communications system, which is being used to support the Trend Monitoring System (TMS) as well as for evaluation of the bus concept is considered. Hardware and software interfaces to the MODCOMP and NOVA minicomputers are included. The system software required to drive the interfaces in each TMS computer is described. Documentation of other software for bus statistics monitoring and for transferring files across the bus is also included.
P. R. Bannister
Extremely low frequency (ELF) measurements are made of the transverse horizontal magnetic field strength received in Connecticut. The AN\\/BSR-1 receiver consists of an AN\\/UYK-20 minicomputer, a signal timing and interface unit (STIU), a rubidium frequency time standard, two magnetic tape recorders, and a preamplifier. The transmission source of these farfield (1.6-Mm range) measurements is the U.S. Navy's ELF Wisconsin Test
Nesel, Michael C.; Hammons, Kevin R.
Enhancements to the real-time processing and display systems of the NASA Western Aeronautical Test Range are described. Display processing has been moved out of the telemetry and radar acquisition processing systems super-minicomputers into user/client interactive graphic workstations. Real-time data is provided to the workstations by way of Ethernet. Future enhancement plans include use of fiber optic cable to replace the Ethernet.
J. P. Kahler
A computerized aid for use in making predictions of far-field sound propagation in the atmosphere is described. The computer program, which was written for a PDP 11\\/45 minicomputer, traces acoustic rays through a moving, stratified, vertically-inhomogeneous atmosphere to provide the information needed to predict the locations and intensities of focused and lesser amplified propagated sound generated by high explosive detonations
J. G. Schneider; M. D. Risley; M. J. Reazer; A. V. Serrano; J. L. Hebert
The vulnerability of current and future aircraft to lightning-induced EMP and electrostatic discharges can be assessed by the techniques developed at Wright Aeronautical Laboratories. The methods are presently noted to possess cost-effective data acquisition, processing and storage systems, and software of exceptionally high productivity. The microcomputer-controlled minicomputers used are capable of transient as well as CW measurements, and analog fiber
Dobrotin, B. M.
A brief outline of NASA's current robotics program is presented. Efforts are being concentrated on a roving surface vehicle for Mars exploration. This vehicle will integrate manipulative, locomotive, and visual functions and will feature an electromechanical manipulator, stereo TV cameras, a laser rangefinder, a minicomputer, and a remote off-line computer. The program hinges on the iterative development of complex scenarios describing the robot's mission and the interrelationships among its various subsystems.
R. E. Gibbs; R. A. Whitby; J. D. Hyde; R. E. Johnson; B. J. Hill; P. L. Werner; T. E. Hoffman; P. A. Gabele
The emission test capability developed at New York's Automotive Emissions Laboratory is described as applied to realtime measurements from in-use three-way-catalyst vehicles. Realtime data from tailpipe emissions analyzers, engine air\\/fuel ratio analyzer, and vehicle speed\\/acceleration\\/power from an electric chassis dynamometer are polled several times per second by a minicomputer data acquisition system. These data are obtained in parallel with conventional
Under sponsorship of ASHRAE and TC-1.5 (Computer Applicatons), a bibliography of computer programs has been developed that inventories available software in the general area of heating, refrigerating, air conditioning, and ventilating. The bibliography contains annotated software abstracts for all sizes of computers (programmable calculators, microcomputers, minicomputers, and mainframes) and includes the topical areas of acoustics, computer-aided design, mechanical equipment design,
E. T. Barron; R. M. Glorioso
This paper discusses the design and construction of a micro controlled mini-computer used as peripheral processor unit for the PDP-11\\/20 in the Electrical and Computer Engineering Laboratory at the University of Massachusetts at Amherst. The instruction set for this computer is determined by the micro code in a read only memory (ROM) and is therefore flexible: Changing the ROM results
The first VLSI device to embody a practical data-flow architecture, the MUPD7281 image pipelined processor, performs repetitive data-intensive tasks extremely quickly at the bidding of a conventional host system. Its speed-5 million instructions\\/s-allows the construction of a system equivalent to a medium- to high-performance minicomputer and makes it a natural for image-processing applications, but it is also well suited for
Thermal battery functional test data are stored in an HP3000 minicomputer operated by the Power Sources Department. A program was written to read data from a battery data base, compute simple statistics (mean, minimum, maximum, standard deviation, and K-factor), print out the results, and store the data in a file for subsequent plotting. A separate program was written to plot the data. The programs were written in the Pascal programming language. 1 tab.
Scharrer, G. L.
Thermal battery functional test data are stored in an HP3000 minicomputer operated by the Power Sources Department. A program was written to read data from a battery data base, compute simple statistics (mean, minimum, maximum, standard deviation, and K-factor), print out the results, and store the data in a file for subsequent plotting. A separate program was written to plot the data. The programs were written in the Pascal programming language.
A silicon valley museum-in-the-making showcases a half-century of innovation in computing. The museum at Moffett contains perhaps the most complete collection of groundbreaking hardware and software in the world-from the Hollerith punch-card tabulating machine that rescued the 1890 US census to the LINC laboratory minicomputer to a prototype of the Palm Pilot PDA and an early copy of IBM's gigabyte
Alley, P. L.
Significant considerations are described for performing a Severe Storms Measurement program in real time. Particular emphasis is placed on the sizing and timing requirements for a minicomputer-based system. Analyses of several factors which could impact the effectiveness of the system are presented. The analyses encompass the problems of data acquisition, data storage, data registration, correlation, and flow field computation, and error induced by aircraft motion, moment estimation, and pulse integration.
Ackermann, Hans D.; Pankratz, Leroy W.; Dansereau, Danny A.
The computer programs published in Open-File Report 82-1065, A comprehensive system for interpreting seismic-refraction arrival-time data using interactive computer methods (Ackermann, Pankratz, and Dansereau, 1982), have been modified to run on a mini-computer. The new version uses approximately 1/10 of the memory of the initial version, is more efficient and gives the same results.
G. Persky; D. N. Deutsch; D. G. Schweikert
LTX is a minicomputer-based design system for large-scale integrated circuit chip layout which offers a flexible set of interactive and automatic procedures for translating a circuit connectivity description into a finished mask design. The system encompasses algorithms for two-dimensional placement, string placement, exploitation of equivalent terminals, decomposition of routing into channels, and channel routing. Circuit connectivity is preserved during interactive
The VuGRAPH code offers possibly the fastest method of generating professional-quality viewgraph transparencies. The viewgraphs are constructed on a four-color plotter controlled by a HP-9825A desk-top minicomputer. The program uses a unique line enhancement scheme to produce bold characters optimally suited for projection in most auditoriums. Twenty-four ''single-key''commands simplify operation for users. This feature and the fast-response typewriter keyboard have
White, P. R.; Little, R. R.
A research effort was undertaken to develop personal computer based software for vibrational analysis. The software was developed to analytically determine the natural frequencies and mode shapes for the uncoupled lateral vibrations of the blade and counterweight assemblies used in a single bladed wind turbine. The uncoupled vibration analysis was performed in both the flapwise and chordwise directions for static rotor conditions. The effects of rotation on the uncoupled flapwise vibration of the blade and counterweight assemblies were evaluated for various rotor speeds up to 90 rpm. The theory, used in the vibration analysis codes, is based on a lumped mass formulation for the blade and counterweight assemblies. The codes are general so that other designs can be readily analyzed. The input for the codes is generally interactive to facilitate usage. The output of the codes is both tabular and graphical. Listings of the codes are provided. Predicted natural frequencies of the first several modes show reasonable agreement with experimental results. The analysis codes were originally developed on a DEC PDP 11/34 minicomputer and then downloaded and modified to run on an ITT XTRA personal computer. Studies conducted to evaluate the efficiency of running the programs on a personal computer as compared with the minicomputer indicated that, with the proper combination of hardware and software options, the efficiency of using a personal computer exceeds that of a minicomputer.
Harris, C. G.; Haris, D. K.
A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.
Wulff, W.; Cheng, H.S.; Lekach, S.V.; Mallen, A.N.
A combination of advanced modeling techniques and modern, special-purpose peripheral minicomputer technology is presented which affords realistic predictions of plant transient and severe off-normal events in LWR power plants through on-line simulations at a speed ten times greater than actual process speeds. Results are shown for a BWR plant simulation. Results are shown to demonstrate computing capacity, accuracy, and speed. Simulation speeds have been achieved which are 110 times larger than those of a CDC-7600 mainframe computer or ten times greater than real-time speed.
Harwood, Ann Elizabeth Gelber
programs written in FORTRAN. They were developed on a VAX 11/780 minicomputer and are 10 currently run on a VAX 11 '785 under the %'MS operating system. All the programs use TAMLIF design files as their input. TAMLIF uses syntax similar to the widely... using a series of nested FORTRAN "'IF" statements that compare their bit patterns and the bit pattern of the current cell to patterns defined by the rules. 6'hen violations are detected. the location and type of the error, along with its associated...
Data base management techniques and applicable equipment are described. Recommendations which will assist potential NASA data users in selecting and using appropriate data base management tools and techniques are presented. Classes of currently available data processing equipment ranging from basic terminals to large minicomputer systems were surveyed as they apply to the needs of potential SEASAT data users. Cost and capabilities projections for this equipment through 1985 were presented. A test of a typical data base management system was described, as well as the results of this test and recommendations to assist potential users in determining when such a system is appropriate for their needs. The representative system tested was UNIVAC's DMS 1100.
Utsman, T. E.
A significant element of the Kennedy Space Center's ground support equipment for the Space Shuttle is the Launch Processing System, which provides a high level of automation for all operations, including the checkout of the Orbiter, Solid Rocket Boosters, and External Tank. Other direct support elements of the Ground Support Equipment accomplish environmental conditioning, provide and control power, gases, and fluids, and supply vehicle facility and personnel fire protection. Attention is given to the prelaunch functions of the Launch Control Center's Firing Rooms, which contain minicomputers, a data recording area, the Hardware Interface Modules, a Common Data Buffer, and Front End Processors.
Kantak, Anil V.
A new method is presented for handling data resulting from Mobile Satellite propagation experiments such as the Pilot Field Experiment (PiFEx) conducted by JPL. This method uses the UNIX operating system and C programming language. The data management system is implemented on a VAX minicomputer. The system automatically divides the large data file housing data from various experiments under a predetermined format into various individual files containing data from each experiment. The system also has a number of programs written in C and FORTRAN languages to allow the researcher to obtain meaningful quantities from the data at hand.
on a modified Ohio Nuclear rectilinear scanner capable of making a single pass or multi-pass scan over a subject. The multi- channel analyzer was coupled to a PDP11-34A minicomputer 25 Table 1. Physical data on dogs at time of injection Dog . Sex.... In an effort to determine the equilibration time, eight dogs were injected with a K-chloride solution. Two curves were generated . The first curve showed the serum specific activity as a function of time, and the second curve demonstrated the exchangeable...
Butera, M. K.
Results are given for three separate investigations of remote sensing over wetlands, including the delineations of roseau cane and mangrove from both Landsat and aircraft MSS data, and the delineation of wetland communities for potential waste assimilation in a coastal river floodplain from Landsat MSS data only. Attention is also given to data processing and analysis techniques of varying levels of sophistication, which must increase with surface cover diversity. All computer processing in these studies was performed on a minicomputer configured with the adequate memory, image display capability, and associated peripherals, using state-of-the-art digital MSS data analysis software.
Weinstein, Berthold W. [Livermore, CA; Willenborg, David L. [Livermore, CA
A manipulator which provides fast, accurate rotational positioning of a small sphere, such as an inertial confinement fusion target, which allows inspecting of the entire surface of the sphere. The sphere is held between two flat, flexible tips which move equal amounts in opposite directions. This provides rolling of the ball about two orthogonal axes without any overall translation. The manipulator may be controlled, for example, by an x- and y-axis driven controlled by a mini-computer which can be programmed to generate any desired scan pattern.
Weinstein, B.W.; Willenborg, D.L.
A manipulator is disclosed which provides fast, accurate rotational positioning of a small sphere, such as an inertial confinement fusion target, which allows inspecting of the entire surface of the sphere. The sphere is held between two flat, flexible tips which move equal amounts in opposite directions. This provides rolling of the ball about two orthogonal axes without any overall translation. The manipulator may be controlled, for example, by an x- and y-axis driven controlled by a mini-computer which can be programmed to generate any desired scan pattern. 8 figs.
O'Neill, Pat; Volkert, J. Jay; Koop, Gerald O.
The exploding technology in micro and personal computers has stimulated knowledgeable occupational health professionals to examine their potential applications in their own work. Commercially available health surveillance systems are currently being offered in large minicomputers or mainframes. Does the revolution in hardware technology now mean that a comprehensive occupational health system can be supported by a small inexpensive computer? Such a machine-independent information system has been developed to perform data base management functions for tracking employee health status to discern potential health effects from workplace exposures.
Allen, Bradley P.
The plan is described for the integrated testing and benchmark of Phase Ada based ESBT Design Research Project. The integration testing is divided into two phases: (1) the modules that do not rely on the Ada code generated by the Ada Generator are tested before the Ada Generator is implemented; and (2) all modules are integrated and tested with the Ada code generated by the Ada Generator. Its performance and size as well as its functionality is verified in this phase. The target platform is a DEC Ada compiler on VAX mini-computers and VAX stations running the VMS operating system.
Chibon, P; Brugal, G; Chassery, J M; Bouttaz, R; Adelh, D; Garbay, C; Giroud, F
The system SAMBA has been designed for automatic analysis of biological images, at the cellular or subcellular level. The examination is performed at the maximum resolution power of the microscope. It enables to discriminate by means of a pool of morphological and densitometrical parameters, between cells in the different phases of the cell cycle, or between cells belonging to various types in heterogeneous populations. Other recognition programs are at present beeing in progress, in order to promote the use of SAMBA in other fields of fundamental research and clinical application. Owing to the minicomputer now in use, SAMBA is an autonomous system, capable of beeing routinely used in diagnostic centers. PMID:161188
Neuhold, E. J.
In the last few years a number of research and advanced development projects have resulted in distributed data base management prototypes. POREL, developed at the University of Stuttgart, is a multiuser, distributed, relational system developed for wide and local area networks of minicomputers and advanced micros. The general objectives of such data base systems and the architecture of POREL are discussed. In addition a comparison of some of the the existing distributed DMBS is included to provide the reader with information about the current state of the art.
Anderson, R. E.; Lewis, J. R.; Trudell, B. J.
A variety of techniques potentially useful to data collection have been tested. An automatic data collection platform with a minicomputer collects and preprocesses data, then sends desired information when interrogated through a communication satellite. Position surveillance by tone-code ranging through communication satellites is automatic, real time and accurate. Emergency medical data transmissions from ambulances to hospitals can be extended to rural and remote areas by direct satellite links. A small platform can send emergency-related data through a satellite while the satellite is routinely relaying powerful communication signals. A low orbit satellite provides means to locate existing emergency locator beacons.
Eppler, W. G.
The table look-up approach to pattern recognition has been used for 3 years at several research centers in a variety of applications. A new version has been developed which is faster, requires significantly less core memory, and retains full precision of the input data. The new version can be used on low-cost minicomputers having 32K words (16 bits each) of core memory and fixed-point arithmetic; no special-purpose hardware is required. An initial FORTRAN version of this system can classify an ERTS computer-compatible tape into 24 classes in less than 15 minutes.
Klenke, D. J.; Hemsch, M. J.
Data from plume interaction tests, nose mounted canard configuration tests, and high angle of attack tests on the Army Generalized Missile model are consolidated in a computer program which makes them readily accessible for plotting, listing, and evaluation. The program is written in FORTRAN and will run on an ordinary minicomputer. It has the capability of retrieving any coefficient from the existing DATAMAN tapes and displaying it in tabular or plotted form. Comparisons of data taken in several wind tunnels and of data with the predictions of Program MISSILE2 are also presented.
Natesh, R.; Smith, J. M.; Bruce, T.; Oidwai, H. A.
One hundred and seventy four silicon sheet samples were analyzed for twin boundary density, dislocation pit density, and grain boundary length. Procedures were developed for the quantitative analysis of the twin boundary and dislocation pit densities using a QTM-720 Quantitative Image Analyzing system. The QTM-720 system was upgraded with the addition of a PDP 11/03 mini-computer with dual floppy disc drive, a digital equipment writer high speed printer, and a field-image feature interface module. Three versions of a computer program that controls the data acquisition and analysis on the QTM-720 were written. Procedures for the chemical polishing and etching were also developed.
Antares, a large, experimental laser fusion facility under construction at Los Alamos National Laboratory in New Mexico, is controlled by a network of PDP-11 minicomputers and microprocessors. The remote nodes of the Antares control network are based on an LSI-11/2 microcomputer interfaced to an STD Bus. This machine interface or MI forms the intelligent process controller located directly adjacent to the many diverse laser subsystem devices. The STD Bus, linked to the LSI-11/2 microcomputer, offers a standardized, cost effective means for the development of the specialized interface functions required for the high energy laser environment.
Pepper, D W
A sophisticated emergency response system was developed to aid in the evaluation of accidental releases of hazardous materials from the Savannah River Plant to the environment. A minicomputer system collects and archives data from both onsite meteorological towers and the National Weather Service. In the event of an accidental release, the computer rapidly calculates the trajectory and dispersion of pollutants in the atmosphere. Computer codes have been developed which provide a graphic display of predicted concentration profiles downwind from the source, as functions of time and distance.
This study reviews the development and current state-of-the-art in computers for artificial intelligence, including LISP machines, AI workstations, professional and engineering workstations, minicomputers, mainframes, and supercomputers. Major computer systems for AI applications are reviewed. The use of personal computers for expert system development is discussed, and AI software for the IBM PC, Texas Instrument Professional Computer, and Apple MacIntosh is presented. Current research aimed at developing a new computer for artificial intelligence is described, and future technological developments are discussed.
Moshkina, T.I.; Nakhmanson, M.S.
This paper discusses the problem of distinguishing the analytical line from the background and approximates the background component. One of the constituent parts of the program package in the procedural-mathematical software for x-ray investigations of polycrystalline substances in application to the DRON-3, DRON-2 and ADP-1 diffractometers is the SSF system of programs, which is designed for determining the parameters of the substructure of materials. The SSF system is tailored not only to Unified Series (ES) computers, but also to the M-6000 and SM-1 minicomputers.
Crankshaw, D P; Hall, B R
A medium sized minicomputer system is presented as an attractive way of balancing the cost of computing equipment with that of programme development. A versatile circuit has been coupled to a computer system and programmes have been developed to control the functions of this circuit. With such a system it is possible for relatively untrained biological workers to use FORTRAN programming language to control sampling of analogue signals, manipulation of data and finally graphical presentation of results. The system is proposed for use both in anaesthetic research and in on-line monitoring of critically ill patients. PMID:596620
Schellart, N A; Zweijpfenning, R C; van Marle, J; Huijsmans, D P
Using a video-image system coupled to a minicomputer with commercial image handling software, autoradiographic grains displayed in dark-field are counted with a fast (ca. 3.5 min for 120,000 microns 2) and reliable (false scores less than 5%) grain-recognizing FORTRAN program executed in the users memory. The grain counts are printed in a raster of adjustable size overlying a bright-field image, so that the counts can be related directly to the underlying histological structures. PMID:3640681
Ling, Y.C.; Bernardo, D.N.; Morrison, G.H.
There has been a steadily increasing demand for more computational power in surface and interface analysis. This paper reports attempts to meet this demand through the use of different computing systems, ranging from minicomputer to supercomputer. Representative laboratory data-processing programs for ion-microscopic analysis are used to evaluate the performance of each system. The bottlenecks and other problems involved in running analytical programs on faster machines are identified and discussed. Results indicate that in order to attain the optimal cost-performance ratio, programs must be formulated to exploit available vector and parallel-processing capabilities.
Sandia National Laboratories uses computer-aided design (CAD) equipment for electrical, mechanical, plant facilities and site modeling applications in support of its responsibilities as a prime contractor to the Department of Energy. This paper describes how CAD assists in product design and definition with use of the geometric data base for engineering analysis and computer-aided manufacturing (CAM). The benefits of CAD to these applications are emphasized. Geometric models and their mass properties are passed through a network of CAD computers to a super minicomputer for plot post-processing, analysis, animation and the generation of numerical control codes. Examples of these applications are presented, and some future activities are projected.
Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.
An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.
D. L. Anderson; M. P. Failey; W. H. Zoller; W. B. Walters; G. E. Gordon; R. M. Lindstrom
A facility for neutron-capture ?-ray spectroscopy for analytical purposes has been developed and tested at the National Bureau\\u000a of Standards reactor. The system consists of an internal beam tube with collimators, an external beam tube and irradiation\\u000a station, a Compton-suppressed Ge(Li) ?-ray detection system, and a minicomputer-based data-collection and-analysis system.\\u000a Detection limits have been established for many elements and errors
Shope, William G., Jr.
The U. S. Geological Survey maintains the basic hydrologic data collection system for the United States. The Survey is upgrading the collection system with electronic communications technologies that acquire, telemeter, process, and disseminate hydrologic data in near real-time. These technologies include satellite communications via the Geostationary Operational Environmental Satellite, Data Collection Platforms in operation at over 1400 Survey gaging stations, Direct-Readout Ground Stations at nine Survey District Offices and a network of powerful minicomputers that allows data to be processed and disseminate quickly.
Meegan, C. A.; Fountain, W. F.; Berry, F. A., Jr.
A system to rapidly digitize data from showers in nuclear emulsions is described. A TV camera views the emulsions though a microscope. The TV output is superimposed on the monitor of a minicomputer. The operator uses the computer's graphics capability to mark the positions of particle tracks. The coordinates of each track are stored on a disk. The computer then predicts the coordinates of each track through successive layers of emulsion. The operator, guided by the predictions, thus tracks and stores the development of the shower. The system provides a significant improvement over purely manual methods of recording shower development in nuclear emulsion stacks.
The results are presented of a program to identify technical innovations which would have an impact on NASA data processing and describe as fully as possible the development work necessary to exploit them. Seven of these options for NASA development, as the opportunities to participate in and enhance the advancing information system technology were called, are reported. A detailed treatment is given of three of the options, involving minicomputers, mass storage devices and software development techniques. These areas were picked by NASA as having the most potential for improving their operations.
Wu, H. C.
Creep tests were conducted by means of a closed loop servocontrolled materials test system. The strain history prior to creep is carefully monitored. Tests were performed for aluminum alloy 6061-O at 150 C and were monitored by a PDP 11/04 minicomputer at a preset constant plastic strain rate prehistory. The results show that the plastic strain rate prior to creep plays a significant role in creep behavior. The endochronic theory of viscoplasticity was applied to describe the observed creep curves. Intrinsic time and strain rate sensitivity function concepts are employed and modified according to the present observation.
Watson, K.; Hummer-Miller, S.; Sawatzky, D. L. (principal investigators)
Neither iterative registration, using drainage intersection maps for control, nor cross correlation techniques were satisfactory in registering day and night HCMM imagery. A procedure was developed which registers the image pairs by selecting control points and mapping the night thermal image to the daytime thermal and reflectance images using an affine transformation on a 1300 by 1100 pixel image. The resulting image registration is accurate to better than two pixels (RMS) and does not exhibit the significant misregistration that was noted in the temperature-difference and thermal-inertia products supplied by NASA. The affine transformation was determined using simple matrix arithmetic, a step that can be performed rapidly on a minicomputer.
Complete motion analysis laboratory has evolved out of analyzing walking patterns of crippled children at Stanford Children's Hospital. Data is collected by placing tiny electrical sensors over muscle groups of child's legs and inserting step-sensing switches in soles of shoes. Miniature radio transmitters send signals to receiver for continuous recording of abnormal walking pattern. Engineers are working to apply space electronics miniaturization techniques to reduce size and weight of telemetry system further as well as striving to increase signal bandwidth so analysis can be performed faster and more accurately using a mini-computer.
Parrish, David J.
A flexible manufacturing system (FMS) consists of production equipment organized by a host computer and physically connected by a central transport system which must be sufficiently versatile to allow a variety of parts to be simultaneously manufactured; this can constitute a computer-integrated manufacturing (CIM) system. CIM achieves automation at a total of five distinct levels, from production equipment, through microcomputers and NC equipment, production cell host computers, and minicomputers and mainframes whose functions encompass planning and design. Attention is given to FMS machining center dialogues, mode-changing, host functions, and coupled functions.
Luckey, Richard R.
During the past 20 years, the ground-water data base of the U. S. Geological Survey has evolved from paper files in local offices, to a national data base on a central mainframe computer, to a distributed data base on a network of 49 minicomputers throughout the United States. Users in local offices have easy, inexpensive access to the distributed data base. The distributed data base has caused some problems in data management but has increased the overall quality of the data base.
Boyle, R. J.; Jensen, R. N.; Knoll, R. H.
The thermal performance of the solar collector field for the NASA Langley Solar Building Test Facility is given for October 1976 through January 1977. A 1,180 square meter solar collector field with seven collector designs helped to provide hot water for the building heating system and absorption air conditioner. The collectors were arranged in 12 rows with nominally 51 collectors per row. Heat transfer rates for each row were calculated and recorded along with sensor, insolation, and weather data every five minutes using a minicomputer. The agreement between the experimental and predicted collector efficiencies was generally within five percentage points.
Sallam, A.A.; Dineley, J.L.
A mathematical method, Catastrophe Theory, is applied to the problem of electrical power system dynamic stability. It is suggested that this offers a method for the continual monitoring of power system stability margins by the use of visual graphic display produced by a dedicated minicomputer using information monitored from the power system. The approach arises from long experience in the field of power system stability and a pre-occupation with visualising this multi-dimensional dynamic problem in such a way as to enhance comprehension, both as an aid to understanding and as a method for rapid assimilation of the significance of changes in the system.
Likens, W. C.; Wrigley, R. C.
Most existing image analysis systems were designed with the Landsat Multi-Spectral Scanner in mind, leaving open the question of whether or not these systems could adequately process Thematic Mapper data. In this report, both hardware and software systems have been evaluated for compatibility with TM data. Lack of spectral analysis capability was not found to be a problem, though techniques for spatial filtering and texture varied. Computer processing speed and data storage of currently existing mini-computer based systems may be less than adequate. Upgrading to more powerful hardware may be required for many TM applications.
Hunt, G. H.; Shelton, G. B.
A description is presented of an experimental measurement system for the study of the stray radiation performance of a 50-cm aperture astronomical telescope model. The model incorporates a very high performance baffle system. To simulate the space environment of the actual orbiting telescope, the telescope model was placed in a vacuum chamber. The stray radiation source for the experiment was a ruby laser system. A photomultiplier tube was used for the detector. A minicomputer system was used to control the experiment and to gather and process the data. A computer program was used to model the telescope baffle system. Experimentally and analytically determined stray radiation attenuation data are compared.
Poppe, Lawrence J.; Eliason, A.H.; Fredericks, J.J.
The Automated Particle Size Analysis System integrates a settling tube and an electroresistance multichannel particle-size analyzer (Coulter Counter) with a Pro-Comp/gg microcomputer and a Hewlett Packard 2100 MX(HP 2100 MX) minicomputer. This system and its associated software digitize the raw sediment grain-size data, combine the coarse- and fine-fraction data into complete grain-size distributions, perform method of moments and inclusive graphics statistics, verbally classify the sediment, generate histogram and cumulative frequency plots, and transfer the results into a data-retrieval system. This system saves time and labor and affords greater reliability, resolution, and reproducibility than conventional methods do.
Forman, Royce G.; Shivakumar, V.; Newman, James C., Jr.
Fatigue Crack Growth (NASA/FLAGRO) computer program developed as aid in predicting growth of preexisting flaws and cracks in structural components of space systems. Is enhanced version of FLAGRO4 and incorporates state-of-the-art improvements in both fracture mechanics and computer technology. Provides fracture-mechanics analyst with computerized method of evaluating "safe-crack-growth-life" capabilities of structural components. Also used to evaluate tolerance to damage of structure of given design. Designed modular to facilitate revisions and operation on minicomputers. Written in FORTRAN 77.
Romine, Peter L.
This final report documents the development and installation of software and hardware for Robotic Welding Process Control. Primary emphasis is on serial communications between the CYRO 750 robotic welder, Heurikon minicomputer running Hunter & Ready VRTX, and an IBM PC/AT, for offline programming and control and closed-loop welding control. The requirements for completion of the implementation of the Rocketdyne weld tracking control are discussed. The procedure for downloading programs from the Intergraph, over the network, is discussed. Conclusions are made on the results of this task, and recommendations are made for efficient implementation of communications, weld process control development, and advanced process control procedures using the Heurikon.
Owen, E.W.; Shimer, D.W.; Lindquist, W.B.; Peterson, R.L.; Wyman, R.H.
The paper describes the control and instrumentation for the Mirror Fusion Test Facility at the Lawrence Livermore National Laboratory, California, USA. This large-scale scientific experiment in controlled thermonuclear fusion, which is currently being expanded, originally had 3000 devices to control and 7000 sensors to monitor. A hierarchical computer control system, is used with nine minicomputers forming the supervisory system. There are approximately 55 local control and instrumentation microcomputers. In addition, each device has its own monitoring equipment, which in some cases consists of a small computer. After describing the overall system a more detailed account is given of the control and instrumentation for two large superconducting magnets.
Caldwell, S.E.; Gentry, R.A.; White, R.W.; Allison, S.W.
Laser induced fluorescence (LIF) can be used to determine the pressure and temperature of an UF/sub 6/ gas sample. An external pulsed laser is used to excite the gas and a multichannel fiber optics system simultaneously collects fluorescence signals emanating from a number of points in the gas. The signals are digitized and presented to a minicomputer for data reduction. Both fluorescence intensity and lifetime are used to deduce temperature and pressure. The LIF probe system is described. Analysis of the data is discussed, and representative results are presented.
Wulff, W.; Cheng, H.S.; Lekach, S.V.; Mallen, A.N.
This presentation demonstrates the feasibility of simulating plant transients and severe abnormal transients in nuclear power plants at much faster than real-time computing speeds in a low-cost, dedicated, interactive minicomputer. This is achieved by implementing advanced modeling techniques in modern, special-purpose peripheral processors for high-speed system simulation. The results of this demonstration will impact safety analyses and parametric studies, studies on operator responses and control system failures and it will make possible the continuous on-line monitoring of plant performance and the detection and diagnosis of system or component failures.
Browell, E. V.; Carter, A. F.; Wilkerson, T. D.
Range-resolved water vapor measurements using the differential-absorption lidar (DIAL) technique is described in detail. The system uses two independently tunable optically pumped lasers operating in the near infrared with laser pulses of less than 100 microseconds separation, to minimize concentration errors caused by atmospheric scattering. Water vapor concentration profiles are calculated for each measurement by a minicomputer, in real time. The work is needed in the study of atmospheric motion and thermodynamics as well as in forestry and agriculture problems.
Nash, Reuel William
, the autoradiographs are photographed through a microscope. Ths resulting 35, millimeter slides are placed in an optical bench which includes a light source, the digitizing camera, and associated optics. The video output of the camera is digitized at 256x256... resolution by the Colorado Video Incorporated CVI-274 Real-time frame store under the control of a Digital Equipment Corporation PDP 11/40 mini-computer. Then, the data in the frame buffer is transferred to dish or tape media for processing. After all...
Sandia's Digital Systems Development Division 1521 has developed a new functional relay tester. Capabilities of this tester include the measurement of coil and contact resistance, hipot, operate current, and contact operation and bounce times. The heart of the tester is a Hewlett-Packard 21MX minicomputer that uses BASIC or FORTRAN programming languages. All measurements are made by means of simple program calls, and all measurement standards are traceable to the National Bureau of Standards. Functional relay test data are stored on a disc drive and can be output as hard copy, manipulated in the computer, or sent over a distributed-system link to other Sandia computers. 17 figures, 4 tables.
SNLA developed and implemented a nuclear material control and accountability system on an HP 3000 minicomputer. The Sandia Nuclear Materials Computer System (SNMCS) which became operative in January 1980 provides: control of shipments and receivals of nuclear material, control of internal transfers of nuclear material, automated inventory with a bar code system, control of inventory adjustments, automated reporting/transmitting to other contractors and operations offices, automated ledgers and journals for material weights and costs, and interface to the Albuquerque Operations Office (ALO) Automated 741 System.
A minicomputer-based data acquisition and analysis system was implemented for materials research on the long-term creep and time-dependent failure of fibers. The measurement system design allows for data acquisition from multiple creep test stations operating asynchronously. System control of the environment (temperature, stress, humidity) of each test station is also available. System hardware consists of an HP 1000-L series minicomputer operating under RTE-XL control linked to an HP 2250 Measurement and Control Processor based on the L-series microcomputer. The 2250 computer is configured for detection of stress loading and creep initiation, long-term creep and temperature data acquisition, and detection of fiber failure, while the 1000-L computer is used for overall system operation, data analysis, and program development. An operating system for fiber testing was developed to allow the user to interactively control, examine, and modify the status and functions of the multiple test stations in real time. Asynchronous time scheduling of the data acquisition is accomplished with a non-RTE timelist. Because of the long-term nature of these tests (thousands of hours), special system procedures were required to recover from disruption of either computer due to power failures, system failures, and scheduled system maintenance. The fiber test system provides higher data resolution, improved acquisition speed, real time examination and analysis of data, and greater test flexibility and expansion than was previously available with dedicated microprocessors.
A minicomputer-based data acquisition and analysis system was implemented for materials research on the long-term creep and time-dependent failure of fibers. The measurement system design allows for data acquisition from multiple creep test stations operating asynchronously. System control of the environment (temperature, stress, humidity) of each test station is also available. System hardware consists of an HP 1000-L series minicomputer operating under RTE-XL control linked to an HP 2250 Measurement and Control Processor based on the L-series microcomputer. The 2250 computer is configured for detection of stress loading and creep initiation, long-term creep and temperature data acquisition, and detection of fiber failure, while the 1000-L computer is used for overall system operation, data analysis, and program development. An operating system for fiber testing was developed to allow the user to interactively control, examine, and modify the status and functions of the multiple test stations in real time. Asynchronous time scheduling of the data acquisition is accomplished with a non-RTE timelist. Because of the long-term nature of these tests (thousands of hours), special system procedures were required to recover from disruption of either computer due to power failures, system failures, and scheduled system maintenance. The fiber test system provides higher data resolution, improved acquisition speed, real time examination and analysis of data, and greater test flexibility and expansion than was previously available with dedicated microprocessors. Documentation is provided which serves as a user's manual for the system operator.
Beck, David L.; Bennett, Robert G.
With the rapid increase in computational power of the standard personal computer, many tasks that could only be performed by a mini-computer or mainframe can now be performed by the common personal computer. Ten years ago, computational and data transfer requirements for a real-time hardware-in-the-loop simulator could only be met by specialized high performance mini-computers. Today, personal computers shoulder the bulk of the computational load in the U.S. Army Aviation and Missile Command's Radio Frequency Simulation System, and one of the U.S. Army Aviation and Missile Command's millimeter wave simulation systems is currently undergoing a transition to personal computers. This paper discusses how personal computers have been used as the computational backbone for a real-time hardware-in-the-loop simulator, and some of the advantages and disadvantages of a PC based simulation. This paper also provides some general background on what the Radio Frequency Simulation System (RFSS) is and how it works, since the RFSS has successfully implemented a PC based real-time hardware-in-the-loop simulator.
Oshima, T.; Matsuda, T.; Tsugita, T.; Sakata, S.; Sato, M.; Koiwa, M.
In the data processing system for the JT-60 tokamak, a unique mass data acquisition system with fast sampling, a transient mass data storage system (TMDS), has been used since 1988. It is composed of a minicomputer and 61 channels of 4/6 MB memory modules with a sampling rate up to 200 kHz and about 300 MB of data are transferred to a main computer by using a special LAN developed by Fujitsu Ltd. TMDS can handle a large amount of data, but cannot be enlarged in its capability, such as CPU power or the number of channels. To solve the problems of TMDS, a new fast VME data acquisition system (FDS), has been developed. It can acquire 6 MB of data per channel with a sampling rate of 200 kHz or 1 MHz and consists of a workstation with VMEbus memory modules. Up to now there are three FDSs with 24 channels. The minicomputer of TMDS has been replaced with a new system based on the technology of FDS. To cope with mass data transfer to a data server, they are connected with a gigabit ethernet switch.
Tonn, B.; Edwards, R.; Goeltz, R.; Hake, K.
This report contains a set of recommendations prepared by Oak Ridge National Laboratory (ORNL) for the US Bureau of the Census pertaining to technology innovation and management. Technology has the potential to benefit the Bureau's data collection, capture, processing, and analysis activities. The entire Bureau was represented from Decennial Census to Economic Programs and various levels of Bureau management and numerous experts in technology. Throughout the Bureau, workstations, minicomputers, and microcomputers have found their place along side the Bureau's mainframes. The Bureau's new computer file structure called the Topologically Integrated Geographic Encoding and Referencing data base (TIGER) represents a major innovation in geographic information systems and impressive progress has been made with Computer Assisted Telephone Interviewing (CATI). Other innovations, such as SPRING, which aims to provide Bureau demographic analysts with the capability of interactive data analysis on minicomputers, are in the initial stages of development. Recommendations fall into five independent, but mutually beneficial categories. (1) The ADP Steering Committee be disbanded and replaced with The Technology Forum. (2) Establishment of a Technology Review Committee (TRC), to be composed of technology experts from outside the Bureau. (3) Designate technological gurus. These individuals will be the Bureau's experts in new and innovative technologies. (4) Adopt a technology innovation process. (5) Establish an Advanced Technology Studies Staff (ATSS) to promote technology transfer, obtain funding for technological innovation, manage innovation projects unable to find a home in other divisions, evaluate innovations that cut across Bureau organizational boundaries, and provide input into Bureau technology analyses. (JF)
During 1982-85, the Water Resources Division (WRD) of the U.S. Geological Survey (USGS) installed over 70 large minicomputers in offices across the country to support its mission in the science of hydrology. These computers are connected by a communications network that allows information to be shared among computers in each office. The computers and network together are known as the Distributed Information System (DIS). The computers are accessed through the use of more than 1500 terminals and minicomputers. The WRD has three fundamentally different needs for computing: data management; hydrologic analysis; and administration. Data management accounts for 50% of the computational workload of WRD because hydrologic data are collected in all 50 states, Puerto Rico, and the Pacific trust territories. Hydrologic analysis consists of 40% of the computational workload of WRD. Cost accounting, payroll, personnel records, and planning for WRD programs occupies an estimated 10% of the computer workload. The DIS communications network is shown on a map. (Lantz-PTT)
Wright, C. W.; Bailey, S. A.; Heath, G. E.; Piazza, C. R.
A new multiprocessor data acquisition system was developed for the existing Airborne Oceanographic Lidar (AOL). This implementation simultaneously utilizes five single board 68010 microcomputers, the UNIX system V operating system, and the real time executive VRTX. The original data acquisition system was implemented on a Hewlett Packard HP 21-MX 16 bit minicomputer using a multi-tasking real time operating system and a mixture of assembly and FORTRAN languages. The present collection of data sources produce data at widely varied rates and require varied amounts of burdensome real time processing and formatting. It was decided to replace the aging HP 21-MX minicomputer with a multiprocessor system. A new and flexible recording format was devised and implemented to accommodate the constantly changing sensor configuration. A central feature of this data system is the minimization of non-remote sensing bus traffic. Therefore, it is highly desirable that each micro be capable of functioning as much as possible on-card or via private peripherals. The bus is used primarily for the transfer of remote sensing data to or from the buffer queue.
Satran, D. R.; Holbrook, G. T.; Greene, G. C.; Neuhart, D.
Recent modernization of NASA's Vortex Research Facility is described. The facility has a 300-ft test section, scheduled for a 300-ft extension, with constant test speeds of the model up to 100 ft/sec. The data acquisition hardware and software improvements included the installation of a 24-channel PCM system onboard the research vehicle, and a large dedicated 16-bit minicomputer. Flow visualization of the vortex wake in the test section is by particle seeding, and a thin sheet of argon laser light perpendicular to the line of flight; detailed flow field measurements are made with a laser velocimeter optics system. The improved experimental capabilities of the facility were used in a study of atmospheric stratification effects on wake vortex decay, showing that the effects of temperature gradient must be taken into account to avoid misleading conclusions in wake vortex research.
Wulff, W.; Cheng, H.S.; Mallen, A.N.
The plant analyzer recently developed at Brookhaven National Laboratory embodies the unique combination of high simulation fidelity, with convenient interactive access, much faster than real-time simulation speed and low cost. This has been achieved through a deliberate match of modeling techniques with modern, special-purpose minicomputer architecture, designed for high-speed simulations of complex systems. The BNL Plant Analyzer is a powerful engineering tool for carrying out, routinely and cost-effectively, safety analyses, optimizations of procedures, parametric studies to obtain safety margins and for generic training. It is also applicable for operator assistance through plant performance monitoring, automatic fault diagnostics and on-line support during emergencies. This paper presents five modeling principles, the criteria for selecting numerical integration techniques and the key features of the special-purpose peripheral processor, the AD10 of Applied Dynamics International. The paper presentation is followed by a remote access demonstration of the plant analyzer.
Wulff, W.; Cheng, H.S.; Mallen, A.N.
Advanced modeling techniques have been combined with modern, special-purpose peripheral minicomputer technology to develop a plant analyzer which provides realistic and accurate predictions of plant transients and severe off-normal events in nuclear power plants through on-line simulations at speeds of approximately 10 times faster than actual process speeds. The new simulation technology serves not only for carrying out routinely and efficiently safety analyses, optimizations of emergency procedures and design changes, parametric studies for obtaining safety margins and for generic training but also for assisting plant operations. Five modeling principles are presented which serve to achieve high-speed simulation of neutron kinetics, thermal conduction, nonhomogeneous and nonequilibrium two-phase flow coolant dynamics, steam line acoustical effects, and the dynamics of the balance of plant and containment systems, control systems and plant protection systems. 21 refs.
Tilton, James C.
Classifiers are often used to produce land cover maps from multispectral Earth observation imagery. Conventionally, these classifiers have been designed to exploit the spectral information contained in the imagery. Very few classifiers exploit the spatial information content of the imagery, and the few that do rarely exploit spatial information content in conjunction with spectral and/or temporal information. A contextual classifier that exploits spatial and spectral information in combination through a general statistical approach was studied. Early test results obtained from an implementation of the classifier on a VAX-11/780 minicomputer were encouraging, but they are of limited meaning because they were produced from small data sets. An implementation of the contextual classifier is presented on the Massively Parallel Processor (MPP) at Goddard that for the first time makes feasible the testing of the classifier on large data sets.
Hansen, J. E.; Larsen, F. Holm
Improvements to the subsystems of the ESA spherical near-field test range are described. An antenna positioner steel tower allowing test antenna diameters of 6 m was installed. A series of 14 measuring probes (dual polarized) was designed, constructed, and tested. The probes cover the frequency range of 3 to 18 GHz. A spherical near-field transformation program SNIFN allowing test antenna diameters of 300 wavelengths was implemented on the system minicomputer. The alignment sequence was updated. A user guide to the measurement system and its software is presented. The information is sufficiently detailed to tell a test engineer how to operate the system manually and in the automatic mode. It also allows the operator to process data, including the near-field to far-field transformation, and to produce radiation patterns and contour plots. Measurements produced are reviewed.
Bragg, M. B.; Zaguli, R. J.; Gregorek, G. M.
A two-phase wind tunnel test was conducted in the 6 by 9 foot Icing Research Tunnel (IRT) at NASA Lewis Research Center to evaluate the effect of ice on the performance of a full scale general aviation wing. In the first IRT tests, rime and glaze shapes were carefully documented as functions of angle of attack and free stream conditions. Next, simulated ice shapes were constructed for two rime and two glaze shapes and used in the second IRT tunnel entry. The ice shapes and the clean airfoil were tapped to obtain surface pressures and a probe used to measure the wake characteristics. These data were recorded and processed, on-line, with a minicomputer/digital data acquisition system. The effect of both rime and glaze ice on the pressure distribution, Cl, Cd, and Cm are presented.
Spinal cord stimulators (SCS) are a well-recognised treatment modality in the management of a number of chronic neuropathic pain conditions, particularly failed back syndrome and radiculopathies. The implantable pulse generator (IPG) component of the SCS is designed and operates in a similar fashion to that of a cardiac pacemaker. The IPG consists of an electrical generator, lithium battery, transmitter/receiver and a minicomputer. When stimulated, it generates pulsed electrical signals which stimulate the dorsal columns of the spinal cord, thus alleviating pain. Analogous to a cardiac pacemaker, it can be potentially damaged by ionising radiation from a linear accelerator, in patients undergoing radiotherapy. Herein we report our clinical management of the first reported case of a patient requiring adjuvant breast radiotherapy who had a SCS in situ. We also provide useful practical recommendations on the management of this scenario within a radiation oncology department. PMID:22024340
Nakamura, Y.; Latham, G. V.; Dorman, H. J.
The Apollo lunar seismic station network gathered data continuously at a rate of 3 x 10 to the 8th bits per day for nearly eight years until termination in September, 1977. The data were processed and analyzed using a PDP-15 minicomputer. On average, 1500 long-period seismic events were detected yearly. Automatic event detection and identification schemes proved unsuccessful because of occasional high noise levels and, above all, the risk of overlooking unusual natural events. The processing procedures which were finally chosen consist of plotting all the data on a compressed time scale, visually picking events from the plots, transferring event data to separate sets of tapes and performing detailed analyses using the latter. Many problems remain, especially in the automatic processing of extraterrestrial seismic signals.
Lauer, J. L.; King, V. W.
A far-infrared interferometer was converted into an emission microspectrophotometer for surface analysis. To cover the mid-infrared as well as the far-infrared the Mylar beamsplitter was made replaceable by a germanium-coated salt plate, and the Moire fringe counting system used to locate the moveable Michelson mirror was improved to read 0.5 micron of mirror displacement. Digital electronics and a dedicated minicomputer were installed for data collection and processing. The most critical element for the recording of weak emission spectra from small areas was, however, a reflecting microscope objective and phase-locked signal detection with simultaneous referencing to a blackbody source. An application of the technique to lubrication problems is shown.
Fulford, Janice M.
A program based on the U.S. Geological Survey (USGS) methods for indirectly computing peak discharges through culverts allows users to employ input data formats used by the water surface profile program (WSPRO). The program can be used to compute discharge rating surfaces or curves that describe the behavior of flow through a particular culvert or to compute discharges from measurements of upstream of the gradually varied flow equations and has been adapted slightly to provide solutions that minimize the need for the user to determine between different flow regimes. The program source is written in Fortran 77 and has been run on mini-computers and personal computers. The program does not use or require graphics capability, a color monitor, or a mouse.
Ansell, G. S.; Loewy, R. G.; Wiberley, S. E.
The use of filamentary composite materials in the design and construction of primary aircraft structures is considered with emphasis on efforts to develop advanced technology in the areas of physical properties, structural concepts and analysis, manufacturing, and reliability and life prediction. The redesign of a main spar/rib region on the Boeing 727 elevator near its actuator attachment point is discussed. A composite fabrication and test facility is described as well as the use of minicomputers for computer aided design. Other topics covered include (1) advanced structural analysis methids for composites; (2) ultrasonic nondestructive testing of composite structures; (3) optimum combination of hardeners in the cure of epoxy; (4) fatigue in composite materials; (5) resin matrix characterization and properties; (6) postbuckling analysis of curved laminate composite panels; and (7) acoustic emission testing of composite tensile specimens.
Gartrell, L. R.; Rhodes, D. B.
A rapid scanning two dimensional laser velocimeter (LV) has been used to measure simultaneously the vortex vertical and axial velocity distributions in the Langley Vortex Research Facility. This system utilized a two dimensional Bragg cell for removing flow direction ambiguity by translating the optical frequency for each velocity component, which was separated by band-pass filters. A rotational scan mechanism provided an incremental rapid scan to compensate for the large displacement of the vortex with time. The data were processed with a digital counter and an on-line minicomputer. Vaporized kerosene (0.5 micron to 5 micron particle sizes) was used for flow visualization and LV scattering centers. The overall measured mean-velocity uncertainity is less than 2 percent. These measurements were obtained from ensemble averaging of individual realizations.
Shreiner, D.P.; Barlai-Kovach, M.
Since scans of cirrhotic livers commonly show a reduction in size and colloid uptake of the right lobe, a quantitative measure of uptake was made using a minicomputer to determine total counts in regions of interest defined over each lobe. Right-to-left ratios were then compared in 103 patients. For normal paitents the mean ratio +- 1 s.d. was 2.85 +- 0.65, and the mean for patients with known cirrhosis was 1.08 +- 0.33. Patients with other liver diseases had ratios similar to the normal group. The normal range of the right-to-left lobe ratio was 1.55 to 4.15. The sensitivity of the ratio for alcoholic cirrhosis was 85.7% and the specificity was 100% in this patient population. The right-to-left lobe ratio was more sensitive and specific for alcoholic cirrhosis than any other criterion tested. An hypothesis is described to explain these results.
The vorticity shedding characteristics in attached and separated regions were investigated over three configurations, namely a backward facing circular arc, an ellipse at an angle of attack and a pitching airfoil. A fully automated data acquisition system was developed, including a two-component Laser-Velocimetry system in backscatter mode, an accurately controlled traversing mechanism and a MINK-11 minicomputer. Two-component velocity measurements were obtained over the above mentioned bodies, with steady and unsteady free streams. Emphasis was concentrated on the separation region, the free-shear layers and the wake downstream of these bodies. Two inviscid vortex models were developed to predict two different flow phenomena, namely the separated flow over a circular cylinder started impulsively from rest and propagating stall over a linear stationary cascade.
Thompson, M.M.; Mikhail, E.M.
An overview of recent developments in the automation of photogrammetry in various countries is presented. Conclusions regarding automated photogrammetry reached at the 1972 Congress in Ottawa are reviewed first as a background for examining the developments of 1972-1976. Applications are described for each country reporting significant developments. Among fifteen conclusions listed are statements concerning: the widespread practice of equipping existing stereoplotters with simple digitizers; the growing tendency to use minicomputers on-line with stereoplotters; the optimization of production of digital terrain models by progressive sampling in stereomodels; the potential of digitization of a photogrammetric model by density correlation on epipolar lines; the capabilities and economic aspects of advanced systems which permit simultaneous production of orthophotos, contours, and digital terrain models; the economy of off-line orthophoto systems; applications of digital image processing; automation by optical techniques; applications of sensors other than photographic imagery, and the role of photogrammetric phases in a completely automated cartographic system. ?? 1976.
Allen, J. W.; Hartman, W. F.; Robinson, J. C.
Advanced surveillance diagnostics were applied to key nuclear power plant valves to improve the availability of the power plant. Two types of valves were monitored: BWR three-stage, pilot-operated safety/relief valves and PWR feedwater control valves. Excessive leakage across the pilot-disc seat in BWR safety/relief valves can cause the second-stage pressure to reach the critical value that activates the valve, even though the set pressure was not exceeded. Acoustic emissions created by the leak noise were monitored and calibrated to indicate incipient activation of the safety/relief valve. Hydrodynamic, vibration, control and process signals from PWR feedwater control valves were monitored by a mini-computer based surveillance system. On-line analysis of these signals coupled with earlier analytic modelling identified: (1) cavitation, (2) changes in steam packaging tightness, (3) valve stem torquing, (4) transducer oscillations, and (5) peak vibration levels during power transients.
Allen, J. W.; Hartman, W. F.; Robinson, J. C.
Advanced surveillance diagnostics were applied to key nuclear power plant valves to improve the availability of the power plant. Two types of valves were monitored: boiling water reactor (BWR) three-stage, pilot-operated safety/relief valves and pressurized water reactor (PWR) feedwater control valves. Excessive leakage across the pilot-disc seat in BWR safety/relief valves can cause the second-stage pressure to reach the critical value that activates the valve, even though the set pressure was not exceeded. Acoustic emission created by the leak noise were monitored and calibrated to indicate incipient activation of the safety/relief valve. Hydrodynamic, vibration, control and process signals frm PWR feedwater control valves were monitored by a mini-computer based surveillance system.
Padilla, Peter A.
A fault injector system, called an in-circuit injector, was designed and developed to facilitate fault injection experiments performed at NASA-Langley's Avionics Integration Research Lab (AIRLAB). The in-circuit fault injector (ICFI) allows fault injections to be performed on electronic systems without special test features, e.g., sockets. The system supports stuck-at-zero, stuck-at-one, and transient fault models. The ICFI system is interfaced to a VAX-11/750 minicomputer. An interface program has been developed in the VAX. The computer code required to access the interface program is presented. Also presented is the connection procedure to be followed to connect the ICFI system to a circuit under test and the ICFI front panel controls which allow manual control of fault injections.
The purpose of this report is to suggest ideas for the technology and procurement policy that would be appropriate for SNAP III in the next decade. Both technology and procurement policy are considered because it would be difficult to implement some of the technology proposed in this report without a change in procurement policy. The report describes the recommended architecture of SNAP III and the software acquisitions and procurements policies to support the architecture. Major recommendations are: Transition from minicomputer to microcomputer system; Transition to proven commercial office system; Use local area network technology; Acquire mass storage capability; Acquire improved graphics capability; Consider automating ship -- shore communications, and start to develop a procurement policy to support the acquisition of the above technology.
Kriegler, F. J.
The MIDAS System is described as a third-generation fast multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turnaround time and significant gains in throughput. The hardware and software are described. The system contains a mini-computer to control the various high-speed processing elements in the data path, and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 200,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation.
Mason, R.R.; Hill, C.L.
The U.S. Geological Survey has developed software that interfaces with the Automated Data Processing System to facilitate and expedite preparation of the annual water-resources data report. This software incorporates a feature that prepares daily values tables and appends them to previously edited files containing station manuscripts. Other features collate the merged files with miscellaneous sections of the report. The report is then printed as page-size, camera-ready copy. All system components reside on a minicomputer; this provides easy access and use by remote field offices. Automation of the annual report preparation process results in significant savings of labor and cost. Use of the system for producing the 1986 annual report in the North Carolina District realized a labor savings of over two man-months. A fully implemented system would produce a greater savings and speed release of the report to users.
Drotning, W. D.
A dilatometer was designed and constructed to perform high precision thermal expansion measurements over the temperature range from ambient to 900 K. The design, construction, operation, and performance of the device is detailed. The measurement of length change employs an optical interferometer system which consists of a Michelson interferometer, a two frequency HeNe laser, and ac fringe detection. Length change precision of better than + or - 0.05 (SIGMA)m is achieved. Dilatometer control, status, data acquisition, and analysis functions are achieved by a minicomputer in order to achieve automated control. Further design features are incorporated which minimize sample preparation and assembly time to achieve rapid sample measurements. The interferometer alignment is under computer control to assist the operator during final adjustment and to maintain alignment throughout the course of a measurement. The performance of the device was evaluated by measurement of three thermal expansion standards (fused silica, platinum, and copper).
The objective of the Hot Corrosion Project in the LLNL Metals and Ceramics Division is to study the physical and chemical mechanisms of corrosion of nickel, iron, and some of their alloys when these metals are subjected to oxidizing or sulfidizing environments at temperatures between 850 and 950/sup 0/C. To obtain meaningful data in this study, we must rigidly control many parameters. Parameters are discussed and the methods chosen to control them in this laboratory. Some of the mechanics and manipulative procedures that are specifically related to data access and repeatability are covered. The method of recording and processing the data from each experiment using an LS-11 minicomputer are described. The analytical procedures used to evaluate the specimens after the corrosion tests are enumerated and discussed.
Ramapriyan, H. K.; Strong, J. P.
The architectures, programming characteristics, and ranges of application of past, present, and planned array processors for the digital processing of remote-sensing images are compared. Such functions as radiometric and geometric corrections, principal-components analysis, cluster coding, histogram generation, grey-level mapping, convolution, classification, and mensuration and modeling operations are considered, and both pipeline-type and single-instruction/multiple-data-stream (SIMD) arrays are evaluated. Numerical results are presented in a table, and it is found that the pipeline-type arrays normally used with minicomputers increase their speed significantly at low cost, while even further gains are provided by the more expensive SIMD arrays. Most image-processing operations become I/O-limited when SIMD arrays are used with current I/O devices.
Chang, Hsiao Tsu; Sun, Geng-tian; Zhang, Yan
The paper explores the collision recognition of two objects in both crisscross and revolution motions A mathematical model has been established based on the continuation theory. The objects of any shape may be regarded as being built of many 3siniplexes or their convex hulls. Therefore the collision problem of two object in motion can be reduced to the collision of two corresponding 3siinplexes on two respective objects accordingly. Thus an optimized algorithm is developed for collision avoidance which is suitable for computer control and eliminating the need for vision aid. With this algorithm computation time has been reduced significantly. This algorithm is applicable to the path planning of mobile robots And also is applicable to collision avoidance of the anthropomorphic arms grasping two complicated shaped objects. The algorithm is realized using LISP language on a VAX8350 minicomputer.
Budd, Jeffrey R.; Finkelstein, Stanley M.; Warwick, Warren J.
A non-invasive, non-effort dependent pulmonary function test has been created which can be used on preschool subjects. The integration of a mini-computer system with the test procedure allows extensive analysis of flow and gas concentration data. This analysis not only supplies lung volume measurements but also gas mixing efficiency which quantifies the evenness of gas distribution and alveolar efficiency which indicates the extent of ventilation-perfusion inequalities and diffusion abnormalities. The test has been performed on a sample of control subjects and cystic fibrosis patients aged 1 to 23 years old. The results indicate that the measurements are not only sensitive and specific to lung disease but also that they should prove useful for following the extent of lung disease over time.
Buzbee, B.; Slocomb, C.
The Los Alamos National Laboratory is known for its interest in large-scale and scientific computation. However, since 1978, the Laboratory has developed distributed processing and is now developing a framework for support of microprocessor-based intelligent workstations. As practiced at Los Alamos, distributed processing means the distribution of minicomputers across a relatively large geographical area and the integration of them into a single network. Discussed in this paper are the motivations for the network, the communications technology used, some salient implementation factors, and advantages and disadvantages of distributed processing. Also discussed are the motivations for intelligent workstations, our categorization of workstations, the goals for each category, implementation plans, and some critical issues associated with them.
Cochran, R. P.; Norris, J. W.; Jones, R. E.
NASA-Lewis Research Center is presently constructing a new test facility for developing turbine-cooling and combustor technology for future generation aircraft gas turbine engines. Prototype engine hardware will be investigated in this new facility at gas stream conditions up to 2480 K average turbine inlet temperature and 4,140,000 N per sq m turbine inlet pressure. The facility will have the unique feature of fully-automated control and data acquisition through the use of an integrated system of minicomputers and programmable controllers, which will result in more effective use of operating time, will limit the number of operators required, and will provide a built-in self-protection safety system. The paper describes the facility and the planning and design considerations involved.
The Savannah River Laboratory's WIND minicomputer system allows quick and accurate assessment of an accidental release at the Savannah River Plant using data from eight meteorological towers. The accuracy of the assessment is largely determined by the accuracy of the meteorological data; therefore quality control is important in an emergency response system. Real-time quality control of this data will be added to the WIND system to automatically identify inaccurate data. Currently, the system averages the measurements from the towers to minimize the influence of inaccurate data being used in calculations. The computer code used in the real-time quality control has been previously used to identify inaccurate measurements from the archived tower data.
Cochran, R. P.; Norris, J. W.
A new test facility is being constructed for developing turbine-cooling and combustor technology for future generation aircraft gas turbine engines. Prototype engine hardware will be investigated in this new facility at gas stream conditions up to 2480 K average turbine inlet temperature and 4.14 x 10 to the 6th power n sq m turbine inlet pressure. The facility will have the unique feature of fully automated control and data acquisition through the use of an integrated system of mini-computers and programmable controllers which will result in more effective use of operating time, will limit the number of operators required, and will provide built in self protection safety systems. The facility and the planning and design considerations are described.
Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.
The Midas System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in Phase I of the overall program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 2 x 100,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. The MIDAS construction and wiring diagrams are given.
Plummer, R. P.; Coler, C. R.
The speech recognition system under development is a trainable pattern classifier based on a maximum-likelihood technique. An adjustable uncertainty threshold allows the rejection of borderline cases for which the probability of misclassification is high. The syntax of the command language spoken may be used as an aid to recognition, and the system adapts to changes in pronunciation if feedback from the user is available. Words must be separated by .25 second gaps. The system runs in real time on a mini-computer (PDP 11/10) and was tested on 120,000 speech samples from 10- and 100-word vocabularies. The results of these tests were 99.9% correct recognition for a vocabulary consisting of the ten digits, and 99.6% recognition for a 100-word vocabulary of flight commands, with a 5% rejection rate in each case. With no rejection, the recognition accuracies for the same vocabularies were 99.5% and 98.6% respectively.
The Mirror Fusion Test Facility (MFTF) is a complex facility requiring a highly-computerized Supervisory Control and Diagnostics System (SCDS) to monitor and provide control over ten subsystems; three of which require true process control. SCDS will provide physicists with a method of studying machine and plasma behavior by acquiring and processing up to four megabytes of plasma diagnostic information every five minutes. A high degree of availability and throughput is provided by a distributed computer system (nine 32-bit minicomputers on shared memory). Data, distributed across SCDS, is managed by a high-bandwidth Distributed Database Management System. The MFTF operators' control room consoles use color television monitors with touch sensitive screens; this is a totally new approach. The method of handling deviations to normal machine operation and how the operator should be notified and assisted in the resolution of problems has been studied and a system designed.
Holdaway, R.; McPherson, P.; Ely, R.
The U.K. Science and Engineering Research Council operates a spacecraft Operations Control Centre at its Rutherford Appleton Laboratory at Chilton in Oxfordshire. This paper describes tracking aspects of the Control Centre during operations of the IRAS and AMPTE missions. Tracking is carried out using a 12 meter antenna with electric drive and S-band monopulse feed. A static pointing error of less than 3 arcminutes is achieved. The antenna is controlled by software on a minicomputer which has access to error and AGC signals from the receivers as well as encoder and status outputs from the antenna. The software is included in the control servo loop, and outputs velocity requests to the motor drive system. Descriptions are given of the tracking system hardware, position control, tracking modes (program and autotrack), orbit predictions, operator interface, and an analysis of accuracies for the two missions.
Doane, R.W.; Berven, B.A.; Blair, M.S.
A computer-controlled multichannel analysis system has been developed by the Radiological Survey Activities Group at Oak Ridge National Laboratory (ORNL) for the Department of Energy (DOE) in support of the DOE's remedial action programs. The purpose of this system is to provide a rapid estimate of the /sup 226/Ra concentration in soil samples using a 6 x 9-in. NaI(Tl) crystal containing a 3.25-in. deep by 3.5-in. diameter well. This gamma detection system is controlled by a mini-computer with a dual floppy disk storage medium. A two-chip interface was also designed at ORNL which handles all control signals generated from the computer keyboard. These computer-generated control signals are processed in machine language for rapid data transfer and BASIC language is used for data processing.
Wallace, P.L.; Shimamoto, F.Y.; Quick, T.M.
Lawrence Livermore Laboratory (LLL) has undertaken an ambitious plan to automate its x-ray analytical equipment. This project ultimately will automate 15 x-ray diffraction and 3 x-ray spectrometric systems. All automation is being done by retrofitting existing equipment and combining it with minicomputers to produce smart instruments. Two types of smart instruments have been developed: one that controls an experiment and acquires data and another that analyzes data and communicates with LLL's large computer center. Three of the former type have been built and are operating; seven more will soon be put into service. Only two of the later type are needed, and both are currently in service. We describe the details of our overall plan, the smart instruments, the retrofitting, our current status, and our software.
Lancraft, R. E.; Caglayan, A. K.
The computer program FINDS is written in FORTRAN-77, and is intended for operation on a VAX 11-780 or 11-750 super minicomputer, using the VMS operating system. The program detects, isolates, and compensates for failures in navigation aid instruments and onboard flight control and navigation sensors of a Terminal Configured Vehicle aircraft in a Microwave Landing System environment. In addition, FINDS provides sensor fault tolerant estimates for the aircraft states which are then used by an automatic guidance and control system to land the aircraft along a prescribed path. FINDS monitors for failures by evaluating all sensor outputs simultaneously using the nonlinear analytic relationships between the various sensor outputs arising from the aircraft point mass equations of motion. Hence, FINDS is an integrated sensor failure detection and isolation system.
Cahill, P. T.; Knowles, R. J.....; Tsen, O.
To meet the complex needs of a nuclear medicine division serving a 1100-bed hospital, a computer information system has been developed in sequential phases. This database management system is based on a time-shared minicomputer linked to a broadband communications network. The database contains information on patient histories, billing, types of procedures, doses of radiopharmaceuticals, times of study, scanning equipment used, and technician performing the procedure. These patient records are cycled through three levels of storage: (a) an active file of 100 studies for those patients currently scheduled, (b) a temporary storage level of 1000 studies, and (c) an archival level of 10,000 studies containing selected information. Merging of this information with reports and various statistical analyses are possible. This first phase has been in operation for well over a year. The second phase is an upgrade of the size of the various storage levels by a factor of ten.
Delivering over 280 billion cubic feet of natural gas per year through some 17,800 miles of pipeline, the vast grid of the East Ohio Gas Co. (EOG), Cleveland, requires constant monitoring and control to properly serve 1.1 million customers. The supervisory control and data acquisition (SCADA) system used by EOG has a series of instruments located at strategic intervals along the pipeline. These instruments monitor temperature, pressure, flow, odorization, compressor status, and alarms. Linked to the instruments are a series of 85 remote terminal units (RTUs) that are polled every 15 secs. for data. These RTUs also control the valves and compressors that deliver natural gas. These, in turn, are linked by radio to a series of minicomputer hosts. To revamp its system, EOG needed both new hardware and application software. This paper reviews the decision making process for selecting both hardware and software systems and the resulting performance.
Title, A. M.
A tunable birefringent filter using an alternate partial polarizer design has been built. The filter has a transmission of 38% in polarized light. Its full width at half maximum is .09A at 5500A. It is tunable from 4500 to 8500A by means of stepping motor actuated rotating half wave plates and polarizers. Wave length commands and thermal compensation commands are generated by a PPD 11/10 minicomputer. The alternate partial polarizer universal filter is compared with the universal birefringent filter and the design techniques, construction methods, and filter performance are discussed in some detail. Based on the experience of this filter some conclusions regarding the future of birefringent filters are elaborated.
Polansky, A. C.
A method for diagnosing surface parameters on a regional scale via geosynchronous satellite imagery is presented. Moisture availability, thermal inertia, atmospheric heat flux, and total evaporation are determined from three infrared images obtained from the Geostationary Operational Environmental Satellite (GOES). Three GOES images (early morning, midafternoon, and night) are obtained from computer tape. Two temperature-difference images are then created. The boundary-layer model is run, and its output is inverted via cubic regression equations. The satellite imagery is efficiently converted into output-variable fields. All computations are executed on a PDP 11/34 minicomputer. Output fields can be produced within one hour of the availability of aligned satellite subimages of a target area.
Hammond, W. Ed; Stead, W. W.
This paper presents the eighteen year history leading to the development of a computerized medical information system and discusses the factors which influenced its philosophy, design and implementation. This system, now called TMR, began as a single-user, tape-oriented minicomputer package and now exists as a multi-user, multi-database, multi-computer system capable of supporting a full range of users in both the inpatient and outpatient settings. The paper discusses why we did what we did, what worked, and what didn't work. Current projects are emphasized including networking and the integration of inpatient and outpatient functions into a single system. A theme of the paper is how hardware and software technological advancements, increasing sophistication of our users, our increasing experience, and just plain luck contributed to the success of TMR.
Stepka, F. S.
NASA Lewis Research Center is presently constructing a test facility for developing turbine-cooling and combustor technology for future-generation aircraft gas-turbine engines. Prototype engine hardware will be investigated in this facility at gas-stream conditions up to 2480 K average turbine inlet temperature and 4.14 million N/sq m turbine inlet pressure. The facility will have the unique features of fully automated control and data acquisition through the use of an integrated system of minicomputers and programmable controllers, which will result in more effective use of operating time and operators and will provide a built-in self-protection safety system. The facility, turbine rig, and turbine-cooling test program are described.
Wensley, J. H.; Lamport, L.; Goldberg, J.; Green, M. W.; Levitt, K. N.; Melliar-Smith, P. M.; Shostak, R. E.; Weinstock, C. B.
SIFT (Software Implemented Fault Tolerance) is an ultrareliable computer for critical aircraft control applications that achieves fault tolerance by the replication of tasks among processing units. The main processing units are off-the-shelf minicomputers, with standard microcomputers serving as the interface to the I/O system. Fault isolation is achieved by using a specially designed redundant bus system to interconnect the processing units. Error detection and analysis and system reconfiguration are performed by software. Iterative tasks are redundantly executed, and the results of each iteration are voted upon before being used. Thus, any single failure in a processing unit or bus can be tolerated with triplication of tasks, and subsequent failures can be tolerated after reconfiguration. Independent execution by separate processors means that the processors need only be loosely synchronized, and a novel fault-tolerant synchronization method is described.
Byard, P. L.; Foltz, C. B.; Jenkner, H.; Peterson, B. M.
The Ohio State University Image-Dissector Scanner (IDS) is now routinely operational on the Perkins 1.8-m reflector of Ohio Wesleyan University and The Ohio State University at the Lowell Observatory. The electro-optical design of the detector is similar to that of the image-tube scanner constructed by Robinson and Wampler at the Lick Observatory. The detector is mounted in a Cassegrain spectrograph with 10-cm-diameter optics. Operation of both the detector and the spectrograph are controlled by a PDP-11/20 minicomputer through a CAMAC interface. Performance of the IDS is adequate for the observational programs currently underway and the overall sensitivity of the system is competitive with similar instruments in use on other telescopes. Auxiliary television acquisition and guiding systems, data reduction, and plans for future development are also discussed.
Yang, P. Y.; Gyorki, J. R.; Wynveen, R. A.
Advanced Instrumentation concepts for improving performance of manned spacecraft Environmental Control and Life Support Systems (EC/LSS) have been developed at Life Systems, Inc. The difference in specific EC/LSS instrumentation requirements and hardware during the transition from exploratory development to flight production stages are discussed. Details of prior control and monitor instrumentation designs are reviewed and an advanced design presented. The latter features a minicomputer-based approach having the flexibility to meet process hardware test programs and the capability to be refined to include the control dynamics and fault diagnostics needed in future flight systems where long duration, reliable operation requires in-flight hardware maintenance. The emphasis is on lower EC/LSS hardware life cycle costs by simplicity in instrumentation and using it to save crew time during flight operation.
Erickson, W. K.; Hofman, L. B.; Donovan, W. E.
Difficulties regarding the digital image analysis of remotely sensed imagery can arise in connection with the extensive calculations required. In the past, an expensive large to medium mainframe computer system was needed for performing these calculations. For image-processing applications smaller minicomputer-based systems are now used by many organizations. The costs for such systems are still in the range from $100K to $300K. Recently, as a result of new developments, the use of low-cost microcomputers for image processing and display systems appeared to have become feasible. These developments are related to the advent of the 16-bit microprocessor and the concept of the microcomputer workstation. Earlier 8-bit microcomputer-based image processing systems are briefly examined, and a computer workstation architecture is discussed. Attention is given to a microcomputer workstation developed by Stanford University, and the design and implementation of a workstation network.
Jordan, A. J.
The analysis software developed for atmospheric cloud microphysics experiments conducted in the laboratory as well as aboard a KC-135 aircraft is described. A group of four programs was developed and implemented on a Hewlett Packard 1000 series F minicomputer running under HP's RTE-IVB operating system. The programs control and read data from a MEMODYNE Model 3765-8BV cassette recorder, format the data on the Hewlett Packard disk subsystem, and generate statistical data (mean, variance, standard deviation) and voltage and engineering unit plots on a user selected plotting device. The programs are written in HP FORTRAN IV and HP ASSEMBLY Language with the graphics software using the HP 1000 Graphics. The supported plotting devices are the HP 2647A graphics terminal, the HP 9872B four color pen plotter, and the HP 2608A matrix line printer.
Badavi, F. F.; Copeland, G. E.
The listing and description of software routines which were used to analyze the analog data obtained from LIDAR - system are given. All routines are written in FORTRAN - IV on a HP - 1000/F minicomputer which serves as the heart of the data acquisition system for the LIDAR program. This particular system has 128 kilobytes of highspeed memory and is equipped with a Vector Instruction Set (VIS) firmware package, which is used in all the routines, to handle quick execution of different long loops. The system handles floating point arithmetic in hardware in order to enhance the speed of execution. This computer is a 2177 C/F series version of HP - 1000 RTE-IVB data acquisition computer system which is designed for real time data capture/analysis and disk/tape mass storage environment.
Bush, J.A.; Newby, J.W.
Shear modulus testing is performed on the torsion pendulum at the General Electric Neutron Devices Department (GEND) as a means of gauging the state of cure for a polymer system. However, collection and reduction of the data to obtain the elastic modulus necessitated extensive operator involved measurements and calculations, which were subject to errors. To improve the reliability of the test, an analog-to-digital interface was designed and built to connect the torsion pendulum with a minicomputer. After the necessary programming was prepared, the system was tested and found to be an improvement over the old procedure in both quality and time of operation. An analysis of the data indicated that the computer generated modulus data were equivalent to the hand method data, but potential operator errors in frequency measurements and calculations were eliminated. The interfacing of the pendulum with the computer resulted in an overall time savings of 52 percent.
Implementation and results of an expert system used for scheduling session requests for the Systems Engineering Simulator (SES) laboratory at the NASA Johnson Space Center (JSC) are discussed. Weekly session requests are received from astronaut crew trainers, procedures developers, engineering assessment personnel, software developers, and various others who wish to access the computers, scene generators, and other simulation equipment available to them in the SES lab. The expert system under discussion is comprised of a data acquisition portion - two Pascal programs run on a personal computer - and a CLIPS program installed on a minicomputer. A brief introduction to the SES lab and its scheduling background is given. A general overview of the system is provided, followed by a detailed description of the constraint-reduction process and of the scheduler itself. Results from a ten-week trial period using this approach are discussed. Finally, a summary of the expert system's strengths and shortcomings are provided.
Erickson, J. D.
The state-of-the-art of automatic multispectral scanner data analysis and interpretation is reviewed. Sources of system variability which tend to obscure the spectral characteristics of the classes under consideration are discussed, and examples of the application of spatial and temporal discrimination bases are given. Automatic processing functions, techniques and methods, and equipment are described with particular attention to those that are applicable to large land surveys using satellite data. The development and characteristics of the Multivariate Interactive Digital Analysis System (MIDAS) for processing aircraft or satellite multispectral scanning data are discussed in detail. The MIDAS system combines the parallel digital implementation capabilities of a low-cost processor with a general purpose PDP-11/45 minicomputer to provide near-real-time data processing. The preprocessing functions are user-selectable. The input subsystem accepts data stored on high density digital tape, computer compatible tape, and analog tape.
Clark, R. P.; Goff, M. R.; Culley, J. E.
A high resolution medical thermal imaging system using an 8 element SPRI1E detector is described. Image processing is by an Intellect 100 processor and is controlled by a DEC LSI 11/23 minicomputer. Image storage is with a 170 Mbyte winchester disc together with archival storage on 12 inch diameter optical discs having a capacity of 1 Gbyte per side. The system is currently being evaluated for use in physiology and medicine. Applications outlined include the potential of thermographic screening to identify genetic carriers in X-linked hypohidrotic ectodermal dysplasia (XED), detailed vas-cular perfusion studies in health and disease and the relation-ship between cutaneous blood flow, neurological peripheral function and skin surface temperature.
Parise, R. A.; Blum, A.; Budney, T. J.; Stone, R. W.
The high level language FORTH is used for the electronic control of the Space Shuttle-based Ultraviolet Imaging Telescope, in a flight computer system which minimizes costs. The greater part of the breadboard version of the flight computer is assembled from commercially available components, reducing novel circuit design features and permitting simultaneous development of both hardware and software. The commercial boards are then refabricated on aluminum core heat conducting stock, using high reliability parts to produce the flight versions of the system. The system's ground support equipment employs a MINC-25 minicomputer which performs such functions as flight computer software development, PROM programming, test and integration support, and flight operations support. The implementation of these concepts in flight computer telescope controls is described.
Paroskie, R. M.; Blau, H. H., Jr.; Blinn, J. C., III
The laser nephelometer data system was updated to provide magnetic tape recording of data, and real time or near real time processing of data to provide particle size distribution and liquid water content. Digital circuits were provided to interface the laser nephelometer to a Data General Nova 1200 minicomputer. Communications are via a teletypewriter. A dual Linc Magnetic Tape System is used for program storage and data recording. Operational programs utilize the Data General Real-Time Operating System (RTOS) and the ERT AIRMAP Real-Time Operating System (ARTS). The programs provide for acquiring data from the laser nephelometer, acquiring data from auxiliary sources, keeping time, performing real time calculations, recording data and communicating with the teletypewriter.
Wirtschafter, David D.; Gams, Richard; Ferguson, Carol; Blackwell, William; Boackle, Paul
The Clinical Protocol Information System (CPIS) supports the clinical research and patient care objectives of the SouthEastern Cancer Study Group (SEG). The information system goals are to improve the evaluability of clinical trials, decrease the frequency of adverse patient events, implement drug toxicity surveillance, improve the availability of study data and demonstrate the criteria for computer networks that can impact on the general medical care of the community. Nodes in the network consist of Data General MicroNova MP-100 minicomputers that drive the interactive data dialogue and communicate with the network concentrator (another DG MicroNova) in Birmingham. Functions supported include: source data editing, care “advice,” care “audit,” care “explanation,” and treatment note printing. The complete database is updated nightly and resides on UAB's IBM 370/158-AP.
The ISL-11 (Intelligent Serial Line) card is a quad height LSI-11 module with an on-board Intel 8039 microprocessor which supplies intelligence between a differential (RS-422) serial line and a DMA interface to the LSI-11. It is compatible with LSI-11's, LSI-11/2's, and LSI-11/23's. The fact that there is a microprocessor on-board allows the use of this module for many different applications. One application will involve networking of minicomputers using the in-house SCULL protocol. The entire SCULL protocol can be handled by the ISL-11 thus taking a significant load off the LSI-11 and simplifying the software interface to the link. The serial line is asynchronous and can be driven up to 38.4K band if the microcode can handle such speeds. A block diagram of the ISL-11 is shown on the next page.
Wulff, W.; Cheng, H.S.; Lekach, S.V.; Mallen, A.N.
A combination of advanced modeling techniques and modern, special-purpose peripheral minicomputer technology is presented which affords realistic predictions of plant transient and severe off-normal events in LWR power plants through on-line simulations at a speed ten times greater than actual process speeds. Results are shown for a BWR plant simulation. The mathematical models account for nonequilibrium, nonhomogeneous two-phase flow effects in the coolant, for acoustical effects in the steam line and for the dynamics of the recirculation loop and feed-water train. Point kinetics incorporate reactivity feedback for void fraction, for fuel temperature, and for coolant temperature. Control systems and trip logic are simulated for the nuclear steam supply system.
Cheng, H.S.; Lekach, S.V.; Mallen, A.N.; Wulff, W.; Cerbone, R.J.
A combination of advanced modeling techniques and modern, special-purpose peripheral minicomputer technology was utilized to develop a plant analyzer which affords realistic predictions of plant transients and severe off-normal events in LWR power plants through on-line simulations at speeds up to 10 times faster than actual process speeds. The mathematical models account for nonequilibrium, nonhomogeneous two-phase flow effects in the coolant, for acoustical effects in the steam line and for the dynamics of the entire balance of the plant. Reactor core models include point kinetics with reactivity feedback due to void fraction, fuel temperature, coolant temperature, and boron concentration as well as a conduction model for predicting fuel and clad temperatures. Control systems and trip logic for plant protection systems are also simulated. The AD10 of Applied Dynamics International, a special-purpose peripheral processor, is used as the principal hardware of the plant analyzer.
The set of computer programs described allows for data definition, data input, and data transfer between the LSI-11 microcomputers and the VAX-11/780 minicomputer. Program VAXCOM allows for a simple method of textual file transfer from the LSI to the VAX. Program LSICOM allows for easy file transfer from the VAX to the LSI. Program TTY changes the LSI-11 operators console to the LSI's printing device. Program DICTIN provides a means for defining a data set for input to either computer. Program DATAIN is a simple to operate data entry program which is capable of building data files on either machine. Program LEDITV is an extremely powerful, easy to use, line oriented text editor. Program COPYSBF is designed to print out textual files on the line printer without character loss from FORTRAN carriage control or wide record transfer.
Daniels, T. S.; Berry, J. D.; Park, S.
The paper reports the development and initial testing of a digital resolver to replace existing analog signal processing instrumentation. Radiometers, mounted directly on one of the fully articulated blades, are electrically connected through a slip ring to analog signal processing circuitry. The measured signals are periodic with azimuth angle and are resolved into harmonic components, with 0 deg over the tail. The periodic nature of the helicopter blade motion restricts the frequency content of each flapping and yaw signal to the fundamental and harmonics of the rotor rotational frequency. A minicomputer is employed to collect these data and then plot them graphically in real time. With this and other information generated by the instrumentation, a helicopter test pilot can then adjust the helicopter model's controls to achieve the desired aerodynamic test conditions.
Martin, M. R.
A prototype high speed balancing system was developed for assembled gas turbine engine modules. The system permits fully assembled gas turbine modules to be operated and balanced at selected speeds up to full turbine speed. The balancing system is a complete stand-alone system providing all necesary lubrication and support hardware for full speed operation. A variable speed motor provides the drive power. A drive belt and gearbox provide rotational speeds up to 21,000 rpm inside a vacuum chamber. The heart of the system is a dedicated minicomputer with attendant data acquisition, storage and I/O devices. The computer is programmed to be completely interactive with the operator. The system was installed at CCAD and evaluated by testing 20 T55 power turbines and 20 T53 power turbines. Engine test results verified the performance of the high speed balanced turbines.
Jackson, Colonel T. F.
The technology of computer systems and the need for their effective use poses a series of significant problems in the health care delivery environment. There is a profusion of hardware, significant decreases in cost coupled with more significant increases in capability, a need for adequate application and support software, and a dire shortage of programmers. The onslaught of microcomputers and stand-alone minicomputer systems pose a serious challenge to the development of cohesive, coordinated and interfaced institution wide systems. The differences in approach between word processing and data processing systems and the obvious need for their future compatibility sets up an arena for turf competition that has serious potential consequences. A blurring of the distinction between voice and data communication systems further complicates the situation. These problems, along with the competition with industry for data processing professionals, present a formidable challenge for leadership. This paper discusses this challenge and how it can be met.
This paper discusses issues and implications in the development of a minicomputer-based, distributed intelligence data acquisition and control system to support complex experimental facilities. Atomic Vapor Laser Isotope Separation is typical of programs that require the large-scale experimental facilities that have led to the design philosophy described in this paper. Incorporating computer systems in the instrumentation of ORNL's experimental facilities has increased productivity. However, several challenging issues must be addressed in developing such systems: (1) scientific as well as engineering goals; (2) ultimate goals with intermediate capabilities; (3) budget, scope, and schedule fluctuations; and (4) overlapping maintenance and development. the techniques and philosophy used to provide the computer system hardware, software, architecture, and interfaces for the instrumentation and control system for large-scale developmental facilities were developed for use at ORNL, but can be related to developing any large-scale experimental facility.
Lumb, Alan M.; Kittle, John L.
ANNIE is a data storage and retrieval system that was developed to reduce the time and effort required to calibrate, verify, and apply watershed models that continuously simulate water quantity and quality. Watershed models have three categories of input: parameters to describe segments of a drainage area, linkage of the segments, and time-series data. Additional goals for ANNIE include the development of software that is easily implemented on minicomputers and some microcomputers and software that has no special requirements for interactive display terminals. Another goal is for the user interaction to be based on the experience of the user so that ANNIE is helpful to the inexperienced user and yet efficient and brief for the experienced user. Finally, the code should be designed so that additional hydrologic models can easily be added to ANNIE.
Wu, H. C.; Yao, J. C.
Creep tests were conducted by means of a closed loop servo-controlled materials test system. These tests are different from the conventional creep tests in that the strain history prior to creep may be carefully monitored. Tests were performed for aluminum alloy 6061-0 at 150 C and monitored by a PDP 11/04 minicomputer at a preset constant plastic-strain rate prehistory. The results show that the plastic-strain rate prior to creep plays a significant role in creep behavior. The endochronic theory of viscoplasticity was applied to describe the observed creep curves. The concepts of intrinsic time and strain rate sensitivity function are employed and modified according to the present observation.
Harrison, Eugene T.; Pickren, John D.; Mangum, Patricia; Tomlins, Helen F.; Pickren, Ann S.
Over the last eight years, a teleprocessing/database Hospital Information System (HIS) for the Medical College of Georgia hospital and clinics has evolved to include approximately 200 terminal functions, 100 cathode ray terminals (CRT's) and 25 printers running on an IBM 4341. Concurrent with this development, several specialized standalone departmental minicomputer systems have evolved in response to specific requirements. In early 1981, a networking concept was proposed whereby the various information management systems within the hospital and clinics could communicate with each other. This concept has been successfully applied to several systems including a Local Area Network (LAN) of 24 processors and 45 workstations. This paper will outline the evolution from a single host processor multiple terminal oriented HIS, to a communications network of computers, to the integration of the host terminal network and a flexible Local Area Network.
Benner, William H. (Danville, CA)
An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.
An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N/sub 2/), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable obtained by decomposing the sample at 1135/sup 0/C, or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135/sup 0/C as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N/sub 2/, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.
Litaker, Harry L., Jr.; Thompson, Shelby; Archer, Ronald D.
The Mobile Information SysTem (MIST) had its origins in the need to determine whether commercial off the shelf (COTS) technologies could improve intervehicular activities (IVA) on International Space Station (ISS) crew maintenance productivity. It began with an exploration of head mounted displays (HMDs), but quickly evolved to include voice recognition, mobile personal computing, and data collection. The unique characteristic of the MIST lies within its mobility, in which a vest is worn that contains a mini-computer and supporting equipment, and a headband with attachments for a HMD, lipstick camera, and microphone. Data is then captured directly by the computer running Morae(TM) or similar software for analysis. To date, the MIST system has been tested in numerous environments such as two parabolic flights on NASA's C-9 microgravity aircraft and several mockup facilities ranging from ISS to the Altair Lunar Sortie Lander. Functional capabilities have included its lightweight and compact design, commonality across systems and environments, and usefulness in remote collaboration. Human Factors evaluations of the system have proven the MIST's ability to be worn for long durations of time (approximately four continuous hours) with no adverse physical deficits, moderate operator compensation, and low workload being reported as measured by Corlett Bishop Discomfort Scale, Cooper-Harper Ratings, and the NASA Total Workload Index (TLX), respectively. Additionally, through development of the system, it has spawned several new applications useful in research. For example, by only employing the lipstick camera, microphone, and a compact digital video recorder (DVR), we created a portable, lightweight data collection device. Video is recorded from the participants point of view (POV) through the use of the camera mounted on the side of the head. Both the video and audio is recorded directly into the DVR located on a belt around the waist. This data is then transferred to another computer for video editing and analysis. Another application has been discovered using simulated flight, in which, a kneeboard is replaced with mini-computer and the HMD to project flight paths and glide slopes for lunar ascent. As technologies evolve, so will the system and its application for research and space system operations.
Ivanova, T. N.; Bercovich, Yu. A.; Mashinskiy, A. L.; Meleshko, G. I.
The paper describes the "SVET" project—a new generation of space greenhouse with small dimensions. Through the use of a minicomputer, "SVET" is fully capable of automatically operating and controlling environmental systems for higher plant growth. A number of preliminary studies have shown the radish and cabbage to be potentially important crops for CELSS (Closed Environmental Life Support System). The "SVET" space greenhouse was mounted on the "CRYSTAL" technological module docked to the Mir orbital space station on 10 June 1990. Soviet cosmonauts Balandin and Solovyov started the first experiments with the greenhouse on 15 June 1990. Preliminary results of seed cultivation over an initial 54-day period in "SVET" are presented. Morphometrical characteristics of plants brought back to Earth are given. Alteration in plant characteristics, such as growth and developmental changes, or morphological contents were noted. A crop of radish plants was harvested under microgravity conditions. Characteristics of plant environmental control parameters and an estimation of functional properties of control and regulation systems of the "SVET" greenhouse in space flight as received via telemetry data is reported.
Ivanova, T N; Bercovich YuA; Mashinskiy, A L; Meleshko, G I
The paper describes the project "SVET"--the creating of a small dimensions space greenhouse of new generation. By means of minicomputer, "SVET" is full-automatic operating and controlling environmental conditions system in the higher plants growth unit. A number of studies have selected the radish and cabbage vegetables as a potentially important crop for CELSS (short term cycle of vegetation). The "SVET" space greenhouse has been mounted on the "CRYSTAL" technological module which docked to the "MIR" orbital space station on June 10, 1990. Soviet cosmonauts Balandin and Solovyov started the first experiments with the greenhouse on June 15, 1990. The preliminary results of the seeds cultivation for the first 54-days period in "SVET" are presented. Morphometrical characteristics of the plants, brought back to the Earth are given. The vegetation peculiarities, such as the plants growth and the development slowing-down, or the dry substance contents raising are noted. For the first time, the root crop of radish plants at microgravity conditions, are produced. Characteristics of controlled plants' environment parameters and an estimation of functional properties of control and regulation systems of the "SVET" greenhouse in space flight according to telemetry data is given. PMID:11541047
Cheng, H.S.; Lekach, S.V.; Mallen, A.N.; Wulff, W.; Cerbone, R.J.
A combination of advanced modeling techniques and modern, special-purpose peripheral minicomputer technology is presented which affords realistic predictions of plant transient and severe off-normal events in LWR power plants through on-line simulations at a speed ten times faster than actual process speeds. Results are shown for a BWR plant simulation. The mathematical models account for nonequilibrium, nonhomogeneous two-phase flow effects in the coolant, for acoustical effects in the steam line and for the dynamics of the recirculation loop and feedwater train. Point kinetics incorporate reactivity feedback due to void fraction, fuel temperature, coolant temperature, and boron concentration. Control systems and trip logic are simulated for the nuclear steam supply system. The AD10 of Applied Dynamics International is the special-purpose peripheral processor. It is specifically designed for high-speed digital system simulation, accommodates hardware (instrumentation) in the input/output loop, and operates interactively on-line, like an analog computer. Results are shown to demonstrate computing capacity, accuracy, and speed. Simulation speeds have been achieved which are orders of magnitude faster than those of a CDC-7600 mainframe computer or ten times faster than real-time speed.
Ricklefs, R. L.; Shelus, P. J.
The University of Texas McDonald Observatory has long been a pioneer in acquiring laser ranging data, a data type which has substantially improved our knowledge of the dynamics of the earth-moon system as well as various aspects of geophysics and general relativity. (See Mulholland, 1980; Shelus, 1985; Shelus, 1987.) The McDonald Laser Ranging System (MLRS) is one of only 2 laser ranging stations world-wide having the capability of routine data acquisition on both lunar and artificial satellite targets (Shelus,IEEE, 1985). In this paper we discuss the current applications of modern computer technology to the problems of acquiring and reducing that ranging data. As technology continues to improve, the logical upgrade is the replacement of obsolescent station minicomputers with the resource-rich environment of micro-computers. The goal is to allow the automation of many station ranging functions as well as the enhancement of onsite data quality control, filtering, and analysis. Plans for such upgrades and their implications for dynamical astronomy are discussed.
Dayton, J. A., Jr.; Ebihara, B. T.
An electron beam test facility, which consists of a precision multidimensional manipulator built into an ultra-high-vacuum bell jar, was designed, fabricated, and operated at Lewis Research Center. The position within the bell jar of a Faraday cup which samples current in the electron beam under test, is controlled by the manipulator. Three orthogonal axes of motion are controlled by stepping motors driven by digital indexers, and the positions are displayed on electronic totalizers. In the transverse directions, the limits of travel are approximately + or - 2.5 cm from the center with a precision of 2.54 micron (0.0001 in.); in the axial direction, approximately 15.0 cm of travel are permitted with an accuracy of 12.7 micron (0.0005 in.). In addition, two manually operated motions are provided, the pitch and yaw of the Faraday cup with respect to the electron beam can be adjusted to within a few degrees. The current is sensed by pulse transformers and the data are processed by a dual channel box car averager with a digital output. The beam tester can be operated manually or it can be programmed for automated operation. In the automated mode, the beam tester is controlled by a microcomputer (installed at the test site) which communicates with a minicomputer at the central computing facility. The data are recorded and later processed by computer to obtain the desired graphical presentations.
Eskenazi, R.; Williams, D. S.
Feature extraction involves the transformation of a raw video image to a more compact representation of the scene in which relevant information about objects of interest is retained. The task of the low-level processor is to extract object outlines and pass the data to the high-level process in a format that facilitates pattern recognition tasks. Due to the immense computational load caused by processing a 256x256 image, even a fast minicomputer requires a few seconds to complete this low-level processing. It is, therefore, necessary to consider hardware implementation of these low-level functions to achieve real-time processing speeds. The considered project had the objective to implement a system in which the continuous feature extraction process is not affected by the dynamic changes in the scene, varying lighting conditions, or object motion relative to the cameras. Due to the high bandwidth (3.5 MHz) and serial nature of the TV data, a pipeline processing scheme was adopted as the overall architecture of this system. Modularity in the system is achieved by designing circuits that are generic within the overall system.
PHILLIPS,J.C; PENAFLOR,B.G; PHAM,N.Q; PIGLOWSKI,D.A
OAK-B135 This operating year marks an upgrade to the computer system charged with control and data acquisition for neutral beam injection system's heating at the DIII-D National Fusion Facility, funded by the US Department of Energy and operated by General Atomics (GA). This upgrade represents the third and latest major revision to a system which has been in service over twenty years. The first control and data acquisition computers were four 16 bit mini computers running a proprietary operating system. Each of the four controlled two ion source over dedicated CAMAC highway. In a 1995 upgrade, the system evolved to be two 32 bit Motorola mini-computers running a version of UNIX. Each computer controlled four ion sources with two CAMAC highways per CPU. This latest upgrade builds on this same logical organization, but makes significant advances in cost, maintainability, and the degree to which the system is open to future modification. The new control and data acquisition system is formed of two 2 GHz Intel Pentium 4 based PC's, running the LINUX operating system. Each PC drives two CAMAC serial highways using a combination of Kinetic Systems PCI standard CAMAC Hardware Drivers and a low-level software driver written in-house expressly for this device. This paper discusses the overall system design and implementation detail, describing actual operating experience for the initial six months of operation.
Hasanul Basher, A.M.
The CAMAC-based control system for the 25-MV Tandem Accelerator at HHIRF uses two Perkin-Elmer, 32-bit minicomputers: a message-switching computer and a supervisory computer. Two operator consoles are located on one of the six serial highways. Operator control is provided by means of a console CRT, trackball, assignable shaft encoders, and meters. The message-switching computer transmits and receives control information on the serial highways. At present, the CRT pages with updated parameters can be displayed and parameters can be controlled only from the two existing consoles, one in the Tandem control room and the other in the ORIC control room. It has become necessary to expand the control capability to several other locations in the building. With the expansion of control and monitoring capability of accelerator parameters to other locations, the operators will be able to control and observe the result of the control action at the same time. This capability will be useful in the new Radioactive Ion Beam project of the division. Since the new control console will be PC-based, the existing page format will be changed. The PC will be communicating with the Perkin-Elmer through RS-232 with the aid of a communication protocol. Hardware configuration has been established, a software program that reads the pages from the shared memory, and a communication protocol have been developed. The following sections present the implementation strategy, work completed, future action plans, and the functional details of the communication protocol.
The acoustic wave generated by sudden thermal stress is used to obtain information non-invasively on the composition and structure of the stressed body. One or more acoustic transducers are coupled with the surface of the body to intercept the acoustic wave and generate a corresponding electrical signal. The sudden thermal stress is induced by a pulse of radiation which deposits energy causing a rapid, but very small, rise of temperature. The radiation may be ionizing radiation, such as high energy electrons, photons (x-rays), neutrons, or other charged particles. The radiation may also be non-ionizing radiation, such as rf and microwave electromagnetic radiation and ultrasonic radiation. The electrical signal from the acoustic transducer is amplified and supplied to a digitizer, which provides a continuous stream of digital words corresponding to samples of the amplified signal. Because in most situations of practical interest the s/n ratio of a single pulse is much less than unity, it is necessary to signalaverage the signals from many successive pulses. This is accomplished with a minicomputer or data processor suitably interfaced with the digitizer. The resulting data can then be suitably displayed as an image on a crt display or plotted or numerically printed out.
Daniel, I. M.
Experimental methods were developed for testing and characterization of composite materials at strain rates ranging from quasi-static to over 500 s(sup -1). Three materials were characterized, two graphite/epoxies and a graphite/S-glass/epoxy. Properties were obtained by testing thin rings 10.16 cm (4 in.) in diameter, 2.54 cm (1 in.) wide, and six to eight plies thick under internal pressure. Unidirectional 0 degree, 90 degree, and 10 degree off-axis rings were tested to obtain longitudinal, transverse, and in-plane shear properties. In the dynamic tests internal pressure was applied explosively through a liquid and the pressure was measured with a calibrated steel ring. Strains in the calibration and specimen rings were recorded with a digital processing oscilloscope. The data were processed and the equation of motion solved numerically by the mini-computer attached to the oscilloscope. Results were obtained and plotted in the form of dynamic stress-strain curves. Longitudinal properties which are governed by the fibers do not vary much with strain rate with only a moderate (up to 20 percent) increase in modulus. Transverse modulus and strength increase sharply with strain rate reaching values up to three times the static values. The in-plane shear modulus and shear strength increase noticeably with strain rate by up to approximately 65 percent. In all cases ultimate strains do not vary significantly with strain rates.
We describe a convenient interface between UNIX-based work-stations or minicomputers, and supercomputers such as the CRAY series machines. Using this interface, the user can issue commands entirely on the UNIX system, with remote compilation, loading and execution performed on the supercomputer. The interface is not a remote login interface. Rather the domain of various UNIX utilities such as compilers, archivers and loaders are extended to include the CRAY. The user need know essentially nothing about the CRAY operating system, commands or filename restrictions. Standard UNIX utilities will perform CRAY operations transparently. UNIX command names and arguments are mapped to corresponding CRAY equivalents, suitable options are selected as needed, UNIX directory tree filenames are coerced to allowable CRAY names and all source and output files are automatically transferred between the machines. The primary purpose of the software is to allow the programmer to benefit from the interactive features of UNIX systems including screen editors, software maintenance utilities such as make and SCCS and in general to avail of the large set of UNIX text manipulation features. The interface was designed particularly to support development of very large multi-file programs, possibly consisting of hundreds of files and hundreds of thousands of lines of code. All CRAY source is kept on the work-station. We have found that using the software, the complete program development phase for a large CRAY application may be performed entirely on a work-station.
Hull, M L; Mote, C D
Measurement problems can be classified into instrumentation, data transmission and recording, and analysis. This paper focuses on the transmission of multichannel, high-volume, high-frequency, high-accuracy data. Boot-ski dynamometer and skier velocity anemometer data provide 13 channels of max. 8-mV signals requiring 8-microvolt resolution or 4.45-Newton dynamometer resolution. The data transmission system features durability, power consumption approx. 10 Watts, weight 4.54 kp, range greater than 3,500 m, frequency response 250 Hz, accuracy 1 per cent, temperature stability, dynamic range plus or minus 2 inches. The transducer signals are ampflified to plus or minus 10 V for the 100-kbps PCM system. Special AC amplifiers, driven by an amplitude-stabilized power oscillator, were designed for elimination of radio frequency interference (RFI), improved stability and high signal/noise. Sixteen words are sequentially sampled at 521/sec-13 data, 2 frame counters, and 1 sync. The ground station consists of the PCM decoder with real-time capability and an analog tape recorder. Data is subsequently buffered and formatted onto digital tape by mini-computer. PMID:4469157
Leung, K.; Bicknell, T.; Vines, K.
To take full advantage of the synthetic aperature radar (SAR) to be flown on board the European Space Agency's Remote Sensing Satellite (ERS-1) (1989) and the Canadian Radarsat (1990), the implementation of a receiving station in Alaska is being studied to gather and process SAR data pertaining in particular to regions within the station's range of reception. The current SAR data processing requirement is estimated to be on the order of 5 minutes per day. The Interim Digital Sar Processor (IDP) which was under continual development through Seasat (1978) and SIR-B (1984) can process slightly more than 2 minutes of ERS-1 data per day. On the other hand, the Advanced Digital SAR Processore (ADSP), currently under development for the Shuttle Imaging Radar C (SIR-C, 1988) and the Venus Radar Mapper, (VMR, 1988), is capable of processing ERS-1 SAR data at a real time rate. To better suit the anticipated ERS-1 SAR data processing requirement, both a modified IDP and an ADSP derivative are being examined. For the modified IDP, a pipelined architecture is proposed for the mini-computer plus array processor arrangement to improve throughout. For the ADSP derivative, a simplified version is proposed to enhance ease of implementation and maintainability while maintaing real time throughput rates. These processing systems are discussed and evaluated.
Shahidi, A. K.; Crapo, J. A.; Schlegelmilch, R. F.; Reinhart, R. C.; Petrik, E. J.; Walters, J. L.; Jones, R. E.
NASA Lewis Research Center has applied artificial intelligence to an advanced ground terminal. This software application is being deployed as an experimenter interface to the link evaluation terminal (LET) and was named Space Communication Artificial Intelligence for the Link Evaluation Terminal (SCAILET). The high-burst-rate (HBR) LET provides 30-GHz-transmitting and 20-GHz-receiving, 220-Mbps capability for wide band communications technology experiments with the Advanced Communication Technology Satellite (ACTS). The HBR-LET terminal consists of seven major subsystems. A minicomputer controls and monitors these subsystems through an IEEE-488 or RS-232 protocol interface. Programming scripts (test procedures defined by design engineers) configure the HBR-LET and permit data acquisition. However, the scripts are difficult to use, require a steep learning curve, are cryptic, and are hard to maintain. This discourages experimenters from utilizing the full capabilities of the HBR-LET system. An intelligent assistant module was developed as part of the SCAILET software. The intelligent assistant addresses critical experimenter needs by solving and resolving problems that are encountered during the configuring of the HBR-LET system. The intelligent assistant is a graphical user interface with an expert system running in the background. In order to further assist and familiarize an experimenter, an on-line hypertext documentation module was developed and included in the SCAILET software.
In this HBR interview, CEO Michael Ruettgers speaks in detail about the managerial practices that have allowed EMC to anticipate and exploit disruptive technologies, market opportunities, and business models ahead of its competitors. He recounts how the company repeatedly ventured into untested markets, ultimately transforming itself from a struggling maker of minicomputer memory boards into a data storage powerhouse and one of the most successful companies of the past decade. The company has achieved sustained and nearly unrivaled revenue, profit, and shareprice growth through a number of means. Emphasizing timing and speed, Ruettgers says, is critical. That's meant staggering products rather than developing them sequentially and avoiding the excessive refinements that slow time to market. Indeed, a sense of urgency, Ruettgers explains, has been critical to EMC's success. Processes such as quarterly goal setting and monthly forecasting meetings help maintain a sense of urgency and allow managers to get early glimpses of changes in the market. So does an environment in which personal accountability is stressed and the corporate focus is single-minded. Perhaps most important, the company has procedures to glean insights from customers. Intensive forums involving EMC engineers and leading-edge customers, who typically push for unconventional solutions to their problems, often yield new product features. Similarly, a customer service system that includes real-time monitoring of product use enables EMC to understand customer needs firsthand. PMID:11189457
Software for data archiving and data display was developed for use on a Digital Equipment Corporation (DEC) PDP-11/34A minicomputer for use with the JPL-designed flux mapper. The flux mapper is a two-dimensional, high radiant energy scanning device designed to measure radiant flux energies expected at the focal point of solar parabolic dish concentrators. Interfacing to the DEC equipment was accomplished by standard RS-232C serial lines. The design of the software was dicated by design constraints of the flux-mapper controller. Early attemps at data acquisition from the flux-mapper controller were not without difficulty. Time and personnel limitations result in an alternative method of data recording at the test site with subsequent analysis accomplished at a data evaluation location at some later time. Software for plotting was also written to better visualize the flux patterns. Recommendations for future alternative development are discussed. A listing of the programs used in the anaysis is included in an appendix.
Wentz, D.L. Jr.
The Multi-Access Storage Subnetwork (MASS) is the latest addition to the Octopus computer network at Lawrence Livermore Laboratory. The subnetwork provides shared mass storage for the Laboratory's multiple-host computer configuration. A Control Data Corp. 38500 Mass Storage facility is interfaces by MASS to the large, scientific worker computers to provide an on-line capacity of 1 trillion bits of user-accessible data. The MASS architecture offers a very high performance approach to the management of large data storage, as well as a high degree of reliability needed for operation in the Laboratory's timesharing environment. MASS combines state-of-the-art digital hardware with an innovative system philosophy. The key LLL design features of the subnetwork that contribute to the high performance include the following: a data transmission scheme that provides a 40-Mbit/s channel over distances of up to 1000 ft, a large metal-oxide-semiconductor (MOS) memory buffer controlled by a 24-port memory multiplexer with an aggregate data rate of 280 Mbit/s, and a set of high-speed microprocessor-based controllers driving the commercial mass storage units. Reliability of the system is provided by a completely redundant network, including two control minicomputer systems. Also enhancing reliability is error detection and correction in the MOS memory. A hardware-generated checksum is carried with each file throughout the entire network to ensure integrity of user files. 6 figures, 1 table.
Legerton, V. N.; Mottinger, N. A.
An interactive, menu driven computer program was written to streamline the orbit determination process during the critical launch support phase of a mission. Residing on a virtual memory minicomputer, this program retains the quantities in-core needed to obtain a least squares estimate of the spacecraft trajectory with interactive displays to assist in rapid radio metric data evaluation. Menu-driven displays allow real time filter and data strategy development. Graphical and tabular displays can be sent to a laser printer for analysis without exiting the program. Products generated by this program feed back to the main orbit determination program in order to further refine the estimate of the trajectory. The final estimate provides a spacecraft ephemeris which is transmitted to the mission control center and used for antenna pointing and frequency predict generation by the Deep Space Network. The development and implementation process of this program differs from that used for most other navigation software by allowing the users to check important operating features during development and have changes made as needed.
Egan, J. T.; Hart, J.; Burt, S. K.; Macelroy, R. D.
A visualization of molecular models can lead to a clearer understanding of the models. Sophisticated graphics devices supported by minicomputers make it possible for the chemist to interact with the display of a very large model, altering its structure. In addition to user interaction, the need arises also for other ways of displaying information. These include the production of viewgraphs, film presentation, as well as publication quality prints of various models. To satisfy these needs, the display capability of the Ames Interactive Modeling System (AIMS) has been enhanced to provide a wide range of graphics and plotting capabilities. Attention is given to an overview of the AIMS system, graphics hardware used by the AIMS display subsystem, a comparison of graphics hardware, the representation of molecular models, graphics software used by the AIMS display subsystem, the display of a model obtained from data stored in molecule data base, a graphics feature for obtaining single frame permanent copy displays, and a feature for producing multiple frame displays.
Lawrence Livermore National Laboratory's (LLNL) Computer Integrated Manufacturing (CIM) project's goal is to implement a wide variety of Computer Aided Engineering (CAE) systems to support our engineering staff. As we move to routine operation, we are addressing the problems of integrated information flow. This paper describes how Computer Aided Design (CAD), Computer Aided Manufacturing (CAM), analysis, and information systems interact and provide vital information, such as drawing release status, production job information, and analytical data. LLNL's information systems must handle a wide spectrum of classified and unclassified data in both paper and electronic form. The range of systems includes terminals, PC's, minicomputers, networks, and mainframe supercomputers. A natural progression toward stand alone engineering workstations, PC based CAD systems, and multiple vendors is occurring. Thus, we are taking steps to ensure that we retain system compatibility. Many such information systems have been attempted. Because results have not always been positive, we are using a pragmatic bottoms up approach to assure success. By beginning with small subsystems, and progressing to full integration, we ensure smooth information flow and provide users with information necessary for decision making. The path to data integration is strewn with obstacles and hazards. We describe many of these and the steps we are taking to remove them.
Ball, J.W.; Nordstrom, D.K.; Zachmann, D.W.
A FORTRAN 77 version of the PL/1 computer program for the geochemical model WATEQ2, which computes major and trace element speciation and mineral saturation for natural waters has been developed. The code (WATEQ4F) has been adapted to execute on an IBM PC or compatible microcomputer. Two versions of the code are available, one operating with IBM Professional FORTRAN and an 8087 or 89287 numeric coprocessor, and one which operates without a numeric coprocessor using Microsoft FORTRAN 77. The calculation procedure is identical to WATEQ2, which has been installed on many mainframes and minicomputers. Limited data base revisions include the addition of the following ions: AlHS04(++), BaS04, CaHS04(++), FeHS04(++), NaF, SrC03, and SrHCO3(+). This report provides the reactions and references for the data base revisions, instructions for program operation, and an explanation of the input and output files. Attachments contain sample output from three water analyses used as test cases and the complete FORTRAN source listing. U.S. Geological Survey geochemical simulation program PHREEQE and mass balance program BALANCE also have been adapted to execute on an IBM PC or compatible microcomputer with a numeric coprocessor and the IBM Professional FORTRAN compiler. (Author 's abstract)
The measurement control program includes control of the analytical procedures, analyst training and qualification, and mass and balance control. A DEC 11/750 minicomputer and an interfaced Hewlett Packard 9836 desktop computer control these functions. The training and qualification program requires the analyst to analyze known and blind'' standards. Statistical tests for precision and bias are used to determine the analyst's qualification. An analyst must be qualified before he/she is permitted to use an analytical procedure for production samples. The qualified analyst must verify the operation of the measurement system prior to analyzing production samples. Balances must be tested and check-weighed and the analytical procedure must be verified by analyzing a blind'' standard. The data from the analysis is entered to the laboratory computer where it is determined to pass'' or fail.'' The analyst is notified immediately by the laboratory computer if the standard, chemical or mass, has passed. The results of the chemical standard are transferred to a Hewlett Packard computer where the data are evaluated routinely. The evaluation determines the bias and the uncertainty of each analytical procedure as used by the analysts. These parameters are applied to the sample results to bias correct the results and calculate the measurement uncertainty. 1 ref., 2 figs.
Sharpe, William N., Jr.
A system for measuring the relative in-plane displacement over a gage length as short as 100 micrometers is described. Two closely spaced indentations are placed in a reflective specimen surface with a Vickers microhardness tester. Interference fringes are generated when they are illuminated with a He-Ne laser. As the distance between the indentations expands or contracts with applied load, the fringes move. This motion is monitored with a minicomputer-controlled system using linear diode arrays as sensors. Characteristics of the system are: (1) gage length ranging from 50 to 500 micrometers, but 100 micrometers is typical; (2) least-count resolution of approximately 0.0025 micrometer; and (3) sampling rate of 13 points per second. In addition, the measurement technique is non-contacting and non-reinforcing. It is useful for strain measurements over small gage lengths and for crack opening displacement measurements near crack tips. This report is a detailed description of a new system recently installed in the Mechanisms of Materials Branch at the NASA Langley Research Center. The intent is to enable a prospective user to evaluate the applicability of the system to a particular problem and assemble one if needed.
As part of the Network Consolidation Program, the 26-meter Tracking and Communication Subnet was transferred to JPL. Along with this transfer JPL assumed responsibility for tracking and navigation support for Earth orbiter missions normally tracked by the 26-meter sites. The High Earth Orbiter (HEO) Multimission Navigation Facility was formed as a component of the Deep Space Network (DSN) Tracking System for the purpose of supporting Earth orbiter missions and certain classes of deep space missions. This facility has been implemented on a dedicated VAX 11/780 minicomputer within the Network Operations Control Center (NOCC). The primary function of the system is to process radio metric data and estimate the orbit of a spacecraft in near-Earth or deep space environment. The system is capable of processing radio metric data in near-real time and providing the quick turnaround required for Earth orbiter operations. It is also capable of generating precision spacecraft ephemeris for use by the NOCC Support Subsystem and external agencies. This article discusses the implementation and functional operation of the Multimission Navigation Subsystem and describes the support that has been provided for an array of missions.
Nyholm, R.A.; Brough, W.G.; Rector, N.L.
The Spent Fuel Test - Climax (SFT-C) is a test of the retrievable, deep geologic storage of commercially generated, spent nuclear reactor fuel in granitic rock. Eleven spent fuel assemblies, together with 6 electrical simulators and 20 guard heaters, are emplaced 420 m below the surface in the Climax granite at the Nevada Test Site. On June 2, 1978, Lawrence Livermore National Laboratory (LLNL) secured funding for the SFT-C, and completed spent fuel emplacement May 28, 1980. This multi-year duration test is located in a remote area and is unattended much of the time. An extensive array of radiological safety and geotechnical instrumentation is deployed to monitor the test performance. A dual minicomputer-based data acquisition system collects and processes data from more than 900 analog instruments. This report documents the design and functions of the hardware and software elements of the Data Acquisition System and describes the supporting facilities which include environmental enclosures, heating/air-conditioning/humidity systems, power distribution systems, fire suppression systems, remote terminal stations, telephone/modem communications, and workshop areas. 9 figures.
The Spent Fuel Test-Climax (SFT-C) is a test of the retrievable, deep geologic storage of commercially generated, spent nuclear reactor fuel in granite rock. Eleven spent fuel assemblies, together with 6 electrical simulators and 20 guard heaters, are emplaced 420 m below the surface in the Climax granite at the US Department of Energy Nevada Test Site. On June 2, 1978, Lawrence Livermore National Laboratory (LLNL) secured funding for the SFT-C, and completed spent fuel emplacement May 28, 1980. The multi-year duration test is located in a remote area and is unattended much of the time. An extensive array of radiological safety and geotechnical instrumentation is deployed to monitor the test performance. A dual minicomputer-based data acquisition system (DAS) collects and processes data from more than 900 analog instruments. This report documents the software element of the LLNL developed SFT-C Data Acquisition System. It defines the operating system and hardware interface configurations, the special applications software and data structures, and support software.
This report presents the results of work aimed at understanding the hydrodynamic behavior of the H-Coal reactor. A summary of the literature search related to the fluid dynamic behavior of gas/liquid/solid systems has been presented. Design details of a cold flow unit were discussed. The process design of this cold flow model followed practices established by HRI in their process development unit. The cold fow unit has been used to conduct experiments with nitrogen, kerosene, or kerosene/coal char slurries, and HDS catalyst, which at room temperature have properties similar to those existing in the H-Coal reactor. Mineral oil, a high-viscosity liquid, was also used. The volume fractions occupied by gas/liquid slurries and catalyst particles were determined by several experimental techniques. The use of a mini-computer for data collection and calculation has greatly accelerated the analysis and reporting of data. Data on nitrogen/kerosene/HDS catalyst and coal char fines are presented in this paper. Correlations identified in the literature search were utilized to analyze the data. From this analysis it became evident that the Richardson-Zaki correlation describes the effect of slurry flow rate on catalyst expansion. Three-phase fluidization data were analyzed with two models.
Bly, S.; Buzzell, C.; Smith, G.
JANUS is a two-sided interactive color graphic simulation in which human commanders can direct their forces, each trying to accomplish their mission. This competitive synthetic battlefield is used to explore the range of human ingenuity under conditions of incomplete information about enemy strength and deployment. Each player can react to new situations by planning new unit movements, using conventional and nuclear weapons, or modifying unit objectives. Conventional direct fire among tanks, infantry fighting vehicles, helicopters, and other units is automated subject to constraints of target acquisition, reload rate, range, suppression, etc. Artillery and missile indirect fire systems deliver conventional munitions, smoke, and nuclear weapons. Players use reconnaissance units, helicopters, or fixed wing aircraft to search for enemy unit locations. Counter-battery radars acquire enemy artillery. The JANUS simulation at LLL has demonstrated the value of the computer as a sophisticated blackboard. A small dedicated minicomputer is adequate for detailed calculations, and may be preferable to sharing a more powerful machine. Real-time color interactive graphics are essential to allow realistic command decision inputs. Competitive human-versus-human synthetic experiences are intense and well-remembered. 2 figures.
In a boiling water reactor (MWR) when there is closure of the main steam isolation valves (MSIVs) the energy generated in the core will be transferred to the pressure suppression pool (PSP) via steam that flows out the relief valves. The pool has limited heat capacity as a heat sink and, hence, if there is no reactor trip, there is the possibility that the pool temperature may rise beyond acceptable limits. In the past few years there have been several studies of this problem with emphasis on calculating the power level in the core. In the present study the authors consider the power level as well as the resulting PSP temperature and take into account different assumptions regarding plant parameters and operator actions. The Brookhaven National Laboratory (BNL) Plant Analyzer (BPA) with a BWR/4 plant model was used to do the calculations. The BPA is a special-purpose minicomputer in combination with state-of-the-art thermal-hydraulic modeling that can calculate the behavior of a BWR plant at faster-than-real-time speeds.
Fenske, K.R. (ed.); Boxberger, L.M.; Amiot, L.W.; Bretscher, M.E.; Engert, D.E.; Moszur, F.M.; Mueller, C.J.; O'Brien, D.E.; Schlesselman, C.G.; Troyer, L.J.
This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is to develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.
Wilt, M.; Goldstein, N.E.; Haught, J.R.; Morrison, H.F.
Sixteen shallow and deep controlled source electromagnetic soundings were performed in Buena Vista Valley, near Winnemucca, Nevada, to investigate an intra-basement conductor previously detected with magnetotellurics. The survey was carried out with the LBL EM-60 system using a remote magnetic reference for low-frequency geomagnetic noise cancellation, 100-m- and 2.8-km-diameter transmitter loops, and a minicomputer for in-field processing. EM soundings were made at distances from 0.5 to 30 km from three loops over the frequency range 0.02 to 500 Hz. Data were interpreted by means of 1-D inversions and the resulting layered models were pieced together to yield an approximate 2-D geoelectric model along the N-S axis of the valley. The EM soundings and one MT sounding show a 3 to 7 ohm-m zone at a depth of four to seven km. The conductor appears to be deepest at the northern end of the valley and shallowest beneath a basement ridge that seems to divide Buena Vista Valley into two basinal structures. Similar intra-basement conductors are also reported 50 to 75 miles south in the Carson Sink-Fallon areas, suggesting a common source, probably related to an anomalously hot, thin crust.
Dayton, J. A., Jr.; Ebihara, B. T.
An electron beam test facility, which consists of a precision multidimensional manipulator built into an ultra-high-vacuum bell jar, was designed, fabricated, and operated at Lewis Research Center. The position within the bell jar of a Faraday cup which samples current in the electron beam under test, is controlled by the manipulator. Three orthogonal axes of motion are controlled by stepping motors driven by digital indexers, and the positions are displayed on electronic totalizers. In the transverse directions, the limits of travel are approximately + or - 2.5 cm from the center with a precision of 2.54 micron (0.0001 in.); in the axial direction, approximately 15.0 cm of travel are permitted with an accuracy of 12.7 micron (0.0005 in.). In addition, two manually operated motions are provided, the pitch and yaw of the Faraday cup with respect to the electron beam can be adjusted to within a few degrees. The current is sensed by pulse transformers and the data are processed by a dual channel box car averager with a digital output. The beam tester can be operated manually or it can be programmed for automated operation. In the automated mode, the beam tester is controlled by a microcomputer (installed at the test site) which communicates with a minicomputer at the central computing facility. The data are recorded and later processed by computer to obtain the desired graphical presentations.
Wheeler, D. J.; Ridd, M. K.
Procedures followed in developing a test case geographic information system derived primarily from remotely sensed data for the North Cache Soil Conservation District (SCD) in northern Utah are outlined. The North Cache SCD faces serious problems regarding water allocation, flood and geologic hazards, urban encroachment into prime farmland, soil erosion, and wildlife habitat. Four fundamental data planes were initially entered into the geo-referenced data base: (1) land use/land cover information for the agricultural and built-up areas of the valley obtained from various forms of aerial photography; (2) vegetation/land cover in mountains classified digitally from LANDSAT; (3) geomorphic terrain units derived from aerial photography and soil maps; and (4) digital terrain maps obtained from DMA digital data. The land use/vegetation/land cover information from manual photographic and LANDSAT interpretation were joined digitally into a single data plane with an integrated legend, and segmented into quadrangle units. These were merged with the digitized geomorphic units and the digital terrain data using a Prime 400 minicomputer. All data planes were geo-referenced to a UTM coordinate grid.
Bannister, P. R.
Extremely low frequency (ELF) measurements are made of the transverse horizontal magnetic field strength received in Connecticut. The AN/BSR-1 receiver consists of an AN/UYK-20 minicomputer, a signal timing and interface unit (STIU), a rubidium frequency time standard, two magnetic tape recorders, and a preamplifier. The transmission source of these farfield (1.6-Mm range) measurements is the U.S. Navy's ELF Wisconsin Test Facility (WTF), located in the Chequamegon National Forest in north central Wisconsin, about 8 km south of the village of Clam Lake. The WTF consists of two 22.5-km antennas; one of which is situated approximately in the north-south (NS) direction and the other approximately in the east-west (EW) direction. Each antenna is grounded at both ends. The electrical axis of the WTF EW antenna is 11 deg east of north at 45 Hz and 14 deg east of north at 75Hz. The electrical axis of the WTF NS antenna is 11 deg east of north at 45 Hz and 14 deg east of north at 75 Hz. The WTF array can be steered electrically. Its radiated power is approximately 0.5 W at 45 Hz and 1 W at 75 Hz. This report will compare results of 45 Hz band data taken during 1983 to 1984 with previous 45 Hz band measurements.
Marshall, R. E.; Kriegler, F. J.
A hybrid processor is described offering enough handling capacity and speed to process efficiently the large quantities of multispectral data that can be gathered by scanner systems such as MSDS, SKYLAB, ERTS, and ERIM M-7. Combinations of general-purpose and special-purpose hybrid computers were examined to include both analog and digital types as well as all-digital configurations. The current trend toward lower costs for medium-scale digital circuitry suggests that the all-digital approach may offer the better solution within the time frame of the next few years. The study recommends and defines such a hybrid digital computing system in which both special-purpose and general-purpose digital computers would be employed. The tasks of recognizing surface objects would be performed in a parallel, pipeline digital system while the tasks of control and monitoring would be handled by a medium-scale minicomputer system. A program to design and construct a small, prototype, all-digital system has been started.
Baylor, L.R.; Burris, R.D.; Greenwood, D.E.; Stewart, K.A.
A stand-alone control and data acquisition system for pellet injectors has been designed and implemented to support pellet injector development at Oak Ridge Laboratory (ORNL) and to enable ORNL pellet injectors to be installed on various fusion experimental devices. The stand-alone system permits LOCAL operation of the injector from a nearby panel and REMOTE operation from the experiment control room. Major components of the system are (1) an Allen-Bradley PLC 2/30 programmable controller, (2) a VAX minicomputer, and (3) a CAMAC serial highway interface. The programmable logic controller (PLC) is used to perform all control functions of the injector. In LOCAL, the operator interface is provided by an intelligent panel system that has a keypad and pushbutton module programmed from the PLC. In REMOTE, the operator interfaces via a VAX-based color graphics display and uses a trackball and keyboard to issue commands. Communications between the remote and local controls and to the fusion experiment supervisory system are via the CAMAC highway. The VAX archives transient data from pellet shots and trend data acquired from the PLC. Details of the hardware and software design and the operation of the system are presented in this paper. 3 refs., 1 fig.
Clem, R. G.; Park, F. W.; Kirsten, F. A.; Phillips, S. L.; Binnall, E. P.
The construction and use of a portable, microprocessor controlled anodic stripping voltammeter for on-site simultaneous metal analysis of copper, lead and cadmium in tap water is discussed. The instrumental system is comprised of a programmable controller which permits keying in analytical parameters such as sparge time and plating time: a rotating cell for efficient oxygen removal and amalgam formation; and, data handling via a minicomputer or analog pen recorder. Plating and stripping potentials are controlled by a digital potentiostat; stripping is done using a staircase waveform with measurement of the current after a one msec delay. In this way charging current effects are minimized. Results of tap water analysis showed 3 plus or minus 1 (MU)g/L lead, 22 plus or minus 0.3 (MU)g/L copper, and less than 0.2 (MU)g/L cadmium for a Berkeley, California tap water, and 1-1000 (MU)g/L Cu, 1 -2 (MU)g/L Pb for ten samples of Seattle, Washington tap water. Recommendations are given for a next generation instrument system.
Tanaka, T.; Ogawa, T.; Igarashi, K.; Fujii, R.
The 50-MHz Doppler radar installed at Syowa Station (69 deg 00'S, 39 deg 35'E), Antarctica, in 1982 can detect continuously a meteor echo if an operator assigns the meteor mode operation to the radar. The radar has two narrow antenna beams (4 deg in the horizontal plane), one toward geomagnetic south and the other toward approximately geographic south, with a crossing angle of about 33 deg. The minicomputer annexed to the radar controls the transmission and reception of a 50-MHz wave. If the receiver detects a meteor echo, the flag signal is sent to the computer. Then the computer begins to determine the echo range with a time resolution of 1 micro s and to sample every 200 microns/s for 1 s the Doppler signal and echo intensity at the particular range (R). The line-of-sight velocity (V sub D) of the echo trail is calculated from the output from the Doppler signal detection circuit having an offset frequency by using the so-called zero-crossing method. The echo amplitude decay time calculated by a least-mean square method is used to obtain the ambipolar diffusion coefficient (D) and then to calculate the echo height (H). About 120 day observations were made during 1982-1983. Some early results are presented. magnetic tapes together with V sub D, D, H and R for later analysis in Japan. About 120 day observation were made during 1982-1983. Some early results are presented.
Whitley, S. L.
The NASA Earth Resources Laboratory has developed a transferrable system for processing Landsat and disparate data with capabilities for digital data classification, georeferencing, overlaying, and data base management. This system is known as the Earth Resources Data Analysis System. The versatility of the system has been demonstrated with applications in several disciplines. A description is given of a low-cost data system concept that is suitable for transfer to one's available in-house minicomputer or to a low-cost computer purchased for this purpose. Software packages are described that process Landsat data to produce surface cover classifications and that geographically reference the data to the UTM projection. Programs are also described that incorporate several sets of Landsat derived information, topographic information, soils information, rainfall information, etc., into a data base. Selected application algorithms are discussed and sample products are presented. The types of computers on which the low-cost data system concept has been implemented are identified, typical implementation costs are given, and the source where the software may be obtained is identified.
Methods for the measurement and display by minicomputer of cardiac images obtained from fluoroscopy to permit an accurate assessment of functional changes are discussed. Heart contours and discrete points can be digitized automatically or manually, with the recorded image in a video, cine, or print format. As each frame is digitized it is assigned a code name identifying the data source, experiment, run, view, and frame, and the images are filed for future reference in any sequence. Two views taken at the same point in the heart cycle are used to compute the spatial position of the ventricle apex and the midpoint of the aortic valve. The remainder of the points on the chamber border are corrected for the linear distortion of the X-rays by projection to a plane containing the chord between the apex and the aortic valve center and oriented so that lines perpendicular to the chord are parallel to the image intensifier face. The image of the chamber surface is obtained by generating circular cross sections with diameters perpendicular to the major chord. The transformed two- and three-dimensional imagery can be displayed in either static or animated form using a graphics terminal.
During 1991, the Space Simulation Facility conducted a survey to assess the requirements and analyze the merits for purchasing a new thermal vacuum data processing system for its facilities. A new, integrated, cost effective PC-based system was purchased which uses commercial off-the-shelf software for operation and control. This system can be easily reconfigured and allows its users to access a local area network. In addition, it provides superior performance compared to that of the former system which used an outdated mini-computer and peripheral hardware. This paper provides essential background on the old data processing system's features, capabilities, and the performance criteria that drove the genesis of its successor. This paper concludes with a detailed discussion of the thermal vacuum data processing system's components, features, and its important role in supporting our space-simulation environment and our capabilities for spacecraft testing. The new system was tested during the ANIK E spacecraft test, and was fully operational in November 1991.
Skiles, J. W.; Schulbach, C. H.
Many ecosystem simulation computer codes have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Supercomputing platforms (both parallel and distributed systems) have been largely unused, however, because of the perceived difficulty in accessing and using the machines. Also, significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers must be considered. We have transferred a grassland simulation model (developed on a VAX) to a Cray Y-MP/C90. We describe porting the model to the Cray and the changes we made to exploit the parallelism in the application and improve code execution. The Cray executed the model 30 times faster than the VAX and 10 times faster than a Unix workstation. We achieved an additional speedup of 30 percent by using the compiler's vectoring and 'in-line' capabilities. The code runs at only about 5 percent of the Cray's peak speed because it ineffectively uses the vector and parallel processing capabilities of the Cray. We expect that by restructuring the code, it could execute an additional six to ten times faster.
A single unified control system is used for all of the Fermilab accelerators and storage rings, from the LINAC to the Tevatron and antiproton source. A review of the general features is given - these include a 'host' system consisting of a number of minicomputers integrated with many distributed microprocessors in a variety of subsystems, usage of an in-house developed protocol, GAS, for communication between the two classes of machines, and a Parameter Page program, designed in conjunction with the system database, which allows a wide variety of quantities to be read and set in a coherent fashion. Recent developments include the implementation of a block transfer and 'fast time plot' facility through CAMAC, inclusion of several new computers in the host, a better understanding of system throughput, greatly improved reliability, advent of programs which sequence a large number of independent operations, and the construction of new hardware subsystems. Possible future system upgrades will be briefly presented. A summary of the utilization of a quite large software staff, at a time when the system is no longer under construction, will be discussed.
Pandya, J.M.; Marinkovich, P.S.; Mysore, R.K.
The nuclear industry, beset with increasing costs of constructing new nuclear plants, needs new initiatives. One, recently developed, is the use of CAD-IGS techniques that are proven beneficial in reducing the construction costs. This methodology had been used successfully in aerospace and automotive industries for many years. Westinghouse CAD-IGS experience with an overseas nuclear plant, now under construction, demonstrates this to be an effective tool for design verification and project management with excellent capabilities for application to new designs and operating plant support. This is accomplished through the CAD plant model and associated data base, which results in reducing human error, more complete preengineering before the start of construction and effective space utilization. Furthermore, the data base accurately represents the as-built plant, which is essential for expediting future plant upgrades. Computer-based modeling is less expensive than the conventional scale modeling and the current technology developments viz. optical scanners, photogrammetry, IGES, and advanced minicomputers will favorably improve the cost-effectiveness.
Gat, N.; Cohen, L.M.; Witte, A.B.; Denison, M.R.
A laser pyrolysis technique for assessing the influence of rapid heating on volatile release from pulverized coal described. Coal particles entrained in an inert gas jet are being irradiated from two opposing directions by a c-w, hf chemical laser beam, providing energy flux up to 2.2 kW/cm/sup 2/. Particle temperature as a function of position in the beam is measured using a novel scanning three-color pyrometer operating at 0.8, 0.95 and 1.1 ..mu..m. Data are reduced in near real time on a MINC minicomputer utilizing a nonlinear least square fit to a gray body radiation curve. Particle velocity is obtained from LDV measurements. Heating rates between 10/sup 5/ to 10/sup 6/ K/sec with asymptotic particle temperature between 2000 to 2900 K were measured. The evolution of volatile release during the heating of the particles was observed for residence times between 1 to 20 msec. An added 0.5% tungsten carbide powder was used to evaluate the mass balance obtained from natural tracers such as Ti. It was found that some degree of vaporization of all the ash elements exists and that the natural mineral matter could not serve as a reliable reference for mass balance determination. The experimental data was reduced to yield global kinetic rate constants.
Studies to be performed by the French NADIR project on techniques to interconnect Danube local-area networks via the Telecom-1 satellite communications system are described. Danube links up to 256 stations by coaxial cable at a predicted rate of 1.8 Mbit/sec in CSMA/CD mode and offers both connectionless and connection-oriented service; Telecom 1 (as simulated by ANIS) provides call-per-call or semipermanent TDMA simplex or full-duplex linkage (point-to-point or multipoint) at 2.4-2000 kbit/sec with a delay of 300 msec and a bit error rate (BER) lower than 10 to the -6th 99 percent of the time (or 10 to the -10th with forward-error correction). The problems of routing, error control, and flow control are considered. A simple scheme involving routing by filtering the address fields, no error and flow control, and minicomputers as gateways in each Danube system is chosen for the point-to-point simulations, while the multipoint connections will be made receiver half-gates and single sender half-gates at each Danube system. Block diagrams are provided.
Benson, James William
SpaceDev is in the market for a deep space launch, and we are not going to pay $50 million for it. There is an ongoing debate about the elasticity of demand related to launch costs. On the one hand there are the ``big iron'' NASA and DoD contractors who say that there is no market for small or inexpensive launchers, that lowering launch costs will not result in significantly more launches, and that the current uncompetitive pricing scheme is appropriate. On the other hand are commercial companies which compete in the real world, and who say that there would be innumerable new launches if prices were to drop dramatically. I participated directly in the microcomputer revolution, and saw first hand what happened to the big iron computer companies who failed to see or heed the handwriting on the wall. We are at the same stage in the space access revolution that personal computers were in the late '70s and early '80s. The global economy is about to be changed in ways that are just as unpredictable as those changes wrought after the introduction of the personal computer. Companies which fail to innovate and keep producing only big iron will suffer the same fate as IBM and all the now-extinct mainframe and minicomputer companies. A few will remain, but with a small share of the market, never again to be in a position to dominate.
Ricklefs, Randall L.; Cheek, Jack; Seery, Paul J.; Emenheiser, Kenneth S.; Hanrahan, William P., III; Mcgarry, Jan F.
Laser ranging systems now managed by the NASA Dynamics of the Solid Earth (DOSE) and operated by the Bendix Field Engineering Corporation, the University of Hawaii, and the University of Texas have produced a wealth on interdisciplinary scientific data over the last three decades. Despite upgrades to the most of the ranging station subsystems, the control computers remain a mix of 1970's vintage minicomputers. These encompass a wide range of vendors, operating systems, and languages, making hardware and software support increasingly difficult. Current technology allows replacement of controller computers at a relatively low cost while maintaining excellent processing power and a friendly operating environment. The new controller systems are now being designed using IBM-PC-compatible 80486-based microcomputers, a real-time Unix operating system (LynxOS), and X-windows/Motif IB, and serial interfaces have been chosen. This design supports minimizing short and long term costs by relying on proven standards for both hardware and software components. Currently, the project is in the design and prototyping stage with the first systems targeted for production in mid-1993.
Though computers were still housed in large, air-conditioned rooms and were often programmed via decks of punched cards, a number of chemists were making effective use of them in teaching as well as research. Eight papers in this issue reported on computer programs. Castleberry, Culp, and Lagowski described an educational experiment in which the effectiveness of computer-based instruction was evaluated in a general chemistry course. Breneman reported on minicomputer-aided instruction, and others described programs that normalized grades, calculated heats of combustion, analyzed results of physical chemistry experiments, solved secular equations, calculated mass spectra, and calculated rate constants. Output devices were usually character based and graphics were rudimentary, as exemplified by the teletype plots of hydrogenic orbitals shown above. The editorial, "On Abandoning Grading and Reconsidering Standards" advocated neither and presented four arguments for maintaining traditional standards and realistic grades. This immediately followed half a decade when poor grades might result in being drafted and serving in Vietnam and student protests were based on government policy rather than whether or not to enforce rules against student drinking. Editor Lippincott pointed out that after several years few students return to thank a professor for making things easy, but many express appreciation for challenges that proved they could do more than they thought they could.
Strand, R.; Cox, T.L.; Sjoreen, A.; Alvic, D.
The National Center for Toxicological Research (NCTR) is the basic research arm of the US Food and Drug Administration (FDA). The NCTR has upgraded and standardized its computer operations on Digital Equipment Corporation VAX minicomputers using Software AG's ADABAS data base management system for all research applications. The NCTR is currently performing a large study to improve the functionality of the animal husbandry systems and applications called Breeding/Multigeneration Support System (BMSS). When functional, it will operate on VAX equipment using the ADABAS data base management system, TDMS, and COBOL. Oak Ridge National Laboratory (ORNL) is supporting NCTR in the design, prototyping, and software engineering of the BMSS. This document summarizes the internal design elements that include data structures, file structures, and system attributes that were required to facilitate the decision support requirements defined in the external design work. Prototype pseudocode then was developed for the recommended system attributes and file and data structures. Finally, ORNL described the processing requirements including the initial access of the BMSS, integration of the existing INLIFE system and the STUDY DEFINITION system under development, data system initialization and maintenance, and BMSS testing and verification. This document describes ORNL's recommendations for the internal design of the BMSS. ORNL will provide research support to NCTR in the additional phases of systems life cycle development for BMSS. ORNL has prepared this document according to NCTR's Standard Operating Procedures for Systems Development. 6 figs., 5 tabs.
Buhrmaster, M.A.; Duncan, L.D.; Hume, R.; Huntley, A.F.
In designing and implementing a computer aided instruction (CAI) prototype for the Navy Management System Support Office (NAVMASSO) as part of the Shipboard Nontactical ADP Program (SNAP), Data Systems Engineering Organization (DSEO) personnel developed techniques for automating the production of COBOL source code for CAI applications. This report discusses the techniques applied, which incorporate the use of a database management system (DBMS) to store, access, and manipulate the data necessary for producing COBOL source code automatically. The objective for developing the code generation techniques is to allow for the production of future applications in an efficient and reliable manner. This report covers the standards and conventions defined, database tables created, and the host language interface program used for generating COBOL source files. The approach is responsible for producing 85 percent of an 830,000 line COBOL application, in approximately one year's time. This code generation program generated transaction processing routines to be executed under the DM6TP NAVMASSO distributed processing environment on the Honeywell DPS-6 minicomputers, representing the standard SNAP-I environment.
Lietzke, E T
Minicomputers, microcomputers, supercomputers, disks, disk drives, diskettes, bits, bytes, input, output, COBOL, FORTRAN, RPG, memory, "K", CRT, CRU, etc., etc. What does all this mean, and more importantly, what can it do for you, the medical group manager, possibly faced with your first hard decision regarding the use of computers in the day-to-day business of running your medical group? The author provides a glossary of these and many other terms for you and puts them in proper perspective so you are more easily able to wade through the morass of computerese jargon. Further, a number of considerations are presented which will give you a base from which to operate, and a series of questions are presented relative to each of the considerations. Your answers will generate enough information relative to any system to enable you to make a reasoned decision about that system. It is not the intent of this article to tell you which system is best--you must judge that for yourself as you evaluate the questions and considerations presented in the light of your unique setting. PMID:10261827
This paper describes the use of DATATRIEVE to automatically generate large blocks of COBOL code for use in a computer aided instruction application. Pro/DATATRIEVE was used to store information describing the attributes of screens that make up the instruction and tests, and to maintain data on expected student responses for each screen. DATATRIEVE procedures were used to create standard, error-free COBOL code that controls the presentation and order of flow of the screens within the lessons and tests. The code generated by DATATRIEVE was incorporated into skeleton COBOL programs through use of the COPY statement. This project involves the use of DATATRIEVE on a DEC Professional-380. The generated code segments were electronically transmitted to a Honeywell DPS-6 minicomputer where the application was compiled and executed. However, the procedures used for the code generation could be applied to any DATATRIEVE implementation and any application system that requires standardized procedures for processing a variety of functions (i.e., transactions).
Strand, R.; Cox, T.L.; Sjoreen, A.; Alvic, D.
The National Center for Toxicological Research (NCTR) is the basic research arm of the US Food and Drug Administration (FDA). The NCTR has upgraded and standardized its computer operations on Digital Equipment Corporation VAX minicomputers using Software AG's ADABAS data base management system for all research applications. The NCTR is currently performing a large study to improve the functionality of the animal husbandry systems and applications called Breeding/Multigeneration Support System (BMSS). When functional, it will operate on VAX equipment using the ADABAS data base management system, TDMS, and COBOL. Oak Ridge National Laboratory (ORNL) is supporting NCTR in the design, prototyping, and software engineering of the BMSS. This document summarizes the external design elements that include data entry screens, screen reports, summary and status reports, and functional definitions of screen and report data. ORNL will provide research support to NCTR in the additional phases of systems life cycle development for BMSS. ORNL has prepared this document according to NCTR's Standard Operating Procedures for Systems Development. 8 figs., 7 tabs.
Potel, Michael J.; MacKay, Steven A.; Sayre, Richard E.
Extracting quantitative information from movie film and video recordings has always been a difficult process. The Galatea motion analysis system represents an application of some powerful interactive computer graphics capabilities to this problem. A minicomputer is interfaced to a stop-motion projector, a data tablet, and real-time display equipment. An analyst views a film and uses the data tablet to track a moving position of interest. Simultaneously, a moving point is displayed in an animated computer graphics image that is synchronized with the film as it runs. Using a projection CRT and a series of mirrors, this image is superimposed on the film image on a large front screen. Thus, the graphics point lies on top of the point of interest in the film and moves with it at cine rates. All previously entered points can be displayed simultaneously in this way, which is extremely useful in checking the accuracy of the entries and in avoiding omission and duplication of points. Furthermore, the moving points can be connected into moving stick figures, so that such representations can be transcribed directly from film. There are many other tools in the system for entering outlines, measuring time intervals, and the like. The system is equivalent to "dynamic tracing paper" because it is used as though it were tracing paper that can keep up with running movie film. We have applied this system to a variety of problems in cell biology, cardiology, biomechanics, and anatomy. We have also extended the system using photogrammetric techniques to support entry of three-dimensional moving points from two (or more) films taken simultaneously from different perspective views. We are also presently constructing a second, lower-cost, microcomputer-based system for motion analysis in video, using digital graphics and video mixing to achieve the graphics overlay for any composite video source image.
Hoffman, John A.; Gluck, R.; Sridhar, S.
The design of a new parallel-processing digital simulator is described. The new simulator has been developed specifically for analysis of wind energy systems in real time. The new processor has been named: the Wind Energy System Time-domain simulator, version 3 (WEST-3). Like previous WEST versions, WEST-3 performs many computations in parallel. The modules in WEST-3 are pure digital processors, however. These digital processors can be programmed individually and operated in concert to achieve real-time simulation of wind turbine systems. Because of this programmability, WEST-3 is very much more flexible and general than its two predecessors. The design features of WEST-3 are described to show how the system produces high-speed solutions of nonlinear time-domain equations. WEST-3 has two very fast Computational Units (CU's) that use minicomputer technology plus special architectural features that make them many times faster than a microcomputer. These CU's are needed to perform the complex computations associated with the wind turbine rotor system in real time. The parallel architecture of the CU causes several tasks to be done in each cycle, including an IO operation and the combination of a multiply, add, and store. The WEST-3 simulator can be expanded at any time for additional computational power. This is possible because the CU's interfaced to each other and to other portions of the simulation using special serial buses. These buses can be 'patched' together in essentially any configuration (in a manner very similar to the programming methods used in analog computation) to balance the input/ output requirements. CU's can be added in any number to share a given computational load. This flexible bus feature is very different from many other parallel processors which usually have a throughput limit because of rigid bus architecture.
Riddick, J. C.; Greenwood, A. C.; Stuart, W. F.
ARGOS, an instrument and recording system for performing standard geomagnetic observatory functions at the three U.K. observatories, is described. Operations are controlled by a minicomputer at each observatory communicating by modem through the public telephone system to a central computer in Edinburgh. A fluxgate magnetometer provides 10-s samples of the variation field at + 1 nT resolution. These values are filtered to produce 1-min values (centred on the minute) which in turn are used to compute hourly mean values. These and other derivatives of the raw data are stored in the observatory computer and transmitted to Edinburgh daily by operator command, where they are transferred to a data file which can be accessed by users via the Joint Academic Network computer network (JANET). Observatory data are available to users by this means within 24 h and can be made available in near real time by special arrangement. ARGOS performs standardization measurements of the values of the field components remotely using a proton magnetometer employing standard techniques for absolute observations. Comparison of these Baseline Reference Measurements with manual absolute observations shows them to be acceptable for baseline adoption. In the first year of operation it has been established that ARGOS produces data which are of comparable quality to the classical standards expected from the U.K. observatories. Data loss has been less than 1%. Further automation of routine procedures (e.g. magnetogram plotting, editing, baseline adoption and the adjustment of minute values day by day) will be the focus of attention in the next 2 years.
Hayward, M J; Robandt, P V; Meek, J T; Thomson, M L
Several Finngan-MAT mass spectrometer data systems were networked together to achieve the following two primary objectives: (1) to allow access to mass spectrometry data and data processing functions from remote locations without affecting simultaneous data acquisition at the instruments, and (2) to electronically archive mass spectrometry data at a central location on a high-capacity, fast-access device that allows rapid retrieval of archived data for all data processing operations at all locations. UNIX workstations, IBM PC/AT-compatible computers, and Data General Nova minicomputers were connected via Ethernet interfaces to allow rapid data transfer among all systems as well as X-Windows access to UNIX-based systems. Bridging techniques were used to isolate possible high-traffic areas of the network and to enable security measures for adequate protection of files. Additionally, serial connections were made through a Northern Telecom phone system to provide remote terminal access to the Data General Nova-based systems. Use of these connectivity techniques significantly improved productivity by allowing retrieval, processing, and printing of data from remote locations, such as office areas, without affecting data acquisition, processing, and printing performed simultaneously at the instruments. For archival purposes, data files are electronically stored on high-capacity magneto-optical disks for rapid retrieval. A highcapacity fixed disk is also available for centralized temporary data file storage. A Digital Equipment Corporation DECstation 2100 UNIX workstation was used as the file server for centralized data storage while being simultaneously utilized as the data system computer for one of the mass spectrometers. Utilization of this UNIX-based file server system in conjunction with Ethernet connectivity techniques provides a centralized, rapid-access, high-capacity, cost- and space-efficient method for electronic archival of mass spectrometry raw data recorded at all of the instruments. PMID:24226001
Laenen, Antonius; Smith, Winchell
The acoustic velocity meter (AVM), also referred to as an ultrasonic flowmeter, has been an operational tool for the measurement of streamflow since 1965. Very little information is available concerning AVM operation, performance, and limitations. The purpose of this report is to consolidate information in such a manner as to provide a better understanding about the application of this instrumentation to streamflow measurement. AVM instrumentation is highly accurate and nonmechanical. Most commercial AVM systems that measure streamflow use the time-of-travel method to determine a velocity between two points. The systems operate on the principle that point-to-point upstream travel-time of sound is longer than the downstream travel-time, and this difference can be monitored and measured accurately by electronics. AVM equipment has no practical upper limit of measurable velocity if sonic transducers are securely placed and adequately protected. AVM systems used in streamflow measurement generally operate with a resolution of ?0.01 meter per second but this is dependent on system frequency, path length, and signal attenuation. In some applications the performance of AVM equipment may be degraded by multipath interference, signal bending, signal attenuation, and variable streamline orientation. Presently used minicomputer systems, although expensive to purchase and maintain, perform well. Increased use of AVM systems probably will be realized as smaller, less expensive, and more conveniently operable microprocessor-based systems become readily available. Available AVM equipment should be capable of flow measurement in a wide variety of situations heretofore untried. New signal-detection techniques and communication linkages can provide additional flexibility to the systems so that operation is possible in more river and estuary situations.
This second annual report under Contract No. 31-109-39-4200 covers the period July 1, 1978 through August 31, 1979. The program demonstrates the feasibility of the nickel-zinc battery for electric vehicle propulsion. The program is divided into seven distinct but highly interactive tasks collectively aimed at the development and commercialization of nickel-zinc technology. These basic technical tasks are separator development, electrode development, product design and analysis, cell/module battery testing, process development, pilot manufacturing, and thermal management. A Quality Assurance Program has also been established. Significant progress has been made in the understanding of separator failure mechanisms, and a generic category of materials has been specified for the 300+ deep discharge (100% DOD) applications. Shape change has been reduced significantly. A methodology has been generated with the resulting hierarchy: cycle life cost, volumetric energy density, peak power at 80% DOD, gravimetric energy density, and sustained power. Generation I design full-sized 400-Ah cells have yielded in excess of 70 W/lb at 80% DOD. Extensive testing of cells, modules, and batteries is done in a minicomputer-based testing facility. The best life attained with electric vehicle-size cell components is 315 cycles at 100% DOD (1.0V cutoff voltage), while four-cell (approx. 6V) module performance has been limited to about 145 deep discharge cycles. The scale-up of processes for production of components and cells has progressed to facilitate component production rates of thousands per month. Progress in the area of thermal management has been significant, with the development of a model that accurately represents heat generation and rejection rates during battery operation. For the balance of the program, cycle life of > 500 has to be demonstrated in modules and full-sized batteries. 40 figures, 19 tables. (RWR)
Chapter 1 discusses the quantum mechanical formalism used for describing the interaction between magnetic dipoles that dictates the appearance of a spectrum. The NMR characteristics of liquids and liquid crystals are stressed. Chapter 2 reviews the theory of multiple quantum and two dimensional NMR. Properties of typical spectra and phase cycling procedures are discussed. Chapter 3 describes a specific application of heteronuclear double quantum coherence to the removal of inhomogeneous broadening in liquids. Pulse sequences have been devised which cancel out any contribution from this inhomogeneity to the final spectrum. An interpretation of various pulse sequences for the case of /sup 13/C and /sup 1/H is given, together with methods of spectral editing by removal or retention of the homo- or heteronuclear J coupling. The technique is applied to a demonstration of high resolution in both frequency and spatial dimensions with a surface coil. In Chapter 4, multiple quantum filtered 2-D spectroscopy is demonstrated as an effective means of studying randomly deuterated molecules dissolved in a nematic liquid crystal. Magnitudes of dipole coupling constants have been determined for benzene and hexane, and their signs and assignments found from high order multiple quantum spectra. For the first time, a realistic impression of the conformation of hexane can be estimated from these results. Chapter 5 is a technical description of the MDB DCHIB-DR11W parallel interface which has been set up to transfer data between the Data General Nova 820 minicomputer, interfaced to the 360 MHz spectrometer, and the Vax 11/730. It covers operation of the boards, physical specifications and installation, and programs for testing and running the interface.
Cok, Ronald S.
A prototype digital image processor for enhancing photographic images has been built in the Research Laboratories at Kodak. This image processor implements a particular version of each of the following algorithms: photographic grain and noise removal, edge sharpening, multidimensional image-segmentation, image-tone reproduction adjustment, and image-color saturation adjustment. All processing, except for segmentation and analysis, is performed by massively parallel and pipelined special-purpose hardware. This hardware runs at 10 MHz and can be adjusted to handle any size digital image. The segmentation circuits run at 30 MHz. The segmentation data are used by three single-board computers for calculating the tonescale adjustment curves. The system, as a whole, has the capability of completely processing 10 million three-color pixels per second. The grain removal and edge enhancement algorithms represent the largest part of the pipelined hardware, operating at over 8 billion integer operations per second. The edge enhancement is performed by unsharp masking, and the grain removal is done using a collapsed Walsh-hadamard transform filtering technique (U.S. Patent No. 4549212). These two algo-rithms can be realized using four basic processing elements, some of which have been imple-mented as VLSI semicustom integrated circuits. These circuits implement the algorithms with a high degree of efficiency, modularity, and testability. The digital processor is controlled by a Digital Equipment Corporation (DEC) PDP 11 minicomputer and can be interfaced to electronic printing and/or electronic scanning de-vices. The processor has been used to process over a thousand diagnostic images.
Farrukh, U. O.
Managing the thermal energy that accumulates within a solid-state laser material under active pumping is of critical importance in the design of laser systems. Earlier models that calculated the temperature distribution in laser rods were single dimensional and assumed laser rods of infinite length. This program presents a new model which solves the temperature distribution problem for finite dimensional laser rods and calculates both the radial and axial components of temperature distribution in these rods. The modeled rod is either side-pumped or end-pumped by a continuous or a single pulse pump beam. (At the present time, the model cannot handle a multiple pulsed pump source.) The optical axis is assumed to be along the axis of the rod. The program also assumes that it is possible to cool different surfaces of the rod at different rates. The user defines the laser rod material characteristics, determines the types of cooling and pumping to be modeled, and selects the time frame desired via the input file. The program contains several self checking schemes to prevent overwriting memory blocks and to provide simple tracing of information in case of trouble. Output for the program consists of 1) an echo of the input file, 2) diffusion properties, radius and length, and time for each data block, 3) the radial increments from the center of the laser rod to the outer edge of the laser rod, and 4) the axial increments from the front of the laser rod to the other end of the rod. This program was written in Microsoft FORTRAN77 and implemented on a Tandon AT with a 287 math coprocessor. The program can also run on a VAX 750 mini-computer. It has a memory requirement of about 147 KB and was developed in 1989.
Everywhere you look, chances are something that was designed and tested by a computer will be in plain view. Computers are now utilized to design and test just about everything imaginable, from automobiles and airplanes to bridges and boats, and elevators and escalators to streets and skyscrapers. Computer-design engineering first emerged in the 1970s, in the automobile and aerospace industries. Since computers were in their infancy, however, architects and engineers during the time were limited to producing only designs similar to hand-drafted drawings. (At the end of 1970s, a typical computer-aided design system was a 16-bit minicomputer with a price tag of $125,000.) Eventually, computers became more affordable and related software became more sophisticated, offering designers the "bells and whistles" to go beyond the limits of basic drafting and rendering, and venture into more skillful applications. One of the major advancements was the ability to test the objects being designed for the probability of failure. This advancement was especially important for the aerospace industry, where complicated and expensive structures are designed. The ability to perform reliability and risk assessment without using extensive hardware testing is critical to design and certification. In 1984, NASA initiated the Probabilistic Structural Analysis Methods (PSAM) project at Glenn Research Center to develop analysis methods and computer programs for the probabilistic structural analysis of select engine components for current Space Shuttle and future space propulsion systems. NASA envisioned that these methods and computational tools would play a critical role in establishing increased system performance and durability, and assist in structural system qualification and certification. Not only was the PSAM project beneficial to aerospace, it paved the way for a commercial risk- probability tool that is evaluating risks in diverse, down- to-Earth application
Brower, R W; van Dorp, W G; Vogel, J A; Roelandt, J R
A computer-assisted system is described which speeds and extends the quantitative interpretation of M-mode echocardiographic recordings. The system consists of a digitizing tablet, minicomputer, TV monitor and a hard copy device. M-mode echocardiograms are placed on the digitizing surface and traced using the digitizing pen. The entered signal includes the endocardial surfaces of the anterior and posterior left ventricular wall for at least one cycle, and two Q waves from a simultaneously recorded ECG to identify end diastole and heart rate. End systole is determined automatically as corresponding to the minimum LV dimension. Results of analysis include continuous plots of estimated volume and circumferential fiber shortening rate (CFSR) vs time. Determinations of special interest are also displayed: enddiastolic volume (EDV) and endsystolic volume (ESV), ejection fraction, cardiac output, mean and peak CFSR. M-mode echocardiograms obtained from 25 normal volunteers are used to evaluate the system. The standard error of the estimate of the computer-assisted system is comparable to the error between observers, furthermore the computer system adds no significant systematic or random error. Comparison between M-mode estimated volumes and angiographically determined values has been described previously and Sy - x here is significantly greater. The main advantages of this system are: 1. a continuous plot of estimated LV volume and CFSR is provided; 2. beat-to-beat analyses are facilitated; 3. the automatic determination of end systole removes possible errors in judgement made previously; 4. it is time saving when one considers the amount of data obtained. With these advantages and the generally satisfactory performance in the clinical trials, this system appears to have extended the clinical quantitative capabilities of M-mode echocardiograms. PMID:1102317
Zumberge, Mark Andrew
We have developed a new, portable apparatus for making absolute measurements of the acceleration due to the earth's gravity. We use the method of interferometrically determining the acceleration of a freely falling corner -cube prism. The falling object is surrounded by a chamber which is driven vertically inside a fixed vacuum chamber. This falling chamber is servoed to track the falling corner -cube to shield it from drag due to background gas. In addition, the drag-free falling chamber removes the need for a magnetic release, shields the falling object from electrostatic forces, and provides a means of both gently arresting the falling object and quickly returning it to its start position, to allow rapid acquisition of data. A synthesized long period isolation device reduces the noise due to seismic oscillations. A new type of Zeeman laser is used as the light source in the interferometer, and is compared with the wavelength of an iodine stabilized laser. The times of occurrence of 45 interference fringes are measured to within 0.2 nsec over a 20 cm drop and are fit to a quadratic by an on-line minicomputer. 150 drops can be made in ten minutes resulting in a value of g having a precision of 3 to 6 parts in 10('9). Systematic errors have been determined to be less than 5 parts in 10('9) through extensive tests. Three months of gravity data have been obtained with a reproducibility ranging from 5 to 10 parts in 10('9). The apparatus has been designed to be easily portable. Field measurements are planned for the immediate future. An accuracy of 6 parts in 10('9) corresponds to a height sensitivity of 2 cm. Vertical motions in the earth's crust and tectonic density changes that may precede earthquakes are to be investigated using this apparatus.
Slootweg, A. Peter
The multistreamer Side-Looking Seismic system presented in this paper makes a sonograph of uncovered or buried crustal topography, thus revealing the structural fabric of the oceanic basement, even when this is covered with a sedimentary layer. Major elements of the system are an airgun as a sound source, five single-channel parallel streamers and two minicomputers for signal capture and processing. The system is used simultaneously for enhanced single-channel seismic profiling and for side-looking seismics. A vertical section with an improved signal-to-noise ratio and a suppression of side-echoes is produced on a digital seismic recorder. Primary side-looking seismic output in the form of 5 profiles with different angles of incidence is obtained within 10 seconds. This part of the processing can be done in real time. In sediment-covered areas the low frequencies used cause the slanted profiles (the side beams in the primary output) to be side-looking sonar images of buried topography. The projection process yielding final side-looking output corrects for slant range deformation caused by the water column and, if necessary, for deformation caused by refraction within the sedimentary column. The result approaches a conformal map of the structure of the traversed basement. Swath width is mainly determined by water depth and refraction effects in the sediment. In Madeira abyssal plain a swath width of 8000 m was attained in a water depth of 5000 m. Within the swath, oceanic basement structures are recognized in the form of elongate more or less parallel reflectors. They are interpreted as buried spreading topography. The lack of side-echoes within fracture zones combined with typical wall signatures can be used to trace fracture zones. These features are demonstrated for an area in Madeira abyssal plain.
Cheng, Charles C.
This paper describes the design and engineering of a laser scanning system for production applications. The laser scanning techniques, the timing control, the logic design of the pattern recognition subsystem, the digital computer servo control for the loading and un-loading of parts, and the laser probe rotation and its synchronization will be discussed. The laser inspection machine is designed to automatically inspect the surface of precision-bored holes, such as those in automobile master cylinders, without contacting the machined surface. Although the controls are relatively sophisticated, operation of the laser inspection machine is simple. A laser light beam from a commercially available gas laser, directed through a probe, scans the entire surface of the bore. Reflected light, picked up through optics by photoelectric sensors, generates signals that are fed to a mini-computer for processing. A pattern recognition techniques program in the computer determines acceptance or rejection of the part being inspected. The system's acceptance specifications are adjustable and are set to the user's established tolerances. However, the computer-controlled laser system is capable of defining from 10 to 75 rms surface finish, and voids or flaws from 0.0005 to 0.020 inch. Following the successful demonstration with an engineering prototype, the described laser machine has proved its capability to consistently ensure high-quality master brake cylinders. It thus provides a safety improvement for the automotive braking system. Flawless, smooth cylinder bores eliminate premature wearing of the rubber seals, resulting in a longer-lasting master brake cylinder and a safer and more reliable automobile. The results obtained from use of this system, which has been in operation about a year for replacement of a tedious, manual operation on one of the high-volume lines at the Bendix Hydraulics Division, have been very satisfactory.
Bonacuse, Peter J.; Kalluri, Sreeramesh
Multiaxial loading, especially at elevated temperature, can cause the inelastic response of a material to differ significantly from that predicted by simple flow rules, i.e., von Mises or Tresca. To quantify some of these differences, the cyclic high-temperature, deformation behavior of a wrought cobalt-based superalloy, Haynes 188, is investigated under combined axial and torsional loads. Haynes 188 is currently used in many aerospace gas turbine and rocket engine applications, e.g., the combustor liner for the T800 turboshaft engine for the RAH-66 Comanche helicopter and the liquid oxygen posts in the main injector of the space shuttle main engine. The deformation behavior of this material is assessed through the examination of hysteresis loops generated from a biaxial fatigue test program. A high-temperature axial, torsional, and combined axial-torsional fatigue data base has been generated on Haynes 188 at 760 C. Cyclic loading tests have been conducted on uniform gauge section tubular specimens in a servohydraulic axial-torsional test rig. Test control and data acquisition were accomplished with a minicomputer. In this paper, the cyclic hardening characteristics and typical hysteresis loops in the axial stress versus axial strain, shear stress versus engineering shear strain, axial strain versus engineering shear strain, and axial stress versus shear stress spaces are presented for cyclic, in-phase and out-of-phase, axial torsional tests. For in-phase tests three different values of the proportionality constant, lambda (ratio of engineering shear strain amplitude to axial strain amplitude), are examined, viz., 0.86, 1.73, and 3.46. In the out-of-phase tests, three different values of the phase angle, phi (between the axial and engineering shear strain waveforms), are studied, viz., 30, 60, and 90 deg with lambda = 1.73. The cyclic hardening behaviors of all the tests conducted on Haynes 188 at 760 C are evaluated using the von Mises equivalent stress-strain and the maximum shear stress-maximum engineering shear strain (Tresca) curves. Comparisons are also made between the hardening behaviors of cyclic axial, torsional, and combined in-phase and out-of-phase axial-torsional fatigue tests. These comparisons are accomplished through simple Ramberg-Osgood type stress-strain functions for cyclic, axial stress-strain and shear stress-engineering shear strain curves.
Vavrus, J. L.
The LOOK program was developed to permit a user to examine a text file in a psuedo-random access manner. Many engineering and scientific programs generate large amounts of printed output. Often this output needs to be examined in only a few places. On mini-computers (like the DEC VAX) high-speed printers are usually at a premium. One alternative is to save the output in a text file and examine it with a text editor. The slowness of a text editor, the possibility of inadvertently changing the output, and other factors make this an unsatisfactory solution. The LOOK program provides the user with a means of rapidly examining the contents of an ASCII text file. LOOK's basis of operation is to open the text file for input only and then access it in a block-wise fashion. LOOK handles the text formatting and displays the text lines on the screen. The user can move forward or backward in the file by a given number of lines or blocks. LOOK also provides the ability to "scroll" the text at various speeds in the forward or backward directions. The user can perform a search for a string (or a combination of up to 10 strings) in a forward or backward direction. Also, user selected portions of text may be extracted and submitted to print or placed in a file. Additional features available to the LOOK user include: cancellation of an operation with a keystroke, user definable keys, switching mode of operation (e.g. 80/132 column), on-line help facility, trapping broadcast messages, and the ability to spawn a sub-process to carry out DCL functions without leaving LOOK. The LOOK program is written in FORTRAN 77 and MACRO ASSEMBLER for interactive execution and has been implemented on a DEC VAX computer using VAX/VMS with a central memory requirement of approximately 430K of 8 bit bytes. LOOK operation is terminal independent but will take advantage of the features of the DEC VT100 terminal if available. LOOK was developed in 1983.
Bonacuse, Peter J.; Kalluri, Sreeramesh
The cyclic, high-temperature deformation behavior of a wrought cobalt-base super-alloy, Haynes 188, is investigated under combined axial and torsional loads. This is accomplished through the examination of hysteresis loops generated from a biaxial fatigue test program. A high-temperature axial, torsional, and combined axial-torsional fatigue database has been generated on Haynes 188 at 760 C. Cyclic loading tests have been conducted on uniform gage section tubular specimens in a servohydraulic axial-torsional test rig. Test control and data acquisition were accomplished with a minicomputer. The fatigue behavior of Haynes 188 at 760 C under axial, torsional, and combined axial-torsional loads and the monotonic and cyclic deformation behaviors under axial and torsional loads have been previously reported. In this paper, the cyclic hardening characteristics and typical hysteresis loops in the axial stress versus axial strain, shear stress ,versus engineering shear strain, axial strain versus engineering shear strain. and axial stress versus shear stress spaces are presented for cyclic in-phase and out-of-phase axial-torsional tests. For in-phase tests, three different values of the proportionality constant lambda (the ratio of engineering shear strain amplitude to axial strain amplitude, are examined, viz. 0.86, 1.73, and 3.46. In the out-of-phase tests, three different values of the phase angle, phi (between the axial and engineering shear strain waveforms), are studied, viz., 30, 60, and 90 degrees with lambda equals 1.73. The cyclic hardening behaviors of all the tests conducted on Haynes 188 at 760 C are evaluated using the von Mises equivalent stress-strain and the maximum shear stress-maximum engineering shear strain (Tresca) curves. Comparisons are also made between the hardening behaviors of cyclic axial, torsional, and combined in-phase (lambda = 1.73 and phi = 0) and out-of-phase (lambda = 1.73 and phi = 90') axial-torsional fatigue tests. These comparisons are accomplished through simple Ramberg-Osgood type stress-strain functions for cyclic, axial stress-strain and shear stress-engineering shear strain curves.
Maraldi, N M; Marinelli, F; Squarzoni, S; Santi, S; Barbieri, M
The application of image analysis methods to conventional thin sections for electron microscopy to analyze the chromatin arrangement are quite limited. We developed a method which utilizes freeze-fractured samples; the results indicate that the method is suitable for identifying the changes in the chromatin arrangement which occur in physiological, experimental and pathological conditions. The modern era of image analysis begins in 1964, when pictures of the moon transmitted by Ranger 7 were processed by a computer. This processing improved the original picture by enhancing and restoring the image affected by various types of distorsion. These performances have been allowed by the third-generation of computers having the speed and the storage capabilities required for practical use of image processing algorithms. Each image can be converted into a two-dimensional light intensity function: f (x, y), where x and y are the spatial coordinates and f value is proportional to the gray level of the image at that point. The digital image is therefore a matrix whose elements are the pixels (picture elements). A typical digital image can be obtained with a quality comparable to monochrome TV, with a 512×512 pixel array with 64 gray levels. The magnetic disks of commercial minicomputers are thus capable of storing some tenths of images which can be elaborated by the image processor, converting the signal into digital form. In biological images, obtained by light microscopy, the digitation converts the chromatic differences into gray level intensities, thus allowing to define the contours of the cytoplasm, of the nucleus and of the nucleoli. The use of a quantitative staining method for the DNA, the Feulgen reaction, permits to evaluate the ratio between condensed chromatin (stained) and euchromatin (unstained). The digitized images obtained by transmission electron microscopy are rich in details at high resolution. However, the application of image analysis techniques to these images and especially to those referring to nuclei, is limited by several drawbacks: i) the thin section represents only a small fraction of the nuclear volume entirely visible in optical microscope specimens; ii) the identification of nucleosomes, of the solenoid fibres and of the higher levels of compaction of the heterochromatin is not thinsectioned specimens; iii) the differences between heterochromatin and euchromatin are based only on their grey level but do not reveal possible variations of their structural organization. Therefore, the applications of image analysis to the nuclear content does not utilzes the high resolution power of e.m. images and simply quantify the areas occupied by electron-dense chromatin with respect to the more electron-transparent ones. This result is less significative of those obtainable by optical microscopy, since the electron staining is not quantitative as the Fulgen reaction. On the other hand, the following problems still remain unresolved and should be clarified only by the use of quantitative image analysis: ultrastructural organization of the different types of heterochromatin (1); relationships between gene activation, transcription and chromatin decondensation; chromatin arrangement transformation induced by exogenous agents. In order to face these problems, in the last years we applied image analysis to cell or tissue specimens frozen in liquid nitrogen and then fractured in order to expose the inner content of the nucleus (Fig. 1). The obtained metal replicas represent very suitable specimens for digitalized image elaboration, since the fibers which give rise to the chromatin domains are exposed by the fracturing and evidentiated by the shadowing as black dots with a clear white shadow (Fig. 2). Therefore, their size and shape can be quantitatively evaluated by a digital image processor; in this vay the structural elements of the chromatin fibres are also detectable inside a fractured nucleus and their relative percentage ca be determined in each nuclear area (Fig. 3). This type of analysis has been initially u
Marché, Jordan D., II
Former AAS president Arthur Dodd Code, age 85, passed away at Meriter Hospital in Madison, Wisconsin on 11 March 2009, from complications involving a long-standing pulmonary condition. Code was born in Brooklyn, New York on 13 August 1923, as the only child of former Canadian businessman Lorne Arthur Code and Jesse (Dodd) Code. An experienced ham radio operator, he entered the University of Chicago in 1940, but then enlisted in the U.S. Navy (1943-45) and was later stationed as an instructor at the Naval Research Laboratory, Washington, D.C. During the war, he gained extensive practical experience with the design and construction of technical equipment that served him well in years ahead. Concurrently, he took physics courses at George Washington University (some under the tutelage of George Gamow). In 1945, he was admitted to the graduate school of the University of Chicago, without having received his formal bachelor's degree. In 1950, he was awarded his Ph.D. for a theoretical study of radiative transfer in O- and B-type stars, directed by Subrahmanyan Chandrasekhar. hired onto the faculty of the Department of Astronomy at the University of Wisconsin-Madison (1951-56). He then accepted a tenured appointment at the California Institute of Technology and the Mount Wilson and Palomar Observatories (1956-58). But following the launch of Sputnik, Code returned to Wisconsin in 1958 as full professor of astronomy, director of the Washburn Observatory, and department chairman so that he could more readily pursue his interest in space astronomy. That same year, he was chosen a member of the Space Science Board of the National Academy of Sciences (created during the International Geophysical Year) and shortly became one of five principal investigators of the original NASA Space Science Working Group. In a cogent 1960 essay, Code argued that astrophysical investigations, when conducted from beyond the Earth's atmosphere, "cannot fail to have a tremendous impact on the future course of stellar astronomy," a prediction strongly borne out in the decades that followed. In 1959, Code founded the Space Astronomy Laboratory (SAL) within the UW Department of Astronomy. Early photometric and spectrographic equipment was test-flown aboard NASA's X-15 rocket plane and Aerobee sounding rockets. Along with other SAL personnel, including Theodore E. Houck, Robert C. Bless, and John F. McNall, Code (as principal investigator) was responsible for the design of the Wisconsin Experiment Package (WEP) as one of two suites of instruments to be flown aboard the Orbiting Astronomical Observatory (OAO), which represented a milestone in the advent of space astronomy. With its seven reflecting telescopes feeding five filter photometers and two scanning spectrometers, WEP permitted the first extended observations in the UV portion of the spectrum. After the complete failure of the OAO-1 spacecraft (launched in 1966), OAO-2 was successfully launched on 7 December 1968 and gathered data on over a thousand celestial objects during the next 50 months, including stars, nebulae, galaxies, planets, and comets. These results appeared in a series of more than 40 research papers, chiefly in the Ap.J., along with the 1972 monograph, The Scientific Results from the Orbiting Astronomical Observatory (OAO-2), edited by Code. Between the OAO launches, other SAL colleagues of Code developed the Wisconsin Automatic Photoelectric Telescope (or APT), the first computer-controlled (or "robotic") telescope. Driven by a PDP-8 mini-computer, it routinely collected atmospheric extinction data. Code was also chosen principal investigator for the Wisconsin Ultraviolet Photo-Polarimeter Experiment (or WUPPE). This used a UV-sensitive polarimeter designed by Kenneth Nordsieck that was flown twice aboard the space shuttles in 1990 and 1995. Among other findings, WUPPE observations demonstrated that interstellar dust does not appreciably change the direction of polarization of starlight, thereby supporting its possible composition as graphite. Code was the recipie
Forman, R. G.
Structural flaws and cracks may grow under fatigue inducing loads and, upon reaching a critical size, cause structural failure to occur. The growth of these flaws and cracks may occur at load levels well below the ultimate load bearing capability of the structure. The Fatigue Crack Growth Computer Program, NASA/FLAGRO, was developed as an aid in predicting the growth of pre-existing flaws and cracks in structural components of space systems. The earlier version of the program, FLAGRO4, was the primary analysis tool used by Rockwell International and the Shuttle subcontractors for fracture control analysis on the Space Shuttle. NASA/FLAGRO is an enhanced version of the program and incorporates state-of-the-art improvements in both fracture mechanics and computer technology. NASA/FLAGRO provides the fracture mechanics analyst with a computerized method of evaluating the "safe crack growth life" capabilities of structural components. NASA/FLAGRO could also be used to evaluate the damage tolerance aspects of a given structural design. The propagation of an existing crack is governed by the stress field in the vicinity of the crack tip. The stress intensity factor is defined in terms of the relationship between the stress field magnitude and the crack size. The propagation of the crack becomes catastrophic when the local stress intensity factor reaches the fracture toughness of the material. NASA/FLAGRO predicts crack growth using a two-dimensional model which predicts growth independently in two directions based on the calculation of stress intensity factors. The analyst can choose to use either a crack growth rate equation or a nonlinear interpolation routine based on tabular data. The growth rate equation is a modified Forman equation which can be converted to a Paris or Walker equation by substituting different values into the exponent. This equation provides accuracy and versatility and can be fit to data using standard least squares methods. Stress-intensity factor numerical values can be computed for making comparisons or checks of solutions. NASA/FLAGRO can check for failure of a part-through crack in the mode of a through crack when net ligament yielding occurs. NASA/FLAGRO has a number of special subroutines and files which provide enhanced capabilities and easy entry of data. These include crack case solutions, cyclic load spectrums, nondestructive examination initial flaw sizes, table interpolation, and material properties. The materials properties files are divided into two types, a user defined file and a fixed file. Data is entered and stored in the user defined file during program execution, while the fixed file contains already coded-in property value data for many different materials. Prompted input from CRT terminals consists of initial crack definition (which can be defined automatically), rate solution type, flaw type and geometry, material properties (if they are not in the built-in tables of material data), load spectrum data (if not included in the loads spectrum file), and design limit stress levels. NASA/FLAGRO output includes an echo of the input with any error or warning messages, the final crack size, whether or not critical crack size has been reached for the specified stress level, and a life history profile of the crack propagation. NASA/FLAGRO is modularly designed to facilitate revisions and operation on minicomputers. The program was implemented on a DEC VAX 11/780 with the VMS operating system. NASA/FLAGRO is written in FORTRAN77 and has a memory requirement of 1.4 MB. The program was developed in 1986.