Sample records for mainframe concepts computer

  1. The role of the host in a cooperating mainframe and workstation environment, volumes 1 and 2

    NASA Technical Reports Server (NTRS)

    Kusmanoff, Antone; Martin, Nancy L.

    1989-01-01

    In recent years, advancements made in computer systems have prompted a move from centralized computing based on timesharing a large mainframe computer to distributed computing based on a connected set of engineering workstations. A major factor in this advancement is the increased performance and lower cost of engineering workstations. The shift to distributed computing from centralized computing has led to challenges associated with the residency of application programs within the system. In a combined system of multiple engineering workstations attached to a mainframe host, the question arises as to how does a system designer assign applications between the larger mainframe host and the smaller, yet powerful, workstation. The concepts related to real time data processing are analyzed and systems are displayed which use a host mainframe and a number of engineering workstations interconnected by a local area network. In most cases, distributed systems can be classified as having a single function or multiple functions and as executing programs in real time or nonreal time. In a system of multiple computers, the degree of autonomy of the computers is important; a system with one master control computer generally differs in reliability, performance, and complexity from a system in which all computers share the control. This research is concerned with generating general criteria principles for software residency decisions (host or workstation) for a diverse yet coupled group of users (the clustered workstations) which may need the use of a shared resource (the mainframe) to perform their functions.

  2. Alive and Kicking: Making the Case for Mainframe Education

    ERIC Educational Resources Information Center

    Murphy, Marianne C.; Sharma, Aditya; Seay, Cameron; McClelland, Marilyn K.

    2010-01-01

    As universities continually update and assess their curriculums, mainframe computing is quite often overlooked as it is often thought of as "legacy computer." Mainframe computing appears to be either uninteresting or thought of as a computer past its prime. However, both assumptions are leading to a shortage of IS professionals in the…

  3. Performance Comparison of Mainframe, Workstations, Clusters, and Desktop Computers

    NASA Technical Reports Server (NTRS)

    Farley, Douglas L.

    2005-01-01

    A performance evaluation of a variety of computers frequently found in a scientific or engineering research environment was conducted using a synthetic and application program benchmarks. From a performance perspective, emerging commodity processors have superior performance relative to legacy mainframe computers. In many cases, the PC clusters exhibited comparable performance with traditional mainframe hardware when 8-12 processors were used. The main advantage of the PC clusters was related to their cost. Regardless of whether the clusters were built from new computers or whether they were created from retired computers their performance to cost ratio was superior to the legacy mainframe computers. Finally, the typical annual maintenance cost of legacy mainframe computers is several times the cost of new equipment such as multiprocessor PC workstations. The savings from eliminating the annual maintenance fee on legacy hardware can result in a yearly increase in total computational capability for an organization.

  4. The Rise of the CISO

    ERIC Educational Resources Information Center

    Gale, Doug

    2007-01-01

    The late 1980s was an exciting time to be a CIO in higher education. Computing was being decentralized as microcomputers replaced mainframes, networking was emerging, and the National Science Foundation Network (NSFNET) was introducing the concept of an "internet" to hundreds of thousands of new users. Security wasn't much of an issue;…

  5. An evaluation of superminicomputers for thermal analysis

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Vidal, J. B.; Jones, G. K.

    1962-01-01

    The feasibility and cost effectiveness of solving thermal analysis problems on superminicomputers is demonstrated. Conventional thermal analysis and the changing computer environment, computer hardware and software used, six thermal analysis test problems, performance of superminicomputers (CPU time, accuracy, turnaround, and cost) and comparison with large computers are considered. Although the CPU times for superminicomputers were 15 to 30 times greater than the fastest mainframe computer, the minimum cost to obtain the solutions on superminicomputers was from 11 percent to 59 percent of the cost of mainframe solutions. The turnaround (elapsed) time is highly dependent on the computer load, but for large problems, superminicomputers produced results in less elapsed time than a typically loaded mainframe computer.

  6. The Future SBSS: Migration of SBSS Functions from a Mainframe Environment to a Disturbed PC-Based LAN Environment

    DTIC Science & Technology

    1993-09-01

    transactions 1 .1414 3.6512 2 .0824 2.1263 3 .0483 1.2451 4 .0483 1.2451 5 .0424 1.0954 D 6 .0424 1.0954 ś .0391 .9880 8 .0333 .8591 9 .0291 .7517 10 .0275...replace SBLC mainframes ( 1 :A- 8 ). RPC and SBLC computers are, in general, UNISYS mainframes ( 1 :A-6). In 1997, the UNISYS mainframe contract will...expire, and RPC systems will move to open systems architectures ( 1 :A- 8 ). At this time, the UNISYS mainframe platforms may be replaced with other platforms

  7. 45 CFR Appendix C to Part 1355 - Electronic Data Transmission Format

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... mainframe-to-mainframe data exchange system using the Sterling Software data transfer package called “SUPERTRACS.” This package will allow data exchange between most computer platforms (both mini and mainframe... 45 Public Welfare 4 2010-10-01 2010-10-01 false Electronic Data Transmission Format C Appendix C...

  8. 45 CFR Appendix C to Part 1355 - Electronic Data Transmission Format

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... mainframe-to-mainframe data exchange system using the Sterling Software data transfer package called “SUPERTRACS.” This package will allow data exchange between most computer platforms (both mini and mainframe... 45 Public Welfare 4 2011-10-01 2011-10-01 false Electronic Data Transmission Format C Appendix C...

  9. DFLOW USER'S MANUAL

    EPA Science Inventory

    DFLOW is a computer program for estimating design stream flows for use in water quality studies. The manual describes the use of the program on both the EPA's IBM mainframe system and on a personal computer (PC). The mainframe version of DFLOW can extract a river's daily flow rec...

  10. (Some) Computer Futures: Mainframes.

    ERIC Educational Resources Information Center

    Joseph, Earl C.

    Possible futures for the world of mainframe computers can be forecast through studies identifying forces of change and their impact on current trends. Some new prospects for the future have been generated by advances in information technology; for example, recent United States successes in applied artificial intelligence (AI) have created new…

  11. Teach or No Teach: Is Large System Education Resurging?

    ERIC Educational Resources Information Center

    Sharma, Aditya; Murphy, Marianne C.

    2011-01-01

    Legacy or not, mainframe education is being taught at many U.S. universities. Some computer science programs have always had some large system content but there does appear to be resurgence of mainframe related content in business programs such as Management Information Systems (MIS) and Computer Information Systems (CIS). Many companies such as…

  12. CD-ROM source data uploaded to the operating and storage devices of an IBM 3090 mainframe through a PC terminal.

    PubMed

    Boros, L G; Lepow, C; Ruland, F; Starbuck, V; Jones, S; Flancbaum, L; Townsend, M C

    1992-07-01

    A powerful method of processing MEDLINE and CINAHL source data uploaded to the IBM 3090 mainframe computer through an IBM/PC is described. Data are first downloaded from the CD-ROM's PC devices to floppy disks. These disks then are uploaded to the mainframe computer through an IBM/PC equipped with WordPerfect text editor and computer network connection (SONNGATE). Before downloading, keywords specifying the information to be accessed are typed at the FIND prompt of the CD-ROM station. The resulting abstracts are downloaded into a file called DOWNLOAD.DOC. The floppy disks containing the information are simply carried to an IBM/PC which has a terminal emulation (TELNET) connection to the university-wide computer network (SONNET) at the Ohio State University Academic Computing Services (OSU ACS). The WordPerfect (5.1) processes and saves the text into DOS format. Using the File Transfer Protocol (FTP, 130,000 bytes/s) of SONNET, the entire text containing the information obtained through the MEDLINE and CINAHL search is transferred to the remote mainframe computer for further processing. At this point, abstracts in the specified area are ready for immediate access and multiple retrieval by any PC having network switch or dial-in connection after the USER ID, PASSWORD and ACCOUNT NUMBER are specified by the user. The system provides the user an on-line, very powerful and quick method of searching for words specifying: diseases, agents, experimental methods, animals, authors, and journals in the research area downloaded. The user can also copy the TItles, AUthors and SOurce with optional parts of abstracts into papers under edition. This arrangement serves the special demands of a research laboratory by handling MEDLINE and CINAHL source data resulting after a search is performed with keywords specified for ongoing projects. Since the Ohio State University has a centrally founded mainframe system, the data upload, storage and mainframe operations are free.

  13. Computer-generated mineral commodity deposit maps

    USGS Publications Warehouse

    Schruben, Paul G.; Hanley, J. Thomas

    1983-01-01

    This report describes an automated method of generating deposit maps of mineral commodity information. In addition, it serves as a user's manual for the authors' mapping system. Procedures were developed which allow commodity specialists to enter deposit information, retrieve selected data, and plot deposit symbols in any geographic area within the conterminous United States. The mapping system uses both micro- and mainframe computers. The microcomputer is used to input and retrieve information, thus minimizing computing charges. The mainframe computer is used to generate map plots which are printed by a Calcomp plotter. Selector V data base system is employed for input and retrieval on the microcomputer. A general mapping program (Genmap) was written in FORTRAN for use on the mainframe computer. Genmap can plot fifteen symbol types (for point locations) in three sizes. The user can assign symbol types to data items interactively. Individual map symbols can be labeled with a number or the deposit name. Genmap also provides several geographic boundary file and window options.

  14. Rotordynamics on the PC: Transient Analysis With ARDS

    NASA Technical Reports Server (NTRS)

    Fleming, David P.

    1997-01-01

    Personal computers can now do many jobs that formerly required a large mainframe computer. An example is NASA Lewis Research Center's program Analysis of RotorDynamic Systems (ARDS), which uses the component mode synthesis method to analyze the dynamic motion of up to five rotating shafts. As originally written in the early 1980's, this program was considered large for the mainframe computers of the time. ARDS, which was written in Fortran 77, has been successfully ported to a 486 personal computer. Plots appear on the computer monitor via calls programmed for the original CALCOMP plotter; plots can also be output on a standard laser printer. The executable code, which uses the full array sizes of the mainframe version, easily fits on a high-density floppy disk. The program runs under DOS with an extended memory manager. In addition to transient analysis of blade loss, step turns, and base acceleration, with simulation of squeeze-film dampers and rubs, ARDS calculates natural frequencies and unbalance response.

  15. 1986 Petroleum Software Directory. [800 mini, micro and mainframe computer software packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1985-01-01

    Pennwell's 1986 Petroleum Software Directory is a complete listing of software created specifically for the petroleum industry. Details are provided on over 800 mini, micro and mainframe computer software packages from more than 250 different companies. An accountant can locate programs to automate bookkeeping functions in large oil and gas production firms. A pipeline engineer will find programs designed to calculate line flow and wellbore pressure drop.

  16. ARDS User Manual

    NASA Technical Reports Server (NTRS)

    Fleming, David P.

    2001-01-01

    Personal computers (PCs) are now used extensively for engineering analysis. their capability exceeds that of mainframe computers of only a few years ago. Programs originally written for mainframes have been ported to PCs to make their use easier. One of these programs is ARDS (Analysis of Rotor Dynamic Systems) which was developed at Arizona State University (ASU) by Nelson et al. to quickly and accurately analyze rotor steady state and transient response using the method of component mode synthesis. The original ARDS program was ported to the PC in 1995. Several extensions were made at ASU to increase the capability of mainframe ARDS. These extensions have also been incorporated into the PC version of ARDS. Each mainframe extension had its own user manual generally covering only that extension. Thus to exploit the full capability of ARDS required a large set of user manuals. Moreover, necessary changes and enhancements for PC ARDS were undocumented. The present document is intended to remedy those problems by combining all pertinent information needed for the use of PC ARDS into one volume.

  17. SPIRES Tailored to a Special Library: A Mainframe Answer for a Small Online Catalog.

    ERIC Educational Resources Information Center

    Newton, Mary

    1989-01-01

    Describes the design and functions of a technical library database maintained on a mainframe computer and supported by the SPIRES database management system. The topics covered include record structures, vocabulary control, input procedures, searching features, time considerations, and cost effectiveness. (three references) (CLB)

  18. Report of the Defense Science Board Task Force on Military Applications of New-Generation Computing Technologies.

    DTIC Science & Technology

    1984-12-01

    1980’s we are seeing enhancement of breadth, power, and accessibility of computers in many dimensions: o Pov~erfu1, costly fragile mainframes for...During the 1980’s we are seeing enhancement of breadth, power and accessibility of computers in many dimensions. (1) Powerful, costly, fragile mainframes... X A~ ’ EMORANDlUM FOR THE t-RAIRMAN, DEFENSE<. ’ ’...’"" S!B.FECT: Defense Science Board T is F- Supercomputei Applicai io, Yoi are requested to

  19. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  20. Disciplines, models, and computers: the path to computational quantum chemistry.

    PubMed

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  1. An Evaluation of the Availability and Application of Microcomputer Software Programs for Use in Air Force Ground Transportation Squadrons

    DTIC Science & Technology

    1988-09-01

    software programs capable of being used on a microcomputer will be considered for analysis. No software intended for use on a miniframe or mainframe...Dial-A-Log consists of a program written in a computer language called L-10 that is run on a DEC-20 miniframe . The combination of the specific...proliferation of software dealing with microcomputers. Instead, they were geared more towards managing the use of miniframe or mainframe computer

  2. From micro to mainframe. A practical approach to perinatal data processing.

    PubMed

    Yeh, S Y; Lincoln, T

    1985-04-01

    A new, practical approach to perinatal data processing for a large obstetric population is described. This was done with a microcomputer for data entry and a mainframe computer for data reduction. The Screen Oriented Data Access (SODA) program was used to generate the data entry form and to input data into the Apple II Plus computer. Data were stored on diskettes and transmitted through a modern and telephone line to the IBM 370/168 computer. The Statistical Analysis System (SAS) program was used for statistical analyses and report generations. This approach was found to be most practical, flexible, and economical.

  3. Networking the Home and University: How Families Can Be Integrated into Proximate/Distant Computer Systems.

    ERIC Educational Resources Information Center

    Watson, J. Allen; And Others

    1989-01-01

    Describes study that was conducted to determine the feasibility of networking home microcomputers with a university mainframe system in order to investigate a new family process research paradigm, as well as the design and function of the microcomputer/mainframe system. Test instrumentation is described and systems' reliability and validity are…

  4. A Low Cost Micro-Computer Based Local Area Network for Medical Office and Medical Center Automation

    PubMed Central

    Epstein, Mel H.; Epstein, Lynn H.; Emerson, Ron G.

    1984-01-01

    A Low Cost Micro-computer based Local Area Network for medical office automation is described which makes use of an array of multiple and different personal computers interconnected by a local area network. Each computer on the network functions as fully potent workstations for data entry and report generation. The network allows each workstation complete access to the entire database. Additionally, designated computers may serve as access ports for remote terminals. Through “Gateways” the network may serve as a front end for a large mainframe, or may interface with another network. The system provides for the medical office environment the expandability and flexibility of a multi-terminal mainframe system at a far lower cost without sacrifice of performance.

  5. Hospital mainframe computer documentation of pharmacist interventions.

    PubMed

    Schumock, G T; Guenette, A J; Clark, T; McBride, J M

    1993-07-01

    The hospital mainframe computer pharmacist intervention documentation system described has successfully facilitated the recording, communication, analysis, and reporting of interventions at our hospital. It has proven to be time efficient, accessible, and user-friendly from the standpoint of both the pharmacist and administrator. The advantages of this system greatly outweigh manual documentation and justify the initial time investment in its design and development. In the future, it is hoped that the system can have even broader impact. Intervention/recommendations documented can be made accessible to medical and nursing staff, and as such further increase interdepartmental communication. As pharmacists embrace the pharmaceutical care mandate, documenting interventions in patient care will continue to grow in importance. Complete documentation is essential if pharmacists are to assume responsibility for patient outcomes. With time being an ever-increasing premium, and with economic and human resources dwindling, an efficient and effective means of recording and tracking pharmacist interventions will become imperative for survival in the fiscally challenged health care arena. Documentation of pharmacist intervention using a hospital mainframe computer at UIH has proven both efficient and effective.

  6. Workstations take over conceptual design

    NASA Technical Reports Server (NTRS)

    Kidwell, George H.

    1987-01-01

    Workstations provide sufficient computing memory and speed for early evaluations of aircraft design alternatives to identify those worthy of further study. It is recommended that the programming of such machines permit integrated calculations of the configuration and performance analysis of new concepts, along with the capability of changing up to 100 variables at a time and swiftly viewing the results. Computations can be augmented through links to mainframes and supercomputers. Programming, particularly debugging operations, are enhanced by the capability of working with one program line at a time and having available on-screen error indices. Workstation networks permit on-line communication among users and with persons and computers outside the facility. Application of the capabilities is illustrated through a description of NASA-Ames design efforts for an oblique wing for a jet performed on a MicroVAX network.

  7. PERSONAL COMPUTERS AND ENVIRONMENTAL ENGINEERING

    EPA Science Inventory

    This article discusses how personal computers can be applied to environmental engineering. fter explaining some of the differences between mainframe and Personal computers, we will review the development of personal computers and describe the areas of data management, interactive...

  8. Advanced On-the-Job Training System: System Specification

    DTIC Science & Technology

    1990-05-01

    3.1.5.2.10 Evaluation Subsystem spotfor the Traking Devopment and Deliery Subsystem ..... 22 3.1.5.2.11 TrIning Development=dDelivery Subsystem sL...e. Alsys Ada compiler f. Ethernet Local Area Network reference manual(s) g. Infotron 992 network reference manual(s) h. Computer Program Source...1989 a. Daily check of mainframe components, including all elements critical to support the terminal network . b. Restoration of mainframe equipment

  9. A data acquisition and storage system for the ion auxiliary propulsion system cyclic thruster test

    NASA Technical Reports Server (NTRS)

    Hamley, John A.

    1989-01-01

    A nine-track tape drive interfaced to a standard personal computer was used to transport data from a remote test site to the NASA Lewis mainframe computer for analysis. The Cyclic Ground Test of the Ion Auxiliary Propulsion System (IAPS), which successfully achieved its goal of 2557 cycles and 7057 hr of thrusting beam on time generated several megabytes of test data over many months of continuous testing. A flight-like controller and power supply were used to control the thruster and acquire data. Thruster data was converted to RS232 format and transmitted to a personal computer, which stored the raw digital data on the nine-track tape. The tape format was such that with minor modifications, mainframe flight data analysis software could be used to analyze the Cyclic Ground Test data. The personal computer also converted the digital data to engineering units and displayed real time thruster parameters. Hardcopy data was printed at a rate dependent on thruster operating conditions. The tape drive provided a convenient means to transport the data to the mainframe for analysis, and avoided a development effort for new data analysis software for the Cyclic test. This paper describes the data system, interfacing and software requirements.

  10. Cost-effective use of minicomputers to solve structural problems

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Foster, E. P.

    1978-01-01

    Minicomputers are receiving increased use throughout the aerospace industry. Until recently, their use focused primarily on process control and numerically controlled tooling applications, while their exposure to and the opportunity for structural calculations has been limited. With the increased availability of this computer hardware, the question arises as to the feasibility and practicality of carrying out comprehensive structural analysis on a minicomputer. This paper presents results on the potential for using minicomputers for structural analysis by (1) selecting a comprehensive, finite-element structural analysis system in use on large mainframe computers; (2) implementing the system on a minicomputer; and (3) comparing the performance of the minicomputers with that of a large mainframe computer for the solution to a wide range of finite element structural analysis problems.

  11. Transferring data oscilloscope to an IBM using an Apple II+

    NASA Technical Reports Server (NTRS)

    Miller, D. L.; Frenklach, M. Y.; Laughlin, P. J.; Clary, D. W.

    1984-01-01

    A set of PASCAL programs permitting the use of a laboratory microcomputer to facilitate and control the transfer of data from a digital oscilloscope (used with photomultipliers in experiments on soot formation in hydrocarbon combustion) to a mainframe computer and the subsequent mainframe processing of these data is presented. Advantages of this approach include the possibility of on-line computations, transmission flexibility, automatic transfer and selection, increased capacity and analysis options (such as smoothing, averaging, Fourier transformation, and high-quality plotting), and more rapid availability of results. The hardware and software are briefly characterized, the programs are discussed, and printouts of the listings are provided.

  12. Mass Storage Systems.

    ERIC Educational Resources Information Center

    Ranade, Sanjay; Schraeder, Jeff

    1991-01-01

    Presents an overview of the mass storage market and discusses mass storage systems as part of computer networks. Systems for personal computers, workstations, minicomputers, and mainframe computers are described; file servers are explained; system integration issues are raised; and future possibilities are suggested. (LRW)

  13. COMPUTER MODELS/EPANET

    EPA Science Inventory

    Pipe network flow analysis was among the first civil engineering applications programmed for solution on the early commercial mainframe computers in the 1960s. Since that time, advancements in analytical techniques and computing power have enabled us to solve systems with tens o...

  14. An Implemented Strategy for Campus Connectivity and Cooperative Computing.

    ERIC Educational Resources Information Center

    Halaris, Antony S.; Sloan, Lynda W.

    1989-01-01

    ConnectPac, a software package developed at Iona College to allow a computer user to access all services from a single personal computer, is described. ConnectPac uses mainframe computing to support a campus computing network, integrating personal and centralized computing into a menu-driven user environment. (Author/MLW)

  15. AI tools in computer based problem solving

    NASA Technical Reports Server (NTRS)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  16. 36 CFR 1236.2 - What definitions apply to this part?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... users but does not retain any transmission data), data systems used to collect and process data that have been organized into data files or data bases on either personal computers or mainframe computers...

  17. 36 CFR 1236.2 - What definitions apply to this part?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... users but does not retain any transmission data), data systems used to collect and process data that have been organized into data files or data bases on either personal computers or mainframe computers...

  18. 36 CFR 1236.2 - What definitions apply to this part?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... users but does not retain any transmission data), data systems used to collect and process data that have been organized into data files or data bases on either personal computers or mainframe computers...

  19. 36 CFR 1236.2 - What definitions apply to this part?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... users but does not retain any transmission data), data systems used to collect and process data that have been organized into data files or data bases on either personal computers or mainframe computers...

  20. Organizational Strategies for End-User Computing Support.

    ERIC Educational Resources Information Center

    Blackmun, Robert R.; And Others

    1988-01-01

    Effective support for end users of computers has been an important issue in higher education from the first applications of general purpose mainframe computers through minicomputers, microcomputers, and supercomputers. The development of end user support is reviewed and organizational models are examined. (Author/MLW)

  1. Supervisors with Micros: Trends and Training Needs.

    ERIC Educational Resources Information Center

    Bryan, Leslie A., Jr.

    1986-01-01

    Results of a study conducted by Purdue University concerning the use of computers by supervisors in manufacturing firms are presented and discussed. Examines access to computers, minicomputers versus mainframes, training time on computers, replacement of staff, creation of personnel problems, and training methods. (CT)

  2. Experience with a UNIX based batch computing facility for H1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhards, R.; Kruener-Marquis, U.; Szkutnik, Z.

    1994-12-31

    A UNIX based batch computing facility for the H1 experiment at DESY is described. The ultimate goal is to replace the DESY IBM mainframe by a multiprocessor SGI Challenge series computer, using the UNIX operating system, for most of the computing tasks in H1.

  3. Specification of Computer Systems by Objectives.

    ERIC Educational Resources Information Center

    Eltoft, Douglas

    1989-01-01

    Discusses the evolution of mainframe and personal computers, and presents a case study of a network developed at the University of Iowa called the Iowa Computer-Aided Engineering Network (ICAEN) that combines Macintosh personal computers with Apollo workstations. Functional objectives are stressed as the best measure of system performance. (LRW)

  4. 36 CFR § 1236.2 - What definitions apply to this part?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... users but does not retain any transmission data), data systems used to collect and process data that have been organized into data files or data bases on either personal computers or mainframe computers...

  5. Using a Cray Y-MP as an array processor for a RISC Workstation

    NASA Technical Reports Server (NTRS)

    Lamaster, Hugh; Rogallo, Sarah J.

    1992-01-01

    As microprocessors increase in power, the economics of centralized computing has changed dramatically. At the beginning of the 1980's, mainframes and super computers were often considered to be cost-effective machines for scalar computing. Today, microprocessor-based RISC (reduced-instruction-set computer) systems have displaced many uses of mainframes and supercomputers. Supercomputers are still cost competitive when processing jobs that require both large memory size and high memory bandwidth. One such application is array processing. Certain numerical operations are appropriate to use in a Remote Procedure Call (RPC)-based environment. Matrix multiplication is an example of an operation that can have a sufficient number of arithmetic operations to amortize the cost of an RPC call. An experiment which demonstrates that matrix multiplication can be executed remotely on a large system to speed the execution over that experienced on a workstation is described.

  6. The Ghost of Computers Past, Present, and Future: Computer Use for Preservice/Inservice Reading Programs.

    ERIC Educational Resources Information Center

    Prince, Amber T.

    Computer assisted instruction, and especially computer simulations, can help to ensure that preservice and inservice teachers learn from the right experiences. In the past, colleges of education used large mainframe computer systems to store student registration, provide simulation lessons on diagnosing reading difficulties, construct informal…

  7. Networked Microcomputers--The Next Generation in College Computing.

    ERIC Educational Resources Information Center

    Harris, Albert L.

    The evolution of computer hardware for college computing has mirrored the industry's growth. When computers were introduced into the educational environment, they had limited capacity and served one user at a time. Then came large mainframes with many terminals sharing the resource. Next, the use of computers in office automation emerged. As…

  8. LaRC local area networks to support distributed computing

    NASA Technical Reports Server (NTRS)

    Riddle, E. P.

    1984-01-01

    The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs.

  9. Cooperative processing data bases

    NASA Technical Reports Server (NTRS)

    Hasta, Juzar

    1991-01-01

    Cooperative processing for the 1990's using client-server technology is addressed. The main theme is concepts of downsizing from mainframes and minicomputers to workstations on a local area network (LAN). This document is presented in view graph form.

  10. Integrated Computer-Aided Drafting Instruction (ICADI).

    ERIC Educational Resources Information Center

    Chen, C. Y.; McCampbell, David H.

    Until recently, computer-aided drafting and design (CAD) systems were almost exclusively operated on mainframes or minicomputers and their cost prohibited many schools from offering CAD instruction. Today, many powerful personal computers are capable of performing the high-speed calculation and analysis required by the CAD application; however,…

  11. Working with Computers: Computer Orientation for Foreign Students.

    ERIC Educational Resources Information Center

    Barlow, Michael

    Designed as a resource for foreign students, this book includes instructions not only on how to use computers, but also on how to use them to complete academic work more efficiently. Part I introduces the basic operations of mainframes and microcomputers and the major areas of computing, i.e., file management, editing, communications, databases,…

  12. Computers, Remote Teleprocessing and Mass Communication.

    ERIC Educational Resources Information Center

    Cropley, A. J.

    Recent developments in computer technology are reducing the limitations of computers as mass communication devices. The growth of remote teleprocessing is one important step. Computers can now interact with users via terminals which may be hundreds of miles from the actual mainframe machine. Many terminals can be in operation at once, so that many…

  13. The transition of GTDS to the Unix workstation environment

    NASA Technical Reports Server (NTRS)

    Carter, D.; Metzinger, R.; Proulx, R.; Cefola, P.

    1995-01-01

    Future Flight Dynamics systems should take advantage of the possibilities provided by current and future generations of low-cost, high performance workstation computing environments with Graphical User Interface. The port of the existing mainframe Flight Dynamics systems to the workstation environment offers an economic approach for combining the tremendous engineering heritage that has been encapsulated in these systems with the advantages of the new computing environments. This paper will describe the successful transition of the Draper Laboratory R&D version of GTDS (Goddard Trajectory Determination System) from the IBM Mainframe to the Unix workstation environment. The approach will be a mix of historical timeline notes, descriptions of the technical problems overcome, and descriptions of associated SQA (software quality assurance) issues.

  14. A note on the computation of antenna-blocking shadows

    NASA Technical Reports Server (NTRS)

    Levy, R.

    1993-01-01

    A simple and readily applied method is provided to compute the shadow on the main reflector of a Cassegrain antenna, when cast by the subreflector and the subreflector supports. The method entails some convenient minor approximations that will produce results similar to results obtained with a lengthier, mainframe computer program.

  15. The Use of Microcomputers in Distance Teaching Systems. ZIFF Papiere 70.

    ERIC Educational Resources Information Center

    Rumble, Greville

    Microcomputers have revolutionized distance education in virtually every area. Used alone, personal computers provide students with a wide range of utilities, including word processing, graphics packages, and spreadsheets. When linked to a mainframe computer or connected to other personal computers in local area networks, microcomputers can…

  16. Computer Yearbook 72.

    ERIC Educational Resources Information Center

    1972

    Recent and expected developments in the computer industry are discussed in this 628-page yearbook, successor to "The Punched Card Annual." The first section of the report is an overview of current computer hardware and software and includes articles about future applications of mainframes, an analysis of the software industry, and a summary of the…

  17. Unix becoming healthcare's standard operating system.

    PubMed

    Gardner, E

    1991-02-11

    An unfamiliar buzzword is making its way into healthcare executives' vocabulary, as well as their computer systems. Unix is being touted by many industry observers as the most likely candidate to be a standard operating system for minicomputers, mainframes and computer networks.

  18. The DART dispersion analysis research tool: A mechanistic model for predicting fission-product-induced swelling of aluminum dispersion fuels. User`s guide for mainframe, workstation, and personal computer applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rest, J.

    1995-08-01

    This report describes the primary physical models that form the basis of the DART mechanistic computer model for calculating fission-product-induced swelling of aluminum dispersion fuels; the calculated results are compared with test data. In addition, DART calculates irradiation-induced changes in the thermal conductivity of the dispersion fuel, as well as fuel restructuring due to aluminum fuel reaction, amorphization, and recrystallization. Input instructions for execution on mainframe, workstation, and personal computers are provided, as is a description of DART output. The theory of fission gas behavior and its effect on fuel swelling is discussed. The behavior of these fission products inmore » both crystalline and amorphous fuel and in the presence of irradiation-induced recrystallization and crystalline-to-amorphous-phase change phenomena is presented, as are models for these irradiation-induced processes.« less

  19. Closely Spaced Independent Parallel Runway Simulation.

    DTIC Science & Technology

    1984-10-01

    facility consists of the Central Computer Facility, the Controller Laboratory, and the Simulator Pilot Complex. CENTRAL COMPUTER FACILITY. The Central... Computer Facility consists of a group of mainframes, minicomputers, and associated peripherals which host the operational and data acquisition...in the Controller Laboratory and convert their verbal directives into a keyboard entry which is transmitted to the Central Computer Complex, where

  20. Using Microcomputers for Communication. Summary Report: Sociology 110 Distance Education Pilot Project.

    ERIC Educational Resources Information Center

    Misanchuk, Earl R.

    A pilot project involved off-campus (distance education) students creating their assignments on Macintosh computers and "mailing" them electronically to a campus mainframe computer. The goal of the project was to determine what is necessary to implement and to evaluate the potential of computer communications for university-level…

  1. Using Microcomputers Simulations in the Classroom: Examples from Undergraduate and Faculty Computer Literacy Courses.

    ERIC Educational Resources Information Center

    Hart, Jeffrey A.

    1985-01-01

    Presents a discussion of how computer simulations are used in two undergraduate social science courses and a faculty computer literacy course on simulations and artificial intelligence. Includes a list of 60 simulations for use on mainframes and microcomputers. Entries include type of hardware required, publisher's address, and cost. Sample…

  2. Technology in the College Classroom.

    ERIC Educational Resources Information Center

    Earl, Archie W., Sr.

    An analysis was made of the use of computing tools at the graduate and undergraduate levels in colleges and universities in the United States. Topics ranged from hand-held calculators to the use of main-frame computers and the assessment of the SPSSX, SPSS, LINDO, and MINITAB computer software packages. Hand-held calculators are being increasingly…

  3. Wireless Computers: Radio and Light Communications May Bring New Freedom to Computing.

    ERIC Educational Resources Information Center

    Hartmann, Thom

    1984-01-01

    Describes systems which use wireless terminals to communicate with mainframe computers or minicomputers via radio band, discusses their limitations, and gives examples of networks using such systems. The use of communications satellites to increase their range and the possibility of using light beams to transmit data are also discussed. (MBR)

  4. Microcomputers: Communication Software. Evaluation Guides. Guide Number 13.

    ERIC Educational Resources Information Center

    Gray, Peter J.

    This guide discusses four types of microcomputer-based communication programs that could prove useful to evaluators: (1) the direct communication of information generated by one computer to another computer; (2) using the microcomputer as a terminal to a mainframe computer to input, direct the analysis of, and/or output data using a statistical…

  5. An Automated Approach to Departmental Grant Management.

    ERIC Educational Resources Information Center

    Kressly, Gaby; Kanov, Arnold L.

    1986-01-01

    Installation of a small computer and the use of specially designed programs has proven a cost-effective solution to the data processing needs of a university medical center's ophthalmology department, providing immediate access to grants accounting information and avoiding dependence on the institution's mainframe computer. (MSE)

  6. NASA Advanced Supercomputing (NAS) User Services Group

    NASA Technical Reports Server (NTRS)

    Pandori, John; Hamilton, Chris; Niggley, C. E.; Parks, John W. (Technical Monitor)

    2002-01-01

    This viewgraph presentation provides an overview of NAS (NASA Advanced Supercomputing), its goals, and its mainframe computer assets. Also covered are its functions, including systems monitoring and technical support.

  7. SIPP ACCESS: Information Tools Improve Access to National Longitudinal Panel Surveys.

    ERIC Educational Resources Information Center

    Robbin, Alice; David, Martin

    1988-01-01

    A computer-based, integrated information system incorporating data and information about the data, SIPP ACCESS systematically links technologies of laser disk, mainframe computer, microcomputer, and electronic networks, and applies relational technology to provide access to information about complex statistical data collections. Examples are given…

  8. The microcomputer workstation - An alternate hardware architecture for remotely sensed image analysis

    NASA Technical Reports Server (NTRS)

    Erickson, W. K.; Hofman, L. B.; Donovan, W. E.

    1984-01-01

    Difficulties regarding the digital image analysis of remotely sensed imagery can arise in connection with the extensive calculations required. In the past, an expensive large to medium mainframe computer system was needed for performing these calculations. For image-processing applications smaller minicomputer-based systems are now used by many organizations. The costs for such systems are still in the range from $100K to $300K. Recently, as a result of new developments, the use of low-cost microcomputers for image processing and display systems appeared to have become feasible. These developments are related to the advent of the 16-bit microprocessor and the concept of the microcomputer workstation. Earlier 8-bit microcomputer-based image processing systems are briefly examined, and a computer workstation architecture is discussed. Attention is given to a microcomputer workstation developed by Stanford University, and the design and implementation of a workstation network.

  9. PLATO Based Computer Assisted Instruction: An Exploration.

    ERIC Educational Resources Information Center

    Wise, Richard L.

    This study focuses on student response to computer-assisted instruction (CAI) after it was introduced into a college level physical geography course, "Introduction to Weather and Climate." PLATO, a University of Illinois mainframe network developed in the 1960s, was selected for its user friendliness, its large supply of courseware, its…

  10. GSTARS computer models and their applications, part I: theoretical development

    USGS Publications Warehouse

    Yang, C.T.; Simoes, F.J.M.

    2008-01-01

    GSTARS is a series of computer models developed by the U.S. Bureau of Reclamation for alluvial river and reservoir sedimentation studies while the authors were employed by that agency. The first version of GSTARS was released in 1986 using Fortran IV for mainframe computers. GSTARS 2.0 was released in 1998 for personal computer application with most of the code in the original GSTARS revised, improved, and expanded using Fortran IV/77. GSTARS 2.1 is an improved and revised GSTARS 2.0 with graphical user interface. The unique features of all GSTARS models are the conjunctive use of the stream tube concept and of the minimum stream power theory. The application of minimum stream power theory allows the determination of optimum channel geometry with variable channel width and cross-sectional shape. The use of the stream tube concept enables the simulation of river hydraulics using one-dimensional numerical solutions to obtain a semi-two- dimensional presentation of the hydraulic conditions along and across an alluvial channel. According to the stream tube concept, no water or sediment particles can cross the walls of stream tubes, which is valid for many natural rivers. At and near sharp bends, however, sediment particles may cross the boundaries of stream tubes. GSTARS3, based on FORTRAN 90/95, addresses this phenomenon and further expands the capabilities of GSTARS 2.1 for cohesive and non-cohesive sediment transport in rivers and reservoirs. This paper presents the concepts, methods, and techniques used to develop the GSTARS series of computer models, especially GSTARS3. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.

  11. Economic Comparison of Processes Using Spreadsheet Programs

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Pappano, A. W.; Jennings, C. N.

    1986-01-01

    Inexpensive approach aids plant-design decisions. Commercially available electronic spreadsheet programs aid economic comparison of different processes for producing particular end products. Facilitates plantdesign decisions without requiring large expenditures for powerful mainframe computers.

  12. Real-time data system: Incorporating new technology in mission critical environments

    NASA Technical Reports Server (NTRS)

    Muratore, John F.; Heindel, Troy A.

    1990-01-01

    If the Space Station Freedom is to remain viable over its 30-year life span, it must be able to incorporate new information systems technologies. These technologies are necessary to enhance mission effectiveness and to enable new NASA missions, such as supporting the Lunar-Mars Initiative. Hi-definition television (HDTV), neural nets, model-based reasoning, advanced languages, CPU designs, and computer networking standards are areas which have been forecasted to make major strides in the next 30 years. A major challenge to NASA is to bring these technologies online without compromising mission safety. In past programs, NASA managers have been understandably reluctant to rely on new technologies for mission critical activities until they are proven in noncritical areas. NASA must develop strategies to allow inflight confidence building and migration of technologies into the trusted tool base. NASA has successfully met this challenge and developed a winning strategy in the Space Shuttle Mission Control Center. This facility, which is clearly among NASA's most critical, is based on 1970's mainframe architecture. Changes to the mainframe are very expensive due to the extensive testing required to prove that changes do not have unanticipated impact on critical processes. Systematic improvement efforts in this facility have been delayed due to this 'risk to change.' In the real-time data system (RTDS) we have introduced a network of engineering computer workstations which run in parallel to the mainframe system. These workstations are located next to flight controller operating positions in mission control and, in some cases, the display units are mounted in the traditional mainframe consoles. This system incorporates several major improvements over the mainframe consoles including automated fault detection by real-time expert systems and color graphic animated schematics of subsystems driven by real-time telemetry. The workstations have the capability of recording telemetry data and providing 'instant replay' for flight controllers. RTDS also provides unique graphics animated by real-time telemetry such as workstation emulation of the shuttle's flight instruments and displays of the remote manipulator system (RMS) position. These systems have been used successfully as prime operational tools since STS-26 and have supported seven shuttle missions.

  13. An Assessment of Security Vulnerabilities Comprehension of Cloud Computing Environments: A Quantitative Study Using the Unified Theory of Acceptance and Use

    ERIC Educational Resources Information Center

    Venkatesh, Vijay P.

    2013-01-01

    The current computing landscape owes its roots to the birth of hardware and software technologies from the 1940s and 1950s. Since then, the advent of mainframes, miniaturized computing, and internetworking has given rise to the now prevalent cloud computing era. In the past few months just after 2010, cloud computing adoption has picked up pace…

  14. Cloud Computing and the Power to Choose

    ERIC Educational Resources Information Center

    Bristow, Rob; Dodds, Ted; Northam, Richard; Plugge, Leo

    2010-01-01

    Some of the most significant changes in information technology are those that have given the individual user greater power to choose. The first of these changes was the development of the personal computer. The PC liberated the individual user from the limitations of the mainframe and minicomputers and from the rules and regulations of centralized…

  15. The Value Proposition in Institutional Repositories

    ERIC Educational Resources Information Center

    Blythe, Erv; Chachra, Vinod

    2005-01-01

    In the education and research arena of the late 1970s and early 1980s, a struggle developed between those who advocated centralized, mainframe-based computing and those who advocated distributed computing. Ultimately, the debate reduced to whether economies of scale or economies of scope are more important to the effectiveness and efficiency of…

  16. In-House Automation of a Small Library Using a Mainframe Computer.

    ERIC Educational Resources Information Center

    Waranius, Frances B.; Tellier, Stephen H.

    1986-01-01

    An automated library routine management system was developed in-house to create system unique to the Library and Information Center, Lunar and Planetary Institute, Houston, Texas. A modular approach was used to allow continuity in operations and services as system was implemented. Acronyms and computer accounts and file names are appended.…

  17. How to Get from Cupertino to Boca Raton.

    ERIC Educational Resources Information Center

    Troxel, Duane K.; Chiavacci, Jim

    1985-01-01

    Describes seven methods to transfer data from Apple computer disks to IBM computer disks and vice versa: print out data and retype; use a commercial software package, optical-character reader, homemade cable, or modem to pass or transfer data directly; pay commercial data-transfer service; or store files on mainframe and download. (MBR)

  18. Don't Gamble with Y2K Compliance.

    ERIC Educational Resources Information Center

    Sturgeon, Julie

    1999-01-01

    Examines one school district's (Clark County, Nevada) response to the Y2K computer problem and provides tips on time-saving Y2K preventive measures other school districts can use. Explains how the district de-bugged its computer system including mainframe considerations and client-server applications. Highlights office equipment and teaching…

  19. International Futures (IFs): A Global Issues Simulation for Teaching and Research.

    ERIC Educational Resources Information Center

    Hughes, Barry B.

    This paper describes the International Futures (IFs) computer assisted simulation game for use with undergraduates. Written in Standard Fortran IV, the model currently runs on mainframe or mini computers, but has not been adapted for micros. It has been successfully installed on Harris, Burroughs, Telefunken, CDC, Univac, IBM, and Prime machines.…

  20. PLATO and the English Curriculum.

    ERIC Educational Resources Information Center

    Macgregor, William B.

    PLATO differs from other computer assisted instruction in that it is truly a system, employing a powerful mainframe computer and connecting its users to each other and to the people running it. The earliest PLATO materials in English were drill and practice programs, an improvement over written texts, but a small one. Unfortunately, game lessons,…

  1. Organizational Communication: Theoretical Implications of Communication Technology Applications.

    ERIC Educational Resources Information Center

    Danowski, James A.

    Communication technology (CT), which involves the use of computers in private and group communication, has had a major impact on theory and research in organizational communication over the past 30 years. From the 1950s to the early 1970s, mainframe computers were seen as managerial tools in creating more centralized organizational structures.…

  2. Officer Computer Utilization Report

    DTIC Science & Technology

    1992-03-01

    Shipboard Non-tactical ADP Program (SNAP),Navy Intelligence Processing System (NIPS), Retail Operation Management (ROM)). Mainframe - An extremely...ADP Program (SNAP), Navy Intelligence Processing System (NIPS), Retail Operation Management (ROM), etc.) @0230@6 7 7. Technical/tactical systems (e.g

  3. Report on the Acceptance Test of the CRI Y-MP 8128, 10 February - 12 March 1990

    NASA Technical Reports Server (NTRS)

    Carter, Russell; Kutler, Paul (Technical Monitor)

    1998-01-01

    The NAS Numerical Aerodynamic Simulation Facility's HSP 2 computer system, a CRI Y-MP 832 SN #1002, underwent a major hardware upgrade in February of 1990. The 32 MWord, 6.3 ns mainframe component of the system was replaced with a 128 MWord, 6.0 ns CRI Y-MP 8128 mainframe, SN #1030. A 30 day Acceptance Test of the computer system was performed by the NAS RND HSP group from 08:00 February 10, 1990 to 08:00 March 12, 1990. Overall responsibility for the RND HSP Acceptance Test was assumed by Duane Carbon. The terms of the contract required that the SN #1030 achieve an effectiveness level of greater than or equal to ninety (90) percent for 30 consecutive days within a 60 day time frame. After the first thirty days, the effectiveness level of SN #1030 was 94.4 percent, hence the acceptance test was passed.

  4. Development of 3-Year Roadmap to Transform the Discipline of Systems Engineering

    DTIC Science & Technology

    2010-03-31

    quickly humans could physically construct them. Indeed, magnetic core memory was entirely constructed by human hands until it was superseded by...For their mainframe computers, IBM develops the applications, operating system, computer hardware and microprocessors (off the shelf standard memory ...processor developers work on potential computational and memory pipelines to support the required performance capabilities and use the available transistors

  5. STATLIB: NSWC Library of Statistical Programs and Subroutines

    DTIC Science & Technology

    1989-08-01

    Uncorrelated Weighted Polynomial Regression 41 .WEPORC Correlated Weighted Polynomial Regression 45 MROP Multiple Regression Using Orthogonal Polynomials ...could not and should not be con- NSWC TR 89-97 verted to the new general purpose computer (the current CDC 995). Some were designed tu compute...personal computers. They are referred to as SPSSPC+, BMDPC, and SASPC and in general are less comprehensive than their mainframe counterparts. The basic

  6. Extending the Human Mind: Computers in Education. Program and Proceedings of the Annual Summer Computer Conference (6th, Eugene, Oregon, August 6-9, 1987).

    ERIC Educational Resources Information Center

    Oregon Univ., Eugene. Center for Advanced Technology in Education.

    Presented in this program and proceedings are the following 27 papers describing a variety of educational uses of computers: "Learner Based Tools for Special Populations" (Barbara Allen); "Micros and Mainframes: Practical Administrative Applications at the Building Level" (Jeannine Bertrand and Eric Schiff); "Logo and Logowriter in the Curriculum"…

  7. Computer ray tracing speeds.

    PubMed

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  8. Ps and Cs of PCs.

    ERIC Educational Resources Information Center

    Raitt, David I.

    1987-01-01

    Considers pros and cons of using personal computers or microcomputers in a library and information setting. Highlights include discussions about the physical environment, security, effects on users, costs in terms of time and money, micro-mainframe links, and standardization considerations. (Author/LRW)

  9. The Computerized Reference Department: Buying the Future.

    ERIC Educational Resources Information Center

    Kriz, Harry M.; Kok, Victoria T.

    1985-01-01

    Basis for systematic computerization of academic research library's reference, collection development, and collection management functions emphasizes productivity enhancement for librarians and support staff. Use of microcomputer and university's mainframe computer to develop applications of database management systems, electronic spreadsheets,…

  10. Micro and Mainframe Computer Models for Improved Planning in Awarding Financial Aid to Disadvantaged Students.

    ERIC Educational Resources Information Center

    Attinasi, Louis C., Jr.; Fenske, Robert H.

    1988-01-01

    Two computer models used at Arizona State University recognize the tendency of students from low-income and minority backgrounds to apply for assistance late in the funding cycle. They permit administrators to project the amount of aid needed by such students. The Financial Aid Computerized Tracking System is described. (Author/MLW)

  11. An Introduction To PC-TRIM.

    Treesearch

    John R. Mills

    1989-01-01

    The timber resource inventory model (TRIM) has been adapted to run on person al computers. The personal computer version of TRIM (PC-TRIM) is more widely used than its mainframe parent. Errors that existed in previous versions of TRIM have been corrected. Information is presented to help users with program input and output management in the DOS environment, to...

  12. Designing Programs for Multiple Configurations: "You Mean Everyone Doesn't Have a Pentium or Better!"

    ERIC Educational Resources Information Center

    Conkright, Thomas D.; Joliat, Judy

    1996-01-01

    Discusses the challenges, solutions, and compromises involved in creating computer-delivered training courseware for Apollo Travel Services, a company whose 50,000 agents must access a mainframe from many different computing configurations. Initial difficulties came in trying to manage random access memory and quicken response time, but the future…

  13. 24 CFR 15.110 - What fees will HUD charge?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... duplicating machinery. The computer run time includes the cost of operating a central processing unit for that... Applies. (6) Computer run time (includes only mainframe search time not printing) The direct cost of... estimated fee is more than $250.00 or you have a history of failing to pay FOIA fees to HUD in a timely...

  14. Intelligent buildings.

    PubMed

    Williams, W E

    1987-01-01

    The maturing of technologies in computer capabilities, particularly direct digital signals, has provided an exciting variety of new communication and facility control opportunities. These include telecommunications, energy management systems, security systems, office automation systems, local area networks, and video conferencing. New applications are developing continuously. The so-called "intelligent" or "smart" building concept evolves from the development of this advanced technology in building environments. Automation has had a dramatic effect on facility planning. For decades, communications were limited to the telephone, the typewritten message, and copy machines. The office itself and its functions had been essentially unchanged for decades. Office automation systems began to surface during the energy crisis and, although their newer technology was timely, they were, for the most part, designed separately from other new building systems. For example, most mainframe computer systems were originally stand-alone, as were word processing installations. In the last five years, the advances in distributive systems, networking, and personal computer capabilities have provided opportunities to make such dramatic improvements in productivity that the Selectric typewriter has gone from being the most advanced piece of office equipment to nearly total obsolescence.

  15. Computing at DESY — current setup, trends and strategic directions

    NASA Astrophysics Data System (ADS)

    Ernst, Michael

    1998-05-01

    Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.

  16. The Changing Environment of Management Information Systems.

    ERIC Educational Resources Information Center

    Tagawa, Ken

    1982-01-01

    The promise of mainframe computers in the 1970s for management information systems (MIS) is largely unfulfilled, and newer office automation systems and data communication systems are designed to be responsive to MIS needs. The status of these innovations is briefly outlined. (MSE)

  17. Macroeconomic Activity Module - NEMS Documentation

    EIA Publications

    2016-01-01

    Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Macroeconomic Activity Module (MAM) used to develop the Annual Energy Outlook for 2016 (AEO2016). The report catalogues and describes the module assumptions, computations, methodology, parameter estimation techniques, and mainframe source code

  18. A Study of Organizational Downsizing and Information Management Strategies.

    DTIC Science & Technology

    1992-09-01

    Projected $100,000 at plants. per month savings from using miniframes . Networked Standardized "Best practices." 3 mainframes and applications. Whatever worked...The projected $1.2 million savings realized from going from mainframes to miniframes is to avoid having to reduce the budget by that amount in other...network hardware Turned in mainframe and replaced it with two miniframes ; networked new minis with systems at plants Networked mainframes and PCs Acquired

  19. Recipe for Regional Development.

    ERIC Educational Resources Information Center

    Baldwin, Fred D.

    1994-01-01

    The Ceramics Corridor has created new jobs in New York's Appalachian region by fostering ceramics research and product development by small private companies. Corridor business incubators offer tenants low overhead costs, fiber-optic connections to Alfred University's mainframe computer, rental of lab space, and use of equipment small companies…

  20. Hardware Development for a Mobile Educational Robot.

    ERIC Educational Resources Information Center

    Mannaa, A. M.; And Others

    1987-01-01

    Describes the development of a robot whose mainframe is essentially transparent and walks on four legs. Discusses various gaits in four-legged motion. Reports on initial trials of a full-sized model without computer-control, including smoothness of motion and actual obstacle crossing features. (CW)

  1. Providing Information Services in Videotex.

    ERIC Educational Resources Information Center

    Harris, Gary L.

    1986-01-01

    The provision of information through videotex in West Germany is described. Information programs and services of the Gesellschaft fur Information und Dokumentation (GID) and its cooperative partners are reviewed to illustrate program contents, a marketing strategy, and the application of gateway technology with mainframe and personal computers.…

  2. Electronic Campus Meets Today's Education Mission.

    ERIC Educational Resources Information Center

    Swalec, John J.; And Others

    Waubonsee Community College (WCC) employs electronic technology to meet the needs of its students and community in virtually every phase of campus operations. WCC's Information System Center, housing three mainframe computers, drives an online registration system, a computerized self-registration system that can be accessed by telephone from…

  3. TOOLS FOR PRESENTING SPATIAL AND TEMPORAL PATTERNS OF ENVIRONMENTAL MONITORING DATA

    EPA Science Inventory

    The EPA Health Effects Research Laboratory has developed this data presentation tool for use with a variety of types of data which may contain spatial and temporal patterns of interest. he technology links mainframe computing power to the new generation of "desktop publishing" ha...

  4. Interconnection requirements in avionic systems

    NASA Astrophysics Data System (ADS)

    Vergnolle, Claude; Houssay, Bruno

    1991-04-01

    The future aircraft generation will have thousand smart electromagnetic sensors distributed allover. Each sensor is connected with fibers links to the main-frame computer in charge of the real time signal''s correlation. Such a computer must be compactly built and massively parallel: it needs the use of 3 D optical free-space interconnect between neighbouring boards and reconfigurable interconnects via holographic backplane. The optical interconnect facilities will be also used to build fault-tolerant computer through large redundancy.

  5. Definitions of database files and fields of the Personal Computer-Based Water Data Sources Directory

    USGS Publications Warehouse

    Green, J. Wayne

    1991-01-01

    This report describes the data-base files and fields of the personal computer-based Water Data Sources Directory (WDSD). The personal computer-based WDSD was derived from the U.S. Geological Survey (USGS) mainframe computer version. The mainframe version of the WDSD is a hierarchical data-base design. The personal computer-based WDSD is a relational data- base design. This report describes the data-base files and fields of the relational data-base design in dBASE IV (the use of brand names in this abstract is for identification purposes only and does not constitute endorsement by the U.S. Geological Survey) for the personal computer. The WDSD contains information on (1) the type of organization, (2) the major orientation of water-data activities conducted by each organization, (3) the names, addresses, and telephone numbers of offices within each organization from which water data may be obtained, (4) the types of data held by each organization and the geographic locations within which these data have been collected, (5) alternative sources of an organization's data, (6) the designation of liaison personnel in matters related to water-data acquisition and indexing, (7) the volume of water data indexed for the organization, and (8) information about other types of data and services available from the organization that are pertinent to water-resources activities.

  6. Workshop on Office Automation and Telecommunication: Applying the Technology.

    ERIC Educational Resources Information Center

    Mitchell, Bill

    This document contains 12 outlines that forecast the office of the future. The outlines cover the following topics: (1) office automation definition and objectives; (2) functional categories of office automation software packages for mini and mainframe computers; (3) office automation-related software for microcomputers; (4) office automation…

  7. Air traffic control : good progress on interim replacement for outage-plagued system, but risks can be further reduced

    DOT National Transportation Integrated Search

    1996-10-01

    Certain air traffic control(ATC) centers experienced a series of major outages, : some of which were caused by the Display Channel Complex or DCC-a mainframe : computer system that processes radar and other data into displayable images on : controlle...

  8. Fiber Optics and Library Technology.

    ERIC Educational Resources Information Center

    Koenig, Michael

    1984-01-01

    This article examines fiber optic technology, explains some of the key terminology, and speculates about the way fiber optics will change our world. Applications of fiber optics to library systems in three major areas--linkage of a number of mainframe computers, local area networks, and main trunk communications--are highlighted. (EJS)

  9. Colleges' Effort To Prepare for Y2K May Yield Benefits for Many Years.

    ERIC Educational Resources Information Center

    Olsen, Florence

    2000-01-01

    Suggests that the money spent ($100 billion) to fix the Y2K bug in the United States resulted in improved campus computer systems. Reports from campuses around the country indicate that both mainframe and desktop systems experienced fewer problems than expected. (DB)

  10. Modified Laser and Thermos cell calculations on microcomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shapiro, A.; Huria, H.C.

    1987-01-01

    In the course of designing and operating nuclear reactors, many fuel pin cell calculations are required to obtain homogenized cell cross sections as a function of burnup. In the interest of convenience and cost, it would be very desirable to be able to make such calculations on microcomputers. In addition, such a microcomputer code would be very helpful for educational course work in reactor computations. To establish the feasibility of making detailed cell calculations on a microcomputer, a mainframe cell code was compiled and run on a microcomputer. The computer code Laser, originally written in Fortran IV for the IBM-7090more » class of mainframe computers, is a cylindrical, one-dimensional, multigroup lattice cell program that includes burnup. It is based on the MUFT code for epithermal and fast group calculations, and Thermos for the thermal calculations. There are 50 fast and epithermal groups and 35 thermal groups. Resonances are calculated assuming a homogeneous system and then corrected for self-shielding, Dancoff, and Doppler by self-shielding factors. The Laser code was converted to run on a microcomputer. In addition, the Thermos portion of Laser was extracted and compiled separately to have available a stand alone thermal code.« less

  11. Computing Services and Assured Computing

    DTIC Science & Technology

    2006-05-01

    fighters’ ability to execute the mission.” Computing Services 4 We run IT Systems that: provide medical care pay the warfighters manage maintenance...users • 1,400 applications • 18 facilities • 180 software vendors • 18,000+ copies of executive software products • Virtually every type of mainframe and... chocs electriques, de branchez les deux cordons d’al imentation avant de faire le depannage P R IM A R Y SD A S B 1 2 PowerHub 7000 RST U L 00- 00

  12. Distributed Processing with a Mainframe-Based Hospital Information System: A Generalized Solution

    PubMed Central

    Kirby, J. David; Pickett, Michael P.; Boyarsky, M. William; Stead, William W.

    1987-01-01

    Over the last two years the Medical Center Information Systems Department at Duke University Medical Center has been developing a systematic approach to distributing the processing and data involved in computerized applications at DUMC. The resulting system has been named MAPS- the Micro-ADS Processing System. A key characteristic of MAPS is that it makes it easy to execute any existing mainframe ADS application with a request from a PC. This extends the functionality of the mainframe application set to the PC without compromising the maintainability of the PC or mainframe systems.

  13. Developing Computer Software for Use in the Speech/Comunications Classroom.

    ERIC Educational Resources Information Center

    Krauss, Beatrice J.

    Appropriate software can turn the microcomputer from the dumb box into a teaching tool. One resource for finding appropriate software is the organization Edunet. It allows the user to access the mainframe of 18 major universities and has developed a communications network with 130 colleges. It also handles billing, does periodic software…

  14. 21. SITE BUILDING 002 SCANNER BUILDING LOOKING AT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. SITE BUILDING 002 - SCANNER BUILDING - LOOKING AT DISC STORAGE SYSTEMS A AND B (A OR B ARE REDUNDANT SYSTEMS), ONE MAINFRAME COMPUTER ON LINE, ONE ON STANDBY WITH STORAGE TAPE, ONE ON STANDBY WITHOUT TAPE INSTALLED. - Cape Cod Air Station, Technical Facility-Scanner Building & Power Plant, Massachusetts Military Reservation, Sandwich, Barnstable County, MA

  15. Microcomputers, Software and Foreign Languages for Special Purposes: An Analysis of TXTPRO.

    ERIC Educational Resources Information Center

    Tang, Michael S.

    TXTPRO, a computer program developed as a graduate-level research tool for descriptive linguistic analysis, produces simple alphabetic and word frequency lists, analyzes word combinations, and develops concordances. With modifications, a teacher could enter the program into a mainframe or a microcomputer and use it for text analyses to develop…

  16. Rotordynamics on the PC: Further Capabilities of ARDS

    NASA Technical Reports Server (NTRS)

    Fleming, David P.

    1997-01-01

    Rotordynamics codes for personal computers are now becoming available. One of the most capable codes is Analysis of RotorDynamic Systems (ARDS) which uses the component mode synthesis method to analyze a system of up to 5 rotating shafts. ARDS was originally written for a mainframe computer but has been successfully ported to a PC; its basic capabilities for steady-state and transient analysis were reported in an earlier paper. Additional functions have now been added to the PC version of ARDS. These functions include: 1) Estimation of the peak response following blade loss without resorting to a full transient analysis; 2) Calculation of response sensitivity to input parameters; 3) Formulation of optimum rotor and damper designs to place critical speeds in desirable ranges or minimize bearing loads; 4) Production of Poincard plots so the presence of chaotic motion can be ascertained. ARDS produces printed and plotted output. The executable code uses the full array sizes of the mainframe version and fits on a high density floppy disc. Examples of all program capabilities are presented and discussed.

  17. An Analysis of Graduate Nursing Students' Innovation-Decision Process

    PubMed Central

    Kacynski, Kathryn A.; Roy, Katrina D.

    1984-01-01

    This study's purpose was to examine the innovation-decision process used by graduate nursing students when deciding to use computer applications. Graduate nursing students enrolled in a mandatory research class were surveyed before and after their use of a mainframe computer for beginning data analysis about their general attitudes towards computers, individual characteristics such as “cosmopoliteness”, and their desire to learn more about a computer application. It was expected that an experimental intervention, a videotaped demonstration of interactive video instruction of cardiopulmonary resuscitation (CPR); previous computer experience; and the subject's “cosmopoliteness” wolud influence attitudes towards computers and the desire to learn more about a computer application.

  18. [A survey of the best bibliographic searching system in occupational medicine and discussion of its implementation].

    PubMed

    Inoue, J

    1991-12-01

    When occupational health personnel, especially occupational physicians search bibliographies, they usually have to search bibliographies by themselves. Also, if a library is not available because of the location of their work place, they might have to rely on online databases. Although there are many commercial databases in the world, people who seldom use them, will have problems with on-line searching, such as user-computer interface, keywords, and so on. The present study surveyed the best bibliographic searching system in the field of occupational medicine by questionnaire through the use of DIALOG OnDisc MEDLINE as a commercial database. In order to ascertain the problems involved in determining the best bibliographic searching system, a prototype bibliographic searching system was constructed and then evaluated. Finally, solutions for the problems were discussed. These led to the following conclusions: to construct the best bibliographic searching system at the present time, 1) a concept of micro-to-mainframe links (MML) is needed for the computer hardware network; 2) multi-lingual font standards and an excellent common user-computer interface are needed for the computer software; 3) a short course and education of database management systems, and support of personal information processing for retrieved data are necessary for the practical use of the system.

  19. NASTRAN migration to UNIX

    NASA Technical Reports Server (NTRS)

    Chan, Gordon C.; Turner, Horace Q.

    1990-01-01

    COSMIC/NASTRAN, as it is supported and maintained by COSMIC, runs on four main-frame computers - CDC, VAX, IBM and UNIVAC. COSMIC/NASTRAN on other computers, such as CRAY, AMDAHL, PRIME, CONVEX, etc., is available commercially from a number of third party organizations. All these computers, with their own one-of-a-kind operating systems, make NASTRAN machine dependent. The job control language (JCL), the file management, and the program execution procedure of these computers are vastly different, although 95 percent of NASTRAN source code was written in standard ANSI FORTRAN 77. The advantage of the UNIX operating system is that it has no machine boundary. UNIX is becoming widely used in many workstations, mini's, super-PC's, and even some main-frame computers. NASTRAN for the UNIX operating system is definitely the way to go in the future, and makes NASTRAN available to a host of computers, big and small. Since 1985, many NASTRAN improvements and enhancements were made to conform to the ANSI FORTRAN 77 standards. A major UNIX migration effort was incorporated into COSMIC NASTRAN 1990 release. As a pioneer work for the UNIX environment, a version of COSMIC 89 NASTRAN was officially released in October 1989 for DEC ULTRIX VAXstation 3100 (with VMS extensions). A COSMIC 90 NASTRAN version for DEC ULTRIX DECstation 3100 (with RISC) is planned for April 1990 release. Both workstations are UNIX based computers. The COSMIC 90 NASTRAN will be made available on a TK50 tape for the DEC ULTRIX workstations. Previously in 1988, an 88 NASTRAN version was tested successfully on a SiliconGraphics workstation.

  20. Dynamic gas temperature measurements using a personal computer for data acquisition and reduction

    NASA Technical Reports Server (NTRS)

    Fralick, Gustave C.; Oberle, Lawrence G.; Greer, Lawrence C., III

    1993-01-01

    This report describes a dynamic gas temperature measurement system. It has frequency response to 1000 Hz, and can be used to measure temperatures in hot, high pressure, high velocity flows. A personal computer is used for collecting and processing data, which results in a much shorter wait for results than previously. The data collection process and the user interface are described in detail. The changes made in transporting the software from a mainframe to a personal computer are described in appendices, as is the overall theory of operation.

  1. Internal controls over computer-processed financial data at Boeing Petroleum Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-02-14

    The Strategic Petroleum Reserve (SPR) is responsible for purchasing and storing crude oil to mitigate the potential adverse impact of any future disruptions in crude oil imports. Boeing Petroleum Services, Inc. (BPS) operates the SPR under a US Department of Energy (DOE) management and operating contract. BPS receives support for various information systems and other information processing needs from a mainframe computer center. The objective of the audit was to determine if the internal controls implemented by BPS for computer systems were adequate to assure processing reliability.

  2. Front End Software for Online Database Searching Part 1: Definitions, System Features, and Evaluation.

    ERIC Educational Resources Information Center

    Hawkins, Donald T.; Levy, Louise R.

    1985-01-01

    This initial article in series of three discusses barriers inhibiting use of current online retrieval systems by novice users and notes reasons for front end and gateway online retrieval systems. Definitions, front end features, user interface, location (personal computer, host mainframe), evaluation, and strengths and weaknesses are covered. (16…

  3. 48 CFR 2452.204-70 - Preservation of, and access to, contract records (tangible and electronically stored information...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... or data storage). ESI devices and media include, but are not be limited to: (1) Computers (mainframe...) Personal data assistants (PDAs); (5) External data storage devices including portable devices (e.g., flash drive); and (6) Data storage media (magnetic, e.g., tape; optical, e.g., compact disc, microfilm, etc...

  4. 48 CFR 2452.204-70 - Preservation of, and access to, contract records (tangible and electronically stored information...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... or data storage). ESI devices and media include, but are not be limited to: (1) Computers (mainframe...) Personal data assistants (PDAs); (5) External data storage devices including portable devices (e.g., flash drive); and (6) Data storage media (magnetic, e.g., tape; optical, e.g., compact disc, microfilm, etc...

  5. Information resources assessment of a healthcare integrated delivery system.

    PubMed Central

    Gadd, C. S.; Friedman, C. P.; Douglas, G.; Miller, D. J.

    1999-01-01

    While clinical healthcare systems may have lagged behind computer applications in other fields in the shift from mainframes to client-server architectures, the rapid deployment of newer applications is closing that gap. Organizations considering the transition to client-server must identify and position themselves to provide the resources necessary to implement and support the infrastructure requirements of client-server architectures and to manage the accelerated complexity at the desktop, including hardware and software deployment, training, and maintenance needs. This paper describes an information resources assessment of the recently aligned Pennsylvania regional Veterans Administration Stars and Stripes Health Network (VISN4), in anticipation of the shift from a predominantly mainframe to a client-server information systems architecture in its well-established VistA clinical information system. The multimethod assessment study is described here to demonstrate this approach and its value to regional healthcare networks undergoing organizational integration and/or significant information technology transformations. PMID:10566414

  6. Proceedings of the NASTRAN (Tradename) Users’ Colloquium (18th) Held in Portland, Oregon on 23-27 April 1990

    DTIC Science & Technology

    1990-04-01

    Maxwell (Texas A&M University-) 4. ACCURACY OF THE QUAD& THICK SHELL ELEMENT ’........... .. 3.0 by William R. Case, Tiffany D. Bowles, Al ia K. Croft and...Computer Literacy: Mainframe Monsters and Pacman. Symposium on Advances and Trends in Structures and Dynamics, Washington, D.C., October 1984. 4. Woodward...No. 1, 1985. 5. Wilson, E.L., and M. Holt: CAL-80-Computer Assisted Learning of Structural Engineering. Symposium on Advances and Trends in

  7. Interfacing the VAX 11/780 Using Berkeley Unix 4.2.BSD and Ethernet Based Xerox Network Systems. Volume 1.

    DTIC Science & Technology

    1984-12-01

    3Com Corporation ....... A-18 Ethernet Controller Support . . . . . . A-19 Host Systems Support . . . . . . . . . A-20 Personal Computers Support...A-23 VAX EtherSeries Software 0 * A-23 Network Research Corporation . o o o . o A-24 File Transfer Service . . . . o A-25 Virtual Terminal Service 0...Control office is planning to acquire a Digital Equipment Corporation VAX 11/780 mainframe computer with the Unix Berkeley 4.2BSD operating system. They

  8. Development of novel hybrid flexure-based microgrippers for precision micro-object manipulation.

    PubMed

    Mohd Zubir, Mohd Nashrul; Shirinzadeh, Bijan; Tian, Yanling

    2009-06-01

    This paper describes the process of developing a microgripper that is capable of high precision and fidelity manipulation of micro-objects. The design adopts the concept of flexure-based hinges on its joints to provide the rotational motion, thus eliminating the inherent nonlinearities associated with the application of conventional rigid hinges. A combination of two modeling techniques, namely, pseudorigid body model and finite element analysis was utilized to expedite the prototyping procedure, which leads to the establishment of a high performance mechanism. A new hybrid compliant structure integrating cantilever beam and flexural hinge configurations within microgripper mechanism mainframe has been developed. This concept provides a novel approach to harness the advantages within each individual configuration while mutually compensating the limitations inherent between them. A wire electrodischarge machining technique was utilized to fabricate the gripper out of high grade aluminum alloy (Al 7075T6). Experimental studies were conducted on the model to obtain various correlations governing the gripper performance as well as for model verification. The experimental results demonstrate high level of compliance in comparison to the computational results. A high amplification characteristic and maximum achievable stroke of 100 microm can be achieved.

  9. Development of novel hybrid flexure-based microgrippers for precision micro-object manipulation

    NASA Astrophysics Data System (ADS)

    Mohd Zubir, Mohd Nashrul; Shirinzadeh, Bijan; Tian, Yanling

    2009-06-01

    This paper describes the process of developing a microgripper that is capable of high precision and fidelity manipulation of micro-objects. The design adopts the concept of flexure-based hinges on its joints to provide the rotational motion, thus eliminating the inherent nonlinearities associated with the application of conventional rigid hinges. A combination of two modeling techniques, namely, pseudorigid body model and finite element analysis was utilized to expedite the prototyping procedure, which leads to the establishment of a high performance mechanism. A new hybrid compliant structure integrating cantilever beam and flexural hinge configurations within microgripper mechanism mainframe has been developed. This concept provides a novel approach to harness the advantages within each individual configuration while mutually compensating the limitations inherent between them. A wire electrodischarge machining technique was utilized to fabricate the gripper out of high grade aluminum alloy (Al 7075T6). Experimental studies were conducted on the model to obtain various correlations governing the gripper performance as well as for model verification. The experimental results demonstrate high level of compliance in comparison to the computational results. A high amplification characteristic and maximum achievable stroke of 100 μm can be achieved.

  10. The Design of an Interactive Computer Based System for the Training of Signal Corps Officers in Communications Network Management

    DTIC Science & Technology

    1985-08-01

    from the mainframe to the terminals is approximately 56k bits per second (21:3). Score: 8. Expandability. The number of terminals available to the 0...the systems controllers may access any files. For modem link up, a callback system is to be implemented to prevent unauthorized off post access (10:2

  11. Practical applications of remote sensing technology

    NASA Technical Reports Server (NTRS)

    Whitmore, Roy A., Jr.

    1990-01-01

    Land managers increasingly are becoming dependent upon remote sensing and automated analysis techniques for information gathering and synthesis. Remote sensing and geographic information system (GIS) techniques provide quick and economical information gathering for large areas. The outputs of remote sensing classification and analysis are most effective when combined with a total natural resources data base within the capabilities of a computerized GIS. Some examples are presented of the successes, as well as the problems, in integrating remote sensing and geographic information systems. The need to exploit remotely sensed data and the potential that geographic information systems offer for managing and analyzing such data continues to grow. New microcomputers with vastly enlarged memory, multi-fold increases in operating speed and storage capacity that was previously available only on mainframe computers are a reality. Improved raster GIS software systems have been developed for these high performance microcomputers. Vector GIS systems previously reserved for mini and mainframe systems are available to operate on these enhanced microcomputers. One of the more exciting areas that is beginning to emerge is the integration of both raster and vector formats on a single computer screen. This technology will allow satellite imagery or digital aerial photography to be presented as a background to a vector display.

  12. High performance network and channel-based storage

    NASA Technical Reports Server (NTRS)

    Katz, Randy H.

    1991-01-01

    In the traditional mainframe-centered view of a computer system, storage devices are coupled to the system through complex hardware subsystems called input/output (I/O) channels. With the dramatic shift towards workstation-based computing, and its associated client/server model of computation, storage facilities are now found attached to file servers and distributed throughout the network. We discuss the underlying technology trends that are leading to high performance network-based storage, namely advances in networks, storage devices, and I/O controller and server architectures. We review several commercial systems and research prototypes that are leading to a new approach to high performance computing based on network-attached storage.

  13. Reanalysis, compatibility and correlation in analysis of modified antenna structures

    NASA Technical Reports Server (NTRS)

    Levy, R.

    1989-01-01

    A simple computational procedure is synthesized to process changes in the microwave-antenna pathlength-error measure when there are changes in the antenna structure model. The procedure employs structural modification reanalysis methods combined with new extensions of correlation analysis to provide the revised rms pathlength error. Mainframe finite-element-method processing of the structure model is required only for the initial unmodified structure, and elementary postprocessor computations develop and deal with the effects of the changes. Several illustrative computational examples are included. The procedure adapts readily to processing spectra of changes for parameter studies or sensitivity analyses.

  14. Marshal Wrubel and the Electronic Computer as an Astronomical Instrument

    NASA Astrophysics Data System (ADS)

    Mutschlecner, J. P.; Olsen, K. H.

    1998-05-01

    In 1960, Marshal H. Wrubel, professor of astrophysics at Indiana University, published an influential review paper under the title, "The Electronic Computer as an Astronomical Instrument." This essay pointed out the enormous potential of the electronic computer as an instrument of observational and theoretical research in astronomy, illustrated programming concepts, and made specific recommendations for the increased use of computers in astronomy. He noted that, with a few scattered exceptions, computer use by the astronomical community had heretofore been "timid and sporadic." This situation was to improve dramatically in the next few years. By the late 1950s, general-purpose, high-speed, "mainframe" computers were just emerging from the experimental, developmental stage, but few were affordable by or available to academic and research institutions not closely associated with large industrial or national defense programs. Yet by 1960 Wrubel had spent a decade actively pioneering and promoting the imaginative application of electronic computation within the astronomical community. Astronomy upper-level undergraduate and graduate students at Indiana were introduced to computing, and Ph.D. candidates who he supervised applied computer techniques to problems in theoretical astrophysics. He wrote an early textbook on programming, taught programming classes, and helped establish and direct the Research Computing Center at Indiana, later named the Wrubel Computing Center in his honor. He and his students created a variety of algorithms and subroutines and exchanged these throughout the astronomical community by distributing the Astronomical Computation News Letter. Nationally as well as internationally, Wrubel actively cooperated with other groups interested in computing applications for theoretical astrophysics, often through his position as secretary of the IAU commission on Stellar Constitution.

  15. Measurement of Loneliness Among Clients Representing Four Stages of Cancer: An Exploratory Study.

    DTIC Science & Technology

    1985-03-01

    status, and membership in organizations for each client were entered into a SPSS program in a mainframe computer . The means and a one-way analysis of...Study 6. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(e) S. CONTRACT OR GRANT NUMBER(&) Suanne Smith 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ...27 Definitions of Terms .......... . . . . 28 II. MErODOLOGY . . . . . . . . . . ......... 30 Overviev of Design

  16. The Effects of Word Processing Software on User Satisfaction: An Empirical Study of Micro, Mini, and Mainframe Computers Using an Interactive Artificial Intelligence Expert-System.

    ERIC Educational Resources Information Center

    Rushinek, Avi; Rushinek, Sara

    1984-01-01

    Describes results of a system rating study in which users responded to WPS (word processing software) questions. Study objectives were data collection and evaluation of variables; statistical quantification of WPS's contribution (along with other variables) to user satisfaction; design of an expert system to evaluate WPS; and database update and…

  17. Environmental Gradient Analysis, Ordination, and Classification in Environmental Impact Assessments.

    DTIC Science & Technology

    1987-09-01

    agglomerative clustering algorithms for mainframe computers: (1) the unweighted pair-group method that V uses arithmetic averages ( UPGMA ), (2) the...hierarchical agglomerative unweighted pair-group method using arithmetic averages ( UPGMA ), which is also called average linkage clustering. This method was...dendrograms produced by weighted clustering (93). Sneath and Sokal (94), Romesburg (84), and Seber• (90) also strongly recommend the UPGMA . A dendrogram

  18. Conversion and Retrievability of Hard Copy and Digital Documents on Optical Disks

    DTIC Science & Technology

    1992-03-01

    53 B. CURRENT THESIS PREPARATION TOOLS . ....... 54 1. Thesis Preparation using G-Thesis ..... 55 2. Thesis Preparation using Framemaker ...School mainframe. • Computer Science department students can use a software package called Framemaker , available on Sun work stations in their...by most thesis typists and students. For this reason, the discussion of thesis preparation tools will be limited to; G-thesis, Framemaker and

  19. The RANDOM computer program: A linear congruential random number generator

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  20. Desktop Computing Integration Project

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  1. ASTEC: Controls analysis for personal computers

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  2. An evaluation of superminicomputers for thermal analysis

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Vidal, J. B.; Jones, G. K.

    1982-01-01

    The use of superminicomputers for solving a series of increasingly complex thermal analysis problems is investigated. The approach involved (1) installation and verification of the SPAR thermal analyzer software on superminicomputers at Langley Research Center and Goddard Space Flight Center, (2) solution of six increasingly complex thermal problems on this equipment, and (3) comparison of solution (accuracy, CPU time, turnaround time, and cost) with solutions on large mainframe computers.

  3. Health care informatics research implementation of the VA-DHCP Spanish version for Latin America.

    PubMed Central

    Samper, R.; Marin, C. J.; Ospina, J. A.; Varela, C. A.

    1992-01-01

    The VA DHCP, hospital computer program represents an integral solution to the complex clinical and administrative functions of any hospital world wide. Developed by the Department of Veterans Administration, it has until lately run exclusively in mainframe platforms. The recent implementation in PCs opens the opportunity for use in Latinamerica. Detailed description of the strategy for Spanish, local implementation in Colombia is made. PMID:1482994

  4. Health care informatics research implementation of the VA-DHCP Spanish version for Latin America.

    PubMed

    Samper, R; Marin, C J; Ospina, J A; Varela, C A

    1992-01-01

    The VA DHCP, hospital computer program represents an integral solution to the complex clinical and administrative functions of any hospital world wide. Developed by the Department of Veterans Administration, it has until lately run exclusively in mainframe platforms. The recent implementation in PCs opens the opportunity for use in Latinamerica. Detailed description of the strategy for Spanish, local implementation in Colombia is made.

  5. A DNA sequence analysis package for the IBM personal computer.

    PubMed Central

    Lagrimini, L M; Brentano, S T; Donelson, J E

    1984-01-01

    We present here a collection of DNA sequence analysis programs, called "PC Sequence" (PCS), which are designed to run on the IBM Personal Computer (PC). These programs are written in IBM PC compiled BASIC and take full advantage of the IBM PC's speed, error handling, and graphics capabilities. For a modest initial expense in hardware any laboratory can use these programs to quickly perform computer analysis on DNA sequences. They are written with the novice user in mind and require very little training or previous experience with computers. Also provided are a text editing program for creating and modifying DNA sequence files and a communications program which enables the PC to communicate with and collect information from mainframe computers and DNA sequence databases. PMID:6546433

  6. Conversion of Mass Storage Hierarchy in an IBM Computer Network

    DTIC Science & Technology

    1989-03-01

    storage devices GUIDE IBM users’ group for DOS operating systems IBM International Business Machines IBM 370/145 CPU introduced in 1970 IBM 370/168 CPU...February 12, 1985, Information Systems Group, International Business Machines Corporation. "IBM 3090 Processor Complex" and 񓼪 Mass Storage System...34 Mainframe Journal, pp. 15-26, 64-65, Dallas, Texas, September-October 1987. 3. International Business Machines Corporation, Introduction to IBM 3S80 Storage

  7. CICS Region Virtualization for Cost Effective Application Development

    ERIC Educational Resources Information Center

    Khan, Kamal Waris

    2012-01-01

    Mainframe is used for hosting large commercial databases, transaction servers and applications that require a greater degree of reliability, scalability and security. Customer Information Control System (CICS) is a mainframe software framework for implementing transaction services. It is designed for rapid, high-volume online processing. In order…

  8. Fast methods to numerically integrate the Reynolds equation for gas fluid films

    NASA Technical Reports Server (NTRS)

    Dimofte, Florin

    1992-01-01

    The alternating direction implicit (ADI) method is adopted, modified, and applied to the Reynolds equation for thin, gas fluid films. An efficient code is developed to predict both the steady-state and dynamic performance of an aerodynamic journal bearing. An alternative approach is shown for hybrid journal gas bearings by using Liebmann's iterative solution (LIS) for elliptic partial differential equations. The results are compared with known design criteria from experimental data. The developed methods show good accuracy and very short computer running time in comparison with methods based on an inverting of a matrix. The computer codes need a small amount of memory and can be run on either personal computers or on mainframe systems.

  9. System analysis for the Huntsville Operation Support Center distributed computer system

    NASA Technical Reports Server (NTRS)

    Ingels, F. M.

    1986-01-01

    A simulation model of the NASA Huntsville Operational Support Center (HOSC) was developed. This simulation model emulates the HYPERchannel Local Area Network (LAN) that ties together the various computers of HOSC. The HOSC system is a large installation of mainframe computers such as the Perkin Elmer 3200 series and the Dec VAX series. A series of six simulation exercises of the HOSC model is described using data sets provided by NASA. The analytical analysis of the ETHERNET LAN and the video terminals (VTs) distribution system are presented. An interface analysis of the smart terminal network model which allows the data flow requirements due to VTs on the ETHERNET LAN to be estimated, is presented.

  10. Access control and privacy in large distributed systems

    NASA Technical Reports Server (NTRS)

    Leiner, B. M.; Bishop, M.

    1986-01-01

    Large scale distributed systems consists of workstations, mainframe computers, supercomputers and other types of servers, all connected by a computer network. These systems are being used in a variety of applications including the support of collaborative scientific research. In such an environment, issues of access control and privacy arise. Access control is required for several reasons, including the protection of sensitive resources and cost control. Privacy is also required for similar reasons, including the protection of a researcher's proprietary results. A possible architecture for integrating available computer and communications security technologies into a system that meet these requirements is described. This architecture is meant as a starting point for discussion, rather that the final answer.

  11. AUTOCASK (AUTOmatic Generation of 3-D CASK models). A microcomputer based system for shipping cask design review analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard, M.A.; Sommer, S.C.

    1995-04-01

    AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.

  12. HNET - A National Computerized Health Network

    PubMed Central

    Casey, Mark; Hamilton, Richard

    1988-01-01

    The HNET system demonstrated conceptually and technically a national text (and limited bit mapped graphics) computer network for use between innovative members of the health care industry. The HNET configuration of a leased high speed national packet switching network connecting any number of mainframe, mini, and micro computers was unique in it's relatively low capital costs and freedom from obsolescence. With multiple simultaneous conferences, databases, bulletin boards, calendars, and advanced electronic mail and surveys, it is marketable to innovative hospitals, clinics, physicians, health care associations and societies, nurses, multisite research projects libraries, etc.. Electronic publishing and education capabilities along with integrated voice and video transmission are identified as future enhancements.

  13. Computing Legacy Software Behavior to Understand Functionality and Security Properties: An IBM/370 Demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linger, Richard C; Pleszkoch, Mark G; Prowell, Stacy J

    Organizations maintaining mainframe legacy software can benefit from code modernization and incorporation of security capabilities to address the current threat environment. Oak Ridge National Laboratory is developing the Hyperion system to compute the behavior of software as a means to gain understanding of software functionality and security properties. Computation of functionality is critical to revealing security attributes, which are in fact specialized functional behaviors of software. Oak Ridge is collaborating with MITRE Corporation to conduct a demonstration project to compute behavior of legacy IBM Assembly Language code for a federal agency. The ultimate goal is to understand functionality and securitymore » vulnerabilities as a basis for code modernization. This paper reports on the first phase, to define functional semantics for IBM Assembly instructions and conduct behavior computation experiments.« less

  14. Operationally Efficient Propulsion System Study (OEPSS) Data Book. Volume 8; Integrated Booster Propulsion Module (BPM) Engine Start Dynamics

    NASA Technical Reports Server (NTRS)

    Kemp, Victoria R.

    1992-01-01

    A fluid-dynamic, digital-transient computer model of an integrated, parallel propulsion system was developed for the CDC mainframe and the SUN workstation computers. Since all STME component designs were used for the integrated system, computer subroutines were written characterizing the performance and geometry of all the components used in the system, including the manifolds. Three transient analysis reports were completed. The first report evaluated the feasibility of integrated engine systems in regards to the start and cutoff transient behavior. The second report evaluated turbopump out and combined thrust chamber/turbopump out conditions. The third report presented sensitivity study results in staggered gas generator spin start and in pump performance characteristics.

  15. Solving subsurface structural problems using a computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witte, D.M.

    1987-02-01

    Until recently, the solution of subsurface structural problems has required a combination of graphical construction, trigonometry, time, and patience. Recent advances in software available for both mainframe and microcomputers now reduce the time and potential error of these calculations by an order of magnitude. Software for analysis of deviated wells, three point problems, apparent dip, apparent thickness, and the intersection of two planes, as well as the plotting and interpretation of these data can be used to allow timely and accurate exploration or operational decisions. The available computer software provides a set of utilities, or tools, rather than a comprehensive,more » intelligent system. The burden for selection of appropriate techniques, computation methods, and interpretations still lies with the explorationist user.« less

  16. A SLAM II simulation model for analyzing space station mission processing requirements

    NASA Technical Reports Server (NTRS)

    Linton, D. G.

    1985-01-01

    Space station mission processing is modeled via the SLAM 2 simulation language on an IBM 4381 mainframe and an IBM PC microcomputer with 620K RAM, two double-sided disk drives and an 8087 coprocessor chip. Using a time phased mission (payload) schedule and parameters associated with the mission, orbiter (space shuttle) and ground facility databases, estimates for ground facility utilization are computed. Simulation output associated with the science and applications database is used to assess alternative mission schedules.

  17. System Documentation for the U.S. Army Ambulatory Care Data Base (ACDB) Study: Mainframe, Personal Computer and Optical Scanner File Structure

    DTIC Science & Technology

    1988-11-01

    Leesburg Pike, Falls Church, VA 22041-3203 (1) HQ HSC (HSCL-A), Fort Sam Houston, TX 78234-6000 (1) Dir, The Army Library, ATTN: ANR-AL-RS (Army Studies), Rm...34BSBAB"I 0831 2324 CLIDEFS(53)-="BAAA" : CLIDEFS(67)=l9BDAA"l 0943 2324 0843 2324 REM * CLINIC CODE BY SINGLE BUBBLE 0843 2324 CLIBU8S(40)"BAAGN&uA.’ AAv

  18. Multi-Core Processor Memory Contention Benchmark Analysis Case Study

    NASA Technical Reports Server (NTRS)

    Simon, Tyler; McGalliard, James

    2009-01-01

    Multi-core processors dominate current mainframe, server, and high performance computing (HPC) systems. This paper provides synthetic kernel and natural benchmark results from an HPC system at the NASA Goddard Space Flight Center that illustrate the performance impacts of multi-core (dual- and quad-core) vs. single core processor systems. Analysis of processor design, application source code, and synthetic and natural test results all indicate that multi-core processors can suffer from significant memory subsystem contention compared to similar single-core processors.

  19. A computer-aided design system geared toward conceptual design in a research environment. [for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    STACK S. H.

    1981-01-01

    A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.

  20. Optical systems integrated modeling

    NASA Technical Reports Server (NTRS)

    Shannon, Robert R.; Laskin, Robert A.; Brewer, SI; Burrows, Chris; Epps, Harlan; Illingworth, Garth; Korsch, Dietrich; Levine, B. Martin; Mahajan, Vini; Rimmer, Chuck

    1992-01-01

    An integrated modeling capability that provides the tools by which entire optical systems and instruments can be simulated and optimized is a key technology development, applicable to all mission classes, especially astrophysics. Many of the future missions require optical systems that are physically much larger than anything flown before and yet must retain the characteristic sub-micron diffraction limited wavefront accuracy of their smaller precursors. It is no longer feasible to follow the path of 'cut and test' development; the sheer scale of these systems precludes many of the older techniques that rely upon ground evaluation of full size engineering units. The ability to accurately model (by computer) and optimize the entire flight system's integrated structural, thermal, and dynamic characteristics is essential. Two distinct integrated modeling capabilities are required. These are an initial design capability and a detailed design and optimization system. The content of an initial design package is shown. It would be a modular, workstation based code which allows preliminary integrated system analysis and trade studies to be carried out quickly by a single engineer or a small design team. A simple concept for a detailed design and optimization system is shown. This is a linkage of interface architecture that allows efficient interchange of information between existing large specialized optical, control, thermal, and structural design codes. The computing environment would be a network of large mainframe machines and its users would be project level design teams. More advanced concepts for detailed design systems would support interaction between modules and automated optimization of the entire system. Technology assessment and development plans for integrated package for initial design, interface development for detailed optimization, validation, and modeling research are presented.

  1. Automated mainframe data collection in a network environment

    NASA Technical Reports Server (NTRS)

    Gross, David L.

    1994-01-01

    The progress and direction of the computer industry have resulted in widespread use of dissimilar and incompatible mainframe data systems. Data collection from these multiple systems is a labor intensive task. In the past, data collection had been restricted to the efforts of personnel specially trained on each system. Information is one of the most important resources an organizations has. Any improvement in an organization's ability to access and manage that information provides a competitive advantage. This problem of data collection is compounded at NASA sites by multi-center and contractor operations. The Centralized Automated Data Retrieval System (CADRS) is designed to provide a common interface that would permit data access, query, and retrieval from multiple contractor and NASA systems. The methods developed for CADRS have a strong commercial potential in that they would be applicable for any industry that needs inter-department, inter-company, or inter-agency data communications. The widespread use of multi-system data networks, that combine older legacy systems with newer decentralized networks, has made data retrieval a critical problem for information dependent industries. Implementing the technology discussed in this paper would reduce operational expense and improve data collection on these composite data systems.

  2. Close to real life. [solving for transonic flow about lifting airfoils using supercomputers

    NASA Technical Reports Server (NTRS)

    Peterson, Victor L.; Bailey, F. Ron

    1988-01-01

    NASA's Numerical Aerodynamic Simulation (NAS) facility for CFD modeling of highly complex aerodynamic flows employs as its basic hardware two Cray-2s, an ETA-10 Model Q, an Amdahl 5880 mainframe computer that furnishes both support processing and access to 300 Gbytes of disk storage, several minicomputers and superminicomputers, and a Thinking Machines 16,000-device 'connection machine' processor. NAS, which was the first supercomputer facility to standardize operating-system and communication software on all processors, has done important Space Shuttle aerodynamics simulations and will be critical to the configurational refinement of the National Aerospace Plane and its intergrated powerplant, which will involve complex, high temperature reactive gasdynamic computations.

  3. Testing and validating the CERES-wheat (Crop Estimation through Resource and Environment Synthesis-wheat) model in diverse environments

    NASA Technical Reports Server (NTRS)

    Otter-Nacke, S.; Godwin, D. C.; Ritchie, J. T.

    1986-01-01

    CERES-Wheat is a computer simulation model of the growth, development, and yield of spring and winter wheat. It was designed to be used in any location throughout the world where wheat can be grown. The model is written in Fortran 77, operates on a daily time stop, and runs on a range of computer systems from microcomputers to mainframes. Two versions of the model were developed: one, CERES-Wheat, assumes nitrogen to be nonlimiting; in the other, CERES-Wheat-N, the effects of nitrogen deficiency are simulated. The report provides the comparisons of simulations and measurements of about 350 wheat data sets collected from throughout the world.

  4. Assessment of the information content of patterns: an algorithm

    NASA Astrophysics Data System (ADS)

    Daemi, M. Farhang; Beurle, R. L.

    1991-12-01

    A preliminary investigation confirmed the possibility of assessing the translational and rotational information content of simple artificial images. The calculation is tedious, and for more realistic patterns it is essential to implement the method on a computer. This paper describes an algorithm developed for this purpose which confirms the results of the preliminary investigation. Use of the algorithm facilitates much more comprehensive analysis of the combined effect of continuous rotation and fine translation, and paves the way for analysis of more realistic patterns. Owing to the volume of calculation involved in these algorithms, extensive computing facilities were necessary. The major part of the work was carried out using an ICL 3900 series mainframe computer as well as other powerful workstations such as a RISC architecture MIPS machine.

  5. User's manual for the Macintosh version of PASCO

    NASA Technical Reports Server (NTRS)

    Lucas, S. H.; Davis, Randall C.

    1991-01-01

    A user's manual for Macintosh PASCO is presented. Macintosh PASCO is an Apple Macintosh version of PASCO, an existing computer code for structural analysis and optimization of longitudinally stiffened composite panels. PASCO combines a rigorous buckling analysis program with a nonlinear mathematical optimization routine to minimize panel mass. Macintosh PASCO accepts the same input as mainframe versions of PASCO. As output, Macintosh PASCO produces a text file and mode shape plots in the form of Apple Macintosh PICT files. Only the user interface for Macintosh is discussed here.

  6. Computing at h1 - Experience and Future

    NASA Astrophysics Data System (ADS)

    Eckerlin, G.; Gerhards, R.; Kleinwort, C.; KrÜNer-Marquis, U.; Egli, S.; Niebergall, F.

    The H1 experiment has now been successfully operating at the electron proton collider HERA at DESY for three years. During this time the computing environment has gradually shifted from a mainframe oriented environment to the distributed server/client Unix world. This transition is now almost complete. Computing needs are largely determined by the present amount of 1.5 TB of reconstructed data per year (1994), corresponding to 1.2 × 107 accepted events. All data are centrally available at DESY. In addition to data analysis, which is done in all collaborating institutes, most of the centrally organized Monte Carlo production is performed outside of DESY. New software tools to cope with offline computing needs include CENTIPEDE, a tool for the use of distributed batch and interactive resources for Monte Carlo production, and H1 UNIX, a software package for automatic updates of H1 software on all UNIX platforms.

  7. Supercomputing resources empowering superstack with interactive and integrated systems

    NASA Astrophysics Data System (ADS)

    Rückemann, Claus-Peter

    2012-09-01

    This paper presents the results from the development and implementation of Superstack algorithms to be dynamically used with integrated systems and supercomputing resources. Processing of geophysical data, thus named geoprocessing, is an essential part of the analysis of geoscientific data. The theory of Superstack algorithms and the practical application on modern computing architectures was inspired by developments introduced with processing of seismic data on mainframes and within the last years leading to high end scientific computing applications. There are several stacking algorithms known but with low signal to noise ratio in seismic data the use of iterative algorithms like the Superstack can support analysis and interpretation. The new Superstack algorithms are in use with wave theory and optical phenomena on highly performant computing resources for huge data sets as well as for sophisticated application scenarios in geosciences and archaeology.

  8. Rotary engine performance computer program (RCEMAP and RCEMAPPC): User's guide

    NASA Technical Reports Server (NTRS)

    Bartrand, Timothy A.; Willis, Edward A.

    1993-01-01

    This report is a user's guide for a computer code that simulates the performance of several rotary combustion engine configurations. It is intended to assist prospective users in getting started with RCEMAP and/or RCEMAPPC. RCEMAP (Rotary Combustion Engine performance MAP generating code) is the mainframe version, while RCEMAPPC is a simplified subset designed for the personal computer, or PC, environment. Both versions are based on an open, zero-dimensional combustion system model for the prediction of instantaneous pressures, temperature, chemical composition and other in-chamber thermodynamic properties. Both versions predict overall engine performance and thermal characteristics, including bmep, bsfc, exhaust gas temperature, average material temperatures, and turbocharger operating conditions. Required inputs include engine geometry, materials, constants for use in the combustion heat release model, and turbomachinery maps. Illustrative examples and sample input files for both versions are included.

  9. Cost/Schedule Control Systems Criteria: A Reference Guide to C/SCSC information

    DTIC Science & Technology

    1992-09-01

    Smith, Larry A. "Mainframe ARTEMIS: More than a Project Management Tool -- Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988...A. "Mainframe ARTEMIS: More than a Project Management Tool - Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988). 14...than a Project Management Tool -- Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988). 17. Trufant, Thomas M. and Robert

  10. Composite Cores

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Spang & Company's new configuration of converter transformer cores is a composite of gapped and ungapped cores assembled together in concentric relationship. The net effect of the composite design is to combine the protection from saturation offered by the gapped core with the lower magnetizing requirement of the ungapped core. The uncut core functions under normal operating conditions and the cut core takes over during abnormal operation to prevent power surges and their potentially destructive effect on transistors. Principal customers are aerospace and defense manufacturers. Cores also have applicability in commercial products where precise power regulation is required, as in the power supplies for large mainframe computers.

  11. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Sano, Hikomaro

    This report outlines “Repoir” (Report information retrieval) system of Toyota Central R & D Laboratories, Inc. as an example of in-house information retrieval system. The online system was designed to process in-house technical reports with the aid of a mainframe computer and has been in operation since 1979. Its features are multiple use of the information for technical and managerial purposes and simplicity in indexing and data input. The total number of descriptors, specially selected for the system, was minimized for ease of indexing. The report also describes the input items, processing flow and typical outputs in kanji letters.

  12. A Big RISC

    DTIC Science & Technology

    1983-07-18

    architecture . Design , performance, and cost of BRISC is presented. Performance is shown to be better than high end mainframes such as the IBM 3081 and Amdahl 470V/8 on integer benchmarks written in C, Pascal and LISP. The cost, conservatively estimated to be $132,400 is about the same as a high end minicomputer such as the VAX-11/780. BRISC has a CPU cycle time of 46 ns, providing a RISC I instruction execution rate of greater than 15 MIPs. BRISC is designed with a Structured Computer Aided Logic Design System (SCALD) by Valid Logic Systems. An evaluation of the utility of

  13. Users Guide to the JPL Doppler Gravity Database

    NASA Technical Reports Server (NTRS)

    Muller, P. M.; Sjogren, W. L.

    1986-01-01

    Local gravity accelerations and gravimetry have been determined directly from spacecraft Doppler tracking data near the Moon and various planets by the Jet Propulsion Laboratory. Researchers in many fields have an interest in planet-wide global gravimetric mapping and its applications. Many of them use their own computers in support of their studies and would benefit from being able to directly manipulate these gravity data for inclusion in their own modeling computations. Pubication of some 150 Apollo 15 subsatellite low-altitude, high-resolution, single-orbit data sets is covered. The doppler residuals with a determination of the derivative function providing line-of-sight-gravity are both listed and plotted (on microfilm), and can be ordered in computer readable forms (tape and floppy disk). The form and format of this database as well as the methods of data reduction are explained and referenced. A skeleton computer program is provided which can be modified to support re-reductions and re-formatted presentations suitable to a wide variety of research needs undertaken on mainframe or PC class microcomputers.

  14. Integrated computer-aided design using minicomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.

    1980-01-01

    Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM), a highly interactive software, has been implemented on minicomputers at the NASA Langley Research Center. CAD/CAM software integrates many formerly fragmented programs and procedures into one cohesive system; it also includes finite element modeling and analysis, and has been interfaced via a computer network to a relational data base management system and offline plotting devices on mainframe computers. The CAD/CAM software system requires interactive graphics terminals operating at a minimum of 4800 bits/sec transfer rate to a computer. The system is portable and introduces 'interactive graphics', which permits the creation and modification of models interactively. The CAD/CAM system has already produced designs for a large area space platform, a national transonic facility fan blade, and a laminar flow control wind tunnel model. Besides the design/drafting element analysis capability, CAD/CAM provides options to produce an automatic program tooling code to drive a numerically controlled (N/C) machine. Reductions in time for design, engineering, drawing, finite element modeling, and N/C machining will benefit productivity through reduced costs, fewer errors, and a wider range of configuration.

  15. Evaluation of a patient centered e-nursing and caring system.

    PubMed

    Tsai, Lai-Yin; Shan, Huang; Mei-Bei, Lin

    2006-01-01

    This study aims to develop an electronic nursing and caring system to manage patients' information and provide patients with safe and efficient services. By transmitting data among wireless cards, optical network, and mainframe computer, nursing care will be delivered more systematically and patients' safety centered caring will be delivered more efficiently and effectively. With this system, manual record keeping time was cut down, and relevant nursing and caring information was linked up. With the development of an electronic nursing system, nurses were able to make the best use of the Internet resources, integrate information management systematically and improve quality of nursing and caring service.

  16. Instructional image processing on a university mainframe: The Kansas system

    NASA Technical Reports Server (NTRS)

    Williams, T. H. L.; Siebert, J.; Gunn, C.

    1981-01-01

    An interactive digital image processing program package was developed that runs on the University of Kansas central computer, a Honeywell Level 66 multi-processor system. The module form of the package allows easy and rapid upgrades and extensions of the system and is used in remote sensing courses in the Department of Geography, in regional five-day short courses for academics and professionals, and also in remote sensing projects and research. The package comprises three self-contained modules of processing functions: Subimage extraction and rectification; image enhancement, preprocessing and data reduction; and classification. Its use in a typical course setting is described. Availability and costs are considered.

  17. Lessons learned in transitioning to an open systems environment

    NASA Technical Reports Server (NTRS)

    Boland, Dillard E.; Green, David S.; Steger, Warren L.

    1994-01-01

    Software development organizations, both commercial and governmental, are undergoing rapid change spurred by developments in the computing industry. To stay competitive, these organizations must adopt new technologies, skills, and practices quickly. Yet even for an organization with a well-developed set of software engineering models and processes, transitioning to a new technology can be expensive and risky. Current industry trends are leading away from traditional mainframe environments and toward the workstation-based, open systems world. This paper presents the experiences of software engineers on three recent projects that pioneered open systems development for NASA's Flight Dynamics Division of the Goddard Space Flight Center (GSFC).

  18. The "big bang" implementation: not for the faint of heart.

    PubMed

    Anderson, Linda K; Stafford, Cynthia J

    2002-01-01

    Replacing a hospital's obsolete mainframe computer system with a modern integrated clinical and administrative information system presents multiple challenges. When the new system is activated in one weekend, in "big bang" fashion, the challenges are magnified. Careful planning is essential to ensure that all hospital staff are fully prepared for this transition, knowing this conversion will involve system downtime, procedural changes, and the resulting stress that naturally accompanies change. Implementation concerns include staff preparation and training, process changes, continuity of patient care, and technical and administrative support. This article outlines how the University of Missouri Health Care addressed these operational concerns during this dramatic information system conversion.

  19. A comparison of TSS and TRASYS in form factor calculation

    NASA Technical Reports Server (NTRS)

    Golliher, Eric

    1993-01-01

    As the workstation and personal computer become more popular than a centralized mainframe to perform thermal analysis, the methods for space vehicle thermal analysis will change. Already, many thermal analysis codes are now available for workstations, which were not in existence just five years ago. As these changes occur, some organizations will adopt the new codes and analysis techniques, while others will not. This might lead to misunderstandings between thermal shops in different organizations. If thermal analysts make an effort to understand the major differences between the new and old methods, a smoother transition to a more efficient and more versatile thermal analysis environment will be realized.

  20. Evolutionary Development of the Simulation by Logical Modeling System (SIBYL)

    NASA Technical Reports Server (NTRS)

    Wu, Helen

    1995-01-01

    Through the evolutionary development of the Simulation by Logical Modeling System (SIBYL) we have re-engineered the expensive and complex IBM mainframe based Long-term Hardware Projection Model (LHPM) to a robust cost-effective computer based mode that is easy to use. We achieved significant cost reductions and improved productivity in preparing long-term forecasts of Space Shuttle Main Engine (SSME) hardware. The LHPM for the SSME is a stochastic simulation model that projects the hardware requirements over 10 years. SIBYL is now the primary modeling tool for developing SSME logistics proposals and Program Operating Plan (POP) for NASA and divisional marketing studies.

  1. Power supply standardization and optimization study

    NASA Technical Reports Server (NTRS)

    Ware, C. L.; Ragusa, E. V.

    1972-01-01

    A comprehensive design study of a power supply for use in the space shuttle and other space flight applications is presented. The design specifications are established for a power supply capable of supplying over 90 percent of the anticipated voltage requirements for future spacecraft avionics systems. Analyses and tradeoff studies were performed on several alternative design approaches to assure that the selected design would provide near optimum performance of the planned applications. The selected design uses a dc-to-dc converter incorporating regenerative current feedback with a time-ratio controlled duty cycle to achieve high efficiency over a wide variation in input voltage and output loads. The packaging concept uses an expandable mainframe capable of accommodating up to six inverter/regulator modules with one common input filter module.

  2. The Spatial and Temporal Variability of the Arctic Atmospheric Boundary Layer and Its Effect on Electromagnetic (EM) Propagation.

    DTIC Science & Technology

    1987-12-01

    could be run on the IBM 3033 mainframe at the Naval Postgraduate School. I would also like to thank Lt. Mike Dotson whose thesis provided the...29 pressure levels by using an interactive graphing package, Grafstat, available on the IBM 3033 mainframe at the Naval Postgraduate School. Profiles...Polarstern should probably have been 65 . ¢,- .. ... .. .. -- --- P4. itp so’ W. . Fi.3 5. Cae1 p sbr,1 u,10UC n hpslctosa anh20UC (1) S, 40 m ito ce 2) P, MZ

  3. Pc as Physics Computer for Lhc ?

    NASA Astrophysics Data System (ADS)

    Jarp, Sverre; Simmins, Antony; Tang, Hong; Yaari, R.

    In the last five years, we have seen RISC workstations take over the computing scene that was once controlled by mainframes and supercomputers. In this paper we will argue that the same phenomenon might happen again. A project, active since March this year in the Physics Data Processing group, of CERN's CN division is described where ordinary desktop PCs running Windows (NT and 3.11) have been used for creating an environment for running large LHC batch jobs (initially the DICE simulation job of Atlas). The problems encountered in porting both the CERN library and the specific Atlas codes are described together with some encouraging benchmark results when comparing to existing RISC workstations in use by the Atlas collaboration. The issues of establishing the batch environment (Batch monitor, staging software, etc.) are also covered. Finally a quick extrapolation of commodity computing power available in the future is touched upon to indicate what kind of cost envelope could be sufficient for the simulation farms required by the LHC experiments.

  4. ASTEC and MODEL: Controls software development at Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Surber, Jeffrey L.

    1993-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at the Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. In the last three years the ASTEC (Analysis and Simulation Tools for Engineering Controls) software has been under development. ASTEC is meant to be an integrated collection of controls analysis tools for use at the desktop level. MODEL (Multi-Optimal Differential Equation Language) is a translator that converts programs written in the MODEL language to FORTRAN. An upgraded version of the MODEL program will be merged into ASTEC. MODEL has not been modified since 1981 and has not kept with changes in computers or user interface techniques. This paper describes the changes made to MODEL in order to make it useful in the 90's and how it relates to ASTEC.

  5. The changing nature of spacecraft operations: From the Vikings of the 1970's to the great observatories of the 1990's and beyond

    NASA Technical Reports Server (NTRS)

    Ledbetter, Kenneth W.

    1992-01-01

    Four trends in spacecraft flight operations are discussed which will reduce overall program costs. These trends are the use of high-speed, highly reliable data communications systems for distributing operations functions to more convenient and cost-effective sites; the improved capability for remote operation of sensors; a continued rapid increase in memory and processing speed of flight qualified computer chips; and increasingly capable ground-based hardware and software systems, notably those augmented by artificial intelligence functions. Changes reflected by these trends are reviewed starting from the NASA Viking missions of the early 70s, when mission control was conducted at one location using expensive and cumbersome mainframe computers and communications equipment. In the 1980s, powerful desktop computers and modems enabled the Magellan project team to operate the spacecraft remotely. In the 1990s, the Hubble Space Telescope project uses multiple color screens and automated sequencing software on small computers. Given a projection of current capabilities, future control centers will be even more cost-effective.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radtke, M.A.

    This paper will chronicle the activity at Wisconsin Public Service Corporation (WPSC) that resulted in the complete migration of a traditional, late 1970`s vintage, Energy Management System (EMS). The new environment includes networked microcomputers, minicomputers, and the corporate mainframe, and provides on-line access to employees outside the energy control center and some WPSC customers. In the late 1980`s, WPSC was forecasting an EMS computer upgrade or replacement to address both capacity and technology needs. Reasoning that access to diverse computing resources would best position the company to accommodate the uncertain needs of the energy industry in the 90`s, WPSC chosemore » to investigate an in-place migration to a network of computers, able to support heterogeneous hardware and operating systems. The system was developed in a modular fashion, with individual modules being deployed as soon as they were completed. The functional and technical specification was continuously enhanced as operating experience was gained from each operational module. With the migration off the original EMS computers complete, the networked system called DEMAXX (Distributed Energy Management Architecture with eXtensive eXpandability) has exceeded expectations in the areas of: cost, performance, flexibility, and reliability.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radtke, M.A.

    This paper will chronicle the activity at Wisconsin Public Service Corporation (WPSC) that resulted in the complete migration of a traditional, late 1970`s vintage, Energy management System (EMS). The new environment includes networked microcomputers, minicomputers, and the corporate mainframe, and provides on-line access to employees outside the energy control center and some WPSC customers. In the late 1980`s, WPSC was forecasting an EMS computer upgrade or replacement to address both capacity and technology needs. Reasoning that access to diverse computing resources would best position the company to accommodate the uncertain needs of the energy industry in the 90`s, WPSC chosemore » to investigate an in-place migration to a network of computers, able to support heterogeneous hardware and operating systems. The system was developed in a modular fashion, with individual modules being deployed as soon as they were completed. The functional and technical specification was continuously enhanced as operating experience was gained from each operational module. With the migration of the original EMS computers complete, the networked system called DEMAXX (Distributed Energy Management Architecture with eXtensive eXpandability) has exceeded expectations in the areas of: cost, performance, flexibility, and reliability.« less

  8. System analysis in rotorcraft design: The past decade

    NASA Technical Reports Server (NTRS)

    Galloway, Thomas L.

    1988-01-01

    Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.

  9. APPLEPIPS /Apple Personal Image Processing System/ - An interactive digital image processing system for the Apple II microcomputer

    NASA Technical Reports Server (NTRS)

    Masuoka, E.; Rose, J.; Quattromani, M.

    1981-01-01

    Recent developments related to microprocessor-based personal computers have made low-cost digital image processing systems a reality. Image analysis systems built around these microcomputers provide color image displays for images as large as 256 by 240 pixels in sixteen colors. Descriptive statistics can be computed for portions of an image, and supervised image classification can be obtained. The systems support Basic, Fortran, Pascal, and assembler language. A description is provided of a system which is representative of the new microprocessor-based image processing systems currently on the market. While small systems may never be truly independent of larger mainframes, because they lack 9-track tape drives, the independent processing power of the microcomputers will help alleviate some of the turn-around time problems associated with image analysis and display on the larger multiuser systems.

  10. The IBM PC at NASA Ames

    NASA Technical Reports Server (NTRS)

    Peredo, James P.

    1988-01-01

    Like many large companies, Ames relies very much on its computing power to get work done. And, like many other large companies, finding the IBM PC a reliable tool, Ames uses it for many of the same types of functions as other companies. Presentation and clarification needs demand much of graphics packages. Programming and text editing needs require simpler, more-powerful packages. The storage space needed by NASA's scientists and users for the monumental amounts of data that Ames needs to keep demand the best database packages that are large and easy to use. Availability to the Micom Switching Network combines the powers of the IBM PC with the capabilities of other computers and mainframes and allows users to communicate electronically. These four primary capabilities of the PC are vital to the needs of NASA's users and help to continue and support the vast amounts of work done by the NASA employees.

  11. The Air Force's central reference laboratory: maximizing service while minimizing cost.

    PubMed

    Armbruster, D A

    1991-11-01

    The Laboratory Services Branch (Epi Lab) of the Epidemiology Division, Brooks AFB, Texas, is designated by regulation to serve as the Air Force's central reference laboratory, providing clinical laboratory testing support to all Air Force medical treatment facilities (MTFs). Epi Lab recognized that it was not offering the MTFs a service comparable to civilian reference laboratories and that, as a result, the Air Force medical system was spending hundreds of thousands of dollars yearly for commercial laboratory support. An in-house laboratory upgrade program was proposed to and approved by the USAF Surgeon General, as a Congressional Efficiencies Add project, to launch a two-phase initiative consisting of a 1-year field trial of 30 MTFs, followed by expansion to another 60 MTFs. Major components of the program include overnight air courier service to deliver patient samples to Epi Lab, a mainframe computer laboratory information system and electronic reporting of results to the MTFs throughout the CONUS. Application of medical marketing concepts and the Total Quality Management (TQM) philosophy allowed Epi to provide dramatically enhanced reference service at a cost savings of about $1 million to the medical system. The Epi Lab upgrade program represents an innovative problem-solving approach, combining technical and managerial improvements, resulting in substantial patient care service and financial dividends. It serves as an example of successful application of TQM and marketing within the military medical system.

  12. Transferring ecosystem simulation codes to supercomputers

    NASA Technical Reports Server (NTRS)

    Skiles, J. W.; Schulbach, C. H.

    1995-01-01

    Many ecosystem simulation computer codes have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Supercomputing platforms (both parallel and distributed systems) have been largely unused, however, because of the perceived difficulty in accessing and using the machines. Also, significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers must be considered. We have transferred a grassland simulation model (developed on a VAX) to a Cray Y-MP/C90. We describe porting the model to the Cray and the changes we made to exploit the parallelism in the application and improve code execution. The Cray executed the model 30 times faster than the VAX and 10 times faster than a Unix workstation. We achieved an additional speedup of 30 percent by using the compiler's vectoring and 'in-line' capabilities. The code runs at only about 5 percent of the Cray's peak speed because it ineffectively uses the vector and parallel processing capabilities of the Cray. We expect that by restructuring the code, it could execute an additional six to ten times faster.

  13. Assessment of radionuclide databases in CAP88 mainframe version 1.0 and Windows-based version 3.0.

    PubMed

    LaBone, Elizabeth D; Farfán, Eduardo B; Lee, Patricia L; Jannik, G Timothy; Donnelly, Elizabeth H; Foley, Trevor Q

    2009-09-01

    In this study the radionuclide databases for two versions of the Clean Air Act Assessment Package-1988 (CAP88) computer model were assessed in detail. CAP88 estimates radiation dose and the risk of health effects to human populations from radionuclide emissions to air. This program is used by several U.S. Department of Energy (DOE) facilities to comply with National Emission Standards for Hazardous Air Pollutants regulations. CAP88 Mainframe, referred to as version 1.0 on the U.S. Environmental Protection Agency Web site (http://www.epa.gov/radiation/assessment/CAP88/), was the very first CAP88 version released in 1988. Some DOE facilities including the Savannah River Site still employ this version (1.0) while others use the more user-friendly personal computer Windows-based version 3.0 released in December 2007. Version 1.0 uses the program RADRISK based on International Commission on Radiological Protection Publication 30 as its radionuclide database. Version 3.0 uses half-life, dose, and risk factor values based on Federal Guidance Report 13. Differences in these values could cause different results for the same input exposure data (same scenario), depending on which version of CAP88 is used. Consequently, the differences between the two versions are being assessed in detail at Savannah River National Laboratory. The version 1.0 and 3.0 database files contain 496 and 838 radionuclides, respectively, and though one would expect the newer version to include all the 496 radionuclides, 35 radionuclides are listed in version 1.0 that are not included in version 3.0. The majority of these has either extremely short or long half-lives or is no longer in production; however, some of the short-lived radionuclides might produce progeny of great interest at DOE sites. In addition, 122 radionuclides were found to have different half-lives in the two versions, with 21 over 3 percent different and 12 over 10 percent different.

  14. Justification for, and design of, an economical programmable multiple flight simulator

    NASA Technical Reports Server (NTRS)

    Kreifeldt, J. G.; Wittenber, J.; Macdonald, G.

    1981-01-01

    The considered research interests in air traffic control (ATC) studies revolve about the concept of distributed ATC management based on the assumption that the pilot has a cockpit display of traffic and navigation information (CDTI) via CRT graphics. The basic premise is that a CDTI equipped pilot can, in coordination with a controller, manage a part of his local traffic situation thereby improving important aspects of ATC performance. A modularly designed programmable flight simulator system is prototyped as a means of providing an economical facility of up to eight simulators to interface with a mainframe/graphics system for ATC experimentation, particularly CDTI-distributed management in which pilot-pilot interaction can have a determining effect on system performance. Need for a multiman simulator facility is predicted on results from an earlier three simulator facility.

  15. A report on the ST ScI optical disk workstation

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The STScI optical disk project was designed to explore the options, opportunities and problems presented by the optical disk technology, and to see if optical disks are a viable, and inexpensive, means of storing the large amount of data which are found in astronomical digital imagery. A separate workstation was purchased on which the development can be done and serves as an astronomical image processing computer, incorporating the optical disks into the solution of standard image processing tasks. It is indicated that small workstations can be powerful tools for image processing, and that astronomical image processing may be more conveniently and cost-effectively performed on microcomputers than on the mainframe and super-minicomputers. The optical disks provide unique capabilities in data storage.

  16. Arterial signal timing optimization using PASSER II-87

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, E.C.P.; Messer, C.J.; Garza, R.U.

    1988-11-01

    PASSER is the acronym for the Progression Analysis and Signal System Evaluation Routine. PASSER II was originally developed by the Texas Transportation Institute (TTI) for the Dallas Corridor Project. The Texas State Department of Highways and Public Transportation (SDHPT) has sponsored the subsequent program development on both mainframe computers and microcomputers. The theory, model structure, methodology, and logic of PASSER II have been evaluated and well documented. PASSER II is widely used because of its ability to easily select multiple-phase sequences by adjusting the background cycle length and progression speeds to find the optimal timing plants, such as cycle, greenmore » split, phase sequence, and offsets, that can efficiently maximize the two-way progression bands.« less

  17. Modelling milk production from feed intake in dairy cattle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clarke, D.L.

    1985-05-01

    Predictive models were developed for both Holstein and Jersey cows. Since Holsteins comprised eighty-five percent of the data, the predictive models developed for Holsteins were used for the development of a user-friendly computer model. Predictive models included: milk production (squared multiple correlation .73), natural log (ln) of milk production (.73), four percent fat-corrected milk (.67), ln four percent fat-corrected milk (.68), fat-free milk (.73), ln fat-free milk (.73), dry matter intake (.61), ln dry matter intake (.60), milk fat (.52), and ln milk fat (.56). The predictive models for ln milk production, ln fat-free milk and ln dry matter intakemore » were incorporated into a computer model. The model was written in standard Fortran for use on mainframe or micro-computers. Daily milk production, fat-free milk production, and dry matter intake were predicted on a daily basis with the previous day's dry matter intake serving as an independent variable in the prediction of the daily milk and fat-free milk production. 21 refs.« less

  18. Two-dimensional nonsteady viscous flow simulation on the Navier-Stokes computer miniNode

    NASA Technical Reports Server (NTRS)

    Nosenchuck, Daniel M.; Littman, Michael G.; Flannery, William

    1986-01-01

    The needs of large-scale scientific computation are outpacing the growth in performance of mainframe supercomputers. In particular, problems in fluid mechanics involving complex flow simulations require far more speed and capacity than that provided by current and proposed Class VI supercomputers. To address this concern, the Navier-Stokes Computer (NSC) was developed. The NSC is a parallel-processing machine, comprised of individual Nodes, each comparable in performance to current supercomputers. The global architecture is that of a hypercube, and a 128-Node NSC has been designed. New architectural features, such as a reconfigurable many-function ALU pipeline and a multifunction memory-ALU switch, have provided the capability to efficiently implement a wide range of algorithms. Efficient algorithms typically involve numerically intensive tasks, which often include conditional operations. These operations may be efficiently implemented on the NSC without, in general, sacrificing vector-processing speed. To illustrate the architecture, programming, and several of the capabilities of the NSC, the simulation of two-dimensional, nonsteady viscous flows on a prototype Node, called the miniNode, is presented.

  19. Pre- and post-processing for Cosmic/NASTRAN on personal computers and mainframes

    NASA Technical Reports Server (NTRS)

    Kamel, H. A.; Mobley, A. V.; Nagaraj, B.; Watkins, K. W.

    1986-01-01

    An interface between Cosmic/NASTRAN and GIFTS has recently been released, combining the powerful pre- and post-processing capabilities of GIFTS with Cosmic/NASTRAN's analysis capabilities. The interface operates on a wide range of computers, even linking Cosmic/NASTRAN and GIFTS when the two are on different computers. GIFTS offers a wide range of elements for use in model construction, each translated by the interface into the nearest Cosmic/NASTRAN equivalent; and the options of automatic or interactive modelling and loading in GIFTS make pre-processing easy and effective. The interface itself includes the programs GFTCOS, which creates the Cosmic/NASTRAN input deck (and, if desired, control deck) from the GIFTS Unified Data Base, COSGFT, which translates the displacements from the Cosmic/NASTRAN analysis back into GIFTS; and HOSTR, which handles stress computations for a few higher-order elements available in the interface, but not supported by the GIFTS processor STRESS. Finally, the versatile display options in GIFTS post-processing allow the user to examine the analysis results through an especially wide range of capabilities, including such possibilities as creating composite loading cases, plotting in color and animating the analysis.

  20. Networked Instructional Chemistry: Using Technology To Teach Chemistry

    NASA Astrophysics Data System (ADS)

    Smith, Stanley; Stovall, Iris

    1996-10-01

    Networked multimedia microcomputers provide new ways to help students learn chemistry and to help instructors manage the learning environment. This technology is used to replace some traditional laboratory work, collect on-line experimental data, enhance lectures and quiz sections with multimedia presentations, provide prelaboratory training for beginning nonchemistry- major organic laboratory, provide electronic homework for organic chemistry students, give graduate students access to real NMR data for analysis, and provide access to molecular modeling tools. The integration of all of these activities into an active learning environment is made possible by a client-server network of hundreds of computers. This requires not only instructional software but also classroom and course management software, computers, networking, and room management. Combining computer-based work with traditional course material is made possible with software management tools that allow the instructor to monitor the progress of each student and make available an on-line gradebook so students can see their grades and class standing. This client-server based system extends the capabilities of the earlier mainframe-based PLATO system, which was used for instructional computing. This paper outlines the components of a technology center used to support over 5,000 students per semester.

  1. CLARET user's manual: Mainframe Logs. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frobose, R.H.

    1984-11-12

    CLARET (Computer Logging and RETrieval) is a stand-alone PDP 11/23 system that can support 16 terminals. It provides a forms-oriented front end by which operators enter online activity logs for the Lawrence Livermore National Laboratory's OCTOPUS computer network. The logs are stored on the PDP 11/23 disks for later retrieval, and hardcopy reports are generated both automatically and upon request. Online viewing of the current logs is provided to management. As each day's logs are completed, the information is automatically sent to a CRAY and included in an online database system. The terminal used for the CLARET system is amore » dual-port Hewlett Packard 2626 terminal that can be used as either the CLARET logging station or as an independent OCTOPUS terminal. Because this is a stand-alone system, it does not depend on the availability of the OCTOPUS network to run and, in the event of a power failure, can be brought up independently.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The model is designed to enable decision makers to compare the economics of geothermal projects with the economics of alternative energy systems at an early stage in the decision process. The geothermal engineering and economic feasibility computer model (GEEF) is written in FORTRAN IV language and can be run on a mainframe or a mini-computer system. An abbreviated version of the model is being developed for usage in conjunction with a programmable desk calculator. The GEEF model has two main segments, namely (i) the engineering design/cost segment and (ii) the economic analysis segment. In the engineering segment, the model determinesmore » the numbers of production and injection wells, heat exchanger design, operating parameters for the system, requirement of supplementary system (to augment the working fluid temperature if the resource temperature is not sufficiently high), and the fluid flow rates. The model can handle single stage systems as well as two stage cascaded systems in which the second stage may involve a space heating application after a process heat application in the first stage.« less

  3. Multi-crop area estimation and mapping on a microprocessor/mainframe network

    NASA Technical Reports Server (NTRS)

    Sheffner, E.

    1985-01-01

    The data processing system is outlined for a 1985 test aimed at determining the performance characteristics of area estimation and mapping procedures connected with the California Cooperative Remote Sensing Project. The project is a joint effort of the USDA Statistical Reporting Service-Remote Sensing Branch, the California Department of Water Resources, NASA-Ames Research Center, and the University of California Remote Sensing Research Program. One objective of the program was to study performance when data processing is done on a microprocessor/mainframe network under operational conditions. The 1985 test covered the hardware, software, and network specifications and the integration of these three components. Plans for the year - including planned completion of PEDITOR software, testing of software on MIDAS, and accomplishment of data processing on the MIDAS-VAX-CRAY network - are discussed briefly.

  4. Normalizing the causality between time series.

    PubMed

    Liang, X San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  5. Compiler-assisted multiple instruction rollback recovery using a read buffer

    NASA Technical Reports Server (NTRS)

    Alewine, N. J.; Chen, S.-K.; Fuchs, W. K.; Hwu, W.-M.

    1993-01-01

    Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper focuses on compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations indicate improved efficiency over previous hardware-based and compiler-based schemes.

  6. Normalizing the causality between time series

    NASA Astrophysics Data System (ADS)

    Liang, X. San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  7. Compiler-assisted multiple instruction rollback recovery using a read buffer

    NASA Technical Reports Server (NTRS)

    Alewine, Neal J.; Chen, Shyh-Kwei; Fuchs, W. Kent; Hwu, Wen-Mei W.

    1995-01-01

    Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper describes compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. The compiler-assisted scheme presented consists of hardware that is less complex than shadow files, history files, history buffers, or delayed write buffers, while experimental evaluation indicates performance improvement over compiler-based schemes.

  8. Arterial Catheterization

    MedlinePlus

    ... and Their Families , ATS Website: www.thoracic.org/assemblies/cc/ccprimer/mainframe2.html Additional Information American Thoracic ... Have the ICU nurse show you how the line is bandaged and how it is watched to ...

  9. Quantifying the potential export flows of used electronic products in Macau: a case study of PCs.

    PubMed

    Yu, Danfeng; Song, Qingbin; Wang, Zhishi; Li, Jinhui; Duan, Huabo; Wang, Jinben; Wang, Chao; Wang, Xu

    2017-12-01

    The used electronic product (UEP) has attracted the worldwide attentions because part of e-waste may be exported from developed countries to developing countries in the name of UEP. On the basis of large foreign trade data of electronic products (e-products), this study adopted the trade data approach (TDA) to quantify the potential exports of UEP in Macau, taking a case study of personal computers (PCs). The results show that the desktop mainframes, LCD monitors, and CRT monitors have more low-unit-value trades with higher trade volumes in the past 10 years, while the laptop and tablet PCs, as the newer technologies, owned the higher ratios of the high-unit-value trades. During the period of 2005-2015, the total mean exports for used laptop and tablet PCs, desktop mainframes, and LCD monitors were approximately 18,592, 79,957, and 43,177 units, respectively, while the possible export volume of used CRT monitors was higher, up to 430,098 units in 2000-2010. Noticed that these potential export volumes could be the lower bound because not all used PCs may be shipped using the PC trade code. For all the four kinds of used PCs, the majority (61.6-98.82%) of the export volumes have gone to Hong Kong, followed by Mainland China and Taiwan. Since 2011, there was no CRT monitor export; however, the other kinds of used PC exports will still exist in Macau in the future. The outcomes are helpful to understand and manage the current export situations of used products in Macau, and can also provide a reference for other countries and regions.

  10. The Electronic Hermit: Trends in Library Automation.

    ERIC Educational Resources Information Center

    LaRue, James

    1988-01-01

    Reviews trends in library software development including: (1) microcomputer applications; (2) CD-ROM; (3) desktop publishing; (4) public access microcomputers; (5) artificial intelligence; (6) mainframes and minicomputers; and (7) automated catalogs. (MES)

  11. Benchmarked analyses of gamma skyshine using MORSE-CGA-PC and the DABL69 cross-section set

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reichert, P.T.; Golshani, M.

    1991-01-01

    Design for gamma-ray skyshine is a common consideration for a variety of nuclear and accelerator facilities. Many of these designs can benefit from a more accurate and complete treatment than can be provided by simple skyshine analysis tools. Those methods typically require a number of conservative, simplifying assumptions in modeling the radiation source and shielding geometry. This paper considers the benchmarking of one analytical option. The MORSE-CGA Monte Carlo radiation transport code system provides the capability for detailed treatment of virtually any source and shielding geometry. Unfortunately, the mainframe computer costs of MORSE-CGA analyses can prevent cost-effective application to smallmore » projects. For this reason, the MORSE-CGA system was converted to run on IBM personal computer (PC)-compatible computers using the Intel 80386 or 80486 microprocessors. The DLC-130/DABL69 cross-section set (46n,23g) was chosen as the most suitable, readily available, broad-group library. The most important reason is the relatively high (P{sub 5}) Legendre order of expansion for angular distribution. This is likely to be beneficial in the deep-penetration conditions modeled in some skyshine problems.« less

  12. A data-management system for detailed areal interpretive data

    USGS Publications Warehouse

    Ferrigno, C.F.

    1986-01-01

    A data storage and retrieval system has been developed to organize and preserve areal interpretive data. This system can be used by any study where there is a need to store areal interpretive data that generally is presented in map form. This system provides the capability to grid areal interpretive data for input to groundwater flow models at any spacing and orientation. The data storage and retrieval system is designed to be used for studies that cover small areas such as counties. The system is built around a hierarchically structured data base consisting of related latitude-longitude blocks. The information in the data base can be stored at different levels of detail, with the finest detail being a block of 6 sec of latitude by 6 sec of longitude (approximately 0.01 sq mi). This system was implemented on a mainframe computer using a hierarchical data base management system. The computer programs are written in Fortran IV and PL/1. The design and capabilities of the data storage and retrieval system, and the computer programs that are used to implement the system are described. Supplemental sections contain the data dictionary, user documentation of the data-system software, changes that would need to be made to use this system for other studies, and information on the computer software tape. (Lantz-PTT)

  13. Finite Element Analysis (FEA) in Design and Production.

    ERIC Educational Resources Information Center

    Waggoner, Todd C.; And Others

    1995-01-01

    Finite element analysis (FEA) enables industrial designers to analyze complex components by dividing them into smaller elements, then assessing stress and strain characteristics. Traditionally mainframe based, FEA is being increasingly used in microcomputers. (SK)

  14. Security in Full-Force

    NASA Technical Reports Server (NTRS)

    2002-01-01

    When fully developed for NASA, Vanguard Enforcer(TM) software-which emulates the activities of highly technical security system programmers, auditors, and administrators-was among the first intrusion detection programs to restrict human errors from affecting security, and to ensure the integrity of a computer's operating systems, as well as the protection of mission critical resources. Vanguard Enforcer was delivered in 1991 to Johnson Space Center and has been protecting systems and critical data there ever since. In August of 1999, NASA granted Vanguard exclusive rights to commercialize the Enforcer system for the private sector. In return, Vanguard continues to supply NASA with ongoing research, development, and support of Enforcer. The Vanguard Enforcer 4.2 is one of several surveillance technologies that make up the Vanguard Security Solutions line of products. Using a mainframe environment, Enforcer 4.2 achieves previously unattainable levels of automated security management.

  15. Functional requirements document for NASA/MSFC Earth Science and Applications Division: Data and information system (ESAD-DIS). Interoperability, 1992

    NASA Technical Reports Server (NTRS)

    Stephens, J. Briscoe; Grider, Gary W.

    1992-01-01

    These Earth Science and Applications Division-Data and Information System (ESAD-DIS) interoperability requirements are designed to quantify the Earth Science and Application Division's hardware and software requirements in terms of communications between personal and visualization workstation, and mainframe computers. The electronic mail requirements and local area network (LAN) requirements are addressed. These interoperability requirements are top-level requirements framed around defining the existing ESAD-DIS interoperability and projecting known near-term requirements for both operational support and for management planning. Detailed requirements will be submitted on a case-by-case basis. This document is also intended as an overview of ESAD-DIs interoperability for new-comers and management not familiar with these activities. It is intended as background documentation to support requests for resources and support requirements.

  16. Impact of workstations on criticality analyses at ABB combustion engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tarko, L.B.; Freeman, R.S.; O'Donnell, P.F.

    1993-01-01

    During 1991, ABB Combustion Engineering (ABB C-E) made the transition from a CDC Cyber 990 mainframe for nuclear criticality safety analyses to Hewlett Packard (HP)/Apollo workstations. The primary motivation for this change was improved economics of the workstation and maintaining state-of-the-art technology. The Cyber 990 utilized the NOS operating system with a 60-bit word size. The CPU memory size was limited to 131 100 words of directly addressable memory with an extended 250000 words available. The Apollo workstation environment at ABB consists of HP/Apollo-9000/400 series desktop units used by most application engineers, networked with HP/Apollo DN10000 platforms that use 32-bitmore » word size and function as the computer servers and network administrative CPUS, providing a virtual memory system.« less

  17. The third level trigger and output event unit of the UA1 data-acquisition system

    NASA Astrophysics Data System (ADS)

    Cittolin, S.; Demoulin, M.; Fucci, A.; Haynes, W.; Martin, B.; Porte, J. P.; Sphicas, P.

    1989-12-01

    The upgraded UA1 experiment utilizes twelve 3081/E emulators for its third-level trigger system. The system is interfaced to VME, and is controlled by 68000 microprocessor VME boards on the input and output. The output controller communicates with an IBM 9375 mainframe via the CERN-IBM developed VICI interface. The events selected by the emulators are output on IBM-3480 cassettes. The user interface to this system is based on a series of Macintosh personal computer connected to the VME bus. These Macs are also used for developing software for the emulators and for monitoring the entire system. The same configuration has also been used for offline event reconstruction. A description of the system, together with details of both the online and offline modes of operation and an eveluation of its performance are presented.

  18. Avoid Disaster: Use Firewalls for Inter-Intranet Security.

    ERIC Educational Resources Information Center

    Charnetski, J. R.

    1998-01-01

    Discusses the use of firewalls for library intranets, highlighting the move from mainframes to PCs, security issues and firewall architecture, and operating systems. Provides a glossary of basic networking terms and a bibliography of suggested reading. (PEN)

  19. Developing a Telecommunications Curriculum for Students with Physical Disabilities.

    ERIC Educational Resources Information Center

    Gandell, Terry S.; Laufer, Dorothy

    1993-01-01

    A telecommunications curriculum was developed for students (ages 15-21) with physical disabilities. Curriculum content included an internal mailbox program (Mailbox), interactive communication system (Blisscom), bulletin board system (Arctel), and a mainframe system (Compuserv). (JDD)

  20. NASA Lewis steady-state heat pipe code users manual

    NASA Technical Reports Server (NTRS)

    Tower, Leonard K.; Baker, Karl W.; Marks, Timothy S.

    1992-01-01

    The NASA Lewis heat pipe code was developed to predict the performance of heat pipes in the steady state. The code can be used as a design tool on a personal computer or with a suitable calling routine, as a subroutine for a mainframe radiator code. A variety of wick structures, including a user input option, can be used. Heat pipes with multiple evaporators, condensers, and adiabatic sections in series and with wick structures that differ among sections can be modeled. Several working fluids can be chosen, including potassium, sodium, and lithium, for which monomer-dimer equilibrium is considered. The code incorporates a vapor flow algorithm that treats compressibility and axially varying heat input. This code facilitates the determination of heat pipe operating temperatures and heat pipe limits that may be encountered at the specified heat input and environment temperature. Data are input to the computer through a user-interactive input subroutine. Output, such as liquid and vapor pressures and temperatures, is printed at equally spaced axial positions along the pipe as determined by the user.

  1. NASA Lewis steady-state heat pipe code users manual

    NASA Astrophysics Data System (ADS)

    Tower, Leonard K.; Baker, Karl W.; Marks, Timothy S.

    1992-06-01

    The NASA Lewis heat pipe code was developed to predict the performance of heat pipes in the steady state. The code can be used as a design tool on a personal computer or with a suitable calling routine, as a subroutine for a mainframe radiator code. A variety of wick structures, including a user input option, can be used. Heat pipes with multiple evaporators, condensers, and adiabatic sections in series and with wick structures that differ among sections can be modeled. Several working fluids can be chosen, including potassium, sodium, and lithium, for which monomer-dimer equilibrium is considered. The code incorporates a vapor flow algorithm that treats compressibility and axially varying heat input. This code facilitates the determination of heat pipe operating temperatures and heat pipe limits that may be encountered at the specified heat input and environment temperature. Data are input to the computer through a user-interactive input subroutine. Output, such as liquid and vapor pressures and temperatures, is printed at equally spaced axial positions along the pipe as determined by the user.

  2. Interactive Forecasting with the National Weather Service River Forecast System

    NASA Technical Reports Server (NTRS)

    Smith, George F.; Page, Donna

    1993-01-01

    The National Weather Service River Forecast System (NWSRFS) consists of several major hydrometeorologic subcomponents to model the physics of the flow of water through the hydrologic cycle. The entire NWSRFS currently runs in both mainframe and minicomputer environments, using command oriented text input to control the system computations. As computationally powerful and graphically sophisticated scientific workstations became available, the National Weather Service (NWS) recognized that a graphically based, interactive environment would enhance the accuracy and timeliness of NWS river and flood forecasts. Consequently, the operational forecasting portion of the NWSRFS has been ported to run under a UNIX operating system, with X windows as the display environment on a system of networked scientific workstations. In addition, the NWSRFS Interactive Forecast Program was developed to provide a graphical user interface to allow the forecaster to control NWSRFS program flow and to make adjustments to forecasts as necessary. The potential market for water resources forecasting is immense and largely untapped. Any private company able to market the river forecasting technologies currently developed by the NWS Office of Hydrology could provide benefits to many information users and profit from providing these services.

  3. Maxdose-SR and popdose-SR routine release atmospheric dose models used at SRS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jannik, G. T.; Trimor, P. P.

    MAXDOSE-SR and POPDOSE-SR are used to calculate dose to the offsite Reference Person and to the surrounding Savannah River Site (SRS) population respectively following routine releases of atmospheric radioactivity. These models are currently accessed through the Dose Model Version 2014 graphical user interface (GUI). MAXDOSE-SR and POPDOSE-SR are personal computer (PC) versions of MAXIGASP and POPGASP, which both resided on the SRS IBM Mainframe. These two codes follow U.S. Nuclear Regulatory Commission (USNRC) Regulatory Guides 1.109 and 1.111 (1977a, 1977b). The basis for MAXDOSE-SR and POPDOSE-SR are USNRC developed codes XOQDOQ (Sagendorf et. al 1982) and GASPAR (Eckerman et. almore » 1980). Both of these codes have previously been verified for use at SRS (Simpkins 1999 and 2000). The revisions incorporated into MAXDOSE-SR and POPDOSE-SR Version 2014 (hereafter referred to as MAXDOSE-SR and POPDOSE-SR unless otherwise noted) were made per Computer Program Modification Tracker (CPMT) number Q-CMT-A-00016 (Appendix D). Version 2014 was verified for use at SRS in Dixon (2014).« less

  4. From the genetic to the computer program: the historicity of 'data' and 'computation' in the investigations on the nematode worm C. elegans (1963-1998).

    PubMed

    García-Sancho, Miguel

    2012-03-01

    This paper argues that the history of the computer, of the practice of computation and of the notions of 'data' and 'programme' are essential for a critical account of the emergence and implications of data-driven research. In order to show this, I focus on the transition that the investigations on the worm C. elegans experienced in the Laboratory of Molecular Biology of Cambridge (UK). Throughout the 1980s, this research programme evolved from a study of the genetic basis of the worm's development and behaviour to a DNA mapping and sequencing initiative. By examining the changing computing technologies which were used at the Laboratory, I demonstrate that by the time of this transition researchers shifted from modelling the worm's genetic programme on a mainframe apparatus to writing minicomputer programs aimed at providing map and sequence data which was then circulated to other groups working on the genetics of C. elegans. The shift in the worm research should thus not be simply explained in the application of computers which transformed the project from hypothesis-driven to a data-intensive endeavour. The key factor was rather a historically specific technology-in-house and easy programmable minicomputers-which redefined the way of achieving the project's long-standing goal, leading the genetic programme to co-evolve with the practices of data production and distribution. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Video Information Communication and Retrieval/Image Based Information System (VICAR/IBIS)

    NASA Technical Reports Server (NTRS)

    Wherry, D. B.

    1981-01-01

    The acquisition, operation, and planning stages of installing a VICAR/IBIS system are described. The system operates in an IBM mainframe environment, and provides image processing of raster data. System support problems with software and documentation are discussed.

  6. SIMOS feasibility report, task 4 : sign inventory management and ordering system

    DOT National Transportation Integrated Search

    1997-12-01

    The Sign Inventory Management and Ordering System (SIMOS) design is a merger of existing manually maintained information management systems married to PennDOT's GIS and department-wide mainframe database to form a logical connection for enhanced sign...

  7. Evaluation of the finite element fuel rod analysis code (FRANCO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, K.; Feltus, M.A.

    1994-12-31

    Knowledge of temperature distribution in a nuclear fuel rod is required to predict the behavior of fuel elements during operating conditions. The thermal and mechanical properties and performance characteristics are strongly dependent on the temperature, which can vary greatly inside the fuel rod. A detailed model of fuel rod behavior can be described by various numerical methods, including the finite element approach. The finite element method has been successfully used in many engineering applications, including nuclear piping and reactor component analysis. However, fuel pin analysis has traditionally been carried out with finite difference codes, with the exception of Electric Powermore » Research Institute`s FREY code, which was developed for mainframe execution. This report describes FRANCO, a finite element fuel rod analysis code capable of computing temperature disrtibution and mechanical deformation of a single light water reactor fuel rod.« less

  8. Man-machine interfaces in LACIE/ERIPS

    NASA Technical Reports Server (NTRS)

    Duprey, B. B. (Principal Investigator)

    1979-01-01

    One of the most important aspects of the interactive portion of the LACIE/ERIPS software system is the way in which the analysis and decision-making capabilities of a human being are integrated with the speed and accuracy of a computer to produce a powerful analysis system. The three major man-machine interfaces in the system are (1) the use of menus for communications between the software and the interactive user; (2) the checkpoint/restart facility to recreate in one job the internal environment achieved in an earlier one; and (3) the error recovery capability which would normally cause job termination. This interactive system, which executes on an IBM 360/75 mainframe, was adapted for use in noninteractive (batch) mode. A case study is presented to show how the interfaces work in practice by defining some fields based on an image screen display, noting the field definitions, and obtaining a film product of the classification map.

  9. Lunar laser ranging data processing in a Unix/X windows environment

    NASA Technical Reports Server (NTRS)

    Ricklefs, Randall L.; Ries, Judit G.

    1993-01-01

    In cooperation with the NASA Crustal Dynamics Project initiative placing workstation computers at each of its laser ranging stations to handle data filtering and normalpointing, MLRS personnel have developed a new generation of software to provide the same services for the lunar laser ranging data type. The Unix operating system and X windows/Motif provides an environment for both batch and interactive filtering and normalpointing as well as prediction calculations. The goal is to provide a transportable and maintainable data reduction environment. This software and some sample displays are presented. that the lunar (or satellite) datacould be processed on one computer while data was taken on the other. The reduction of the data was totally interactive and in no way automated. In addition, lunar predictions were produced on-site, another first in the effort to down-size historically mainframe-based applications. Extraction of earth rotation parameters was at one time attempted on site in near-realtime. In 1988, the Crustal Dynamics Project SLR Computer Panel mandated the installation of Hewlett-Packard 9000/360 Unix workstations at each NASA-operated laser ranging station to relieve the aging controller computers of much of their data and communications handling responsibility and to provide on-site data filtering and normal pointing for a growing list of artificial satellite targets. This was seen by MLRS staff as an opportunity to provide a better lunar data processing environment as well.

  10. Lunar laser ranging data processing in a Unix/X windows environment

    NASA Astrophysics Data System (ADS)

    Ricklefs, Randall L.; Ries, Judit G.

    1993-06-01

    In cooperation with the NASA Crustal Dynamics Project initiative placing workstation computers at each of its laser ranging stations to handle data filtering and normalpointing, MLRS personnel have developed a new generation of software to provide the same services for the lunar laser ranging data type. The Unix operating system and X windows/Motif provides an environment for both batch and interactive filtering and normalpointing as well as prediction calculations. The goal is to provide a transportable and maintainable data reduction environment. This software and some sample displays are presented. that the lunar (or satellite) datacould be processed on one computer while data was taken on the other. The reduction of the data was totally interactive and in no way automated. In addition, lunar predictions were produced on-site, another first in the effort to down-size historically mainframe-based applications. Extraction of earth rotation parameters was at one time attempted on site in near-realtime. In 1988, the Crustal Dynamics Project SLR Computer Panel mandated the installation of Hewlett-Packard 9000/360 Unix workstations at each NASA-operated laser ranging station to relieve the aging controller computers of much of their data and communications handling responsibility and to provide on-site data filtering and normal pointing for a growing list of artificial satellite targets. This was seen by MLRS staff as an opportunity to provide a better lunar data processing environment as well.

  11. Jackson State University's Center for Spatial Data Research and Applications: New facilities and new paradigms

    NASA Technical Reports Server (NTRS)

    Davis, Bruce E.; Elliot, Gregory

    1989-01-01

    Jackson State University recently established the Center for Spatial Data Research and Applications, a Geographical Information System (GIS) and remote sensing laboratory. Taking advantage of new technologies and new directions in the spatial (geographic) sciences, JSU is building a Center of Excellence in Spatial Data Management. New opportunities for research, applications, and employment are emerging. GIS requires fundamental shifts and new demands in traditional computer science and geographic training. The Center is not merely another computer lab but is one setting the pace in a new applied frontier. GIS and its associated technologies are discussed. The Center's facilities are described. An ARC/INFO GIS runs on a Vax mainframe, with numerous workstations. Image processing packages include ELAS, LIPS, VICAR, and ERDAS. A host of hardware and software peripheral are used in support. Numerous projects are underway, such as the construction of a Gulf of Mexico environmental data base, development of AI in image processing, a land use dynamics study of metropolitan Jackson, and others. A new academic interdisciplinary program in Spatial Data Management is under development, combining courses in Geography and Computer Science. The broad range of JSU's GIS and remote sensing activities is addressed. The impacts on changing paradigms in the university and in the professional world conclude the discussion.

  12. WATEQ4F - a personal computer Fortran translation of the geochemical model WATEQ2 with revised data base

    USGS Publications Warehouse

    Ball, J.W.; Nordstrom, D. Kirk; Zachmann, D.W.

    1987-01-01

    A FORTRAN 77 version of the PL/1 computer program for the geochemical model WATEQ2, which computes major and trace element speciation and mineral saturation for natural waters has been developed. The code (WATEQ4F) has been adapted to execute on an IBM PC or compatible microcomputer. Two versions of the code are available, one operating with IBM Professional FORTRAN and an 8087 or 89287 numeric coprocessor, and one which operates without a numeric coprocessor using Microsoft FORTRAN 77. The calculation procedure is identical to WATEQ2, which has been installed on many mainframes and minicomputers. Limited data base revisions include the addition of the following ions: AlHS04(++), BaS04, CaHS04(++), FeHS04(++), NaF, SrC03, and SrHCO3(+). This report provides the reactions and references for the data base revisions, instructions for program operation, and an explanation of the input and output files. Attachments contain sample output from three water analyses used as test cases and the complete FORTRAN source listing. U.S. Geological Survey geochemical simulation program PHREEQE and mass balance program BALANCE also have been adapted to execute on an IBM PC or compatible microcomputer with a numeric coprocessor and the IBM Professional FORTRAN compiler. (Author 's abstract)

  13. Computer Supported Indexing: A History and Evaluation of NASA's MAI System

    NASA Technical Reports Server (NTRS)

    Silvester, June P.

    1997-01-01

    Computer supported or machine aided indexing (MAI) can be categorized in multiple ways. The system used by the National Aeronautics and Space Administration's (NASA's) Center for AeroSpace Information (CASI) is described as semantic and computational. It's based on the co-occurrence of domain-specific terminology in parts of a sentence, and the probability that an indexer will assign a particular index term when a given word or phrase is encountered in text. The NASA CASI system is run on demand by the indexer and responds in 3 to 9 seconds with a list of suggested, authorized terms. The system was originally based on a syntactic system used in the late 1970's by the Defense Technical Information Center (DTIC). The NASA mainframe-supported system consists of three components: two programs and a knowledge base (KB). The evolution of the system is described and flow charts illustrate the MAI procedures. Tests used to evaluate NASA's MAI system were limited to those that would not slow production. A very early test indicated that MAI saved about 3 minutes and provided several additional terms for each document indexed. It also was determined that time and other resources spent in careful construction of the KB pay off with high-quality output and indexer acceptance of MAI results.

  14. A PC-based multispectral scanner data evaluation workstation: Application to Daedalus scanners

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; James, Mark W.; Smith, Matthew R.; Atkinson, Robert J.

    1991-01-01

    In late 1989, a personal computer (PC)-based data evaluation workstation was developed to support post flight processing of Multispectral Atmospheric Mapping Sensor (MAMS) data. The MAMS Quick View System (QVS) is an image analysis and display system designed to provide the capability to evaluate Daedalus scanner data immediately after an aircraft flight. Even in its original form, the QVS offered the portability of a personal computer with the advanced analysis and display features of a mainframe image analysis system. It was recognized, however, that the original QVS had its limitations, both in speed and processing of MAMS data. Recent efforts are presented that focus on overcoming earlier limitations and adapting the system to a new data tape structure. In doing so, the enhanced Quick View System (QVS2) will accommodate data from any of the four spectrometers used with the Daedalus scanner on the NASA ER2 platform. The QVS2 is designed around the AST 486/33 MHz CPU personal computer and comes with 10 EISA expansion slots, keyboard, and 4.0 mbytes of memory. Specialized PC-McIDAS software provides the main image analysis and display capability for the system. Image analysis and display of the digital scanner data is accomplished with PC-McIDAS software.

  15. Market research for Idaho Transportation Department linear referencing system.

    DOT National Transportation Integrated Search

    2009-09-02

    For over 30 years, the Idaho Transportation Department (ITD) has had an LRS called MACS : (MilePoint And Coded Segment), which is being implemented on a mainframe using a : COBOL/CICS platform. As ITD began embracing newer technologies and moving tow...

  16. Design and implementation of scalable tape archiver

    NASA Technical Reports Server (NTRS)

    Nemoto, Toshihiro; Kitsuregawa, Masaru; Takagi, Mikio

    1996-01-01

    In order to reduce costs, computer manufacturers try to use commodity parts as much as possible. Mainframes using proprietary processors are being replaced by high performance RISC microprocessor-based workstations, which are further being replaced by the commodity microprocessor used in personal computers. Highly reliable disks for mainframes are also being replaced by disk arrays, which are complexes of disk drives. In this paper we try to clarify the feasibility of a large scale tertiary storage system composed of 8-mm tape archivers utilizing robotics. In the near future, the 8-mm tape archiver will be widely used and become a commodity part, since recent rapid growth of multimedia applications requires much larger storage than disk drives can provide. We designed a scalable tape archiver which connects as many 8-mm tape archivers (element archivers) as possible. In the scalable archiver, robotics can exchange a cassette tape between two adjacent element archivers mechanically. Thus, we can build a large scalable archiver inexpensively. In addition, a sophisticated migration mechanism distributes frequently accessed tapes (hot tapes) evenly among all of the element archivers, which improves the throughput considerably. Even with the failures of some tape drives, the system dynamically redistributes hot tapes to the other element archivers which have live tape drives. Several kinds of specially tailored huge archivers are on the market, however, the 8-mm tape scalable archiver could replace them. To maintain high performance in spite of high access locality when a large number of archivers are attached to the scalable archiver, it is necessary to scatter frequently accessed cassettes among the element archivers and to use the tape drives efficiently. For this purpose, we introduce two cassette migration algorithms, foreground migration and background migration. Background migration transfers cassettes between element archivers to redistribute frequently accessed cassettes, thus balancing the load of each archiver. Background migration occurs the robotics are idle. Both migration algorithms are based on access frequency and space utility of each element archiver. To normalize these parameters according to the number of drives in each element archiver, it is possible to maintain high performance even if some tape drives fail. We found that the foreground migration is efficient at reducing access response time. Beside the foreground migration, the background migration makes it possible to track the transition of spatial access locality quickly.

  17. Web-Enabled Systems for Student Access.

    ERIC Educational Resources Information Center

    Harris, Chad S.; Herring, Tom

    1999-01-01

    California State University, Fullerton is developing a suite of server-based, Web-enabled applications that distribute the functionality of its student information system software to external customers without modifying the mainframe applications or databases. The cost-effective, secure, and rapidly deployable business solution involves using the…

  18. Beyond Information Retrieval: Ways To Provide Content in Context.

    ERIC Educational Resources Information Center

    Wiley, Deborah Lynne

    1998-01-01

    Provides an overview of information retrieval from mainframe systems to Web search engines; discusses collaborative filtering, data extraction, data visualization, agent technology, pattern recognition, classification and clustering, and virtual communities. Argues that rather than huge data-storage centers and proprietary software, we need…

  19. Installing an Integrated System and a Fourth-Generation Language.

    ERIC Educational Resources Information Center

    Ridenour, David; Ferguson, Linda

    1987-01-01

    In the spring of 1986 Indiana State University converted to the Series Z software of Information Associates, an IBM mainframe, and Information Builders' FOCUS fourth-generation language. The beginning of the planning stage to product selection, training, and implementation is described. (Author/MLW)

  20. What Lies Beyond the Online Catalog?

    ERIC Educational Resources Information Center

    Matthews, Joseph R.; And Others

    1985-01-01

    Five prominent consultants project technological advancements that, in some cases, will enhance current library systems, and in many cases will cause them to become obsolete. Major trends include advances in mainframe and microcomputing technology, development of inexpensive local area networks and telecommunications gateways, and the advent of…

  1. Hardware Support for Malware Defense and End-to-End Trust

    DTIC Science & Technology

    2017-02-01

    IoT) sensors and actuators, mobile devices and servers; cloud based, stand alone, and traditional mainframes. The prototype developed demonstrated...virtual machines. For mobile platforms we developed and prototyped an architecture supporting separation of personalities on the same platform...4 3.1. MOBILE

  2. Curriculum Development through YTS Modular Credit Accumulation.

    ERIC Educational Resources Information Center

    Further Education Unit, London (England).

    This document reports the evaluation of the collaborately developed Modular Training Framework (MainFrame), a British curriculum development project, built around a commitment to a competency-based, modular credit accumulation program. The collaborators were three local education authorities (LEAs), those of Bedfordshire, Haringey, and Sheffield,…

  3. Kodak Optical Disk and Microfilm Technologies Carve Niches in Specific Applications.

    ERIC Educational Resources Information Center

    Gallenberger, John; Batterton, John

    1989-01-01

    Describes the Eastman Kodak Company's microfilm and optical disk technologies and their applications. Topics discussed include WORM technology; retrieval needs and cost effective archival storage needs; engineering applications; jukeboxes; optical storage options; systems for use with mainframes and microcomputers; and possible future…

  4. Automating Finance

    ERIC Educational Resources Information Center

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  5. Analysis of differential and active charging phenomena on ATS-5 and ATS-6

    NASA Technical Reports Server (NTRS)

    Olsen, R. C.; Whipple, E. C., Jr.

    1980-01-01

    Spacecraft charging on the differential charging and artificial particle emission experiments on ATS 5 and ATS 6 were studied. Differential charging of spacecraft surfaces generated large electrostatic barriers to spacecraft generated electrons, from photoemission, secondary emission, and thermal emitters. The electron emitter could partially or totally discharge the satellite, but the mainframe recharged negatively in a few 10's of seconds. The time dependence of the charging behavior was explained by the relatively large capacitance for differential charging in comparison to the small spacecraft to space capacitance. A daylight charging event on ATS 6 was shown to have a charging behavior suggesting the dominance of differential charging on the absolute potential of the mainframe. Ion engine operations and plasma emission experiments on ATS 6 were shown to be an effective means of controlling the spacecraft potential in eclipse and sunlight. Elimination of barrier effects around the detectors and improving the quality of the particle data are discussed.

  6. Effect of Joule heating and current crowding on electromigration in mobile technology

    NASA Astrophysics Data System (ADS)

    Tu, K. N.; Liu, Yingxia; Li, Menglu

    2017-03-01

    In the present era of big data and internet of things, the use of microelectronic products in all aspects of our life is manifested by the ubiquitous presence of mobile devices as i-phones and wearable i-products. These devices are facing the need for higher power and greater functionality applications such as in i-health, yet they are limited by physical size. At the moment, software (Apps) is much ahead of hardware in mobile technology. To advance hardware, the end of Moore's law in two-dimensional integrated circuits can be extended by three-dimensional integrated circuits (3D ICs). The concept of 3D ICs has been with us for more than ten years. The challenge in 3D IC technology is dense packing by using both vertical and horizontal interconnections. Mass production of 3D IC devices is behind schedule due to cost because of low yield and uncertain reliability. Joule heating is serious in a dense structure because of heat generation and dissipation. A change of reliability paradigm has advanced from failure at a specific circuit component to failure at a system level weak-link. Currently, the electronic industry is introducing 3D IC devices in mainframe computers, where cost is not an issue, for the purpose of collecting field data of failure, especially the effect of Joule heating and current crowding on electromigration. This review will concentrate on the positive feedback between Joule heating and electromigration, resulting in an accelerated system level weak-link failure. A new driving force of electromigration, the electric potential gradient force due to current crowding, will be reviewed critically. The induced failure tends to occur in the low current density region.

  7. An overview of the NASA electronic components information management system

    NASA Technical Reports Server (NTRS)

    Kramer, G.; Waterbury, S.

    1991-01-01

    The NASA Parts Project Office (NPPO) comprehensive data system to support all NASA Electric, Electronic, and Electromechanical (EEE) parts management and technical data requirements is described. A phase delivery approach is adopted, comprising four principal phases. Phases 1 and 2 support Space Station Freedom (SSF) and use a centralized architecture with all data and processing kept on a mainframe computer. Phases 3 and 4 support all NASA centers and projects and implement a distributed system architecture, in which data and processing are shared among networked database servers. The Phase 1 system, which became operational in February of 1990, implements a core set of functions. Phase 2, scheduled for release in 1991, adds functions to the Phase 1 system. Phase 3, to be prototyped beginning in 1991 and delivered in 1992, introduces a distributed system, separate from the Phase 1 and 2 system, with a refined semantic data model. Phase 4 extends the data model and functionality of the Phase 3 system to provide support for the NASA design community, including integration with Computer Aided Design (CAD) environments. Phase 4 is scheduled for prototyping in 1992 to 93 and delivery in 1994.

  8. Software Testing and Verification in Climate Model Development

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  9. The Navy/NASA Engine Program (NNEP89): A user's manual

    NASA Technical Reports Server (NTRS)

    Plencner, Robert M.; Snyder, Christopher A.

    1991-01-01

    An engine simulation computer code called NNEP89 was written to perform 1-D steady state thermodynamic analysis of turbine engine cycles. By using a very flexible method of input, a set of standard components are connected at execution time to simulate almost any turbine engine configuration that the user could imagine. The code was used to simulate a wide range of engine cycles from turboshafts and turboprops to air turborockets and supersonic cruise variable cycle engines. Off design performance is calculated through the use of component performance maps. A chemical equilibrium model is incorporated to adequately predict chemical dissociation as well as model virtually any fuel. NNEP89 is written in standard FORTRAN77 with clear structured programming and extensive internal documentation. The standard FORTRAN77 programming allows it to be installed onto most mainframe computers and workstations without modification. The NNEP89 code was derived from the Navy/NASA Engine program (NNEP). NNEP89 provides many improvements and enhancements to the original NNEP code and incorporates features which make it easier to use for the novice user. This is a comprehensive user's guide for the NNEP89 code.

  10. Exhaustive Versus Randomized Searchers for Nonlinear Optimization in 21st Century Computing: Solar Application

    NASA Technical Reports Server (NTRS)

    Sen, Syamal K.; AliShaykhian, Gholam

    2010-01-01

    We present a simple multi-dimensional exhaustive search method to obtain, in a reasonable time, the optimal solution of a nonlinear programming problem. It is more relevant in the present day non-mainframe computing scenario where an estimated 95% computing resources remains unutilized and computing speed touches petaflops. While the processor speed is doubling every 18 months, the band width is doubling every 12 months, and the hard disk space is doubling every 9 months. A randomized search algorithm or, equivalently, an evolutionary search method is often used instead of an exhaustive search algorithm. The reason is that a randomized approach is usually polynomial-time, i.e., fast while an exhaustive search method is exponential-time i.e., slow. We discuss the increasing importance of exhaustive search in optimization with the steady increase of computing power for solving many real-world problems of reasonable size. We also discuss the computational error and complexity of the search algorithm focusing on the fact that no measuring device can usually measure a quantity with an accuracy greater than 0.005%. We stress the fact that the quality of solution of the exhaustive search - a deterministic method - is better than that of randomized search. In 21 st century computing environment, exhaustive search cannot be left aside as an untouchable and it is not always exponential. We also describe a possible application of these algorithms in improving the efficiency of solar cells - a real hot topic - in the current energy crisis. These algorithms could be excellent tools in the hands of experimentalists and could save not only large amount of time needed for experiments but also could validate the theory against experimental results fast.

  11. 32 CFR 1700.6 - Fees for records services.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Fees for records services. 1700.6 Section 1700.6 National Defense Other Regulations Relating to National Defense OFFICE OF THE DIRECTOR OF NATIONAL... CD (recordable) Each 20.00 Telecommunications Per minute .50 Paper (mainframe printer) Per page .10...

  12. 32 CFR 1700.6 - Fees for records services.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Fees for records services. 1700.6 Section 1700.6 National Defense Other Regulations Relating to National Defense OFFICE OF THE DIRECTOR OF NATIONAL... CD (recordable) Each 20.00 Telecommunications Per minute .50 Paper (mainframe printer) Per page .10...

  13. Implementation of Parallel Computing Technology to Vortex Flow

    NASA Technical Reports Server (NTRS)

    Dacles-Mariani, Jennifer

    1999-01-01

    Mainframe supercomputers such as the Cray C90 was invaluable in obtaining large scale computations using several millions of grid points to resolve salient features of a tip vortex flow over a lifting wing. However, real flight configurations require tracking not only of the flow over several lifting wings but its growth and decay in the near- and intermediate- wake regions, not to mention the interaction of these vortices with each other. Resolving and tracking the evolution and interaction of these vortices shed from complex bodies is computationally intensive. Parallel computing technology is an attractive option in solving these flows. In planetary science vortical flows are also important in studying how planets and protoplanets form when cosmic dust and gases become gravitationally unstable and eventually form planets or protoplanets. The current paradigm for the formation of planetary systems maintains that the planets accreted from the nebula of gas and dust left over from the formation of the Sun. Traditional theory also indicate that such a preplanetary nebula took the form of flattened disk. The coagulation of dust led to the settling of aggregates toward the midplane of the disk, where they grew further into asteroid-like planetesimals. Some of the issues still remaining in this process are the onset of gravitational instability, the role of turbulence in the damping of particles and radial effects. In this study the focus will be with the role of turbulence and the radial effects.

  14. Commercial space development needs cheap launchers

    NASA Astrophysics Data System (ADS)

    Benson, James William

    1998-01-01

    SpaceDev is in the market for a deep space launch, and we are not going to pay $50 million for it. There is an ongoing debate about the elasticity of demand related to launch costs. On the one hand there are the ``big iron'' NASA and DoD contractors who say that there is no market for small or inexpensive launchers, that lowering launch costs will not result in significantly more launches, and that the current uncompetitive pricing scheme is appropriate. On the other hand are commercial companies which compete in the real world, and who say that there would be innumerable new launches if prices were to drop dramatically. I participated directly in the microcomputer revolution, and saw first hand what happened to the big iron computer companies who failed to see or heed the handwriting on the wall. We are at the same stage in the space access revolution that personal computers were in the late '70s and early '80s. The global economy is about to be changed in ways that are just as unpredictable as those changes wrought after the introduction of the personal computer. Companies which fail to innovate and keep producing only big iron will suffer the same fate as IBM and all the now-extinct mainframe and minicomputer companies. A few will remain, but with a small share of the market, never again to be in a position to dominate.

  15. 76 FR 6839 - ActiveCore Technologies, Inc., Battery Technologies, Inc., China Media1 Corp., Dura Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-08

    ... SECURITIES AND EXCHANGE COMMISSION [File No. 500-1] ActiveCore Technologies, Inc., Battery Technologies, Inc., China Media1 Corp., Dura Products International, Inc. (n/k/a Dexx Corp.), Global Mainframe... Battery Technologies, Inc. because it has not filed any periodic reports since the period ended December...

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Chao

    Sparx, a new environment for Cryo-EM image processing; Cryo-EM, Single particle reconstruction, principal component analysis; Hardware Req.: PC, MAC, Supercomputer, Mainframe, Multiplatform, Workstation. Software Req.: operating system is Unix; Compiler C++; type of files: source code, object library, executable modules, compilation instructions; sample problem input data. Location/transmission: http://sparx-em.org; User manual & paper: http://sparx-em.org;

  17. Spectral analysis of airflow sounds in patent versus occluded tracheostomy tubes: a pilot study in tracheostomized adult patients.

    PubMed

    Rao, A J; Niwa, H; Watanabe, Y; Fukuta, S; Yanagita, N

    1990-05-01

    Cannula occlusion is a life-threatening postoperative complication of tracheostomy. Current management largely relies on nursing care for prevention of fatalities because no proven mechanical, machine-based support monitoring exists. The objective of this paper was to address the problem of monitoring the state of cannula patency, based on analysis of airflow acoustic spectral patterns in tracheostomized adult patients in the patent and partially occluded cannula. Tracheal airflow sounds were picked up via a condenser microphone air-coupled to the skin just below the tracheal stoma. Signal output from Mic was amplified, high-pass filtered, digital tape-recorded, and analyzed on a mainframe computer. Although airflow frequencies for patient cannulae were predominantly low-pitched (0.1 to 0.3 kHz), occluded tubes had discrete high-pitched spectral peaks (1.3 to 1.6 kHz). These results suggest that frequency analysis of airflow sounds can identify a change in the status of cannula patency.

  18. Databank Software for the 1990s and Beyond--Part 1: The User's Wish List.

    ERIC Educational Resources Information Center

    Basch, Reva

    1990-01-01

    Describes desired software enhancements identified by the Southern California Online Users Group in the areas of search language, database selection, document retrieval and display, user interface, customer support, and cost and economic issues. The need to prioritize these wishes and to determine whether features should reside in the mainframe or…

  19. Checking the Goldbach conjecture up to 4\\cdot 10^11

    NASA Astrophysics Data System (ADS)

    Sinisalo, Matti K.

    1993-10-01

    One of the most studied problems in additive number theory, Goldbach's conjecture, states that every even integer greater than or equal to 4 can be expressed as a sum of two primes. In this paper checking of this conjecture up to 4 \\cdot {10^{11}} by the IBM 3083 mainframe with vector processor is reported.

  20. TDRSS-user orbit determination using batch least-squares and sequential methods

    NASA Astrophysics Data System (ADS)

    Oza, D. H.; Jones, T. L.; Hakimi, M.; Samii, Mina V.; Doll, C. E.; Mistretta, G. D.; Hart, R. C.

    1993-02-01

    The Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) commissioned Applied Technology Associates, Incorporated, to develop the Real-Time Orbit Determination/Enhanced (RTOD/E) system on a Disk Operating System (DOS)-based personal computer (PC) as a prototype system for sequential orbit determination of spacecraft. This paper presents the results of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite System (TDRSS) user spacecraft, Landsat-4, obtained using RTOD/E, operating on a PC, with the accuracy of an established batch least-squares system, the Goddard Trajectory Determination System (GTDS), and operating on a mainframe computer. The results of Landsat-4 orbit determination will provide useful experience for the Earth Observing System (EOS) series of satellites. The Landsat-4 ephemerides were estimated for the January 17-23, 1991, timeframe, during which intensive TDRSS tracking data for Landsat-4 were available. Independent assessments were made of the consistencies (overlap comparisons for the batch case and covariances and the first measurement residuals for the sequential case) of solutions produced by the batch and sequential methods. The forward-filtered RTOD/E orbit solutions were compared with the definitive GTDS orbit solutions for Landsat-4; the solution differences were less than 40 meters after the filter had reached steady state.

  1. Verification of a national water data base using a geographic information system

    USGS Publications Warehouse

    Harrison, H.E.

    1994-01-01

    The National Water Data Exchange (NAWDEX) was developed to assist users of water-resource data in the identification, location, and acquisition of data. The Master Water Data Index (MWDI) of NAWDEX currently indexes the data collected by 423 organizations from nearly 500,000 sites throughout the United Stales. The utilization of new computer technologies permit the distribution of the MWDI to the public on compact disc. In addition, geographic information systems (GIS) are now available that can store and analyze these data in a spatial format. These recent innovations could increase access and add new capabilities to the MWDI. Before either of these technologies could be employed, however, a quality-assurance check of the MWDI needed to be performed. The MWDI resides on a mainframe computer in a tabular format. It was copied onto a workstation and converted to a GIS format. The GIS was used to identify errors in the MWDI and produce reports that summarized these errors. The summary reports were sent to the responsible contributing agencies along with instructions for submitting their corrections to the NAWDEX Program Office. The MWDI administrator received reports that summarized all of the errors identified. Of the 494,997 sites checked, 93,440 sites had at least one error (18.9 percent error rate).

  2. Comparison of ERBS orbit determination accuracy using batch least-squares and sequential methods

    NASA Technical Reports Server (NTRS)

    Oza, D. H.; Jones, T. L.; Fabien, S. M.; Mistretta, G. D.; Hart, R. C.; Doll, C. E.

    1991-01-01

    The Flight Dynamics Div. (FDD) at NASA-Goddard commissioned a study to develop the Real Time Orbit Determination/Enhanced (RTOD/E) system as a prototype system for sequential orbit determination of spacecraft on a DOS based personal computer (PC). An overview is presented of RTOD/E capabilities and the results are presented of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite System (TDRSS) user spacecraft obtained using RTOS/E on a PC with the accuracy of an established batch least squares system, the Goddard Trajectory Determination System (GTDS), operating on a mainframe computer. RTOD/E was used to perform sequential orbit determination for the Earth Radiation Budget Satellite (ERBS), and the Goddard Trajectory Determination System (GTDS) was used to perform the batch least squares orbit determination. The estimated ERBS ephemerides were obtained for the Aug. 16 to 22, 1989, timeframe, during which intensive TDRSS tracking data for ERBS were available. Independent assessments were made to examine the consistencies of results obtained by the batch and sequential methods. Comparisons were made between the forward filtered RTOD/E orbit solutions and definitive GTDS orbit solutions for ERBS; the solution differences were less than 40 meters after the filter had reached steady state.

  3. TDRSS-user orbit determination using batch least-squares and sequential methods

    NASA Technical Reports Server (NTRS)

    Oza, D. H.; Jones, T. L.; Hakimi, M.; Samii, Mina V.; Doll, C. E.; Mistretta, G. D.; Hart, R. C.

    1993-01-01

    The Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) commissioned Applied Technology Associates, Incorporated, to develop the Real-Time Orbit Determination/Enhanced (RTOD/E) system on a Disk Operating System (DOS)-based personal computer (PC) as a prototype system for sequential orbit determination of spacecraft. This paper presents the results of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite System (TDRSS) user spacecraft, Landsat-4, obtained using RTOD/E, operating on a PC, with the accuracy of an established batch least-squares system, the Goddard Trajectory Determination System (GTDS), and operating on a mainframe computer. The results of Landsat-4 orbit determination will provide useful experience for the Earth Observing System (EOS) series of satellites. The Landsat-4 ephemerides were estimated for the January 17-23, 1991, timeframe, during which intensive TDRSS tracking data for Landsat-4 were available. Independent assessments were made of the consistencies (overlap comparisons for the batch case and covariances and the first measurement residuals for the sequential case) of solutions produced by the batch and sequential methods. The forward-filtered RTOD/E orbit solutions were compared with the definitive GTDS orbit solutions for Landsat-4; the solution differences were less than 40 meters after the filter had reached steady state.

  4. Comparison of ERBS orbit determination accuracy using batch least-squares and sequential methods

    NASA Astrophysics Data System (ADS)

    Oza, D. H.; Jones, T. L.; Fabien, S. M.; Mistretta, G. D.; Hart, R. C.; Doll, C. E.

    1991-10-01

    The Flight Dynamics Div. (FDD) at NASA-Goddard commissioned a study to develop the Real Time Orbit Determination/Enhanced (RTOD/E) system as a prototype system for sequential orbit determination of spacecraft on a DOS based personal computer (PC). An overview is presented of RTOD/E capabilities and the results are presented of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite System (TDRSS) user spacecraft obtained using RTOS/E on a PC with the accuracy of an established batch least squares system, the Goddard Trajectory Determination System (GTDS), operating on a mainframe computer. RTOD/E was used to perform sequential orbit determination for the Earth Radiation Budget Satellite (ERBS), and the Goddard Trajectory Determination System (GTDS) was used to perform the batch least squares orbit determination. The estimated ERBS ephemerides were obtained for the Aug. 16 to 22, 1989, timeframe, during which intensive TDRSS tracking data for ERBS were available. Independent assessments were made to examine the consistencies of results obtained by the batch and sequential methods. Comparisons were made between the forward filtered RTOD/E orbit solutions and definitive GTDS orbit solutions for ERBS; the solution differences were less than 40 meters after the filter had reached steady state.

  5. Data Recording Room in the 10-by 10-Foot Supersonic Wind Tunnel

    NASA Image and Video Library

    1973-04-21

    The test data recording equipment located in the office building of the 10-by 10-Foot Supersonic Wind Tunnel at the NASA Lewis Research Center. The data system was the state of the art when the facility began operating in 1955 and was upgraded over time. NASA engineers used solenoid valves to measure pressures from different locations within the test section. Up 48 measurements could be fed into a single transducer. The 10-by 10 data recorders could handle up to 200 data channels at once. The Central Automatic Digital Data Encoder (CADDE) converted this direct current raw data from the test section into digital format on magnetic tape. The digital information was sent to the Lewis Central Computer Facility for additional processing. It could also be displayed in the control room via strip charts or oscillographs. The 16-by 56-foot long ERA 1103 UNIVAC mainframe computer processed most of the digital data. The paper tape with the raw data was fed into the ERA 1103 which performed the needed calculations. The information was then sent back to the control room. There was a lag of several minutes before the computed information was available, but it was exponentially faster than the hand calculations performed by the female computers. The 10- by 10-foot tunnel, which had its official opening in May 1956, was built under the Congressional Unitary Plan Act which coordinated wind tunnel construction at the NACA, Air Force, industry, and universities. The 10- by 10 was the largest of the three NACA tunnels built under the act.

  6. Standard high-reliability integrated circuit logic packaging. [for deep space tracking stations

    NASA Technical Reports Server (NTRS)

    Slaughter, D. W.

    1977-01-01

    A family of standard, high-reliability hardware used for packaging digital integrated circuits is described. The design transition from early prototypes to production hardware is covered and future plans are discussed. Interconnections techniques are described as well as connectors and related hardware available at both the microcircuit packaging and main-frame level. General applications information is also provided.

  7. The challenge of a data storage hierarchy

    NASA Technical Reports Server (NTRS)

    Ruderman, Michael

    1992-01-01

    A discussion of Mesa Archival Systems' data archiving system is presented. This data archiving system is strictly a software system that is implemented on a mainframe and manages the data into permanent file storage. Emphasis is placed on the fact that any kind of client system on the network can be connected through the Unix interface of the data archiving system.

  8. Thematic mapper flight model preshipment review data package. Volume 2, part A: Subsystem data

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Performance and acceptance data are presented for the multiplexer, scan mirror, power supply, mainframe/top mechanical and the aft optics, assemblies. Other major subsystems evaluated include the relay optics, the electronic module, the radiative cooler, and the cable harness. Reference lists of nonconforming materials reports, failure reports, and requests for deviation/waiver are also given.

  9. A modular finite-element model (MODFE) for areal and axisymmetric ground-water-flow problems, Part 3: Design philosophy and programming details

    USGS Publications Warehouse

    Torak, L.J.

    1993-01-01

    A MODular Finite-Element, digital-computer program (MODFE) was developed to simulate steady or unsteady-state, two-dimensional or axisymmetric ground-water-flow. The modular structure of MODFE places the computationally independent tasks that are performed routinely by digital-computer programs simulating ground-water flow into separate subroutines, which are executed from the main program by control statements. Each subroutine consists of complete sets of computations, or modules, which are identified by comment statements, and can be modified by the user without affecting unrelated computations elsewhere in the program. Simulation capabilities can be added or modified by either adding or modifying subroutines that perform specific computational tasks, and the modular-program structure allows the user to create versions of MODFE that contain only the simulation capabilities that pertain to the ground-water problem of interest. MODFE is written in a Fortran programming language that makes it virtually device independent and compatible with desk-top personal computers and large mainframes. MODFE uses computer storage and execution time efficiently by taking advantage of symmetry and sparseness within the coefficient matrices of the finite-element equations. Parts of the matrix coefficients are computed and stored as single-subscripted variables, which are assembled into a complete coefficient just prior to solution. Computer storage is reused during simulation to decrease storage requirements. Descriptions of subroutines that execute the computational steps of the modular-program structure are given in tables that cross reference the subroutines with particular versions of MODFE. Programming details of linear and nonlinear hydrologic terms are provided. Structure diagrams for the main programs show the order in which subroutines are executed for each version and illustrate some of the linear and nonlinear versions of MODFE that are possible. Computational aspects of changing stresses and boundary conditions with time and of mass-balance and error terms are given for each hydrologic feature. Program variables are listed and defined according to their occurrence in the main programs and in subroutines. Listings of the main programs and subroutines are given.

  10. Some Problems and Solutions in Transferring Ecosystem Simulation Codes to Supercomputers

    NASA Technical Reports Server (NTRS)

    Skiles, J. W.; Schulbach, C. H.

    1994-01-01

    Many computer codes for the simulation of ecological systems have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Recent recognition of ecosystem science as a High Performance Computing and Communications Program Grand Challenge area emphasizes supercomputers (both parallel and distributed systems) as the next set of tools for ecological simulation. Transferring ecosystem simulation codes to such systems is not a matter of simply compiling and executing existing code on the supercomputer since there are significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers. To more appropriately match the application to the architecture (necessary to achieve reasonable performance), the parallelism (if it exists) of the original application must be exploited. We discuss our work in transferring a general grassland simulation model (developed on a VAX in the FORTRAN computer programming language) to a Cray Y-MP. We show the Cray shared-memory vector-architecture, and discuss our rationale for selecting the Cray. We describe porting the model to the Cray and executing and verifying a baseline version, and we discuss the changes we made to exploit the parallelism in the application and to improve code execution. As a result, the Cray executed the model 30 times faster than the VAX 11/785 and 10 times faster than a Sun 4 workstation. We achieved an additional speed-up of approximately 30 percent over the original Cray run by using the compiler's vectorizing capabilities and the machine's ability to put subroutines and functions "in-line" in the code. With the modifications, the code still runs at only about 5% of the Cray's peak speed because it makes ineffective use of the vector processing capabilities of the Cray. We conclude with a discussion and future plans.

  11. Concepts and Relations in Neurally Inspired In Situ Concept-Based Computing

    PubMed Central

    van der Velde, Frank

    2016-01-01

    In situ concept-based computing is based on the notion that conceptual representations in the human brain are “in situ.” In this way, they are grounded in perception and action. Examples are neuronal assemblies, whose connection structures develop over time and are distributed over different brain areas. In situ concepts representations cannot be copied or duplicated because that will disrupt their connection structure, and thus the meaning of these concepts. Higher-level cognitive processes, as found in language and reasoning, can be performed with in situ concepts by embedding them in specialized neurally inspired “blackboards.” The interactions between the in situ concepts and the blackboards form the basis for in situ concept computing architectures. In these architectures, memory (concepts) and processing are interwoven, in contrast with the separation between memory and processing found in Von Neumann architectures. Because the further development of Von Neumann computing (more, faster, yet power limited) is questionable, in situ concept computing might be an alternative for concept-based computing. In situ concept computing will be illustrated with a recently developed BABI reasoning task. Neurorobotics can play an important role in the development of in situ concept computing because of the development of in situ concept representations derived in scenarios as needed for reasoning tasks. Neurorobotics would also benefit from power limited and in situ concept computing. PMID:27242504

  12. Concepts and Relations in Neurally Inspired In Situ Concept-Based Computing.

    PubMed

    van der Velde, Frank

    2016-01-01

    In situ concept-based computing is based on the notion that conceptual representations in the human brain are "in situ." In this way, they are grounded in perception and action. Examples are neuronal assemblies, whose connection structures develop over time and are distributed over different brain areas. In situ concepts representations cannot be copied or duplicated because that will disrupt their connection structure, and thus the meaning of these concepts. Higher-level cognitive processes, as found in language and reasoning, can be performed with in situ concepts by embedding them in specialized neurally inspired "blackboards." The interactions between the in situ concepts and the blackboards form the basis for in situ concept computing architectures. In these architectures, memory (concepts) and processing are interwoven, in contrast with the separation between memory and processing found in Von Neumann architectures. Because the further development of Von Neumann computing (more, faster, yet power limited) is questionable, in situ concept computing might be an alternative for concept-based computing. In situ concept computing will be illustrated with a recently developed BABI reasoning task. Neurorobotics can play an important role in the development of in situ concept computing because of the development of in situ concept representations derived in scenarios as needed for reasoning tasks. Neurorobotics would also benefit from power limited and in situ concept computing.

  13. s95-16439

    NASA Image and Video Library

    2014-08-14

    S95-16439 (13-22 July 1995) --- An overall view from the rear shows activity in the new Mission Control Center (MCC), opened for operation and dedicated during the STS-70 mission. The new MCC, developed at a cost of about 50 million, replaces the main-frame based, NASA-unique design of the old Mission Control with a standard workstation-based, local area network system commonly in use today.

  14. MULTIVARIATE ANALYSIS OF DRINKING BEHAVIOUR IN A RURAL POPULATION

    PubMed Central

    Mathrubootham, N.; Bashyam, V.S.P.; Shahjahan

    1997-01-01

    This study was carried out to find out the drinking pattern in a rural population, using multivariate techniques. 386 current users identified in a community were assessed with regard to their drinking behaviours using a structured interview. For purposes of the study the questions were condensed into 46 meaningful variables. In bivariate analysis, 14 variables including dependent variables such as dependence, MAST & CAGE (measuring alcoholic status), Q.F. Index and troubled drinking were found to be significant. Taking these variables and other multivariate techniques too such as ANOVA, correlation, regression analysis and factor analysis were done using both SPSS PC + and HCL magnum mainframe computer with FOCUS package and UNIX systems. Results revealed that number of factors such as drinking style, duration of drinking, pattern of abuse, Q.F. Index and various problems influenced drinking and some of them set up a vicious circle. Factor analysis revealed mainly 3 factors, abuse, dependence and social drinking factors. Dependence could be divided into low/moderate dependence. The implications and practical applications of these tests are also discussed. PMID:21584077

  15. Development of the functional simulator for the Galileo attitude and articulation control system

    NASA Technical Reports Server (NTRS)

    Namiri, M. K.

    1983-01-01

    A simulation program for verifying and checking the performance of the Galileo Spacecraft's Attitude and Articulation Control Subsystem's (AACS) flight software is discussed. The program, which is called Functional Simulator (FUNSIM), provides a simple method of interfacing user-supplied mathematical models coded in FORTRAN which describes spacecraft dynamics, sensors, and actuators; this is done with the AACS flight software, coded in HAL/S (High-level Advanced Language/Shuttle). It is thus able to simulate the AACS flight software accurately to the HAL/S statement level in the environment of a mainframe computer system. FUNSIM also has a command and data subsystem (CDS) simulator. It is noted that the input/output data and timing are simulated with the same precision as the flight microprocessor. FUNSIM uses a variable stepsize numerical integration algorithm complete with individual error bound control on the state variable to solve the equations of motion. The program has been designed to provide both line printer and matrix dot plotting of the variables requested in the run section and to provide error diagnostics.

  16. The SEL Adapts to Meet Changing Times

    NASA Technical Reports Server (NTRS)

    Pajerski, Rose S.; Basili, Victor R.

    1997-01-01

    Since 1976, the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization, the Flight Dynamics Division (FDD) at Goddard Space Flight Center, develops, maintains, and manages complex flight dynamics systems. It has done this by developing and refining a continual process improvement approach that allows an organization such as the FDD to fine-tune its process for its particular domain. Experimental software engineering and measurement play a significant role in this approach. The SEL is a partnership of NASA Goddard, its major software contractor, Computer Sciences Corporation (CSC), and the University of Maryland's (LTM) Department of Computer Science. The FDD primarily builds software systems that provide ground-based flight dynamics support for scientific satellites. They fall into two sets: ground systems and simulators. Ground systems are midsize systems that average around 250 thousand source lines of code (KSLOC). Ground system development projects typically last 1 - 2 years. Recent systems have been rehosted to workstations from IBM mainframes, and also contain significant new subsystems written in C and C++. The simulators are smaller systems averaging around 60 KSLOC that provide the test data for the ground systems. Simulator development lasts up to 1 year. Most of the simulators have been built in Ada on workstations. The SEL is responsible for the management and continual improvement of the software engineering processes used on these FDD projects.

  17. CD-ROM technology at the EROS data center

    USGS Publications Warehouse

    Madigan, Michael E.; Weinheimer, Mary C.

    1993-01-01

    The vast amount of digital spatial data often required by a single user has created a demand for media alternatives to 1/2" magnetic tape. One such medium that has been recently adopted at the U.S. Geological Survey's EROS Data Center is the compact disc (CD). CD's are a versatile, dynamic, and low-cost method for providing a variety of data on a single media device and are compatible with various computer platforms. CD drives are available for personal computers, UNIX workstations, and mainframe systems, either directly connected, or through a network. This medium furnishes a quick method of reproducing and distributing large amounts of data on a single CD. Several data sets are already available on CD's, including collections of historical Landsat multispectral scanner data and biweekly composites of Advanced Very High Resolution Radiometer data for the conterminous United States. The EROS Data Center intends to provide even more data sets on CD's. Plans include specific data sets on a customized disc to fulfill individual requests, and mass production of unique data sets for large-scale distribution. Requests for a single compact disc-read only memory (CD-ROM) containing a large volume of data either for archiving or for one-time distribution can be addressed with a CD-write once (CD-WO) unit. Mass production and large-scale distribution will require CD-ROM replication and mastering.

  18. An implementation of the distributed programming structural synthesis system (PROSSS)

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1981-01-01

    A method is described for implementing a flexible software system that combines large, complex programs with small, user-supplied, problem-dependent programs and that distributes their execution between a mainframe and a minicomputer. The Programming Structural Synthesis System (PROSSS) was the specific software system considered. The results of such distributed implementation are flexibility of the optimization procedure organization and versatility of the formulation of constraints and design variables.

  19. JANE, A new information retrieval system for the Radiation Shielding Information Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trubey, D.K.

    A new information storage and retrieval system has been developed for the Radiation Shielding Information Center (RSIC) at Oak Ridge National Laboratory to replace mainframe systems that have become obsolete. The database contains citations and abstracts of literature which were selected by RSIC analysts and indexed with terms from a controlled vocabulary. The database, begun in 1963, has been maintained continuously since that time. The new system, called JANE, incorporates automatic indexing techniques and on-line retrieval using the RSIC Data General Eclipse MV/4000 minicomputer, Automatic indexing and retrieval techniques based on fuzzy-set theory allow the presentation of results in ordermore » of Retrieval Status Value. The fuzzy-set membership function depends on term frequency in the titles and abstracts and on Term Discrimination Values which indicate the resolving power of the individual terms. These values are determined by the Cover Coefficient method. The use of a commercial database base to store and retrieve the indexing information permits rapid retrieval of the stored documents. Comparisons of the new and presently-used systems for actual searches of the literature indicate that it is practical to replace the mainframe systems with a minicomputer system similar to the present version of JANE. 18 refs., 10 figs.« less

  20. Yong-Ki Kim — His Life and Recent Work

    NASA Astrophysics Data System (ADS)

    Stone, Philip M.

    2007-08-01

    Dr. Kim made internationally recognized contributions in many areas of atomic physics research and applications, and was still very active when he was killed in an automobile accident. He joined NIST in 1983 after 17 years at the Argonne National Laboratory following his Ph.D. work at the University of Chicago. Much of his early work at Argonne and especially at NIST was the elucidation and detailed analysis of the structure of highly charged ions. He developed a sophisticated, fully relativistic atomic structure theory that accurately predicts atomic energy levels, transition wavelengths, lifetimes, and transition probabilities for a large number of ions. This information has been vital to model the properties of the hot interior of fusion research plasmas, where atomic ions must be described with relativistic atomic structure calculations. In recent years, Dr. Kim worked on the precise calculation of ionization and excitation cross sections of numerous atoms, ions, and molecules that are important in fusion research and in plasma processing for manufacturing semiconductor chips. Dr. Kim greatly advanced the state-of-the-art of calculations for these cross sections through development and implementation of highly innovative methods, including his Binary-Encounter-Bethe (BEB) theory and a scaled plane wave Born (scaled PWB) theory. His methods, using closed quantum mechanical formulas and no adjustable parameters, avoid tedious large-scale computations with main-frame computers. His calculations closely reproduce the results of benchmark experiments as well as large-scale calculations requiring hours of computer time. This recent work on BEB and scaled PWB is reviewed and examples of its capabilities are shown.

  1. Computer-Based Tutoring of Visual Concepts: From Novice to Experts.

    ERIC Educational Resources Information Center

    Sharples, Mike

    1991-01-01

    Description of ways in which computers might be used to teach visual concepts discusses hypermedia systems; describes computer-generated tutorials; explains the use of computers to create learning aids such as concept maps, feature spaces, and structural models; and gives examples of visual concept teaching in medical education. (10 references)…

  2. Simulation of human decision making

    DOEpatents

    Forsythe, J Chris [Sandia Park, NM; Speed, Ann E [Albuquerque, NM; Jordan, Sabina E [Albuquerque, NM; Xavier, Patrick G [Albuquerque, NM

    2008-05-06

    A method for computer emulation of human decision making defines a plurality of concepts related to a domain and a plurality of situations related to the domain, where each situation is a combination of at least two of the concepts. Each concept and situation is represented in the computer as an oscillator output, and each situation and concept oscillator output is distinguishable from all other oscillator outputs. Information is input to the computer representative of detected concepts, and the computer compares the detected concepts with the stored situations to determine if a situation has occurred.

  3. Concept Learning through Image Processing.

    ERIC Educational Resources Information Center

    Cifuentes, Lauren; Yi-Chuan, Jane Hsieh

    This study explored computer-based image processing as a study strategy for middle school students' science concept learning. Specifically, the research examined the effects of computer graphics generation on science concept learning and the impact of using computer graphics to show interrelationships among concepts during study time. The 87…

  4. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    ERIC Educational Resources Information Center

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  5. Hierarchical polymerized high internal phase emulsions synthesized from surfactant-stabilized emulsion templates.

    PubMed

    Wong, Ling L C; Villafranca, Pedro M Baiz; Menner, Angelika; Bismarck, Alexander

    2013-05-21

    In building construction, structural elements, such as lattice girders, are positioned specifically to support the mainframe of a building. This arrangement provides additional structural hierarchy, facilitating the transfer of load to its foundation while keeping the building weight down. We applied the same concept when synthesizing hierarchical open-celled macroporous polymers from high internal phase emulsion (HIPE) templates stabilized by varying concentrations of a polymeric non-ionic surfactant from 0.75 to 20 w/vol %. These hierarchical poly(merized)HIPEs have multimodally distributed pores, which are efficiently arranged to enhance the load transfer mechanism in the polymer foam. As a result, hierarchical polyHIPEs produced from HIPEs stabilized by 5 vol % surfactant showed a 93% improvement in Young's moduli compared to conventional polyHIPEs produced from HIPEs stabilized by 20 vol % of surfactant with the same porosity of 84%. The finite element method (FEM) was used to determine the effect of pore hierarchy on the mechanical performance of porous polymers under small periodic compressions. Results from the FEM showed a clear improvement in Young's moduli for simulated hierarchical porous geometries. This methodology could be further adapted as a predictive tool to determine the influence of hierarchy on the mechanical properties of a range of porous materials.

  6. Business Process Reengineering With Knowledge Value Added in Support of the Department of the Navy Chief Information Officer

    DTIC Science & Technology

    2003-09-01

    BLANK xv LIST OF ACRONYMS ABC Activity Based Costing ADO ActiveX Data Object ASP Application Server Page BPR Business Process Re...processes uses people and systems (hardware, software, machinery, etc.) and that these people and systems contain the “corporate” knowledge of the...server architecture was also a high maintenance item. Data was no longer contained on one mainframe but was distributed throughout the enterprise

  7. Evaluation of Landsat-4 orbit determination accuracy using batch least-squares and sequential methods

    NASA Astrophysics Data System (ADS)

    Oza, D. H.; Jones, T. L.; Feiertag, R.; Samii, M. V.; Doll, C. E.; Mistretta, G. D.; Hart, R. C.

    The Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) commissioned Applied Technology Associates, Incorporated, to develop the Real-Time Orbit Determination/Enhanced (RTOD/E) system on a Disk Operating System (DOS)-based personal computer (PC) as a prototype system for sequential orbit determination of spacecraft. This paper presents the results of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite (TDRS) System (TDRSS) user spacecraft, Landsat-4, obtained using RTOD/E, operating on a PC, with the accuracy of an established batch least-squares system, the Goddard Trajectory Determination System (GTDS), operating on a mainframe computer. The results of Landsat-4 orbit determination will provide useful experience for the Earth Observing System (EOS) series of satellites. The Landsat-4 ephemerides were estimated for the May 18-24, 1992, timeframe, during which intensive TDRSS tracking data for Landsat-4 were available. During this period, there were two separate orbit-adjust maneuvers on one of the TDRSS spacecraft (TDRS-East) and one small orbit-adjust maneuver for Landsat-4. Independent assessments were made of the consistencies (overlap comparisons for the batch case and covariances and the first measurement residuals for the sequential case) of solutions produced by the batch and sequential methods. The forward-filtered RTOD/E orbit solutions were compared with the definitive GTDS orbit solutions for Landsat-4; the solution differences were generally less than 30 meters after the filter had reached steady state.

  8. Using the network to achieve energy efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giglio, M.

    1995-12-01

    Novell, the third largest software company in the world, has developed Netware Embedded Systems Technology (NEST). NEST will take the network deeper into non-traditional computing environments and will imbed networking into more intelligent devices. Ultimately, this will lead to energy efficiencies in the office. NEST can make point-of-sale terminals, alarm systems, televisions, traffic controls, printers, lights, fax machines, copiers, HVAC controls, PBX machines, etc., either intelligent or more intelligent than they are currently. The mission statement for this particular group is to integrate over 30 million new intelligent devices into the workplace and the home with Novell networks by 1997.more » Computing trends have progressed from mainframes in the 1960s to keys, security systems, and airplanes in the year 2000. In fact, the new Boeing 777 has NEST in it, and it also has network servers on board. NEST enables the embedded network with the ability to put intelligence into devices. This gives one more control of the devices from wherever one is. For example, the pharmaceutical industry could use NEST to coordinate what the consumer is buying, what is in the warehouse, what the manufacturing plant is tooled for, and so on. Through NEST technology, the pharmaceutical industry now uses a camera that takes pictures of the pills. It can see whether an {open_quotes}overdose{close_quotes} or {open_quotes}underdose{close_quotes} of a particular type of pill is being manufactured. The plant can be shut down and corrections made immediately.« less

  9. Evaluation of Landsat-4 orbit determination accuracy using batch least-squares and sequential methods

    NASA Technical Reports Server (NTRS)

    Oza, D. H.; Jones, T. L.; Feiertag, R.; Samii, M. V.; Doll, C. E.; Mistretta, G. D.; Hart, R. C.

    1993-01-01

    The Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) commissioned Applied Technology Associates, Incorporated, to develop the Real-Time Orbit Determination/Enhanced (RTOD/E) system on a Disk Operating System (DOS)-based personal computer (PC) as a prototype system for sequential orbit determination of spacecraft. This paper presents the results of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite (TDRS) System (TDRSS) user spacecraft, Landsat-4, obtained using RTOD/E, operating on a PC, with the accuracy of an established batch least-squares system, the Goddard Trajectory Determination System (GTDS), operating on a mainframe computer. The results of Landsat-4 orbit determination will provide useful experience for the Earth Observing System (EOS) series of satellites. The Landsat-4 ephemerides were estimated for the May 18-24, 1992, timeframe, during which intensive TDRSS tracking data for Landsat-4 were available. During this period, there were two separate orbit-adjust maneuvers on one of the TDRSS spacecraft (TDRS-East) and one small orbit-adjust maneuver for Landsat-4. Independent assessments were made of the consistencies (overlap comparisons for the batch case and covariances and the first measurement residuals for the sequential case) of solutions produced by the batch and sequential methods. The forward-filtered RTOD/E orbit solutions were compared with the definitive GTDS orbit solutions for Landsat-4; the solution differences were generally less than 30 meters after the filter had reached steady state.

  10. A novel dismantling process of waste printed circuit boards using water-soluble ionic liquid.

    PubMed

    Zeng, Xianlai; Li, Jinhui; Xie, Henghua; Liu, Lili

    2013-10-01

    Recycling processes for waste printed circuit boards (WPCBs) have been well established in terms of scientific research and field pilots. However, current dismantling procedures for WPCBs have restricted the recycling process, due to their low efficiency and negative impacts on environmental and human health. This work aimed to seek an environmental-friendly dismantling process through heating with water-soluble ionic liquid to separate electronic components and tin solder from two main types of WPCBs-cathode ray tubes and computer mainframes. The work systematically investigates the influence factors, heating mechanism, and optimal parameters for opening solder connections on WPCBs during the dismantling process, and addresses its environmental performance and economic assessment. The results obtained demonstrate that the optimal temperature, retention time, and turbulence resulting from impeller rotation during the dismantling process, were 250 °C, 12 min, and 45 rpm, respectively. Nearly 90% of the electronic components were separated from the WPCBs under the optimal experimental conditions. This novel process offers the possibility of large industrial-scale operations for separating electronic components and recovering tin solder, and for a more efficient and environmentally sound process for WPCBs recycling. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  12. Constraints and Opportunities in GCM Model Development

    NASA Technical Reports Server (NTRS)

    Schmidt, Gavin; Clune, Thomas

    2010-01-01

    Over the past 30 years climate models have evolved from relatively simple representations of a few atmospheric processes to complex multi-disciplinary system models which incorporate physics from bottom of the ocean to the mesopause and are used for seasonal to multi-million year timescales. Computer infrastructure over that period has gone from punchcard mainframes to modern parallel clusters. Constraints of working within an ever evolving research code mean that most software changes must be incremental so as not to disrupt scientific throughput. Unfortunately, programming methodologies have generally not kept pace with these challenges, and existing implementations now present a heavy and growing burden on further model development as well as limiting flexibility and reliability. Opportunely, advances in software engineering from other disciplines (e.g. the commercial software industry) as well as new generations of powerful development tools can be incorporated by the model developers to incrementally and systematically improve underlying implementations and reverse the long term trend of increasing development overhead. However, these methodologies cannot be applied blindly, but rather must be carefully tailored to the unique characteristics of scientific software development. We will discuss the need for close integration of software engineers and climate scientists to find the optimal processes for climate modeling.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colbert, C.; Moles, D.R.

    This paper reports that the authors developed for the Air Force the Mark VI Personal Identity Verifier (PIV) for controlling access to a fixed or mobile ICBM site, a computer terminal, or mainframe. The Mark VI records the digitized silhouettes of four fingers of each hand on an AT and T smart card. Like fingerprints, finger shapes, lengths, and widths constitute an unguessable biometric password. A Security Officer enrolls an authorized person who places each hand, in turn, on a backlighted panel. An overhead scanning camera records the right and left hand reference templates on the smart card. The Securitymore » Officer adds to the card: name, personal identification number (PIN), and access restrictions such as permitted days of the week, times of day, and doors. To gain access, cardowner inserts card into a reader slot and places either hand on the panel. Resulting access template is matched to the reference template by three sameness algorithms. The final match score is an average of 12 scores (each of the four fingers, matched for shape, length, and width), expressing the degree of sameness. (A perfect match would score 100.00.) The final match score is compared to a predetermined score (threshold), generating an accept or reject decision.« less

  14. The comparative effect of individually-generated vs. collaboratively-generated computer-based concept mapping on science concept learning

    NASA Astrophysics Data System (ADS)

    Kwon, So Young

    Using a quasi-experimental design, the researcher investigated the comparative effects of individually-generated and collaboratively-generated computer-based concept mapping on middle school science concept learning. Qualitative data were analyzed to explain quantitative findings. One hundred sixty-one students (74 boys and 87 girls) in eight, seventh grade science classes at a middle school in Southeast Texas completed the entire study. Using prior science performance scores to assure equivalence of student achievement across groups, the researcher assigned the teacher's classes to one of the three experimental groups. The independent variable, group, consisted of three levels: 40 students in a control group, 59 students trained to individually generate concept maps on computers, and 62 students trained to collaboratively generate concept maps on computers. The dependent variables were science concept learning as demonstrated by comprehension test scores, and quality of concept maps created by students in experimental groups as demonstrated by rubric scores. Students in the experimental groups received concept mapping training and used their newly acquired concept mapping skills to individually or collaboratively construct computer-based concept maps during study time. The control group, the individually-generated concept mapping group, and the collaboratively-generated concept mapping group had equivalent learning experiences for 50 minutes during five days, excepting that students in a control group worked independently without concept mapping activities, students in the individual group worked individually to construct concept maps, and students in the collaborative group worked collaboratively to construct concept maps during their study time. Both collaboratively and individually generated computer-based concept mapping had a positive effect on seventh grade middle school science concept learning but neither strategy was more effective than the other. However, the students who collaboratively generated concept maps created significantly higher quality concept maps than those who individually generated concept maps. The researcher concluded that the concept mapping software, Inspiration(TM), fostered construction of students' concept maps individually or collaboratively for science learning and helped students capture their evolving creative ideas and organize them for meaningful learning. Students in both the individual and the collaborative concept mapping groups had positive attitudes toward concept mapping using Inspiration(TM) software.

  15. Velocity Profile Characterization for the 5-CM Agent Fate Wind Tunnels

    DTIC Science & Technology

    2008-01-01

    denominator in the turbulence intensity) decreases near the floor. As can be see , the turbulence intensity ranges from about 0.5 to 2% for the low...Profiles The friction velocity calculated by the above procedure is a factor of two larger than the operational profile. It is difficult to see how the...the toolbar, see Figure 5. 2. Connect appropriate length co-axial cable and probe holder to desired input channel on the IFA300 mainframe. 3. Install

  16. Worldwide Report, Telecommunications Policy, Research and Development, No. 277.

    DTIC Science & Technology

    1983-07-01

    244128 JPRS 83810 July 1983 BiäiTCBüTröR’ zmmmn A Approved for pybHc ie2eo*e; Distribution Uniixrütsd Worldwide Report TELECOMMUNICATIONS...business would choose to attack would lead to increased charges for consumers, especially in rural areas. /’The Phone Book1, by Ian Reinecke and...hefty minicom- puter power for distributed data processing and it is in this field that the low-end mainframe mar- ket is being squeezed out by 32

  17. Enterprise storage report for the 1990's

    NASA Technical Reports Server (NTRS)

    Moore, Fred

    1991-01-01

    Data processing has become an increasingly vital function, if not the most vital function, in most businesses today. No longer only a mainframe domain, the data processing enterprise also includes the midrange and workstation platforms, either local or remote. This expanded view of the enterprise has encouraged more and more businesses to take a strategic, long-range view of information management rather than the short-term tactical approaches of the past. Some of the significant aspects of data storage in the enterprise for the 1990's are highlighted.

  18. METRRA Signature - Radar Cross Section Measurements. Final Report/ Instruction Manual

    DTIC Science & Technology

    1978-12-01

    Configuration 1 1. 5 Condensed System Parameters 1 1.5.1 Transmitter 1 1.5.2 Receiver 4 2.0 Description 5 V 2.1 Transmitter 5 2.3 Receiver 10 2.4 Antennas 14...System Configuration. 1.4.1 See Figure 1.4.2. 1.5 Condensed System Parameters . 1.5.1 Transmitter. "Mainframe: Applied Microwave Laboratory, Model...for Cubic Defense by Addington Laboratories. Techebychev designs are used for both filters to provide the steepest skirts for given numbers of reactive

  19. Students' Misconceptions about Medium-Scale Integrated Circuits

    ERIC Educational Resources Information Center

    Herman, G. L.; Loui, M. C.; Zilles, C.

    2011-01-01

    To improve instruction in computer engineering and computer science, instructors must better understand how their students learn. Unfortunately, little is known about how students learn the fundamental concepts in computing. To investigate student conceptions and misconceptions about digital logic concepts, the authors conducted a qualitative…

  20. The Assessment of Taiwanese College Students' Conceptions of and Approaches to Learning Computer Science and Their Relationships

    ERIC Educational Resources Information Center

    Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung

    2015-01-01

    The aim of this study was to explore Taiwanese college students' conceptions of and approaches to learning computer science and then explore the relationships between the two. Two surveys, Conceptions of Learning Computer Science (COLCS) and Approaches to Learning Computer Science (ALCS), were administered to 421 college students majoring in…

  1. Computer Literacy Project. A General Orientation in Basic Computer Concepts and Applications.

    ERIC Educational Resources Information Center

    Murray, David R.

    This paper proposes a two-part, basic computer literacy program for university faculty, staff, and students with no prior exposure to computers. The program described would introduce basic computer concepts and computing center service programs and resources; provide fundamental preparation for other computer courses; and orient faculty towards…

  2. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    ERIC Educational Resources Information Center

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  3. Evolving technologies for Space Station Freedom computer-based workstations

    NASA Technical Reports Server (NTRS)

    Jensen, Dean G.; Rudisill, Marianne

    1990-01-01

    Viewgraphs on evolving technologies for Space Station Freedom computer-based workstations are presented. The human-computer computer software environment modules are described. The following topics are addressed: command and control workstation concept; cupola workstation concept; Japanese experiment module RMS workstation concept; remote devices controlled from workstations; orbital maneuvering vehicle free flyer; remote manipulator system; Japanese experiment module exposed facility; Japanese experiment module small fine arm; flight telerobotic servicer; human-computer interaction; and workstation/robotics related activities.

  4. Measuring the Computer-Related Self-Concept

    ERIC Educational Resources Information Center

    Langheinrich, Jessica; Schönfelder, Mona; Bogner, Franz X.

    2016-01-01

    A positive self-concept supposedly affects a student's well-being as well as his or her perception of individual competence at school. As computer-based learning is becoming increasingly important in school, a positive computer-related self-concept (CSC) might help to enhance cognitive achievement. Consequently, we focused on establishing a short,…

  5. Comparative Effects of Computer-Based Concept Maps, Refutational Texts, and Expository Texts on Science Learning

    ERIC Educational Resources Information Center

    Adesope, Olusola O.; Cavagnetto, Andy; Hunsu, Nathaniel J.; Anguiano, Carlos; Lloyd, Joshua

    2017-01-01

    This study used a between-subjects experimental design to examine the effects of three different computer-based instructional strategies (concept map, refutation text, and expository scientific text) on science learning. Concept maps are node-link diagrams that show concepts as nodes and relationships among the concepts as labeled links.…

  6. Prospective Computer Teachers' Mental Images about the Concepts of "School" and "Computer Teacher"

    ERIC Educational Resources Information Center

    Saban, Aslihan

    2011-01-01

    In this phenomenological study, prospective computer teachers' mental images related to the concepts of "school" and "computer teacher" were examined through metaphors. Participants were all the 45 seniors majoring in the Department of Computer and Instructional Technologies at Selcuk University, Ahmet Kelesoglu Faculty of…

  7. The Effect of Mode of CAI and Individual Learning Differences on the Understanding of Concept Relationships.

    ERIC Educational Resources Information Center

    Rowland, Paul McD.

    The effect of mode of computer-assisted instruction (CAI) and individual learning differences on the learning of science concepts was investigated. University elementary education majors learned about home energy use from either a computer simulation or a computer tutorial. Learning of science concepts was measured using achievement and…

  8. Effects of a Computer-Assisted Concept Mapping Learning Strategy on EFL College Students' English Reading Comprehension

    ERIC Educational Resources Information Center

    Liu, Pei-Lin; Chen, Chiu-Jung; Chang, Yu-Ju

    2010-01-01

    The purpose of this research was to investigate the effects of a computer-assisted concept mapping learning strategy on EFL college learners' English reading comprehension. The research questions were: (1) what was the influence of the computer-assisted concept mapping learning strategy on different learners' English reading comprehension? (2) did…

  9. Using the Tower of Hanoi Puzzle to Infuse Your Mathematics Classroom with Computer Science Concepts

    ERIC Educational Resources Information Center

    Marzocchi, Alison S.

    2016-01-01

    This article suggests that logic puzzles, such as the well-known Tower of Hanoi puzzle, can be used to introduce computer science concepts to mathematics students of all ages. Mathematics teachers introduce their students to computer science concepts that are enacted spontaneously and subconsciously throughout the solution to the Tower of Hanoi…

  10. Interface design of VSOP'94 computer code for safety analysis

    NASA Astrophysics Data System (ADS)

    Natsir, Khairina; Yazid, Putranto Ilham; Andiwijayakusuma, D.; Wahanani, Nursinta Adi

    2014-09-01

    Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects.

  11. Computers and Play.

    ERIC Educational Resources Information Center

    Colker, Larry

    Viewing computers in various forms as developmentally appropriate objects for children, this discussion provides a framework for integrating conceptions of computers and conceptions of play. Several instances are cited from the literature in which explicit analogies have been made between computers and playthings or play environments.…

  12. s95-16445

    NASA Image and Video Library

    2014-08-07

    S95-16445 (13-22 July 1995) --- A wide angle view from the rear shows activity in the new Mission Control Center (MCC), opened for operation and dedicated during the STS-70 mission. The Space Shuttle Discovery was just passing over Florida at the time this photo was taken (note mercator map and TV scene on screens). The new MCC, developed at a cost of about 50 million, replaces the main-frame based, NASA-unique design of the old Mission Control with a standard workstation-based, local area network system commonly in use today.

  13. Enterprise storage report for the 1990's

    NASA Technical Reports Server (NTRS)

    Moore, Fred

    1992-01-01

    Data processing has become an increasingly vital function, if not the most vital function, in most businesses today. No longer only a mainframe domain, the data processing enterprise also includes the midrange and workstation platforms, either local or remote. This expanded view of the enterprise has encouraged more and more businesses to take a strategic, long-range view of information management rather than the short-term tactical approaches of the past. This paper will highlight some of the significant aspects of data storage in the enterprise for the 1990's.

  14. Principles and techniques in the design of ADMS+. [advanced data-base management system

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nick; Kang, Hyunchul

    1986-01-01

    'ADMS+/-' is an advanced data base management system whose architecture integrates the ADSM+ mainframe data base system with a large number of work station data base systems, designated ADMS-; no communications exist between these work stations. The use of this system radically decreases the response time of locally processed queries, since the work station runs in a single-user mode, and no dynamic security checking is required for the downloaded portion of the data base. The deferred update strategy used reduces overhead due to update synchronization in message traffic.

  15. Navy Enterprise Resource Planning Program: Governance Challenges in Deploying an Enterprise-Wide Information Technology System in the Department of the Navy

    DTIC Science & Technology

    2010-12-01

    Authority WCF Working Capital Fund Y2K Year 2000 xvi THIS PAGE INTENTIONALLY LEFT BLANK xvii ACKNOWLEDGMENTS We express our deepest...signaling the move away from mainframe systems. However, it was the year 2000 ( Y2K ) dilemma that ushered in unprecedented growth in the development of ERP...software and IT systems of the 1990s. The possibility of non- Y2K compliant legacy systems failing at the turn of the century resulted in the

  16. Non-Determinism: An Abstract Concept in Computer Science Studies

    ERIC Educational Resources Information Center

    Armoni, Michal; Gal-Ezer, Judith

    2007-01-01

    Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…

  17. Can Computer Animations Affect College Biology Students' Conceptions about Diffusion and Osmosis?

    ERIC Educational Resources Information Center

    Sanger, Michael J.; Brecheisen, Dorothy M.; Hynek, Brian M.

    2001-01-01

    Investigates whether viewing computer animations representing the process of diffusion and osmosis affects students' conceptions. Discusses the difficulties of implementing computer animations in the classroom. (Contains 27 references.) (YDS)

  18. Toward using games to teach fundamental computer science concepts

    NASA Astrophysics Data System (ADS)

    Edgington, Jeffrey Michael

    Video and computer games have become an important area of study in the field of education. Games have been designed to teach mathematics, physics, raise social awareness, teach history and geography, and train soldiers in the military. Recent work has created computer games for teaching computer programming and understanding basic algorithms. We present an investigation where computer games are used to teach two fundamental computer science concepts: boolean expressions and recursion. The games are intended to teach the concepts and not how to implement them in a programming language. For this investigation, two computer games were created. One is designed to teach basic boolean expressions and operators and the other to teach fundamental concepts of recursion. We describe the design and implementation of both games. We evaluate the effectiveness of these games using before and after surveys. The surveys were designed to ascertain basic understanding, attitudes and beliefs regarding the concepts. The boolean game was evaluated with local high school students and students in a college level introductory computer science course. The recursion game was evaluated with students in a college level introductory computer science course. We present the analysis of the collected survey information for both games. This analysis shows a significant positive change in student attitude towards recursion and modest gains in student learning outcomes for both topics.

  19. Mathematics Objectives and Measurement Specifications 1986-1990. Exit Level. Texas Educational Assessment of Minimum Skills (TEAMS).

    ERIC Educational Resources Information Center

    Texas Education Agency, Austin. Div. of Educational Assessment.

    This document lists the objectives for the Texas educational assessment program in mathematics. Eighteen objectives for exit level mathematics are listed, by category: number concepts (4); computation (3); applied computation (5); statistical concepts (3); geometric concepts (2); and algebraic concepts (1). Then general specifications are listed…

  20. Integration of communications with the Intelligent Gateway Processor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hampel, V.E.

    1986-01-01

    The Intelligent Gateway Processor (IGP) software is being used to interconnect users equipped with different personal computers and ASCII terminals to mainframe machines of different make. This integration is made possible by the IGP's unique user interface and networking software. Prototype systems of the table-driven, interpreter-based IGP have been adapted to very different programmatic requirements and have demonstrated substantial increases in end-user productivity. Procedures previously requiring days can now be carried out in minutes. The IGP software has been under development by the Technology Information Systems (TIS) program at Lawrence Livermore National Laboratory (LLNL) since 1975 and is in usemore » by several federal agencies since 1983: The Air Force is prototyping applications which range from automated identification of spare parts for aircraft to office automation and the controlled storage and distribution of technical orders and engineering drawings. Other applications of the IGP are the Information Management System (IMS) for aviation statistics in the Federal Aviation Administration (FAA), the Nuclear Criticality Information System (NCIS) and a nationwide Cost Estimating System (CES) in the Department of Energy, the library automation network of the Defense Technical Information Center (DTIC), and the modernization program in the Office of the Secretary of Defense (OSD). 31 refs., 9 figs.« less

  1. Sources of Cryogenic Data and Information

    NASA Astrophysics Data System (ADS)

    Mohling, R. A.; Hufferd, W. L.; Marquardt, E. D.

    It is commonly known that cryogenic data, technology, and information are applied across many military, National Aeronautics and Space Administration (NASA), and civilian product lines. Before 1950, however, there was no centralized US source of cryogenic technology data. The Cryogenic Data Center of the National Bureau of Standards (NBS) maintained a database of cryogenic technical documents that served the national need well from the mid 1950s to the early 1980s. The database, maintained on a mainframe computer, was a highly specific bibliography of cryogenic literature and thermophysical properties that covered over 100 years of data. In 1983, however, the Cryogenic Data Center was discontinued when NBS's mission and scope were redefined. In 1998, NASA contracted with the Chemical Propulsion Information Agency (CPIA) and Technology Applications, Inc. (TAI) to reconstitute and update Cryogenic Data Center information and establish a self-sufficient entity to provide technical services for the cryogenic community. The Cryogenic Information Center (CIC) provided this service until 2004, when it was discontinued due to a lack of market interest. The CIC technical assets were distributed to NASA Marshall Space Flight Center and the National Institute of Standards and Technology. Plans are under way in 2006 for CPIA to launch an e-commerce cryogenic website to offer bibliography data with capability to download cryogenic documents.

  2. Role of Computer Assisted Instruction (CAI) in an Introductory Computer Concepts Course.

    ERIC Educational Resources Information Center

    Skudrna, Vincent J.

    1997-01-01

    Discusses the role of computer assisted instruction (CAI) in undergraduate education via a survey of related literature and specific applications. Describes an undergraduate computer concepts course and includes appendices of instructions, flowcharts, programs, sample student work in accounting, COBOL instructional model, decision logic in a…

  3. Effects of Multidimensional Concept Maps on Fourth Graders' Learning in Web-Based Computer Course

    ERIC Educational Resources Information Center

    Huang, Hwa-Shan; Chiou, Chei-Chang; Chiang, Heien-Kun; Lai, Sung-Hsi; Huang, Chiun-Yen; Chou, Yin-Yu

    2012-01-01

    This study explores the effect of multidimensional concept mapping instruction on students' learning performance in a web-based computer course. The subjects consisted of 103 fourth graders from an elementary school in central Taiwan. They were divided into three groups: multidimensional concept map (MCM) instruction group, Novak concept map (NCM)…

  4. The effects of a computer skill training programme adopting social comparison and self-efficacy enhancement strategies on self-concept and skill outcome in trainees with physical disabilities.

    PubMed

    Tam, S F

    2000-10-15

    The aim of this controlled, quasi-experimental study was to evaluate the effects of both self-efficacy enhancement and social comparison training strategy on computer skills learning and self-concept outcome of trainees with physical disabilities. The self-efficacy enhancement group comprised 16 trainees, the tutorial training group comprised 15 trainees, and there were 25 subjects in the control group. Both the self-efficacy enhancement group and the tutorial training group received a 15 week computer skills training course, including generic Chinese computer operation, Chinese word processing and Chinese desktop publishing skills. The self-efficacy enhancement group received training with tutorial instructions that incorporated self-efficacy enhancement strategies and experienced self-enhancing social comparisons. The tutorial training group received behavioural learning-based tutorials only, and the control group did not receive any training. The following measurements were employed to evaluate the outcomes: the Self-Concept Questionnaire for the Physically Disabled Hong Kong Chinese (SCQPD), the computer self-efficacy rating scale and the computer performance rating scale. The self-efficacy enhancement group showed significantly better computer skills learning outcome, total self-concept, and social self-concept than the tutorial training group. The self-efficacy enhancement group did not show significant changes in their computer self-efficacy: however, the tutorial training group showed a significant lowering of their computer self-efficacy. The training strategy that incorporated self-efficacy enhancement and positive social comparison experiences maintained the computer self-efficacy of trainees with physical disabilities. This strategy was more effective in improving the learning outcome (p = 0.01) and self-concept (p = 0.05) of the trainees than the conventional tutorial-based training strategy.

  5. Introducing Hospital Staff to Computer Concepts: An Educational Program

    PubMed Central

    Kaplan, Bonnie

    1981-01-01

    An in-house computer education program for hospital staff ran for two years at a large, metropolitan hospital. The program drew physicians, administrators, department heads, secretaries, technicians, and data managers to courses, seminars, and workshops on medical computing. Two courses, an introduction to computer concepts and a programming course, are described and evaluated.

  6. Structured Interviews on Children's Conceptions of Computers. Technical Report No. 19.

    ERIC Educational Resources Information Center

    Mawby, Ronald; And Others

    The terms and concepts children used to explain their beliefs about computers before and after classroom exposure to microcomputers were studied to identify misconceptions about computers that could interfere with computer-based learning. Children in each of two classrooms at the Bank Street School for Children were interviewed individually on…

  7. The Effects of Integrating Computer-Based Concept Mapping for Physics Learning in Junior High School

    ERIC Educational Resources Information Center

    Chang, Cheng-Chieh; Yeh, Ting-Kuang; Shih, Chang-Ming

    2016-01-01

    It generally is accepted that concept mapping has a noticeable impact on learning. But literatures show the use of concept mapping is not benefit all learners. The present study explored the effects of incorporating computer-based concept mapping in physics instruction. A total of 61 9th-grade students participated in this study. By using a…

  8. A Crafts-Oriented Approach to Computing in High School: Introducing Computational Concepts, Practices, and Perspectives with Electronic Textiles

    ERIC Educational Resources Information Center

    Kafai, Yasmin B.; Lee, Eunkyoung; Searle, Kristin; Fields, Deborah; Kaplan, Eliot; Lui, Debora

    2014-01-01

    In this article, we examine the use of electronic textiles (e-textiles) for introducing key computational concepts and practices while broadening perceptions about computing. The starting point of our work was the design and implementation of a curriculum module using the LilyPad Arduino in a pre-AP high school computer science class. To…

  9. Integration of Ausubelian Learning Theory and Educational Computing.

    ERIC Educational Resources Information Center

    Heinze-Fry, Jane A.; And Others

    1984-01-01

    Examines possible benefits when Ausubelian learning approaches are integrated into computer-assisted instruction, presenting an example of this integration in a computer program dealing with introductory ecology concepts. The four program parts (tutorial, interactive concept mapping, simulations, and vee-mapping) are described. (JN)

  10. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    ERIC Educational Resources Information Center

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  11. Visualization Tools for Teaching Computer Security

    ERIC Educational Resources Information Center

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  12. Review and Discussion of Children's Conceptions of Computers

    NASA Astrophysics Data System (ADS)

    Rücker, Michael T.; Pinkwart, Niels

    2016-04-01

    Today's children grow up surrounded by computers. They observe them, interact with them and, as a consequence, start forming conceptions of how they work and what they can do. Any constructivist approach to learning requires that we gain an understanding of such preconceived ideas and beliefs in order to use computers as learning tools in an effective and informed manner. In this paper, we present five such conceptions that children reportedly form about computers, based on an interdisciplinary literature review. We then evaluate how persistent these conceptions appear to be over time and in light of new technological developments. Finally, we discuss the relevance and implications of our findings for education in the contexts of conceptual pluralism and conceptual categorisation.

  13. The business of demographics.

    PubMed

    Russell, C

    1984-06-01

    The emergence of "demographics" in the past 15 years is a vital tool for American business research and planning. Tracing demographic trends became important for businesses when traditional consumer markets splintered with the enormous changes since the 1960s in US population growth, age structure, geographic distribution, income, education, living arrangements, and life-styles. The mass of reliable, small-area demographic data needed for market estimates and projections became available with the electronic census--public release of Census Bureau census and survey data on computer tape, beginning with the 1970 census. Census Bureau tapes as well as printed reports and microfiche are now widely accessible at low cost through summary tape processing centers designated by the bureau and its 12 regional offices and State Data Center Program. Data accessibility, plummeting computer costs, and businessess' unfamiliarity with demographics spawned the private data industry. By 1984, 70 private companies were offering demographic services to business clients--customized information repackaged from public data or drawn from proprietary data bases created from such data. Critics protest the for-profit use of public data by companies able to afford expensive mainframe computer technology. Business people defend their rights to public data as taxpaying ceitzens, but they must ensure that the data are indeed used for the public good. They must also question the quality of demographic data generated by private companies. Business' demographic expertise will improve when business schools offer training in demography, as few now do, though 40 of 88 graduate-level demographic programs now include business-oriented courses. Lower cost, easier access to business demographics is growing as more census data become available on microcomputer diskettes and through on-line linkages with large data bases--from private data companies and the Census Bureau itself. A directory of private and public demographic resources is appended, including forecasting, consulting and research services available.

  14. Section 1. Simulation of surface-water integrated flow and transport in two-dimensions: SWIFT2D user's manual

    USGS Publications Warehouse

    Schaffranek, Raymond W.

    2004-01-01

    A numerical model for simulation of surface-water integrated flow and transport in two (horizontal-space) dimensions is documented. The model solves vertically integrated forms of the equations of mass and momentum conservation and solute transport equations for heat, salt, and constituent fluxes. An equation of state for salt balance directly couples solution of the hydrodynamic and transport equations to account for the horizontal density gradient effects of salt concentrations on flow. The model can be used to simulate the hydrodynamics, transport, and water quality of well-mixed bodies of water, such as estuaries, coastal seas, harbors, lakes, rivers, and inland waterways. The finite-difference model can be applied to geographical areas bounded by any combination of closed land or open water boundaries. The simulation program accounts for sources of internal discharges (such as tributary rivers or hydraulic outfalls), tidal flats, islands, dams, and movable flow barriers or sluices. Water-quality computations can treat reactive and (or) conservative constituents simultaneously. Input requirements include bathymetric and topographic data defining land-surface elevations, time-varying water level or flow conditions at open boundaries, and hydraulic coefficients. Optional input includes the geometry of hydraulic barriers and constituent concentrations at open boundaries. Time-dependent water level, flow, and constituent-concentration data are required for model calibration and verification. Model output consists of printed reports and digital files of numerical results in forms suitable for postprocessing by graphical software programs and (or) scientific visualization packages. The model is compatible with most mainframe, workstation, mini- and micro-computer operating systems and FORTRAN compilers. This report defines the mathematical formulation and computational features of the model, explains the solution technique and related model constraints, describes the model framework, documents the type and format of inputs required, and identifies the type and format of output available.

  15. Self-Concept, Computer Anxiety, Gender and Attitude towards Interactive Computer Technologies: A Predictive Study among Nigerian Teachers

    ERIC Educational Resources Information Center

    Agbatogun, Alaba Olaoluwakotansibe

    2010-01-01

    Interactive Computer Technologies (ICTs) have crept into education industry, thus dramatically causing transformation in instructional process. This study examined the relative and combined contributions of computer anxiety, self-concept and gender to teachers' attitude towards the use of ICT(s). 454 Nigerian teachers constituted the sample. Three…

  16. THE COMPUTER CONCEPT OF SELF-INSTRUCTIONAL DEVICES.

    ERIC Educational Resources Information Center

    SILBERMAN, HARRY F.

    THE COMPUTER SYSTEM CONCEPT WILL BE DEVELOPED IN TWO WAYS--FIRST, A DESCRIPTION WILL BE MADE OF THE SMALL COMPUTER-BASED TEACHING MACHINE WHICH IS BEING USED AS A RESEARCH TOOL, SECOND, A DESCRIPTION WILL BE MADE OF THE LARGE COMPUTER LABORATORY FOR AUTOMATED SCHOOL SYSTEMS WHICH ARE BEING DEVELOPED. THE FIRST MACHINE CONSISTS OF THREE ELEMENTS--…

  17. Computer Science Concept Inventories: Past and Future

    ERIC Educational Resources Information Center

    Taylor, C.; Zingaro, D.; Porter, L.; Webb, K. C.; Lee, C. B.; Clancy, M.

    2014-01-01

    Concept Inventories (CIs) are assessments designed to measure student learning of core concepts. CIs have become well known for their major impact on pedagogical techniques in other sciences, especially physics. Presently, there are no widely used, validated CIs for computer science. However, considerable groundwork has been performed in the form…

  18. Analyzing the Effects of Various Concept Mapping Techniques on Learning Achievement under Different Learning Styles

    ERIC Educational Resources Information Center

    Chiou, Chei-Chang; Lee, Li-Tze; Tien, Li-Chu; Wang, Yu-Min

    2017-01-01

    This study explored the effectiveness of different concept mapping techniques on the learning achievement of senior accounting students and whether achievements attained using various techniques are affected by different learning styles. The techniques are computer-assisted construct-by-self-concept mapping (CACSB), computer-assisted…

  19. Conceptions of Programming: A Study into Learning To Program.

    ERIC Educational Resources Information Center

    Booth, Shirley

    This paper reports the results of a phenomenographic study which focused on identifying and describing the conceptions of programming and related phenomena of about 120 computer science and computer engineering students learning to program. The report begins by tracing developments in the students' conceptions of programming and its parts, and…

  20. Computer Graphics and Metaphorical Elaboration for Learning Science Concepts.

    ERIC Educational Resources Information Center

    ChanLin, Lih-Juan; Chan, Kung-Chi

    This study explores the instructional impact of using computer multimedia to integrate metaphorical verbal information into graphical representations of biotechnology concepts. The combination of text and graphics into a single metaphor makes concepts dual-coded, and therefore more comprehensible and memorable for the student. Visual stimuli help…

  1. UNIX based client/server hospital information system.

    PubMed

    Nakamura, S; Sakurai, K; Uchiyama, M; Yoshii, Y; Tachibana, N

    1995-01-01

    SMILE (St. Luke's Medical Center Information Linkage Environment) is a HIS which is a client/server system using a UNIX workstation under an open network, LAN(FDDI&10BASE-T). It provides a multivendor environment, high performance with low cost and a user-friendly GUI. However, the client/server architecture with a UNIX workstation does not have the same OLTP environment (ex. TP monor) as the mainframe. So, our system problems and the steps used to solve them were reviewed. Several points that are necessary for a client/server system with a UNIX workstation in the future are presented.

  2. Web client and ODBC access to legacy database information: a low cost approach.

    PubMed Central

    Sanders, N. W.; Mann, N. H.; Spengler, D. M.

    1997-01-01

    A new method has been developed for the Department of Orthopaedics of Vanderbilt University Medical Center to access departmental clinical data. Previously this data was stored only in the medical center's mainframe DB2 database, it is now additionally stored in a departmental SQL database. Access to this data is available via any ODBC compliant front-end or a web client. With a small budget and no full time staff, we were able to give our department on-line access to many years worth of patient data that was previously inaccessible. PMID:9357735

  3. Remote control canard missile with a free-rolling tail brake torque system

    NASA Technical Reports Server (NTRS)

    Blair, A. B., Jr.

    1981-01-01

    An experimental wind-tunnel investigation has been conducted at supersonic Mach numbers to determine the static aerodynamic characteristics of a cruciform canard-controlled missile with fixed and free-rolling tail-fin afterbodies. Mechanical coupling effects of the free-rolling tail afterbody were investigated using an electronic/electromagnetic brake system that provides arbitrary tail-fin brake torques with continuous measurements of tail-to-mainframe torque and tail-roll rate. Results are summarized to show the effects of fixed and free-rolling tail-fin afterbodies that include simulated measured bearing friction torques on the longitudinal and lateral-directional aerodynamic characteristics.

  4. Distributed computing environments for future space control systems

    NASA Technical Reports Server (NTRS)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  5. Assessing Cognitive Learning of Analytical Problem Solving

    NASA Astrophysics Data System (ADS)

    Billionniere, Elodie V.

    Introductory programming courses, also known as CS1, have a specific set of expected outcomes related to the learning of the most basic and essential computational concepts in computer science (CS). However, two of the most often heard complaints in such courses are that (1) they are divorced from the reality of application and (2) they make the learning of the basic concepts tedious. The concepts introduced in CS1 courses are highly abstract and not easily comprehensible. In general, the difficulty is intrinsic to the field of computing, often described as "too mathematical or too abstract." This dissertation presents a small-scale mixed method study conducted during the fall 2009 semester of CS1 courses at Arizona State University. This study explored and assessed students' comprehension of three core computational concepts---abstraction, arrays of objects, and inheritance---in both algorithm design and problem solving. Through this investigation students' profiles were categorized based on their scores and based on their mistakes categorized into instances of five computational thinking concepts: abstraction, algorithm, scalability, linguistics, and reasoning. It was shown that even though the notion of computational thinking is not explicit in the curriculum, participants possessed and/or developed this skill through the learning and application of the CS1 core concepts. Furthermore, problem-solving experiences had a direct impact on participants' knowledge skills, explanation skills, and confidence. Implications for teaching CS1 and for future research are also considered.

  6. Process-Based Development of Competence Models to Computer Science Education

    ERIC Educational Resources Information Center

    Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter

    2016-01-01

    A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…

  7. City public service learns to speed read. [Computerized routing system for meter reading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aitken, E.L.

    1994-02-01

    City Public Service (CPS) of San Antonio, TX is a municipally owned utility that serves a densely populated 1,566 square miles in and around San Antonio. CPS's service area is divided into 21 meter reading districts, each of which is broken down into no more than 99 regular routes. Every day, a CPS employee reads one of the districts, following one or more routes. In 1991, CPS began using handheld computers to record reads for regular routes, which are stored on the devices themselves. In contrast, rereads and final reads occur at random throughout the service area. Because they changemore » every day, the process of creating routes that can be loaded onto a handheld device is difficult. Until recently, rereads and final reads were printed on paper orders, and route schedulers would spend close to two hours sorting the paper orders into routes. Meter readers would then hand-sequence the orders on their routes, often using a city map, before taking them into the field in stacks. When the meter readers returned, their completed orders had to be separated by type of reread, and then keyed into the mainframe computer before bill processing could begin. CPS's data processing department developed a computerized routing system of its own that saves time and labor, as well as paper. The system eliminates paper orders entirely, enabling schedulers to create reread and final read routes graphically on a PC. Information no longer needs to be keyed from hard copy, reducing the margin of error and streamlining bill processing by incorporating automated data transfer between systems.« less

  8. Restructuring VA ambulatory care and medical education: the PACE model of primary care.

    PubMed

    Cope, D W; Sherman, S; Robbins, A S

    1996-07-01

    The Veterans Health Administration (VHA) Western Region and associated medical schools formulated a set of recommendations for an improved ambulatory health care delivery system during a 1988 strategic planning conference. As a result, the Department of Veterans Affairs (VA) Medical Center in Sepulveda, California, initiated the Pilot (now Primary) Ambulatory Care and Education (PACE) program in 1990 to implement and evaluate a model program. The PACE program represents a significant departure from traditional VA and non-VA academic medical center care, shifting the focus of care from the inpatient to the outpatient setting. From its inception, the PACE program has used an interdisciplinary team approach with three independent global care firms. Each firm is interdisciplinary in composition, with a matrix management structure that expands role function and empowers team members. Emphasis is on managed primary care, stressing a biopsychosocial approach and cost-effective comprehensive care emphasizing prevention and health maintenance. Information management is provided through a network of personal computers that serve as a front end to the VHA Decentralized Hospital Computer Program (DHCP) mainframe. In addition to providing comprehensive and cost-effective care, the PACE program educates trainees in all health care disciplines, conducts research, and disseminates information about important procedures and outcomes. Undergraduate and graduate trainees from 11 health care disciplines rotate through the PACE program to learn an integrated approach to managed ambulatory care delivery. All trainees are involved in a problem-based approach to learning that emphasizes shared training experiences among health care disciplines. This paper describes the transitional phases of the PACE program (strategic planning, reorganization, and quality improvement) that are relevant for other institutions that are shifting to training programs emphasizing primary and ambulatory care.

  9. Constructing conceptual knowledge and promoting "number sense" from computer-managed practice in rounding whole numbers

    NASA Astrophysics Data System (ADS)

    Hativa, Nira

    1993-12-01

    This study sought to identify how high achievers learn and understand new concepts in arithmetic from computer-based practice which provides full solutions to examples but without verbal explanations. Four high-achieving second graders were observed in their natural school settings throughout all their computer-based practice sessions which involved the concept of rounding whole numbers, a concept which was totally new to them. Immediate post-session interviews inquired into students' strategies for solutions, errors, and their understanding of the underlying mathematical rules. The article describes the process through which the students construct their knowledge of the rounding concepts and the errors and misconceptions encountered in this process. The article identifies the cognitive abilities that promote student self-learning of the rounding concepts, their number concepts and "number sense." Differences in the ability to generalise, "mathematical memory," mindfulness of work and use of cognitive strategies are shown to account for the differences in patterns of, and gains in, learning and in maintaining knowledge among the students involved. Implications for the teaching of estimation concepts and of promoting students' "number sense," as well as for classroom use of computer-based practice are discussed.

  10. A computer assisted tutorial for applications of computer spreadsheets in nursing financial management.

    PubMed

    Edwardson, S R; Pejsa, J

    1993-01-01

    A computer-based tutorial for teaching nursing financial management concepts was developed using the macro function of a commercially available spreadsheet program. The goals of the tutorial were to provide students with an experience with spreadsheets as a computer tool and to teach selected financial management concepts. Preliminary results show the tutorial was well received by students. Suggestions are made for overcoming the general lack of computer sophistication among students.

  11. Minimizing the Free Energy: A Computer Method for Teaching Chemical Equilibrium Concepts.

    ERIC Educational Resources Information Center

    Heald, Emerson F.

    1978-01-01

    Presents a computer method for teaching chemical equilibrium concepts using material balance conditions and the minimization of the free energy. Method for the calculation of chemical equilibrium, the computer program used to solve equilibrium problems and applications of the method are also included. (HM)

  12. Using Problem Solving to Teach a Programming Language.

    ERIC Educational Resources Information Center

    Milbrandt, George

    1995-01-01

    Computer studies courses should incorporate as many computer concepts and programming language experiences as possible. A gradual increase in problem difficulty will help the student to understand various computer concepts, and the programming language's syntax and structure. A sidebar provides two examples of how to establish a learning…

  13. Using Real-Life Experiences to Teach Computer Concepts

    ERIC Educational Resources Information Center

    Read, Alexis

    2012-01-01

    Teaching computer concepts to individuals with visual impairments (that is, those who are blind or visually impaired) presents some unique challenges. Students often have difficulty remembering to perform certain steps or have difficulty remembering specific keystrokes when using computers. Many cannot visualize the way in which complex computing…

  14. Teaching Topographic Map Skills and Geomorphology Concepts with Google Earth in a One-Computer Classroom

    ERIC Educational Resources Information Center

    Hsu, Hsiao-Ping; Tsai, Bor-Wen; Chen, Che-Ming

    2018-01-01

    Teaching high-school geomorphological concepts and topographic map reading entails many challenges. This research reports the applicability and effectiveness of Google Earth in teaching topographic map skills and geomorphological concepts, by a single teacher, in a one-computer classroom. Compared to learning via a conventional instructional…

  15. Using the Tower of Hanoi puzzle to infuse your mathematics classroom with computer science concepts

    NASA Astrophysics Data System (ADS)

    Marzocchi, Alison S.

    2016-07-01

    This article suggests that logic puzzles, such as the well-known Tower of Hanoi puzzle, can be used to introduce computer science concepts to mathematics students of all ages. Mathematics teachers introduce their students to computer science concepts that are enacted spontaneously and subconsciously throughout the solution to the Tower of Hanoi puzzle. These concepts include, but are not limited to, conditionals, iteration, and recursion. Lessons, such as the one proposed in this article, are easily implementable in mathematics classrooms and extracurricular programmes as they are good candidates for 'drop in' lessons that do not need to fit into any particular place in the typical curriculum sequence. As an example for readers, the author describes how she used the puzzle in her own Number Sense and Logic course during the federally funded Upward Bound Math/Science summer programme for college-intending low-income high school students. The article explains each computer science term with real-life and mathematical examples, applies each term to the Tower of Hanoi puzzle solution, and describes how students connected the terms to their own solutions of the puzzle. It is timely and important to expose mathematics students to computer science concepts. Given the rate at which technology is currently advancing, and our increased dependence on technology in our daily lives, it has become more important than ever for children to be exposed to computer science. Yet, despite the importance of exposing today's children to computer science, many children are not given adequate opportunity to learn computer science in schools. In the United States, for example, most students finish high school without ever taking a computing course. Mathematics lessons, such as the one described in this article, can help to make computer science more accessible to students who may have otherwise had little opportunity to be introduced to these increasingly important concepts.

  16. Combination of inquiry learning model and computer simulation to improve mastery concept and the correlation with critical thinking skills (CTS)

    NASA Astrophysics Data System (ADS)

    Nugraha, Muhamad Gina; Kaniawati, Ida; Rusdiana, Dadi; Kirana, Kartika Hajar

    2016-02-01

    Among the purposes of physics learning at high school is to master the physics concepts and cultivate scientific attitude (including critical attitude), develop inductive and deductive reasoning skills. According to Ennis et al., inductive and deductive reasoning skills are part of critical thinking. Based on preliminary studies, both of the competence are lack achieved, it is seen from student learning outcomes is low and learning processes that are not conducive to cultivate critical thinking (teacher-centered learning). One of learning model that predicted can increase mastery concepts and train CTS is inquiry learning model aided computer simulations. In this model, students were given the opportunity to be actively involved in the experiment and also get a good explanation with the computer simulations. From research with randomized control group pretest-posttest design, we found that the inquiry learning model aided computer simulations can significantly improve students' mastery concepts than the conventional (teacher-centered) method. With inquiry learning model aided computer simulations, 20% of students have high CTS, 63.3% were medium and 16.7% were low. CTS greatly contribute to the students' mastery concept with a correlation coefficient of 0.697 and quite contribute to the enhancement mastery concept with a correlation coefficient of 0.603.

  17. Computer-assisted concept mapping: Visual aids for knowledge construction

    PubMed Central

    Mammen, Jennifer R.

    2016-01-01

    Background Concept mapping is a visual representation of ideas that facilitates critical thinking and is applicable to many areas of nursing education. Computer-Assisted Concept Maps are more flexible and less constrained than traditional paper methods, allowing for analysis and synthesis of complex topics and larger amounts of data. Ability to iteratively revise and collaboratively create computerized maps can contribute to enhanced interpersonal learning. However, there is limited awareness of free software that can support these types of applications. Discussion This educational brief examines affordances and limitations of Computer-Assisted Concept Maps and reviews free software for development of complex, collaborative malleable maps. Free software such as VUE, Xmind, MindMaple, and others can substantially contribute to utility of concept-mapping for nursing education. Conclusions Computerized concept-mapping is an important tool for nursing and is likely to hold greater benefit for students and faculty than traditional pen and paper methods alone. PMID:27351610

  18. Examining Trust, Forgiveness and Regret as Computational Concepts

    NASA Astrophysics Data System (ADS)

    Marsh, Stephen; Briggs, Pamela

    The study of trust has advanced tremendously in recent years, to the extent that the goal of a more unified formalisation of the concept is becoming feasible. To that end, we have begun to examine the closely related concepts of regret and forgiveness and their relationship to trust and its siblings. The resultant formalisation allows computational tractability in, for instance, artificial agents. Moreover, regret and forgiveness, when allied to trust, are very powerful tools in the Ambient Intelligence (AmI) security area, especially where Human Computer Interaction and concrete human understanding are key. This paper introduces the concepts of regret and forgiveness, exploring them from social psychological as well as a computational viewpoint, and presents an extension to Marsh's original trust formalisation that takes them into account. It discusses and explores work in the AmI environment, and further potential applications.

  19. Using Rasch Measurement to Develop a Computer Modeling-Based Instrument to Assess Students' Conceptual Understanding of Matter

    ERIC Educational Resources Information Center

    Wei, Silin; Liu, Xiufeng; Wang, Zuhao; Wang, Xingqiao

    2012-01-01

    Research suggests that difficulty in making connections among three levels of chemical representations--macroscopic, submicroscopic, and symbolic--is a primary reason for student alternative conceptions of chemistry concepts, and computer modeling is promising to help students make the connections. However, no computer modeling-based assessment…

  20. Teachers' Perceptions on the Use of ICT in a CAL Environment to Enhance the Conception of Science Concepts

    ERIC Educational Resources Information Center

    George, Frikkie; Ogunniyi, M.

    2016-01-01

    Instructional methodologies increasingly require teachers' efficacy and implementation of computer-assisted learning (CAL) practices in general and particularly in the science classroom. The South African National Education Department's e-Education[1] policy also encourages the use of computers and computer software in implementing outcome-based…

  1. Using a Virtual Class to Demonstrate Computer-Mediated Group Dynamics Concepts

    ERIC Educational Resources Information Center

    Franz, Timothy M.; Vicker, Lauren A.

    2010-01-01

    We report about an active learning demonstration designed to use a virtual class to present computer-mediated group communication course concepts to show that students can learn about these concepts in a virtual class. We designated 1 class period as a virtual rather than face-to-face class, when class members "attended" virtually using…

  2. Resequencing Skills and Concepts in Applied Calculus Using the Computer as a Tool.

    ERIC Educational Resources Information Center

    Heid, M. Kathleen

    1988-01-01

    During the first 12 weeks of an applied calculus course, two classes of college students studied calculus concepts using graphical and symbol-manipulation computer programs to perform routine manipulations. Three weeks were spent on skill development. Students showed better understanding of concepts and performed almost as well on routine skills.…

  3. Assessing Changes in High School Students' Conceptual Understanding through Concept Maps before and after the Computer-Based Predict-Observe-Explain (CB-POE) Tasks on Acid-Base Chemistry at the Secondary Level

    ERIC Educational Resources Information Center

    Yaman, Fatma; Ayas, Alipasa

    2015-01-01

    Although concept maps have been used as alternative assessment methods in education, there has been an ongoing debate on how to evaluate students' concept maps. This study discusses how to evaluate students' concept maps as an assessment tool before and after 15 computer-based Predict-Observe-Explain (CB-POE) tasks related to acid-base chemistry.…

  4. Forecasting the need for physicians in the United States: the Health Resources and Services Administration's physician requirements model.

    PubMed Central

    Greenberg, L; Cultice, J M

    1997-01-01

    OBJECTIVE: The Health Resources and Services Administration's Bureau of Health Professions developed a demographic utilization-based model of physician specialty requirements to explore the consequences of a broad range of scenarios pertaining to the nation's health care delivery system on need for physicians. DATA SOURCE/STUDY SETTING: The model uses selected data primarily from the National Center for Health Statistics, the American Medical Association, and the U.S. Bureau of Census. Forecasts are national estimates. STUDY DESIGN: Current (1989) utilization rates for ambulatory and inpatient medical specialty services were obtained for the population according to age, gender, race/ethnicity, and insurance status. These rates are used to estimate specialty-specific total service utilization expressed in patient care minutes for future populations and converted to physician requirements by applying per-physician productivity estimates. DATA COLLECTION/EXTRACTION METHODS: Secondary data were analyzed and put into matrixes for use in the mainframe computer-based model. Several missing data points, e.g., for HMO-enrolled populations, were extrapolated from available data by the project's contractor. PRINCIPAL FINDINGS: The authors contend that the Bureau's demographic utilization model represents improvements over other data-driven methodologies that rely on staffing ratios and similar supply-determined bases for estimating requirements. The model's distinct utility rests in offering national-level physician specialty requirements forecasts. Images Figure 1 PMID:9018213

  5. Dense wavelength division multiplexing devices for metropolitan-area datacom and telecom networks

    NASA Astrophysics Data System (ADS)

    DeCusatis, Casimer M.; Priest, David G.

    2000-12-01

    Large data processing environments in use today can require multi-gigabyte or terabyte capacity in the data communication infrastructure; these requirements are being driven by storage area networks with access to petabyte data bases, new architecture for parallel processing which require high bandwidth optical links, and rapidly growing network applications such as electronic commerce over the Internet or virtual private networks. These datacom applications require high availability, fault tolerance, security, and the capacity to recover from any single point of failure without relying on traditional SONET-based networking. These requirements, coupled with fiber exhaust in metropolitan areas, are driving the introduction of dense optical wavelength division multiplexing (DWDM) in data communication systems, particularly for large enterprise servers or mainframes. In this paper, we examine the technical requirements for emerging nextgeneration DWDM systems. Protocols for storage area networks and computer architectures such as Parallel Sysplex are presented, including their fiber bandwidth requirements. We then describe two commercially available DWDM solutions, a first generation 10 channel system and a recently announced next generation 32 channel system. Technical requirements, network management and security, fault tolerant network designs, new network topologies enabled by DWDM, and the role of time division multiplexing in the network are all discussed. Finally, we present a description of testing conducted on these networks and future directions for this technology.

  6. Utility Computing: Reality and Beyond

    NASA Astrophysics Data System (ADS)

    Ivanov, Ivan I.

    Utility Computing is not a new concept. It involves organizing and providing a wide range of computing-related services as public utilities. Much like water, gas, electricity and telecommunications, the concept of computing as public utility was announced in 1955. Utility Computing remained a concept for near 50 years. Now some models and forms of Utility Computing are emerging such as storage and server virtualization, grid computing, and automated provisioning. Recent trends in Utility Computing as a complex technology involve business procedures that could profoundly transform the nature of companies' IT services, organizational IT strategies and technology infrastructure, and business models. In the ultimate Utility Computing models, organizations will be able to acquire as much IT services as they need, whenever and wherever they need them. Based on networked businesses and new secure online applications, Utility Computing would facilitate "agility-integration" of IT resources and services within and between virtual companies. With the application of Utility Computing there could be concealment of the complexity of IT, reduction of operational expenses, and converting of IT costs to variable `on-demand' services. How far should technology, business and society go to adopt Utility Computing forms, modes and models?

  7. The Influence of Using Momentum and Impulse Computer Simulation to Senior High School Students’ Concept Mastery

    NASA Astrophysics Data System (ADS)

    Kaniawati, I.; Samsudin, A.; Hasopa, Y.; Sutrisno, A. D.; Suhendi, E.

    2016-08-01

    This research is based on students’ lack of mastery of physics abstract concepts. Thus, this study aims to improve senior high school students’ mastery of momentum and impulse concepts with the use of computer simulation. To achieve these objectives, the research method employed was pre experimental design with one group pre-test post-test. A total of 36 science students of grade 11 in one of public senior high school in Bandung became the sample in this study. The instruments utilized to determine the increase of students’ concept mastery were pretest and posttest in the form of multiple choices. After using computer simulations in physics learning, students’ mastery of momentum and impulse concept has increased as indicated by the normalized gain of 0.64 with the medium category.

  8. Collaborative and Multilingual Approach to Learn Database Topics Using Concept Maps

    PubMed Central

    Calvo, Iñaki

    2014-01-01

    Authors report on a study using the concept mapping technique in computer engineering education for learning theoretical introductory database topics. In addition, the learning of multilingual technical terminology by means of the collaborative drawing of a concept map is also pursued in this experiment. The main characteristics of a study carried out in the database subject at the University of the Basque Country during the 2011/2012 course are described. This study contributes to the field of concept mapping as these kinds of cognitive tools have proved to be valid to support learning in computer engineering education. It contributes to the field of computer engineering education, providing a technique that can be incorporated with several educational purposes within the discipline. Results reveal the potential that a collaborative concept map editor offers to fulfil the above mentioned objectives. PMID:25538957

  9. Conceptions of Effective Teaching and Perceived Use of Computer Technologies in Active Learning Classrooms

    ERIC Educational Resources Information Center

    Gebre, Engida; Saroyan, Alenoush; Aulls, Mark W.

    2015-01-01

    This paper examined professors' conceptions of effective teaching in the context of a course they were teaching in active learning classrooms and how the conceptions related to the perceived role and use of computers in their teaching. We interviewed 13 professors who were teaching in active learning classrooms in winter 2011 in a large research…

  10. Fog-computing concept usage as means to enhance information and control system reliability

    NASA Astrophysics Data System (ADS)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-05-01

    This paper focuses on the reliability issue of information and control systems (ICS). The authors propose using the elements of the fog-computing concept to enhance the reliability function. The key idea of fog-computing is to shift computations to the fog-layer of the network, and thus to decrease the workload of the communication environment and data processing components. As for ICS, workload also can be distributed among sensors, actuators and network infrastructure facilities near the sources of data. The authors simulated typical workload distribution situations for the “traditional” ICS architecture and for the one with fogcomputing concept elements usage. The paper contains some models, selected simulation results and conclusion about the prospects of the fog-computing as a means to enhance ICS reliability.

  11. Defining a New 21st Century Skill-Computational Thinking: Concepts and Trends

    ERIC Educational Resources Information Center

    Haseski, Halil Ibrahim; Ilic, Ulas; Tugtekin, Ufuk

    2018-01-01

    Computational Thinking is a skill that guides the 21th century individual in the problems experienced during daily life and it has an ever-increasing significance. Multifarious definitions were attempted to explain the concept of Computational Thinking. However, it was determined that there was no consensus on this matter in the literature and…

  12. Studies Relating to Computer Use of Spelling and Grammar Checkers and Educational Achievement

    ERIC Educational Resources Information Center

    Radi, Odette Bourjaili

    2015-01-01

    The content of this paper will focus on both language and computer practices and how school age students develop their literacy skills in the two domains of "language" and "computers." The term literacy is a broad concept that has attracted many interpretations over the years. Some of the concepts raised by the literature apply…

  13. An Investigation of the Effectiveness of Computer Simulation Programs as Tutorial Tools for Teaching Population Ecology at University.

    ERIC Educational Resources Information Center

    Korfiatis, K.; Papatheodorou, E.; Paraskevopoulous, S.; Stamou, G. P.

    1999-01-01

    Describes a study of the effectiveness of computer-simulation programs in enhancing biology students' familiarity with ecological modeling and concepts. Finds that computer simulations improved student comprehension of ecological processes expressed in mathematical form, but did not allow a full understanding of ecological concepts. Contains 28…

  14. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  15. Easy boundary definition for EGUN

    NASA Astrophysics Data System (ADS)

    Becker, R.

    1989-06-01

    The relativistic electron optics program EGUN [1] has reached a broad distribution, and many users have asked for an easier way of boundary input. A preprocessor to EGUN has been developed that accepts polygonal input of boundary points, and offers features such as rounding off of corners, shifting and squeezing of electrodes and simple input of slanted Neumann boundaries. This preprocessor can either be used on a PC that is linked to a mainframe using the FORTRAN version of EGUN, or in connection with the version EGNc, which also runs on a PC. In any case, direct graphic response on the PC greatly facilitates the creation of correct input files for EGUN.

  16. Applied Research Study

    NASA Technical Reports Server (NTRS)

    Leach, Ronald J.

    1997-01-01

    The purpose of this project was to study the feasibility of reusing major components of a software system that had been used to control the operations of a spacecraft launched in the 1980s. The study was done in the context of a ground data processing system that was to be rehosted from a large mainframe to an inexpensive workstation. The study concluded that a systematic approach using inexpensive tools could aid in the reengineering process by identifying a set of certified reusable components. The study also developed procedures for determining duplicate versions of software, which were created because of inadequate naming conventions. Such procedures reduced reengineering costs by approximately 19.4 percent.

  17. Laboratory Information Systems.

    PubMed

    Henricks, Walter H

    2015-06-01

    Laboratory information systems (LISs) supply mission-critical capabilities for the vast array of information-processing needs of modern laboratories. LIS architectures include mainframe, client-server, and thin client configurations. The LIS database software manages a laboratory's data. LIS dictionaries are database tables that a laboratory uses to tailor an LIS to the unique needs of that laboratory. Anatomic pathology LIS (APLIS) functions play key roles throughout the pathology workflow, and laboratories rely on LIS management reports to monitor operations. This article describes the structure and functions of APLISs, with emphasis on their roles in laboratory operations and their relevance to pathologists. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. HIS priorities in developing countries.

    PubMed

    Amado Espinosa, L

    1995-04-01

    Looking for a solution to fulfill the requirements that the new global economical system demands, developing countries face a reality of poor communications infrastructure, a delay in applying information technology to the organizations, and a semi-closed political system avoiding the necessary reforms. HIS technology has been developed more for transactional purposes on mini and mainframe platforms. Administrative modules are the most frequently observed and physicians are now requiring more support for their activities. The second information systems generation will take advantage of PC technology, client-server models and telecommunications to achieve integration. International organizations, academic and industrial, public and private, will play a major role to transfer technology and to develop this area.

  19. A Collaborative Knowledge Management Process for Implementing Healthcare Enterprise Information Systems

    NASA Astrophysics Data System (ADS)

    Cheng, Po-Hsun; Chen, Sao-Jie; Lai, Jin-Shin; Lai, Feipei

    This paper illustrates a feasible health informatics domain knowledge management process which helps gather useful technology information and reduce many knowledge misunderstandings among engineers who have participated in the IBM mainframe rightsizing project at National Taiwan University (NTU) Hospital. We design an asynchronously sharing mechanism to facilitate the knowledge transfer and our health informatics domain knowledge management process can be used to publish and retrieve documents dynamically. It effectively creates an acceptable discussion environment and even lessens the traditional meeting burden among development engineers. An overall description on the current software development status is presented. Then, the knowledge management implementation of health information systems is proposed.

  20. Computer-Assisted Concept Mapping: Visual Aids for Knowledge Construction.

    PubMed

    Mammen, Jennifer R

    2016-07-01

    Concept mapping is a visual representation of ideas that facilitates critical thinking and is applicable to many areas of nursing education. Computer-assisted concept maps are more flexible and less constrained than traditional paper methods, allowing for analysis and synthesis of complex topics and larger amounts of data. Ability to iteratively revise and collaboratively create computerized maps can contribute to enhanced interpersonal learning. However, there is limited awareness of free software that can support these types of applications. This educational brief examines affordances and limitations of computer-assisted concept maps and reviews free software for development of complex, collaborative malleable maps. Free software, such as VUE, XMind, MindMaple, and others, can substantially contribute to the utility of concept mapping for nursing education. Computerized concept-mapping is an important tool for nursing and is likely to hold greater benefit for students and faculty than traditional pen-and-paper methods alone. [J Nurs Educ. 2016;55(7):403-406.]. Copyright 2016, SLACK Incorporated.

  1. Development of Interactive Computer Programs To Help Students Transfer Basic Skills to College Level Science and Behavioral Science Courses.

    ERIC Educational Resources Information Center

    Mikulecky, Larry

    Interactive computer programs, developed at Indiana University's Learning Skills Center, were designed to model effective strategies for reading biology and psychology textbooks. For each subject area, computer programs and textbook passages were used to instruct and model for students how to identify key concepts, compare and contrast concepts,…

  2. "Games Are Made for Fun": Lessons on the Effects of Concept Maps in the Classroom Use of Computer Games

    ERIC Educational Resources Information Center

    Charsky, Dennis; Ressler, William

    2011-01-01

    Does using a computer game improve students' motivation to learn classroom material? The current study examined students' motivation to learn history concepts while playing a commercial, off-the-shelf computer game, Civilization III. The study examined the effect of using conceptual scaffolds to accompany game play. Students from three ninth-grade…

  3. Design and analysis of sustainable computer mouse using design for disassembly methodology

    NASA Astrophysics Data System (ADS)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  4. Facilitating Preschoolers' Scientific Knowledge Construction via Computer Games Regarding Light and Shadow: The Effect of the Prediction-Observation-Explanation (POE) Strategy

    NASA Astrophysics Data System (ADS)

    Hsu, Chung-Yuan; Tsai, Chin-Chung; Liang, Jyh-Chong

    2011-10-01

    Educational researchers have suggested that computer games have a profound influence on students' motivation, knowledge construction, and learning performance, but little empirical research has targeted preschoolers. Thus, the purpose of the present study was to investigate the effects of implementing a computer game that integrates the prediction-observation-explanation (POE) strategy (White and Gunstone in Probing understanding. Routledge, New York, 1992) on facilitating preschoolers' acquisition of scientific concepts regarding light and shadow. The children's alternative conceptions were explored as well. Fifty participants were randomly assigned into either an experimental group that played a computer game integrating the POE model or a control group that played a non-POE computer game. By assessing the students' conceptual understanding through interviews, this study revealed that the students in the experimental group significantly outperformed their counterparts in the concepts regarding "shadow formation in daylight" and "shadow orientation." However, children in both groups, after playing the games, still expressed some alternative conceptions such as "Shadows always appear behind a person" and "Shadows should be on the same side as the sun."

  5. ORCA Project: Research on high-performance parallel computer programming environments. Final report, 1 Apr-31 Mar 90

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, L.; Notkin, D.; Adams, L.

    1990-03-31

    This task relates to research on programming massively parallel computers. Previous work on the Ensamble concept of programming was extended and investigation into nonshared memory models of parallel computation was undertaken. Previous work on the Ensamble concept defined a set of programming abstractions and was used to organize the programming task into three distinct levels; Composition of machine instruction, composition of processes, and composition of phases. It was applied to shared memory models of computations. During the present research period, these concepts were extended to nonshared memory models. During the present research period, one Ph D. thesis was completed, onemore » book chapter, and six conference proceedings were published.« less

  6. X-Ray Radiography of Gas Turbine Ceramics.

    DTIC Science & Technology

    1979-10-20

    Microfocus X-ray equipment. 1a4ihe definition of equipment concepts for a computer assisted tomography ( CAT ) system; and 4ffthe development of a CAT ...were obtained from these test coupons using Microfocus X-ray and image en- hancement techniques. A Computer Assisted Tomography ( CAT ) design concept...monitor. Computer reconstruction algorithms were investigated with respect to CAT and a preferred approach was determined. An appropriate CAT algorithm

  7. Do We Need to Understand the Technology to Get to the Science? A Systematic Review of the Concept of Computer Literacy in Preventive Health Programs

    ERIC Educational Resources Information Center

    Dominick, Gregory M.; Friedman, Daniela B.; Hoffman-Goetz, Laurie

    2009-01-01

    Objective: To systematically review definitions and descriptions of computer literacy as related to preventive health education programs. Method: A systematic review of the concept of computer literacy as related to preventive health education was conducted. Empirical studies published between 1994 and 2007 on prevention education programs with a…

  8. On introduction of artificial intelligence elements to heat power engineering

    NASA Astrophysics Data System (ADS)

    Dregalin, A. F.; Nazyrova, R. R.

    1993-10-01

    The basic problems of 'the thermodynamic intelligence' of personal computers have been outlined. The thermodynamic intellect of personal computers as a concept has been introduced to heat processes occurring in engines of flying vehicles. In particular, the thermodynamic intellect of computers is determined by the possibility of deriving formal relationships between thermodynamic functions. In chemical thermodynamics, a concept of a characteristic function has been introduced.

  9. Integrated Computer System of Management in Logistics

    NASA Astrophysics Data System (ADS)

    Chwesiuk, Krzysztof

    2011-06-01

    This paper aims at presenting a concept of an integrated computer system of management in logistics, particularly in supply and distribution chains. Consequently, the paper includes the basic idea of the concept of computer-based management in logistics and components of the system, such as CAM and CIM systems in production processes, and management systems for storage, materials flow, and for managing transport, forwarding and logistics companies. The platform which integrates computer-aided management systems is that of electronic data interchange.

  10. Combinatorial computational chemistry approach for materials design: applications in deNOx catalysis, Fischer-Tropsch synthesis, lanthanoid complex, and lithium ion secondary battery.

    PubMed

    Koyama, Michihisa; Tsuboi, Hideyuki; Endou, Akira; Takaba, Hiromitsu; Kubo, Momoji; Del Carpio, Carlos A; Miyamoto, Akira

    2007-02-01

    Computational chemistry can provide fundamental knowledge regarding various aspects of materials. While its impact in scientific research is greatly increasing, its contributions to industrially important issues are far from satisfactory. In order to realize industrial innovation by computational chemistry, a new concept "combinatorial computational chemistry" has been proposed by introducing the concept of combinatorial chemistry to computational chemistry. This combinatorial computational chemistry approach enables theoretical high-throughput screening for materials design. In this manuscript, we review the successful applications of combinatorial computational chemistry to deNO(x) catalysts, Fischer-Tropsch catalysts, lanthanoid complex catalysts, and cathodes of the lithium ion secondary battery.

  11. Man/terminal interaction evaluation of computer operating system command and control service concepts. [in Spacelab

    NASA Technical Reports Server (NTRS)

    Dodson, D. W.; Shields, N. L., Jr.

    1978-01-01

    The Experiment Computer Operating System (ECOS) of the Spacelab will allow the onboard Payload Specialist to command experiment devices and display information relative to the performance of experiments. Three candidate ECOS command and control service concepts were reviewed and laboratory data on operator performance was taken for each concept. The command and control service concepts evaluated included a dedicated operator's menu display from which all command inputs were issued, a dedicated command key concept with which command inputs could be issued from any display, and a multi-display concept in which command inputs were issued from several dedicated function displays. Advantages and disadvantages are discussed in terms of training, operational errors, task performance time, and subjective comments of system operators.

  12. On Roles of Models in Information Systems

    NASA Astrophysics Data System (ADS)

    Sølvberg, Arne

    The increasing penetration of computers into all aspects of human activity makes it desirable that the interplay among software, data and the domains where computers are applied is made more transparent. An approach to this end is to explicitly relate the modeling concepts of the domains, e.g., natural science, technology and business, to the modeling concepts of software and data. This may make it simpler to build comprehensible integrated models of the interactions between computers and non-computers, e.g., interaction among computers, people, physical processes, biological processes, and administrative processes. This chapter contains an analysis of various facets of the modeling environment for information systems engineering. The lack of satisfactory conceptual modeling tools seems to be central to the unsatisfactory state-of-the-art in establishing information systems. The chapter contains a proposal for defining a concept of information that is relevant to information systems engineering.

  13. Computing Realized Compound Yield with a Financial Calculator: A Note

    ERIC Educational Resources Information Center

    Moy, Ronald L.; Terregrossa, Ralph

    2011-01-01

    This note points out that realized compound yield (RCY) has a similar concept from capital budgeting; namely, modified internal rate of return. Recognizing this relationship makes it easier to teach the concept and allows students to easily compute RCY using a financial calculator.

  14. Voter comparator switch provides fail safe data communications system - A concept

    NASA Technical Reports Server (NTRS)

    Koczela, L. J.; Wilgus, D. S.

    1971-01-01

    System indicates status of computers and controls operational modes. Two matrices are used - one relating to permissible system states, the other relating to requested system states. Concept is useful to designers of digital data transmission systems and time shared computer systems.

  15. [Georg Schlöndorff-the father of computer-assisted surgery].

    PubMed

    Mösges, R

    2016-09-01

    Georg Schlöndorff (1931-2011) developed the idea of computer-assisted surgery (CAS) during his time as professor and chairman of the Department of Otorhinolaryngology at the Medical Faculty of the University of Aachen, Germany. In close cooperation with engineers and physicists, he succeeded in translating this concept into a functional prototype that was applied in live surgery in the operating theatre. The first intervention performed with this image-guided navigation system was a skull base surgical procedure 1987. During the following years, this concept was extended to orbital surgery, neurosurgery, mid-facial traumatology, and brachytherapy of solid tumors in the head and neck region. Further technical developments of this first prototype included touchless optical positioning and the computer vision concept with three orthogonal images, which is still common in contemporary navigation systems. During his time as emeritus professor from 1996, Georg Schlöndorff further pursued his concept of CAS by developing technical innovations such as computational fluid dynamics (CFD).

  16. State-Transition Structures in Physics and in Computation

    NASA Astrophysics Data System (ADS)

    Petri, C. A.

    1982-12-01

    In order to establish close connections between physical and computational processes, it is assumed that the concepts of “state” and of “transition” are acceptable both to physicists and to computer scientists, at least in an informal way. The aim of this paper is to propose formal definitions of state and transition elements on the basis of very low level physical concepts in such a way that (1) all physically possible computations can be described as embedded in physical processes; (2) the computational aspects of physical processes can be described on a well-defined level of abstraction; (3) the gulf between the continuous models of physics and the discrete models of computer science can be bridged by simple mathematical constructs which may be given a physical interpretation; (4) a combinatorial, nonstatistical definition of “information” can be given on low levels of abstraction which may serve as a basis to derive higher-level concepts of information, e.g., by a statistical or probabilistic approach. Conceivable practical consequences are discussed.

  17. Educational Technology: Best Practices from America's Schools.

    ERIC Educational Resources Information Center

    Bozeman, William C.; Baumbach, Donna J.

    This book begins with an overview of computer technology concepts, including computer system configurations, computer communications, and software. Instructional computer applications are then discussed; topics include computer-assisted instruction, computer-managed instruction, computer-enhanced instruction, LOGO, authoring programs, presentation…

  18. Turbulence modeling for compressible flows

    NASA Technical Reports Server (NTRS)

    Marvin, J. G.

    1977-01-01

    Material prepared for a course on Applications and Fundamentals of Turbulence given at the University of Tennessee Space Institute, January 10 and 11, 1977, is presented. A complete concept of turbulence modeling is described, and examples of progess for its use in computational aerodynimics are given. Modeling concepts, experiments, and computations using the concepts are reviewed in a manner that provides an up-to-date statement on the status of this problem for compressible flows.

  19. A Method for Aircraft Concept Selection Using Multicriteria Interactive Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Buonanno, Michael; Mavris, Dimitri

    2005-01-01

    The problem of aircraft concept selection has become increasingly difficult in recent years as a result of a change from performance as the primary evaluation criteria of aircraft concepts to the current situation in which environmental effects, economics, and aesthetics must also be evaluated and considered in the earliest stages of the decision-making process. This has prompted a shift from design using historical data regression techniques for metric prediction to the use of physics-based analysis tools that are capable of analyzing designs outside of the historical database. The use of optimization methods with these physics-based tools, however, has proven difficult because of the tendency of optimizers to exploit assumptions present in the models and drive the design towards a solution which, while promising to the computer, may be infeasible due to factors not considered by the computer codes. In addition to this difficulty, the number of discrete options available at this stage may be unmanageable due to the combinatorial nature of the concept selection problem, leading the analyst to arbitrarily choose a sub-optimum baseline vehicle. These concept decisions such as the type of control surface scheme to use, though extremely important, are frequently made without sufficient understanding of their impact on the important system metrics because of a lack of computational resources or analysis tools. This paper describes a hybrid subjective/quantitative optimization method and its application to the concept selection of a Small Supersonic Transport. The method uses Genetic Algorithms to operate on a population of designs and promote improvement by varying more than sixty parameters governing the vehicle geometry, mission, and requirements. In addition to using computer codes for evaluation of quantitative criteria such as gross weight, expert input is also considered to account for criteria such as aeroelasticity or manufacturability which may be impossible or too computationally expensive to consider explicitly in the analysis. Results indicate that concepts resulting from the use of this method represent designs which are promising to both the computer and the analyst, and that a mapping between concepts and requirements that would not otherwise be apparent is revealed.

  20. Procedural Quantum Programming

    NASA Astrophysics Data System (ADS)

    Ömer, Bernhard

    2002-09-01

    While classical computing science has developed a variety of methods and programming languages around the concept of the universal computer, the typical description of quantum algorithms still uses a purely mathematical, non-constructive formalism which makes no difference between a hydrogen atom and a quantum computer. This paper investigates, how the concept of procedural programming languages, the most widely used classical formalism for describing and implementing algorithms, can be adopted to the field of quantum computing, and how non-classical features like the reversibility of unitary transformations, the non-observability of quantum states or the lack of copy and erase operations can be reflected semantically. It introduces the key concepts of procedural quantum programming (hybrid target architecture, operator hierarchy, quantum data types, memory management, etc.) and presents the experimental language QCL, which implements these principles.

  1. Computer Systems for Teaching Complex Concepts.

    ERIC Educational Resources Information Center

    Feurzeig, Wallace

    Four Programing systems--Mentor, Stringcomp, Simon, and Logo--were designed and implemented as integral parts of research into the various ways computers may be used for teaching problem-solving concepts and skills. Various instructional contexts, among them medicine, mathematics, physics, and basic problem-solving for elementary school children,…

  2. The Use of Force Sensors and a Computer System to Introduce the Concept of Inertia at a School

    ERIC Educational Resources Information Center

    Bogacz, Bogdan F.; Pedziwiatr, Antoni T.

    2014-01-01

    A classical experiment used to introduce the concept of body inertia, breaking of a thread below and above a hanging weight, is described mathematically and presented in a new way, using force sensors and a computer system.

  3. Beyond the Clock--Using the Computer to Teach the Abstract Concept of Time.

    ERIC Educational Resources Information Center

    Drysdale, Julie

    1993-01-01

    Discusses several projects to help teach and reinforce the concept of time, using the books "The Very Hungry Caterpillar" (by Eric Carle) and "Charlotte's Web (by E. B. White) as well as the computer software program "Timeliner" (by Tom Snyder). (SR)

  4. Use of Networked Collaborative Concept Mapping To Measure Team Processes and Team Outcomes.

    ERIC Educational Resources Information Center

    Chung, Gregory K. W. K.; O'Neil, Harold F., Jr.; Herl, Howard E.; Dennis, Robert A.

    The feasibility of using a computer-based networked collaborative concept mapping system to measure teamwork skills was studied. A concept map is a node-link-node representation of content, where the nodes represent concepts and links represent relationships between connected concepts. Teamwork processes were examined for a group concept mapping…

  5. Liminal Spaces and Learning Computing

    ERIC Educational Resources Information Center

    McCartney, Robert; Boustedt, Jonas; Eckerdal, Anna; Mostrom, Jan Erik; Sanders, Kate; Thomas, Lynda; Zander, Carol

    2009-01-01

    "Threshold concepts" are concepts that, among other things, transform the way a student looks at a discipline. Although the term "threshold" might suggest that the transformation occurs at a specific point in time, an "aha" moment, it seems more common (at least in computing) that a longer time period is required.…

  6. Web-Compatible Graphics Visualization Framework for Online Instruction and Assessment of Hardware Concepts

    ERIC Educational Resources Information Center

    Chandramouli, Magesh; Chittamuru, Siva-Teja

    2016-01-01

    This paper explains the design of a graphics-based virtual environment for instructing computer hardware concepts to students, especially those at the beginner level. Photorealistic visualizations and simulations are designed and programmed with interactive features allowing students to practice, explore, and test themselves on computer hardware…

  7. The Concept of Nondeterminism: Its Development and Implications for Teaching

    ERIC Educational Resources Information Center

    Armoni, Michal; Ben-Ari, Mordechai

    2009-01-01

    Nondeterminism is a fundamental concept in computer science that appears in various contexts such as automata theory, algorithms and concurrent computation. We present a taxonomy of the different ways that nondeterminism can be defined and used; the categories of the taxonomy are domain, nature, implementation, consistency, execution and…

  8. Academic Information Management--A New Synthesis.

    ERIC Educational Resources Information Center

    Drummond, Marshall Edward

    The design concept and initial development phases of academic information management (AIM) are discussed. The AIM concept is an attempt to serve three segments of academic management with data and models to support decision making. AIM is concerned with management and evaluation of instructional computing in areas other than direct computing (data…

  9. Integrating Computer Concepts into Principles of Accounting.

    ERIC Educational Resources Information Center

    Beck, Henry J.; Parrish, Roy James, Jr.

    A package of instructional materials for an undergraduate principles of accounting course at Danville Community College was developed based upon the following assumptions: (1) the principles of accounting student does not need to be able to write computer programs; (2) computerized accounting concepts should be presented in this course; (3)…

  10. Virtual Bioinformatics Distance Learning Suite

    ERIC Educational Resources Information Center

    Tolvanen, Martti; Vihinen, Mauno

    2004-01-01

    Distance learning as a computer-aided concept allows students to take courses from anywhere at any time. In bioinformatics, computers are needed to collect, store, process, and analyze massive amounts of biological and biomedical data. We have applied the concept of distance learning in virtual bioinformatics to provide university course material…

  11. The geo-control system for station keeping and colocation of geostationary satellites

    NASA Technical Reports Server (NTRS)

    Montenbruck, O.; Eckstein, M. C.; Gonner, J.

    1993-01-01

    GeoControl is a compact but powerful and accurate software system for station keeping of single and colocated satellites, which has been developed at the German Space Operations Center. It includes four core modules for orbit determination (including maneuver estimation), maneuver planning, monitoring of proximities between colocated satellites, and interference and event prediction. A simple database containing state vector and maneuver information at selected epochs is maintained as a central interface between the modules. A menu driven shell utilizing form screens for data input serves as the central user interface. The software is written in Ada and FORTRAN and may be used on VAX workstations or mainframes under the VMS operating system.

  12. Design of cryogenic tanks for launch vehicles

    NASA Technical Reports Server (NTRS)

    Copper, Charles; Pilkey, Walter D.; Haviland, John K.

    1990-01-01

    During the period since January 1990, work was concentrated on the problem of the buckling of the structure of an ALS (advanced launch systems) tank during the boost phase. The primary problem was to analyze a proposed hat stringer made by superplastic forming, and to compare it with an integrally stiffened stringer design. A secondary objective was to determine whether structural rings having the identical section to the stringers will provide adequate support against overall buckling. All of the analytical work was carried out with the TESTBED program on the CONVEX computer, using PATRAN programs to create models. Analyses of skin/stringer combinations have shown that the proposed stringer design is an adequate substitute for the integrally stiffened stringer. Using a highly refined mesh to represent the corrugations in the vertical webs of the hat stringers, effective values were obtained for cross-sectional area, moment of inertia, centroid height, and torsional constant. Not only can these values be used for comparison with experimental values, but they can also be used for beams to replace the stringers and frames in analytical models of complete sections of tank. The same highly refined model was used to represent a section of skin reinforced by a stringer and a ring segment in the configuration of a cross. It was intended that this would provide a baseline buckling analysis representing a basic mode, however, the analysis proved to be beyond the scope of the CONVEX computer. One quarter of this model was analyzed, however, to provide information on buckling between the spot welds. Models of large sections of the tank structure were made, using beam elements to model the stringers and frames. In order to represent the stiffening effects of pressure, stresses and deflections under pressure should first be obtained, and then the buckling analysis should be made on the structure so deflected. So far, uncharacteristic deflections under pressure were obtained from the TESTBED program using two types of structural elements. Similar results were obtained using the ANSYS program on a mainframe computer, although two finite element programs on microcomputers have yielded realistic results.

  13. Algorithmic complexity of quantum capacity

    NASA Astrophysics Data System (ADS)

    Oskouei, Samad Khabbazi; Mancini, Stefano

    2018-04-01

    We analyze the notion of quantum capacity from the perspective of algorithmic (descriptive) complexity. To this end, we resort to the concept of semi-computability in order to describe quantum states and quantum channel maps. We introduce algorithmic entropies (like algorithmic quantum coherent information) and derive relevant properties for them. Then we show that quantum capacity based on semi-computable concept equals the entropy rate of algorithmic coherent information, which in turn equals the standard quantum capacity. Thanks to this, we finally prove that the quantum capacity, for a given semi-computable channel, is limit computable.

  14. Words and Concepts. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2007

    2007-01-01

    "Words and Concepts" is a computer software program that focuses on building oral language skills related to vocabulary, comprehension, word relationships, and other concepts in six units--vocabulary, categorization, word identification by function, word association, concept of same, and concept of different. It can be used by adults and…

  15. The Integrated Library System Design Concepts for a Complete Serials Control Subsystem.

    DTIC Science & Technology

    1984-08-20

    7AD-fl149 379 THE INTEGRTED LIBRARY SYSTEM DESIGN CONCEPTS FOR A 1/COMPLETE SERIALS CONTROL UBSYSTEM(U) ONLINE COMPUTER SYSTEMS INC GERMANTOWN MD 28...CONTROL SUBSYSTEM Presented to: The Pentagon Library The Pentagon Washington, DC 20310 Prepared by: Online Computer Systems, Inc. 20251 Century Blvd...MDA903-82-C-0535 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT, PROJECT, TASK AREA & WORK UNIT NUMBERS Online Computer Systems, Inc

  16. GRID2D/3D: A computer program for generating grid systems in complex-shaped two- and three-dimensional spatial domains. Part 2: User's manual and program listing

    NASA Technical Reports Server (NTRS)

    Bailey, R. T.; Shih, T. I.-P.; Nguyen, H. L.; Roelke, R. J.

    1990-01-01

    An efficient computer program, called GRID2D/3D, was developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2- and 3-D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation in which the distribution of grid points within the spatial domain is controlled by stretching functions. All single grid systems generated by GRID2D/3D can have grid lines that are continuous and differentiable everywhere up to the second-order. Also, grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For continuous composite grid systems, the grid lines are continuous and differentiable everywhere up to the second-order except at interfaces where different single grid systems meet. At interfaces where different single grid systems meet, the grid lines are only differentiable up to the first-order. For 2-D spatial domains, the boundary curves are described by using either cubic or tension spline interpolation. For 3-D spatial domains, the boundary surfaces are described by using either linear Coon's interpolation, bi-hyperbolic spline interpolation, or a new technique referred to as 3-D bi-directional Hermite interpolation. Since grid systems generated by algebraic methods can have grid lines that overlap one another, GRID2D/3D contains a graphics package for evaluating the grid systems generated. With the graphics package, the user can generate grid systems in an interactive manner with the grid generation part of GRID2D/3D. GRID2D/3D is written in FORTRAN 77 and can be run on any IBM PC, XT, or AT compatible computer. In order to use GRID2D/3D on workstations or mainframe computers, some minor modifications must be made in the graphics part of the program; no modifications are needed in the grid generation part of the program. The theory and method used in GRID2D/3D is described.

  17. GRID2D/3D: A computer program for generating grid systems in complex-shaped two- and three-dimensional spatial domains. Part 1: Theory and method

    NASA Technical Reports Server (NTRS)

    Shih, T. I.-P.; Bailey, R. T.; Nguyen, H. L.; Roelke, R. J.

    1990-01-01

    An efficient computer program, called GRID2D/3D was developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2- and 3-D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation in which the distribution of grid points within the spatial domain is controlled by stretching functions. All single grid systems generated by GRID2D/3D can have grid lines that are continuous and differentiable everywhere up to the second-order. Also, grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For continuous composite grid systems, the grid lines are continuous and differentiable everywhere up to the second-order except at interfaces where different single grid systems meet. At interfaces where different single grid systems meet, the grid lines are only differentiable up to the first-order. For 2-D spatial domains, the boundary curves are described by using either cubic or tension spline interpolation. For 3-D spatial domains, the boundary surfaces are described by using either linear Coon's interpolation, bi-hyperbolic spline interpolation, or a new technique referred to as 3-D bi-directional Hermite interpolation. Since grid systems generated by algebraic methods can have grid lines that overlap one another, GRID2D/3D contains a graphics package for evaluating the grid systems generated. With the graphics package, the user can generate grid systems in an interactive manner with the grid generation part of GRID2D/3D. GRID2D/3D is written in FORTRAN 77 and can be run on any IBM PC, XT, or AT compatible computer. In order to use GRID2D/3D on workstations or mainframe computers, some minor modifications must be made in the graphics part of the program; no modifications are needed in the grid generation part of the program. This technical memorandum describes the theory and method used in GRID2D/3D.

  18. Computational Analyses of Offset Stream Nozzles for Noise Reduction

    NASA Technical Reports Server (NTRS)

    Dippold, Vance, III; Foster, Lancert; Wiese,Michael

    2007-01-01

    The Wind computational fluid dynamics code was used to perform a series of simulations on two offset stream nozzle concepts for jet noise reduction. The first concept used an S-duct to direct the secondary stream to the lower side of the nozzle. The second concept used vanes to turn the secondary flow downward. The analyses were completed in preparation of tests conducted in the NASA Glenn Research Center Aeroacoustic Propulsion Laboratory. The offset stream nozzles demonstrated good performance and reduced the amount of turbulence on the lower side of the jet plume. The computer analyses proved instrumental in guiding the development of the final test configurations and giving insight into the flow mechanics of offset stream nozzles. The computational predictions were compared with flowfield results from the jet rig testing and showed excellent agreement.

  19. Solar-System Tests of Gravitational Theories

    NASA Technical Reports Server (NTRS)

    Shapiro, Irwin

    1997-01-01

    We are engaged in testing gravitational theory by means of observations of objects in the solar system. These tests include an examination of the Principle Of Equivalence (POE), the Shapiro delay, the advances of planetary perihelia, the possibility of a secular variation G in the "gravitational constant" G, and the rate of the de Sitter (geodetic) precession of the Earth-Moon system. These results are consistent with our preliminary results focusing on the contribution of Lunar Laser Ranging (LLR), which were presented at the seventh Marcel Grossmann meeting on general relativity. The largest improvement over previous results comes in the uncertainty for (eta): a factor of five better than our previous value. This improvement reflects the increasing strength of the LLR data. A similar analysis presented at the same meeting by a group at the Jet Propulsion Laboratory gave a similar result for (eta). Our value for (beta) represents our first such result determined simultaneously with the solar quadrupole moment from the dynamical data set. These results are being prepared for publication. We have shown how positions determined from different planetary ephemerides can be compared and how the combination of VLBI and pulse timing information can yield a direct tie between planetary and radio frames. We have continued to include new data in our analysis as they became available. Finally, we have made improvement in our analysis software (PEP) and ported it to a network of modern workstations from its former home on a "mainframe" computer.

  20. On the Value of Computer-aided Instruction: Thoughts after Teaching Sales Writing in a Computer Classroom.

    ERIC Educational Resources Information Center

    Hagge, John

    1986-01-01

    Focuses on problems encountered with computer-aided writing instruction. Discusses conflicts caused by the computer classroom concept, some general paradoxes and ethical implications of computer-aided instruction. (EL)

  1. Effect of Computer-Assisted Instruction on Secondary School Students' Achievement in Ecological Concepts

    ERIC Educational Resources Information Center

    Nkemdilim, Egbunonu Roseline; Okeke, Sam O. C.

    2014-01-01

    This study investigated the effects of computer-assisted instruction (CAI) on students' achievement in ecological concepts. Quasi-experimental design, specifically the pre-test post test non-equivalent control group design was adopted. The sample consisted of sixty-six (66) senior secondary year two (SS II) biology students, drawn from two…

  2. Playing Violent Video and Computer Games and Adolescent Self-Concept.

    ERIC Educational Resources Information Center

    Funk, Jeanne B.; Buchman, Debra D.

    1996-01-01

    Documents current adolescent electronic game-playing habits, exploring associations among preference for violent games, frequency and location of play, and self-concept. Identifies marked gender differences in game-playing habits and in scores on a self-perception profile. Finds that for girls, more time playing video or computer games is…

  3. How Learning Logic Programming Affects Recursion Comprehension

    ERIC Educational Resources Information Center

    Haberman, Bruria

    2004-01-01

    Recursion is a central concept in computer science, yet it is difficult for beginners to comprehend. Israeli high-school students learn recursion in the framework of a special modular program in computer science (Gal-Ezer & Harel, 1999). Some of them are introduced to the concept of recursion in two different paradigms: the procedural…

  4. The ChemViz Project: Using a Supercomputer To Illustrate Abstract Concepts in Chemistry.

    ERIC Educational Resources Information Center

    Beckwith, E. Kenneth; Nelson, Christopher

    1998-01-01

    Describes the Chemistry Visualization (ChemViz) Project, a Web venture maintained by the University of Illinois National Center for Supercomputing Applications (NCSA) that enables high school students to use computational chemistry as a technique for understanding abstract concepts. Discusses the evolution of computational chemistry and provides a…

  5. Computer-Based Exercises To Supplement the Teaching of Stereochemical Aspects of Drug Action.

    ERIC Educational Resources Information Center

    Harrold, Marc W.

    1995-01-01

    At the Duquesne University (PA) school of pharmacy, five self-paced computer exercises using a molecular modeling program have been implemented to teach stereochemical concepts. The approach, designed for small-group learning, has been well received and found effective in enhancing students' understanding of the concepts. (Author/MSE)

  6. Identifying Predictors of Achievement in the Newly Defined Information Literacy: A Neural Network Analysis

    ERIC Educational Resources Information Center

    Sexton, Randall; Hignite, Michael; Margavio, Thomas M.; Margavio, Geanie W.

    2009-01-01

    Information Literacy is a concept that evolved as a result of efforts to move technology-based instructional and research efforts beyond the concepts previously associated with "computer literacy." While computer literacy was largely a topic devoted to knowledge of hardware and software, information literacy is concerned with students' abilities…

  7. When Practice Doesn't Make Perfect: Effects of Task Goals on Learning Computing Concepts

    ERIC Educational Resources Information Center

    Miller, Craig S.; Settle, Amber

    2011-01-01

    Specifying file references for hypertext links is an elementary competence that nevertheless draws upon core computational thinking concepts such as tree traversal and the distinction between relative and absolute references. In this article we explore the learning effects of different instructional strategies in the context of an introductory…

  8. Designing for Interaction: Six Steps to Designing Computer-Supported Group-Based Learning

    ERIC Educational Resources Information Center

    Strijbos, J. W.; Martens, R. L.; Jochems, W. M. G.

    2004-01-01

    At present, the design of computer-supported group-based learning (CSGBL) is often based on subjective decisions regarding tasks, pedagogy and technology, or concepts such as "cooperative learning" and "collaborative learning." Critical review reveals these concepts as insufficiently substantial to serve as a basis for CSGBL design. Furthermore,…

  9. An Innovative Improvement of Engineering Learning System Using Computational Fluid Dynamics Concept

    ERIC Educational Resources Information Center

    Hung, T. C.; Wang, S. K.; Tai, S. W.; Hung, C. T.

    2007-01-01

    An innovative concept of an electronic learning system has been established in an attempt to achieve a technology that provides engineering students with an instructive and affordable framework for learning engineering-related courses. This system utilizes an existing Computational Fluid Dynamics (CFD) package, Active Server Pages programming,…

  10. Learning Computer Science Concepts with Scratch

    ERIC Educational Resources Information Center

    Meerbaum-Salant, Orni; Armoni, Michal; Ben-Ari, Mordechai

    2013-01-01

    Scratch is a visual programming environment that is widely used by young people. We investigated if Scratch can be used to teach concepts of computer science (CS). We developed learning materials for middle-school students that were designed according to the constructionist philosophy of Scratch and evaluated them in a few schools during two…

  11. Interactive systems design and synthesis of future spacecraft concepts

    NASA Technical Reports Server (NTRS)

    Wright, R. L.; Deryder, D. D.; Ferebee, M. J., Jr.

    1984-01-01

    An interactive systems design and synthesis is performed on future spacecraft concepts using the Interactive Design and Evaluation of Advanced spacecraft (IDEAS) computer-aided design and analysis system. The capabilities and advantages of the systems-oriented interactive computer-aided design and analysis system are described. The synthesis of both large antenna and space station concepts, and space station evolutionary growth is demonstrated. The IDEAS program provides the user with both an interactive graphics and an interactive computing capability which consists of over 40 multidisciplinary synthesis and analysis modules. Thus, the user can create, analyze and conduct parametric studies and modify Earth-orbiting spacecraft designs (space stations, large antennas or platforms, and technologically advanced spacecraft) at an interactive terminal with relative ease. The IDEAS approach is useful during the conceptual design phase of advanced space missions when a multiplicity of parameters and concepts must be analyzed and evaluated in a cost-effective and timely manner.

  12. Improving learning with science and social studies text using computer-based concept maps for students with disabilities.

    PubMed

    Ciullo, Stephen; Falcomata, Terry S; Pfannenstiel, Kathleen; Billingsley, Glenna

    2015-01-01

    Concept maps have been used to help students with learning disabilities (LD) improve literacy skills and content learning, predominantly in secondary school. However, despite increased access to classroom technology, no previous studies have examined the efficacy of computer-based concept maps to improve learning from informational text for students with LD in elementary school. In this study, we used a concurrent delayed multiple probe design to evaluate the interactive use of computer-based concept maps on content acquisition with science and social studies texts for Hispanic students with LD in Grades 4 and 5. Findings from this study suggest that students improved content knowledge during intervention relative to a traditional instruction baseline condition. Learning outcomes and social validity information are considered to inform recommendations for future research and the feasibility of classroom implementation. © The Author(s) 2014.

  13. Computer Literacy for Teachers.

    ERIC Educational Resources Information Center

    Sarapin, Marvin I.; Post, Paul E.

    Basic concepts of computer literacy are discussed as they relate to industrial arts/technology education. Computer hardware development is briefly examined, and major software categories are defined, including database management, computer graphics, spreadsheet programs, telecommunications and networking, word processing, and computer assisted and…

  14. First-in-Man Computed Tomography-Guided Percutaneous Revascularization of Coronary Chronic Total Occlusion Using a Wearable Computer: Proof of Concept.

    PubMed

    Opolski, Maksymilian P; Debski, Artur; Borucki, Bartosz A; Szpak, Marcin; Staruch, Adam D; Kepka, Cezary; Witkowski, Adam

    2016-06-01

    We report a case of successful computed tomography-guided percutaneous revascularization of a chronically occluded right coronary artery using a wearable, hands-free computer with a head-mounted display worn by interventional cardiologists in the catheterization laboratory. The projection of 3-dimensional computed tomographic reconstructions onto the screen of virtual reality glass allowed the operators to clearly visualize the distal coronary vessel, and verify the direction of the guide wire advancement relative to the course of the occluded vessel segment. This case provides proof of concept that wearable computers can improve operator comfort and procedure efficiency in interventional cardiology. Copyright © 2016 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  15. Computational communities: African-American cultural capital in computer science education

    NASA Astrophysics Data System (ADS)

    Lachney, Michael

    2017-10-01

    Enrolling the cultural capital of underrepresented communities in PK-12 technology and curriculum design has been a primary strategy for broadening the participation of students of color in U.S. computer science (CS) fields. This article examines two ways that African-American cultural capital and computing can be bridged in CS education. The first is community representation, using cultural capital to highlight students' social identities and networks through computational thinking. The second, computational integration, locates computation in cultural capital itself. I survey two risks - the appearance of shallow computing and the reproduction of assimilationist logics - that may arise when constructing one bridge without the other. To avoid these risks, I introduce the concept of computational communities by exploring areas in CS education that employ both strategies. This concept is then grounded in qualitative data from an after school program that connected CS to African-American cosmetology.

  16. Multiple Instance Fuzzy Inference

    DTIC Science & Technology

    2015-12-02

    very small probabilities. To compute Pr(t | Bi) for a given bag Bi, a conjunction measure of all its instances Bij , j = 1, . . . ,M is computed using...the noisy-or operator Pr(t | Bi) = 1− ∏ 1≤ j ≤M (1− Pr(Bij ∈ t)), (2.5) where Pr(Bij ∈ t) is computed from a Gaussian distribution centred at the concept...Xnk to target concept Ci, and its computed using Pr(Xnk ∈ Ci) = e−( ∑D j =1 sij(xnkj−cij)2) (2.9) In (4.5), sij is a scaling parameter that weights the

  17. Re-engineering Nascom's network management architecture

    NASA Technical Reports Server (NTRS)

    Drake, Brian C.; Messent, David

    1994-01-01

    The development of Nascom systems for ground communications began in 1958 with Project Vanguard. The low-speed systems (rates less than 9.6 Kbs) were developed following existing standards; but, there were no comparable standards for high-speed systems. As a result, these systems were developed using custom protocols and custom hardware. Technology has made enormous strides since the ground support systems were implemented. Standards for computer equipment, software, and high-speed communications exist and the performance of current workstations exceeds that of the mainframes used in the development of the ground systems. Nascom is in the process of upgrading its ground support systems and providing additional services. The Message Switching System (MSS), Communications Address Processor (CAP), and Multiplexer/Demultiplexer (MDM) Automated Control System (MACS) are all examples of Nascom systems developed using standards such as, X-windows, Motif, and Simple Network Management Protocol (SNMP). Also, the Earth Observing System (EOS) Communications (Ecom) project is stressing standards as an integral part of its network. The move towards standards has produced a reduction in development, maintenance, and interoperability costs, while providing operational quality improvement. The Facility and Resource Manager (FARM) project has been established to integrate the Nascom networks and systems into a common network management architecture. The maximization of standards and implementation of computer automation in the architecture will lead to continued cost reductions and increased operational efficiency. The first step has been to derive overall Nascom requirements and identify the functionality common to all the current management systems. The identification of these common functions will enable the reuse of processes in the management architecture and promote increased use of automation throughout the Nascom network. The MSS, CAP, MACS, and Ecom projects have indicated the potential value of commercial-off-the-shelf (COTS) and standards through reduced cost and high quality. The FARM will allow the application of the lessons learned from these projects to all future Nascom systems.

  18. CET89 - CHEMICAL EQUILIBRIUM WITH TRANSPORT PROPERTIES, 1989

    NASA Technical Reports Server (NTRS)

    Mcbride, B.

    1994-01-01

    Scientists and engineers need chemical equilibrium composition data to calculate the theoretical thermodynamic properties of a chemical system. This information is essential in the design and analysis of equipment such as compressors, turbines, nozzles, engines, shock tubes, heat exchangers, and chemical processing equipment. The substantial amount of numerical computation required to obtain equilibrium compositions and transport properties for complex chemical systems led scientists at NASA's Lewis Research Center to develop CET89, a program designed to calculate the thermodynamic and transport properties of these systems. CET89 is a general program which will calculate chemical equilibrium compositions and mixture properties for any chemical system with available thermodynamic data. Generally, mixtures may include condensed and gaseous products. CET89 performs the following operations: it 1) obtains chemical equilibrium compositions for assigned thermodynamic states, 2) calculates dilute-gas transport properties of complex chemical mixtures, 3) obtains Chapman-Jouguet detonation properties for gaseous species, 4) calculates incident and reflected shock properties in terms of assigned velocities, and 5) calculates theoretical rocket performance for both equilibrium and frozen compositions during expansion. The rocket performance function allows the option of assuming either a finite area or an infinite area combustor. CET89 accommodates problems involving up to 24 reactants, 20 elements, and 600 products (400 of which may be condensed). The program includes a library of thermodynamic and transport properties in the form of least squares coefficients for possible reaction products. It includes thermodynamic data for over 1300 gaseous and condensed species and transport data for 151 gases. The subroutines UTHERM and UTRAN convert thermodynamic and transport data to unformatted form for faster processing. The program conforms to the FORTRAN 77 standard, except for some input in NAMELIST format. It requires about 423 KB memory, and is designed to be used on mainframe, workstation, and mini computers. Due to its memory requirements, this program does not readily lend itself to implementation on MS-DOS based machines.

  19. Content Structure as a Design Strategy Variable in Concept Acquisition.

    ERIC Educational Resources Information Center

    Tennyson, Robert D.; Tennyson, Carol L.

    Three methods of sequencing coordinate concepts (simultaneous, collective, and successive) were investigated with a Bayesian, computer-based, adaptive control system. The data analysis showed that when coordinate concepts are taught simultaneously (contextually similar concepts presented at the same time), student performance is superior to either…

  20. The Information Science Experiment System - The computer for science experiments in space

    NASA Technical Reports Server (NTRS)

    Foudriat, Edwin C.; Husson, Charles

    1989-01-01

    The concept of the Information Science Experiment System (ISES), potential experiments, and system requirements are reviewed. The ISES is conceived as a computer resource in space whose aim is to assist computer, earth, and space science experiments, to develop and demonstrate new information processing concepts, and to provide an experiment base for developing new information technology for use in space systems. The discussion covers system hardware and architecture, operating system software, the user interface, and the ground communication link.

  1. Improving Students’ Science Process Skills through Simple Computer Simulations on Linear Motion Conceptions

    NASA Astrophysics Data System (ADS)

    Siahaan, P.; Suryani, A.; Kaniawati, I.; Suhendi, E.; Samsudin, A.

    2017-02-01

    The purpose of this research is to identify the development of students’ science process skills (SPS) on linear motion concept by utilizing simple computer simulation. In order to simplify the learning process, the concept is able to be divided into three sub-concepts: 1) the definition of motion, 2) the uniform linear motion and 3) the uniformly accelerated motion. This research was administered via pre-experimental method with one group pretest-posttest design. The respondents which were involved in this research were 23 students of seventh grade in one of junior high schools in Bandung City. The improving process of students’ science process skill is examined based on normalized gain analysis from pretest and posttest scores for all sub-concepts. The result of this research shows that students’ science process skills are dramatically improved by 47% (moderate) on observation skill; 43% (moderate) on summarizing skill, 70% (high) on prediction skill, 44% (moderate) on communication skill and 49% (moderate) on classification skill. These results clarify that the utilizing simple computer simulations in physics learning is be able to improve overall science skills at moderate level.

  2. On the importance of a rich embodiment in the grounding of concepts: perspectives from embodied cognitive science and computational linguistics.

    PubMed

    Thill, Serge; Padó, Sebastian; Ziemke, Tom

    2014-07-01

    The recent trend in cognitive robotics experiments on language learning, symbol grounding, and related issues necessarily entails a reduction of sensorimotor aspects from those provided by a human body to those that can be realized in machines, limiting robotic models of symbol grounding in this respect. Here, we argue that there is a need for modeling work in this domain to explicitly take into account the richer human embodiment even for concrete concepts that prima facie relate merely to simple actions, and illustrate this using distributional methods from computational linguistics which allow us to investigate grounding of concepts based on their actual usage. We also argue that these techniques have applications in theories and models of grounding, particularly in machine implementations thereof. Similarly, considering the grounding of concepts in human terms may be of benefit to future work in computational linguistics, in particular in going beyond "grounding" concepts in the textual modality alone. Overall, we highlight the overall potential for a mutually beneficial relationship between the two fields. Copyright © 2014 Cognitive Science Society, Inc.

  3. Design and analysis of advanced flight planning concepts

    NASA Technical Reports Server (NTRS)

    Sorensen, John A.

    1987-01-01

    The objectives of this continuing effort are to develop and evaluate new algorithms and advanced concepts for flight management and flight planning. This includes the minimization of fuel or direct operating costs, the integration of the airborne flight management and ground-based flight planning processes, and the enhancement of future traffic management systems design. Flight management (FMS) concepts are for on-board profile computation and steering of transport aircraft in the vertical plane between a city pair and along a given horizontal path. Flight planning (FPS) concepts are for the pre-flight ground based computation of the three-dimensional reference trajectory that connects the city pair and specifies the horizontal path, fuel load, and weather profiles for initializing the FMS. As part of these objectives, a new computer program called EFPLAN has been developed and utilized to study advanced flight planning concepts. EFPLAN represents an experimental version of an FPS. It has been developed to generate reference flight plans compatible as input to an FMS and to provide various options for flight planning research. This report describes EFPLAN and the associated research conducted in its development.

  4. Comparing Computer-Supported Dynamic Modeling and "Paper & Pencil" Concept Mapping Technique in Students' Collaborative Activity

    ERIC Educational Resources Information Center

    Komis, Vassilis; Ergazaki, Marida; Zogza, Vassiliki

    2007-01-01

    This study aims at highlighting the collaborative activity of two high school students (age 14) in the cases of modeling the complex biological process of plant growth with two different tools: the "paper & pencil" concept mapping technique and the computer-supported educational environment "ModelsCreator". Students' shared activity in both cases…

  5. The Effects of Computer-Supported Inquiry-Based Learning Methods and Peer Interaction on Learning Stellar Parallax

    ERIC Educational Resources Information Center

    Ruzhitskaya, Lanika

    2011-01-01

    The presented research study investigated the effects of computer-supported inquiry-based learning and peer interaction methods on effectiveness of learning a scientific concept. The stellar parallax concept was selected as a basic, and yet important in astronomy, scientific construct, which is based on a straightforward relationship of several…

  6. Analyzing the Use of Concept Maps in Computer Science: A Systematic Mapping Study

    ERIC Educational Resources Information Center

    dos Santos, Vinicius; de Souza, Érica F.; Felizardo, Katia R; Vijaykumar, Nandamudi L.

    2017-01-01

    Context: concept Maps (CMs) enable the creation of a schematic representation of a domain knowledge. For this reason, CMs have been applied in different research areas, including Computer Science. Objective: the objective of this paper is to present the results of a systematic mapping study conducted to collect and evaluate existing research on…

  7. Teachers and Students' Conceptions of Computer-Based Models in the Context of High School Chemistry: Elicitations at the Pre-Intervention Stage

    ERIC Educational Resources Information Center

    Waight, Noemi; Gillmeister, Kristina

    2014-01-01

    This study examined teachers' and students' initial conceptions of computer-based models--Flash and NetLogo models--and documented how teachers and students reconciled notions of multiple representations featuring macroscopic, submicroscopic and symbolic representations prior to actual intervention in eight high school chemistry…

  8. The Concept of Energy in Psychological Theory. Cognitive Science Program, Technical Report No. 86-2.

    ERIC Educational Resources Information Center

    Posner, Michael I.; Rothbart, Mary Klevjord

    This paper describes a basic framework for integration of computational and energetic concepts in psychological theory. The framework is adapted from a general effort to understand the neural systems underlying cognition. The element of the cognitive system that provides the best basis for attempting to relate energetic and computational ideas is…

  9. Effects of Geographic Information System on the Learning of Environmental Education Concepts in Basic Computer-Mediated Classrooms in Nigeria

    ERIC Educational Resources Information Center

    Adeleke, Ayobami Gideon

    2017-01-01

    This research paper specifically examined the impact of Geographic Information System (GIS) integration in a learning method and on the performance and retention of Environmental Education (EE) concepts in basic social studies. Non-equivalent experimental research design was employed. 126 pupils in four intact, computer-mediated classrooms were…

  10. The Virtual Solar System Project: Developing Conceptual Understanding of Astronomical Concepts through Building Three-Dimensional Computational Models.

    ERIC Educational Resources Information Center

    Keating, Thomas; Barnett, Michael; Barab, Sasha A.; Hay, Kenneth E.

    2002-01-01

    Describes the Virtual Solar System (VSS) course which is one of the first attempts to integrate three-dimensional (3-D) computer modeling as a central component of introductory undergraduate education. Assesses changes in student understanding of astronomy concepts as a result of participating in an experimental introductory astronomy course in…

  11. Towards a computational- and algorithmic-level account of concept blending using analogies and amalgams

    NASA Astrophysics Data System (ADS)

    Besold, Tarek R.; Kühnberger, Kai-Uwe; Plaza, Enric

    2017-10-01

    Concept blending - a cognitive process which allows for the combination of certain elements (and their relations) from originally distinct conceptual spaces into a new unified space combining these previously separate elements, and enables reasoning and inference over the combination - is taken as a key element of creative thought and combinatorial creativity. In this article, we summarise our work towards the development of a computational-level and algorithmic-level account of concept blending, combining approaches from computational analogy-making and case-based reasoning (CBR). We present the theoretical background, as well as an algorithmic proposal integrating higher-order anti-unification matching and generalisation from analogy with amalgams from CBR. The feasibility of the approach is then exemplified in two case studies.

  12. STS-41 mission charts, computer-generated and artist concept drawings, photos

    NASA Technical Reports Server (NTRS)

    1990-01-01

    STS-41 related charts, computer-generated and artist concept drawings, and photos of the Ulysses spacecraft and mission flight path provided by the European Space Agency (ESA). Charts show the Ulysses mission flight path and encounter with Jupiter (45980, 45981) and sun (illustrating cosmic dust, gamma ray burst, magnetic field, x-rays, solar energetic particles, visible corona, interstellar gas, plasma wave, cosmic rays, solar radio noise, and solar wind) (45988). Computer-generated view shows the Ulysses spacecraft (45983). Artist concept illustrates Ulysses spacecraft deploy from the space shuttle payload bay (PLB) with the inertial upper stage (IUS) and payload assist module (PAM-S) visible (45984). Ulysses spacecraft is also shown undergoing preflight testing in the manufacturing facility (45985, 45986, 45987).

  13. Extension of transonic flow computational concepts in the analysis of cavitated bearings

    NASA Technical Reports Server (NTRS)

    Vijayaraghavan, D.; Keith, T. G., Jr.; Brewe, D. E.

    1990-01-01

    An analogy between the mathematical modeling of transonic potential flow and the flow in a cavitating bearing is described. Based on the similarities, characteristics of the cavitated region and jump conditions across the film reformation and rupture fronts are developed using the method of weak solutions. The mathematical analogy is extended by utilizing a few computational concepts of transonic flow to numerically model the cavitating bearing. Methods of shock fitting and shock capturing are discussed. Various procedures used in transonic flow computations are adapted to bearing cavitation applications, for example, type differencing, grid transformation, an approximate factorization technique, and Newton's iteration method. These concepts have proved to be successful and have vastly improved the efficiency of numerical modeling of cavitated bearings.

  14. A Quantitative Empirical Analysis of the Abstract/Concrete Distinction

    ERIC Educational Resources Information Center

    Hill, Felix; Korhonen, Anna; Bentz, Christian

    2014-01-01

    This study presents original evidence that abstract and concrete concepts are organized and represented differently in the mind, based on analyses of thousands of concepts in publicly available data sets and computational resources. First, we show that abstract and concrete concepts have differing patterns of association with other concepts.…

  15. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  16. Technical report: precisely fitting bars on implants in five steps-a CAD/CAM concept for the edentulous mandible.

    PubMed

    Beuer, Florian; Schweiger, Josef; Huber, Martin; Engels, Jörg; Stimmelmayr, Michael

    2014-06-01

    Various treatment concepts have been presented for the edentulous mandible. Manufacturing tension-free and precisely fitting bars on dental implants was previously a great challenge in prosthetic dentistry and required great effort. Modern computer aided design/computer aided manufacturing technology in combination with some clinical modifications of the established workflow enables the clinician to achieve precise results in a very efficient way. The innovative five-step concept is presented in a clinical case. © 2014 by the American College of Prosthodontists.

  17. Advanced ETC/LSS computerized analytical models, CO2 concentration. Volume 1: Summary document

    NASA Technical Reports Server (NTRS)

    Taylor, B. N.; Loscutoff, A. V.

    1972-01-01

    Computer simulations have been prepared for the concepts of C02 concentration which have the potential for maintaining a C02 partial pressure of 3.0 mmHg, or less, in a spacecraft environment. The simulations were performed using the G-189A Generalized Environmental Control computer program. In preparing the simulations, new subroutines to model the principal functional components for each concept were prepared and integrated into the existing program. Sample problems were run to demonstrate the methods of simulation and performance characteristics of the individual concepts. Comparison runs for each concept can be made for parametric values of cabin pressure, crew size, cabin air dry and wet bulb temperatures, and mission duration.

  18. Analysis and methodology for aeronautical systems technology program planning

    NASA Technical Reports Server (NTRS)

    White, M. J.; Gershkoff, I.; Lamkin, S.

    1983-01-01

    A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

  19. The effectiveness of using computer simulated experiments on junior high students' understanding of the volume displacement concept

    NASA Astrophysics Data System (ADS)

    Choi, Byung-Soon; Gennaro, Eugene

    Several researchers have suggested that the computer holds much promise as a tool for science teachers for use in their classrooms (Bork, 1979, Lunetta & Hofstein, 1981). It also has been said that there needs to be more research in determining the effectiveness of computer software (Tinker, 1983).This study compared the effectiveness of microcomputer simulated experiences with that of parallel instruction involving hands-on laboratory experiences for teaching the concept of volume displacement to junior high school students. This study also assessed the differential effect on students' understanding of the volume displacement concept using sex of the students as another independent variable. In addition, it compared the degree of retention, after 45 days, of both treatment groups.It was found that computer simulated experiences were as effective as hands-on laboratory experiences, and that males, having had hands-on laboratory experiences, performed better on the posttest than females having had the hands-on laboratory experiences. There were no significant differences in performance when comparing males with females using the computer simulation in the learning of the displacement concept. This study also showed that there were no significant differences in the retention levels when the retention scores of the computer simulation groups were compared to those that had the hands-on laboratory experiences. However, an ANOVA of the retention test scores revealed that males in both treatment conditions retained knowledge of volume displacement better than females.

  20. Toward a Proof of Concept Cloud Framework for Physics Applications on Blue Gene Supercomputers

    NASA Astrophysics Data System (ADS)

    Dreher, Patrick; Scullin, William; Vouk, Mladen

    2015-09-01

    Traditional high performance supercomputers are capable of delivering large sustained state-of-the-art computational resources to physics applications over extended periods of time using batch processing mode operating environments. However, today there is an increasing demand for more complex workflows that involve large fluctuations in the levels of HPC physics computational requirements during the simulations. Some of the workflow components may also require a richer set of operating system features and schedulers than normally found in a batch oriented HPC environment. This paper reports on progress toward a proof of concept design that implements a cloud framework onto BG/P and BG/Q platforms at the Argonne Leadership Computing Facility. The BG/P implementation utilizes the Kittyhawk utility and the BG/Q platform uses an experimental heterogeneous FusedOS operating system environment. Both platforms use the Virtual Computing Laboratory as the cloud computing system embedded within the supercomputer. This proof of concept design allows a cloud to be configured so that it can capitalize on the specialized infrastructure capabilities of a supercomputer and the flexible cloud configurations without resorting to virtualization. Initial testing of the proof of concept system is done using the lattice QCD MILC code. These types of user reconfigurable environments have the potential to deliver experimental schedulers and operating systems within a working HPC environment for physics computations that may be different from the native OS and schedulers on production HPC supercomputers.

  1. Tell it like it is.

    PubMed

    Lee, S L

    2000-05-01

    Nurses, therapists and case managers were spending too much time each week on the phone waiting to read patient reports to live transcriptionists who would then type the reports for storage in VNSNY's clinical management mainframe database. A speech recognition system helped solve the problem by providing the staff 24-hour access to an automated transcription service any day of the week. Nurses and case managers no longer wait in long queues to transmit patient reports or to retrieve information from the database. Everything is done automatically within minutes. VNSNY saved both time and money by updating its transcription strategy. Now nurses can spend more time with patients and less time on the phone transcribing notes. It also means fewer staff members are needed on weekends to do manual transcribing.

  2. Japanese experiment module data management and communication system

    NASA Astrophysics Data System (ADS)

    Iizuka, Isao; Yamamoto, Harumitsu; Harada, Minoru; Eguchi, Iwao; Takahashi, Masami

    The data management and communications system (DMCS) for the Japanese experiment module (JEM) being developed for the Space Station is described. Data generated by JEM experiments will be transmitted via TDRS (primary link) to the NASDA Operation Control Center. The DMSC will provide data processing, test and graphics handling, schedule planning support, and data display and facilitate subsystems, payloads, emergency operations, status, and diagnostics and healthchecks management. The ground segment includes a mainframe, mass storage, a workstation, and a LAN, with the capability of receiving and manipulating data from the JEM, the Space Station, and the payload. Audio and alert functions are also included. The DMCS will be connected to the interior of the module with through-bulkhead optical fibers.

  3. Setting the Scope of Concept Inventories for Introductory Computing Subjects

    ERIC Educational Resources Information Center

    Goldman, Ken; Gross, Paul; Heeren, Cinda; Herman, Geoffrey L.; Kaczmarczyk, Lisa; Loui, Michael C.; Zilles, Craig

    2010-01-01

    A concept inventory is a standardized assessment tool intended to evaluate a student's understanding of the core concepts of a topic. In order to create a concept inventory it is necessary to accurately identify these core concepts. A Delphi process is a structured multi-step process that uses a group of experts to achieve a consensus opinion. We…

  4. Computer Analogies: Teaching Molecular Biology and Ecology.

    ERIC Educational Resources Information Center

    Rice, Stanley; McArthur, John

    2002-01-01

    Suggests that computer science analogies can aid the understanding of gene expression, including the storage of genetic information on chromosomes. Presents a matrix of biology and computer science concepts. (DDR)

  5. Tutorial: Computer architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gajski, D.D.; Milutinovic, V.M.; Siegel, H.J.

    1986-01-01

    This book presents the state-of-the-art in advanced computer architecture. It deals with the concepts underlying current architectures and covers approaches and techniques being used in the design of advanced computer systems.

  6. Using an Advanced Computational Laboratory Experiment to Extend and Deepen Physical Chemistry Students' Understanding of Atomic Structure

    ERIC Educational Resources Information Center

    Hoffman, Gary G.

    2015-01-01

    A computational laboratory experiment is described, which involves the advanced study of an atomic system. The students use concepts and techniques typically covered in a physical chemistry course but extend those concepts and techniques to more complex situations. The students get a chance to explore the study of atomic states and perform…

  7. Effect of Inquiry-Based Computer Simulation Modeling on Pre-Service Teachers' Understanding of Homeostasis and Their Perceptions of Design Features

    ERIC Educational Resources Information Center

    Chabalengula, Vivien; Fateen, Rasheta; Mumba, Frackson; Ochs, Laura Kathryn

    2016-01-01

    This study investigated the effect of an inquiry-based computer simulation modeling (ICoSM) instructional approach on pre-service science teachers' understanding of homeostasis and its related concepts, and their perceived design features of the ICoSM and simulation that enhanced their conceptual understanding of these concepts. Fifty pre-service…

  8. Using Interactive Simulations in Assessment: The Use of Computer-Based Interactive Simulations in the Assessment of Statistical Concepts

    ERIC Educational Resources Information Center

    Neumann, David L.

    2010-01-01

    Interactive computer-based simulations have been applied in several contexts to teach statistical concepts in university level courses. In this report, the use of interactive simulations as part of summative assessment in a statistics course is described. Students accessed the simulations via the web and completed questions relating to the…

  9. The Use of Engineering Design Concept for Computer Programming Course: A Model of Blended Learning Environment

    ERIC Educational Resources Information Center

    Tritrakan, Kasame; Kidrakarn, Pachoen; Asanok, Manit

    2016-01-01

    The aim of this research is to develop a learning model which blends factors from learning environment and engineering design concept for learning in computer programming course. The usage of the model was also analyzed. This study presents the design, implementation, and evaluation of the model. The research methodology is divided into three…

  10. Effects of a Structured Resource-Based Web Issue-Quest Approach on Students' Learning Performances in Computer Programming Courses

    ERIC Educational Resources Information Center

    Hsu, Ting-Chia; Hwang, Gwo-Jen

    2017-01-01

    Programming concepts are important and challenging to novices who are beginning to study computer programming skills. In addition to the textbook content, students usually learn the concepts of programming from the web; however, it could be difficult for novice learners to effectively derive helpful information from such non-structured open…

  11. Student Conceptions about the DNA Structure within a Hierarchical Organizational Level: Improvement by Experiment- and Computer-Based Outreach Learning

    ERIC Educational Resources Information Center

    Langheinrich, Jessica; Bogner, Franz X.

    2015-01-01

    As non-scientific conceptions interfere with learning processes, teachers need both, to know about them and to address them in their classrooms. For our study, based on 182 eleventh graders, we analyzed the level of conceptual understanding by implementing the "draw and write" technique during a computer-supported gene technology module.…

  12. Self-Concept of Computer and Math Ability: Gender Implications across Time and within ICT Studies

    ERIC Educational Resources Information Center

    Sainz, Milagros; Eccles, Jacquelynne

    2012-01-01

    The scarcity of women in ICT-related studies has been systematically reported by the scientific community for many years. This paper has three goals: to analyze gender differences in self-concept of computer and math abilities along with math performance in two consecutive academic years; to study the ontogeny of gender differences in self-concept…

  13. The Impact of Three-Dimensional Computational Modeling on Student Understanding of Astronomy Concepts: A Qualitative Analysis. Research Report

    ERIC Educational Resources Information Center

    Hansen, John; Barnett, Michael; MaKinster, James; Keating, Thomas

    2004-01-01

    In this study, we explore an alternate mode for teaching and learning the dynamic, three-dimensional (3D) relationships that are central to understanding astronomical concepts. To this end, we implemented an innovative undergraduate course in which we used inexpensive computer modeling tools. As the second of a two-paper series, this report…

  14. Curriculum-Based Measurement of Mathematics Competence: From Computation to Concepts and Applications to Real-Life Problem Solving

    ERIC Educational Resources Information Center

    Fuchs, Lynn S.; Fuchs, Douglas; Courey, Susan J.

    2005-01-01

    In this article, the authors explain how curriculum-based measurement (CBM) differs from other forms of classroom-based assessment. The development of CBM is traced from computation to concepts and applications to real-life problem solving, with examples of the assessments and illustrations of research to document technical features and utility…

  15. The Effects of Computer-Aided Concept Cartoons and Outdoor Science Activities on Light Pollution

    ERIC Educational Resources Information Center

    Aydin, Güliz

    2015-01-01

    The purpose of this study is to create an awareness of light pollution on seventh grade students via computer aided concept cartoon applications and outdoor science activities and to help them develop solutions; and to determine student opinions on the practices carried out. The study was carried out at a middle school in Mugla province of Aegean…

  16. The Effectiveness of Interactive Computer Assisted Modeling in Teaching Study Strategies and Concept Mapping of College Textbook Material.

    ERIC Educational Resources Information Center

    Mikulecky, Larry

    A study evaluated the effectiveness of a series of print materials and interactive computer-guided study programs designed to lead undergraduate students to apply basic textbook reading and concept mapping strategies to the study of science and social science textbooks. Following field testing with 25 learning skills students, 50 freshman biology…

  17. GAMMON: An Approach to the Concept of Strategy in Game-Playing Programs.

    ERIC Educational Resources Information Center

    Bushey, William Edward

    In order to investigate the use of strategies in a game-playing computer program, "Gammon," a computer program that plays Backgammon, was developed. It focuses on the play of a given strategy, as well as the process of strategy selection, and examines the concept of strategy as an integrating and driving force in the play of a game. A…

  18. Evaluation of Airframe Noise Reduction Concepts via Simulations Using a Lattice Boltzmann Approach

    NASA Technical Reports Server (NTRS)

    Fares, Ehab; Casalino, Damiano; Khorrami, Mehdi R.

    2015-01-01

    Unsteady computations are presented for a high-fidelity, 18% scale, semi-span Gulfstream aircraft model in landing configuration, i.e. flap deflected at 39 degree and main landing gear deployed. The simulations employ the lattice Boltzmann solver PowerFLOW® to simultaneously capture the flow physics and acoustics in the near field. Sound propagation to the far field is obtained using a Ffowcs Williams and Hawkings acoustic analogy approach. In addition to the baseline geometry, which was presented previously, various noise reduction concepts for the flap and main landing gear are simulated. In particular, care is taken to fully resolve the complex geometrical details associated with these concepts in order to capture the resulting intricate local flow field thus enabling accurate prediction of their acoustic behavior. To determine aeroacoustic performance, the farfield noise predicted with the concepts applied is compared to high-fidelity simulations of the untreated baseline configurations. To assess the accuracy of the computed results, the aerodynamic and aeroacoustic impact of the noise reduction concepts is evaluated numerically and compared to experimental results for the same model. The trends and effectiveness of the simulated noise reduction concepts compare well with measured values and demonstrate that the computational approach is capable of capturing the primary effects of the acoustic treatment on a full aircraft model.

  19. PACS-Graz, 1985-2000: from a scientific pilot to a state-wide multimedia radiological information system

    NASA Astrophysics Data System (ADS)

    Gell, Guenther

    2000-05-01

    1971/72 began the implementation of a computerized radiological documentation system as the Department of Radiology of the University of Graz, which developed over the years into a full RIS. 1985 started a scientific cooperation with SIEMENS to develop a PACS. The two systems were linked and evolved into a highly integrated RIS-PACS for the state wide hospital system in Styria. During its lifetime the RIS, originally implemented in FORTRAN on a UNIVAC 494 mainframe migrated to a PDP15, on to a PDP11, then VAX and Alphas. The flexible original record structure with variable length fields and the powerful retrieval language were retained. The data acquisition part with the user interface was rewritten several times and many service programs have been added. During our PACS cooperation many ideas like the folder concept or functionalities of the GUI have been designed and tested and were then implemented in the SIENET product. The actual RIS/PACS supports the whole workflow in the Radiology Department. It is installed in a 2.300 bed university hospital and the smaller hospitals of the State of Styria. Modalities from different vendors are connected via DICOM to the RIS (modality worklist) and to the PACS. PACSubsystems from other vendors have been integrated. Images are distributed to referring clinics and for teleconsultation and image processing and reports are available on line to all connected hospitals. We spent great efforts to guarantee optimal support of the workflow and to ensure an enhanced cost/benefit ratio for each user (class). Another special feature is selective image distribution. Using the high level retrieval language individual filters can be constructed easily to implement any image distribution policy agreed upon by radiologists and referring clinicians.

  20. Numerical propulsion system simulation

    NASA Technical Reports Server (NTRS)

    Lytle, John K.; Remaklus, David A.; Nichols, Lester D.

    1990-01-01

    The cost of implementing new technology in aerospace propulsion systems is becoming prohibitively expensive. One of the major contributors to the high cost is the need to perform many large scale system tests. Extensive testing is used to capture the complex interactions among the multiple disciplines and the multiple components inherent in complex systems. The objective of the Numerical Propulsion System Simulation (NPSS) is to provide insight into these complex interactions through computational simulations. This will allow for comprehensive evaluation of new concepts early in the design phase before a commitment to hardware is made. It will also allow for rapid assessment of field-related problems, particularly in cases where operational problems were encountered during conditions that would be difficult to simulate experimentally. The tremendous progress taking place in computational engineering and the rapid increase in computing power expected through parallel processing make this concept feasible within the near future. However it is critical that the framework for such simulations be put in place now to serve as a focal point for the continued developments in computational engineering and computing hardware and software. The NPSS concept which is described will provide that framework.

  1. Incorporating Concept Mapping in Project-Based Learning: Lessons from Watershed Investigations

    ERIC Educational Resources Information Center

    Rye, James; Landenberger, Rick; Warner, Timothy A.

    2013-01-01

    The concept map tool set forth by Novak and colleagues is underutilized in education. A meta-analysis has encouraged teachers to make extensive use of concept mapping, and researchers have advocated computer-based concept mapping applications that exploit hyperlink technology. Through an NSF sponsored geosciences education grant, middle and…

  2. Coherent concepts are computed in the anterior temporal lobes.

    PubMed

    Lambon Ralph, Matthew A; Sage, Karen; Jones, Roy W; Mayberry, Emily J

    2010-02-09

    In his Philosophical Investigations, Wittgenstein famously noted that the formation of semantic representations requires more than a simple combination of verbal and nonverbal features to generate conceptually based similarities and differences. Classical and contemporary neuroscience has tended to focus upon how different neocortical regions contribute to conceptualization through the summation of modality-specific information. The additional yet critical step of computing coherent concepts has received little attention. Some computational models of semantic memory are able to generate such concepts by the addition of modality-invariant information coded in a multidimensional semantic space. By studying patients with semantic dementia, we demonstrate that this aspect of semantic memory becomes compromised following atrophy of the anterior temporal lobes and, as a result, the patients become increasingly influenced by superficial rather than conceptual similarities.

  3. Manned systems utilization analysis (study 2.1). Volume 3: LOVES computer simulations, results, and analyses

    NASA Technical Reports Server (NTRS)

    Stricker, L. T.

    1975-01-01

    The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.

  4. Interactive design and analysis of future large spacecraft concepts

    NASA Technical Reports Server (NTRS)

    Garrett, L. B.

    1981-01-01

    An interactive computer aided design program used to perform systems level design and analysis of large spacecraft concepts is presented. Emphasis is on rapid design, analysis of integrated spacecraft, and automatic spacecraft modeling for lattice structures. Capabilities and performance of multidiscipline applications modules, the executive and data management software, and graphics display features are reviewed. A single user at an interactive terminal create, design, analyze, and conduct parametric studies of Earth orbiting spacecraft with relative ease. Data generated in the design, analysis, and performance evaluation of an Earth-orbiting large diameter antenna satellite are used to illustrate current capabilities. Computer run time statistics for the individual modules quantify the speed at which modeling, analysis, and design evaluation of integrated spacecraft concepts is accomplished in a user interactive computing environment.

  5. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Operations concept report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  6. Concept of software interface for BCI systems

    NASA Astrophysics Data System (ADS)

    Svejda, Jaromir; Zak, Roman; Jasek, Roman

    2016-06-01

    Brain Computer Interface (BCI) technology is intended to control external system by brain activity. One of main part of such system is software interface, which carries about clear communication between brain and either computer or additional devices connected to computer. This paper is organized as follows. Firstly, current knowledge about human brain is briefly summarized to points out its complexity. Secondly, there is described a concept of BCI system, which is then used to build an architecture of proposed software interface. Finally, there are mentioned disadvantages of sensing technology discovered during sensing part of our research.

  7. Nuclear Analysis

    NASA Technical Reports Server (NTRS)

    Clement, J. D.; Kirby, K. D.

    1973-01-01

    Exploratory calculations were performed for several gas core breeder reactor configurations. The computational method involved the use of the MACH-1 one dimensional diffusion theory code and the THERMOS integral transport theory code for thermal cross sections. Computations were performed to analyze thermal breeder concepts and nonbreeder concepts. Analysis of breeders was restricted to the (U-233)-Th breeding cycle, and computations were performed to examine a range of parameters. These parameters include U-233 to hydrogen atom ratio in the gaseous cavity, carbon to thorium atom ratio in the breeding blanket, cavity size, and blanket size.

  8. Use of cloud computing in biomedicine.

    PubMed

    Sobeslav, Vladimir; Maresova, Petra; Krejcar, Ondrej; Franca, Tanos C C; Kuca, Kamil

    2016-12-01

    Nowadays, biomedicine is characterised by a growing need for processing of large amounts of data in real time. This leads to new requirements for information and communication technologies (ICT). Cloud computing offers a solution to these requirements and provides many advantages, such as cost savings, elasticity and scalability of using ICT. The aim of this paper is to explore the concept of cloud computing and the related use of this concept in the area of biomedicine. Authors offer a comprehensive analysis of the implementation of the cloud computing approach in biomedical research, decomposed into infrastructure, platform and service layer, and a recommendation for processing large amounts of data in biomedicine. Firstly, the paper describes the appropriate forms and technological solutions of cloud computing. Secondly, the high-end computing paradigm of cloud computing aspects is analysed. Finally, the potential and current use of applications in scientific research of this technology in biomedicine is discussed.

  9. The challenge of computer mathematics.

    PubMed

    Barendregt, Henk; Wiedijk, Freek

    2005-10-15

    Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.

  10. A cross-disciplinary introduction to quantum annealing-based algorithms

    NASA Astrophysics Data System (ADS)

    Venegas-Andraca, Salvador E.; Cruz-Santos, William; McGeoch, Catherine; Lanzagorta, Marco

    2018-04-01

    A central goal in quantum computing is the development of quantum hardware and quantum algorithms in order to analyse challenging scientific and engineering problems. Research in quantum computation involves contributions from both physics and computer science; hence this article presents a concise introduction to basic concepts from both fields that are used in annealing-based quantum computation, an alternative to the more familiar quantum gate model. We introduce some concepts from computer science required to define difficult computational problems and to realise the potential relevance of quantum algorithms to find novel solutions to those problems. We introduce the structure of quantum annealing-based algorithms as well as two examples of this kind of algorithms for solving instances of the max-SAT and Minimum Multicut problems. An overview of the quantum annealing systems manufactured by D-Wave Systems is also presented.

  11. Semantic Similarity between Web Documents Using Ontology

    NASA Astrophysics Data System (ADS)

    Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh

    2018-06-01

    The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.

  12. Semantic Similarity between Web Documents Using Ontology

    NASA Astrophysics Data System (ADS)

    Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh

    2018-03-01

    The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.

  13. Potential application of artificial concepts to aerodynamic simulation

    NASA Technical Reports Server (NTRS)

    Kutler, P.; Mehta, U. B.; Andrews, A.

    1984-01-01

    The concept of artificial intelligence as it applies to computational fluid dynamics simulation is investigated. How expert systems can be adapted to speed the numerical aerodynamic simulation process is also examined. A proposed expert grid generation system is briefly described which, given flow parameters, configuration geometry, and simulation constraints, uses knowledge about the discretization process to determine grid point coordinates, computational surface information, and zonal interface parameters.

  14. Staff Perspectives on the Use of a Computer-Based Concept for Lifestyle Intervention Implemented in Primary Health Care

    ERIC Educational Resources Information Center

    Carlfjord, Siw; Johansson, Kjell; Bendtsen, Preben; Nilsen, Per; Andersson, Agneta

    2010-01-01

    Objective: The aim of this study was to evaluate staff experiences of the use of a computer-based concept for lifestyle testing and tailored advice implemented in routine primary health care (PHC). Design: The design of the study was a cross-sectional, retrospective survey. Setting: The study population consisted of staff at nine PHC units in the…

  15. The Effect of 3D Computer Modeling and Observation-Based Instruction on the Conceptual Change regarding Basic Concepts of Astronomy in Elementary School Students

    ERIC Educational Resources Information Center

    Kucukozer, Huseyin; Korkusuz, M. Emin; Kucukozer, H. Asuman; Yurumezoglu, Kemal

    2009-01-01

    This study has examined the impact of teaching certain basic concepts of astronomy through a predict-observe-explain strategy, which includes three-dimensional (3D) computer modeling and observations on conceptual changes seen in sixth-grade elementary school children (aged 11-13; number of students: 131). A pre- and postastronomy instruction…

  16. Pathology informatics questions and answers from the University of Pittsburgh pathology residency informatics rotation.

    PubMed

    Harrison, James H

    2004-01-01

    Effective pathology practice increasingly requires familiarity with concepts in medical informatics that may cover a broad range of topics, for example, traditional clinical information systems, desktop and Internet computer applications, and effective protocols for computer security. To address this need, the University of Pittsburgh (Pittsburgh, Pa) includes a full-time, 3-week rotation in pathology informatics as a required component of pathology residency training. To teach pathology residents general informatics concepts important in pathology practice. We assess the efficacy of the rotation in communicating these concepts using a short-answer examination administered at the end of the rotation. Because the increasing use of computers and the Internet in education and general communications prior to residency training has the potential to communicate key concepts that might not need additional coverage in the rotation, we have also evaluated incoming residents' informatics knowledge using a similar pretest. This article lists 128 questions that cover a range of topics in pathology informatics at a level appropriate for residency training. These questions were used for pretests and posttests in the pathology informatics rotation in the Pathology Residency Program at the University of Pittsburgh for the years 2000 through 2002. With slight modification, the questions are organized here into 15 topic categories within pathology informatics. The answers provided are brief and are meant to orient the reader to the question and suggest the level of detail appropriate in an answer from a pathology resident. A previously published evaluation of the test results revealed that pretest scores did not increase during the 3-year evaluation period, and self-assessed computer skill level correlated with pretest scores, but all pretest scores were low. Posttest scores increased substantially, and posttest scores did not correlate with the self-assessed computer skill level recorded at pretest time. Even residents who rated themselves high in computer skills lacked many concepts important in pathology informatics, and posttest scores showed that residents with both high and low self-assessed skill levels learned pathology informatics concepts effectively.

  17. LOCAL ORTHOGONAL CUTTING METHOD FOR COMPUTING MEDIAL CURVES AND ITS BIOMEDICAL APPLICATIONS

    PubMed Central

    Einstein, Daniel R.; Dyedov, Vladimir

    2010-01-01

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method called local orthogonal cutting (LOC) for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stability and consistency tests. These concepts lend themselves to robust numerical techniques and result in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods. PMID:20628546

  18. Development and assessment of a chemistry-based computer video game as a learning tool

    NASA Astrophysics Data System (ADS)

    Martinez-Hernandez, Kermin Joel

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning experience through gameplay. The project consists of three areas: development, assessment, and implementation. However, the foci of this study were the development and assessment of the computer video game including possible learning outcomes and game design elements. A chemistry-based game using a mixed genre of a single player first-person game embedded with action-adventure and puzzle components was developed to determine if students' level of understanding of chemistry concepts change after gameplay intervention. Three phases have been completed to assess students' understanding of chemistry concepts prior and after gameplay intervention. Two main assessment instruments (pre/post open-ended content survey and individual semi-structured interviews) were used to assess student understanding of concepts. In addition, game design elements were evaluated for future development phases. Preliminary analyses of the interview data suggest that students were able to understand most of the chemistry challenges presented in the game and the game served as a review for previously learned concepts as well as a way to apply such previous knowledge. To guarantee a better understanding of the chemistry concepts, additions such as debriefing and feedback about the content presented in the game seem to be needed. The use of visuals in the game to represent chemical processes, game genre, and game idea appear to be the game design elements that students like the most about the current computer video game.

  19. The development of a computer-assisted instruction system for clinical nursing skills with virtual instruments concepts: A case study for intra-aortic balloon pumping.

    PubMed

    Chang, Ching-I; Yan, Huey-Yeu; Sung, Wen-Hsu; Shen, Shu-Cheng; Chuang, Pao-Yu

    2006-01-01

    The purpose of this research was to develop a computer-aided instruction system for intra-aortic balloon pumping (IABP) skills in clinical nursing with virtual instrument (VI) concepts. Computer graphic technologies were incorporated to provide not only static clinical nursing education, but also the simulated function of operating an expensive medical instrument with VI techniques. The content of nursing knowledge was adapted from current well-accepted clinical training materials. The VI functions were developed using computer graphic technology with photos of real medical instruments taken by digital camera. We wish the system could provide beginners of nursing education important teaching assistance.

  20. Stored program concept for analog computers

    NASA Technical Reports Server (NTRS)

    Hannauer, G., III; Patmore, J. R.

    1971-01-01

    Optimization of three-stage matrices, modularization, and black boxes design techniques provides for automatically interconnecting computing component inputs and outputs in general purpose analog computer. Design also produces relatively inexpensive and less complex automatic patching system.

Top