Science.gov

Sample records for microcomputer database management

  1. Implementing a Microcomputer Database Management System.

    ERIC Educational Resources Information Center

    Manock, John J.; Crater, K. Lynne

    1985-01-01

    Current issues in selecting, structuring, and implementing microcomputer database management systems in research administration offices are discussed, and their capabilities are illustrated with the system used by the University of North Carolina at Wilmington. Trends in microcomputer technology and their likely impact on research administration…

  2. Automating Relational Database Design for Microcomputer Users.

    ERIC Educational Resources Information Center

    Pu, Hao-Che

    1991-01-01

    Discusses issues involved in automating the relational database design process for microcomputer users and presents a prototype of a microcomputer-based system (RA, Relation Assistant) that is based on expert systems technology and helps avoid database maintenance problems. Relational database design is explained and the importance of easy input…

  3. Database Management System

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In 1981 Wayne Erickson founded Microrim, Inc, a company originally focused on marketing a microcomputer version of RIM (Relational Information Manager). Dennis Comfort joined the firm and is now vice president, development. The team developed an advanced spinoff from the NASA system they had originally created, a microcomputer database management system known as R:BASE 4000. Microrim added many enhancements and developed a series of R:BASE products for various environments. R:BASE is now the second largest selling line of microcomputer database management software in the world.

  4. An Introduction to Database Management Systems.

    ERIC Educational Resources Information Center

    Warden, William H., III; Warden, Bette M.

    1984-01-01

    Description of database management systems for microcomputers highlights system features and factors to consider in microcomputer system selection. A method for ranking database management systems is explained and applied to a defined need, i.e., software support for indexing a weekly newspaper. A glossary of terms and 32-item bibliography are…

  5. Managing Microcomputer Technology as an Organizational Resource.

    ERIC Educational Resources Information Center

    Khosrowpour, Mehdi; Amoroso, Donald

    With the realization that microcomputers provide an extraordinary value to the organization follows the need to address a variety of issues in order to more effectively manage these resources. Each of the 14 chapters, consisting of papers written by different authors, represents a different perspective existing in organizations with respect to the…

  6. Database Manager

    ERIC Educational Resources Information Center

    Martin, Andrew

    2010-01-01

    It is normal practice today for organizations to store large quantities of records of related information as computer-based files or databases. Purposeful information is retrieved by performing queries on the data sets. The purpose of DATABASE MANAGER is to communicate to students the method by which the computer performs these queries. This…

  7. An Evaluator's Guide to Using DB MASTER: A Microcomputer Based File Management Program. Research on Evaluation Program, Paper and Report Series No. 91.

    ERIC Educational Resources Information Center

    Gray, Peter J.

    Ways a microcomputer can be used to establish and maintain an evaluation database and types of data management features possible on a microcomputer are described in this report, which contains step-by-step procedures and numerous examples for establishing a database, manipulating data, and designing and printing reports. Following a brief…

  8. The Microcomputer in the Library: III. Information Retrieval from External and Internal Databases.

    ERIC Educational Resources Information Center

    Leggate, Peter; Dyer, Hilary

    1986-01-01

    Identifies two types of applications software for microcomputers: (1) communications, file transfer, and search assistance software, which facilitates intelligent access to external databases; and (2) software to support local database creation and searching. Functions of each type of software are described and examples of commercial packages are…

  9. Library Micro-Computing, Vol. 1. Reprints from the Best of "ONLINE" [and]"DATABASE."

    ERIC Educational Resources Information Center

    Online, Inc., Weston, CT.

    Reprints of 18 articles pertaining to library microcomputing appear in this collection, the first of two volumes on this topic in a series of volumes of reprints from "ONLINE" and "DATABASE" magazines. Edited for information professionals who use electronically distributed databases, these articles address such topics as: (1) an integrated library…

  10. Database Management

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Management of the data within a planetary data system (PDS) is addressed. Principles of modern data management are described and several large NASA scientific data base systems are examined. Data management in PDS is outlined and the major data management issues are introduced.

  11. Microcomputers and the Library: A Planning Guide for Managers.

    ERIC Educational Resources Information Center

    Walton, Robert A.

    This manual was designed to provide the library manager or supervisor with a basic understanding of microcomputer hardware, software, procurement, and supervision. While developed for a summer workshop series, it can also serve as an introductory text. Separate chapters cover 10 major topics: (1) the microtechnology revolution; (2) how a…

  12. Microcomputers & Educational Researchers: Writing, Project Management, Statistics Software, & Data Retrieval.

    ERIC Educational Resources Information Center

    Knirk, Frederick G.

    Designed to assist educational researchers in utilizing microcomputers, this paper presents information on four types of computer software: writing tools for educators, statistical software designed to perform analyses of small and moderately large data sets, project management tools, and general education/research oriented information services…

  13. Microcomputer Software for Online Information Management: An Overview.

    ERIC Educational Resources Information Center

    Nieuwenhuysen, Paul

    1988-01-01

    Summary of the findings of a critical overview of more than 200 microcomputer programs using PC-DOS/MS-DOS for online information management discusses software for telecommunications, conversion/reformatting, storage and retrieval of text information, and word processing. (6 references) (MES)

  14. Chesapeake Bay database (version 1. 00) (for microcomputers). Data file

    SciTech Connect

    Not Available

    1988-11-11

    The Chesapeake Bay Database contains 337 records of discrete water quality observations, collected on 3 oceanographic cruises during the summers of 1985, 1986, and 1987. Each record contains 64 fields listing the hydrographic, chemical and biological data measured for each observation.

  15. Database Searching by Managers.

    ERIC Educational Resources Information Center

    Arnold, Stephen E.

    Managers and executives need the easy and quick access to business and management information that online databases can provide, but many have difficulty articulating their search needs to an intermediary. One possible solution would be to encourage managers and their immediate support staff members to search textual databases directly as they now…

  16. Assurance Program for Remedial Action (APRA) microcomputer-operated bibliography management system

    SciTech Connect

    Stenner, R.D.; Washburn, D.K.; Denham, D.H.

    1985-06-01

    Pacific Northwest Laboratory (PNL) provided technical assistance to the Office of Operational Safety (OOS) in developing their Assurance Program for Remedial Action (APRA). The APRA Bibliography Management System (BMS), a microcomputer-operated system designed to file, locate and retrieve project-specific bibliographic data, was developed to manage the documentation associated with APRA. The BMS uses APRABASE, a PNL-developed computer program written in dBASE II language, which is designed to operate using the commercially available dBASE II database software. This document describes the APRABASE computer program, its associated subprograms, and the dBASE II APRA file. A User's Manual is also provided in the document. Although the BMS was designed to manage APRA-associated documents, it could be easily adapted for use in handling bibliographic data associated with any project.

  17. Microcomputers in Small Business Management. Leadership and Training Series No. 64.

    ERIC Educational Resources Information Center

    Heath, Betty; Camp, William G.

    This guide is designed to assist vocational educators in training individuals at the secondary, postsecondary, and adult levels to use microcomputers in small business management. An overview of the use of microcomputers in the small business setting is provided in the introduction. Included in the next section is a multi-page matrix dealing with…

  18. Small Business Microcomputer Programs: Tools for Library Media Center Management.

    ERIC Educational Resources Information Center

    Yerkey, A. Neil

    1984-01-01

    This examination of ways to use general-purpose commercial software to assist in the management of library media centers focuses on database management programs, word processing, authoring systems, and calculator (spreadsheet) programs. System requirements and purchasing information for representative programs (address, price) are provided. Six…

  19. Online Searching of Bibliographic Databases: Microcomputer Access to National Information Systems.

    ERIC Educational Resources Information Center

    Coons, Bill

    This paper describes the range and scope of various information databases available for technicians, researchers, and managers employed in forestry and the forest products industry. Availability of information on reports of field and laboratory research, business trends, product prices, and company profiles through national distributors of…

  20. A Soft Sell for Hardware: The Use of Microcomputer Technology for Cost Effective Special Education Management.

    ERIC Educational Resources Information Center

    Miller, Rosemary; Ragghianti, Suzanne

    The role of the computer as manager of instruction in special education is discussed. In Part 1, basic computer terms are introduced under the major headings of data measurement, hardware, and software. Part 2 focuses on selection criteria for microcomputers. Suggestions for evaluating software are given in terms of ease of use, documentation,…

  1. An End-User Approach to Using Microcomputers in Teaching Production/Operations Management.

    ERIC Educational Resources Information Center

    Luebbe, Richard L.; Finch, Byron J.

    1989-01-01

    Advocates using an end-user approach to microcomputers in production/operations management. This approach requires the student to design and create spreadsheet models to solve problems, enhancing subject area knowledge, creativity, and problem-solving and computer skills. (SK)

  2. Development of a Microcomputer/Videodisc Aided Math Instructional Management System for Mildly Handicapped Children. Final Report.

    ERIC Educational Resources Information Center

    Hofmeister, Alan M.

    This final report describes activities and accomplishments of a project which developed, implemented, and evaluated the effectiveness of a microcomputer/videodisc math instructional management system for grades K-4. The system was designed to operate on an APPLE II microcomputer, videodisc player, and input-output devices. It included three…

  3. Requirements Management Database

    2009-08-13

    This application is a simplified and customized version of the RBA and CTS databases to capture federal, site, and facility requirements, link to actions that must be performed to maintain compliance with their contractual and other requirements.

  4. Urological history-taking and management recommendations by microcomputer.

    PubMed

    Glen, E S; Small, D R; Morrison, L M; Pollock, K

    1989-02-01

    A system has been developed to acquire a complete urological history using an Apple microcomputer. The system can ask up to 300 multiple choice questions which the patient answers using a light pen. The questions are grouped into blocks for urological symptoms and complicating factors. A printout summarises the history and the recommended further investigations. A copy of this is given to the patient and the referring doctor. The clinician discusses the printout and proposed investigations with the patient. The system has been tested against experienced clinicians and the results are presented. The computer system was evaluated for 26 patients and was found to record all of the important information. The system is now in regular use in the out-patient clinic as the first part of the diagnostic work-up in suitable referrals. This system has shortened waiting times for first appointments. To date 261 patients have used the system. The consultant urologist continues his practice of reading all referral letters and allocating priorities. Conditions requiring immediate physical examination (e.g. testicular swelling) are not suitable for this type of approach.

  5. Designing a Decision Support System (DSS) for Academic Library Managers Using Preprogrammed Application Software on a Microcomputer.

    ERIC Educational Resources Information Center

    McDonald, Joseph

    1986-01-01

    Focusing on management decisions in academic libraries, this article compares management information systems (MIS) with decision support systems (DSS) and discusses the decision-making process, information needs of library managers, sources of data, reasons for choosing microcomputer, preprogrammed application software, prototyping a system, and…

  6. Administrative Uses of Microcomputers.

    ERIC Educational Resources Information Center

    Crawford, Chase

    1987-01-01

    This paper examines the administrative uses of the microcomputer, stating that high performance educational managers are likely to have microcomputers in their organizations. Four situations that would justify the use of a computer are: (1) when massive amounts of data are processed through well-defined operations; (2) when data processing is…

  7. Advanced Scientific Computing Environment Team new scientific database management task

    SciTech Connect

    Church, J.P.; Roberts, J.C.; Sims, R.N.; Smetana, A.O.; Westmoreland, B.W.

    1991-06-01

    The mission of the ASCENT Team is to continually keep pace with, evaluate, and select emerging computing technologies to define and implement prototypic scientific environments that maximize the ability of scientists and engineers to manage scientific data. These environments are to be implemented in a manner consistent with the site computing architecture and standards and NRTSC/SCS strategic plans for scientific computing. The major trends in computing hardware and software technology clearly indicate that the future computer'' will be a network environment that comprises supercomputers, graphics boxes, mainframes, clusters, workstations, terminals, and microcomputers. This network computer'' will have an architecturally transparent operating system allowing the applications code to run on any box supplying the required computing resources. The environment will include a distributed database and database managing system(s) that permits use of relational, hierarchical, object oriented, GIS, et al, databases. To reach this goal requires a stepwise progression from the present assemblage of monolithic applications codes running on disparate hardware platforms and operating systems. The first steps include converting from the existing JOSHUA system to a new J80 system that complies with modern language standards, development of a new J90 prototype to provide JOSHUA capabilities on Unix platforms, development of portable graphics tools to greatly facilitate preparation of input and interpretation of output; and extension of Jvv'' concepts and capabilities to distributed and/or parallel computing environments.

  8. TWRS technical baseline database manager definition document

    SciTech Connect

    Acree, C.D.

    1997-08-13

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

  9. Construction of file database management

    SciTech Connect

    MERRILL,KYLE J.

    2000-03-01

    This work created a database for tracking data analysis files from multiple lab techniques and equipment stored on a central file server. Experimental details appropriate for each file type are pulled from the file header and stored in a searchable database. The database also stores specific location and self-directory structure for each data file. Queries can be run on the database according to file type, sample type or other experimental parameters. The database was constructed in Microsoft Access and Visual Basic was used for extraction of information from the file header.

  10. Negative Effects of Learning Spreadsheet Management on Learning Database Management

    ERIC Educational Resources Information Center

    Vágner, Anikó; Zsakó, László

    2015-01-01

    A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…

  11. Current and Future Microcomputer Capabilities: Selecting the Hardware.

    ERIC Educational Resources Information Center

    Mason, Robert M.

    1984-01-01

    Suggests guidelines for selecting microcomputers, reviews present status of microcomputer industry, projects microcomputer capabilities for 1988 and 1998, and discusses implications of recent trends on microcomputer purchasing. It is concluded that buyers interested in a microcomputer for information management will make a safe decision selecting…

  12. Accessing ERIC with Your Microcomputer. ERIC Digest.

    ERIC Educational Resources Information Center

    Klausmeier, Jane A.

    This fact sheet offers basic instructions on connecting to the ERIC database for individuals who own or have access to a microcomputer and are familiar with ERIC and how to search it through a database terminal. Software, hardware, and telephone line components necesary to make a microcomputer act as a database terminal are outlined. The…

  13. Frame-Based Approach To Database Management

    NASA Astrophysics Data System (ADS)

    Voros, Robert S.; Hillman, Donald J.; Decker, D. Richard; Blank, Glenn D.

    1989-03-01

    Practical knowledge-based systems need to reason in terms of knowledge that is already available in databases. This type of knowledge is usually represented as tables acquired from external databases and published reports. Knowledge based systems provide a means for reasoning about entities at a higher level of abstraction. What is needed in many of today's expert systems is a link between the knowledge base and external databases. One such approach is a frame-based database management system. Package Expert (PEx) designs packages for integrated circuits. The thrust of our work is to bring together diverse technologies, data and design knowledge in a coherent system. PEx uses design rules to reason about properties of chips and potential packages, including dimensions, possible materials and packaging requirements. This information is available in existing databases. PEx needs to deal with the following types of information consistently: material databases which are in several formats; technology databases, also in several formats; and parts files which contain dimensional information. It is inefficient and inelegant to have rules access the database directly. Instead, PEx uses a frame-based hierarchical knowledge management approach to databases. Frames serve as the interface between rule-based knowledge and databases. We describe PEx and the use of frames in database retrieval. We first give an overview and the design evolution of the expert system. Next, we describe the system implementation. Finally, we describe how the rules in the expert system access the databases via frames.

  14. Microcomputer Data Base Programs in Social Research.

    ERIC Educational Resources Information Center

    Tate, C. Neal

    1986-01-01

    Microcomputer uses by social researchers include writing programs, standard spreadsheets and data base management. In addition, microcomputers can increase the effectiveness and efficiency of information gathering by improving notetaking and organizing. Software developments will help make microcomputer data base management tools, now not…

  15. Ridesharing and the database management system

    SciTech Connect

    Taasevigen, D.

    1981-08-01

    Lawrence Livermore National Laboratory has operated a ridesharing program since 1977. As the volume of recordkeeping and information tracking for the program became more extensive, the need for an easily altered and operated database system became apparent. The following report describes the needs of the ridesharing program and how our database management system answers those needs.

  16. The land management and operations database (LMOD)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper presents the design, implementation, deployment, and application of the Land Management and Operations Database (LMOD). LMOD is the single authoritative source for reference land management and operation reference data within the USDA enterprise data warehouse. LMOD supports modeling appl...

  17. A Bibliography of Text Information Management Software for IBM Microcomputers and Compatibles.

    ERIC Educational Resources Information Center

    Nieuwenhuysen, Paul

    1988-01-01

    This bibliography lists 754 books and articles which focus on programs that can run on IBM microcomputers and compatibles using PC DOS/MS DOS, and that can be used in online information and documentation work. An index lists the software packages alphabetically by title and references them to the main entries, which are listed by author. (CLB)

  18. Research on computer virus database management system

    NASA Astrophysics Data System (ADS)

    Qi, Guoquan

    2011-12-01

    The growing proliferation of computer viruses becomes the lethal threat and research focus of the security of network information. While new virus is emerging, the number of viruses is growing, virus classification increasing complex. Virus naming because of agencies' capture time differences can not be unified. Although each agency has its own virus database, the communication between each other lacks, or virus information is incomplete, or a small number of sample information. This paper introduces the current construction status of the virus database at home and abroad, analyzes how to standardize and complete description of virus characteristics, and then gives the information integrity, storage security and manageable computer virus database design scheme.

  19. Expert systems identify fossils and manage large paleontological databases

    SciTech Connect

    Beightol, D.S. ); Conrad, M.A.

    1988-02-01

    EXPAL is a computer program permitting creation and maintenance of comprehensive databases in marine paleontology. It is designed to assist specialists and non-specialists. EXPAL includes a powerful expert system based on the morphological descriptors specific to a given group of fossils. The expert system may be used, for example, to describe and automatically identify an unknown specimen. EXPAL was first applied to Dasycladales (Calcareous green algae). Projects are under way for corresponding expert systems and databases on planktonic foraminifers and calpionellids. EXPAL runs on an IBM XT or compatible microcomputer.

  20. Organizing a breast cancer database: data management.

    PubMed

    Yi, Min; Hunt, Kelly K

    2016-06-01

    Developing and organizing a breast cancer database can provide data and serve as valuable research tools for those interested in the etiology, diagnosis, and treatment of cancer. Depending on the research setting, the quality of the data can be a major issue. Assuring that the data collection process does not contribute inaccuracies can help to assure the overall quality of subsequent analyses. Data management is work that involves the planning, development, implementation, and administration of systems for the acquisition, storage, and retrieval of data while protecting it by implementing high security levels. A properly designed database provides you with access to up-to-date, accurate information. Database design is an important component of application design. If you take the time to design your databases properly, you'll be rewarded with a solid application foundation on which you can build the rest of your application.

  1. Organizing a breast cancer database: data management.

    PubMed

    Yi, Min; Hunt, Kelly K

    2016-06-01

    Developing and organizing a breast cancer database can provide data and serve as valuable research tools for those interested in the etiology, diagnosis, and treatment of cancer. Depending on the research setting, the quality of the data can be a major issue. Assuring that the data collection process does not contribute inaccuracies can help to assure the overall quality of subsequent analyses. Data management is work that involves the planning, development, implementation, and administration of systems for the acquisition, storage, and retrieval of data while protecting it by implementing high security levels. A properly designed database provides you with access to up-to-date, accurate information. Database design is an important component of application design. If you take the time to design your databases properly, you'll be rewarded with a solid application foundation on which you can build the rest of your application. PMID:27197511

  2. Inside Microcomputers.

    ERIC Educational Resources Information Center

    Frederick, Franz J.

    1982-01-01

    The internal processes of microcomputer functioning are explained. Components include: (1) a central processing unit; (2) memories which store programs and data; (3) a clock which determines the order in which a computer performs its operations; (4) a bus consisting of receptacles for additional installations; (5) interfaces which connect the…

  3. Microcomputer Guide.

    ERIC Educational Resources Information Center

    Fors, George, Ed.

    Designed for use by school districts introducing computer mathematics into the curriculum, this manual provides guidelines for selecting a microcomputer system, as well as objectives and an outline for an introductory course in computer programming. Also presented are topics for computer applications in science, mathematics, chemistry, and…

  4. Microcomputer Acquisition Standards and Controls.

    ERIC Educational Resources Information Center

    Wold, Geoffrey H.

    1987-01-01

    Increased use of microcomputers in schools can be implemented more effectively when management develops acquisitions standards and controls. Technical standards as well as operational and documentation standards are outlined. (MLF)

  5. Choosing the Right Database Management Program.

    ERIC Educational Resources Information Center

    Vockell, Edward L.; Kopenec, Donald

    1989-01-01

    Provides a comparison of four database management programs commonly used in schools: AppleWorks, the DOS 3.3 and ProDOS versions of PFS, and MECC's Data Handler. Topics discussed include information storage, spelling checkers, editing functions, search strategies, graphs, printout formats, library applications, and HyperCard. (LRW)

  6. How Should we Manage all These Databases?

    SciTech Connect

    Langley, K.E.

    1998-11-01

    In an organization where there are many DBAs working with many instances and databases on many machines with many developers - how do you manage all of this without total chaos? This paper will outline how the central Database Support organization at Lockheed Martin Energy Systems in Oak Ridge, TN manages more than 250 instances on more than 90 systems with a variety of operating systems. This discussion will include how tasks and responsibilities are divided between System DBAs, Application Project DBAs, and developers. The use of standards as well as local routines to maintain the systems will be discussed. Information on the type of communications used to keep the different groups informed and up-to-date will also be presented.

  7. How Should We Manage All Those Databases?

    SciTech Connect

    Langley, K E

    1998-10-01

    In an organization where there are many DBAs working with many instances and databases on many machines with many developers - how do you manage all of this without total chaos? This paper will outline how the central Database Support organization at Lockheed Martin Energy Systems in Oak Ridge, TN manages more than 250 instances on more than 90 systems with a variety of operating systems. This discussion will include how tasks and responsibilities are divided between System DBAs, Application Project DBAs, and developers. The use of standards as well as local routines to maintain the systems will be discussed. Information on the type of communications used to keep the different group informed and up-to-date will also be presented.

  8. SPIRE Data-Base Management System

    NASA Technical Reports Server (NTRS)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  9. Service Management Database for DSN Equipment

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.

  10. Processing requirements of secure C3/I and battle management systems - Development of Gemini trusted multiple microcomputer base

    NASA Astrophysics Data System (ADS)

    Tao, T. F.; Schell, R. R.

    The present investigation is concerned with the potential applications of trusted computer system technologies in space. It is suggested that the rapidly expanding roles of new space defense missions will require space-borne command, control, communication, intelligence, and battle management (C2/I-BM) systems. The trusted computer system technology can be extended to develop new computer architectures which are able to support the broader requirements of C3/I-BM processing. The Gemini Trusted Multiple Microcomputer Base product is being developed to meet the demanding requirements and to support simultaneously the multiple capabilities. Attention is given to recent important events of trusted computer system developments, and to the Gemini system architecture.

  11. Pre-Validated Signal Database Management System

    1996-12-18

    SPRT/DBMS is a pre-validated experimental database management system for industries where large volumes of process signals are acquired and archived. This system implements a new and powerful pattern recognition method, the spectrum transformed sequential testing (STST or ST2) procedure. A network of interacting ST2 modules deployed in parallel is integrated with a relational DBMS to fully validate process signals as they are archived. This reliable, secure DBMS then provides system modelers, code developers, and safetymore » analysts with an easily accessible source of fully validated process data.« less

  12. HGDBMS: a human genetics database management system.

    PubMed

    Seuchter, S A; Skolnick, M H

    1988-10-01

    Human genetics research involves a large number of complex data sets naturally organized in hierarchical structures. Data collection is performed on different levels, e.g., the project level, pedigree level, individual level, and sample level. Different aspects of a study utilize different views of the data, requiring a flexible database management system (DBMS) which satisfies these different needs for data collection and retrieval. We describe HGDBMS, a comprehensive relational DBMS, implemented as an application of the GENISYS I DBMS, which allows embedding the hierarchical structure of pedigrees in a relational structure. The system's file structure is described in detail. Currently our Melanoma and Chromosome 17 map studies are managed with HGDBMS. Our initial experience demonstrates the value of a flexible system which supports the needs for data entry, update, storage, reporting, and analysis required during different phases of genetic research. Further developments will focus on the integration of HGDBMS with a human genetics expert system shell and analysis programs. PMID:3180747

  13. HGDBMS: a human genetics database management system.

    PubMed

    Seuchter, S A; Skolnick, M H

    1988-10-01

    Human genetics research involves a large number of complex data sets naturally organized in hierarchical structures. Data collection is performed on different levels, e.g., the project level, pedigree level, individual level, and sample level. Different aspects of a study utilize different views of the data, requiring a flexible database management system (DBMS) which satisfies these different needs for data collection and retrieval. We describe HGDBMS, a comprehensive relational DBMS, implemented as an application of the GENISYS I DBMS, which allows embedding the hierarchical structure of pedigrees in a relational structure. The system's file structure is described in detail. Currently our Melanoma and Chromosome 17 map studies are managed with HGDBMS. Our initial experience demonstrates the value of a flexible system which supports the needs for data entry, update, storage, reporting, and analysis required during different phases of genetic research. Further developments will focus on the integration of HGDBMS with a human genetics expert system shell and analysis programs.

  14. Integrated Space Asset Management Database and Modeling

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

  15. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  16. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins; John M. Beck

    2011-11-01

    The Next Generation Nuclear Plant (NGNP) Risk Management System (RMS) is a database used to maintain the project risk register. The RMS also maps risk reduction activities to specific identified risks. Further functionality of the RMS includes mapping reactor suppliers Design Data Needs (DDNs) to risk reduction tasks and mapping Phenomena Identification Ranking Table (PIRTs) to associated risks. This document outlines the basic instructions on how to use the RMS. This document constitutes Revision 1 of the NGNP Risk Management Database: A Model for Managing Risk. It incorporates the latest enhancements to the RMS. The enhancements include six new custom views of risk data - Impact/Consequence, Tasks by Project Phase, Tasks by Status, Tasks by Project Phase/Status, Tasks by Impact/WBS, and Tasks by Phase/Impact/WBS.

  17. The Microcomputer and School Transportation.

    ERIC Educational Resources Information Center

    Dembowski, Frederick L.

    1984-01-01

    Microcomputers have many cost- and time-saving uses in school transportation management. Applications include routing and scheduling, demographic analysis, fleet maintenance, and personnel and contract management. Word processing is especially promising for storing and updating documents like specifications. Enrollment forecasting and inventory…

  18. Advanced Scientific Computing Environment Team new scientific database management task. Progress report

    SciTech Connect

    Church, J.P.; Roberts, J.C.; Sims, R.N.; Smetana, A.O.; Westmoreland, B.W.

    1991-06-01

    The mission of the ASCENT Team is to continually keep pace with, evaluate, and select emerging computing technologies to define and implement prototypic scientific environments that maximize the ability of scientists and engineers to manage scientific data. These environments are to be implemented in a manner consistent with the site computing architecture and standards and NRTSC/SCS strategic plans for scientific computing. The major trends in computing hardware and software technology clearly indicate that the future ``computer`` will be a network environment that comprises supercomputers, graphics boxes, mainframes, clusters, workstations, terminals, and microcomputers. This ``network computer`` will have an architecturally transparent operating system allowing the applications code to run on any box supplying the required computing resources. The environment will include a distributed database and database managing system(s) that permits use of relational, hierarchical, object oriented, GIS, et al, databases. To reach this goal requires a stepwise progression from the present assemblage of monolithic applications codes running on disparate hardware platforms and operating systems. The first steps include converting from the existing JOSHUA system to a new J80 system that complies with modern language standards, development of a new J90 prototype to provide JOSHUA capabilities on Unix platforms, development of portable graphics tools to greatly facilitate preparation of input and interpretation of output; and extension of ``Jvv`` concepts and capabilities to distributed and/or parallel computing environments.

  19. The database management system: A topic and a tool

    NASA Technical Reports Server (NTRS)

    Plummer, O. R.

    1984-01-01

    Data structures and data base management systems are common tools employed to deal with the administrative information of a university. An understanding of these topics is needed by a much wider audience, ranging from those interested in computer aided design and manufacturing to those using microcomputers. These tools are becoming increasingly valuable to academic programs as they develop comprehensive computer support systems. The wide use of these tools relies upon the relational data model as a foundation. Experience with the use of the IPAD RIM5.0 program is described.

  20. Applications of GIS and database technologies to manage a Karst Feature Database

    USGS Publications Warehouse

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  1. Managing an Elementary School Reading Program: How a Microcomputer Can Help.

    ERIC Educational Resources Information Center

    Einhorn, Edith

    The purpose of the computerized Reading Program Management System is to assist reading specialists of the District Heights Elementary School (Maryland) to perform four specific reading program tasks: placement, grouping, monitoring, and materials supply. The system uses a general purpose, commercial data management software package called DB…

  2. Database Design for Preservation Project Management: The California Newspaper Project.

    ERIC Educational Resources Information Center

    Hayman, Lynne M.

    1997-01-01

    Describes a database designed to manage a serials preservation project in which issues from multiple repositories are gathered and collated for preservation microfilming. Management information, added to bibliographic and holdings records, supports the production of reports tracking preservation activity. (Author)

  3. Microcomputers in Geography.

    ERIC Educational Resources Information Center

    Snaden, James N.; And Others

    Geographers in the United States rely heavily on microcomputers. They employ microcomputers to enhance three general categories of tasks: word processing and other productivity needs, geographic instruction, and discipline-specific applications. Word processing and desktop publishing continue to be the primary uses of microcomputers by…

  4. Basic Information on Microcomputers.

    ERIC Educational Resources Information Center

    Dembowski, Frederick L.

    1983-01-01

    The second in a series of articles on the use of microcomputers in the school business office contains a summary of the most important concepts and issues concerning the central processing unit and internal storage aspects of the microcomputer. All microcomputer jargon is italicized for easy recognition. (MLF)

  5. Microcomputer Applications in Agriculture.

    ERIC Educational Resources Information Center

    Hilgenberg, Gene; And Others

    This curriculum guide is intended to assist persons teaching a course in microcomputer applications in agriculture. (These applications are designed to be used on Apple IIe or TRS-80 microcomputers.) Addressed in the individual units of instruction are the following topics: microcomputer operating procedures; procedures for evaluating and…

  6. Automating Reference Desk Files with Microcomputers in a Public Library: An Exploration of Data Resources, Methods, and Software.

    ERIC Educational Resources Information Center

    Miley, David W.

    Many reference librarians still rely on manual searches to access vertical files, ready reference files, and other information stored in card files, drawers, and notebooks scattered around the reference department. Automated access to these materials via microcomputers using database management software may speed up the process. This study focuses…

  7. Microcomputers: "A New Era at Ramapo Catskill."

    ERIC Educational Resources Information Center

    Freund, Alfred L.

    1983-01-01

    Discussion of the use of microcomputers in a cooperative public library system notes library management applications in areas of clerical work, word processing, book ordering, inventories, special collection catalogs, mailing lists, and a union list of serials. (EJS)

  8. Integrated Space Asset Management Database and Modeling

    NASA Astrophysics Data System (ADS)

    Gagliano, L.; MacLeod, T.; Mason, S.; Percy, T.; Prescott, J.

    The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

  9. Integrated Space Asset Management Database and Modeling

    NASA Astrophysics Data System (ADS)

    Gagliano, L.; MacLeod, T.; Mason, S.; Percy, T.; Prescott, J.

    The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

  10. The role of databases in areawide pest management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A database is a comprehensive collection of related data organized for convenient access, generally in a computer. The evolution of computer software and the need to distinguish the specialized computer systems for storing and manipulating data, stimulated development of database management systems...

  11. Adapting the rangeland database for managing ecological site description data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Field data collection for writing Ecological Data Descriptions (ESD) creates a paperwork burden that reduces efficiency of ESD preparation. The recently developed Rangeland Database and Field Data Entry System is well suited to managing ESD data. This database was developed to automate data entry an...

  12. MST radar data-base management

    NASA Technical Reports Server (NTRS)

    Wickwar, V. B.

    1983-01-01

    Data management for Mesospheric-Stratospheric-Tropospheric, (MST) radars is addressed. An incoherent-scatter radar data base is discussed in terms of purpose, centralization, scope, and nature of the data base management system.

  13. Managing a large database of camera fingerprints

    NASA Astrophysics Data System (ADS)

    Goljan, Miroslav; Fridrich, Jessica; Filler, Tomáš

    2010-01-01

    Sensor fingerprint is a unique noise-like pattern caused by slightly varying pixel dimensions and inhomogeneity of the silicon wafer from which the sensor is made. The fingerprint can be used to prove that an image came from a specific digital camera. The presence of a camera fingerprint in an image is usually established using a detector that evaluates cross-correlation between the fingerprint and image noise. The complexity of the detector is thus proportional to the number of pixels in the image. Although computing the detector statistic for a few megapixel image takes several seconds on a single-processor PC, the processing time becomes impractically large if a sizeable database of camera fingerprints needs to be searched through. In this paper, we present a fast searching algorithm that utilizes special "fingerprint digests" and sparse data structures to address several tasks that forensic analysts will find useful when deploying camera identification from fingerprints in practice. In particular, we develop fast algorithms for finding if a given fingerprint already resides in the database and for determining whether a given image was taken by a camera whose fingerprint is in the database.

  14. Managerial Applications of the Microcomputer for Special Education Teachers. For Your Information.

    ERIC Educational Resources Information Center

    Griffith-Sheriff, Denise; Walter, Virginia

    An overview of the uses of microcomputers in special education management is provided. Following a list of nine applications of microcomputers to educational management is a brief description of microcomputers currently used in education. A listing of five firms currently marketing special education management software includes information of…

  15. A database management capability for Ada

    NASA Technical Reports Server (NTRS)

    Chan, Arvola; Danberg, SY; Fox, Stephen; Landers, Terry; Nori, Anil; Smith, John M.

    1986-01-01

    The data requirements of mission critical defense systems have been increasing dramatically. Command and control, intelligence, logistics, and even weapons systems are being required to integrate, process, and share ever increasing volumes of information. To meet this need, systems are now being specified that incorporate data base management subsystems for handling storage and retrieval of information. It is expected that a large number of the next generation of mission critical systems will contain embedded data base management systems. Since the use of Ada has been mandated for most of these systems, it is important to address the issues of providing data base management capabilities that can be closely coupled with Ada. A comprehensive distributed data base management project has been investigated. The key deliverables of this project are three closely related prototype systems implemented in Ada. These three systems are discussed.

  16. Communications Software for Microcomputers.

    ERIC Educational Resources Information Center

    Bruman, Janet L.

    Focusing on the use of microcomputers as "smart terminals" for accessing time-sharing systems for libraries, this document discusses the communications software needed to allow the microcomputer to appear as a terminal to the remote host. The functions which communications software programs are designed to perform are defined and explained,…

  17. Doing Physics with Microcomputers.

    ERIC Educational Resources Information Center

    Bak, Per

    1983-01-01

    Describes how microcomputers can perform very demanding/large-scale physics calculations at speeds not much slower than those of modern, full-size computers. Among the examples provided are a Monte Carlo simulation of the three-dimensional Ising model and a program (for the Apple microcomputer) using the time-independent Schrodinger Equation. (JN)

  18. Miracles, Microcomputers, and Librarians.

    ERIC Educational Resources Information Center

    Swanson, Don R.

    1982-01-01

    Describes potential uses of microcomputers in library education and library automation based upon experiences with a four-user Altos ACS8000 microcomputer at the University of Chicago Graduate Library School. Word processing, training in online information retrieval using MIRABILIS (Microsystem for Interactive Bibliographic Searching), and…

  19. Microcomputers and Literacy.

    ERIC Educational Resources Information Center

    Grice, R. D.

    1986-01-01

    The nature of literacy associated with the widely used new medium of microcomputers has not been fully exploited by schools to foster development of literacy programs. Microcomputer applications need integration with classroom activities where students construct language meaning. (19 references) (CJH)

  20. Instructional Microcomputer Report.

    ERIC Educational Resources Information Center

    Black, B. R.

    Intended to aid teachers, administrators, and interested parents by informing them about the state of the educational microcomputer market, this report is meant to be a guide to school districts in the acquisition and maintenance of microcomputers and related instructional materials. Trends in educational computing technology are noted in relation…

  1. Expansion of the MANAGE database with forest and drainage studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The “Measured Annual Nutrient loads from AGricultural Environments” (MANAGE) database was published in 2006 to expand an early 1980’s compilation of nutrient export (load) data from agricultural land uses at the field or farm spatial scale. Then in 2008, MANAGE was updated with 15 additional studie...

  2. Development of a Relational Database for Learning Management Systems

    ERIC Educational Resources Information Center

    Deperlioglu, Omer; Sarpkaya, Yilmaz; Ergun, Ertugrul

    2011-01-01

    In today's world, Web-Based Distance Education Systems have a great importance. Web-based Distance Education Systems are usually known as Learning Management Systems (LMS). In this article, a database design, which was developed to create an educational institution as a Learning Management System, is described. In this sense, developed Learning…

  3. Microcomputers for Information Storage and Retrieval.

    ERIC Educational Resources Information Center

    Kanters, Ben

    1983-01-01

    Report on use of the microcomputer for information storage and retrieval (ISR) notes hardware (floppy disks, keyboard, screen, printer); functions of ISR software; standardization of the operating system; database creation; data entry; indexing; search process; choice of ISR software package; software market and user; training and instruction; and…

  4. Geoscience research databases for coastal Alabama ecosystem management

    USGS Publications Warehouse

    Hummell, Richard L.

    1995-01-01

    Effective management of complex coastal ecosystems necessitates access to scientific knowledge that can be acquired through a multidisciplinary approach involving Federal and State scientists that take advantage of agency expertise and resources for the benefit of all participants working toward a set of common research and management goals. Cooperative geostatic investigations have led toward building databases of fundamental scientific knowledge that can be utilized to manage coastal Alabama's natural and future development. These databases have been used to assess the occurrence and economic potential of hard mineral resources in the Alabama EFZ, and to support oil spill contingency planning and environmental analysis for coastal Alabama.

  5. Evidence generation from healthcare databases: recommendations for managing change.

    PubMed

    Bourke, Alison; Bate, Andrew; Sauer, Brian C; Brown, Jeffrey S; Hall, Gillian C

    2016-07-01

    There is an increasing reliance on databases of healthcare records for pharmacoepidemiology and other medical research, and such resources are often accessed over a long period of time so it is vital to consider the impact of changes in data, access methodology and the environment. The authors discuss change in communication and management, and provide a checklist of issues to consider for both database providers and users. The scope of the paper is database research, and changes are considered in relation to the three main components of database research: the data content itself, how it is accessed, and the support and tools needed to use the database. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27183900

  6. DOE technology information management system database study report

    SciTech Connect

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  7. TRENDS: The aeronautical post-test database management system

    NASA Technical Reports Server (NTRS)

    Bjorkman, W. S.; Bondi, M. J.

    1990-01-01

    TRENDS, an engineering-test database operating system developed by NASA to support rotorcraft flight tests, is described. Capabilities and characteristics of the system are presented, with examples of its use in recalling and analyzing rotorcraft flight-test data from a TRENDS database. The importance of system user-friendliness in gaining users' acceptance is stressed, as is the importance of integrating supporting narrative data with numerical data in engineering-test databases. Considerations relevant to the creation and maintenance of flight-test database are discussed and TRENDS' solutions to database management problems are described. Requirements, constraints, and other considerations which led to the system's configuration are discussed and some of the lessons learned during TRENDS' development are presented. Potential applications of TRENDS to a wide range of aeronautical and other engineering tests are identified.

  8. A survey of commercial object-oriented database management systems

    NASA Technical Reports Server (NTRS)

    Atkins, John

    1992-01-01

    The object-oriented data model is the culmination of over thirty years of database research. Initially, database research focused on the need to provide information in a consistent and efficient manner to the business community. Early data models such as the hierarchical model and the network model met the goal of consistent and efficient access to data and were substantial improvements over simple file mechanisms for storing and accessing data. However, these models required highly skilled programmers to provide access to the data. Consequently, in the early 70's E.F. Codd, an IBM research computer scientists, proposed a new data model based on the simple mathematical notion of the relation. This model is known as the Relational Model. In the relational model, data is represented in flat tables (or relations) which have no physical or internal links between them. The simplicity of this model fostered the development of powerful but relatively simple query languages that now made data directly accessible to the general database user. Except for large, multi-user database systems, a database professional was in general no longer necessary. Database professionals found that traditional data in the form of character data, dates, and numeric data were easily represented and managed via the relational model. Commercial relational database management systems proliferated and performance of relational databases improved dramatically. However, there was a growing community of potential database users whose needs were not met by the relational model. These users needed to store data with data types not available in the relational model and who required a far richer modelling environment than that provided by the relational model. Indeed, the complexity of the objects to be represented in the model mandated a new approach to database technology. The Object-Oriented Model was the result.

  9. Relational Information Management Data-Base System

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Erickson, W. J.; Gray, F. P.; Comfort, D. L.; Wahlstrom, S. O.; Von Limbach, G.

    1985-01-01

    DBMS with several features particularly useful to scientists and engineers. RIM5 interfaced with any application program written in language capable of Calling FORTRAN routines. Applications include data management for Space Shuttle Columbia tiles, aircraft flight tests, high-pressure piping, atmospheric chemistry, census, university registration, CAD/CAM Geometry, and civil-engineering dam construction.

  10. Kingfisher: a system for remote sensing image database management

    NASA Astrophysics Data System (ADS)

    Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.

    2003-04-01

    At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.

  11. Microcomputer Applications in Analytical Chemistry.

    ERIC Educational Resources Information Center

    Long, Joseph W.

    The first part of this paper addresses the following topics: (1) the usefulness of microcomputers; (2) applications for microcomputers in analytical chemistry; (3) costs; (4) major microcomputer systems and subsystems; and (5) which microcomputer to buy. Following these brief comments, the major focus of the paper is devoted to a discussion of…

  12. Development of the ageing management database of PUSPATI TRIGA reactor

    NASA Astrophysics Data System (ADS)

    Ramli, Nurhayati; Maskin, Mazleha; Tom, Phongsakorn Prak; Husain, Nurfazila; Farid, Mohd Fairus Abd; Ramli, Shaharum; Adnan, Amirul Syazwan; Abidin, Nurul Husna Zainal

    2016-01-01

    Since its first criticality in 1982, PUSPATI TRIGA Reactor (RTP) has been operated for more than 30 years. As RTP become older, ageing problems have been seen to be the prominent issues. In addressing the ageing issues, an Ageing Management (AgeM) database for managing related ageing matters was systematically developed. This paper presents the development of AgeM database taking into account all RTP major Systems, Structures and Components (SSCs) and ageing mechanism of these SSCs through the system surveillance program.

  13. Microcomputer Technical Overview.

    ERIC Educational Resources Information Center

    Moursund, David

    1984-01-01

    A rationale for understanding computer operations is given. An overview of microcomputer technology, including an introduction to computer software, hardware, input and output devices, central processing unit, primary and secondary memory, and videodisk interactive systems is presented. (Author/BS)

  14. Selecting a Relational Database Management System for Library Automation Systems.

    ERIC Educational Resources Information Center

    Shekhel, Alex; O'Brien, Mike

    1989-01-01

    Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)

  15. Use of Knowledge Bases in Education of Database Management

    ERIC Educational Resources Information Center

    Radványi, Tibor; Kovács, Emod

    2008-01-01

    In this article we present a segment of Sulinet Digital Knowledgebase curriculum system in which you can find the sections of subject-matter which aid educating the database management. You can follow the order of the course from the beginning when some topics appearance and raise in elementary school, through the topics accomplish in secondary…

  16. Interface between astrophysical datasets and distributed database management systems (DAVID)

    NASA Technical Reports Server (NTRS)

    Iyengar, S. S.

    1988-01-01

    This is a status report on the progress of the DAVID (Distributed Access View Integrated Database Management System) project being carried out at Louisiana State University, Baton Rouge, Louisiana. The objective is to implement an interface between Astrophysical datasets and DAVID. Discussed are design details and implementation specifics between DAVID and astrophysical datasets.

  17. Database Management Principles of the UCLA Library's Orion System.

    ERIC Educational Resources Information Center

    Fayollat, James; Coles, Elizabeth

    1987-01-01

    Describes an integrated online library system developed at the University of California at Los Angeles (UCLA) which incorporates a number of database management features that enhance efficiency, for record retrieval and display. Design features related to record storage and retrieval and the design of linked files are described in detail.…

  18. Management and Planning Issues in the Use of Microcomputers in Schools. Occasional Paper in Educational Planning, Management and Statistics No. 11.

    ERIC Educational Resources Information Center

    Lancaster, David

    Reasons underlying the growth of interest in Asia and the Pacific region in educational computing and issues raised by such developments are examined in this paper, which begins by describing three main areas of use of microcomputers in schools--for teaching computer studies, for computer assisted learning, and for school adminstration. Reasons…

  19. An image database management system for conducting CAD research

    NASA Astrophysics Data System (ADS)

    Gruszauskas, Nicholas; Drukker, Karen; Giger, Maryellen L.

    2007-03-01

    The development of image databases for CAD research is not a trivial task. The collection and management of images and their related metadata from multiple sources is a time-consuming but necessary process. By standardizing and centralizing the methods in which these data are maintained, one can generate subsets of a larger database that match the specific criteria needed for a particular research project in a quick and efficient manner. A research-oriented management system of this type is highly desirable in a multi-modality CAD research environment. An online, webbased database system for the storage and management of research-specific medical image metadata was designed for use with four modalities of breast imaging: screen-film mammography, full-field digital mammography, breast ultrasound and breast MRI. The system was designed to consolidate data from multiple clinical sources and provide the user with the ability to anonymize the data. Input concerning the type of data to be stored as well as desired searchable parameters was solicited from researchers in each modality. The backbone of the database was created using MySQL. A robust and easy-to-use interface for entering, removing, modifying and searching information in the database was created using HTML and PHP. This standardized system can be accessed using any modern web-browsing software and is fundamental for our various research projects on computer-aided detection, diagnosis, cancer risk assessment, multimodality lesion assessment, and prognosis. Our CAD database system stores large amounts of research-related metadata and successfully generates subsets of cases that match the user's desired search criteria.

  20. Management of Equipment Databases at CERN for the Atlas Experiment

    NASA Astrophysics Data System (ADS)

    Galvão, Kaio Karam; Pommès, Kathy; Molina-Pérez, Jorge; Maidantchik, Carmen; Grael, Felipe Fink

    2008-06-01

    The ATLAS experiment is about to finish its installation phase, entering into operation on the summer of 2008. This installation has represented an enormous challenge in terms of developing, setting up, and administrating the Equipment Databases, due to the large complexity of the detector, its associated services, and the necessary infrastructure. All major equipment is registered prior to installation including its electronic description and interconnectivity. This information is stored in Oracle databases. 3D visualization tools, user interfaces for portable devices, and generic retrieval/updating mechanisms have been developed in order to carry out the management of the sub-detectors databases. The full traceability of all installed equipment is crucial from ATLAS organizational point of view, and it is also a requirement by the French authorities to fulfill the INB (Installation Nucléaire de Base) protocol.

  1. Concierge: Personal Database Software for Managing Digital Research Resources

    PubMed Central

    Sakai, Hiroyuki; Aoyama, Toshihiro; Yamaji, Kazutsuna; Usui, Shiro

    2007-01-01

    This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literature management, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp). PMID:18974800

  2. An engineering database management system for spacecraft operations

    NASA Technical Reports Server (NTRS)

    Cipollone, Gregorio; Mckay, Michael H.; Paris, Joseph

    1993-01-01

    Studies at ESOC have demonstrated the feasibility of a flexible and powerful Engineering Database Management System in support for spacecraft operations documentation. The objectives set out were three-fold: first an analysis of the problems encountered by the Operations team in obtaining and managing operations documents; secondly, the definition of a concept for operations documentation and the implementation of prototype to prove the feasibility of the concept; and thirdly, definition of standards and protocols required for the exchange of data between the top-level partners in a satellite project. The EDMS prototype was populated with ERS-l satellite design data and has been used by the operations team at ESOC to gather operational experience. An operational EDMS would be implemented at the satellite prime contractor's site as a common database for all technical information surrounding a project and would be accessible by the cocontractor's and ESA teams.

  3. Region and database management for HANDI 2000 business management system

    SciTech Connect

    Wilson, D.

    1998-08-26

    The Data Integration 2000 Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract. It is based on the Commercial-Off-The-Shelf product solution with commercially proven business processes. The COTS product solution set, of PassPort and People Soft software, supports finance, supply and chemical management/Material Safety Data Sheet, human resources.

  4. A Support Database System for Integrated System Health Management (ISHM)

    NASA Technical Reports Server (NTRS)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between

  5. The Oil and Natural Gas Knowledge Management Database from NETL

    DOE Data Explorer

    The Knowledge Management Database (KMD) Portal provides four options for searching the documents and data that NETL-managed oil and gas research has produced over the years for DOE’s Office of Fossil Energy. Information includes R&D carried out under both historical and ongoing DOE oil and gas research and development (R&D). The Document Repository, the CD/DVD Library, the Project Summaries from 1990 to the present, and the Oil and Natural Gas Program Reference Shelf provide a wide range of flexibility and coverage.

  6. Fifty "Best" Database and File Management Packages for Academic Libraries.

    ERIC Educational Resources Information Center

    Garten, Edward D.

    1985-01-01

    Lists computer programs selected on basis of following criteria: (1) applicability or easy adaptability to academic or research library environment; (2) operative on three microcomputer brands found in academic and research libraries; (3) good or excellent reviews appearing in software magazines; (4) price ranging from $50-$350. Vendor addresses…

  7. A multidisciplinary database for geophysical time series management

    NASA Astrophysics Data System (ADS)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  8. Microcomputer Analysis of Children's Language Samples.

    ERIC Educational Resources Information Center

    Rosenkoetter, Sharon E.; Rice, Mabel L.

    The workshop paper examines the use of microcomputer packages to analyze spontaneous language samples of children with communication disorders. Advantages of computerized analysis are seen to include time saving, more efficient data management, and increased objectivity. To help consumers determine which programs to buy, four aspects are…

  9. Teaching Real Science with a Microcomputer.

    ERIC Educational Resources Information Center

    Naiman, Adeline

    1983-01-01

    Discusses various ways science can be taught using microcomputers, including simulations/games which allow large-scale or historic experiments to be replicated on a manageable scale in a brief time. Examples of several computer programs are also presented, including "Experiments in Human Physiology,""Health Awareness Games,""Heredity Dog," and…

  10. Criteria for the Evaluation of Microcomputer Courseware.

    ERIC Educational Resources Information Center

    Cohen, Vicki Blum

    1983-01-01

    Discusses attributes which are offered as set of standards to judge instructional software--those unique to design of microcomputer courseware and those included in design of all instruction. Curriculum role, modes of interaction, computer managed instruction, graphics, feedback, packaging, and manuals are noted. Fourteen references are included.…

  11. Technology and Microcomputers for an Information Centre/Special Library.

    ERIC Educational Resources Information Center

    Daehn, Ralph M.

    1984-01-01

    Discusses use of microcomputer hardware and software, telecommunications methods, and advanced library methods to create a specialized information center's database of literature relating to farm machinery and food processing. Systems and services (electronic messaging, serials control, database creation, cataloging, collections, circulation,…

  12. Extending the Online Public Access Catalog into the Microcomputer Environment.

    ERIC Educational Resources Information Center

    Sutton, Brett

    1990-01-01

    Describes PCBIS, a database program for MS-DOS microcomputers that features a utility for automatically converting online public access catalog search results stored as text files into structured database files that can be searched, sorted, edited, and printed. Topics covered include the general features of the program, record structure, record…

  13. Lessons Learned from Deploying an Analytical Task Management Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen

    2007-01-01

    Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.

  14. Computerized database management system for breast cancer patients.

    PubMed

    Sim, Kok Swee; Chong, Sze Siang; Tso, Chih Ping; Nia, Mohsen Esmaeili; Chong, Aun Kee; Abbas, Siti Fathimah

    2014-01-01

    Data analysis based on breast cancer risk factors such as age, race, breastfeeding, hormone replacement therapy, family history, and obesity was conducted on breast cancer patients using a new enhanced computerized database management system. My Structural Query Language (MySQL) is selected as the application for database management system to store the patient data collected from hospitals in Malaysia. An automatic calculation tool is embedded in this system to assist the data analysis. The results are plotted automatically and a user-friendly graphical user interface is developed that can control the MySQL database. Case studies show breast cancer incidence rate is highest among Malay women, followed by Chinese and Indian. The peak age for breast cancer incidence is from 50 to 59 years old. Results suggest that the chance of developing breast cancer is increased in older women, and reduced with breastfeeding practice. The weight status might affect the breast cancer risk differently. Additional studies are needed to confirm these findings.

  15. Microcomputers in the Laboratory.

    ERIC Educational Resources Information Center

    Rafert, Bruce; Nicklin, R. C.

    1982-01-01

    A one-semester hour laboratory course introduced junior and senior physics majors to assembly language programing and to interfacing KIM-1 microcomputer to experiments. A general purpose interface to a standard breadboard was developed. Course details, apparatus, and some interfacing projects are given. (Author/SK)

  16. Microcomputer Applications Specialist.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication contains 16 subjects appropriate for use in a competency list for the occupation of microcomputer applications specialist, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 16 units are…

  17. Microcomputer Controlled Experiments.

    ERIC Educational Resources Information Center

    Kirkman, John; Knaggs, David

    1982-01-01

    Describes a microcomputer-controlled system which determines the current/voltage characteristics of a resistor, lamp, and diode, detailing system elements, construction, and providing printout of the program developed to provide control and arithmetic functions necessary to complete the experiment. (SK)

  18. Microcomputers in Education.

    ERIC Educational Resources Information Center

    Anderson, Cheryl A.

    Designed to answer basic questions educators have about microcomputer hardware and software and their applications in teaching, this paper describes the revolution in computer technology that has resulted from the development of the microchip processor and provides information on the major computer components; i.e.; input, central processing unit,…

  19. Storage Media for Microcomputers.

    ERIC Educational Resources Information Center

    Trautman, Rodes

    1983-01-01

    Reviews computer storage devices designed to provide additional memory for microcomputers--chips, floppy disks, hard disks, optical disks--and describes how secondary storage is used (file transfer, formatting, ingredients of incompatibility); disk/controller/software triplet; magnetic tape backup; storage volatility; disk emulator; and…

  20. Management Guidelines for Database Developers' Teams in Software Development Projects

    NASA Astrophysics Data System (ADS)

    Rusu, Lazar; Lin, Yifeng; Hodosi, Georg

    Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.

  1. An examination of electronic file transfer between host and microcomputers for the AMPMODNET/AIMNET (Army Material Plan Modernization Network/Acquisition Information Management Network) classified network environment

    SciTech Connect

    Hake, K.A.

    1990-11-01

    This report presents the results of investigation and testing conducted by Oak Ridge National Laboratory (ORNL) for the Project Manager -- Acquisition Information Management (PM-AIM), and the United States Army Materiel Command Headquarters (HQ-AMC). It concerns the establishment of file transfer capabilities on the Army Materiel Plan Modernization (AMPMOD) classified computer system. The discussion provides a general context for micro-to-mainframe connectivity and focuses specifically upon two possible solutions for file transfer capabilities. The second section of this report contains a statement of the problem to be examined, a brief description of the institutional setting of the investigation, and a concise declaration of purpose. The third section lays a conceptual foundation for micro-to-mainframe connectivity and provides a more detailed description of the AMPMOD computing environment. It gives emphasis to the generalized International Business Machines, Inc. (IBM) standard of connectivity because of the predominance of this vendor in the AMPMOD computing environment. The fourth section discusses two test cases as possible solutions for file transfer. The first solution used is the IBM 3270 Control Program telecommunications and terminal emulation software. A version of this software was available on all the IBM Tempest Personal Computer 3s. The second solution used is Distributed Office Support System host electronic mail software with Personal Services/Personal Computer microcomputer e-mail software running with IBM 3270 Workstation Program for terminal emulation. Test conditions and results are presented for both test cases. The fifth section provides a summary of findings for the two possible solutions tested for AMPMOD file transfer. The report concludes with observations on current AMPMOD understanding of file transfer and includes recommendations for future consideration by the sponsor.

  2. Microcomputers and astronomical navigation.

    NASA Astrophysics Data System (ADS)

    Robin-Jouan, Y.

    1996-04-01

    Experienced navigators remember ancient astronomical navigation and its limitations. Using microcomputers in small packages and selecting up-to-date efficient methods will overcome many of these limitations. Both features lead to focus on observations, and encourage an increase in their numbers. With no intention of competing with satellite navigation, sextant navigation in the open sea can then be accessed again by anybody. It can be considered for demonstrative use or as a complement to the GPS.

  3. Object and file management in the EXODUS extensible database system

    SciTech Connect

    Carey, M.J.; DeWitt, D.J.; Richardson, J.E.; Shekita, E.J.

    1986-03-01

    This paper describes the design of the object-oriented storage component of EXODUS, an extensible database management system currently under development at the University of Wisconsin. The basic abstraction in the EXODUS storage system is the storage object, an uninterpreted variable-length record of arbitrary size; higher level abstractions such as records and indices are supported via the storage object abstraction. One of the key design features described here is a scheme for managing large dynamic objects, as storage objects can occupy many disk pages and can grow or shrink at arbitrary points. The data structure and algorithms used to support such objects are described, and performance results from a preliminary prototype of the EXODUS large-object management scheme are presented. A scheme for maintaining versions of large objects is also described. The file structure used in the EXODUS storage system, which provides a mechanism for grouping and sequencing through a set of related storage objects and the EXODUS approach to buffer management, concurrency control, and recovery, both for small and large objects are discussed. 30 refs., 13 figs.

  4. Representing clinical communication knowledge through database management system integration.

    PubMed

    Khairat, Saif; Craven, Catherine; Gong, Yang

    2012-01-01

    Clinical communication failures are considered the leading cause of medical errors [1]. The complexity of the clinical culture and the significant variance in training and education levels form a challenge to enhancing communication within the clinical team. In order to improve communication, a comprehensive understanding of the overall communication process in health care is required. In an attempt to further understand clinical communication, we conducted a thorough methodology literature review to identify strengths and limitations of previous approaches [2]. Our research proposes a new data collection method to study the clinical communication activities among Intensive Care Unit (ICU) clinical teams with a primary focus on the attending physician. In this paper, we present the first ICU communication instrument, and, we introduce the use of database management system to aid in discovering patterns and associations within our ICU communications data repository.

  5. Representing clinical communication knowledge through database management system integration.

    PubMed

    Khairat, Saif; Craven, Catherine; Gong, Yang

    2012-01-01

    Clinical communication failures are considered the leading cause of medical errors [1]. The complexity of the clinical culture and the significant variance in training and education levels form a challenge to enhancing communication within the clinical team. In order to improve communication, a comprehensive understanding of the overall communication process in health care is required. In an attempt to further understand clinical communication, we conducted a thorough methodology literature review to identify strengths and limitations of previous approaches [2]. Our research proposes a new data collection method to study the clinical communication activities among Intensive Care Unit (ICU) clinical teams with a primary focus on the attending physician. In this paper, we present the first ICU communication instrument, and, we introduce the use of database management system to aid in discovering patterns and associations within our ICU communications data repository. PMID:22874366

  6. TLC for Growing Minds. Microcomputer Projects. Elementary Intermediate Microcomputer Projects.

    ERIC Educational Resources Information Center

    Buxton, Marilyn

    Designed to improve students' thinking, learning, and creative skills while they learn to program a microcomputer in BASIC programing language, this book for intermediate learners at the elementary school level provides a variety of microcomputer activities designed to extend the concepts learned in accompanying instructional manuals (Volumes 3…

  7. Enhanced DIII-D Data Management Through a Relational Database

    NASA Astrophysics Data System (ADS)

    Burruss, J. R.; Peng, Q.; Schachter, J.; Schissel, D. P.; Terpstra, T. B.

    2000-10-01

    A relational database is being used to serve data about DIII-D experiments. The database is optimized for queries across multiple shots, allowing for rapid data mining by SQL-literate researchers. The relational database relates different experiments and datasets, thus providing a big picture of DIII-D operations. Users are encouraged to add their own tables to the database. Summary physics quantities about DIII-D discharges are collected and stored in the database automatically. Meta-data about code runs, MDSplus usage, and visualization tool usage are collected, stored in the database, and later analyzed to improve computing. Documentation on the database may be accessed through programming languages such as C, Java, and IDL, or through ODBC compliant applications such as Excel and Access. A database-driven web page also provides a convenient means for viewing database quantities through the World Wide Web. Demonstrations will be given at the poster.

  8. National Levee Database: monitoring, vulnerability assessment and management in Italy

    NASA Astrophysics Data System (ADS)

    Barbetta, Silvia; Camici, Stefania; Maccioni, Pamela; Moramarco, Tommaso

    2015-04-01

    Italian levees and historical breach failures to be exploited in the framework of an operational procedure addressed to the seepage vulnerability assessment of river reaches where the levee system is an important structural measure against flooding. For its structure, INLED is a dynamic geospatial database with ongoing efforts to add levee data from authorities with the charge of hydraulic risk mitigation. In particular, the database is aimed to provide the available information about: i) location and condition of levees; ii) morphological and geometrical properties; iii) photographic documentation; iv) historical levee failures; v) assessment of vulnerability to overtopping and seepage carried out through a procedure based on simple vulnerability indexes (Camici et al. 2014); vi) management, control and maintenance; vii)flood hazard maps developed by assuming the levee system undamaged/damaged during the flood event. Currently, INLED contains data of levees that are mostly located in the Tiber basin, Central Italy. References Apel H., Merz B. & Thieken A.H. Quantification of uncertainties in flood risk assessments. Int J River Basin Manag 2008, 6, (2), 149-162. Camici S,, Barbetta S., Moramarco T., Levee body vulnerability to seepage: the case study of the levee failure along the Foenna stream on 1st January 2006 (central Italy)", Journal of Flood Risk Management, in press. Colleselli F. Geotechnical problems related to river and channel embankments. Rotterdam, the Netherlands: Springer, 1994. H. R.Wallingford Consultants (HRWC). Risk assessment for flood and coastal defence for strategic planning: high level methodology technical report, London, 2003. Mazzoleni M., Bacchi B., Barontini S., Di Baldassarre G., Pilotti M. & Ranzi R. Flooding hazard mapping in floodplain areas affected by piping breaches in the Po River, Italy. J Hydrol Eng 2014, 19, (4), 717-731.

  9. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    NASA Technical Reports Server (NTRS)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  10. Management of the life and death of an earth-science database: some examples from geotherm

    USGS Publications Warehouse

    Bliss, J.D.

    1986-01-01

    Productive earth-science databases require managers who are familiar with and skilled at using available software developed specifically for database management. There also should be a primary user with a clearly understood mission. The geologic phenomenon addressed by the database must be sufficiently understood, and adequate appropriate data must be available to construct a useful database. The database manager, in concert with the primary user, must ensure that data of adequate quality are available in the database, as well as prepare for mechanisms of releasing the data when the database is terminated. The primary user needs to be held accountable along with the database manager to ensure that a useful database will be created. Quality of data and maintenance of database relevancy to the user's mission are important issues during the database's lifetime. Products prepared at termination may be used more than the operational database and thus are of critical importance. These concepts are based, in part, on both the shortcomings and successes of GEOTHERM, a comprehensive system of databases and software used to store, locate, and evaluate the geology, geochemistry, and hydrology of geothermal systems. ?? 1986.

  11. Microcomputers: Applications in Vocational Education.

    ERIC Educational Resources Information Center

    Rodenstein, Judith, Ed.; Lambert, Roger, Ed.

    This handbook was assembled for vocational educators so that they can see the applications of microcomputers in both their instructional and administrative tasks. The 22 papers included in the handbook were written by educators who are and have been using microcomputers extensively in their work. The first section of the handbook discusses the…

  12. Microcomputers and the Classroom Order.

    ERIC Educational Resources Information Center

    Olson, John K.

    A preliminary case study used repertory grid and stimulated recall techniques to examine how teachers make sense of the impact of microcomputers on their work. Emphasis was on how teachers construe their classroom influence in relation to the actual use of microcomputers and to idealizations of their use. A version of the Kelly (1955) repgrid was…

  13. Conference Abstracts: Microcomputers in Education.

    ERIC Educational Resources Information Center

    Baird, William E.

    1985-01-01

    Provides abstracts of five papers presented at the Fourth Annual Microcomputers in Education Conference. Papers considered microcomputers in science laboratories, Apple II Plus/e computer-assisted instruction in chemistry, computer solutions for space mechanics concerns, computer applications to problem solving and hypothesis testing, and…

  14. Keeping Track of Our Treasures: Managing Historical Data with Relational Database Software.

    ERIC Educational Resources Information Center

    Gutmann, Myron P.; And Others

    1989-01-01

    Describes the way a relational database management system manages a large historical data collection project. Shows that such databases are practical to construct. States that the programing tasks involved are not for beginners, but the rewards of having data organized are worthwhile. (GG)

  15. Information survey for microcomputer systems integration

    SciTech Connect

    Hake, K.A.

    1991-12-01

    One goal of the PM-AIM is to provide US Army Project Managers (PMs) and Project Executive Officers (PEOs) with a fundamental microcomputing resource to help perform acquisition information management and its concomitant reporting requirements. Providing key application software represents one means of accomplishing this goal. This workstation would furnish a broad range of capabilities needed in the PM and PEO office settings as well as software tools for specific project management and acquisition information. Although still in the conceptual phase, the practical result of this exercise in systems integration will likely be a system called the Project Manager`s Information System (PMIS) or the AIM workstation. It would include such software as, Project Manager`s System Software (PMSS), Defense Acquisition Executive Summary (DAES), and Consolidated Acquisition Reporting System (CARS) and would conform to open systems architecture as accepted by the Department of Defense. ORNL has assisted PM-AIM in the development of technology ideas for the PMIS workstation concept. This paper represents the compilation of information gained during this process. This information is presented as a body of knowledge (or knowledge domain) defining the complex technology of microcomputing. The concept of systems integration or tying together all hardware and software components reflects the nature of PM-AIM`s task in attempting to field a PMIS or AIM workstation.

  16. Simple Interval Timers for Microcomputers.

    ERIC Educational Resources Information Center

    McInerney, M.; Burgess, G.

    1985-01-01

    Discusses simple interval timers for microcomputers, including (1) the Jiffy clock; (2) CPU count timers; (3) screen count timers; (4) light pen timers; and (5) chip timers. Also examines some of the general characteristics of all types of timers. (JN)

  17. Interfacing Microcomputers with Laboratory Instruments.

    ERIC Educational Resources Information Center

    Long, Joseph W.

    1983-01-01

    Describes development of microcomputer-controlled gamma scintillation spectrometer and chromatographic data analyzer, including design and construction of interface electronics and production of software. Includes diagrams of electric circuits and project evaluation indicating that both instruments functioned as intended. (JN)

  18. Microcomputer Activities and Occupational Therapy.

    ERIC Educational Resources Information Center

    Wall, Nancy

    1984-01-01

    Directed to occupational therapists, the article focuses on the applications of microcomputers to services for developmentally disabled persons. Noted are computer devices (input, output, software, and firmware); computer programs (basic and sophisticated instruction, graphics); and LOGO, a computer language.

  19. A convenient and adaptable microcomputer environment for DNA and protein sequence manipulation and analysis.

    PubMed Central

    Pustell, J; Kafatos, F C

    1986-01-01

    We describe the further development of a widely used package of DNA and protein sequence analysis programs for microcomputers (1,2,3). The package now provides a screen oriented user interface, and an enhanced working environment with powerful formatting, disk access, and memory management tools. The new GenBank floppy disk database is supported transparently to the user and a similar version of the NBRF protein database is provided. The programs can use sequence file annotation to automatically annotate printouts and translate or extract specified regions from sequences by name. The sequence comparison programs can now perform a 5000 X 5000 bp analysis in 12 minutes on an IBM PC. A program to locate potential protein coding regions in nucleic acids, a digitizer interface, and other additions are also described. PMID:3753784

  20. A Vibroacoustic Database Management Center for Shuttle and expendable launch vehicle payloads

    NASA Technical Reports Server (NTRS)

    Thomas, Valerie C.

    1987-01-01

    A Vibroacoustic Database Management Center has recently been established at the Jet Propulsion Laboratory (JPL). The center uses the Vibroacoustic Payload Environment Prediction System (VAPEPS) computer program to maintain a database of flight and ground-test data and structural parameters for both Shuttle and expendable launch-vehicle payloads. Given the launch-vehicle environment, the VAPEPS prediction software, which employs Statistical Energy Analysis (SEA) methods, can be used with or without the database to establish the vibroacoustic environment for new payload components. This paper summarizes the VAPEPS program and describes the functions of the Database Management Center at JPL.

  1. Information survey for microcomputer systems integration

    SciTech Connect

    Hake, K.A.

    1991-12-01

    One goal of the PM-AIM is to provide US Army Project Managers (PMs) and Project Executive Officers (PEOs) with a fundamental microcomputing resource to help perform acquisition information management and its concomitant reporting requirements. Providing key application software represents one means of accomplishing this goal. This workstation would furnish a broad range of capabilities needed in the PM and PEO office settings as well as software tools for specific project management and acquisition information. Although still in the conceptual phase, the practical result of this exercise in systems integration will likely be a system called the Project Manager's Information System (PMIS) or the AIM workstation. It would include such software as, Project Manager's System Software (PMSS), Defense Acquisition Executive Summary (DAES), and Consolidated Acquisition Reporting System (CARS) and would conform to open systems architecture as accepted by the Department of Defense. ORNL has assisted PM-AIM in the development of technology ideas for the PMIS workstation concept. This paper represents the compilation of information gained during this process. This information is presented as a body of knowledge (or knowledge domain) defining the complex technology of microcomputing. The concept of systems integration or tying together all hardware and software components reflects the nature of PM-AIM's task in attempting to field a PMIS or AIM workstation.

  2. Information flow in the DAMA project beyond database managers: information flow managers

    NASA Astrophysics Data System (ADS)

    Russell, Lucian; Wolfson, Ouri; Yu, Clement

    1996-12-01

    To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point of sale information, is being considered in the Demand Activated Manufacturing Project (DAMA) of the American Textile Partnership (AMTEX) project. A scenario is examined in which 100 000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26 000 suppliers through the use of bill of materials explosions at four levels of detail. Enabling this communication requires an approach that shares common features with both workflows and database management. A new paradigm, the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced so as to keep estimates of demand as current as possible.

  3. Microcomputer log analysis system

    SciTech Connect

    Ostrander, C.

    1984-04-01

    A comprehensive friendly log analysis system for use on a microcomputer requires only average log analysis skills. Most systems require both log analysis and computer professional for operation. This one has many capabilities: (1) data entry is handled by office personnel after minimal training; (2) entered data is filed and cataloged for future retrieval and analysis; (3) the system can handle more than 9,000,000 ft (2700 km) of log data in over 60,000 files; (4) all data can be edited; (5) searches and listings can be made using factors such as formation names; (6) facsimile reproductions can be made of any log on file; (7) a screening program turns the system into a sophisticated hand calculator to quickly determine zones of interest; and (8) up to 1100 ft (335 m) of contiguous data from a well can be analyzed in one run. Innovative features include: (1) a discriminating factor to separate reservoirs for individual attention concerning rock type, fluid content and potential reserves; and (2) a written report of each reservoir using artificial intelligence. The report discusses, among other things, the rock type and its consistency, comparing the system finding with the geologist's opinion. Differences between the two will elicit alternative analyses.

  4. Microcomputers in the Research Office.

    ERIC Educational Resources Information Center

    Daly, Brian E.

    There are three valuable types of computer software needed by the professional researcher--word processing, spreadsheet, and database management. In general, these packages are required for the effective operation of offices; however, researchers may have special needs which can only be met with more specialized software. Recently, word processing…

  5. Database of Pesticides and Off-flavors for Health Crisis Management.

    PubMed

    Ueda, Yasuhito; Itoh, Mitsuo

    2016-01-01

    In this experiment, 351 pesticides and 441 different organic compounds were analyzed by GC/MS, and a database of retention time, retention index, monoisotopic mass, two selected ions, molecular formula, and CAS numbers was created. The database includes compounds such as alcohols, aldehydes, carboxylic acids, esters, ethers and hydrocarbons with unpleasant odors. This database is expected to be useful for health crisis management in the future. PMID:27211918

  6. Computer Databases: A Survey. Part 3: Product Databases.

    ERIC Educational Resources Information Center

    O'Leary, Mick

    1987-01-01

    Describes five online databases that focus on computer products, primarily software and microcomputing hardware, and compares the databases in terms of record content, product coverage, vertical market coverage, currency, availability, and price. Sample records and searches are provided, as well as a directory of product databases. (CLB)

  7. Use of a microcomputer network for history taking in a prenatal clinic.

    PubMed

    Lilford, R J; Chard, T; Bingham, P; Carrigan, E

    1985-04-01

    A stand-alone microcomputer was installed at St. Bartholomew's Hospital to obtain the booking (first prenatal) history. This system has many well-documented advantages over the manual method, particularly with respect to the completeness and quality of the history produced. However, a single microcomputer system is unable to deal with the load of a busy clinic, and initially, several independent terminals were required. We now describe the installation of a local area network to link several microcomputers with a single database in the Antenatal Clinic at Queen Charlotte's Maternity Hospital.

  8. Knowledge Based Engineering for Spatial Database Management and Use

    NASA Technical Reports Server (NTRS)

    Peuquet, D. (Principal Investigator)

    1984-01-01

    The use of artificial intelligence techniques that are applicable to Geographic Information Systems (GIS) are examined. Questions involving the performance and modification to the database structure, the definition of spectra in quadtree structures and their use in search heuristics, extension of the knowledge base, and learning algorithm concepts are investigated.

  9. GAS CHROMATOGRAPHIC RETENTION PARAMETERS DATABASE FOR REFRIGERANT MIXTURE COMPOSITION MANAGEMENT

    EPA Science Inventory

    Composition management of mixed refrigerant systems is a challenging problem in the laboratory, manufacturing facilities, and large refrigeration machinery. Ths issue of composition management is especially critical for the maintenance of machinery that utilizes zeotropic mixture...

  10. Integrated Standardized Database/Model Management System: Study management concepts and requirements

    SciTech Connect

    Baker, R.; Swerdlow, S.; Schultz, R.; Tolchin, R.

    1994-02-01

    Data-sharing among planners and planning software for utility companies is the motivation for creating the Integrated Standardized Database (ISD) and Model Management System (MMS). The purpose of this document is to define the requirements for the ISD/MMS study management component in a manner that will enhance the use of the ISD. After an analysis period which involved EPRI member utilities across the United States, the study concept was formulated. It is defined in terms of its entities, relationships and its support processes, specifically for implementation as the key component of the MMS. From the study concept definition, requirements are derived. There are unique requirements, such as the necessity to interface with DSManager, EGEAS, IRPManager, MIDAS and UPM and there are standard information systems requirements, such as create, modify, delete and browse data. An initial ordering of the requirements is established, with a section devoted to future enhancements.

  11. Zebrafish Database: Customizable, Free, and Open-Source Solution for Facility Management.

    PubMed

    Yakulov, Toma Antonov; Walz, Gerd

    2015-12-01

    Zebrafish Database is a web-based customizable database solution, which can be easily adapted to serve both single laboratories and facilities housing thousands of zebrafish lines. The database allows the users to keep track of details regarding the various genomic features, zebrafish lines, zebrafish batches, and their respective locations. Advanced search and reporting options are available. Unique features are the ability to upload files and images that are associated with the respective records and an integrated calendar component that supports multiple calendars and categories. Built on the basis of the Joomla content management system, the Zebrafish Database is easily extendable without the need for advanced programming skills.

  12. Military Services Fitness Database: Development of a Computerized Physical Fitness and Weight Management Database for the U.S. Army

    PubMed Central

    Williamson, Donald A.; Bathalon, Gaston P.; Sigrist, Lori D.; Allen, H. Raymond; Friedl, Karl E.; Young, Andrew J.; Martin, Corby K.; Stewart, Tiffany M.; Burrell, Lolita; Han, Hongmei; Hubbard, Van S.; Ryan, Donna

    2009-01-01

    The Department of Defense (DoD) has mandated development of a system to collect and manage data on the weight, percent body fat (%BF), and fitness of all military personnel. This project aimed to (1) develop a computerized weight and fitness database to track individuals and Army units over time allowing cross-sectional and longitudinal evaluations and (2) test the computerized system for feasibility and integrity of data collection over several years of usage. The computer application, the Military Services Fitness Database (MSFD), was designed for (1) storage and tracking of data related to height, weight, %BF for the Army Weight Control Program (AWCP) and Army Physical Fitness Test (APFT) scores and (2) generation of reports using these data. A 2.5-year pilot test of the MSFD indicated that it monitors population and individual trends of changing body weight, %BF, and fitness in a military population. PMID:19216292

  13. [Role and management of cancer clinical database in the application of gastric cancer precision medicine].

    PubMed

    Li, Yuanfang; Zhou, Zhiwei

    2016-02-01

    Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.

  14. Microcomputer Network for Computerized Adaptive Testing (CAT): Program Listing.

    ERIC Educational Resources Information Center

    Quan, Baldwin; And Others

    This program listing is a supplement to the Microcomputer Network for Computerized Adaptive Testing (CAT). The driver textfile program allows access to major subprograms of the CAT project. The test administration textfile program gives examinees a prescribed set of subtests. The parameter management textfile program establishes a file containing…

  15. Satellite Doppler data processing using a microcomputer

    NASA Technical Reports Server (NTRS)

    Schmid, P. E.; Lynn, J. J.

    1977-01-01

    A microcomputer which was developed to compute ground radio beacon position locations using satellite measurements of Doppler frequency shift is described. Both the computational algorithms and the microcomputer hardware incorporating these algorithms were discussed. Results are presented where the microcomputer in conjunction with the NIMBUS-6 random access measurement system provides real time calculation of beacon latitude and longitude.

  16. PylotDB - A Database Management, Graphing, and Analysis Tool Written in Python

    SciTech Connect

    Barnette, Daniel W.

    2012-01-04

    PylotDB, written completely in Python, provides a user interface (UI) with which to interact with, analyze, graph data from, and manage open source databases such as MySQL. The UI mitigates the user having to know in-depth knowledge of the database application programming interface (API). PylotDB allows the user to generate various kinds of plots from user-selected data; generate statistical information on text as well as numerical fields; backup and restore databases; compare database tables across different databases as well as across different servers; extract information from any field to create new fields; generate, edit, and delete databases, tables, and fields; generate or read into a table CSV data; and similar operations. Since much of the database information is brought under control of the Python computer language, PylotDB is not intended for huge databases for which MySQL and Oracle, for example, are better suited. PylotDB is better suited for smaller databases that might be typically needed in a small research group situation. PylotDB can also be used as a learning tool for database applications in general.

  17. PylotDB - A Database Management, Graphing, and Analysis Tool Written in Python

    2012-01-04

    PylotDB, written completely in Python, provides a user interface (UI) with which to interact with, analyze, graph data from, and manage open source databases such as MySQL. The UI mitigates the user having to know in-depth knowledge of the database application programming interface (API). PylotDB allows the user to generate various kinds of plots from user-selected data; generate statistical information on text as well as numerical fields; backup and restore databases; compare database tables acrossmore » different databases as well as across different servers; extract information from any field to create new fields; generate, edit, and delete databases, tables, and fields; generate or read into a table CSV data; and similar operations. Since much of the database information is brought under control of the Python computer language, PylotDB is not intended for huge databases for which MySQL and Oracle, for example, are better suited. PylotDB is better suited for smaller databases that might be typically needed in a small research group situation. PylotDB can also be used as a learning tool for database applications in general.« less

  18. Survey of standards applicable to a database management system

    NASA Technical Reports Server (NTRS)

    Urena, J. L.

    1981-01-01

    Industry, government, and NASA standards, and the status of standardization activities of standards setting organizations applicable to the design, implementation and operation of a data base management system for space related applications are identified. The applicability of the standards to a general purpose, multimission data base management system is addressed.

  19. A microcomputer-based emergency response system*.

    PubMed

    Belardo, S; Howell, A; Ryan, R; Wallace, W A

    1983-09-01

    A microcomputer-based system was developed to provide local officials responsible for disaster management with assistance during the crucial period immediately following a disaster, a period when incorrect decisions could have an adverse impact on the surrounding community. While the paper focuses on a potential disaster resulting from an accident at a commercial nuclear power generating facility, the system can be applied to other disastrous situations. Decisions involving evacuation, shelter and the deployment of resources must be made in response to floods, earthquakes, accidents in the transportation of hazardous materials, and hurricanes to name a few examples. As a decision aid, the system was designed to enhance data display by presenting the data in the form of representations (i.e. road maps, evacuation routes, etc.) as well as in list or tabular form. The potential impact of the event (i.e. the release of radioactive material) was displayed in the form of a cloud, representing the dispersion of the radioactive material. In addition, an algorithm was developed to assist the manager in assigning response resources to demands. The capability for modelling the impact of a disaster is discussed briefly, with reference to a system installed in the communities surrounding the Indian Point nuclear power plant in New York State. Results demonstrate both the technical feasibility of incorporating microcomputers indecision support systems for radiological emergency response, and the acceptance of such systems by those public officials responsible for implementing the response plans.

  20. An improved FORTRAN 77 recombinant DNA database management system with graphic extensions in GKS.

    PubMed

    Van Rompuy, L L; Lesage, C; Vanderhaegen, M E; Telemans, M P; Zabeau, M F

    1986-12-01

    We have improved an existing clone database management system written in FORTRAN 77 and adapted it to our software environment. Improvements are that the database can be interrogated for any type of information, not just keywords. Also, recombinant DNA constructions can be represented in a simplified 'shorthand', whereafter a program assembles the full nucleotide sequence from the contributing fragments, which may be obtained from nucleotide sequence databases. Another improvement is the replacement of the database manager by programs, running in batch to maintain the databank and verify its consistency automatically. Finally, graphic extensions are written in Graphical Kernel System, to draw linear and circular restriction maps of recombinants. Besides restriction sites, recombinant features can be presented from the feature lines of recombinant database entries, or from the feature tables of nucleotide databases. The clone database management system is fully integrated into the sequence analysis software package from the Pasteur Institute, Paris, and is made accessible through the same menu. As a result, recombinant DNA sequences can directly be analysed by the sequence analysis programs.

  1. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    SciTech Connect

    Wolery, T W; Sutton, M

    2011-09-19

    they use a large body of thermodynamic data, generally from a supporting database file, to sort out the various important reactions from a wide spectrum of possibilities, given specified inputs. Usually codes of this kind are used to construct models of initial aqueous solutions that represent initial conditions for some process, although sometimes these calculations also represent a desired end point. Such a calculation might be used to determine the major chemical species of a dissolved component, the solubility of a mineral or mineral-like solid, or to quantify deviation from equilibrium in the form of saturation indices. Reactive transport codes such as TOUGHREACT and NUFT generally require the user to determine which chemical species and reactions are important, and to provide the requisite set of information including thermodynamic data in an input file. Usually this information is abstracted from the output of a geochemical modeling code and its supporting thermodynamic data file. The Yucca Mountain Project (YMP) developed two qualified thermodynamic databases to model geochemical processes, including ones involving repository components such as spent fuel. The first of the two (BSC, 2007a) was for systems containing dilute aqueous solutions only, the other (BSC, 2007b) for systems involving concentrated aqueous solutions and incorporating a model for such based on Pitzer's (1991) equations. A 25 C-only database with similarities to the latter was also developed for the Waste Isolation Pilot Plant (WIPP, cf. Xiong, 2005). The NAGRA/PSI database (Hummel et al., 2002) was developed to support repository studies in Europe. The YMP databases are often used in non-repository studies, including studies of geothermal systems (e.g., Wolery and Carroll, 2010) and CO2 sequestration (e.g., Aines et al., 2011).

  2. Special-Interest Microcomputing Publications.

    ERIC Educational Resources Information Center

    Colsher, William L.

    1980-01-01

    This article describes computer journals, newsletters, and cassette magazines that are devoted to a particular brand of personal computer, such as the TRS-80, or to a particular microprocessor, such as the 6502, used in the Apple II, Commodore PET, and other microcomputers. Publishers' addresses and rates are listed. (Author/SJL)

  3. Microcomputer Business Applications. Teacher Edition.

    ERIC Educational Resources Information Center

    James, Marcia; And Others

    This curriculum guide is designed to teach concepts associated with business applications of microcomputers. It can be used in marketing, office education, and computer literacy courses. Most activities can be done in less than 1 hour. The course is organized in eight units that cover the following: (1) systems and software; (2) electronic filing;…

  4. Microcomputer Infusion Project: A Model.

    ERIC Educational Resources Information Center

    Rossberg, Stephen A.; Bitter, Gary G.

    1988-01-01

    Describes the Microcomputer Infusion Project (MIP), which was developed at Arizona State University to provide faculty with the necessary hardware, software, and training to become models of computer use in both lesson development and presentation for preservice teacher education students. Topics discussed include word processing; database…

  5. Microcomputer Unit: Graphing Straight Lines.

    ERIC Educational Resources Information Center

    Hastings, Ellen H.; Yates, Daniel S.

    1983-01-01

    The material is designed to help pupils investigate how the value for slope in the equation of a line affects the inclination for the graph of an equation. A program written in BASIC designed to run on an Apple microcomputer is included. Worksheet masters for duplication are provided. (MP)

  6. Students Discuss Microcomputers and History.

    ERIC Educational Resources Information Center

    Slatta, Richard

    1987-01-01

    Reviews the authors experience in teaching a senior-level undergraduate course entitled: "Using a Microcomputer to Enhance Historical Research and Writing." The class used Ashton-Tate's FRAMEWORK, an integrated program that combines filing, outlining, word processing, and other functions. Includes the syllabus and student reactions to the course.…

  7. Microcomputer Applications in Interaction Analysis.

    ERIC Educational Resources Information Center

    Wadham, Rex A.

    The Timed Interval Categorical Observation Recorder (TICOR), a portable, battery powered microcomputer designed to automate the collection of sequential and simultaneous behavioral observations and their associated durations, was developed to overcome problems in gathering subtle interaction analysis data characterized by sequential flow of…

  8. Microcomputer Hardware. Energy Technology Series.

    ERIC Educational Resources Information Center

    Technical Education Research Centre-Southwest, Waco, TX.

    This course in microcomputer hardware is one of 16 courses in the Energy Technology Series developed for an Energy Conservation-and-Use Technology curriculum. Intended for use in two-year postsecondary technical institutions to prepare technicians for employment, the courses are also useful in industry for updating employees in company-sponsored…

  9. Microcomputer Modules for Undergraduate Geography.

    ERIC Educational Resources Information Center

    Groop, Richard; And Others

    1985-01-01

    Described and evaluated are microcomputer units of instruction that were developed for use in undergraduate geography courses. Students responded favorably to the modules--"Socioeconomic Patterns,""Economic Rent,""Sampling Distribution of Sample Means,""Land Use Competition,""Data Classing,""Weather and Climate," and "Landforms." (RM)

  10. Networking and Microcomputers. ERIC Digest.

    ERIC Educational Resources Information Center

    Klausmeier, Jane

    Computer networks can fall into three broad categories--local area networks (LAN), microcomputer based messaging systems (this includes computer bulletin board systems), or commercial information systems. Many of the same types of activities take place within the three categories. The major differences are the types of information available and…

  11. History Microcomputer Games: Update 2.

    ERIC Educational Resources Information Center

    Sargent, James E.

    1985-01-01

    Provides full narrative reviews of B-1 Nuclear Bomber (Avalon, 1982); American History Adventure (Social Science Microcomputer Review Software, 1985); Government Simulations (Prentice-Hall, 1985); and The Great War, FDR and the New Deal, and Hitler's War, all from New Worlds Software, 1985. Lists additional information on five other history and…

  12. Microcomputers in the Public Schools.

    ERIC Educational Resources Information Center

    Hayes, Jeanne

    1984-01-01

    A table shows the number of public schools using microcomputers, by brand, in 1982-83 and 1983-84 respectively. It reveals that Apple has extended its market dominance from 51.1 to 66.2 percent during this time. (TE)

  13. Design of Student Information Management Database Application System for Office and Departmental Target Responsibility System

    NASA Astrophysics Data System (ADS)

    Zhou, Hui

    It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.

  14. MADMAX - Management and analysis database for multiple ~omics experiments.

    PubMed

    Lin, Ke; Kools, Harrie; de Groot, Philip J; Gavai, Anand K; Basnet, Ram K; Cheng, Feng; Wu, Jian; Wang, Xiaowu; Lommen, Arjen; Hooiveld, Guido J E J; Bonnema, Guusje; Visser, Richard G F; Muller, Michael R; Leunissen, Jack A M

    2011-01-01

    The rapid increase of ~omics datasets generated by microarray, mass spectrometry and next generation sequencing technologies requires an integrated platform that can combine results from different ~omics datasets to provide novel insights in the understanding of biological systems. MADMAX is designed to provide a solution for storage and analysis of complex ~omics datasets. In addition, analysis results (such as lists of genes) will be merged to reveal candidate genes supported by all datasets. The system constitutes an ISA-Tab compliant LIMS part which is independent of different analysis pipelines. A pilot study of different type of ~omics data in Brassica rapa demonstrates the possible use of MADMAX. The web-based user interface provides easy access to data and analysis tools on top of the database.

  15. MADMAX - Management and analysis database for multiple ~omics experiments.

    PubMed

    Lin, Ke; Kools, Harrie; de Groot, Philip J; Gavai, Anand K; Basnet, Ram K; Cheng, Feng; Wu, Jian; Wang, Xiaowu; Lommen, Arjen; Hooiveld, Guido J E J; Bonnema, Guusje; Visser, Richard G F; Muller, Michael R; Leunissen, Jack A M

    2011-01-01

    The rapid increase of ~omics datasets generated by microarray, mass spectrometry and next generation sequencing technologies requires an integrated platform that can combine results from different ~omics datasets to provide novel insights in the understanding of biological systems. MADMAX is designed to provide a solution for storage and analysis of complex ~omics datasets. In addition, analysis results (such as lists of genes) will be merged to reveal candidate genes supported by all datasets. The system constitutes an ISA-Tab compliant LIMS part which is independent of different analysis pipelines. A pilot study of different type of ~omics data in Brassica rapa demonstrates the possible use of MADMAX. The web-based user interface provides easy access to data and analysis tools on top of the database. PMID:21778530

  16. Managing Geological Profiles in Databases for 3D Visualisation

    NASA Astrophysics Data System (ADS)

    Jarna, A.; Grøtan, B. O.; Henderson, I. H. C.; Iversen, S.; Khloussy, E.; Nordahl, B.; Rindstad, B. I.

    2016-10-01

    Geology and all geological structures are three-dimensional in space. GIS and databases are common tools used by geologists to interpret and communicate geological data. The NGU (Geological Survey of Norway) is the national institution for the study of bedrock, mineral resources, surficial deposits and groundwater and marine geology. 3D geology is usually described by geological profiles, or vertical sections through a map, where you can look at the rock structure below the surface. The goal is to gradually expand the usability of existing and new geological profiles to make them more available in the retail applications as well as build easier entry and registration of profiles. The project target is to develop the methodology for acquisition of data, modification and use of data and its further presentation on the web by creating a user-interface directly linked to NGU's webpage. This will allow users to visualise profiles in a 3D model.

  17. Planning the future of JPL's management and administrative support systems around an integrated database

    NASA Technical Reports Server (NTRS)

    Ebersole, M. M.

    1983-01-01

    JPL's management and administrative support systems have been developed piece meal and without consistency in design approach over the past twenty years. These systems are now proving to be inadequate to support effective management of tasks and administration of the Laboratory. New approaches are needed. Modern database management technology has the potential for providing the foundation for more effective administrative tools for JPL managers and administrators. Plans for upgrading JPL's management and administrative systems over a six year period evolving around the development of an integrated management and administrative data base are discussed.

  18. A Microcomputer Implementation Of An Intelligent Data Acquisition And Load Forecasting System

    NASA Astrophysics Data System (ADS)

    Rahman, Saifur

    1987-01-01

    This paper reports on the hardware and the programming aspects of an intelligent data acquisition and load forecasting system that has been implemented on a desktop microcomputer. The objective was to develop a low cost and reliable system that would collect forecasted weather data, real-time electric utility load data, archive them, and issue an electric utility load forecast in 1-hour, 6-hour and upto 24-hour increments within a midnight-to-midnight time frame. Data are collected, over commercial telephone lines, from remote locations (often hundreds of miles apart), filtered and then processed. The archived data are used to form monthly summaries of hourly electric utility load (MW) and weather conditions in the area. A set of pre-selected rules are then applied on this database to develop the desired load forecast. All this work is done in a totally automated fashion, i.e., without any human intervention. The data acquisition and load forecasting system is based on an AT&T 3B2/300 UNIX based desktop microcomputer. The 3B2 serves as the "heart" of the system and performs the functions of data collection, processing, archiving, load forecasting and display. It is a multi-tasking, multi-user machine and at it's present configuration can support four users and a "super user", or system manager.

  19. Microcomputer authoring systems: valuable tools for health educators.

    PubMed

    Whiteside, M F; Whiteside, J A

    Writing courseware with the aid of an authoring system is a bold step that can bring together the health education content expert and the power of the microcomputer. The microcomputer can be programmed to present essential knowledge to students in a low pressure setting, specifically geared to their levels of comprehension and rates of progression. Microcomputer-based simulations and patient management problems seem suited to the task helping students develop adequate problem-solving skills in health education (Lewis, 1983; Peterson, 1984). Furthermore, many lecture hours can be replaced by an infinitely patient tutor with which students can interact at their convenience. Creating self-study materials delivered via microcomputer is also a step toward providing the most effective type of learning experiences for individual students. Despite the fact that putting authoring systems in the hands of well-informed content specialists may meet a number of pressing needs in health education, there is one drawback. Authoring systems have a built-in pedagogical structure that, to some extent, dictates the design of the lesson. However, spending time in the evaluation process prior to purchase will enable educators to identify a system that can be used to develop courseware that very closely matches the desires of the author. Integrating microcomputer courseware into health education courses is certainly an attractive solution to some of the educational problems faced in health education today. An authoring system can be used to develop courseware that can substituted for lectures on basic concepts. In addition, students will have more opportunities to develop the ability to apply, in problem-solving situations, the factual knowledge they are learning before they are responsible for making judgments in real-life situations. The time is ripe, then, for health educators to investigate how authoring systems can help them utilize the technology of the microcomputer to improve

  20. Federated Web-accessible Clinical Data Management within an Extensible NeuroImaging Database

    PubMed Central

    Keator, David B.; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R.; Bockholt, Jeremy; Grethe, Jeffrey S.

    2010-01-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site. PMID:20567938

  1. Flight Deck Interval Management Display. [Elements, Information and Annunciations Database User Guide

    NASA Technical Reports Server (NTRS)

    Lancaster, Jeff; Dillard, Michael; Alves, Erin; Olofinboba, Olu

    2014-01-01

    The User Guide details the Access Database provided with the Flight Deck Interval Management (FIM) Display Elements, Information, & Annunciations program. The goal of this User Guide is to support ease of use and the ability to quickly retrieve and select items of interest from the Database. The Database includes FIM Concepts identified in a literature review preceding the publication of this document. Only items that are directly related to FIM (e.g., spacing indicators), which change or enable FIM (e.g., menu with control buttons), or which are affected by FIM (e.g., altitude reading) are included in the database. The guide has been expanded from previous versions to cover database structure, content, and search features with voiced explanations.

  2. An Extensible "SCHEMA-LESS" Database Framework for Managing High-Throughput Semi-Structured Documents

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semistructured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  3. An Extensible Schema-less Database Framework for Managing High-throughput Semi-Structured Documents

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.; La, Tracy; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword searches of records for both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high throughput open database framework for managing, storing, and searching unstructured or semi structured arbitrary hierarchal models, XML and HTML.

  4. NETMARK: A Schema-less Extension for Relational Databases for Managing Semi-structured Data Dynamically

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  5. The Golosiiv on-line plate archive database, management and maintenance

    NASA Astrophysics Data System (ADS)

    Pakuliak, L.; Sergeeva, T.

    2007-08-01

    We intend to create online version of the database of the MAO NASU plate archive as VO-compatible structures in accordance with principles, developed by the International Virtual Observatory Alliance in order to make them available for world astronomical community. The online version of the log-book database is constructed by means of MySQL+PHP. Data management system provides a user with user interface, gives a capability of detailed traditional form-filling radial search of plates, obtaining some auxiliary sampling, the listing of each collection and permits to browse the detail descriptions of collections. The administrative tool allows database administrator the data correction, enhancement with new data sets and control of the integrity and consistence of the database as a whole. The VO-compatible database is currently constructing under the demands and in the accordance with principles of international data archives and has to be strongly generalized in order to provide a possibility of data mining by means of standard interfaces and to be the best fitted to the demands of WFPDB Group for databases of the plate catalogues. On-going enhancements of database toward the WFPDB bring the problem of the verification of data to the forefront, as it demands the high degree of data reliability. The process of data verification is practically endless and inseparable from data management owing to a diversity of data errors nature, that means to a variety of ploys of their identification and fixing. The current status of MAO NASU glass archive forces the activity in both directions simultaneously: the enhancement of log-book database with new sets of observational data as well as generalized database creation and the cross-identification between them. The VO-compatible version of the database is supplying with digitized data of plates obtained with MicroTek ScanMaker 9800 XL TMA. The scanning procedure is not total but is conducted selectively in the frames of special

  6. Data management and database structure at the ARS Culture Collection

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The organization and management of collection data for the 96,000 strains held in the ARS Culture Collection has been an ongoing process. Originally, the records for the four separate collections were maintained by individual curators in notebooks and/or card files and subsequently on the National C...

  7. The Future of Asset Management for Human Space Exploration: Supply Classification and an Integrated Database

    NASA Technical Reports Server (NTRS)

    Shull, Sarah A.; Gralla, Erica L.; deWeck, Olivier L.; Shishko, Robert

    2006-01-01

    One of the major logistical challenges in human space exploration is asset management. This paper presents observations on the practice of asset management in support of human space flight to date and discusses a functional-based supply classification and a framework for an integrated database that could be used to improve asset management and logistics for human missions to the Moon, Mars and beyond.

  8. Rail transit energy management program: Energy database. Volume 2. Final report

    SciTech Connect

    Uher, R.A.

    1995-03-01

    The Rail Transportation Energy Management Program (EMP) is a private/public partnership whose objective is to reduce rail transit energy cost and improve energy efficiency. The Energy Database (EDB) was set up under the program. The purpose of the EDB is to provide information to the members of the program. This information includes rail transit energy and energy cost data and the results of implementation of energy cost reduction strategies. The EDB also includes a means for timely exchange of information among transit authorities as well as associated with energy management. The database is presently set up on a personal computer and is accessed by the users via an 800 telephone line.

  9. MST-80B microcomputer trainer

    SciTech Connect

    Jones, G.D.; Fisher, E.R.; Spann, J.M.

    1980-04-01

    The microcomputer revolution in electronics is spreading so rapidly that it is difficult to educate enough people quickly and thoroughly in the new technology. Lawrence Livermore Laboratory's MST-80B was developed as a way to speed learning in in-house training courses, and it is now being widely used outside LLL. The MST-80B trainer is a complete, self-contained, microcomputer system housed in a briefcase. The trainer uses the Intel 8080A 8-Bit Microprocessor (CPU), and it has its own solid-state memory, a built-in keyboard, input and output ports, and a display for visual output. The trainer is furnished with a permanent Monitor Program (in Read-Only Memory) that allows users to enter, debug, modify, and run programs of their own easily. 8 figures, 3 tables.

  10. Database Management: Building, Changing and Using Databases. Collected Papers and Abstracts of the Mid-Year Meeting of the American Society for Information Science (15th, Portland, Oregon, May 1986).

    ERIC Educational Resources Information Center

    American Society for Information Science, Washington, DC.

    This document contains abstracts of papers on database design and management which were presented at the 1986 mid-year meeting of the American Society for Information Science (ASIS). Topics considered include: knowledge representation in a bilingual art history database; proprietary database design; relational database design; in-house databases;…

  11. NDT-COMP9 microcomputer

    SciTech Connect

    Dodd, C.V.; Cowan, R.F.

    1980-09-01

    An 8080-based microcomputer system, the NDT-COMP9, has been designed for instrumentation control and data analysis in eddy-current tests. The NDT-COMP9 represents a significantly more powerful computer system than the NDT-COMP8 microcomputer from which it was developed. The NDT-COMP9 system is contained on a 240- by 120-mm (9.5- by 4.8-in.) circuit board and will fit in a four-wide Nuclear Instrumentation Module (NIM) BIN with 26-pin edge connectors. In addition to the 8080-compatible central processing unit (CPU), an arithmetic processing unit (APU) is available to provide up to 32-bit fixed- or floating-point, basic or transcendental math functions. The 16K of read only memory (ROM) and random access memory (RAM), one serial input-output (I/O) port (RS-232-C at a maximum speed of 9600 baud), and 72 parallel I/O ports are available. The baud rate is under software control. A system monitor and math package are available for use with the microcomputer.

  12. Microcomputers aid pipeline hydraulic analysis

    SciTech Connect

    Hein, M.A.; Brosius, M.

    1984-02-13

    Microcomputer technology has come a long way in the last few years, and now inexpensive desktop computers can be used to analyze fluid and heat flow in even the largest pipeline and networked piping systems. Except for network problems requiring dynamic compositional modeling and extremely large amounts of data storage, all processing, including input, calculation, and output, can be handled with the microcomputer. And, even for these large problems, a small personal computer can be used to efficiently build the input files, process the output, and generally enhance the whole computational procedure. Only a few years ago the engineer had to code up his data, give it to the keypunching department, wait several hours or days until he got his cards back, attach the appropriate job control language (JCL), submit the deck to the computer department, wait several more hours or days to receive the final results, and finally pore over endless tables of numbers to interpret the results. Further, if there was an error in the input or if several case studies were required, he had to go through the whole process repeatedly. With the advent of the microcomputer with graphics packages, light pens, graphic pads, tens of megaword, fast-access, disk storage, and versatile, user-friendly software the data preparation, interpretation, and computational times for hydraulic piping simulation are cut by an order of magnitude.

  13. Management of three-dimensional and anthropometric databases: Alexandria and Cleopatra

    NASA Astrophysics Data System (ADS)

    Paquet, Eric; Robinette, Kathleen; Rioux, Marc

    2000-10-01

    This paper describes two systems for managing 3D and anthropometric databases, namely Alexandria and Cleopatra. Each system is made out of three parts: the crawler, the analyzer, and the search engine. The crawler retrieves the content from the network while the analyzer describes automatically the shape, scale, and color of each retrieved object and writes down a compact descriptor. The search engine applies the query by example paradigm to find and retrieve similar or related objects from the database based on different aspects of 3D shape, scale, and color distribution. The descriptors are defined and the implementation of the system is detailed. The application of the system to the CAESAR anthropometric survey is discussed. Experimental results from the CAESAR database and from generic databases are presented.

  14. ``STANDARD LIBRARY'': A relational database for the management of electron microprobe standards

    NASA Astrophysics Data System (ADS)

    Diamond, Larryn W.; Schmatz, Dirk; Würsten, Felix

    1994-05-01

    Laboratory collections of well-characterized solid materials are an indispensable basis for the calibration of quantitative electron microprobe analyses. The STANDARD LIBRARY database has been designed to manage the wide variety of information needed to characterize such standards, and to provide a rapid way by which these data can be accessed. In addition to physical storage information, STANDARD LIBRARY includes a full set of chemical and mineralogic characterization variables, and a set of variables specific to microprobe calibration (instrumental setup, standard homogeneity, etc.). Application programs for STANDARD LIBRARY provide a series of interactive screen views for database search, retrieval, and editing operations (including inventories). Search and inventory results can be written as UNIX data files, some of which are formatted to be read directly by the software that controls CAMECA SX50™ electron microprobes. The application programs are coded in OSL for the INGRES™ database-management system, and run within any environment that supports INGRES™ (e.g. UNIX, VMS, DOS, etc.). STANDARD LIBRARY has been generalized, however, such that only the physical storage structure of the database is dependent on the selected database-management system.

  15. Microcomputer Checks Butt-Weld Accuracy

    NASA Technical Reports Server (NTRS)

    Clisham, W.; Garner, W.; Cohen, C.; Beal, J.; Polen, R.; Lloyd, J.

    1982-01-01

    Electrical gage and microcomputer eliminate time-consuming manual measurements. Alinement and angle of plates on either side of butt weld are measured and recorded automatically by hand-held gage and desk-top microcomputer. Gage/micro-computer quickly determine whether weld is within dimensional tolerances or whether reworking is needed. Microcomputer prints out measurements while operator moves gage from point to point along weld. Out-of-tolerance measurements are marked by an asterisk on printout.

  16. Insight: An ontology-based integrated database and analysis platform for epilepsy self-management research.

    PubMed

    Sahoo, Satya S; Ramesh, Priya; Welter, Elisabeth; Bukach, Ashley; Valdez, Joshua; Tatsuoka, Curtis; Bamps, Yvan; Stoll, Shelley; Jobst, Barbara C; Sajatovic, Martha

    2016-10-01

    We present Insight as an integrated database and analysis platform for epilepsy self-management research as part of the national Managing Epilepsy Well Network. Insight is the only available informatics platform for accessing and analyzing integrated data from multiple epilepsy self-management research studies with several new data management features and user-friendly functionalities. The features of Insight include, (1) use of Common Data Elements defined by members of the research community and an epilepsy domain ontology for data integration and querying, (2) visualization tools to support real time exploration of data distribution across research studies, and (3) an interactive visual query interface for provenance-enabled research cohort identification. The Insight platform contains data from five completed epilepsy self-management research studies covering various categories of data, including depression, quality of life, seizure frequency, and socioeconomic information. The data represents over 400 participants with 7552 data points. The Insight data exploration and cohort identification query interface has been developed using Ruby on Rails Web technology and open source Web Ontology Language Application Programming Interface to support ontology-based reasoning. We have developed an efficient ontology management module that automatically updates the ontology mappings each time a new version of the Epilepsy and Seizure Ontology is released. The Insight platform features a Role-based Access Control module to authenticate and effectively manage user access to different research studies. User access to Insight is managed by the Managing Epilepsy Well Network database steering committee consisting of representatives of all current collaborating centers of the Managing Epilepsy Well Network. New research studies are being continuously added to the Insight database and the size as well as the unique coverage of the dataset allows investigators to conduct

  17. Insight: An ontology-based integrated database and analysis platform for epilepsy self-management research.

    PubMed

    Sahoo, Satya S; Ramesh, Priya; Welter, Elisabeth; Bukach, Ashley; Valdez, Joshua; Tatsuoka, Curtis; Bamps, Yvan; Stoll, Shelley; Jobst, Barbara C; Sajatovic, Martha

    2016-10-01

    We present Insight as an integrated database and analysis platform for epilepsy self-management research as part of the national Managing Epilepsy Well Network. Insight is the only available informatics platform for accessing and analyzing integrated data from multiple epilepsy self-management research studies with several new data management features and user-friendly functionalities. The features of Insight include, (1) use of Common Data Elements defined by members of the research community and an epilepsy domain ontology for data integration and querying, (2) visualization tools to support real time exploration of data distribution across research studies, and (3) an interactive visual query interface for provenance-enabled research cohort identification. The Insight platform contains data from five completed epilepsy self-management research studies covering various categories of data, including depression, quality of life, seizure frequency, and socioeconomic information. The data represents over 400 participants with 7552 data points. The Insight data exploration and cohort identification query interface has been developed using Ruby on Rails Web technology and open source Web Ontology Language Application Programming Interface to support ontology-based reasoning. We have developed an efficient ontology management module that automatically updates the ontology mappings each time a new version of the Epilepsy and Seizure Ontology is released. The Insight platform features a Role-based Access Control module to authenticate and effectively manage user access to different research studies. User access to Insight is managed by the Managing Epilepsy Well Network database steering committee consisting of representatives of all current collaborating centers of the Managing Epilepsy Well Network. New research studies are being continuously added to the Insight database and the size as well as the unique coverage of the dataset allows investigators to conduct

  18. Database Design Learning: A Project-Based Approach Organized through a Course Management System

    ERIC Educational Resources Information Center

    Dominguez, Cesar; Jaime, Arturo

    2010-01-01

    This paper describes an active method for database design learning through practical tasks development by student teams in a face-to-face course. This method integrates project-based learning, and project management techniques and tools. Some scaffolding is provided at the beginning that forms a skeleton that adapts to a great variety of…

  19. Two Student Self-Management Techniques Applied to Data-Based Program Modification.

    ERIC Educational Resources Information Center

    Wesson, Caren

    Two student self-management techniques, student charting and student selection of instructional activities, were applied to ongoing data-based program modification. Forty-two elementary school resource room students were assigned randomly (within teacher) to one of three treatment conditions: Teacher Chart-Teacher Select Instructional Activities…

  20. Functions and Relations: Some Applications from Database Management for the Teaching of Classroom Mathematics.

    ERIC Educational Resources Information Center

    Hauge, Sharon K.

    While functions and relations are important concepts in the teaching of mathematics, research suggests that many students lack an understanding and appreciation of these concepts. The present paper discusses an approach for teaching functions and relations that draws on the use of illustrations from database management. This approach has the…

  1. Toward public volume database management: a case study of NOVA, the National Online Volumetric Archive

    NASA Astrophysics Data System (ADS)

    Fletcher, Alex; Yoo, Terry S.

    2004-04-01

    Public databases today can be constructed with a wide variety of authoring and management structures. The widespread appeal of Internet search engines suggests that public information be made open and available to common search strategies, making accessible information that would otherwise be hidden by the infrastructure and software interfaces of a traditional database management system. We present the construction and organizational details for managing NOVA, the National Online Volumetric Archive. As an archival effort of the Visible Human Project for supporting medical visualization research, archiving 3D multimodal radiological teaching files, and enhancing medical education with volumetric data, our overall database structure is simplified; archives grow by accruing information, but seldom have to modify, delete, or overwrite stored records. NOVA is being constructed and populated so that it is transparent to the Internet; that is, much of its internal structure is mirrored in HTML allowing internet search engines to investigate, catalog, and link directly to the deep relational structure of the collection index. The key organizational concept for NOVA is the Image Content Group (ICG), an indexing strategy for cataloging incoming data as a set structure rather than by keyword management. These groups are managed through a series of XML files and authoring scripts. We cover the motivation for Image Content Groups, their overall construction, authorship, and management in XML, and the pilot results for creating public data repositories using this strategy.

  2. Information Technologies in Public Health Management: A Database on Biocides to Improve Quality of Life

    PubMed Central

    Roman, C; Scripcariu, L; Diaconescu, RM; Grigoriu, A

    2012-01-01

    Background Biocides for prolonging the shelf life of a large variety of materials have been extensively used over the last decades. It has estimated that the worldwide biocide consumption to be about 12.4 billion dollars in 2011, and is expected to increase in 2012. As biocides are substances we get in contact with in our everyday lives, access to this type of information is of paramount importance in order to ensure an appropriate living environment. Consequently, a database where information may be quickly processed, sorted, and easily accessed, according to different search criteria, is the most desirable solution. The main aim of this work was to design and implement a relational database with complete information about biocides used in public health management to improve the quality of life. Methods: Design and implementation of a relational database for biocides, by using the software “phpMyAdmin”. Results: A database, which allows for an efficient collection, storage, and management of information including chemical properties and applications of a large quantity of biocides, as well as its adequate dissemination into the public health environment. Conclusion: The information contained in the database herein presented promotes an adequate use of biocides, by means of information technologies, which in consequence may help achieve important improvement in our quality of life. PMID:23113190

  3. PRAIRIEMAP: A GIS database for prairie grassland management in western North America

    USGS Publications Warehouse

    ,

    2003-01-01

    The USGS Forest and Rangeland Ecosystem Science Center, Snake River Field Station (SRFS) maintains a database of spatial information, called PRAIRIEMAP, which is needed to address the management of prairie grasslands in western North America. We identify and collect spatial data for the region encompassing the historical extent of prairie grasslands (Figure 1). State and federal agencies, the primary entities responsible for management of prairie grasslands, need this information to develop proactive management strategies to prevent prairie-grassland wildlife species from being listed as Endangered Species, or to develop appropriate responses if listing does occur. Spatial data are an important component in documenting current habitat and other environmental conditions, which can be used to identify areas that have undergone significant changes in land cover and to identify underlying causes. Spatial data will also be a critical component guiding the decision processes for restoration of habitat in the Great Plains. As such, the PRAIRIEMAP database will facilitate analyses of large-scale and range-wide factors that may be causing declines in grassland habitat and populations of species that depend on it for their survival. Therefore, development of a reliable spatial database carries multiple benefits for land and wildlife management. The project consists of 3 phases: (1) identify relevant spatial data, (2) assemble, document, and archive spatial data on a computer server, and (3) develop and maintain the web site (http://prairiemap.wr.usgs.gov) for query and transfer of GIS data to managers and researchers.

  4. GSIMF : a web service based software and database management system for the generation grids.

    SciTech Connect

    Wang, N.; Ananthan, B.; Gieraltowski, G.; May, E.; Vaniachine, A.; Tech-X Corp.

    2008-01-01

    To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids.

  5. Documentation of a spatial data-base management system for monitoring pesticide application in Washington

    USGS Publications Warehouse

    Schurr, K.M.; Cox, S.E.

    1994-01-01

    The Pesticide-Application Data-Base Management System was created as a demonstration project and was tested with data submitted to the Washington State Department of Agriculture by pesticide applicators from a small geographic area. These data were entered into the Department's relational data-base system and uploaded into the system's ARC/INFO files. Locations for pesticide applica- tions are assigned within the Public Land Survey System grids, and ARC/INFO programs in the Pesticide-Application Data-Base Management System can subdivide each survey section into sixteen idealized quarter-quarter sections for display map grids. The system provides data retrieval and geographic information system plotting capabilities from a menu of seven basic retrieval options. Additionally, ARC/INFO coverages can be created from the retrieved data when required for particular applications. The Pesticide-Application Data-Base Management System, or the general principles used in the system, could be adapted to other applica- tions or to other states.

  6. The Coral Triangle Atlas: an integrated online spatial database system for improving coral reef management.

    PubMed

    Cros, Annick; Ahamad Fatan, Nurulhuda; White, Alan; Teoh, Shwu Jiau; Tan, Stanley; Handayani, Christian; Huang, Charles; Peterson, Nate; Venegas Li, Ruben; Siry, Hendra Yusran; Fitriana, Ria; Gove, Jamison; Acoba, Tomoko; Knight, Maurice; Acosta, Renerio; Andrew, Neil; Beare, Doug

    2014-01-01

    In this paper we describe the construction of an online GIS database system, hosted by WorldFish, which stores bio-physical, ecological and socio-economic data for the 'Coral Triangle Area' in South-east Asia and the Pacific. The database has been built in partnership with all six (Timor-Leste, Malaysia, Indonesia, The Philippines, Solomon Islands and Papua New Guinea) of the Coral Triangle countries, and represents a valuable source of information for natural resource managers at the regional scale. Its utility is demonstrated using biophysical data, data summarising marine habitats, and data describing the extent of marine protected areas in the region.

  7. The Coral Triangle Atlas: An Integrated Online Spatial Database System for Improving Coral Reef Management

    PubMed Central

    Cros, Annick; Ahamad Fatan, Nurulhuda; White, Alan; Teoh, Shwu Jiau; Tan, Stanley; Handayani, Christian; Huang, Charles; Peterson, Nate; Venegas Li, Ruben; Siry, Hendra Yusran; Fitriana, Ria; Gove, Jamison; Acoba, Tomoko; Knight, Maurice; Acosta, Renerio; Andrew, Neil; Beare, Doug

    2014-01-01

    In this paper we describe the construction of an online GIS database system, hosted by WorldFish, which stores bio-physical, ecological and socio-economic data for the ‘Coral Triangle Area’ in South-east Asia and the Pacific. The database has been built in partnership with all six (Timor-Leste, Malaysia, Indonesia, The Philippines, Solomon Islands and Papua New Guinea) of the Coral Triangle countries, and represents a valuable source of information for natural resource managers at the regional scale. Its utility is demonstrated using biophysical data, data summarising marine habitats, and data describing the extent of marine protected areas in the region. PMID:24941442

  8. TheSNPpit—A High Performance Database System for Managing Large Scale SNP Data

    PubMed Central

    Groeneveld, Eildert; Lichtenberg, Helmut

    2016-01-01

    The fast development of high throughput genotyping has opened up new possibilities in genetics while at the same time producing considerable data handling issues. TheSNPpit is a database system for managing large amounts of multi panel SNP genotype data from any genotyping platform. With an increasing rate of genotyping in areas like animal and plant breeding as well as human genetics, already now hundreds of thousand of individuals need to be managed. While the common database design with one row per SNP can manage hundreds of samples this approach becomes progressively slower as the size of the data sets increase until it finally fails completely once tens or even hundreds of thousands of individuals need to be managed. TheSNPpit has implemented three ideas to also accomodate such large scale experiments: highly compressed vector storage in a relational database, set based data manipulation, and a very fast export written in C with Perl as the base for the framework and PostgreSQL as the database backend. Its novel subset system allows the creation of named subsets based on the filtering of SNP (based on major allele frequency, no-calls, and chromosomes) and manually applied sample and SNP lists at negligible storage costs, thus avoiding the issue of proliferating file copies. The named subsets are exported for down stream analysis. PLINK ped and map files are processed as in- and outputs. TheSNPpit allows management of different panel sizes in the same population of individuals when higher density panels replace previous lower density versions as it occurs in animal and plant breeding programs. A completely generalized procedure allows storage of phenotypes. TheSNPpit only occupies 2 bits for storing a single SNP implying a capacity of 4 mio SNPs per 1MB of disk storage. To investigate performance scaling, a database with more than 18.5 mio samples has been created with 3.4 trillion SNPs from 12 panels ranging from 1000 through 20 mio SNPs resulting in a

  9. Microcomputer Software Programs for Vocational Education.

    ERIC Educational Resources Information Center

    Rodenstein, Judith, Ed.; Lambert, Roger, Ed.

    Over 200 microcomputer software packages applicable to vocational education are listed. Most of the programs are available for the Apple, TRS-80, and Commodore microcomputers. The packages have been reviewed, but have not been formally evaluated. Titles of the programs with names and addresses of the distributors are provided. Telephone numbers…

  10. Using Microcomputers to Increase Productivity in Academia.

    ERIC Educational Resources Information Center

    McKenzie, Garry D.

    1984-01-01

    The expanded use of microcomputers, including word processing, to improve productivity of geological educators and students is discussed. Topic areas examined include: computer development and academic use; word processing with microcomputers; instructional uses and other applications; impacts on academia; and acquisition. (BC)

  11. A microcomputer-based preventive maintenance system.

    PubMed

    Rohrer, R A

    1983-01-01

    A medical equipment preventive maintenance system using a Radio Shack microcomputer is described. The system generates a schedule of equipment to be inspected each week. The software is written in BASIC for easy modification or transfer to other commercially available microcomputers. The system has been in use for nine months with good results. PMID:10278151

  12. Advanced Microcomputer Service Technician. Teacher Edition.

    ERIC Educational Resources Information Center

    Brown, A. O., III; Fulkerson, Dan, Ed.

    This manual is the second of a three-text microcomputer service and repair series. This text addresses the training needs of "chip level" technicians who work with digital troubleshooting instruments to solve the complex microcomputer problems that are sent to them from computer stores that do not have full-service facilities. The manual contains…

  13. School Districts Using Microcomputers: A National Survey.

    ERIC Educational Resources Information Center

    Hayes, Jeanne

    1983-01-01

    Discusses results of a comprehensive nationwide survey conducted in the summers of 1982 and 1983 on the number and brands of microcomputers in schools. Findings show a 118 percent increase in the number of schools using microcomputers, with Apple the most popular brand. (TE)

  14. Microcomputers as Social Facilitators in Integrated Preschools.

    ERIC Educational Resources Information Center

    Spiegel-McGill, Phyllis; And Others

    1989-01-01

    The study compared the effects of different play conditions (microcomputer, remote-control robot, or no toys) on the amount of time four dyads of handicapped/nonhandicapped children would interact during structured play. Results suggested that microcomputers may serve as social facilitators for children with significant social and language…

  15. Microcomputers in Education. Report No. 4798.

    ERIC Educational Resources Information Center

    Feurzeig, W.; And Others

    A brief review of the history of computer-assisted instruction and discussion of the current and potential roles of microcomputers in education introduce this review of the capabilities of state-of-the-art microcomputers and currently available software for them, and some speculations about future trends and developments. A survey of current…

  16. Properly Matching Microcomputer Hardware, Software Minimizes "Glitches."

    ERIC Educational Resources Information Center

    Fredenburg, Philip B.

    1986-01-01

    Microcomputer systems for school districts are best obtained by selecting the software, and matching it with hardware. Discusses criteria for software and hardware, monitors, input/output devices, backup devices, and printers. Components of two basic microcomputer systems for the business office are proposed. (MLF)

  17. Distributed Processors and Processing with Microcomputers.

    ERIC Educational Resources Information Center

    Noerr, K. T. Bivins; Noerr, P. L.

    1983-01-01

    This discussion of the role and effects of microcomputers in the information industry considers technological advances of recent times and how they may affect the future. Complexity of systems and systems control, problems of communication (machine-to-machine, machine-to-human), spread of microcomputers, and information networks and systems are…

  18. Using Calculators and Microcomputers with Exceptional Children.

    ERIC Educational Resources Information Center

    Etlinger, Leonard E.; Ogletree, Earl J.

    The focus of this document is on descriptions of calculators, microcomputers, and related educational technology and materials. Calculators are viewed as innovative teaching tools that can have both practical and pedagogical functions in the classroom to enhance understanding and achievement in mathematics. Microcomputers are seen as glorified…

  19. Graduate Education in a Microcomputer Environment. New Delivery Systems for Non-Traditional Graduate Studies.

    ERIC Educational Resources Information Center

    Ammentorp, William; Chaffin, Paulette

    The application of computers and information science to health and human service graduate studies at St. Mary's College is described. The programs are built on a microcomputer hardware and software base and draw on national database utilities to facilitate instructor-student communication as well as access to typical library data. The program…

  20. Libraries in the Information Age: Where Are the Microcomputer and Laser Optical Disc Technologies Taking Us?

    ERIC Educational Resources Information Center

    Chen, Ching-chih

    1986-01-01

    This discussion of information technology and its impact on library operations and services emphasizes the development of microcomputer and laser optical disc technologies. Libraries' earlier responses to bibliographic utilities, online databases, and online public access catalogs are described, and future directions for library services are…

  1. Development of a database management system for Coal Combustion By-Products (CCBs)

    SciTech Connect

    O`Leary, E.M.; Peck, W.D.; Pflughoeft-Hassett, D.F.

    1997-06-01

    Coal combustion by-products (CCBs) are produced in high volumes worldwide. Utilization of these materials is economically and environmentally advantageous and is expected to increase as disposal costs increase. The American Coal Ash Association (ACAA) is developing a database to contain characterization and utilization information on CCBs. This database will provide information for use by managers, marketers, operations personnel, and researchers that will aid in their decision making and long-term planning for issues related to CCBs. The comprehensive nature of the database and the interactive user application will enable ACAA members to efficiently and economically access a wealth of data on CCBs and will promote the technically sound, environmentally safe, and commercially competitive use of CCBs.

  2. Rhode Island Water Supply System Management Plan Database (WSSMP-Version 1.0)

    USGS Publications Warehouse

    Granato, Gregory E.

    2004-01-01

    In Rhode Island, the availability of water of sufficient quality and quantity to meet current and future environmental and economic needs is vital to life and the State's economy. Water suppliers, the Rhode Island Water Resources Board (RIWRB), and other State agencies responsible for water resources in Rhode Island need information about available resources, the water-supply infrastructure, and water use patterns. These decision makers need historical, current, and future water-resource information. In 1997, the State of Rhode Island formalized a system of Water Supply System Management Plans (WSSMPs) to characterize and document relevant water-supply information. All major water suppliers (those that obtain, transport, purchase, or sell more than 50 million gallons of water per year) are required to prepare, maintain, and carry out WSSMPs. An electronic database for this WSSMP information has been deemed necessary by the RIWRB for water suppliers and State agencies to consistently document, maintain, and interpret the information in these plans. Availability of WSSMP data in standard formats will allow water suppliers and State agencies to improve the understanding of water-supply systems and to plan for future needs or water-supply emergencies. In 2002, however, the Rhode Island General Assembly passed a law that classifies some of the WSSMP information as confidential to protect the water-supply infrastructure from potential terrorist threats. Therefore the WSSMP database was designed for an implementation method that will balance security concerns with the information needs of the RIWRB, suppliers, other State agencies, and the public. A WSSMP database was developed by the U.S. Geological Survey in cooperation with the RIWRB. The database was designed to catalog WSSMP information in a format that would accommodate synthesis of current and future information about Rhode Island's water-supply infrastructure. This report documents the design and implementation of

  3. The MANAGE database: nutrient load and site characteristic updates and runoff concentration data.

    PubMed

    Harmel, Daren; Qian, Song; Reckhow, Ken; Casebolt, Pamela

    2008-01-01

    The "Measured Annual Nutrient loads from AGricultural Environments" (MANAGE) database was developed to be a readily accessible, easily queried database of site characteristic and field-scale nutrient export data. The original version of MANAGE, which drew heavily from an early 1980s compilation of nutrient export data, created an electronic database with nutrient load data and corresponding site characteristics from 40 studies on agricultural (cultivated and pasture/range) land uses. In the current update, N and P load data from 15 additional studies of agricultural runoff were included along with N and P concentration data for all 55 studies. The database now contains 1677 watershed years of data for various agricultural land uses (703 for pasture/rangeland; 333 for corn; 291 for various crop rotations; 177 for wheat/oats; and 4-33 yr for barley, citrus, vegetables, sorghum, soybeans, cotton, fallow, and peanuts). Across all land uses, annual runoff loads averaged 14.2 kg ha(-1) for total N and 2.2 kg ha(-1) for total P. On average, these losses represented 10 to 25% of applied fertilizer N and 4 to 9% of applied fertilizer P. Although such statistics produce interesting generalities across a wide range of land use, management, and climatic conditions, regional crop-specific analyses should be conducted to guide regulatory and programmatic decisions. With this update, MANAGE contains data from a vast majority of published peer-reviewed N and P export studies on homogeneous agricultural land uses in the USA under natural rainfall-runoff conditions and thus provides necessary data for modeling and decision-making related to agricultural runoff. The current version can be downloaded at http://www.ars.usda.gov/spa/manage-nutrient.

  4. A Conceptual Model and Database to Integrate Data and Project Management

    NASA Astrophysics Data System (ADS)

    Guarinello, M. L.; Edsall, R.; Helbling, J.; Evaldt, E.; Glenn, N. F.; Delparte, D.; Sheneman, L.; Schumaker, R.

    2015-12-01

    Data management is critically foundational to doing effective science in our data-intensive research era and done well can enhance collaboration, increase the value of research data, and support requirements by funding agencies to make scientific data and other research products available through publically accessible online repositories. However, there are few examples (but see the Long-term Ecological Research Network Data Portal) of these data being provided in such a manner that allows exploration within the context of the research process - what specific research questions do these data seek to answer? what data were used to answer these questions? what data would have been helpful to answer these questions but were not available? We propose an agile conceptual model and database design, as well as example results, that integrate data management with project management not only to maximize the value of research data products but to enhance collaboration during the project and the process of project management itself. In our project, which we call 'Data Map,' we used agile principles by adopting a user-focused approach and by designing our database to be simple, responsive, and expandable. We initially designed Data Map for the Idaho EPSCoR project "Managing Idaho's Landscapes for Ecosystem Services (MILES)" (see https://www.idahoecosystems.org//) and will present example results for this work. We consulted with our primary users- project managers, data managers, and researchers to design the Data Map. Results will be useful to project managers and to funding agencies reviewing progress because they will readily provide answers to the questions "For which research projects/questions are data available and/or being generated by MILES researchers?" and "Which research projects/questions are associated with each of the 3 primary questions from the MILES proposal?" To be responsive to the needs of the project, we chose to streamline our design for the prototype

  5. Watershed Data Management (WDM) Database for Salt Creek Streamflow Simulation, DuPage County, Illinois

    USGS Publications Warehouse

    Murphy, Elizabeth A.; Ishii, Audrey

    2006-01-01

    The U.S. Geological Survey (USGS), in cooperation with DuPage County Department of Engineering, Stormwater Management Division, maintains a database of hourly meteorologic and hydrologic data for use in a near real-time streamflow simulation system, which assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek watershed in DuPage County, Illinois. The majority of the precipitation data are collected from a tipping-bucket rain-gage network located in and near DuPage County. The other meteorologic data (wind speed, solar radiation, air temperature, and dewpoint temperature) are collected at Argonne National Laboratory in Argonne, Illinois. Potential evapotranspiration is computed from the meteorologic data. The hydrologic data (discharge and stage) are collected at USGS streamflow-gaging stations in DuPage County. These data are stored in a Watershed Data Management (WDM) database. This report describes a version of the WDM database that was quality-assured and quality-controlled annually to ensure the datasets were complete and accurate. This version of the WDM database contains data from January 1, 1997, through September 30, 2004, and is named SEP04.WDM. This report provides a record of time periods of poor data for each precipitation dataset and describes methods used to estimate the data for the periods when data were missing, flawed, or snowfall-affected. The precipitation dataset data-filling process was changed in 2001, and both processes are described. The other meteorologic and hydrologic datasets in the database are fully described in the annual U.S. Geological Survey Water Data Report for Illinois and, therefore, are described in less detail than the precipitation datasets in this report.

  6. The Kepler DB, a Database Management System for Arrays, Sparse Arrays and Binary Data

    NASA Technical Reports Server (NTRS)

    McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Middour, Christopher; Klaus, Todd C.; Wohler, Bill

    2010-01-01

    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30-minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database (Kepler DB) management system was created to act as the repository of this information. After one year of ight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center.

  7. Database system for management of health physics and industrial hygiene records.

    SciTech Connect

    Murdoch, B. T.; Blomquist, J. A.; Cooke, R. H.; Davis, J. T.; Davis, T. M.; Dolecek, E. H.; Halka-Peel, L.; Johnson, D.; Keto, D. N.; Reyes, L. R.; Schlenker, R. A.; Woodring; J. L.

    1999-10-05

    This paper provides an overview of the Worker Protection System (WPS), a client/server, Windows-based database management system for essential radiological protection and industrial hygiene. Seven operational modules handle records for external dosimetry, bioassay/internal dosimetry, sealed sources, routine radiological surveys, lasers, workplace exposure, and respirators. WPS utilizes the latest hardware and software technologies to provide ready electronic access to a consolidated source of worker protection.

  8. Database Access Manager for the Software Engineering Laboratory (DAMSEL) user's guide

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Operating instructions for the Database Access Manager for the Software Engineering Laboratory (DAMSEL) system are presented. Step-by-step instructions for performing various data entry and report generation activities are included. Sample sessions showing the user interface display screens are also included. Instructions for generating reports are accompanied by sample outputs for each of the reports. The document groups the available software functions by the classes of users that may access them.

  9. Dynamic Tables: An Architecture for Managing Evolving, Heterogeneous Biomedical Data in Relational Database Management Systems

    PubMed Central

    Corwin, John; Silberschatz, Avi; Miller, Perry L.; Marenco, Luis

    2007-01-01

    Data sparsity and schema evolution issues affecting clinical informatics and bioinformatics communities have led to the adoption of vertical or object-attribute–value-based database schemas to overcome limitations posed when using conventional relational database technology. This paper explores these issues and discusses why biomedical data are difficult to model using conventional relational techniques. The authors propose a solution to these obstacles based on a relational database engine using a sparse, column-store architecture. The authors provide benchmarks comparing the performance of queries and schema-modification operations using three different strategies: (1) the standard conventional relational design; (2) past approaches used by biomedical informatics researchers; and (3) their sparse, column-store architecture. The performance results show that their architecture is a promising technique for storing and processing many types of data that are not handled well by the other two semantic data models. PMID:17068350

  10. Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements

    NASA Technical Reports Server (NTRS)

    Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri

    2006-01-01

    NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.

  11. Microcomputer Use in Higher Education. Executive Summary of a Survey.

    ERIC Educational Resources Information Center

    Lukesh, Susan S.; And Others

    This executive summary of the 1986 Survey of Microcomputers in Higher Education presents the highlights of each of the major areas covered by the survey: (1) general policy; (2) microcomputer availability; (3) microcomputer access; (4) microcomputer acquisition; (5) software availability; and (6) software support. The 211 survey respondents were…

  12. Metabolonote: A Wiki-Based Database for Managing Hierarchical Metadata of Metabolome Analyses

    PubMed Central

    Ara, Takeshi; Enomoto, Mitsuo; Arita, Masanori; Ikeda, Chiaki; Kera, Kota; Yamada, Manabu; Nishioka, Takaaki; Ikeda, Tasuku; Nihei, Yoshito; Shibata, Daisuke; Kanaya, Shigehiko; Sakurai, Nozomu

    2015-01-01

    Metabolomics – technology for comprehensive detection of small molecules in an organism – lags behind the other “omics” in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata), existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data) being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called “Togo Metabolome Data” (TogoMD), with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers’ understanding and use of data but also submitters’ motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitate the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http

  13. An Introduction to Database Structure and Database Machines.

    ERIC Educational Resources Information Center

    Detweiler, Karen

    1984-01-01

    Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…

  14. Database Administrator

    ERIC Educational Resources Information Center

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  15. Microcomputer logarithmic time base generator

    NASA Astrophysics Data System (ADS)

    Wills, L. J.; Ly, Nhan G.

    1985-11-01

    A new circuit is introduced to generate the logarithmic time base function with good resolution. By using a single-chip microcomputer with EPROM program storage, the circuitry is simplified and can be easily reproduced. The output function covers more than six decades of time and has 590 discrete points per decade with an accuracy of one discrete point per decade or ±0.16%. The design overcomes two well-known problems in using the logarithmic time base. First because the time increments are derived from a real-time register there is a precise reference for zero time, and second a series of time base interval marks are output for correctly calibrating the time axis.

  16. Microcomputer control of power converters

    SciTech Connect

    Alegria, C.M.; Freris, L.L.; Paiva, J.P.

    1984-08-01

    Thyristor power converters are charac terized by an inherent discrete control action and are, therefore, particularly well suited to on - line control by microcomputers. The paper presents a new design of digital controllers based upon powerful 16-bit or bit -slice microprocessors, which provide high firing time resolution and enough computing power to implement sophis ticated control strategies. Both hardware and software are discussed, with special emphasis on the firing con trol algorithms. The properties of Pulse Frequency Con trol (PFC) and Pulse Phase Control (PPC) are examined and small-signal discrete models are presented. These models are used in the analysis of constant current and constant extinction angle control through the z-transform method.

  17. BDC: Operational Database Management for the SPOT/HELIOS Operations Control System

    NASA Astrophysics Data System (ADS)

    Guiral, P.; Teodomante, S.

    Since operational database is essential for the executable environment of satellite control system (e.g. for monitoring and commanding), the French Space Agency (CNES), as ground segment designer and satellite operator, allocates important resources to Operational Database Management tools development. Indeed, this kind of tool is necessary in order to generate and maintain the operations control system (OCS) data repository during all the relevant space system life. In this context, the objectives of this paper are firstly to present lessons learnt from SPOT/Helios product line and secondly to point out the new challenges related to the increasing number of satellite systems to qualify and maintain during the upcoming years. "BDC", as a component of the SPOT / Helios operations control, is an Operational Database Management tool designed and developed by CNES. This tool has been used since 1998 for SPOT4, then has been upgraded for Helios 1A / 1B, SPOT5 and currently is being customized for Helios 2A. We emphasize the need for CNES of having at one's disposal a tool enabling a significant flexibility in handling data modification during technical and operational qualification phases. This implies: an evolution of the data exchanges between the satellite contractor, Astrium, and • CNES. constraints on the tool development process, leading to the choice of developing • first a prototype and then industrializing it. After a brief data description, the tool is technically described, in particular its architecture and the design choices that allow reusability for different satellites lines. Keywords: Satellite operations, Operations Control System, Data management, Relational Database.

  18. Networking of microcomputers in the radiology department.

    PubMed

    Markivee, C R

    1985-10-01

    A microcomputer may be installed in any of several areas in a radiology department or office to automate data processing. Such areas include the reception desk, the transcription office, the quality-control station, and remote or satellite radiography rooms. Independent microcomputers can be interconnected by networking, using small hardware and software packages and cables, to effect communication between them, afford access to a common data base, and share peripheral devices such as hard disks and printers. A network of microcomputers can perform many of the functions of a larger minicomputer system at lower cost and can be assembled in small modules as budgetary constraints allow. PMID:3876011

  19. 275 C Downhole Microcomputer System

    SciTech Connect

    Chris Hutchens; Hooi Miin Soo

    2008-08-31

    An HC11 controller IC and along with serial SRAM and ROM support ICs chip set were developed to support a data acquisition and control for extreme temperature/harsh environment conditions greater than 275 C. The 68HC11 microprocessor is widely used in well logging tools for control, data acquisition, and signal processing applications and was the logical choice for a downhole controller. This extreme temperature version of the 68HC11 enables new high temperature designs and additionally allows 68HC11-based well logging tools and MWD tools to be upgraded for high temperature operation in deep gas reservoirs, The microcomputer chip consists of the microprocessor ALU, a small boot ROM, 4 kbyte data RAM, counter/timer unit, serial peripheral interface (SPI), asynchronous serial interface (SCI), and the A, B, C, and D parallel ports. The chip is code compatible with the single chip mode commercial 68HC11 except for the absence of the analog to digital converter system. To avoid mask programmed internal ROM, a boot program is used to load the microcomputer program from an external mask SPI ROM. A SPI RAM IC completes the chip set and allows data RAM to be added in 4 kbyte increments. The HC11 controller IC chip set is implemented in the Peregrine Semiconductor 0.5 micron Silicon-on-Sapphire (SOS) process using a custom high temperature cell library developed at Oklahoma State University. Yield data is presented for all, the HC11, SPI-RAM and ROM. The lessons learned in this project were extended to the successful development of two high temperature versions of the LEON3 and a companion 8 Kbyte SRAM, a 200 C version for the Navy and a 275 C version for the gas industry.

  20. Fleet-Wide Prognostic and Health Management Suite: Asset Fault Signature Database

    SciTech Connect

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2015-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: (1) Diagnostic Advisor, (2) Asset Fault Signature (AFS) Database, (3) Remaining Useful Life Advisor, and (4) Remaining Useful Life Database. The paper focuses on the AFS Database of the FW-PHM Suite, which is used to catalog asset fault signatures. A fault signature is a structured representation of the information that an expert would use to first detect and then verify the occurrence of a specific type of fault. The fault signatures developed to assess the health status of generator step-up transformers are described in the paper. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  1. Enhancing Disaster Management: Development of a Spatial Database of Day Care Centers in the USA

    DOE PAGES

    Singh, Nagendra; Tuttle, Mark A.; Bhaduri, Budhendra L.

    2015-07-30

    Children under the age of five constitute around 7% of the total U.S. population and represent a segment of the population, which is totally dependent on others for day-to-day activities. A significant proportion of this population spends time in some form of day care arrangement while their parents are away from home. Accounting for those children during emergencies is of high priority, which requires a broad understanding of the locations of such day care centers. As concentrations of at risk population, the spatial location of day care centers is critical for any type of emergency preparedness and response (EPR). However,more » until recently, the U.S. emergency preparedness and response community did not have access to a comprehensive spatial database of day care centers at the national scale. This paper describes an approach for the development of the first comprehensive spatial database of day care center locations throughout the USA utilizing a variety of data harvesting techniques to integrate information from widely disparate data sources followed by geolocating for spatial precision. In the context of disaster management, such spatially refined demographic databases hold tremendous potential for improving high resolution population distribution and dynamics models and databases.« less

  2. Enhancing Disaster Management: Development of a Spatial Database of Day Care Centers in the USA

    SciTech Connect

    Singh, Nagendra; Tuttle, Mark A.; Bhaduri, Budhendra L.

    2015-07-30

    Children under the age of five constitute around 7% of the total U.S. population and represent a segment of the population, which is totally dependent on others for day-to-day activities. A significant proportion of this population spends time in some form of day care arrangement while their parents are away from home. Accounting for those children during emergencies is of high priority, which requires a broad understanding of the locations of such day care centers. As concentrations of at risk population, the spatial location of day care centers is critical for any type of emergency preparedness and response (EPR). However, until recently, the U.S. emergency preparedness and response community did not have access to a comprehensive spatial database of day care centers at the national scale. This paper describes an approach for the development of the first comprehensive spatial database of day care center locations throughout the USA utilizing a variety of data harvesting techniques to integrate information from widely disparate data sources followed by geolocating for spatial precision. In the context of disaster management, such spatially refined demographic databases hold tremendous potential for improving high resolution population distribution and dynamics models and databases.

  3. Kal-Haiti a Research Database for Risks Management and Sustainable Reconstruction in Haiti

    NASA Astrophysics Data System (ADS)

    Giros, A.; Fontannaz, D.; Allenbach, B.; Treinsoutrot, D.; De Michele, M.

    2012-07-01

    Following the 12th January 2010 earthquake in Haiti, the French Agence Nationale de la Recherche has funded a project named KAL-Haiti which aims at gathering remote sensing imagery as well as in-situ and exogenous data into a knowledge base. This database, seen as a shareable resource, can serve as a basis for helping the reconstruction of the country, but also as a reference for scientific studies devoted to all phases of risk management. The project main outcome will be a geo-referenced database containing a selection of remotely sensed imagery acquired before and after the disastrous event supplemented with all relevant ancillary data, and enriched with in-situ measurements and exogenous data. The resulting reference database is freely available for research and for reconstruction tasks. It is strongly expected that users will also become contributors by sharing their own data production, thus participating to the growth of the initial kernel. The database will also be enriched with new satellite images, monitoring the evolution of the Haitian situation over the next 10 years.

  4. Conceptual database modeling: a method for enabling end users (radiologists) to understand and develop their information management applications.

    PubMed

    Hawkins, H; Young, S K; Hubert, K C; Hallock, P

    2001-06-01

    As medical technology advances at a rapid pace, clinicians become further and further removed from the design of their own technological tools. This is particularly evident with information management. For radiologists, clinical histories, patient reports, and other pertinent information require sophisticated tools for data handling. However, as databases grow more powerful and sophisticated, systems require the expertise of programmers and information technology personnel. The radiologist, the clinician end-user, must maintain involvement in the development of system tools to insure effective information management. Conceptual database modeling is a design method that serves to bridge the gap between the technological aspects of information management and its clinical applications. Conceptual database modeling involves developing information systems in simple language so that anyone can have input into the overall design. This presentation describes conceptual database modeling, using object role modeling, as a means by which end-users (clinicians) may participate in database development.

  5. A cohort and database study of airway management in patients undergoing thyroidectomy for retrosternal goitre.

    PubMed

    Gilfillan, N; Ball, C M; Myles, P S; Serpell, J; Johnson, W R; Paul, E

    2014-11-01

    Patients undergoing thyroid surgery with retrosternal goitre may raise concerns for the anaesthetist, especially airway management. We reviewed a multicentre prospective thyroid surgery database and extracted data for those patients with retrosternal goitre. Additionally, we reviewed the anaesthetic charts of patients with retrosternal goitre at our institution to identify the anaesthetic induction technique and airway management. Of 4572 patients in the database, 919 (20%) had a retrosternal goitre. Two cases of early postoperative tracheomalacia were reported, one in the retrosternal group. Despite some very large goitres, no patient required tracheostomy or cardiopulmonary bypass and there were no perioperative deaths. In the subset of 133 patients managed at our institution over six years, there were no major adverse anaesthetic outcomes and no patient had a failed airway or tracheomalacia. In the latter cohort, of 32 (24%) patients identified as having a potentially difficult airway, 17 underwent awake fibreoptic tracheal intubation, but two of these were abandoned and converted to intravenous induction and general anaesthesia. Eleven had inhalational induction; two of these were also abandoned and converted to intravenous induction and general anaesthesia. Of those suspected as having a difficult airway, 28 (87.5%) subsequently had direct laryngoscopy where the laryngeal inlet was clearly visible. We found no good evidence that thyroid surgery patients with retrosternal goitre, with or without symptoms and signs of tracheal compression, present the experienced anaesthetist with an airway that cannot be managed using conventional techniques. This does not preclude the need for multidisciplinary discussion and planning. PMID:25342401

  6. A Prescribed Fire Emission Factors Database for Land Management and Air Quality Applications

    NASA Astrophysics Data System (ADS)

    Lincoln, E.; Hao, W.; Baker, S.; Yokelson, R. J.; Burling, I. R.; Urbanski, S. P.; Miller, W.; Weise, D. R.; Johnson, T. J.

    2010-12-01

    Prescribed fire is a significant emissions source in the U.S. and that needs to be adequately characterized in atmospheric transport/chemistry models. In addition, the Clean Air Act, its amendments, and air quality regulations require that prescribed fire managers estimate the quantity of emissions that a prescribed fire will produce. Several published papers contain a few emission factors for prescribed fire and additional results are found in unpublished documents whose quality has to be assessed. In conjunction with three research projects developing detailed new emissions data and meteorological tools to assist prescribed fire managers, the Strategic Environmental Research and Development Program (SERDP) is supporting development of a database that contains emissions information related to prescribed burning. Ultimately, this database will be available on the Internet and will contain older emissions information that has been assessed and newer emissions information that has been developed from both laboratory-scale and field measurements. The database currently contains emissions information from over 300 burns of different wildland vegetation types, including grasslands, shrublands, woodlands, forests, and tundra over much of North America. A summary of the compiled data will be presented, along with suggestions for additional categories.

  7. Head-to-Head Evaluation of the Pro-Cite and Sci-Mate Bibliographic Database Management Systems.

    ERIC Educational Resources Information Center

    Saari, David S.; Foster, George A., Jr.

    1989-01-01

    Compares two full featured database management systems for bibliographic information in terms of programs and documentation; record creation and editing; online database citations; search procedures; access to references in external text files; sorting and printing functions; style sheets; indexes; and file operations. (four references) (CLB)

  8. The Use of SQL and Second Generation Database Management Systems for Data Processing and Information Retrieval in Libraries.

    ERIC Educational Resources Information Center

    Leigh, William; Paz, Noemi

    1989-01-01

    Describes Structured Query Language (SQL), the result of an American National Standards Institute effort to standardize language used to query computer databases and a common element in second generation database management systems. The discussion covers implementations of SQL, associated products, and techniques for its use in online catalogs,…

  9. An online spatial database of Australian Indigenous Biocultural Knowledge for contemporary natural and cultural resource management.

    PubMed

    Pert, Petina L; Ens, Emilie J; Locke, John; Clarke, Philip A; Packer, Joanne M; Turpin, Gerry

    2015-11-15

    With growing international calls for the enhanced involvement of Indigenous peoples and their biocultural knowledge in managing conservation and the sustainable use of physical environment, it is timely to review the available literature and develop cross-cultural approaches to the management of biocultural resources. Online spatial databases are becoming common tools for educating land managers about Indigenous Biocultural Knowledge (IBK), specifically to raise a broad awareness of issues, identify knowledge gaps and opportunities, and to promote collaboration. Here we describe a novel approach to the application of internet and spatial analysis tools that provide an overview of publically available documented Australian IBK (AIBK) and outline the processes used to develop the online resource. By funding an AIBK working group, the Australian Centre for Ecological Analysis and Synthesis (ACEAS) provided a unique opportunity to bring together cross-cultural, cross-disciplinary and trans-organizational contributors who developed these resources. Without such an intentionally collaborative process, this unique tool would not have been developed. The tool developed through this process is derived from a spatial and temporal literature review, case studies and a compilation of methods, as well as other relevant AIBK papers. The online resource illustrates the depth and breadth of documented IBK and identifies opportunities for further work, partnerships and investment for the benefit of not only Indigenous Australians, but all Australians. The database currently includes links to over 1500 publically available IBK documents, of which 568 are geo-referenced and were mapped. It is anticipated that as awareness of the online resource grows, more documents will be provided through the website to build the database. It is envisaged that this will become a well-used tool, integral to future natural and cultural resource management and maintenance.

  10. An online spatial database of Australian Indigenous Biocultural Knowledge for contemporary natural and cultural resource management.

    PubMed

    Pert, Petina L; Ens, Emilie J; Locke, John; Clarke, Philip A; Packer, Joanne M; Turpin, Gerry

    2015-11-15

    With growing international calls for the enhanced involvement of Indigenous peoples and their biocultural knowledge in managing conservation and the sustainable use of physical environment, it is timely to review the available literature and develop cross-cultural approaches to the management of biocultural resources. Online spatial databases are becoming common tools for educating land managers about Indigenous Biocultural Knowledge (IBK), specifically to raise a broad awareness of issues, identify knowledge gaps and opportunities, and to promote collaboration. Here we describe a novel approach to the application of internet and spatial analysis tools that provide an overview of publically available documented Australian IBK (AIBK) and outline the processes used to develop the online resource. By funding an AIBK working group, the Australian Centre for Ecological Analysis and Synthesis (ACEAS) provided a unique opportunity to bring together cross-cultural, cross-disciplinary and trans-organizational contributors who developed these resources. Without such an intentionally collaborative process, this unique tool would not have been developed. The tool developed through this process is derived from a spatial and temporal literature review, case studies and a compilation of methods, as well as other relevant AIBK papers. The online resource illustrates the depth and breadth of documented IBK and identifies opportunities for further work, partnerships and investment for the benefit of not only Indigenous Australians, but all Australians. The database currently includes links to over 1500 publically available IBK documents, of which 568 are geo-referenced and were mapped. It is anticipated that as awareness of the online resource grows, more documents will be provided through the website to build the database. It is envisaged that this will become a well-used tool, integral to future natural and cultural resource management and maintenance. PMID:25682266

  11. Using Relational Data Base Management Systems Capabilities to Increase the Usefulness of Open-Ended Survey Responses. AIR 1985 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Seppanen, Loretta J.

    The use of Relational Database Management Systems (RDBMS), a type of microcomputer application software, to analyze open-ended survey questions is discussed. Using open-ended questions allows researchers to ask respondents to express themselves freely about their attitudes and beliefs. This approach also can elicit a precise answer even though the…

  12. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  13. Utilization of Educationally Oriented Microcomputer Based Laboratories

    ERIC Educational Resources Information Center

    Fitzpatrick, Michael J.; Howard, James A.

    1977-01-01

    Describes one approach to supplying engineering and computer science educators with an economical portable digital systems laboratory centered around microprocessors. Expansion of the microcomputer based laboratory concept to include Learning Resource Aided Instruction (LRAI) systems is explored. (Author)

  14. Introducing a Microcomputer into Adult Education Classes.

    ERIC Educational Resources Information Center

    Bostock, Stephen; Seifert, Roger

    1983-01-01

    There are now plenty of adult education classes on how to use a computer; this article is an account of how microcomputers were actually used as an aid to learning in the biological, natural, and social sciences. (Author/SSH)

  15. The Practical Use of Microcomputers in Rehabilitation.

    ERIC Educational Resources Information Center

    Vanderheiden, Gregg C.

    1983-01-01

    The application of microcomputers for handicapped individuals has tremendous potential. Barriers needing research include: access by the handicapped to the available software, aids that are portable, the problem of obsolescence. (SEW)

  16. The Seven Deadly Sins of Online Microcomputing.

    ERIC Educational Resources Information Center

    King, Alan

    1989-01-01

    Offers suggestions for avoiding common errors in online microcomputer use. Areas discussed include learning the basics; hardware protection; backup options; hard disk organization; software selection; file security; and the use of dedicated communications lines. (CLB)

  17. National information network and database system of hazardous waste management in China

    SciTech Connect

    Ma Hongchang

    1996-12-31

    Industries in China generate large volumes of hazardous waste, which makes it essential for the nation to pay more attention to hazardous waste management. National laws and regulations, waste surveys, and manifest tracking and permission systems have been initiated. Some centralized hazardous waste disposal facilities are under construction. China`s National Environmental Protection Agency (NEPA) has also obtained valuable information on hazardous waste management from developed countries. To effectively share this information with local environmental protection bureaus, NEPA developed a national information network and database system for hazardous waste management. This information network will have such functions as information collection, inquiry, and connection. The long-term objective is to establish and develop a national and local hazardous waste management information network. This network will significantly help decision makers and researchers because it will be easy to obtain information (e.g., experiences of developed countries in hazardous waste management) to enhance hazardous waste management in China. The information network consists of five parts: technology consulting, import-export management, regulation inquiry, waste survey, and literature inquiry.

  18. Microcomputer-Based Pediatric Health Maintenance System

    PubMed Central

    Maxwell, Carron M.; Philipsborn, Herbert F.; Napier, Robin; Nigro, Rise

    1983-01-01

    The Pediatric Evaluation, Research and Tracking System (PERTS) is an automated health maintenance and tracking system implemented on a microcomputer. This summary of significant medical information is used to support more effective patient care, operations research, training and program planning for a pediatric group practice in a suburban teaching hospital. The System's programs are designed to be convenient for use by health professionals and clerical staff. The system is implemented on a 64K microcomputer using MBASIC under MP/M and KSAM.

  19. Integrated computer aided reservoir management (CARM) using Landmark`s OpenWorks 3 database and Reservoir Management software

    SciTech Connect

    Ward, L.C.

    1995-08-01

    Multi-disciplinary asset teams in today`s oil industry are facing an information revolution. To assist them to more accurately define and develop known reservoirs, to visualise reservoirs in 3 dimensions, and to communicate more effectively, they require access to a single common dataset and a flexible, and comprehensive suite of reservoir description software, that allows delineation and refinement of quantitative 3D reservoir models. Landmark`s Computer Aided Reservoir Management (CARM) software provides the most complete integrated geo-information solution for data management, and a suite of integrated Reservoir Management software covering 3D & 2D seismic interpretation, 3D Geocellular modelling (Stratamodel), geological cross section building and deterministic and probabilistic petrophysical log analysis for 3D display. The OpenWorks 3 database provides a common framework not only for the integration of data between Landmark applications, but also with third party applications. Thus once the reservoir stratigraphic framework has been built in Stratamodel it can be used as direct input for stochastic modelling in Odin`s STORM, and also provide data direct to reservoir simulation applications. The key element to this integration is the OpenWorks 3 database which is a production oriented geo-science data model with over 500 tables and in excess of 2500 attributes. The OpenWorks 3 software permits seamless data transfer from one reservoir management application to another, and at every stage of reservoir management the latest updated interpretation is available to every team member. The goal of integrated reservoir management, to achieve effective exploitation of reserves, now utilises multi disciplinary analysis by cross functional teams, enabling the industry to maximise return on {open_quotes}knowledge assets{close_quotes} and physical reserves.

  20. Managing vulnerabilities and achieving compliance for Oracle databases in a modern ERP environment

    NASA Astrophysics Data System (ADS)

    Hölzner, Stefan; Kästle, Jan

    In this paper we summarize good practices on how to achieve compliance for an Oracle database in combination with an ERP system. We use an integrated approach to cover both the management of vulnerabilities (preventive measures) and the use of logging and auditing features (detective controls). This concise overview focusses on the combination Oracle and SAP and it’s dependencies, but also outlines security issues that arise with other ERP systems. Using practical examples, we demonstrate common vulnerabilities and coutermeasures as well as guidelines for the use of auditing features.

  1. Performance of online drug information databases as clinical decision support tools in infectious disease medication management.

    PubMed

    Polen, Hyla H; Zapantis, Antonia; Clauson, Kevin A; Clauson, Kevin Alan; Jebrock, Jennifer; Paris, Mark

    2008-01-01

    Infectious disease (ID) medication management is complex and clinical decision support tools (CDSTs) can provide valuable assistance. This study evaluated scope and completeness of ID drug information found in online databases by evaluating their ability to answer 147 question/answer pairs. Scope scores produced highest rankings (%) for: Micromedex (82.3), Lexi-Comp/American Hospital Formulary Service (81.0), and Medscape Drug Reference (81.0); lowest includes: Epocrates Online Premium (47.0), Johns Hopkins ABX Guide (45.6), and PEPID PDC (40.8). PMID:18999059

  2. Object-Oriented Database for Managing Building Modeling Components and Metadata: Preprint

    SciTech Connect

    Long, N.; Fleming, K.; Brackney, L.

    2011-12-01

    Building simulation enables users to explore and evaluate multiple building designs. When tools for optimization, parametrics, and uncertainty analysis are combined with analysis engines, the sheer number of discrete simulation datasets makes it difficult to keep track of the inputs. The integrity of the input data is critical to designers, engineers, and researchers for code compliance, validation, and building commissioning long after the simulations are finished. This paper discusses an application that stores inputs needed for building energy modeling in a searchable, indexable, flexible, and scalable database to help address the problem of managing simulation input data.

  3. Development of genome viewer (Web Omics Viewer) for managing databases of cucumber genome

    NASA Astrophysics Data System (ADS)

    Wojcieszek, M.; RóŻ, P.; Pawełkowicz, M.; Nowak, R.; Przybecki, Z.

    Cucumber is an important plant in horticulture and science world. Sequencing projects of C. sativus genome enable new methodological aproaches in further investigation of this species. Accessibility is crucial to fully exploit obtained information about detail structure of genes, markers and other characteristic features such contigs, scaffolds and chromosomes. Genome viewer is one of tools providing plain and easy way for presenting genome data for users and for databases administration. Gbrowse - the main viewer has several very useful features but lacks in managing simplicity. Our group developed new genome browser Web Omics Viewer (WOV), keeping functionality but improving utilization and accessibility to cucumber genome data.

  4. System configuration management plan for the TWRS controlled baseline database system [TCBD

    SciTech Connect

    Spencer, S.G.

    1998-09-23

    LHMC, TWRS Business Management Organization (BMO) is designated as system owner, operator, and maintenance authority. The TWAS BMO identified the need for the TCBD. The TWRS BMO users have established all requirements for the database and are responsible for maintaining database integrity and control (after the interface data has been received). Initial interface data control and integrity is maintained through functional and administrative processes and is the responsibility of the database owners who are providing the data. The specific groups within the TWRS BMO affected by this plan are the Financial Management and TWRS Management Support Project, Master Planning, and the Financial Control Integration and Reporting. The interfaces between these organizations are through normal line management chain of command. The Master Planning Group is assigned the responsibility to continue development and maintenance of the TCBD. This group maintains information that includes identification of requirements and changes to those requirements in a TCBD project file. They are responsible for the issuance, maintenance, and change authority of this SCW. LHMC, TWRS TCBD Users are designated as providing the project`s requirement changes for implementation and also testing of the TCBD during development. The Master Planning Group coordinates and monitors the user`s requests for system requirements (new/existing) as well as beta and acceptance testing. Users are those individuals and organizations needing data or information from the TCBD and having both a need-to-know and the proper training and authority to access the database. Each user or user organization is required to comply with the established requirements and procedures governing the TCBD. Lockheed Martin Services, Inc. (LMSI) is designated the TCBD developer, maintainer, and custodian until acceptance and process testing of the system has been completed via the TWRS BMO. Once this occurs, the TCBD will be completed and

  5. Study on parallel and distributed management of RS data based on spatial database

    NASA Astrophysics Data System (ADS)

    Chen, Yingbiao; Qian, Qinglan; Wu, Hongqiao; Liu, Shijin

    2009-10-01

    With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.

  6. Graph Databases for Large-Scale Healthcare Systems: A Framework for Efficient Data Management and Data Services

    SciTech Connect

    Park, Yubin; Shankar, Mallikarjun; Park, Byung H.; Ghosh, Dr. Joydeep

    2014-01-01

    Designing a database system for both efficient data management and data services has been one of the enduring challenges in the healthcare domain. In many healthcare systems, data services and data management are often viewed as two orthogonal tasks; data services refer to retrieval and analytic queries such as search, joins, statistical data extraction, and simple data mining algorithms, while data management refers to building error-tolerant and non-redundant database systems. The gap between service and management has resulted in rigid database systems and schemas that do not support effective analytics. We compose a rich graph structure from an abstracted healthcare RDBMS to illustrate how we can fill this gap in practice. We show how a healthcare graph can be automatically constructed from a normalized relational database using the proposed 3NF Equivalent Graph (3EG) transformation.We discuss a set of real world graph queries such as finding self-referrals, shared providers, and collaborative filtering, and evaluate their performance over a relational database and its 3EG-transformed graph. Experimental results show that the graph representation serves as multiple de-normalized tables, thus reducing complexity in a database and enhancing data accessibility of users. Based on this finding, we propose an ensemble framework of databases for healthcare applications.

  7. The "How To's" of Using Word Processors and Database Managers with Qualitative Data: A Primer for Professionals.

    ERIC Educational Resources Information Center

    Stuck, M. F.

    This guide provides an introduction to the use of microcomputers with qualitative data. It is deliberately non-specific and rudimentary in order to be of maximum benefit to the widest possible audience of beginning microcomputer users who wish to analyze their data using software that is not specifically designed for qualitative data analysis. The…

  8. Metadata-based generation and management of knowledgebases from molecular biological databases.

    PubMed

    Eccles, J R; Saldanha, J W

    1990-06-01

    Present-day knowledge-based systems (or expert systems) and databases constitute 'islands of computing' with little or no connection to each other. The use of software to provide a communication channel between the two, and to integrate their separate functions, is particularly attractive in certain data-rich domains where there are already pre-existing database systems containing the data required by the relevant knowledge-based system. Our evolving program, GENPRO, provides such a communication channel. The original methodology has been extended to provide interactive Prolog clause input with syntactic and semantic verification. This enables automatic generation of clauses from the source database, together with complete management of subsequent interfacing to the specified knowledge-based system. The particular data-rich domain used in this paper is protein structure, where processes which require reasoning (modelled by knowledge-based systems), such as the inference of protein topology, protein model-building and protein structure prediction, often require large amounts of raw data (i.e., facts about particular proteins) in the form of logic programming ground clauses. These are generated in the proper format by use of the concept of metadata. PMID:2397635

  9. Migration check tool: automatic plan verification following treatment management systems upgrade and database migration.

    PubMed

    Hadley, Scott W; White, Dale; Chen, Xiaoping; Moran, Jean M; Keranen, Wayne M

    2013-01-01

    Software upgrades of the treatment management system (TMS) sometimes require that all data be migrated from one version of the database to another. It is necessary to verify that the data are correctly migrated to assure patient safety. It is impossible to verify by hand the thousands of parameters that go into each patient's radiation therapy treatment plan. Repeating pretreatment QA is costly, time-consuming, and may be inadequate in detecting errors that are introduced during the migration. In this work we investigate the use of an automatic Plan Comparison Tool to verify that plan data have been correctly migrated to a new version of a TMS database from an older version. We developed software to query and compare treatment plans between different versions of the TMS. The same plan in the two TMS systems are translated into an XML schema. A plan comparison module takes the two XML schemas as input and reports any differences in parameters between the two versions of the same plan by applying a schema mapping. A console application is used to query the database to obtain a list of active or in-preparation plans to be tested. It then runs in batch mode to compare all the plans, and a report of success or failure of the comparison is saved for review. This software tool was used as part of software upgrade and database migration from Varian's Aria 8.9 to Aria 11 TMS. Parameters were compared for 358 treatment plans in 89 minutes. This direct comparison of all plan parameters in the migrated TMS against the previous TMS surpasses current QA methods that relied on repeating pretreatment QA measurements or labor-intensive and fallible hand comparisons. PMID:24257281

  10. Migration check tool: automatic plan verification following treatment management systems upgrade and database migration.

    PubMed

    Hadley, Scott W; White, Dale; Chen, Xiaoping; Moran, Jean M; Keranen, Wayne M

    2013-11-04

    Software upgrades of the treatment management system (TMS) sometimes require that all data be migrated from one version of the database to another. It is necessary to verify that the data are correctly migrated to assure patient safety. It is impossible to verify by hand the thousands of parameters that go into each patient's radiation therapy treatment plan. Repeating pretreatment QA is costly, time-consuming, and may be inadequate in detecting errors that are introduced during the migration. In this work we investigate the use of an automatic Plan Comparison Tool to verify that plan data have been correctly migrated to a new version of a TMS database from an older version. We developed software to query and compare treatment plans between different versions of the TMS. The same plan in the two TMS systems are translated into an XML schema. A plan comparison module takes the two XML schemas as input and reports any differences in parameters between the two versions of the same plan by applying a schema mapping. A console application is used to query the database to obtain a list of active or in-preparation plans to be tested. It then runs in batch mode to compare all the plans, and a report of success or failure of the comparison is saved for review. This software tool was used as part of software upgrade and database migration from Varian's Aria 8.9 to Aria 11 TMS. Parameters were compared for 358 treatment plans in 89 minutes. This direct comparison of all plan parameters in the migrated TMS against the previous TMS surpasses current QA methods that relied on repeating pretreatment QA measurements or labor-intensive and fallible hand comparisons.

  11. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  12. FEMA (Federal Emergency Management Agency) database requirements assessment and resource directory model. Final report 24 Aug 81-15 May 82

    SciTech Connect

    Tenopir, C.; Williams, M.E.

    1982-05-01

    Word-oriented databases (bibliographic, textual, directory, etc.) relevant to various units within the Federal Emergency Management Agency are identified and those of most potential relevance are analyzed. Subject profiles reflecting the interests of each major FEMA unit were developed and tested online on fifteen publicly available databases. The databases were then ranked by the number of citations pertinent to all aspects of emergency management and the number of pertinent citations per year of database coverage. Sample citations from the fifteen databases are included. A model Directory of Databases pertinent to emergency management was developed.

  13. Database Objects vs Files: Evaluation of alternative strategies for managing large remote sensing data

    NASA Astrophysics Data System (ADS)

    Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram

    2010-05-01

    Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available

  14. Information flow in the DAMA Project beyond database managers: Information flow managers

    SciTech Connect

    Russell, L.; Wolfson, O.; Yu, C.

    1996-03-01

    To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point-of-sale information, is being considered in the Demand Activated Manufacturing Project of the American Textile Partnership project. A scenario is examined in which 100,000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26,000 suppliers through the use of bill-of-materials explosions at four levels of detail. A new paradign the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced to keep estimates of demand as current as possible.

  15. Making the procedure manual come alive: A prototype relational database and dynamic website model for the management of nursing information.

    PubMed

    Peace, Jane; Brennan, Patricia Flatley

    2006-01-01

    The nursing procedural manual is an essential resource for clinical practice, yet insuring its currency and availability at the point of care remains an unresolved information management challenge for nurses. While standard HTML-based web pages offer significant advantage over paper compilations, employing emerging computer science tools offers even greater promise. This paper reports on the creation of a prototypical dynamic web-based nursing procedure manual driven by a relational database. We created a relational database in MySQL to manage, store, and link the procedure information, and developed PHP files to guide content retrieval, content management, and display on demand in browser-viewable format. This database driven dynamic website model is an important innovation to meet the challenge of content management and dissemination of nursing information.

  16. [General-purpose microcomputer for medical laboratory instruments].

    PubMed

    Vil'ner, G A; Dudareva, I E; Kurochkin, V E; Opalev, A A; Polek, A M

    1984-01-01

    Presented in the paper is the microcomputer based on the KP580 microprocessor set. Debugging of the hardware and the software by using the unique debugging stand developed on the basis of microcomputer "Electronica-60" is discussed.

  17. Knowledge management: An abstraction of knowledge base and database management systems

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  18. The use of database management systems and artificial intelligence in automating the planning of optical navigation pictures

    NASA Technical Reports Server (NTRS)

    Davis, Robert P.; Underwood, Ian M.

    1987-01-01

    The use of database management systems (DBMS) and AI to minimize human involvement in the planning of optical navigation pictures for interplanetary space probes is discussed, with application to the Galileo mission. Parameters characterizing the desirability of candidate pictures, and the program generating them, are described. How these parameters automatically build picture records in a database, and the definition of the database structure, are then discussed. The various rules, priorities, and constraints used in selecting pictures are also described. An example is provided of an expert system, written in Prolog, for automatically performing the selection process.

  19. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    NASA Astrophysics Data System (ADS)

    Shao, Weber; Kupelian, Patrick A.; Wang, Jason; Low, Daniel A.; Ruan, Dan

    2014-03-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  20. The DoD Gateway Information System (DGIS): The DoD Microcomputer User's Gateway to the World.

    ERIC Educational Resources Information Center

    Kuhn, Allan D.; Cotter, Gladys A.

    1988-01-01

    Describes the U.S. Department of Defense (DoD) Gateway Information System, which provides communications capabilities and access to online databases for DoD microcomputer end-users. Functions, structure, development, and artificial intelligence applications of the system are discussed. (11 references) (MES)

  1. MAGIC-SPP: a database-driven DNA sequence processing package with associated management tools

    PubMed Central

    Liang, Chun; Sun, Feng; Wang, Haiming; Qu, Junfeng; Freeman, Robert M; Pratt, Lee H; Cordonnier-Pratt, Marie-Michèle

    2006-01-01

    Background Processing raw DNA sequence data is an especially challenging task for relatively small laboratories and core facilities that produce as many as 5000 or more DNA sequences per week from multiple projects in widely differing species. To meet this challenge, we have developed the flexible, scalable, and automated sequence processing package described here. Results MAGIC-SPP is a DNA sequence processing package consisting of an Oracle 9i relational database, a Perl pipeline, and user interfaces implemented either as JavaServer Pages (JSP) or as a Java graphical user interface (GUI). The database not only serves as a data repository, but also controls processing of trace files. MAGIC-SPP includes an administrative interface, a laboratory information management system, and interfaces for exploring sequences, monitoring quality control, and troubleshooting problems related to sequencing activities. In the sequence trimming algorithm it employs new features designed to improve performance with respect to concerns such as concatenated linkers, identification of the expected start position of a vector insert, and extending the useful length of trimmed sequences by bridging short regions of low quality when the following high quality segment is sufficiently long to justify doing so. Conclusion MAGIC-SPP has been designed to minimize human error, while simultaneously being robust, versatile, flexible and automated. It offers a unique combination of features that permit administration by a biologist with little or no informatics background. It is well suited to both individual research programs and core facilities. PMID:16522212

  2. Developing genomic knowledge bases and databases to support clinical management: current perspectives.

    PubMed

    Huser, Vojtech; Sincan, Murat; Cimino, James J

    2014-01-01

    Personalized medicine, the ability to tailor diagnostic and treatment decisions for individual patients, is seen as the evolution of modern medicine. We characterize here the informatics resources available today or envisioned in the near future that can support clinical interpretation of genomic test results. We assume a clinical sequencing scenario (germline whole-exome sequencing) in which a clinical specialist, such as an endocrinologist, needs to tailor patient management decisions within his or her specialty (targeted findings) but relies on a genetic counselor to interpret off-target incidental findings. We characterize the genomic input data and list various types of knowledge bases that provide genomic knowledge for generating clinical decision support. We highlight the need for patient-level databases with detailed lifelong phenotype content in addition to genotype data and provide a list of recommendations for personalized medicine knowledge bases and databases. We conclude that no single knowledge base can currently support all aspects of personalized recommendations and that consolidation of several current resources into larger, more dynamic and collaborative knowledge bases may offer a future path forward.

  3. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    NASA Technical Reports Server (NTRS)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  4. Microcomputer Applications for Library Instruction: Automation of Test and Assignment Scoring, and Student Record Keeping.

    ERIC Educational Resources Information Center

    Sugranes, Maria R.; Snider, Larry C.

    1985-01-01

    Describes the development of an automated library instruction records management system using microcomputer technology. Development described includes assessment of need, exploration of options, system design, and operational development. System products are identified and operational results are reported based on actual system performance.…

  5. Multiple on-line data collection and processing for radioimmunoassy using a micro-computer system.

    PubMed Central

    Carter, N W; Davidson, D; Lucas, D F; Griffiths, P D

    1980-01-01

    A micro-computer system is described which has been designed to perform on-line data capture from up to seven radioisotope counters of different types in parallel with interactive results processing and subsequent transmission to a laboratory computer-based data management system. Images Fig. 1 Fig. 2 PMID:7400348

  6. Instructional Uses of Microcomputers: The Why, What, and How of the B. C. Approach.

    ERIC Educational Resources Information Center

    Daneliuk, Carl; Wright, Annette

    1981-01-01

    Through cooperation among a number of provincial agencies to facilitate centralized management, coordination, and implementation, 100 microcomputers have been placed in British Columbia public school classrooms. Project rationale, psychological barriers, inservice field support for teachers, hardware and software acquisition and testing, and…

  7. Using Microcomputers To Apply Statewide Standards for Schools and School Systems: Technological Changes over Five Years.

    ERIC Educational Resources Information Center

    Wu, Yi-Cheng; Hebbler, Stephen W.

    The Evaluation and Assessment Laboratory at the University of Alabama (Tuscaloosa) has contracted with the Georgia Department of Education (GDOE) to develop a microcomputer-based data management system for use in applying evaluation standards to schools and school systems. The Comprehensive Evaluation System (CES) was implemented statewide and has…

  8. Microcomputers in Agriculture. A Resource Guide for California Community College Faculty in Agriculture & Natural Resources. Update.

    ERIC Educational Resources Information Center

    California Community Colleges, Sacramento. Office of the Chancellor.

    This resource guide contains descriptions of microcomputer programs that are suitable for use in community college courses in agriculture and natural resources. Product descriptions are organized according to the following subject areas: agricultural business, animal production, farm mechanics, farm management, forestry and natural resources,…

  9. The Power of the New Microcomputers: Challenge and Opportunity.

    ERIC Educational Resources Information Center

    Rumsey, Eric

    1990-01-01

    Describes current developments in microcomputer technology, including CD-ROM, the 80386 microprocessor, and the graphic user interface. It is argued that, as the information handling capacity of microcomputers increases, the microcomputer industry will increasingly market information to end users, and the library will be challenged to provide the…

  10. Exceptional Children and Microcomputers, A Survey of Public School Applications.

    ERIC Educational Resources Information Center

    Crowner, T. Timothy

    Telephone interviews on the use of microcomputers in special education were administered to personnel in 25 districts (drawn from a random sample of the largest school districts in the United States) on the following topics: coordination of microcomputers; numbers of microcomputers owned and used; pragmatic uses; funding; brands used; selection,…

  11. Microcomputer-Administered Research: What it Means for Educational Researchers.

    ERIC Educational Resources Information Center

    Johnson, Craig W.

    1982-01-01

    The development of the microcomputer offers advantages for behavioral science and educational research. Among the potentials of microcomputer use are enhanced capacities for precise replication and extension, measurement, and experimental control and randomization, and the relative economy that microcomputer-administered experimental treatments…

  12. A Low Cost Microcomputer Laboratory for Investigating Computer Architecture.

    ERIC Educational Resources Information Center

    Mitchell, Eugene E., Ed.

    1980-01-01

    Described is a microcomputer laboratory at the United States Military Academy at West Point, New York, which provides easy access to non-volatile memory and a single input/output file system for 16 microcomputer laboratory positions. A microcomputer network that has a centralized data base is implemented using the concepts of computer network…

  13. Fabricating a Microcomputer on a Single Silicon Wafer

    NASA Technical Reports Server (NTRS)

    Evanchuk, V. L.

    1983-01-01

    Concept for "microcomputer on a slice" reduces microcomputer costs by eliminating scribing, wiring, and packaging of individual circuit chips. Low-cost microcomputer on silicon slice contains redundant components. All components-central processing unit, input/output circuitry, read-only memory, and random-access memory (CPU, I/O, ROM, and RAM) on placed on single silicon wafer.

  14. On the evaluation of fuzzy quantified queries in a database management system

    NASA Technical Reports Server (NTRS)

    Bosc, Patrick; Pivert, Olivier

    1992-01-01

    Many propositions to extend database management systems have been made in the last decade. Some of them aim at the support of a wider range of queries involving fuzzy predicates. Unfortunately, these queries are somewhat complex and the question of their efficiency is a subject under discussion. In this paper, we focus on a particular subset of queries, namely those using fuzzy quantified predicates. More precisely, we will consider the case where such predicates apply to individual elements as well as to sets of elements. Thanks to some interesting properties of alpha-cuts of fuzzy sets, we are able to show that the evaluation of these queries can be significantly improved with respect to a naive strategy based on exhaustive scans of sets or files.

  15. Telecommunications issues of intelligent database management for ground processing systems in the EOS era

    NASA Technical Reports Server (NTRS)

    Touch, Joseph D.

    1994-01-01

    Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.

  16. Outcome Management in Cardiac Surgery Using the Society of Thoracic Surgeons National Database.

    PubMed

    Halpin, Linda S; Gallardo, Bret E; Speir, Alan M; Ad, Niv

    2016-09-01

    Health care reform has helped streamline patient care and reimbursement by encouraging providers to provide the best outcome for the best value. Institutions with cardiac surgery programs need a methodology to monitor and improve outcomes linked to reimbursement. The Society of Thoracic Surgeons National Database (STSND) is a tool for monitoring outcomes and improving care. This article identifies the purpose, goals, and reporting system of the STSND and ways these data can be used for benchmarking, linking outcomes to the effectiveness of treatment, and identifying factors associated with mortality and complications. We explain the methodology used at Inova Heart and Vascular Institute, Falls Church, Virginia, to perform outcome management by using the STSND and address our performance-improvement cycle through discussion of data collection, analysis, and outcome reporting. We focus on the revision of clinical practice and offer examples of how patient outcomes have been improved using this methodology. PMID:27568532

  17. Outcome Management in Cardiac Surgery Using the Society of Thoracic Surgeons National Database.

    PubMed

    Halpin, Linda S; Gallardo, Bret E; Speir, Alan M; Ad, Niv

    2016-09-01

    Health care reform has helped streamline patient care and reimbursement by encouraging providers to provide the best outcome for the best value. Institutions with cardiac surgery programs need a methodology to monitor and improve outcomes linked to reimbursement. The Society of Thoracic Surgeons National Database (STSND) is a tool for monitoring outcomes and improving care. This article identifies the purpose, goals, and reporting system of the STSND and ways these data can be used for benchmarking, linking outcomes to the effectiveness of treatment, and identifying factors associated with mortality and complications. We explain the methodology used at Inova Heart and Vascular Institute, Falls Church, Virginia, to perform outcome management by using the STSND and address our performance-improvement cycle through discussion of data collection, analysis, and outcome reporting. We focus on the revision of clinical practice and offer examples of how patient outcomes have been improved using this methodology.

  18. Hazardous waste database: Waste management policy implications for the US Department of Energy`s Environmental Restoration and Waste Management Programmatic Environmental Impact Statement

    SciTech Connect

    Lazaro, M.A.; Policastro, A.J.; Antonopoulos, A.A.; Hartmann, H.M.; Koebnick, B.; Dovel, M.; Stoll, P.W.

    1994-03-01

    The hazardous waste risk assessment modeling (HaWRAM) database is being developed to analyze the risk from treatment technology operations and potential transportation accidents associated with the hazardous waste management alternatives. These alternatives are being assessed in the Department of Energy`s Environmental Restoration and Waste Management Programmatic Environmental Impact Statement (EM PEIS). To support the risk analysis, the current database contains complexwide detailed information on hazardous waste shipments from 45 Department of Energy installations during FY 1992. The database is currently being supplemented with newly acquired data. This enhancement will improve database information on operational hazardous waste generation rates, and the level and type of current on-site treatment at Department of Energy installations.

  19. Microcomputer Simulation of Enzyme Kinetic Behaviour.

    ERIC Educational Resources Information Center

    Gill, R. A.

    1984-01-01

    Describes a program which simulates the kinetic behavior of a "typical" enzyme. Program objectives, background to the kinetic model used in the simulation, major program features, typical results obtained, and a note on the availability of the program (written in BASIC for Commodore microcomputer) are included. (JN)

  20. A Modular System of Interfacing Microcomputers.

    ERIC Educational Resources Information Center

    Martin, Peter

    1983-01-01

    Describes a system of interfacing allowing a range of signal conditioning and control modules to be connected to microcomputers, enabling execution of such experiments as: examining rate of cooling; control by light-activated switch; pH measurements; control frequency of signal generators; and making automated measurements of frequency response of…

  1. Sheet Music Index on a Microcomputer.

    ERIC Educational Resources Information Center

    Carter, Nancy F.

    1983-01-01

    Describes procedures followed in the use of a microcomputer to develop an index to a collection of vocal sheet music at the Music Library of the University of Colorado. Indexes generated using four fields--title, composer's last name, lyricist's last name, and first line--are noted. (EJS)

  2. Microcomputers: Tools for Developing Technological Literacy.

    ERIC Educational Resources Information Center

    Liao, Thomas T.

    1983-01-01

    Describes a course in which undergraduate students learn to program microcomputers while learning about its applications and ramifications. Descriptions of software developed for the course are also provided. These include yellow light (traffic flow), domestic electrical energy use/cost, water pollution, and supermarket automation. (CN)

  3. Computense: Verb Drills on a Microcomputer.

    ERIC Educational Resources Information Center

    Dolphin, Emil

    1989-01-01

    A microcomputer program providing extensive and constantly changing practice with 35 irregular French verbs in 6 tenses is described. The program's objectives are to establish lateral associations between tense forms, provide continuous opportunity for recall, and shorten correct response time. (Author/MSE)

  4. Evaluation of Five Microcomputer CAD Packages.

    ERIC Educational Resources Information Center

    Leach, James A.

    1987-01-01

    Discusses the similarities, differences, advanced features, applications and number of users of five microcomputer computer-aided design (CAD) packages. Included are: "AutoCAD (V.2.17)"; "CADKEY (V.2.0)"; "CADVANCE (V.1.0)"; "Super MicroCAD"; and "VersaCAD Advanced (V.4.00)." Describes the evaluation of the packages and makes recommendations for…

  5. Microcomputers: An Available Technology for Special Education.

    ERIC Educational Resources Information Center

    Joiner, Lee Marvin; And Others

    1980-01-01

    The article describes the capabilities and features of basic microcomputer systems and describes special education applications: computer assisted instruction, prosthesis, testing, communication, and enhancing personal relations. Problems such as the availability of authoring languages, high quality educational software, and computer safety are…

  6. Microcomputer Peripheral Service Technician. Teacher Edition.

    ERIC Educational Resources Information Center

    Brown, A. O., III; Fulkerson, Dan, Ed.

    This manual is the third of a three-text microcomputer service and repair series. This text is designed to assist instructors in teaching service and repair procedures for floppy disk drives, printers, and monitors. The manual contains five units. Each instructional unit includes some or all of these basic components: performance objectives,…

  7. Basic Microcomputer Service Technician. Teacher Edition.

    ERIC Educational Resources Information Center

    Brown, A. O., III; Fulkerson, Dan, Ed.

    This manual is the first of three manuals for teaching repair skills to entry-level microcomputer service technicians. Although it focuses on basic computer repair skills, it also highlights the people skills needed by service providers. The manual contains 11 units. Each instructional unit includes some or all of these basic components:…

  8. Micro-Computers in Biology Inquiry.

    ERIC Educational Resources Information Center

    Barnato, Carolyn; Barrett, Kathy

    1981-01-01

    Describes the modification of computer programs (BISON and POLLUT) to accommodate species and areas indigenous to the Pacific Coast area. Suggests that these programs, suitable for PET microcomputers, may foster a long-term, ongoing, inquiry-directed approach in biology. (DS)

  9. Issues and Concerns in Special Education Microcomputing.

    ERIC Educational Resources Information Center

    Maddux, Cleborne D.

    1986-01-01

    Discussion of microcomputer use in public elementary schools focuses on the field of special education. Two main ways of using computers are described: (1) traditional uses, including administration, computer-assisted instruction, and assessment; and (2) new applications, including programming, word processing, simulations, and prosthetic aids…

  10. Computer-Assisted Microcomputer Preventive Maintenance.

    ERIC Educational Resources Information Center

    Ives, David J.

    1993-01-01

    Proposes a solution to the problem of preventive microcomputer maintenance in libraries. The strategy involves a batch program, small paper form, simple WordPerfect macro, and on-diskette log form which saves time, quickly generates standard reports, increases efficiency, eliminates most paperwork, and offers the flexibility for modification and…

  11. Microcomputer Listens to the Coefficient of Restitution.

    ERIC Educational Resources Information Center

    Smith, P. A.; And Others

    1981-01-01

    Describes a procedure for determining the coefficient of restitution using a microcomputer which collects and sends data to a large computer where analysis is done and graphical output is generated. The data collection hardware and software are described, and results are illustrated. (Author/SK)

  12. Microcomputers for Libraries: Features, Descriptions, Evaluations.

    ERIC Educational Resources Information Center

    Warden, William H.; Warden, Bette M.

    1983-01-01

    Discusses major features of microcomputers (word length, memory, size, disk size and capacity, central processing unit speed, peripherals, software) and provides descriptions of 19 models. Comparative analysis on basis of cost, software support, machine capacity, storage capacity, and types of peripherals is presented. A 173-item bibliography is…

  13. Microcomputers in Education: Why Is Earlier Better?

    ERIC Educational Resources Information Center

    Cuffaro, Harriet K.

    1984-01-01

    Microcomputers are not necessarily a desirable teaching/learning tool for young children. Learning styles of the preschool child are not often compatible with computer assisted instruction techniques. An examination of the types of available programing activities and software is presented. (DF)

  14. Microcomputer Software for Teaching German: An Evaluation.

    ERIC Educational Resources Information Center

    Cornick, Lisa

    This report examines the strengths and weaknesses of the following 12 microcomputer programs: (1) Language Teacher Series: TRS-80; (2) Language Teacher Series: Atari; (3) Apfeldeutsch; (4) Author I; (5) Dasher; (6) The Definite Article; (7) Flashcard; (8) German Packages I, II, and III; (9) German Vocabulary Builder; (10) The Linguist; (11)…

  15. The Impact of Microcomputers on Composition Students.

    ERIC Educational Resources Information Center

    Hocking, Joan

    To determine whether computer assisted instruction was just a fad or a viable alternative to traditional methods for teaching English composition, a microcomputer was used in a traditional college freshman English course. The class was divided into small groups: some went to the computer lab, while others worked in the classroom. Interactive…

  16. Developing Resource Support for Educators Using Microcomputers.

    ERIC Educational Resources Information Center

    Martinez, Carole

    This working paper documents the efforts of a professional information center (PIC) in Denver to provide resources that aid teachers and administrators in selecting software programs for varied microcomputer uses. The report outlines the development of PIC as a coordinator and clearinghouse for organizing and disseminating information on computers…

  17. A Microcomputer-Controlled Measurement of Acceleration.

    ERIC Educational Resources Information Center

    Crandall, A. Jared; Stoner, Ronald

    1982-01-01

    Describes apparatus and method used to allow rapid and repeated measurement of acceleration of a ball rolling down an inclined plane. Acceleration measurements can be performed in an hour with the apparatus interfaced to a Commodore PET microcomputer. A copy of the BASIC program is available from the authors. (Author/JN)

  18. Microcomputer-Based Programs for Pharmacokinetic Simulations.

    ERIC Educational Resources Information Center

    Li, Ronald C.; And Others

    1995-01-01

    Microcomputer software that simulates drug-concentration time profiles based on user-assigned pharmacokinetic parameters such as central volume of distribution, elimination rate constant, absorption rate constant, dosing regimens, and compartmental transfer rate constants is described. The software is recommended for use in undergraduate…

  19. Scheduling Software for MS-DOS Microcomputers.

    ERIC Educational Resources Information Center

    Carlson, David H.; Prior, Barbara

    1991-01-01

    Identifies four microcomputer-based software packages for scheduling and evaluates their usefulness for scheduling employees in a library setting. Evaluation criteria are applied to (1) Schedule Master, from Schedule Master Corporation; (2) Schedule Plus, from Cyclesoft, Inc.; (3) Who Works When, from Newport Systems; and (4) Working Hours, from…

  20. The Hidden Costs of Owning a Microcomputer.

    ERIC Educational Resources Information Center

    McDole, Thomas L.

    Before purchasing computer hardware, individuals must consider the costs associated with the setup and operation of a microcomputer system. Included among the initial costs of purchasing a computer are the costs of the computer, one or more disk drives, a monitor, and a printer as well as the costs of such optional peripheral devices as a plotter…

  1. RESPSYST: An Interactive Microcomputer Program for Education.

    ERIC Educational Resources Information Center

    Boyle, Joseph

    1985-01-01

    RESPSYST is a computer program (written in BASICA and using MS-DOS/PC-DOS microcomputers) incorporating more than 20 of the factors that determine gas transport by the cardio-respiratory system. The five-part program discusses most of these factors, provides several question/answer sections, and relies heavily on graphics to demonstrate…

  2. Microcomputer Courseware: Characteristics and Design Trends.

    ERIC Educational Resources Information Center

    Bialo, Ellen R.; Erickson, Lisa B.

    A total of 163 microcomputer programs evaluated by the Educational Products Information Exchange (EPIE) Institute through December 1983 were examined in order to identify strengths and weaknesses in instructional and technical design. Programs were evaluated in a variety of areas including the arts, business education, computer languages, computer…

  3. [Cystic Fibrosis Cloud database: An information system for storage and management of clinical and microbiological data of cystic fibrosis patients].

    PubMed

    Prieto, Claudia I; Palau, María J; Martina, Pablo; Achiary, Carlos; Achiary, Andrés; Bettiol, Marisa; Montanaro, Patricia; Cazzola, María L; Leguizamón, Mariana; Massillo, Cintia; Figoli, Cecilia; Valeiras, Brenda; Perez, Silvia; Rentería, Fernando; Diez, Graciela; Yantorno, Osvaldo M; Bosch, Alejandra

    2016-01-01

    The epidemiological and clinical management of cystic fibrosis (CF) patients suffering from acute pulmonary exacerbations or chronic lung infections demands continuous updating of medical and microbiological processes associated with the constant evolution of pathogens during host colonization. In order to monitor the dynamics of these processes, it is essential to have expert systems capable of storing and subsequently extracting the information generated from different studies of the patients and microorganisms isolated from them. In this work we have designed and developed an on-line database based on an information system that allows to store, manage and visualize data from clinical studies and microbiological analysis of bacteria obtained from the respiratory tract of patients suffering from cystic fibrosis. The information system, named Cystic Fibrosis Cloud database is available on the http://servoy.infocomsa.com/cfc_database site and is composed of a main database and a web-based interface, which uses Servoy's product architecture based on Java technology. Although the CFC database system can be implemented as a local program for private use in CF centers, it can also be used, updated and shared by different users who can access the stored information in a systematic, practical and safe manner. The implementation of the CFC database could have a significant impact on the monitoring of respiratory infections, the prevention of exacerbations, the detection of emerging organisms, and the adequacy of control strategies for lung infections in CF patients.

  4. [Cystic Fibrosis Cloud database: An information system for storage and management of clinical and microbiological data of cystic fibrosis patients].

    PubMed

    Prieto, Claudia I; Palau, María J; Martina, Pablo; Achiary, Carlos; Achiary, Andrés; Bettiol, Marisa; Montanaro, Patricia; Cazzola, María L; Leguizamón, Mariana; Massillo, Cintia; Figoli, Cecilia; Valeiras, Brenda; Perez, Silvia; Rentería, Fernando; Diez, Graciela; Yantorno, Osvaldo M; Bosch, Alejandra

    2016-01-01

    The epidemiological and clinical management of cystic fibrosis (CF) patients suffering from acute pulmonary exacerbations or chronic lung infections demands continuous updating of medical and microbiological processes associated with the constant evolution of pathogens during host colonization. In order to monitor the dynamics of these processes, it is essential to have expert systems capable of storing and subsequently extracting the information generated from different studies of the patients and microorganisms isolated from them. In this work we have designed and developed an on-line database based on an information system that allows to store, manage and visualize data from clinical studies and microbiological analysis of bacteria obtained from the respiratory tract of patients suffering from cystic fibrosis. The information system, named Cystic Fibrosis Cloud database is available on the http://servoy.infocomsa.com/cfc_database site and is composed of a main database and a web-based interface, which uses Servoy's product architecture based on Java technology. Although the CFC database system can be implemented as a local program for private use in CF centers, it can also be used, updated and shared by different users who can access the stored information in a systematic, practical and safe manner. The implementation of the CFC database could have a significant impact on the monitoring of respiratory infections, the prevention of exacerbations, the detection of emerging organisms, and the adequacy of control strategies for lung infections in CF patients. PMID:26895996

  5. Karlsruhe Database for Radioactive Wastes (KADABRA) - Accounting and Management System for Radioactive Waste Treatment - 12275

    SciTech Connect

    Himmerkus, Felix; Rittmeyer, Cornelia

    2012-07-01

    The data management system KADABRA was designed according to the purposes of the Cen-tral Decontamination Department (HDB) of the Wiederaufarbeitungsanlage Karlsruhe Rueckbau- und Entsorgungs-GmbH (WAK GmbH), which is specialized in the treatment and conditioning of radioactive waste. The layout considers the major treatment processes of the HDB as well as regulatory and legal requirements. KADABRA is designed as an SAG ADABAS application on IBM system Z mainframe. The main function of the system is the data management of all processes related to treatment, transfer and storage of radioactive material within HDB. KADABRA records the relevant data concerning radioactive residues, interim products and waste products as well as the production parameters relevant for final disposal. Analytical data from the laboratory and non destructive assay systems, that describe the chemical and radiological properties of residues, production batches, interim products as well as final waste products, can be linked to the respective dataset for documentation and declaration. The system enables the operator to trace the radioactive material through processing and storage. Information on the actual sta-tus of the material as well as radiological data and storage position can be gained immediately on request. A variety of programs accessed to the database allow the generation of individual reports on periodic or special request. KADABRA offers a high security standard and is constantly adapted to the recent requirements of the organization. (authors)

  6. Planning for End-User Database Searching: Drexel and the Mac: A User-Consistent Interface.

    ERIC Educational Resources Information Center

    LaBorie, Tim; Donnelly, Leslie

    Drexel University instituted a microcomputing program in 1984 which required all freshmen to own Apple Macintosh microcomputers. All students were taught database searching on the BRS (Bibliographic Retrieval Services) system as part of the freshman humanities curriculum, and the university library was chosen as the site to house continuing…

  7. PARPs database: A LIMS systems for protein-protein interaction data mining or laboratory information management system

    PubMed Central

    Droit, Arnaud; Hunter, Joanna M; Rouleau, Michèle; Ethier, Chantal; Picard-Cloutier, Aude; Bourgais, David; Poirier, Guy G

    2007-01-01

    Background In the "post-genome" era, mass spectrometry (MS) has become an important method for the analysis of proteins and the rapid advancement of this technique, in combination with other proteomics methods, results in an increasing amount of proteome data. This data must be archived and analysed using specialized bioinformatics tools. Description We herein describe "PARPs database," a data analysis and management pipeline for liquid chromatography tandem mass spectrometry (LC-MS/MS) proteomics. PARPs database is a web-based tool whose features include experiment annotation, protein database searching, protein sequence management, as well as data-mining of the peptides and proteins identified. Conclusion Using this pipeline, we have successfully identified several interactions of biological significance between PARP-1 and other proteins, namely RFC-1, 2, 3, 4 and 5. PMID:18093328

  8. A spatial classification and database for management, research, and policy making: The Great Lakes aquatic habitat framework

    EPA Science Inventory

    Managing the world’s largest and complex freshwater ecosystem, the Laurentian Great Lakes, requires a spatially hierarchical basin-wide database of ecological and socioeconomic information that are comparable across the region. To meet such a need, we developed a hierarchi...

  9. A Web-Based Multi-Database System Supporting Distributed Collaborative Management and Sharing of Microarray Experiment Information

    PubMed Central

    Burgarella, Sarah; Cattaneo, Dario; Masseroli, Marco

    2006-01-01

    We developed MicroGen, a multi-database Web based system for managing all the information characterizing spotted microarray experiments. It supports information gathering and storing according to the Minimum Information About Microarray Experiments (MIAME) standard. It also allows easy sharing of information and data among all multidisciplinary actors involved in spotted microarray experiments. PMID:17238488

  10. Covariant Evolutionary Event Analysis for Base Interaction Prediction Using a Relational Database Management System for RNA.

    PubMed

    Xu, Weijia; Ozer, Stuart; Gutell, Robin R

    2009-01-01

    With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure. PMID:20502534

  11. Covariant Evolutionary Event Analysis for Base Interaction Prediction Using a Relational Database Management System for RNA

    PubMed Central

    Xu, Weijia; Ozer, Stuart; Gutell, Robin R.

    2010-01-01

    With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure. PMID:20502534

  12. Developing a comprehensive database management system for organization and evaluation of mammography datasets.

    PubMed

    Wu, Yirong; Rubin, Daniel L; Woods, Ryan W; Elezaby, Mai; Burnside, Elizabeth S

    2014-01-01

    We aimed to design and develop a comprehensive mammography database system (CMDB) to collect clinical datasets for outcome assessment and development of decision support tools. A Health Insurance Portability and Accountability Act (HIPAA) compliant CMDB was created to store multi-relational datasets of demographic risk factors and mammogram results using the Breast Imaging Reporting and Data System (BI-RADS) lexicon. The CMDB collected both biopsy pathology outcomes, in a breast pathology lexicon compiled by extending BI-RADS, and our institutional breast cancer registry. The audit results derived from the CMDB were in accordance with Mammography Quality Standards Act (MQSA) audits and national benchmarks. The CMDB has managed the challenges of multi-level organization demanded by the complexity of mammography practice and lexicon development in pathology. We foresee that the CMDB will be useful for efficient quality assurance audits and development of decision support tools to improve breast cancer diagnosis. Our procedure of developing the CMDB provides a framework to build a detailed data repository for breast imaging quality control and research, which has the potential to augment existing resources.

  13. Developing a Comprehensive Database Management System for Organization and Evaluation of Mammography Datasets

    PubMed Central

    Wu, Yirong; Rubin, Daniel L; Woods, Ryan W; Elezaby, Mai; Burnside, Elizabeth S

    2014-01-01

    We aimed to design and develop a comprehensive mammography database system (CMDB) to collect clinical datasets for outcome assessment and development of decision support tools. A Health Insurance Portability and Accountability Act (HIPAA) compliant CMDB was created to store multi-relational datasets of demographic risk factors and mammogram results using the Breast Imaging Reporting and Data System (BI-RADS) lexicon. The CMDB collected both biopsy pathology outcomes, in a breast pathology lexicon compiled by extending BI-RADS, and our institutional breast cancer registry. The audit results derived from the CMDB were in accordance with Mammography Quality Standards Act (MQSA) audits and national benchmarks. The CMDB has managed the challenges of multi-level organization demanded by the complexity of mammography practice and lexicon development in pathology. We foresee that the CMDB will be useful for efficient quality assurance audits and development of decision support tools to improve breast cancer diagnosis. Our procedure of developing the CMDB provides a framework to build a detailed data repository for breast imaging quality control and research, which has the potential to augment existing resources. PMID:25368510

  14. Facilitating the nurse practitioner's research role: using a microcomputer for data entry in clinical settings.

    PubMed

    Matteson, P S; Hawkins, J W

    1993-01-01

    Data coding and entry can be a tedious and error-prone component of research. Statistical packages for a microcomputer are now available that enable the researcher to create data entry forms in a spread sheet format. In this article the use of software that can be used for data entry, manipulation, and analysis is discussed. In a practice setting these features are particularly useful, because the nurse practitioner can take the microcomputer to the clinical facility and enter data directly from client-provider interactions or from records. In clinical settings where client data are entered directly into a computer database as part of assessment and care giving, data can be downloaded onto this or similar programs for manipulation and analysis.

  15. Computer Science and Technology: Modeling and Measurement Techniques for Evaluation of Design Alternatives in the Implementation of Database Management Software. Final Report.

    ERIC Educational Resources Information Center

    Deutsch, Donald R.

    This report describes a research effort that was carried out over a period of several years to develop and demonstrate a methodology for evaluating proposed Database Management System designs. The major proposition addressed by this study is embodied in the thesis statement: Proposed database management system designs can be evaluated best through…

  16. Printed circuit board layout by microcomputer

    NASA Astrophysics Data System (ADS)

    Krausman, E. W.

    1983-12-01

    Printed circuit board artwork is usually prepared manually because of the unavailability of computer-aided-design tools. This thesis presents the design of a microcomputer based printed circuit board layout system that is easy to use and cheap. Automatic routing and component placement routines will significantly speed up the process. The design satisfies the following requirements: Microcomputer implementation, portable, algorithm independent, interactive, and user friendly. When it is fully implemented a user will be able to select components and a board outline from an automated catalog, enter a schematic diagram, position the components on the board, and completely route the board from a single graphics terminal. Currently, the user interface and the outer level command processor have been implemented in Pascal. Future versions will be written in C for better portability.

  17. Versatile microcomputer-based temperature controller

    SciTech Connect

    Yarberry, V.R.

    1980-09-01

    The wide range of thermal responses required in laboratory and scientific equipment requires a temperature controller with a great deal of flexibility. While a number of analog temperature controllers are commercially available, they have certain limitations, such as inflexible parameter control or insufficient precision. Most lack digital interface capabilities--a necessity when the temperature controller is part of a computer-controlled automatic data acquisition system. We have developed an extremely versatile microcomputer-based temperature controller to fulfill this need in a variety of equipment. The control algorithm used allows optimal tailoring of parameters to control overshoot, response time, and accuracy. This microcomputer-based temperature controller can be used as a standalone instrument (with a teletype used to enter para-meters), or it can be integrated into a data acquisition system (with a computer used to pass parameters by way of an IEE-488 instrumentation bus).

  18. Scheduling nursing personnel on a microcomputer.

    PubMed

    Liao, C J; Kao, C Y

    1997-01-01

    Suggests that with the shortage of nursing personnel, hospital administrators have to pay more attention to the needs of nurses to retain and recruit them. Also asserts that improving nurses' schedules is one of the most economic ways for the hospital administration to create a better working environment for nurses. Develops an algorithm for scheduling nursing personnel. Contrary to the current hospital approach, which schedules nurses on a person-by-person basis, the proposed algorithm constructs schedules on a day-by-day basis. The algorithm has inherent flexibility in handling a variety of possible constraints and goals, similar to other non-cyclical approaches. But, unlike most other non-cyclical approaches, it can also generate a quality schedule in a short time on a microcomputer. The algorithm was coded in C language and run on a microcomputer. The developed software is currently implemented at a leading hospital in Taiwan. The response to the initial implementation is quite promising. PMID:10167872

  19. Southern African Treatment Resistance Network (SATuRN) RegaDB HIV drug resistance and clinical management database: supporting patient management, surveillance and research in southern Africa.

    PubMed

    Manasa, Justen; Lessells, Richard; Rossouw, Theresa; Naidu, Kevindra; Van Vuuren, Cloete; Goedhals, Dominique; van Zyl, Gert; Bester, Armand; Skingsley, Andrew; Stott, Katharine; Danaviah, Siva; Chetty, Terusha; Singh, Lavanya; Moodley, Pravi; Iwuji, Collins; McGrath, Nuala; Seebregts, Christopher J; de Oliveira, Tulio

    2014-01-01

    Substantial amounts of data have been generated from patient management and academic exercises designed to better understand the human immunodeficiency virus (HIV) epidemic and design interventions to control it. A number of specialized databases have been designed to manage huge data sets from HIV cohort, vaccine, host genomic and drug resistance studies. Besides databases from cohort studies, most of the online databases contain limited curated data and are thus sequence repositories. HIV drug resistance has been shown to have a great potential to derail the progress made thus far through antiretroviral therapy. Thus, a lot of resources have been invested in generating drug resistance data for patient management and surveillance purposes. Unfortunately, most of the data currently available relate to subtype B even though >60% of the epidemic is caused by HIV-1 subtype C. A consortium of clinicians, scientists, public health experts and policy markers working in southern Africa came together and formed a network, the Southern African Treatment and Resistance Network (SATuRN), with the aim of increasing curated HIV-1 subtype C and tuberculosis drug resistance data. This article describes the HIV-1 data curation process using the SATuRN Rega database. The data curation is a manual and time-consuming process done by clinical, laboratory and data curation specialists. Access to the highly curated data sets is through applications that are reviewed by the SATuRN executive committee. Examples of research outputs from the analysis of the curated data include trends in the level of transmitted drug resistance in South Africa, analysis of the levels of acquired resistance among patients failing therapy and factors associated with the absence of genotypic evidence of drug resistance among patients failing therapy. All these studies have been important for informing first- and second-line therapy. This database is a free password-protected open source database available on

  20. A Microcomputer-Based Neurophysiological Stimulator

    PubMed Central

    Halter, John

    1979-01-01

    A neurophysiological stimulator is presented which utilizes TTL hardware controlled by a microcomputer. Up to four channels of stimulation are provided, each of which consists of a TTL-Based Pulse Generator. Operating parameters are entered into the stimulator via a front panel in a format familiar to the clinician. Operating parameters may be investigated and modified at any time by another computer, thereby enabling the implementation of more complex clinical procedures. ImagesFigure 1Figure 4

  1. Some Applications of Microcomputers in Observatory Automation

    NASA Astrophysics Data System (ADS)

    Honeycutt, R. K.; Kephart, J. E.

    1982-06-01

    We present here some of the techniques used to automate many of the observing tasks on the 0.91-meter telescope of the Goethe Link Observatory. A description of the method used to calculate the dome position for a telescope which is mounted asymmetrically is included. We also give details of a novel autoguider. This autoguider uses a digitized television image of the star field to enable the microcomputer to generate error signals from a centroid calculation.

  2. Federal microcomputer software for urban hydrology

    USGS Publications Warehouse

    Jennings, Marshall E.; Smith, Roger H.; Jennings, Ross B.

    1988-01-01

    The purpose of this paper is to describe the development, availability, and general use of selected urban hydrology microcomputer software developed by: U.S. Soil Conservation Service (SCS); U.S. Army Corps of Engineers, Hydrologic Engineering Center (HEC); U.S. Environmental Protection Agency (EPA); and U.S. Geological Survey (USGS). The discussion is limited to software used for design and planning for urban stormwater flows.

  3. Accessing remote data bases using microcomputers

    PubMed Central

    Saul, Peter D.

    1985-01-01

    General practitioners' access to remote data bases using microcomputers is increasing, making even the most obscure information readily available. Some of the systems available to general practitioners in the UK are described and the methods of access are outlined. General practitioners should be aware of the advances in technology; data bases are increasing in size, the cost of access is falling and their use is becoming easier. PMID:4020756

  4. Microcomputer versus mainframe simulations: A case study

    NASA Technical Reports Server (NTRS)

    Bengtson, Neal M.

    1988-01-01

    The research was conducted to two parts. Part one consisted of a study of the feasibility of running the Space Transportation Model simulation on an office IBM-AT. The second part was to design simulation runs so as to study the effects of certain performance factors on the execution of the simulation model. The results of this research are given in the two reports which follow: Microcomputer vs. Mainframe Simulation: A Case Study and Fractional Factorial Designs of Simulation Runs for the Space Transportation System Operations Model. In the first part, a DOS batch job was written in order to simplify the execution of the simulation model on an office microcomputer. A comparison study was then performed of running the model on NASA-Langley's mainframe computer vs. running on the IBM-AT microcomputer. This was done in order to find the advantages and disadvantages of running the model on each machine with the objective of determining if running of the office PC was practical. The study concluded that it was. The large number of performance parameters in the Space Transportation model precluded running a full factorial design needed to determine the most significant design factors. The second report gives several suggested fractional factorial designs which require far fewer simulation runs in order to determine which factors have significant influence on results.

  5. Microcomputers in exploration - perspective and forecast

    SciTech Connect

    Crane, D.C.

    1984-04-01

    General purpose microcomputers have become so inexpensive that powerful systems now are within the budget of a one-person company. The uninitiated have visions of a desktop machine that can do budget reports, royalty accounting, lease control, interactive modeling of the subsurface, contour maps, decision making, dart throwing, and much more. Old hands in major oil companies can see the opposite side: horror stories about expensive programming projects, machines that are overloaded and slow, and enormous staffs required to interface the working geologist with the machine. The truth about microcomputers is somewhere in between. An intelligent search for software packages will turn up a few that pay for themselves and the computer(s) they require. But buying a machine and attempting to make it fit a ''grand vision'' can lead to disaster. This paper presents a perspective on microcomputer capabilities, both software and hardware, now and for the next few years, and suggests a method and a schedule for small-company geologists to become involved with computers.

  6. The Database Business: Managing Today--Planning for Tomorrow. Issues and Futures.

    ERIC Educational Resources Information Center

    Aitchison, T. M.; And Others

    1988-01-01

    Current issues and the future of the database business are discussed in five papers. Topics covered include aspects relating to the quality of database production; international ownership in the U.S. information marketplace; an overview of pricing strategies in the electronic information industry; and pricing issues from the viewpoints of online…

  7. A Parallel Relational Database Management System Approach to Relevance Feedback in Information Retrieval.

    ERIC Educational Resources Information Center

    Lundquist, Carol; Frieder, Ophir; Holmes, David O.; Grossman, David

    1999-01-01

    Describes a scalable, parallel, relational database-drive information retrieval engine. To support portability across a wide range of execution environments, all algorithms adhere to the SQL-92 standard. By incorporating relevance feedback algorithms, accuracy is enhanced over prior database-driven information retrieval efforts. Presents…

  8. [Research and development of medical case database: a novel medical case information system integrating with biospecimen management].

    PubMed

    Pan, Shiyang; Mu, Yuan; Wang, Hong; Wang, Tong; Huang, Peijun; Ma, Jianfeng; Jiang, Li; Zhang, Jie; Gu, Bing; Yi, Lujiang

    2010-04-01

    To meet the needs of management of medical case information and biospecimen simultaneously, we developed a novel medical case information system integrating with biospecimen management. The database established by MS SQL Server 2000 covered, basic information, clinical diagnosis, imaging diagnosis, pathological diagnosis and clinical treatment of patient; physicochemical property, inventory management and laboratory analysis of biospecimen; users log and data maintenance. The client application developed by Visual C++ 6.0 was used to implement medical case and biospecimen management, which was based on Client/Server model. This system can perform input, browse, inquest, summary of case and related biospecimen information, and can automatically synthesize case-records based on the database. Management of not only a long-term follow-up on individual, but also of grouped cases organized according to the aim of research can be achieved by the system. This system can improve the efficiency and quality of clinical researches while biospecimens are used coordinately. It realizes synthesized and dynamic management of medical case and biospecimen, which may be considered as a new management platform.

  9. Interactive display of molecular models using a microcomputer system

    NASA Technical Reports Server (NTRS)

    Egan, J. T.; Macelroy, R. D.

    1980-01-01

    A simple, microcomputer-based, interactive graphics display system has been developed for the presentation of perspective views of wire frame molecular models. The display system is based on a TERAK 8510a graphics computer system with a display unit consisting of microprocessor, television display and keyboard subsystems. The operating system includes a screen editor, file manager, PASCAL and BASIC compilers and command options for linking and executing programs. The graphics program, written in USCD PASCAL, involves the centering of the coordinate system, the transformation of centered model coordinates into homogeneous coordinates, the construction of a viewing transformation matrix to operate on the coordinates, clipping invisible points, perspective transformation and scaling to screen coordinates; commands available include ZOOM, ROTATE, RESET, and CHANGEVIEW. Data file structure was chosen to minimize the amount of disk storage space. Despite the inherent slowness of the system, its low cost and flexibility suggests general applicability.

  10. A mini/microcomputer-based land use information system

    NASA Technical Reports Server (NTRS)

    Seitz, R. N.; Keefer, R. L.; Britton, L. J.; Wilson, J. M.

    1977-01-01

    The paper describes the Multipurpose Interactive NASA Information System (MINIS), a data management system for land-use applications. MINIS is written nearly entirely in FORTRAN IV, and has a full range of conditional, Boolean and arithmetic commands, as well as extensive format control and the capability of interactive file creation and updating. It requires a mini or microcomputer with at least 64 K of core or semiconductor memory. MINIS has its own equation-oriented query language for retrieval from different kinds of data bases. It features a graphics output which permits output of overlay maps. Some experience of the U.S. Department of Agriculture and the Tennessee State Planning Office with MINIS is discussed.

  11. An advanced microcomputer design for processing of semiconductor materials

    NASA Technical Reports Server (NTRS)

    Bjoern, L.; Lindkvist, L.; Zaar, J.

    1988-01-01

    In the Get Away Special 330 payload two germanium samples doped with gallium will be processed. The aim of the experiments is to create a planar solid/liquid interface, and to study the breakdown of this interface as the crystal growth rate increases. For the experiments a gradient furnace was designed which is heated by resistive heaters. Cooling is provided by circulating gas from the atmosphere in the cannister through cooling channels in the furnace. The temperature along the sample are measured by platinum/rhodium thermocouples. The furnace is controlled by a microcomputer system, based upon the processor 80C88. A data acquisition system is integrated into the system. In order to synchronize the different actions in time, a multitask manager is used.

  12. 'The surface management system' (SuMS) database: a surface-based database to aid cortical surface reconstruction, visualization and analysis

    NASA Technical Reports Server (NTRS)

    Dickson, J.; Drury, H.; Van Essen, D. C.

    2001-01-01

    Surface reconstructions of the cerebral cortex are increasingly widely used in the analysis and visualization of cortical structure, function and connectivity. From a neuroinformatics perspective, dealing with surface-related data poses a number of challenges. These include the multiplicity of configurations in which surfaces are routinely viewed (e.g. inflated maps, spheres and flat maps), plus the diversity of experimental data that can be represented on any given surface. To address these challenges, we have developed a surface management system (SuMS) that allows automated storage and retrieval of complex surface-related datasets. SuMS provides a systematic framework for the classification, storage and retrieval of many types of surface-related data and associated volume data. Within this classification framework, it serves as a version-control system capable of handling large numbers of surface and volume datasets. With built-in database management system support, SuMS provides rapid search and retrieval capabilities across all the datasets, while also incorporating multiple security levels to regulate access. SuMS is implemented in Java and can be accessed via a Web interface (WebSuMS) or using downloaded client software. Thus, SuMS is well positioned to act as a multiplatform, multi-user 'surface request broker' for the neuroscience community.

  13. A plant resource and experiment management system based on the Golm Plant Database as a basic tool for omics research

    PubMed Central

    Köhl, Karin I; Basler, Georg; Lüdemann, Alexander; Selbig, Joachim; Walther, Dirk

    2008-01-01

    Background For omics experiments, detailed characterisation of experimental material with respect to its genetic features, its cultivation history and its treatment history is a requirement for analyses by bioinformatics tools and for publication needs. Furthermore, meta-analysis of several experiments in systems biology based approaches make it necessary to store this information in a standardised manner, preferentially in relational databases. In the Golm Plant Database System, we devised a data management system based on a classical Laboratory Information Management System combined with web-based user interfaces for data entry and retrieval to collect this information in an academic environment. Results The database system contains modules representing the genetic features of the germplasm, the experimental conditions and the sampling details. In the germplasm module, genetically identical lines of biological material are generated by defined workflows, starting with the import workflow, followed by further workflows like genetic modification (transformation), vegetative or sexual reproduction. The latter workflows link lines and thus create pedigrees. For experiments, plant objects are generated from plant lines and united in so-called cultures, to which the cultivation conditions are linked. Materials and methods for each cultivation step are stored in a separate ACCESS database of the plant cultivation unit. For all cultures and thus every plant object, each cultivation site and the culture's arrival time at a site are logged by a barcode-scanner based system. Thus, for each plant object, all site-related parameters, e.g. automatically logged climate data, are available. These life history data and genetic information for the plant objects are linked to analytical results by the sampling module, which links sample components to plant object identifiers. This workflow uses controlled vocabulary for organs and treatments. Unique names generated by the system

  14. Data management with a landslide inventory of the Franconian Alb (Germany) using a spatial database and GIS tools

    NASA Astrophysics Data System (ADS)

    Bemm, Stefan; Sandmeier, Christine; Wilde, Martina; Jaeger, Daniel; Schwindt, Daniel; Terhorst, Birgit

    2014-05-01

    The area of the Swabian-Franconian cuesta landscape (Southern Germany) is highly prone to landslides. This was apparent in the late spring of 2013, when numerous landslides occurred as a consequence of heavy and long-lasting rainfalls. The specific climatic situation caused numerous damages with serious impact on settlements and infrastructure. Knowledge on spatial distribution of landslides, processes and characteristics are important to evaluate the potential risk that can occur from mass movements in those areas. In the frame of two projects about 400 landslides were mapped and detailed data sets were compiled during years 2011 to 2014 at the Franconian Alb. The studies are related to the project "Slope stability and hazard zones in the northern Bavarian cuesta" (DFG, German Research Foundation) as well as to the LfU (The Bavarian Environment Agency) within the project "Georisks and climate change - hazard indication map Jura". The central goal of the present study is to create a spatial database for landslides. The database should contain all fundamental parameters to characterize the mass movements and should provide the potential for secure data storage and data management, as well as statistical evaluations. The spatial database was created with PostgreSQL, an object-relational database management system and PostGIS, a spatial database extender for PostgreSQL, which provides the possibility to store spatial and geographic objects and to connect to several GIS applications, like GRASS GIS, SAGA GIS, QGIS and GDAL, a geospatial library (Obe et al. 2011). Database access for querying, importing, and exporting spatial and non-spatial data is ensured by using GUI or non-GUI connections. The database allows the use of procedural languages for writing advanced functions in the R, Python or Perl programming languages. It is possible to work directly with the (spatial) data entirety of the database in R. The inventory of the database includes (amongst others

  15. Reflective Database Access Control

    ERIC Educational Resources Information Center

    Olson, Lars E.

    2009-01-01

    "Reflective Database Access Control" (RDBAC) is a model in which a database privilege is expressed as a database query itself, rather than as a static privilege contained in an access control list. RDBAC aids the management of database access controls by improving the expressiveness of policies. However, such policies introduce new interactions…

  16. De-MA: a web Database for electron Microprobe Analyses to assist EMP lab manager and users

    NASA Astrophysics Data System (ADS)

    Allaz, J. M.

    2012-12-01

    Lab managers and users of electron microprobe (EMP) facilities require comprehensive, yet flexible documentation structures, as well as an efficient scheduling mechanism. A single on-line database system for managing reservations, and providing information on standards, quantitative and qualitative setups (element mapping, etc.), and X-ray data has been developed for this purpose. This system is particularly useful in multi-user facilities where experience ranges from beginners to the highly experienced. New users and occasional facility users will find these tools extremely useful in developing and maintaining high quality, reproducible, and efficient analyses. This user-friendly database is available through the web, and uses MySQL as a database and PHP/HTML as script language (dynamic website). The database includes several tables for standards information, X-ray lines, X-ray element mapping, PHA, element setups, and agenda. It is configurable for up to five different EMPs in a single lab, each of them having up to five spectrometers and as many diffraction crystals as required. The installation should be done on a web server supporting PHP/MySQL, although installation on a personal computer is possible using third-party freeware to create a local Apache server, and to enable PHP/MySQL. Since it is web-based, any user outside the EMP lab can access this database anytime through any web browser and on any operating system. The access can be secured using a general password protection (e.g. htaccess). The web interface consists of 6 main menus. (1) "Standards" lists standards defined in the database, and displays detailed information on each (e.g. material type, name, reference, comments, and analyses). Images such as EDS spectra or BSE can be associated with a standard. (2) "Analyses" lists typical setups to use for quantitative analyses, allows calculation of mineral composition based on a mineral formula, or calculation of mineral formula based on a fixed

  17. [Anemia management in haemodialysis. EuCliD database in Spain].

    PubMed

    Avilés, B; Coronel, F; Pérez-García, R; Marcelli, D; Orlandini, G; Ayala, J A; Rentero, R

    2002-01-01

    We present the results on Anaemia Management in Fresenius Medical Care Spain dialysis centres as reported by EuCliD (European Clinical Database), evaluating a population of 4,426 patients treated in Spain during the year 2001. To analyse the erythropoietin dose and the haemoglobin levels we divided the population in two groups according to the time with dialysis treatment: patients treated less than six months and patients between six months, and four years on therapy. We compared our results with the evidence based recommendations Guidelines: the European Best Practice Guidelines (EBPG) and the US National Kidney Foundation (NKF-K/DOQI). We also compared our results with those presented by the ESAM2 on 2,618 patients on dialysis in Spain carried out in the second half of the year 2000. We observed that 70% of the population reaches an haemoglobin value higher that 11 g/dl, with a mean erythropoietin (rHu-EPO) dose of 111.9 Ul/kg weight/week (n = 3,700; SD 74.9). However, for those patients on treatment for less than six months, the mean Haemoglobin only reaches 10.65 g/dl (n = 222; SD 1.4). The rHu-EPO was administrated subcutaneously in 70.2% of the patients. About the iron therapy, 86% of the patients received iron treatment and the administration route was intravenous in 93% of the population. The ferritin levels were below 100 micrograms/dl in 10% of the patients and 26.4% showed a transferrin saturation index (TSAT) below 20%. The erythropoieting resistance index (ERI), as rHu-EPO/haemoglobin, has been used to evaluate the response to rHu-Epo, according to different variables. It was observed that the following factors lead to a higher rHu-EPO resistance: intravenous rHu-EPO as administration route, the presence of hypoalbuminemia, increase of protein C reactive, Transferrin saturation below 20% and starting dialysis during the last six months.

  18. Watershed Data Management (WDM) database for Salt Creek streamflow simulation, DuPage County, Illinois, water years 2005-11

    USGS Publications Warehouse

    Bera, Maitreyee

    2014-01-01

    The U.S. Geological Survey (USGS), in cooperation with DuPage County Stormwater Management Division, maintains a USGS database of hourly meteorologic and hydrologic data for use in a near real-time streamflow simulation system, which assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek watershed in DuPage County, Illinois. Most of the precipitation data are collected from a tipping-bucket rain-gage network located in and near DuPage County. The other meteorologic data (wind speed, solar radiation, air temperature, and dewpoint temperature) are collected at Argonne National Laboratory in Argonne, Ill. Potential evapotranspiration is computed from the meteorologic data. The hydrologic data (discharge and stage) are collected at USGS streamflow-gaging stations in DuPage County. These data are stored in a Watershed Data Management (WDM) database. An earlier report describes in detail the WDM database development including the processing of data from January 1, 1997, through September 30, 2004, in SEP04.WDM database. SEP04.WDM is updated with the appended data from October 1, 2004, through September 30, 2011, water years 2005–11 and renamed as SEP11.WDM. This report details the processing of meteorologic and hydrologic data in SEP11.WDM. This report provides a record of snow affected periods and the data used to fill missing-record periods for each precipitation site during water years 2005–11. The meteorologic data filling methods are described in detail in Over and others (2010), and an update is provided in this report.

  19. A spatial classification and database for management, research, and policy making: The Great Lakes aquatic habitat framework

    USGS Publications Warehouse

    Wang, Lizhu; Riseng, Catherine M.; Mason, Lacey; Werhrly, Kevin; Rutherford, Edward; McKenna, James E.; Castiglione, Chris; Johnson, Lucinda B.; Infante, Dana M.; Sowa, Scott P.; Robertson, Mike; Schaeffer, Jeff; Khoury, Mary; Gaiot, John; Hollenhurst, Tom; Brooks, Colin N.; Coscarelli, Mark

    2015-01-01

    Managing the world's largest and most complex freshwater ecosystem, the Laurentian Great Lakes, requires a spatially hierarchical basin-wide database of ecological and socioeconomic information that is comparable across the region. To meet such a need, we developed a spatial classification framework and database — Great Lakes Aquatic Habitat Framework (GLAHF). GLAHF consists of catchments, coastal terrestrial, coastal margin, nearshore, and offshore zones that encompass the entire Great Lakes Basin. The catchments captured in the database as river pour points or coastline segments are attributed with data known to influence physicochemical and biological characteristics of the lakes from the catchments. The coastal terrestrial zone consists of 30-m grid cells attributed with data from the terrestrial region that has direct connection with the lakes. The coastal margin and nearshore zones consist of 30-m grid cells attributed with data describing the coastline conditions, coastal human disturbances, and moderately to highly variable physicochemical and biological characteristics. The offshore zone consists of 1.8-km grid cells attributed with data that are spatially less variable compared with the other aquatic zones. These spatial classification zones and their associated data are nested within lake sub-basins and political boundaries and allow the synthesis of information from grid cells to classification zones, within and among political boundaries, lake sub-basins, Great Lakes, or within the entire Great Lakes Basin. This spatially structured database could help the development of basin-wide management plans, prioritize locations for funding and specific management actions, track protection and restoration progress, and conduct research for science-based decision making.

  20. Evolution of microcomputer-based medical instrumentation.

    PubMed

    Tompkins, Willis J

    2009-01-01

    This paper provides a historical review of the evolution of the technologies that led to modern microcomputer-based medical instrumentation. I review the history of the microprocessor-based system because of the importance of the microprocessor in the design of modern medical instruments. I then give some examples of medical instruments in which the microprocessor has played a key role and in some cases has even empowered us to develop new instruments that were not possible before. I include a discussion of the role of the microprocessor-based personal computer in development of medical instruments.

  1. jSPyDB, an open source database-independent tool for data management

    NASA Astrophysics Data System (ADS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  2. Microcomputers: Communication Software. Evaluation Guides. Guide Number 13.

    ERIC Educational Resources Information Center

    Gray, Peter J.

    This guide discusses four types of microcomputer-based communication programs that could prove useful to evaluators: (1) the direct communication of information generated by one computer to another computer; (2) using the microcomputer as a terminal to a mainframe computer to input, direct the analysis of, and/or output data using a statistical…

  3. Evaluator's Guide for Microcomputer-Based Instructional Packages.

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    This guide developed by MicroSIFT, a clearinghouse for microcomputer-based educational software and courseware, provides background information and forms to aid teachers and other educators in evaluating available microcomputer courseware. The evaluation process comprises four stages: (1) sifting, which screens out those programs that are not…

  4. Microcomputer Telecommunication: Bringing Education Online for an Expanded Classroom.

    ERIC Educational Resources Information Center

    Schooler, Douglas K.

    This document is designed as a comprehensive resource and guide for computer-using educators who wish to expand the scope and magnitude of their computer power through telecommunications. The major focus is on the microcomputer's ability to communicate with other microcomputers via telephone lines. Emphasis is placed upon information services and…

  5. Handbook and Annotated Software Bibliography. Microcomputers in ABE.

    ERIC Educational Resources Information Center

    Holter, Mary Patricia; Johnson, Carmen

    This handbook and annotated bibliography presents discussions, ideas, and resources useful to adult basic education (ABE) program teachers and administrators in implementing educational microcomputing, and describes microcomputer software programs that have been used successfully in ABE. The first part of the book, the handbook, is organized in…

  6. Microcomputers: Introduction to Features and Uses. Final Report.

    ERIC Educational Resources Information Center

    Hecht, Myron; And Others

    This introduction to microcomputers and their implementation and use in federal agencies discusses both basic concepts in microcomputers and their specific uses by clerical, administrative, professional, and technical federal personnel. Issues concerned with the use of specialized application software and integrated packages--word processing, data…

  7. Microcomputers in Education: A Self-Paced Orientation.

    ERIC Educational Resources Information Center

    Carey, Doris; Carey, Regan

    Designed to serve as a self-paced computer course for education students with no experience using microcomputers, this manual contains instructions for operating an Apple IIe microcomputer, its introductory software, and Bank Street Writer, using the DOS 3.3 System Master. The lessons, which contain illustrations and sample screens, include…

  8. Interfacing Commodore Microcomputers with a Laboratory Device: A Thermometer Probe.

    ERIC Educational Resources Information Center

    Powers, Michael H.

    1986-01-01

    Describes hardware and software requirements for interfacing a thermometer probe, via an analog and digital converter, to any Commodore PET, VIC-20, or Commodore-64 microcomputer (or other microcomputers with some modifications). Also describes use of the probe in an experiment measuring enthalpies of reaction to determine enthalpy of formation of…

  9. The Effects of Microcomputers on Children's Attention to Reading Tasks.

    ERIC Educational Resources Information Center

    Zuk, Dorie; Danner, Fred

    A study investigated the effects of microcomputers on children's attention to reading tasks and the relationship between previous reading achievement and grade level on such attentional behavior. Fifty-five third and fifth graders read two stories each, one presented on a microcomputer and one presented in print. Television cartoons and rock music…

  10. The Microcomputer in the Library: II. Hardware and Operating Systems.

    ERIC Educational Resources Information Center

    Leggate, Peter; Dyer, Hilary

    1985-01-01

    This second in a series of six articles introducing microcomputer applications in smaller libraries describes the main microcomputer hardware components--processors, internal and external memory, buses, printers, communications, hardware. Importance of ergonomic factors in equipment design, multi-user and network configurations, and the role of…

  11. A Guide to Microcomputer Programs in the California Community Colleges.

    ERIC Educational Resources Information Center

    Dimsdale, Jeffrey M., Ed.

    Designed to assist faculty in California community colleges in sharing microcomputer programs they have written, this guide provides abstracts for 89 teacher-developed microcomputer programs that can be obtained for non-commercial use. Each entry contains information on the title and author of the program, the institution of the author, the…

  12. Creating Microcomputer Graphics with the KoalaPad.

    ERIC Educational Resources Information Center

    White, Dennis W.

    1985-01-01

    The KoalaPad, an advanced graphic tablet introduced in 1983, reduces the cost and the degree of programing background required to create sophisticated images on the microcomputer. The potentials of the KoalaPad for use in an art education program are discussed, and recommendations for creating a microcomputer graphics lab are presented. (RM)

  13. Microcomputing in North Carolina Libraries: A Special Section.

    ERIC Educational Resources Information Center

    Speller, Benjamin F., Jr., Ed.; Burgin, Robert, Ed.

    1982-01-01

    The eight articles in this special section on microcomputer applications in libraries discuss selection of microcomputer courseware for school media collections, public and special libraries, information retrieval, and library education. References and an annotated bibliography are provided. (Request subscription information from W. Robert…

  14. A microcomputer program for analysis of nucleic acid hybridization data

    PubMed Central

    Green, S.; Field, J.K.; Green, C.D.; Beynon, R.J.

    1982-01-01

    The study of nucleic acid hybridization is facilitated by computer mediated fitting of theoretical models to experimental data. This paper describes a non-linear curve fitting program, using the `Patternsearch' algorithm, written in BASIC for the Apple II microcomputer. The advantages and disadvantages of using a microcomputer for local data processing are discussed. Images PMID:7071017

  15. TLC for Growing Minds. Microcomputer Projects. Advanced Projects for Adults.

    ERIC Educational Resources Information Center

    Taitt, Henry A.

    Designed to improve students' thinking, learning, and creative skills while they learn to program a microcomputer in BASIC programing language, this book for advanced learners at the high school/adult level provides a variety of microcomputer activities designed to extend the concepts learned in the accompanying instructional manuals (volumes 3…

  16. Enhancing a Mainframe Library System through Microcomputer Technology.

    ERIC Educational Resources Information Center

    Breeding, Marshall

    1988-01-01

    Discusses ways a microcomputer system might enhance the mainframe library system at Vanderbilt University. Topics include: (1) types of microcomputers; (2) types of terminals; (3) characteristics of a dumb terminal; (4) characteristics of an intelligent terminal; (5) which terminals should be given additional features; and (6) designing an…

  17. Microcomputer Selection Guidelines for Administrators. Operations Notebook No. 30.

    ERIC Educational Resources Information Center

    Coombs, Robert W.; And Others

    Designed to assist administrators in making intelligent decisions about microcomputer selection, this nontechnical guide provides information in three areas: how, where, when, and why to use a microcomputer; what questions to ask about software and hardware; and what terminology to use. It provides a framework for answering six questions the…

  18. Microcomputer Technology: Its impact on Teachers in an Elementary School.

    ERIC Educational Resources Information Center

    Hope, Warren C.

    The purpose of this study was to examine the initiation and implementation of microcomputer technology in the educational environment of N.H. Jones Elementary School (Ocala, Florida) and to assess its impact on teachers. Microcomputer technology was configured as a teacher workstation. A conceptual framework was developed to promote microcomputer…

  19. Applications of Local Area Networks of Microcomputers in Libraries.

    ERIC Educational Resources Information Center

    Levert, Virginia M.

    1985-01-01

    Important features of local area networks (LAN) are reviewed, and several microcomputer LANs are described (ARCnet, Hinet, ShareNet, Ethernet, Omninet, PLAN 4000). Results of survey of 10 libraries using or planning to use a microcomputer LAN and considerations in choosing a LAN are reported. Forty-one references are cited. (EJS)

  20. User's manual for levelized power generation cost using a microcomputer

    SciTech Connect

    Fuller, L.C.

    1984-08-01

    Microcomputer programs for the estimation of levelized electrical power generation costs are described. Procedures for light-water reactor plants and coal-fired plants include capital investment cost, operation and maintenance cost, fuel cycle cost, nuclear decommissioning cost, and levelized total generation cost. Programs are written in Pascal and are run on an Apple II Plus microcomputer.

  1. Metadata Dictionary Database: A Proposed Tool for Academic Library Metadata Management

    ERIC Educational Resources Information Center

    Southwick, Silvia B.; Lampert, Cory

    2011-01-01

    This article proposes a metadata dictionary (MDD) be used as a tool for metadata management. The MDD is a repository of critical data necessary for managing metadata to create "shareable" digital collections. An operational definition of metadata management is provided. The authors explore activities involved in metadata management in…

  2. Are Bibliographic Management Software Search Interfaces Reliable?: A Comparison between Search Results Obtained Using Database Interfaces and the EndNote Online Search Function

    ERIC Educational Resources Information Center

    Fitzgibbons, Megan; Meert, Deborah

    2010-01-01

    The use of bibliographic management software and its internal search interfaces is now pervasive among researchers. This study compares the results between searches conducted in academic databases' search interfaces versus the EndNote search interface. The results show mixed search reliability, depending on the database and type of search…

  3. Guide on Data Models in the Selection and Use of Database Management Systems. Final Report.

    ERIC Educational Resources Information Center

    Gallagher, Leonard J.; Draper, Jesse M.

    A tutorial introduction to data models in general is provided, with particular emphasis on the relational and network models defined by the two proposed ANSI (American National Standards Institute) database language standards. Examples based on the network and relational models include specific syntax and semantics, while examples from the other…

  4. Development of a grape genomics database using IBM DB2 content manager software

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A relational database was created for the North American Grapevine Genome project at the Viticultural Research Center, at Florida A&M University. The collaborative project with USDA, ARS researchers is an important resource for viticulture production of new grapevine varieties which will be adapted ...

  5. University Management of Research: A Data-Based Policy and Planning. AIR 1989 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Strubbe, J.

    The development of an appropriate research policy for a university as well as for the national and international levels can be accomplished only if quantitative data and qualitative evaluations (scientific contribution, results, goal-achievement) are made available to illustrate research activities. A database is described that would enable…

  6. Effects of group size, gender, and ability grouping on learning science process skills using microcomputers

    NASA Astrophysics Data System (ADS)

    Berge, Zane L.

    What are the effects of group size (individuals, pairs, and quads of students), gender, and ability grouping of 245 seventh- and eighth-grade students on achievement within an environment that uses microcomputers as tools in learning science process skills? A split-plot, multivariate factorial design was used to analyze the above factors and interactions among the factors. Analyses indicated that the only statistically significant result was a main effect on ability for the two response variables measured in the study. Major conclusions included: (1) teams of two and four members working together solved problems as effectively as individuals, (2) the lessons and procedures implemented in the manner described generated a gender-neutral achievement outcome in science, and (3) microcomputer, using a file-management program and structured activities, can be used as a tool to promote student learning of science process skills.

  7. A specialized database manager for interpretation of NMR spectra of synthetic glucides: JPD

    NASA Astrophysics Data System (ADS)

    Czaplicki, J.; Ponthus, C.

    1998-02-01

    The current communication presents a program, written specifically to create and handle a specialized database, containing NMR spectral patterns of various monosaccharidic units. The program's database format is compatible with that of the Aurelia/Amix Bruker software package. The software facilitates the search for J patterns included in the database and their comparison with an experimental spectrum, in order to identify the components of the studied system, including the contaminants. Nous présentons ici un logiciel écrit spécifiquement pour créer et gérer une base de données spécialisées, contenant les motifs du couplage J des unités monosaccharidiques différentes. Le format de la base de données est compatible avec le format utilisé par le logiciel Aurelia/Amix de Bruker. Le logiciel facilite la recherche des motifs J inclus dans la base de données de leurs comparaisons avec un spectre expérimental, afin d'identifier les constituants de l'échantillon étudié, et ses éventuelles impuretés.

  8. Cognitive and Affective Effects of Various Types of Microcomputer Use by Preschoolers.

    ERIC Educational Resources Information Center

    Goodwin, Laura D.; And Others

    1986-01-01

    The effects of microcomputer use on preschoolers' knowledge of pre-reading concepts and attitudes toward microcomputers were investigated. Seventy-seven preschoolers were randomly assigned to three treatment conditions: (1) adult-assisted microcomputer instruction; (2) unassisted microcomputer use; and (3) no computer use. Analysis of pretest…

  9. Microcomputers and neurobiology: a short review.

    PubMed

    Fraser, P J

    1985-12-01

    A brief history of the application of computing techniques emphasizes the two-part development with expensive minicomputers available in a few laboratories being added to by inexpensive microcomputers ubiquitously available. Computers are used for microscope control and plotting, serial section reconstruction, morphometric measurement, stereology, video image analysis, photometry and fluorescence microscopy. Basic principles are exemplified by considering nerve cell reconstruction. General principles of computerized electrical measurement including filtering, averaging and stimulus generation are discussed. Computerized waveform selection as used for spike discrimination, when considered along with computer control of electrode position and the growing availability of multichannel recording arrays, suggests a possible advance in automatic analyses. With the ability to process more complex waveforms successfully, electrophysiological data such as compound extracellular potentials may usefully replace the cleaner, but more limited intracellular data. Success with multichannel feedback controlled stimulators making paraplegics stand and walk point to a developing application with much potential.

  10. Microcomputer based software for biodynamic simulation

    NASA Technical Reports Server (NTRS)

    Rangarajan, N.; Shams, T.

    1993-01-01

    This paper presents a description of a microcomputer based software package, called DYNAMAN, which has been developed to allow an analyst to simulate the dynamics of a system consisting of a number of mass segments linked by joints. One primary application is in predicting the motion of a human occupant in a vehicle under the influence of a variety of external forces, specially those generated during a crash event. Extensive use of a graphical user interface has been made to aid the user in setting up the input data for the simulation and in viewing the results from the simulation. Among its many applications, it has been successfully used in the prototype design of a moving seat that aids in occupant protection during a crash, by aircraft designers in evaluating occupant injury in airplane crashes, and by users in accident reconstruction for reconstructing the motion of the occupant and correlating the impacts with observed injuries.

  11. High resolution image processing on low-cost microcomputers

    NASA Technical Reports Server (NTRS)

    Miller, R. L.

    1993-01-01

    Recent advances in microcomputer technology have resulted in systems that rival the speed, storage, and display capabilities of traditionally larger machines. Low-cost microcomputers can provide a powerful environment for image processing. A new software program which offers sophisticated image display and analysis on IBM-based systems is presented. Designed specifically for a microcomputer, this program provides a wide-range of functions normally found only on dedicated graphics systems, and therefore can provide most students, universities and research groups with an affordable computer platform for processing digital images. The processing of AVHRR images within this environment is presented as an example.

  12. Forest management. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1995-04-01

    The bibliography contains citations concerning forest management practices. Planning that evaluates the sustainability of timber harvest, habitat availability, and recreation over long periods of time is covered. Topics include silviculture, tree diseases and pests, timber cutting methods, and watershed management. (Contains a minimum of 141 citations and includes a subject term index and title list.)

  13. Forest management. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1996-05-01

    The bibliography contains citations concerning forest management practices. Planning that evaluates the sustainability of timber harvest, habitat availability, and recreation over long periods of time is covered. Topics include silviculture, tree diseases and pests, timber cutting methods, and watershed management. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  14. Sustainable forest management. (Latest citations from the Cab abstracts database). Published Search

    SciTech Connect

    1996-12-01

    The bibliography contains citations concerning developments in sustainable forestry management. Topics include international regulations, economics, strategies, land use rights, ecological impact, political developments, and evaluations of sustainable forestry resource management programs. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  15. A new database on contaminant exposure and effects in terrestrial vertebrates for natural resource managers

    USGS Publications Warehouse

    Rattner, B.A.; Pearson, J.L.; Garrett, L.J.; Erwin, R.M.; Walz, A.; Ottinger, M.A.; Barrett, H.R.

    1997-01-01

    The Biomonitoring of Environmental Status and Trends (BEST) program of the Department of the Interior is focused to identify and understand effects of contaminant stressors on biological resources under their stewardship. Despite the desire of many to continuously monitor the environmental health of our estuaries, much can be learned by summarizing existing temporal, geographic, and phylogenetic contaminant information. To this end, retrospective contaminant exposure and effects data for amphibians, reptiles, birds, and mammals residing within 30 km of Atlantic coast estuaries are being assembled through searches of published literature (e.g., Fisheries Review, Wildlife Review, BIOSIS Previews) and databases (e.g., US EPA Ecological Incident Information System; USGS Diagnostic and Epizootic Databases), and compilation of summary data from unpublished reports of government natural resource agencies, private conservation groups, and universities. These contaminant exposure and effect data for terrestrial vertebrates (CEE-TV) are being summarized using Borland dBASE in a 96- field format, including species, collection time and site coordinates, sample matrix, contaminant concentration, biomarker and bioindicator responses, and source of information (N>1500 records). This CEE-TV database has been imported into the ARC/INFO geographic information system (GIS), for purposes of examining geographic coverage and trends, and to identify critical data gaps. A preliminary risk assessment will be conducted to identify and characterize contaminants and other stressors potentially affecting terrestrial vertebrates that reside, migrate through or reproduce in these estuaries. Evaluations are underway, using specific measurement and assessment endpoints, to rank and prioritize estuarine ecosystems in which terrestrial vertebrates are potentially at risk for purposes of prediction and focusing future biomonitoring efforts.

  16. ScriptWriter. A relational database to manage outpatient medical treatment.

    PubMed Central

    Tanner, T. B.

    1994-01-01

    ScriptWriter is database software designed to replicate the process of a physician writing a prescription. The software also includes standard demographic and progress note information; however the focus of the software is on automating the process of writing prescriptions. The software is especially adept at creating patient medication lists, generating medication histories and keeping track of medication expiration dates. Other strengths include its ability to organize patient assignments and assist in the generation of progress notes. The application is network capable and fully graphical. A psychiatric outpatient clinic is currently using the software. Practitioners in non-psychiatric settings can also benefit from the software. PMID:7949872

  17. COMPILATION AND MANAGEMENT OF ORP GLASS FORMULATION DATABASE, VSL-12R2470-1 REV 0

    SciTech Connect

    Kruger, Albert A.; Pasieka, Holly K.; Muller, Isabelle; Gilbo, Konstantin; Perez-Cardenas, Fernando; Joseph, Innocent; Pegg, Ian L.; Kot, Wing K.

    2012-12-13

    The present report describes the first steps in the development of a glass property-composition database for WTP LAW and HL W glasses that includes all of the data that were used in the development of the WTP baseline models and all of the data collected subsequently as part of WTP enhancement studies perfonned for ORP. The data were reviewed to identifY some of the more significant gaps in the composition space that will need to be filled to support waste processing at Hanford. The WTP baseline models have been evaluated against the new data in terms of range of validity and prediction perfonnance.

  18. Planning and management of water resource programs. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1997-05-01

    The bibliography contains citations concerning planning and management of water resource programs and projects at the local, regional, state, and national levels. The studies of water quality, drinking water, industrial water, and irrigation are presented. Topics include groundwater and surface water management, flood control, waste water treatment, hydroelectric power generation, sanitation and toxic hazards, models and risk assessment, and remote sensing technology. Worldwide water management is covered.(Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  19. Planning and management of water resource programs. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1996-02-01

    The bibliography contains citations concerning planning and management of water resource programs and projects at the local, regional, state, and national levels. The studies of water quality, drinking water, industrial water, and irrigation are presented. Topics include groundwater and surface water management, flood control, waste water treatment, hydroelectric power generation, sanitation and toxic hazards, models and risk assessment, and remote sensing technology. Worldwide water management is covered.(Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  20. Solar energy utilization and microcomputer control in the greenhouse builk curing and drying solar system

    SciTech Connect

    Nassar, A.N.H.

    1987-01-01

    Three agricultural applications in a specially designed greenhouse solar system functioning as a multi-purpose solar air collector for crop production and curing/drying processes are examined. An automated hydroponic crop production system is proposed for the greenhouse solar system. Design criteria of the proposed system and its utilization of solar energy for root-zone warming are presented and discussed. Based upon limited testing of the hydroponic system considered, hydroponic production of greenhouse crops is believed reasonable to complement the year-round use of the greenhouse solar system. The hardware/software design features of a microcomputer-based control system applied in the greenhouse solar barn are presented and discussed. On-line management and utilization of incident solar energy by the microcomputer system are investigated for both the greenhouse and tobacco curing/drying modes of operation. The design approach considered for the microcomputer control system is believed suitable for regulating solar energy collection and utilization for crop production applications in greenhouse systems.

  1. 76 FR 59170 - Hartford Financial Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-23

    ... Management Division, Hartford, CT; Notice of Negative Determination Regarding Application for Reconsideration... negative determination regarding workers' eligibility to apply for Trade Adjustment Assistance (TAA...). The negative determination was issued on August 19, 2011. The Department's Notice of determination...

  2. A real-time, portable, microcomputer-based jet engine simulator

    NASA Technical Reports Server (NTRS)

    Blech, R. A.; Soeder, J. F.; Mihaloew, J. R.

    1984-01-01

    Modern piloted flight simulators require detailed models of many aircraft components, such as the airframe, propulsion system, flight deck controls and instrumentation, as well as motion drive and visual display systems. The amount of computing power necessary to implement these systems can exceed that offered by dedicated mainframe computers. One approach to this problem is through the use of distributed computing, where parts of the simulation are assigned to computing subsystems, such as microcomputers. One such subsystem, such as microcomputers. One such subsystem, a real-time, portable, microcomputer-based jet engine simulator, is described in this paper. The simulator will be used at the NASA Ames Vertical Motion Simulator facility to perform calculations previously done on the facility's mainframe computer. The mainframe will continue to do all other system calculations and will interface to the engine simulator through analog I/0. The engine simulator hardware includes a 16-bit microcomputer and floating-point coprocessor. There is an 8 channel analog input board and an 8 channel analog output board. A model of a small turboshaft engine/control is coded in floating-point FORTRAN. The FORTRAN code and a data monitoring program run under the control of an assembly language real-time executive. The monitoring program allows the user to isplay and/or modify simulator variables on-line through a data terminal. A dual disk drive system is used for mass storage of programs and data. The CP/M-86 operating system provides file management and overall system control. The frame time for the simulator is 30 milliseconds, which includes all analog I/0 operations.

  3. Wetlands legislation and management. (Latest citations from the Selected Water Resources Abstracts database). Published Search

    SciTech Connect

    Not Available

    1994-02-01

    The bibliography contains citations concerning federal and state legislation governing coastal and fresh water wetlands. Studies of regional regulations and management of specific sites are included. Topics such as reconciling environmental considerations with economic pressures and landowners' rights are covered. Wetlands restoration projects, conservation projects, and development plans are also presented. Many citations discuss wetlands management in relation to the Clean Water Act. (Contains 250 citations and includes a subject term index and title list.)

  4. Investigating Electromagnetic Induction through a Microcomputer-Based Laboratory.

    ERIC Educational Resources Information Center

    Trumper, Ricardo; Gelbman, Moshe

    2000-01-01

    Describes a microcomputer-based laboratory experiment designed for high school students that very accurately analyzes Faraday's law of electromagnetic induction, addressing each variable separately while the others are kept constant. (Author/CCM)

  5. Teaching WP and DP with CP/M-Based Microcomputers.

    ERIC Educational Resources Information Center

    Bartholome, Lloyd W.

    1982-01-01

    The use of CP/M (Control Program Monitor)-based microcomputers in teaching word processing and data processing is explored. The system's advantages, variations, dictionary software, and future are all discussed. (CT)

  6. Applications of Microcomputers in the Teaching of Physics 6502 Software.

    ERIC Educational Resources Information Center

    Marsh, David P.

    1980-01-01

    Described is a variety of uses of the microcomputer when coupled with software available for systems using 6502 microprocessors. Included are several computer programs which exhibit some of the possibilities for programing the 6502 microprocessors. (DS)

  7. A Microcomputer-Assisted Presentation of Atomic Orbitals.

    ERIC Educational Resources Information Center

    Petrich, James A.

    1981-01-01

    A program written for the Apple II microcomputer that plots the "one-S" orbital of an atom is described. The material is used to move from the Bohr model of the atom to the quantum mechanical description. (MP)

  8. A Laboratory Data Collection Microcomputer for Handicapped Science Students.

    ERIC Educational Resources Information Center

    Lunney, David; Morrison, Robert C.

    1982-01-01

    A microcomputer-based Universal Laboratory Training and Research Aid (ULTRA) provides meaningful laboratory access to blind students and students with upper limb disabilities. Using ULTRA, blind students can perform chemical experiments independently. (CL)

  9. Computer Center: Setting Up a Microcomputer Center--1 Person's Perspective.

    ERIC Educational Resources Information Center

    Duhrkopf, Richard, Ed.; Collins, Michael, A. J., Ed.

    1988-01-01

    Considers eight components to be considered in setting up a microcomputer center for use with college classes. Discussions include hardware, software, physical facility, furniture, technical support, personnel, continuing financial expenditures, and security. (CW)

  10. Learning Scientific Reasoning Skills in Microcomputer-Based Laboratories.

    ERIC Educational Resources Information Center

    Friedler, Yael; And Others

    1990-01-01

    The impact of enhanced observation or enhanced prediction on scientific reasoning about heat energy and temperature problems was investigated. Instruction was successful in improving prediction and observation skills in the microcomputer-based laboratory environment. (CW)

  11. Evaluating Microcomputer Access Technology for Use by Visually Impaired Students.

    ERIC Educational Resources Information Center

    Ruconich, Sandra

    1984-01-01

    The article outlines advantages and limitations of five types of access to microcomputer technology for visually impaired students: electronic braille, paper braille, Optacon, synthetic speech, and enlarged print. Additional considerations in access decisions are noted. (CL)

  12. Modeling Steady-State Groundwater Flow Using Microcomputer Spreadsheets.

    ERIC Educational Resources Information Center

    Ousey, John Russell, Jr.

    1986-01-01

    Describes how microcomputer spreadsheets are easily adapted for use in groundwater modeling. Presents spreadsheet set-ups and the results of five groundwater models. Suggests that this approach can provide a basis for demonstrations, laboratory exercises, and student projects. (ML)

  13. Microcomputer-controlled world time display for public area viewing

    NASA Astrophysics Data System (ADS)

    Yep, S.; Rashidian, M.

    1982-05-01

    The design, development, and implementation of a microcomputer-controlled world clock is discussed. The system, designated international Time Display System (ITDS), integrates a Geochron Calendar Map and a microcomputer-based digital display to automatically compensate for daylight savings time, leap year, and time zone differences. An in-depth technical description of the design and development of the electronic hardware, firmware, and software systems is provided. Reference material on the time zones, fabrication techniques, and electronic subsystems are also provided.

  14. Analysis, Repair, and Management of the Total Ozone Mapping Spectrometer Database

    NASA Technical Reports Server (NTRS)

    Sirovich, Lawrence

    1997-01-01

    In the ensuing period we were able to demonstrate that the origin of these filamentous patterns resulted from the action of synoptic-scale vortical velocity field on the global-scale background gradient of ozone concentration in the meridional direction. Hyperbolic flow patterns between long-lived atmospheric vortices bring together air parcels from different latitudes, thus creating large gradients along the separatrices leaving the hyperbolic (stagnation) point. This result is further confirmed by the KL analysis of the ozone field in the equatorial region, where the background concentration gradient vanishes. The spectral slope in this region has been found to lie close to -1, in agreement with Batchelor's prediction. Another outcome of this result is that it at least provides indirect evidence about the kinetic energy spectrum of the atmospheric turbulence in the range of scales approximately 200 to 2000 km. Namely, Batchelor's analysis is based on the assumption that the velocity field is large-scale, that is the kinetic energy spectrum decays as O(k(sup -3)) or steeper. Since the scalar spectrum is confirmed, this also supports this form of the kinetic energy spectrum. The study of equatorial regions of TOMS data revealed the efficiency of the KL method is in detecting and separating a wave-like measurement artifact inherently present in the dataset due to the non-perfect correction for cross-track bias. Just two to three eigenfunctions represent the error, which makes it possible to enhance the data by reconstituting it from the data by eliminating the subspace of artifactual eigenfunctions. This represents a highly efficient means for achieving an improved rendering of the data. This has been implemented on the database. A wide range of techniques and algorithms have been developed for the repair and extension of the TOMS database.

  15. The Johnson Space Center Management Information Systems (JSCMIS): An interface for organizational databases

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.; Erickson, Lloyd

    1990-01-01

    The Management Information and Decision Support Environment (MIDSE) is a research activity to build and test a prototype of a generic human interface on the Johnson Space Center (JSC) Information Network (CIN). The existing interfaces were developed specifically to support operations rather than the type of data which management could use. The diversity of the many interfaces and their relative difficulty discouraged occasional users from attempting to use them for their purposes. The MIDSE activity approached this problem by designing and building an interface to one JSC data base - the personnel statistics tables of the NASA Personnel and Payroll System (NPPS). The interface was designed against the following requirements: generic (use with any relational NOMAD data base); easy to learn (intuitive operations for new users); easy to use (efficient operations for experienced users); self-documenting (help facility which informs users about the data base structure as well as the operation of the interface); and low maintenance (easy configuration to new applications). A prototype interface entitled the JSC Management Information Systems (JSCMIS) was produced. It resides on CIN/PROFS and is available to JSC management who request it. The interface has passed management review and is ready for early use. Three kinds of data are now available: personnel statistics, personnel register, and plan/actual cost.

  16. Pacific Missile Test Center Information Resources Management Organization (code 0300): The ORACLE client-server and distributed processing architecture

    SciTech Connect

    Beckwith, A. L.; Phillips, J. T.

    1990-06-10

    Computing architectures using distributed processing and distributed databases are increasingly becoming considered acceptable solutions for advanced data processing systems. This is occurring even though there is still considerable professional debate as to what truly'' distributed computing actually is and despite the relative lack of advanced relational database management software (RDBMS) capable of meeting database and system integrity requirements for developing reliable integrated systems. This study investigates the functionally of ORACLE data base management software that is performing distributed processing between a MicroVAX/VMS minicomputer and three MS-DOS-based microcomputers. The ORACLE database resides on the MicroVAX and is accessed from the microcomputers with ORACLE SQL*NET, DECnet, and ORACLE PC TOOL PACKS. Data gathered during the study reveals that there is a demonstrable decrease in CPU demand on the MicroVAX, due to distributed processing'', when the ORACLE PC Tools are used to access the database as opposed to database access from dumb'' terminals. Also discovered were several hardware/software constraints that must be considered in implementing various software modules. The results of the study indicate that this distributed data processing architecture is becoming sufficiently mature, reliable, and should be considered for developing applications that reduce processing on central hosts. 33 refs., 2 figs.

  17. Measuring impact of protected area management interventions: current and future use of the Global Database of Protected Area Management Effectiveness.

    PubMed

    Coad, Lauren; Leverington, Fiona; Knights, Kathryn; Geldmann, Jonas; Eassom, April; Kapos, Valerie; Kingston, Naomi; de Lima, Marcelo; Zamora, Camilo; Cuardros, Ivon; Nolte, Christoph; Burgess, Neil D; Hockings, Marc

    2015-11-01

    Protected areas (PAs) are at the forefront of conservation efforts, and yet despite considerable progress towards the global target of having 17% of the world's land area within protected areas by 2020, biodiversity continues to decline. The discrepancy between increasing PA coverage and negative biodiversity trends has resulted in renewed efforts to enhance PA effectiveness. The global conservation community has conducted thousands of assessments of protected area management effectiveness (PAME), and interest in the use of these data to help measure the conservation impact of PA management interventions is high. Here, we summarize the status of PAME assessment, review the published evidence for a link between PAME assessment results and the conservation impacts of PAs, and discuss the limitations and future use of PAME data in measuring the impact of PA management interventions on conservation outcomes. We conclude that PAME data, while designed as a tool for local adaptive management, may also help to provide insights into the impact of PA management interventions from the local-to-global scale. However, the subjective and ordinal characteristics of the data present significant limitations for their application in rigorous scientific impact evaluations, a problem that should be recognized and mitigated where possible. PMID:26460133

  18. Measuring impact of protected area management interventions: current and future use of the Global Database of Protected Area Management Effectiveness.

    PubMed

    Coad, Lauren; Leverington, Fiona; Knights, Kathryn; Geldmann, Jonas; Eassom, April; Kapos, Valerie; Kingston, Naomi; de Lima, Marcelo; Zamora, Camilo; Cuardros, Ivon; Nolte, Christoph; Burgess, Neil D; Hockings, Marc

    2015-11-01

    Protected areas (PAs) are at the forefront of conservation efforts, and yet despite considerable progress towards the global target of having 17% of the world's land area within protected areas by 2020, biodiversity continues to decline. The discrepancy between increasing PA coverage and negative biodiversity trends has resulted in renewed efforts to enhance PA effectiveness. The global conservation community has conducted thousands of assessments of protected area management effectiveness (PAME), and interest in the use of these data to help measure the conservation impact of PA management interventions is high. Here, we summarize the status of PAME assessment, review the published evidence for a link between PAME assessment results and the conservation impacts of PAs, and discuss the limitations and future use of PAME data in measuring the impact of PA management interventions on conservation outcomes. We conclude that PAME data, while designed as a tool for local adaptive management, may also help to provide insights into the impact of PA management interventions from the local-to-global scale. However, the subjective and ordinal characteristics of the data present significant limitations for their application in rigorous scientific impact evaluations, a problem that should be recognized and mitigated where possible.

  19. Measuring impact of protected area management interventions: current and future use of the Global Database of Protected Area Management Effectiveness

    PubMed Central

    Coad, Lauren; Leverington, Fiona; Knights, Kathryn; Geldmann, Jonas; Eassom, April; Kapos, Valerie; Kingston, Naomi; de Lima, Marcelo; Zamora, Camilo; Cuardros, Ivon; Nolte, Christoph; Burgess, Neil D.; Hockings, Marc

    2015-01-01

    Protected areas (PAs) are at the forefront of conservation efforts, and yet despite considerable progress towards the global target of having 17% of the world's land area within protected areas by 2020, biodiversity continues to decline. The discrepancy between increasing PA coverage and negative biodiversity trends has resulted in renewed efforts to enhance PA effectiveness. The global conservation community has conducted thousands of assessments of protected area management effectiveness (PAME), and interest in the use of these data to help measure the conservation impact of PA management interventions is high. Here, we summarize the status of PAME assessment, review the published evidence for a link between PAME assessment results and the conservation impacts of PAs, and discuss the limitations and future use of PAME data in measuring the impact of PA management interventions on conservation outcomes. We conclude that PAME data, while designed as a tool for local adaptive management, may also help to provide insights into the impact of PA management interventions from the local-to-global scale. However, the subjective and ordinal characteristics of the data present significant limitations for their application in rigorous scientific impact evaluations, a problem that should be recognized and mitigated where possible. PMID:26460133

  20. Railroad management planning. (Latest citations from the NTIS database). Published Search

    SciTech Connect

    Not Available

    1993-01-01

    The bibliography contains citations concerning railroad management techniques and their impact on operations. Topics include freight statistics, impacts on communities, and yard operations. Forecasts of future trends and government policies regarding railroad operations are also discussed. (Contains a minimum of 76 citations and includes a subject term index and title list.)

  1. Content-Based Management of Image Databases in the Internet Age

    ERIC Educational Resources Information Center

    Kleban, James Theodore

    2010-01-01

    The Internet Age has seen the emergence of richly annotated image data collections numbering in the billions of items. This work makes contributions in three primary areas which aid the management of this data: image representation, efficient retrieval, and annotation based on content and metadata. The contributions are as follows. First,…

  2. Inland wetlands legislation and management. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1996-03-01

    The bibliography contains citations concerning Federal and state laws and management programs for the protection and use of inland wetlands. The use of wetlands to control highway runoff and community wastewater is discussed. Wetlands protection programs, restoration projects, resource planning, and wetlands identification methods are cited. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  3. Inland wetlands legislation and management. (Latest citations from the NTIS Bibliographic database). Published Search

    SciTech Connect

    Not Available

    1993-11-01

    The bibliography contains citations concerning Federal and state laws and management programs for the protection and use of inland wetlands. the use of wetlands to control highway runoff and community wastewater is discussed. Wetlands protection programs, restoration projects, resource planning, and wetlands identification methods are cited. (Contains 250 citations and includes a subject term index and title list.)

  4. Rainforests: Conservation and resource management. (Latest citations from the Biobusiness database). Published Search

    SciTech Connect

    Not Available

    1994-12-01

    The bibliography contains citations concerning conservation of rainforest ecology and management of natural resources. Topics include plant community structure and development, nutrient dynamics, rainfall characteristics and water budgets, and forest dynamics. Studies performed in specific forest areas are included. Effects of human activities are also considered. (Contains a minimum of 154 citations and includes a subject term index and title list.)

  5. Unterstützung der IT-Service-Management-Prozesse an der Technischen Universität München durch eine Configuration-Management-Database

    NASA Astrophysics Data System (ADS)

    Knittl, Silvia

    Hochschulprozesse in Lehre und Verwaltung erfordern durch die steigende Integration und IT-Unterstützung ein sogenanntes Business Alignment der IT und damit auch ein professionelleres IT-Service-Management (ITSM). Die IT Infrastructure Library (ITIL) mit ihrer Beschreibung von in der Praxis bewährten Prozessen hat sich zum de-facto Standard im ITSM etabliert. Ein solcher Prozess ist das Konfigurationsmanagement. Es bildet die IT-Infrastruktur als Konfigurationselemente und deren Beziehungen in einem Werkzeug, genannt Configuration Management Database (CMDB), ab und unterstützt so das ITSM. Dieser Bericht beschreibt die Erfahrungen mit der prototypischen Einführung einer CMDB an der Technischen Universität München.

  6. Avibase – a database system for managing and organizing taxonomic concepts

    PubMed Central

    Lepage, Denis; Vaidya, Gaurav; Guralnick, Robert

    2014-01-01

    Abstract Scientific names of biological entities offer an imperfect resolution of the concepts that they are intended to represent. Often they are labels applied to entities ranging from entire populations to individual specimens representing those populations, even though such names only unambiguously identify the type specimen to which they were originally attached. Thus the real-life referents of names are constantly changing as biological circumscriptions are redefined and thereby alter the sets of individuals bearing those names. This problem is compounded by other characteristics of names that make them ambiguous identifiers of biological concepts, including emendations, homonymy and synonymy. Taxonomic concepts have been proposed as a way to address issues related to scientific names, but they have yet to receive broad recognition or implementation. Some efforts have been made towards building systems that address these issues by cataloguing and organizing taxonomic concepts, but most are still in conceptual or proof-of-concept stage. We present the on-line database Avibase as one possible approach to organizing taxonomic concepts. Avibase has been successfully used to describe and organize 844,000 species-level and 705,000 subspecies-level taxonomic concepts across every major bird taxonomic checklist of the last 125 years. The use of taxonomic concepts in place of scientific names, coupled with efficient resolution services, is a major step toward addressing some of the main deficiencies in the current practices of scientific name dissemination and use. PMID:25061375

  7. Attributes of the Federal Energy Management Program's Federal Site Building Characteristics Database

    SciTech Connect

    Loper, Susan A.; Sandusky, William F.

    2010-12-31

    Typically, the Federal building stock is referred to as a group of about one-half million buildings throughout the United States. Additional information beyond this level is generally limited to distribution of that total by agency and maybe distribution of the total by state. However, additional characterization of the Federal building stock is required as the Federal sector seeks ways to implement efficiency projects to reduce energy and water use intensity as mandated by legislation and Executive Order. Using a Federal facility database that was assembled for use in a geographic information system tool, additional characterization of the Federal building stock is provided including information regarding the geographical distribution of sites, building counts and percentage of total by agency, distribution of sites and building totals by agency, distribution of building count and floor space by Federal building type classification by agency, and rank ordering of sites, buildings, and floor space by state. A case study is provided regarding how the building stock has changed for the Department of Energy from 2000 through 2008.

  8. The computational structural mechanics testbed architecture. Volume 4: The global-database manager GAL-DBM

    NASA Technical Reports Server (NTRS)

    Wright, Mary A.; Regelbrugge, Marc E.; Felippa, Carlos A.

    1989-01-01

    This is the fourth of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 4 describes the nominal-record data management component of the NICE software. It is intended for all users.

  9. Management of Reclaimed Produced Water in California Enhanced with the Expanded U.S. Geological Survey Produced Waters Geochemical Database

    NASA Astrophysics Data System (ADS)

    Gans, K. D.; Blondes, M. S.; Kharaka, Y. K.; Reidy, M. E.; Conaway, C. H.; Thordsen, J. J.; Rowan, E. L.; Engle, M.

    2015-12-01

    In California, in 2014, every barrel of oil produced also produced 16 barrels of water. Approximately 3.2 billion barrels of water were co-produced with California oil in 2014. Half of California's produced water is generally used for steam and water injection for enhanced oil recovery. The other half (~215,000 acre-feet of water) is available for potential reuse. Concerns about the severe drought, groundwater depletion, and contamination have prompted petroleum operators and water districts to examine the recycling of produced water. Knowledge of the geochemistry of produced waters is valuable in determining the feasibility of produced water reuse. Water with low salinity can be reclaimed for use outside of the petroleum industry (e.g. irrigation, municipal uses, and industrial operations). Since a great proportion of California petroleum wells have produced water with relatively low salinity (generally 10,000-40,000 mg/L TDS), reclaiming produced water could be important as a drought mitigation strategy, especially in the parched southern San Joaquin Valley with many oil fields. The USGS Produced Waters Geochemical Database, available at http://eerscmap.usgs.gov/pwapp, will facilitate studies on the management of produced water for reclamation in California. Expanding on the USGS 2002 database, we have more accurately located California wells. We have added new data for 300 wells in the Sacramento Valley, San Joaquin Valley and the Los Angeles Basin for a total of ~ 1100 wells in California. In addition to the existing (2002) geochemical analyses of major ions and total dissolved solids, the new data also include geochemical analyses of minor ions and stable isotopes. We have added an interactive web map application which allows the user to filter data on chosen fields (e.g. TDS < 35,000 mg/L). Using the web map application as well as more in-depth investigation on the full data set can provide critical insight for better management of produced waters in water

  10. Knowledge Management in Cardiac Surgery: The Second Tehran Heart Center Adult Cardiac Surgery Database Report

    PubMed Central

    Abbasi, Kyomars; Karimi, Abbasali; Abbasi, Seyed Hesameddin; Ahmadi, Seyed Hossein; Davoodi, Saeed; Babamahmoodi, Abdolreza; Movahedi, Namdar; Salehiomran, Abbas; Shirzad, Mahmood; Bina, Peyvand

    2012-01-01

    Background: The Adult Cardiac Surgery Databank (ACSD) of Tehran Heart Center was established in 2002 with a view to providing clinical prediction rules for outcomes of cardiac procedures, developing risk score systems, and devising clinical guidelines. This is a general analysis of the collected data. Methods: All the patients referred to Tehran Heart Center for any kind of heart surgery between 2002 and 2008 were included, and their demographic, medical, clinical, operative, and postoperative data were gathered. This report presents general information as well as in-hospital mortality rates regarding all the cardiac procedures performed in the above time period. Results: There were 24959 procedures performed: 19663 (78.8%) isolated coronary artery bypass grafting surgeries (CABGs); 1492 (6.0%) isolated valve surgeries; 1437 (5.8%) CABGs concomitant with other procedures; 832 (3.3%) CABGs combined with valve surgeries; 722 (2.9%) valve surgeries concomitant with other procedures; 545 (2.2%) surgeries other than CABG or valve surgery; and 267 (1.1%) CABGs concomitant with valve and other types of surgery. The overall mortality was 205 (1.04%), with the lowest mortality rate (0.47%) in the isolated CABGs and the highest (4.49%) in the CABGs concomitant with valve surgeries and other types of surgery. Meanwhile, the overall mortality rate was higher in the female patients than in the males (1.90% vs. 0.74%, respectively). Conclusion: Isolated CABG was the most prevalent procedure at our center with the lowest mortality rate. However, the overall mortality was more prevalent in our female patients. This database can serve as a platform for the participation of the other countries in the region in the creation of a regional ACSD. PMID:23304179

  11. Principles and techniques in the design of ADMS+. [advanced data-base management system

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nick; Kang, Hyunchul

    1986-01-01

    'ADMS+/-' is an advanced data base management system whose architecture integrates the ADSM+ mainframe data base system with a large number of work station data base systems, designated ADMS-; no communications exist between these work stations. The use of this system radically decreases the response time of locally processed queries, since the work station runs in a single-user mode, and no dynamic security checking is required for the downloaded portion of the data base. The deferred update strategy used reduces overhead due to update synchronization in message traffic.

  12. Databases save time and improve the quality of the design, management and processing of ecopathological surveys.

    PubMed

    Sulpice, P; Bugnard, F; Calavas, D

    1994-01-01

    The example of an ecopathological survey on nursing ewe mastitis shows that data bases have 4 complementary functions: assistance during the conception of surveys; follow-up of surveys; management and quality control of data; and data organization for statistical analysis. This is made possible by the simultaneous conception of both the data base and the survey, and by the integration of computer science into the work of the task group that conducts the survey. This methodology helps save time and improve the quality of data in ecopathological surveys.

  13. Managing attribute--value clinical trials data using the ACT/DB client-server database system.

    PubMed

    Nadkarni, P M; Brandt, C; Frawley, S; Sayward, F G; Einbinder, R; Zelterman, D; Schacter, L; Miller, P L

    1998-01-01

    ACT/DB is a client-server database application for storing clinical trials and outcomes data, which is currently undergoing initial pilot use. It stores most of its data in entity-attribute-value form. Such data are segregated according to data type to allow indexing by value when possible, and binary large object data are managed in the same way as other data. ACT/DB lets an investigator design a study rapidly by defining the parameters (or attributes) that are to be gathered, as well as their logical grouping for purposes of display and data entry. ACT/DB generates customizable data entry. The data can be viewed through several standard reports as well as exported as text to external analysis programs. ACT/DB is designed to encourage reuse of parameters across multiple studies and has facilities for dictionary search and maintenance. It uses a Microsoft Access client running on Windows 95 machines, which communicates with an Oracle server running on a UNIX platform. ACT/DB is being used to manage the data for seven studies in its initial deployment.

  14. The web-enabled database of JRC-EC, a useful tool for managing European Gen IV materials data

    NASA Astrophysics Data System (ADS)

    Over, H. H.; Dietz, W.

    2008-06-01

    Materials and document databases are important tools to conserve knowledge and experimental materials data of European R&D projects. A web-enabled application guarantees a fast access to these data. In combination with analysis tools the experimental data are used for e.g. mechanical design, construction and lifetime predictions of complex components. The effective and efficient handling of large amounts of generic and detailed materials data with regard to properties related to e.g. fabrication processes, joining techniques, irradiation or aging is one of the basic elements of data management within ongoing nuclear safety and design related European research projects and networks. The paper describes the structure and functionality of Mat-DB and gives examples how these tools can be used for the management and evaluation of materials data of European (national or multi-national) R&D activities or future reactor types such as the EURATOM FP7 Generation IV reactor types or the heavy liquid metals cooled reactor.

  15. BACTID: a microcomputer implementation of a PASCAL program for bacterial identification based on Bayesean probability.

    PubMed

    Jilly, B J

    1988-01-01

    A computer program (BACTID) is described which enables the identification of bacteria based on a priori data and Bayesean probability testing. The program is not limited to a specific format, has a short execution time, can be easily applied to a variety of situations, and can be run on almost any microcomputer system operating under either 8-bit CP/M or 16-bit MS-DOS or PC-DOS. Additionally, BACTID is not limited to one type of computer (hardware independent); is not limited by size of the computer's random access memory (RAM independent); can recognize various database matrices (format independent); is able to compensate for missing data; and allows for various methods of data entry. The efficacy of the program was checked against a commercially available test system and a 99.34% agreement was obtained. Also, the execution time for a 46 x 21 data matrix was as little as 3.5 seconds. These results show that microcomputer identification programs not only are viable alternatives to code-book registers, but also offer flexibility which is not found in commercial systems. PMID:3282771

  16. Modified Laser and Thermos cell calculations on microcomputers

    SciTech Connect

    Shapiro, A.; Huria, H.C.

    1987-01-01

    In the course of designing and operating nuclear reactors, many fuel pin cell calculations are required to obtain homogenized cell cross sections as a function of burnup. In the interest of convenience and cost, it would be very desirable to be able to make such calculations on microcomputers. In addition, such a microcomputer code would be very helpful for educational course work in reactor computations. To establish the feasibility of making detailed cell calculations on a microcomputer, a mainframe cell code was compiled and run on a microcomputer. The computer code Laser, originally written in Fortran IV for the IBM-7090 class of mainframe computers, is a cylindrical, one-dimensional, multigroup lattice cell program that includes burnup. It is based on the MUFT code for epithermal and fast group calculations, and Thermos for the thermal calculations. There are 50 fast and epithermal groups and 35 thermal groups. Resonances are calculated assuming a homogeneous system and then corrected for self-shielding, Dancoff, and Doppler by self-shielding factors. The Laser code was converted to run on a microcomputer. In addition, the Thermos portion of Laser was extracted and compiled separately to have available a stand alone thermal code.

  17. Data storage management in a distributed database with deterministic limited communications windows between data storage nodes

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-05-01

    An orbital service model allows data to be collected, stored and used on different nodes comprising an ad-hoc system where provider craft supply services to consumer craft. Ad-hoc networks and provider-consumer relationships are commonly used in various applications on Earth. The deterministic movement of spacecraft, however, allows the ad-hoc network and service providing model to operate in a different way than would be typical in most terrestrial ad-hoc networks. While long periods of no direct node-to-node connectivity may exist, the periods of connectivity are pre-known based on orbital parameters. Additionally, paths for indirect connectivity can be identified and evaluated for cost effectiveness. This paper presents a data management approach for an orbital computing ad-hoc system. Algorithms for determining where data should be stored (identification of most useful point of storage, whether multiple copies are justified) and how movement should be affected (transfer scheduling, replication, etc.) are presented and evaluated.

  18. Spatial database for the management of "urban geology" geothematic information: the case of Drama City, Greece

    NASA Astrophysics Data System (ADS)

    Pantelias, Eustathios; Zervakou, Alexandra D.; Tsombos, Panagiotis I.; Nikolakopoulos, Konstantinos G.

    2008-10-01

    The aggregation of population in big cities leads to the concentration of human activities, economic wealth, over consumption of natural resources and urban growth without planning and sustainable management. As a result, urban societies are exposed to various dangers and threats with economical, social, ecological - environmental impacts on the urban surroundings. Problems associated with urban development are related to their geological conditions and those of their surroundings, e.g. flooding, land subsidence, groundwater pollution, soil contamination, earthquakes, landslides, etc. For these reasons, no sustainable urban planning can be done without geological information support. The first systematic recording, codification and documentation of "urban geology" geothematic information in Greece is implemented by the Institute of Geological and Mineral Exploration (I.G.M.E.) in the frame of project "Collection, codification and documentation of geothematic information for urban and suburban areas in Greece - pilot applications". Through the implementation of this project, all geothematic information derived from geological mapping, geotechnical - geochemical - geophysical research and measurements in four pilot areas of Greece Drama (North Greece), Nafplio & Sparti (Peloponnesus) and Thrakomakedones (Attica) is stored and processed in specially designed geodatabases in GIS environment containing vector and raster data. For the specific GIS application ArcGIS Personal Geodatabase is used. Data is classified in geothematic layers, grouped in geothematic datasets (e.g. Topography, Geology - Tectonics, Submarine Geology, Technical Geology, Hydrogeology, Soils, Radioactive elements, etc) and being processed in order to produced multifunctional geothematic maps. All compiled data constitute the essential base for land use planning and environmental protection in specific urban areas. With the termination of the project the produced geodatabase and other digital data

  19. Electronic Fetal Monitoring by Microcomputer: A Clinical Application

    PubMed Central

    Lichten, Edward M.

    1981-01-01

    Using recent advances in microcomputer technology, a system for the continuous, direct processing of fetal monitor information is installed in Sinai Hospital. The basic system consists of an Apple IIR microcomputer with accessories readily available at local computer stores. Eight fetal monitors are connected to the computer system by cable. Monitor tracings, similar in quality to the original, are displayed at the central unit and at remote locations throughout the labor and delivery areas. This information can also be transmitted to physicians' homes and reproduced on a multi-copy graphic printer. Two benefits with this system are noted. First, this application of microcomputer technology promotes the rapid dissemination of information to physicians and staff. Second, medical record storage can be improved by the graphic printer's copies.

  20. A microcomputer system designed for psychological and behavioural experiments.

    PubMed

    Popplewell, D A; Burton, M J

    1985-05-01

    This paper describes a relatively cheap MC6809-based microcomputer designed to run experiments in real-time, and to use the hardware and software facilities of a larger (HOST) computer. Each microcomputer is capable of controlling a wide range of psychological and behavioural experiments, and includes 32K RAM, 4K EPROM, 32 digital input lines, 32 digital output lines, analogue/digital converters, and programmable timers. Any programming language may be used, providing a cross-compiler generating MC6809 executable code exists for the HOST. Following over a year of use we can confirm that this system provides an effective method of running psychological and behavioural experiments.

  1. Microcomputers and Teacher Education. OATE-OACTE Monograph Series No. 10.

    ERIC Educational Resources Information Center

    Murphy, Linda B., Ed; Warger, Cynthia L., Ed.

    The six articles presented in this monograph discuss the role of microcomputers in teacher education: (1) "What Teachers and Their Students Should Know about Microcomputers" (Michael T. Battista) (2) "Promoting 'Computing Literacy' for Teacher Educators" (Keith E. Bernhard); (3) "Microcomputers in Education: A Review of Research" (Lynn Lehner and…

  2. Casks (computer analysis of storage casks): A microcomputer based analysis system for storage cask review

    SciTech Connect

    Chen, T.F.; Mok, G.C.; Carlson, R.W.

    1995-08-01

    CASKS is a microcomputer based computer system developed by LLNL to assist the Nuclear Regulatory Commission in performing confirmatory analyses for licensing review of radioactive-material storage cask designs. The analysis programs of the CASKS computer system consist of four modules: the impact analysis module, the thermal analysis module, the thermally-induced stress analysis module, and the pressure-induced stress analysis module. CASKS uses a series of menus to coordinate input programs, cask analysis programs, output programs, data archive programs and databases, so the user is able to run the system in an interactive environment. This paper outlines the theoretical background on the impact analysis module and the yielding surface formulation. The close agreement between the CASKS analytical predictions and the results obtained form the two storage casks drop tests performed by SNL and by BNFL at Winfrith serves as the validation of the CASKS impact analysis module.

  3. A Multi-Disciplinary Management of Flooding Risk Based on the Use of Rainfall Data, Historical Impacts Databases and Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Renard, F.; Alonso, L.; Soto, D.

    2014-12-01

    Greater Lyon (1.3 million inhabitants 650 km ²), France, is subjected to recurring floods, with numerous consequences. From the perspective of prevention and management of this risk, the local authorities, in partnership with multidisciplinary researchers, have developed since 1988 a database built by the field teams, which specifically identifies all floods (places, date, impacts, damage, etc.). At first, this historical database is compared to two other databases, the emergency services and the local newspaper ones, in georeferencing these events using a GIS. It turns out that the historical database is more complete and precise, but the contribution of the other two bases is not negligible, and a useful complement to the knowledge of impacts. Thanks to the dense rain measurement network (30 rain gauges), the flood information is then compared to the distribution of rainfall for each episode (interpolation by ordinary kriging). The results are satisfactory and validate the accuracy of the information contained in the database, but also the accuracy of rainfall measurements. Thereafter, the number of flood on the study area is confronted with rainfall characteristics (intensity, duration and height of precipitated water). It does not appear here clear relationship between the number of floods and rainfall characteristics, because of the diversity of land uses, its permeability and the the types of local sewer network and urban water management. Finally, floods observed in the database are compared spatially with a GIS to flooding from the sewer network modeling (using the software Canoe). A strong spatial similarity between floods observed in the field and simulated flood is found in the majority of cases, despite the limitations of each tools. These encouraging results confirm the accuracy of the database and the reliability of the simulation software, and offer many operational perspectives to better understand the flood and learn to cope with the flooding risk.

  4. National Radon Database. Volume 4. The EPA/state residential radon surveys: CA, HI, ID, LA, NE, NV, NC, OK, SC, the Navajo Nation, and the Billings, MT IHS Area 1989-1990 (5 1/4 inch, 1. 2mb) (for microcomputers). Data file

    SciTech Connect

    Not Available

    1990-01-01

    The National Radon Database (NRDB) was developed by the United States Environmental Protection Agency (USEPA) to distribute information in two recent radon surveys: the EPA/State Residential Radon Surveys and the National Residential Radon Survey. The National Residential Radon Surveys collected annual average radon measurements on all levels of approximately 5,700 homes nationwide. Information collected during survey includes a detailed questionnaire on house characteristics, as well as radon measurements. The radon survey data for Volume 6 is contained on two diskettes. The data diskettes are accompanied by comprehensive documentation on the design and implementation of the survey, the development and use of sampling weights, a summary of survey results, and information concerning the household questionnaire.

  5. National Radon Database. Volume 4. The EPA/state residential radon survey: CA, HI, ID, LA, NE, NV, NC, OK, SC, the Navajo Nation, and the Billings, MT IHS Area 1989-1990 (3 1/2 inch, 1. 44mb) (for microcomputers). Data file

    SciTech Connect

    Not Available

    1990-01-01

    The National Radon Database (NRDB) was developed by the United States Environmental Protection Agency (USEPA) to distribute information in two recent radon surveys: the EPA/State Residential Radon Surveys and the National Residential Radon Survey. The National Residential Radon Surveys collected annual average radon measurements on all levels of approximately 5,700 homes nationwide. Information collected during survey includes a detailed questionnaire on house characteristics, as well as radon measurements. The radon survey data for Volume 6 is contained on two diskettes. The data diskettes are accompanied by comprehensive documentation on the design and implementation of the survey, the development and use of sampling weights, a summary of survey results, and information concerning the household questionnaire.

  6. VIEWCACHE: An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nick; Sellis, Timoleon

    1991-01-01

    The objective is to illustrate the concept of incremental access to distributed databases. An experimental database management system, ADMS, which has been developed at the University of Maryland, in College Park, uses VIEWCACHE, a database access method based on incremental search. VIEWCACHE is a pointer-based access method that provides a uniform interface for accessing distributed databases and catalogues. The compactness of the pointer structures formed during database browsing and the incremental access method allow the user to search and do inter-database cross-referencing with no actual data movement between database sites. Once the search is complete, the set of collected pointers pointing to the desired data are dereferenced.

  7. Microcomputers in Vocational Home Economics Classrooms in USD #512.

    ERIC Educational Resources Information Center

    Shawnee Mission Public Schools, KS.

    A project was conducted to identify software suitable for use in home economics classes and to train home economics teachers to use that software with an Apple II Plus microcomputer. During the project, home economics software was identified, evaluated, and catalogued. Teaching strategies were adapted to include using the computer in the…

  8. Software Development Group. Software Review Center. Microcomputing Working Paper Series.

    ERIC Educational Resources Information Center

    Perkey, Nadine; Smith, Shirley C.

    Two papers describe the roles of the Software Development Group (SDG) and the Software Review Center (SRC) at Drexel University. The first paper covers the primary role of the SDG, which is designed to assist Drexel faculty with the technical design and programming of courseware for the Apple Macintosh microcomputer; the relationship of the SDG…

  9. A Bibliography of Microcomputer Software for Working Draft.

    ERIC Educational Resources Information Center

    Hodges, James O.

    Over 160 microcomputer software programs for elementary and secondary school social studies programs are included in this annotated bibliography. Listed in alphabetical order by the name of the program, the annotation contains a description of the program, appropriate grade level, the name of the system and whether it is available on disk or…

  10. TI 99/4A Microcomputers As Science Laboratory Instruments.

    ERIC Educational Resources Information Center

    Thomas, Frederick J.

    1983-01-01

    Emphasis on interfacing -- the various processes for passing information into or out of computers -- is suggested. Why the Texas Instruments microcomputer is plausible for this is noted, followed by specific interfacing suggestions, including programs for a timer and the assembly language subroutine. (MNS)

  11. Instructional Microcomputer Applications by Business Teachers in Minnesota.

    ERIC Educational Resources Information Center

    Lambrecht, Judith J.

    Data were collected from Minnesota secondary and postsecondary business teachers regarding their instructional microcomputer applications and their attitudes about several instructional computing issues. Usable surveys were returned by 342 teachers in 236 schools. The predominant brand of computer at the secondary level was the Apple II; most…

  12. Impact of the Microcomputer on the Physical Environment for Learning.

    ERIC Educational Resources Information Center

    Parker, James A.

    This paper provides an exploration of the impact of the increasing use of microcomputers in education upon curriculum development and the physical space needs in the learning environment. A secondary topic is the present and future directions of schools in the design of computer environments. (Author/MD)

  13. Planning for the Administrative Microcomputer Local Area Network.

    ERIC Educational Resources Information Center

    Rohr, Theodore; Devoti, Bart

    An overview is provided of the methods used by the Forest Park campus of St. Louis Community College (SLCC) to plan and develop a local area network (LAN) for administrative microcomputers. The first three sections provide brief descriptions of the SLCC District, SLCC, and the Forest Park campus. Section IV looks at the organization of…

  14. The Computer Connection: Four Approaches to Microcomputer Laboratory Interfacing.

    ERIC Educational Resources Information Center

    Graef, Jean L.

    1983-01-01

    Four ways in which microcomputers can be turned into laboratory instruments are discussed. These include adding an analog/digital (A/D) converter on a printed circuit board, adding an external A/D converter using the computer's serial port, attaching transducers to the game paddle ports, or connecting an instrument to the computer. (JN)

  15. A Microcomputer Exercise on Genetic Transcription and Translation.

    ERIC Educational Resources Information Center

    Meisenheimer, John L.

    1985-01-01

    Describes a microcomputer program (written for the Apple II+) which can serve as a lecture demonstration aid in explaining genetic transcription and translation. The program provides unemotional information on student errors, thus serving as a review drill to supplement the classroom. Student participation and instructor options are discussed. (DH)

  16. Using a Microcomputer in the Classroom. Third Edition.

    ERIC Educational Resources Information Center

    Bitter, Gary G.; And Others

    This book was written to help classroom teachers, lay persons, and school personnel understand the role of microcomputers in education. It has been especially designed for undergraduate and graduate technology-based education programs. Specific education examples and applications are provided throughout the book and exercises have been designed…

  17. An accurate demand feeder for fish, suitable for microcomputer control.

    PubMed

    Beach, M A; Baker, G E; Roberts, M G

    1986-01-01

    This paper describes an easily constructed and inexpensive demand feeder. The feeder is driven by an AC synchronous motor and gearbox and is suitable for microcomputer control. It will operate with inexpensive commercially available pelleted fish food, and will consistently deliver a single pellet for each operation of the motor. The components and materials can be purchased for approximately 23 pounds.

  18. The Use of a Microcomputer as an EKG Monitor.

    ERIC Educational Resources Information Center

    Walters, R. A.; Reynolds, R. F.

    1983-01-01

    Discusses the design and operation of a microcomputer system which obtains and displays an individual's electrocardiogram (EKG). The EKG information, in digital form, can be stored on a floppy disk and transmitted over telephone lines by use of a modem. (JN)

  19. ASCAL: A Microcomputer Program for Estimating Logistic IRT Item Parameters.

    ERIC Educational Resources Information Center

    Vale, C. David; Gialluca, Kathleen A.

    ASCAL is a microcomputer-based program for calibrating items according to the three-parameter logistic model of item response theory. It uses a modified multivariate Newton-Raphson procedure for estimating item parameters. This study evaluated this procedure using Monte Carlo Simulation Techniques. The current version of ASCAL was then compared to…

  20. Microcomputers and the Media Specialist: An Annotated Bibliography.

    ERIC Educational Resources Information Center

    Miller, Inabeth

    An overview of the literature reflecting the rapid development of interest in microcomputer use in education since 1978 is followed by an annotated bibliography which lists books, articles, and ERIC documents in nine categories. The first section includes materials of general interest--historical background, guides to using computers in the…

  1. A Microcomputer-Controlled T-60 NMR Emulator.

    ERIC Educational Resources Information Center

    Howard, Gary D.; DuBois, Thomas D.

    1986-01-01

    Discusses the problems created by the insufficient number of major data collection and analysis instruments in most college chemistry courses. Suggests the use of less expensive microcomputer-controlled instrument emulators, which resemble the actual instruments. Discusses the application of one such emulator in an instrumental analysis course.…

  2. Development of a microcomputer-based teleconference system

    SciTech Connect

    Allen, R.W.

    1980-09-12

    A computer based teleconference system was developed at the Lawrence Livermore National Laboratory (LLNL) from FY 1978 through FY 1980. The system is unique because it is based on a single low-cost microcomputer. The implementation of the system, with emphasis on the practical software issues addressed during development, is discussed.

  3. Foreign Language Teaching Programs for Microcomputers: A Volume of Reviews.

    ERIC Educational Resources Information Center

    Culley, Gerald R., Ed.; Mulford, George W., Ed.

    Teachers and supervisors of foreign language programs from 29 high schools in six states provide reviews of foreign language microcomputer courseware. Evaluations of the 25 programs for French, German, Italian, Russian and Spanish are based on: (1) quality of content; (2) relevance to subject area; (3) suitability to computer medium; (4)…

  4. Using Microcomputers in School Administration. Fastback No. 248.

    ERIC Educational Resources Information Center

    Connors, Eugene T.; Valesky, Thomas C.

    This "fastback" outlines the steps to take in computerizing school administration. After an introduction that lists the potential benefits of microcomputers in administrative offices, the booklet begins by delineating a three-step process for establishing an administrative computer system: (1) creating a district-level committee of administrators,…

  5. Microcomputers as Interfaces to Bibliographic Utilities (OCLC, RLIN, etc.).

    ERIC Educational Resources Information Center

    Genaway, David C.

    1983-01-01

    Discusses microcomputer interfacing systems which accept full MARC record from bibliographic utility, reformat and transform it into local library computer system format; illustrates how the TPS-400 interface works; outlines criteria for interface selection; indicates installation decisions; and notes limitation problems and benefits. Four…

  6. Microcomputers in Scottish Schools--A National Plan.

    ERIC Educational Resources Information Center

    Paton, George

    1985-01-01

    Describes the general elements of the National Plan for Microcomputers in Scottish schools, a plan devised by the Scottish Microelectronic Development Programme, which stresses national standardization of computer systems for compatibility of software and communication links and development of national software application packages. (MBR)

  7. Integrating Microcomputers and Microelectronics into the Physics Curriculum.

    ERIC Educational Resources Information Center

    Gale, Douglas S.

    1980-01-01

    Describes an interdisciplinary microcomputer and microelectronics program offered jointly by the Physics and Computer Science Departments of East Texas State University. The program operates on both the graduate and undergraduate level. Content as well as structure of the program are discussed. (Author/DS)

  8. Library Software: Directory of Microcomputer Software for Libraries.

    ERIC Educational Resources Information Center

    Walton, Robert A.

    The availability of appropriate software for library applications is a continuing problem, and this directory is designed to reduce the frustration of librarians in their search for library software for a microcomputer by providing profiles of software packages designed specifically for libraries. Each profile describes the purpose of the program,…

  9. Ideas for Integrating the Microcomputer with Special Education.

    ERIC Educational Resources Information Center

    Pollard, Jim, Ed.

    This publication contains six presentations on using microcomputers in special education, submitted by special education teachers at informal information sharing sessions. The first is a lesson plan involving pre-computer activities that prepare preschool developmentally delayed children for using a computer keyboard. Finger isolation and…

  10. Matrix algebra routines for the Acorn Archimedes microcomputer: example applications.

    PubMed

    Fielding, A

    1988-08-01

    A set of matrix algebra routines have been written, as BASICV procedures, for the Acorn Archimedes microcomputer. It is shown that these procedures are executed so quickly that programs, which require matrix algebra computations, can be written in interpreted BASIC. Two example applications, reciprocal averaging and principal components analysis, are demonstrated.

  11. Model Experiment of Two-Dimentional Brownian Motion by Microcomputer.

    ERIC Educational Resources Information Center

    Mishima, Nobuhiko; And Others

    1980-01-01

    Describes the use of a microcomputer in studying a model experiment (Brownian particles colliding with thermal particles). A flow chart and program for the experiment are provided. Suggests that this experiment may foster a deepened understanding through mutual dialog between the student and computer. (SK)

  12. Ideas for Integrating the Microcomputer into Science Instruction.

    ERIC Educational Resources Information Center

    Pollard, Jim, Ed.

    Much of the innovation in the use of microcomputers in education has come from classroom teachers who are using computers with students. In October, 1987, forums were held for secondary school science teachers who were using computers in their science classes. Within this document are some of the lesson plans that the participating teachers…

  13. An Observational Study of Social Processes in Microcomputer Classrooms.

    ERIC Educational Resources Information Center

    Feldmann, Shirley C.; And Others

    This observational study examined student and teacher verbal and nonverbal behaviors in microcomputer classrooms in a high school where most of the students are Black, Hispanic, or Asian, and almost half of them are classified as economically disadvantaged. A total of 125 students in grades 9 to 12 were observed, with 47 students in marketing, 18…

  14. Decisions, Decisions, Decisions: Help in Choosing Microcomputer Software and Hardware.

    ERIC Educational Resources Information Center

    Pugh, W. Jean; Fredenburg, Anne M.

    1985-01-01

    This bibliography, prepared with the information specialist, end-user, and administrator in mind, presents citations to 167 journal articles that provide concrete comparisons of commercially-available microcomputer software packages and hardware equipment. An index divided into software and hardware sections with references to type of comparison…

  15. Teaching Economics: Research Findings from a Microcomputer/Videodisc Project.

    ERIC Educational Resources Information Center

    Glenn, Allen D.; And Others

    1984-01-01

    Describes field test findings of a project funded by the Minnesota Educational Computing Consortium and the Rockefeller Family Fund to demonstrate that microcomputers and home videodisc players can deliver instruction to students. Basic research questions and field testing procedures for a high school economics course are provided. (MBR)

  16. Validity of the Microcomputer Evaluation Screening and Assessment Aptitude Scores.

    ERIC Educational Resources Information Center

    Janikowski, Timothy P.; And Others

    1991-01-01

    Examined validity of Microcomputer Evaluation Screening and Assessment (MESA) aptitude scores relative to General Aptitude Test Battery (GATB) using multitrait-multimethod correlational analyses. Findings from 54 rehabilitation clients and 29 displaced workers revealed no evidence to support the construct validity of the MESA. (Author/NB)

  17. Client Perceptions of the Microcomputer Evaluation and Screening Assessment.

    ERIC Educational Resources Information Center

    Bordieri, James E.; Musgrave, Jack

    1989-01-01

    Explored rehabilitation clients' (N=75) perceptions of Microcomputer Evaluation and Screening Assessment (MESA). Results showed clients experienced greater enjoyment, but more difficulty, learning how to complete computer exercises than hardware exercises but viewed computer exercises instructions as easier to understand. Observed differences in…

  18. Utilization of the Microcomputer in the Mathematics Classroom.

    ERIC Educational Resources Information Center

    Pruett, Poppy L.; And Others

    1993-01-01

    Reports a study investigating the instructional use of microcomputers by secondary mathematics teachers, and discusses results from a sample of 128 completed questionnaires showing that computer utilization is hampered by inadequate access to equipment, lack of software appropriate to the mathematics curricula, and a lack of guidance for…

  19. Microcomputer Usage in Secondary Marketing Education. A National Study.

    ERIC Educational Resources Information Center

    Searle, A. Gary

    A study was conducted to determine microcomputer hardware, software, and inservice components of secondary marketing education programs. A questionnaire was developed and sent to 420 teacher-coordinators in 42 states. A total of 225 (54 percent) usable returns were tabulated at the University of Wisconsin-Stout Computer Center. Results of the…

  20. Evaluation of Three Microcomputer Teaching Modules. SUMIT Courseware Development Project.

    ERIC Educational Resources Information Center

    Soldan, Ted

    The purpose of this series of experiments was to examine two questions related to the effectiveness of computer assisted instruction (CAI). Can microcomputer modules teach effectively, and do they enhance learning when used as a supplement to traditional teaching methods? Part 1 of this report addresses the former question and part 2 addresses the…

  1. A Microcomputer-Based Interactive Presentation Development System.

    ERIC Educational Resources Information Center

    Moreau, Dennis R.; Dominick, Wayne D.

    1988-01-01

    Reviews research and development projects sponsored by the National Aeronautics and Space Administration (NASA) that address microcomputer-based support for instructional activities at the University of Southwestern Louisiana. Highlights include a graphics project, local area networks, and the Interactive Presentation Development System, which is…

  2. Microcomputer Simulation of Nonlinear Systems: From Oscillations to Chaos.

    ERIC Educational Resources Information Center

    Raw, Cecil J. G.; Stacey, Larry M.

    1989-01-01

    Presents two short microcomputer programs which illustrate features of nonlinear dynamics, including steady states, periodic oscillations, period doubling, and chaos. Logistic maps are explained, inclusion in undergraduate chemistry and physics courses to teach nonlinear equations is discussed, and applications in social and biological sciences…

  3. Planning the Use of Microcomputers in Higher Education Administration.

    ERIC Educational Resources Information Center

    Slovacek, Simeon P.; Dolence, Michael G.

    The process of planning the role of the microcomputer in higher education administration is investigated through a survey of a sample of universities and colleges in California engaged in such efforts, and through a review of literature in education as well as computing. A major objective of the study was to systematically investigate the…

  4. Executive Decision Making: Using Microcomputers in Budget Planning.

    ERIC Educational Resources Information Center

    Hoffman, Roslyn; Robinson, Lucinda

    The successful integration of microcomputer support to help prepare for an anticipated budget crisis at the University of Illinois at Chicago is described. The IBM Personal Computer and VisiCalc software were key tools in the decision support system. When campus executives were instructed to cut budgets and reallocate funds to produce a "Doomsday…

  5. Microcomputer Activities Which Encourage the Reading-Writing Connection.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    Many reading teachers, cognizant of the creative opportunities for skill development allowed by new reading-writing software, are choosing to use microcomputers in their classrooms full-time. Adventure story creation programs capitalize on reading-writing integration by allowing children, with appropriate assistance, to create their own…

  6. Logo Burn-In. Microcomputing Working Paper Series.

    ERIC Educational Resources Information Center

    Drexel Univ., Philadelphia, PA. Microcomputing Program.

    This paper describes a hot-stamping operation undertaken at Drexel University in an attempt to prevent computer theft on campus. The program was initiated in response to the University's anticipated receipt of up to 3,000 Macintosh microcomputers per year and the consequent publicity the university was receiving. All clusters of computers (e.g.,…

  7. Microcomputer-Based Digital Signal Processing Laboratory Experiments.

    ERIC Educational Resources Information Center

    Tinari, Jr., Rocco; Rao, S. Sathyanarayan

    1985-01-01

    Describes a system (Apple II microcomputer interfaced to flexible, custom-designed digital hardware) which can provide: (1) Fast Fourier Transform (FFT) computation on real-time data with a video display of spectrum; (2) frequency synthesis experiments using the inverse FFT; and (3) real-time digital filtering experiments. (JN)

  8. The Microcomputer as an Aid in Teaching Dynamic Systems Analysis.

    ERIC Educational Resources Information Center

    Young, Peter

    1985-01-01

    Describes the major aspects of a 20-lecture introductory course for environmental science students, showing how microcomputers can help teach dynamic systems concepts. Also discusses the continuous stirred tank reactor as a simple dynamic system, higher order systems, frequency response, nonlinear systems, and feedback control. (JN)

  9. Development of an In-Plant Microcomputer Literacy Lab.

    ERIC Educational Resources Information Center

    Gacka, Richard C.

    General Electric Company (GE) of Erie, Pennsylvania, hosts onsite literacy classes in conjunction with the Northwest Tri-County Unit. The development of the inplant microcomputer literacy lab expands the offerings available to the participating adult basic education and General Educational Development program students by supplying software and…

  10. Evaluating Microcomputer Courses in a Non-Traditional Setting.

    ERIC Educational Resources Information Center

    Thompson, Gary E.

    Summer Tech '84 was the product of a joint venture of the College of Education of Ohio State University and the Columbus Ohio Public Schools to provide instruction on the use of microcomputers to citizens of the Columbus metropolitan area. For four weeks, 80 different 10-hour classes were offered in eight areas: introductory computer literacy,…

  11. A BASIC Microcomputer Program for Estimating Test Reliability.

    ERIC Educational Resources Information Center

    Cobern, William W.

    This computer program, written in BASIC, performs three different calculations of test reliability: (1) the Kuder-Richardson method; (2); the "common split-half" method; and (3) the Rulon-Guttman split-half method. The program reads sequential access data files for microcomputers that have been set up by statistical packages such as STATPAC. The…

  12. Stackfile Database

    NASA Technical Reports Server (NTRS)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  13. Djeen (Database for Joomla!’s Extensible Engine): a research information management system for flexible multi-technology project administration

    PubMed Central

    2013-01-01

    Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665

  14. Proposal and Evaluation of Management Method for College Mechatronics Education Applying the Project Management

    NASA Astrophysics Data System (ADS)

    Ando, Yoshinobu; Eguchi, Yuya; Mizukawa, Makoto

    In this research, we proposed and evaluated a management method of college mechatronics education. We applied the project management to college mechatronics education. We practiced our management method to the seminar “Microcomputer Seminar” for 3rd grade students who belong to Department of Electrical Engineering, Shibaura Institute of Technology. We succeeded in management of Microcomputer Seminar in 2006. We obtained the good evaluation for our management method by means of questionnaire.

  15. Microcomputer based controller for the Langley 0.3-meter Transonic Cryogenic Tunnel

    NASA Technical Reports Server (NTRS)

    Balakrishna, S.; Kilgore, W. Allen

    1989-01-01

    Flow control of the Langley 0.3-meter Transonic Cryogenic Tunnel (TCT) is a multivariable nonlinear control problem. Globally stable control laws were generated to hold tunnel conditions in the presence of geometrical disturbances in the test section and precisely control the tunnel states for small and large set point changes. The control laws are mechanized as four inner control loops for tunnel pressure, temperature, fan speed, and liquid nitrogen supply pressure, and two outer loops for Mach number and Reynolds number. These integrated control laws have been mechanized on a 16-bit microcomputer working on DOS. This document details the model of the 0.3-m TCT, control laws, microcomputer realization, and its performance. The tunnel closed loop responses to small and large set point changes were presented. The controller incorporates safe thermal management of the tunnel cooldown based on thermal restrictions. The controller was shown to provide control of temperature to + or - 0.2K, pressure to + or - 0.07 psia, and Mach number to + or - 0.002 of a given set point during aerodynamic data acquisition in the presence of intrusive geometrical changes like flexwall movement, angle-of-attack changes, and drag rake traverse. The controller also provides a new feature of Reynolds number control. The controller provides a safe, reliable, and economical control of the 0.3-m TCT.

  16. A Real-Time Image Acquisition And Processing System For A RISC-Based Microcomputer

    NASA Astrophysics Data System (ADS)

    Luckman, Adrian J.; Allinson, Nigel M.

    1989-03-01

    A low cost image acquisition and processing system has been developed for the Acorn Archimedes microcomputer. Using a Reduced Instruction Set Computer (RISC) architecture, the ARM (Acorn Risc Machine) processor provides instruction speeds suitable for image processing applications. The associated improvement in data transfer rate has allowed real-time video image acquisition without the need for frame-store memory external to the microcomputer. The system is comprised of real-time video digitising hardware which interfaces directly to the Archimedes memory, and software to provide an integrated image acquisition and processing environment. The hardware can digitise a video signal at up to 640 samples per video line with programmable parameters such as sampling rate and gain. Software support includes a work environment for image capture and processing with pixel, neighbourhood and global operators. A friendly user interface is provided with the help of the Archimedes Operating System WIMP (Windows, Icons, Mouse and Pointer) Manager. Windows provide a convenient way of handling images on the screen and program control is directed mostly by pop-up menus.

  17. Data management and language enhancement for generalized set theory computer language for operation of large relational databases

    NASA Technical Reports Server (NTRS)

    Finley, Gail T.

    1988-01-01

    This report covers the study of the relational database implementation in the NASCAD computer program system. The existing system is used primarily for computer aided design. Attention is also directed to a hidden-surface algorithm for final drawing output.

  18. Biofuel Database

    National Institute of Standards and Technology Data Gateway

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  19. Microcomputer-Based Curriculum Mapping: A Data Management Approach.

    ERIC Educational Resources Information Center

    Eisenberg, Michael

    Curriculum administration and evaluation require specific information on such curriculum components as content, time, teaching methodology, materials, evaluation, and scheduling. Without such information, planning, coordination, resources allocation, and other decision-making activities are severely handicapped. Furthermore, evaluation of the…

  20. Microcomputers in Education: Uses for the '80s. Proceedings of the Annual Microcomputer Conference (2nd, Tempe, Arizona, January 15-16, 1982). Publication No. 3.

    ERIC Educational Resources Information Center

    Watson, Nancy Ralph, Ed.

    The 30 conference papers in this collection are presented in 6 categories. Five overviews discuss innovative uses of computers in education (Dorothy K. Deringer); microcomputers in instructional research (Alan M. Lesgold); microcomputers in the schools (Mitchell Batoff, Gary G. Bitter); and the courseware crisis (Barbara R. Sadowski). Research and…