Sample records for data-base management system

  1. Computer-assisted engineering data base

    NASA Technical Reports Server (NTRS)

    Dube, R. P.; Johnson, H. R.

    1983-01-01

    General capabilities of data base management technology are described. Information requirements posed by the space station life cycle are discussed, and it is asserted that data base management technology supporting engineering/manufacturing in a heterogeneous hardware/data base management system environment should be applied to meeting these requirements. Today's commercial systems do not satisfy all of these requirements. The features of an R&D data base management system being developed to investigate data base management in the engineering/manufacturing environment are discussed. Features of this system represent only a partial solution to space station requirements. Areas where this system should be extended to meet full space station information management requirements are discussed.

  2. Data base management system analysis and performance testing with respect to NASA requirements

    NASA Technical Reports Server (NTRS)

    Martin, E. A.; Sylto, R. V.; Gough, T. L.; Huston, H. A.; Morone, J. J.

    1981-01-01

    Several candidate Data Base Management Systems (DBM's) that could support the NASA End-to-End Data System's Integrated Data Base Management System (IDBMS) Project, later rescoped and renamed the Packet Management System (PMS) were evaluated. The candidate DBMS systems which had to run on the Digital Equipment Corporation VAX 11/780 computer system were ORACLE, SEED and RIM. Oracle and RIM are both based on the relational data base model while SEED employs a CODASYL network approach. A single data base application which managed stratospheric temperature profiles was studied. The primary reasons for using this application were an insufficient volume of available PMS-like data, a mandate to use actual rather than simulated data, and the abundance of available temperature profile data.

  3. Current Standardization and Cooperative Efforts Related to Industrial Information Infrastructures.

    DTIC Science & Technology

    1993-05-01

    Data Management Systems: Components used to store, manage, and retrieve data. Data management includes knowledge bases, database management...Application Development Tools and Methods X/Open and POSIX APIs Integrated Design Support System (IDS) Knowledge -Based Systems (KBS) Application...IDEFlx) Yourdon Jackson System Design (JSD) Knowledge -Based Systems (KBSs) Structured Systems Development (SSD) Semantic Unification Meta-Model

  4. Management Principles to be Considered for Implementing a Data Base Management System Aboard U.S. (United States) Naval Ships under the Shipboard Non-Tactical ADP (Automated Data Processing) Program.

    DTIC Science & Technology

    1982-12-01

    Data Base Management System Aboard U.S. Naval Ships Under the Shipboard Non-tactical ADP Program by Robert Harrison Dixon December 1982 Thesis Advisor...OF REPORT a PERIOD COVIAOtt Management Principles to be Considered for Master’s Thesis Implementing a Data Base Management System December 1982 Aboard...NOTES is. KEY s0mas (Coelte on revrs side of 0..e..mp am iNe or "Neo 00111) Data Base Management System , DBMS, SNAP, SNAP I, SNAP II, Information

  5. Study of systems and techniques for data base management

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  6. Data communication between data terminal equipment and the JPL administrative data base management system

    NASA Technical Reports Server (NTRS)

    Iverson, R. W.

    1984-01-01

    Approaches to enabling an installed base of mixed data terminal equipment to access a data base management system designed to work with a specific terminal are discussed. The approach taken by the Jet Propulsion Laboratory is described. Background information on the Jet Propulsion Laboratory (JPL), its organization and a description of the Administrative Data Base Management System is included.

  7. Case Mix Management Systems: An Opportunity to Integrate Medical Records and Financial Management System Data Bases

    PubMed Central

    Rusnak, James E.

    1987-01-01

    Due to previous systems selections, many hospitals (health care facilities) are faced with the problem of fragmented data bases containing clinical, demographic and financial information. Projects to select and implement a Case Mix Management System (CMMS) provide an opportunity to reduce the number of separate physical files and to migrate towards systems with an integrated data base. The number of CMMS candidate systems is often restricted due to data base and system interface issues. The hospital must insure the CMMS project provides a means to implement an integrated on-line hospital information data base for use by departments in operating under a DRG-based Prospective Payment System. This paper presents guidelines for use in selecting a Case Mix Mangement System to meet the hospital's financial and operations planning, budgeting, marketing, and other management needs, while considering the data base implications of the implementation.

  8. Managing geometric information with a data base management system

    NASA Technical Reports Server (NTRS)

    Dube, R. P.

    1984-01-01

    The strategies for managing computer based geometry are described. The computer model of geometry is the basis for communication, manipulation, and analysis of shape information. The research on integrated programs for aerospace-vehicle design (IPAD) focuses on the use of data base management system (DBMS) technology to manage engineering/manufacturing data. The objectives of IPAD is to develop a computer based engineering complex which automates the storage, management, protection, and retrieval of engineering data. In particular, this facility must manage geometry information as well as associated data. The approach taken on the IPAD project to achieve this objective is discussed. Geometry management in current systems and the approach taken in the early IPAD prototypes are examined.

  9. Data management system for USGS/USEPA urban hydrology studies program

    USGS Publications Warehouse

    Doyle, W.H.; Lorens, J.A.

    1982-01-01

    A data management system was developed to store, update, and retrieve data collected in urban stormwater studies jointly conducted by the U.S. Geological Survey and U.S. Environmental Protection Agency in 11 cities in the United States. The data management system is used to retrieve and combine data from USGS data files for use in rainfall, runoff, and water-quality models and for data computations such as storm loads. The system is based on the data management aspect of the Statistical Analysis System (SAS) and was used to create all the data files in the data base. SAS is used for storage and retrieval of basin physiography, land-use, and environmental practices inventory data. Also, storm-event water-quality characteristics are stored in the data base. The advantages of using SAS to create and manage a data base are many with a few being that it is simple, easy to use, contains a comprehensive statistical package, and can be used to modify files very easily. Data base system development has progressed rapidly during the last two decades and the data managment system concepts used in this study reflect the advancement made in computer technology during this era. Urban stormwater data is, however, just one application for which the system can be used. (USGS)

  10. Information Systems for University Planning.

    ERIC Educational Resources Information Center

    Robinson, Robert J.

    This paper proposes construction of a separate data base environment for university planning information, distinct from data bases and systems supporting operational functioning and management. The data base would receive some of its input from the management information systems (MIS)/transactional data bases and systems through a process of…

  11. Data base management study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data base management techniques and applicable equipment are described. Recommendations which will assist potential NASA data users in selecting and using appropriate data base management tools and techniques are presented. Classes of currently available data processing equipment ranging from basic terminals to large minicomputer systems were surveyed as they apply to the needs of potential SEASAT data users. Cost and capabilities projections for this equipment through 1985 were presented. A test of a typical data base management system was described, as well as the results of this test and recommendations to assist potential users in determining when such a system is appropriate for their needs. The representative system tested was UNIVAC's DMS 1100.

  12. Documentation of a spatial data-base management system for monitoring pesticide application in Washington

    USGS Publications Warehouse

    Schurr, K.M.; Cox, S.E.

    1994-01-01

    The Pesticide-Application Data-Base Management System was created as a demonstration project and was tested with data submitted to the Washington State Department of Agriculture by pesticide applicators from a small geographic area. These data were entered into the Department's relational data-base system and uploaded into the system's ARC/INFO files. Locations for pesticide applica- tions are assigned within the Public Land Survey System grids, and ARC/INFO programs in the Pesticide-Application Data-Base Management System can subdivide each survey section into sixteen idealized quarter-quarter sections for display map grids. The system provides data retrieval and geographic information system plotting capabilities from a menu of seven basic retrieval options. Additionally, ARC/INFO coverages can be created from the retrieved data when required for particular applications. The Pesticide-Application Data-Base Management System, or the general principles used in the system, could be adapted to other applica- tions or to other states.

  13. Vibroacoustic payload environment prediction system (VAPEPS): Data base management center remote access guide

    NASA Technical Reports Server (NTRS)

    Thomas, V. C.

    1986-01-01

    A Vibroacoustic Data Base Management Center has been established at the Jet Propulsion Laboratory (JPL). The center utilizes the Vibroacoustic Payload Environment Prediction System (VAPEPS) software package to manage a data base of shuttle and expendable launch vehicle flight and ground test data. Remote terminal access over telephone lines to a dedicated VAPEPS computer system has been established to provide the payload community a convenient means of querying the global VAPEPS data base. This guide describes the functions of the JPL Data Base Management Center and contains instructions for utilizing the resources of the center.

  14. Engineering data management: Experience and projections

    NASA Technical Reports Server (NTRS)

    Jefferson, D. K.; Thomson, B.

    1978-01-01

    Experiences in developing a large engineering data management system are described. Problems which were encountered are presented and projected to future systems. Business applications involving similar types of data bases are described. A data base management system architecture proposed by the business community is described and its applicability to engineering data management is discussed. It is concluded that the most difficult problems faced in engineering and business data management can best be solved by cooperative efforts.

  15. A database management capability for Ada

    NASA Technical Reports Server (NTRS)

    Chan, Arvola; Danberg, SY; Fox, Stephen; Landers, Terry; Nori, Anil; Smith, John M.

    1986-01-01

    The data requirements of mission critical defense systems have been increasing dramatically. Command and control, intelligence, logistics, and even weapons systems are being required to integrate, process, and share ever increasing volumes of information. To meet this need, systems are now being specified that incorporate data base management subsystems for handling storage and retrieval of information. It is expected that a large number of the next generation of mission critical systems will contain embedded data base management systems. Since the use of Ada has been mandated for most of these systems, it is important to address the issues of providing data base management capabilities that can be closely coupled with Ada. A comprehensive distributed data base management project has been investigated. The key deliverables of this project are three closely related prototype systems implemented in Ada. These three systems are discussed.

  16. The ABC's required for establishing a practical computerized plant engineering management data base system

    NASA Technical Reports Server (NTRS)

    Maiocco, F. R.; Hume, J. P.

    1976-01-01

    A system's approach is outlined in the paper to assist facility and Plant Engineers improve their organization's data management system. The six basic steps identified may appear somewhat simple; however, adequate planning, proper resources, and the involvement of management will determine the success of a computerized facility management data base. Helpful suggestions are noted throughout the paper to insure the development of a practical computerized data management system.

  17. RIM as the data base management system for a material properties data base

    NASA Technical Reports Server (NTRS)

    Karr, P. H.; Wilson, D. J.

    1984-01-01

    Relational Information Management (RIM) was selected as the data base management system for a prototype engineering materials data base. The data base provides a central repository for engineering material properties data, which facilitates their control. Numerous RIM capabilities are exploited to satisfy prototype data base requirements. Numerical, text, tabular, and graphical data and references are being stored for five material types. Data retrieval will be accomplished both interactively and through a FORTRAN interface. The experience gained in creating and exercising the prototype will be used in specifying requirements for a production system.

  18. SPIRE Data-Base Management System

    NASA Technical Reports Server (NTRS)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  19. Reference manual for data base on Nevada water-rights permits

    USGS Publications Warehouse

    Cartier, K.D.; Bauer, E.M.; Farnham, J.L.

    1995-01-01

    The U.S. Geological Survey and Nevada Division of Water Resources have cooperatively developed and implemented a data-base system for managing water-rights permit information for the State of Nevada. The Water-Rights Permit data base is part of an integrated system of computer data bases using the Ingres Relational Data-Base Manage-ment System, which allows efficient storage and access to water information from the State Engineer's office. The data base contains a main table, three ancillary tables, and five lookup tables, as well as a menu-driven system for entering, updating, and reporting on the data. This reference guide outlines the general functions of the system and provides a brief description of data tables and data-entry screens.

  20. User requirements for NASA data base management systems. Part 1: Oceanographic discipline

    NASA Technical Reports Server (NTRS)

    Fujimoto, B.

    1981-01-01

    Generic oceanographic user requirements were collected and analyzed for use in developing a general multipurpose data base management system for future missions of the Office of Space and Terrestrial Applications (OSTA) of NASA. The collection of user requirements involved; studying the state-of-the-art technology in data base management systems; analyzing the results of related studies; formulating a viable and diverse list of scientists to be interviewed; developing a presentation format and materials; and interviewing oceanographic data users. More effective data management systems are needed to handle the increasing influx of data.

  1. CAD/CAM data management needs, requirements and options

    NASA Technical Reports Server (NTRS)

    Lopatka, R. S.; Johnson, T. G.

    1978-01-01

    The requirements for a data management system in support of technical or scientific applications and possible courses of action were reviewed. Specific requirements were evolved while working towards higher level integration impacting all phases of the current design process and through examination of commercially marketed systems and related data base research. Arguments are proposed for varied approaches in implementing data base systems ranging from no action necessary to immediate procurement of an existing data base management system.

  2. Database Management

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Management of the data within a planetary data system (PDS) is addressed. Principles of modern data management are described and several large NASA scientific data base systems are examined. Data management in PDS is outlined and the major data management issues are introduced.

  3. Development of an interactive data base management system for capturing large volumes of data.

    PubMed

    Moritz, T E; Ellis, N K; VillaNueva, C B; Steeger, J E; Ludwig, S T; Deegan, N I; Shroyer, A L; Henderson, W G; Sethi, G K; Grover, F L

    1995-10-01

    Accurate collection and successful management of data are problems common to all scientific studies. For studies in which large quantities of data are collected by means of questionnaires and/or forms, data base management becomes quite laborious and time consuming. Data base management comprises data collection, data entry, data editing, and data base maintenance. In this article, the authors describe the development of an interactive data base management (IDM) system for the collection of more than 1,400 variables from a targeted population of 6,000 patients undergoing heart surgery requiring cardiopulmonary bypass. The goals of the IDM system are to increase the accuracy and efficiency with which this large amount of data is collected and processed, to reduce research nurse work load through automation of certain administrative and clerical activities, and to improve the process for implementing a uniform study protocol, standardized forms, and definitions across sites.

  4. Web-based data acquisition and management system for GOSAT validation Lidar data analysis

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra N.; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2012-11-01

    An web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data analysis is developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS ground-level meteorological data, Rawinsonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data.

  5. A distributed data base management system. [for Deep Space Network

    NASA Technical Reports Server (NTRS)

    Bryan, A. I.

    1975-01-01

    Major system design features of a distributed data management system for the NASA Deep Space Network (DSN) designed for continuous two-way deep space communications are described. The reasons for which the distributed data base utilizing third-generation minicomputers is selected as the optimum approach for the DSN are threefold: (1) with a distributed master data base, valid data is available in real-time to support DSN management activities at each location; (2) data base integrity is the responsibility of local management; and (3) the data acquisition/distribution and processing power of a third-generation computer enables the computer to function successfully as a data handler or as an on-line process controller. The concept of the distributed data base is discussed along with the software, data base integrity, and hardware used. The data analysis/update constraint is examined.

  6. MST radar data-base management

    NASA Technical Reports Server (NTRS)

    Wickwar, V. B.

    1983-01-01

    Data management for Mesospheric-Stratospheric-Tropospheric, (MST) radars is addressed. An incoherent-scatter radar data base is discussed in terms of purpose, centralization, scope, and nature of the data base management system.

  7. A Data Management System for Multi-Phase Case-Control Studies

    PubMed Central

    Gibeau, Joanne M.; Steinfeldt, Lois C.; Stine, Mark J.; Tullis, Katherine V.; Lynch, H. Keith

    1983-01-01

    The design of a computerized system for the management of data in multi-phase epidemiologic case-control studies is described. Typical study phases include case-control selection, abstracting of data from medical records, and interview of study subjects or next of kin. In consultation with project personnel, requirements for the system were established: integration of data from all study phases into one data base, accurate follow-up of subjects through the study, sophisticated data editing capabilities, ready accessibility of specified programs to project personnel, and generation of current status and exception reports for project managment. SIR (Scientific Information Retrieval), a commercially available data base management system, was selected as the foundation of this system. The system forms a comprehensive data management system applicable to many types of public health research studies.

  8. An event-version-based spatio-temporal modeling approach and its application in the cadastral management

    NASA Astrophysics Data System (ADS)

    Li, Yangdong; Han, Zhen; Liao, Zhongping

    2009-10-01

    Spatiality, temporality, legality, accuracy and continuality are characteristic of cadastral information, and the cadastral management demands that the cadastral data should be accurate, integrated and updated timely. It's a good idea to build an effective GIS management system to manage the cadastral data which are characterized by spatiality and temporality. Because no sound spatio-temporal data models have been adopted, however, the spatio-temporal characteristics of cadastral data are not well expressed in the existing cadastral management systems. An event-version-based spatio-temporal modeling approach is first proposed from the angle of event and version. Then with the help of it, an event-version-based spatio-temporal cadastral data model is built to represent spatio-temporal cadastral data. At last, the previous model is used in the design and implementation of a spatio-temporal cadastral management system. The result of the application of the system shows that the event-version-based spatio-temporal data model is very suitable for the representation and organization of cadastral data.

  9. Environmental Files and Data Bases. Part A. Introduction and Oceanographic Management Information System.

    DTIC Science & Technology

    1981-09-01

    Management Information System Naval Oceanography Program Naval Oceanographic Requirements Acoustic Reference Service Research Vehicle...THE OCEANOGRAPHIC MANAGEMENT INFORMATION SYSTEM . .. .... 2-1 3. ACOUSTIC DATA .. .. .... ......... ...... 3-1 4. GEOLOGICAL AND GEOPHYSICAL DATA...36 CHAPTER 2 THE OCEANOGRAPHIC MANAGEMENT INFORMATION SYSTEM 2-i CHAPTER 2 THE OCEANOGRAPHIC MANAGEMENT INFORMATION SYSTEM CONTENTS Page

  10. Description of data base management systems activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    One of the major responsibilities of the JPL Computing and Information Services Office is to develop and maintain a JPL plan for providing computing services to the JPL management and administrative community that will lead to improved productivity. The CISO plan to accomplish this objective has been titled 'Management and Administrative Support Systems' (MASS). The MASS plan is based on the continued use of JPL's IBM 3032 Computer system for administrative computing and for the MASS functions. The current candidate administrative Data Base Management Systems required to support the MASS include ADABASE, Cullinane IDMS and TOTAL. Previous uses of administrative Data Base Systems have been applied to specific local functions rather than in a centralized manner with elements common to the many user groups. Limited capacity data base systems have been installed in microprocessor based office automation systems in a few Project and Management Offices using Ashton-Tate dBASE II. These experiences plus some other localized in house DBMS uses have provided an excellent background for developing user and system requirements for a single DBMS to support the MASS program.

  11. A relational data-knowledge base system and its potential in developing a distributed data-knowledge system

    NASA Technical Reports Server (NTRS)

    Rahimian, Eric N.; Graves, Sara J.

    1988-01-01

    A new approach used in constructing a rational data knowledge base system is described. The relational database is well suited for distribution due to its property of allowing data fragmentation and fragmentation transparency. An example is formulated of a simple relational data knowledge base which may be generalized for use in developing a relational distributed data knowledge base system. The efficiency and ease of application of such a data knowledge base management system is briefly discussed. Also discussed are the potentials of the developed model for sharing the data knowledge base as well as the possible areas of difficulty in implementing the relational data knowledge base management system.

  12. SUPERFUND SOILS DATA MANAGEMENT SYSTEM

    EPA Science Inventory

    This paper describes the Superfund Soil Data Management System (DMS), a PC-based data system being developed by the U.S. Environmental Protection Agency (EPA) in its effort to manage and evaluate treatment and performance data for contaminated soil, sludge, and debris. his system...

  13. A conceptual design for an integrated data base management system for remote sensing data. [user requirements and data processing

    NASA Technical Reports Server (NTRS)

    Maresca, P. A.; Lefler, R. M.

    1978-01-01

    The requirements of potential users were considered in the design of an integrated data base management system, developed to be independent of any specific computer or operating system, and to be used to support investigations in weather and climate. Ultimately, the system would expand to include data from the agriculture, hydrology, and related Earth resources disciplines. An overview of the system and its capabilities is presented. Aspects discussed cover the proposed interactive command language; the application program command language; storage and tabular data maintained by the regional data base management system; the handling of data files and the use of system standard formats; various control structures required to support the internal architecture of the system; and the actual system architecture with the various modules needed to implement the system. The concepts on which the relational data model is based; data integrity, consistency, and quality; and provisions for supporting concurrent access to data within the system are covered in the appendices.

  14. Quantitative Evaluation of 3 DBMS: ORACLE, SEED AND INGRES

    NASA Technical Reports Server (NTRS)

    Sylto, R.

    1984-01-01

    Characteristics required for NASA scientific data base management application are listed as well as performance testing objectives. Results obtained for the ORACLE, SEED, and INGRES packages are presented in charts. It is concluded that vendor packages can manage 130 megabytes of data at acceptable load and query rates. Performance tests varying data base designs and various data base management system parameters are valuable to applications for choosing packages and critical to designing effective data bases. An applications productivity increases with the use of data base management system because of enhanced capabilities such as a screen formatter, a reporter writer, and a data dictionary.

  15. Management Data Base Development.

    ERIC Educational Resources Information Center

    Dan, Robert L.

    1975-01-01

    A management data base is seen as essential for a management information system, program budgeting, program costing, management by objectives, program evaluation, productivity measures, and accountability in institutions of higher education. The necessity of a management data base is addressed, along with the benefits and limitations it may have…

  16. Design and realization of confidential data management system RFID-based

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Wang, Zhong; Wang, Xin

    2017-03-01

    This paper introduces the composition of RFID system, and then analyzes the hardware design and software design systems, and finally summarizes the realization and application of the confidential data management system RFID-based.

  17. Pan Air Geometry Management System (PAGMS): A data-base management system for PAN AIR geometry data

    NASA Technical Reports Server (NTRS)

    Hall, J. F.

    1981-01-01

    A data-base management system called PAGMS was developed to facilitate the data transfer in applications computer programs that create, modify, plot or otherwise manipulate PAN AIR type geometry data in preparation for input to the PAN AIR system of computer programs. PAGMS is composed of a series of FORTRAN callable subroutines which can be accessed directly from applications programs. Currently only a NOS version of PAGMS has been developed.

  18. Fault management for data systems

    NASA Technical Reports Server (NTRS)

    Boyd, Mark A.; Iverson, David L.; Patterson-Hine, F. Ann

    1993-01-01

    Issues related to automating the process of fault management (fault diagnosis and response) for data management systems are considered. Substantial benefits are to be gained by successful automation of this process, particularly for large, complex systems. The use of graph-based models to develop a computer assisted fault management system is advocated. The general problem is described and the motivation behind choosing graph-based models over other approaches for developing fault diagnosis computer programs is outlined. Some existing work in the area of graph-based fault diagnosis is reviewed, and a new fault management method which was developed from existing methods is offered. Our method is applied to an automatic telescope system intended as a prototype for future lunar telescope programs. Finally, an application of our method to general data management systems is described.

  19. Principles and techniques in the design of ADMS+. [advanced data-base management system

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nick; Kang, Hyunchul

    1986-01-01

    'ADMS+/-' is an advanced data base management system whose architecture integrates the ADSM+ mainframe data base system with a large number of work station data base systems, designated ADMS-; no communications exist between these work stations. The use of this system radically decreases the response time of locally processed queries, since the work station runs in a single-user mode, and no dynamic security checking is required for the downloaded portion of the data base. The deferred update strategy used reduces overhead due to update synchronization in message traffic.

  20. A means to an end: a web-based client management system in palliative care.

    PubMed

    O'Connor, Margaret; Erwin, Trudy; Dawson, Linda

    2009-03-01

    Home-based palliative care (hospice) services require comprehensive and fully integrated information systems to develop and manage the various aspects of their business, incorporating client data and management information. These systems assist in maintaining the quality of client care as well as improved management efficiencies. This article reports on a large not-for-profit home-based palliative care service in Australia, which embarked on a project to develop an electronic data management system specifically designed to meet the needs of the palliative care sector. This web-based client information management system represents a joint venture between the organization and a commercial company and has been a very successful project.

  1. The data base management system alternative for computing in the human services.

    PubMed

    Sircar, S; Schkade, L L; Schoech, D

    1983-01-01

    The traditional incremental approach to computerization presents substantial problems as systems develop and grow. The Data Base Management System approach to computerization was developed to overcome the problems resulting from implementing computer applications one at a time. The authors describe the applications approach and the alternative Data Base Management System (DBMS) approach through their developmental history, discuss the technology of DBMS components, and consider the implications of choosing the DBMS alternative. Human service managers need an understanding of the DBMS alternative and its applicability to their agency data processing needs. The basis for a conscious selection of computing alternatives is outlined.

  2. Requirements for company-wide management

    NASA Technical Reports Server (NTRS)

    Southall, J. W.

    1980-01-01

    Computing system requirements were developed for company-wide management of information and computer programs in an engineering data processing environment. The requirements are essential to the successful implementation of a computer-based engineering data management system; they exceed the capabilities provided by the commercially available data base management systems. These requirements were derived from a study entitled The Design Process, which was prepared by design engineers experienced in development of aerospace products.

  3. Development and Demonstration of a Statistical Data Base System for Library and Network Planning and Evaluation. Fourth Quarterly Report.

    ERIC Educational Resources Information Center

    Jones, Dennis; And Others

    The National Center for Higher Education Management Systems (NCHEMS) has completed the development and demonstration of a library statistical data base. The data base, or management information system, was developed for administrators of public and academic libraries. The system provides administrators with a framework of information and…

  4. Wiki-based Data Management System for Toxicogenomics

    EPA Science Inventory

    We are developing a data management system to enable systems-based toxicology at the US EPA. This is built upon the WikiLIMS platform and is capabale of housing not just genomics data but also a wide variety of toxicology data and associated experimental design information. Thi...

  5. Survey of standards applicable to a database management system

    NASA Technical Reports Server (NTRS)

    Urena, J. L.

    1981-01-01

    Industry, government, and NASA standards, and the status of standardization activities of standards setting organizations applicable to the design, implementation and operation of a data base management system for space related applications are identified. The applicability of the standards to a general purpose, multimission data base management system is addressed.

  6. Solid Waste Information Management System (SWIMS). Data summary, fiscal year 1980

    NASA Astrophysics Data System (ADS)

    Batchelder, H. M.

    1981-05-01

    The solid waste information management system (SWIMS) maintains computerized records on a master data base. It provides a comprehensive system for cataloging and assembling data into output reports. The SWIMS data base contains information on the transuranic (TRU) and low level waste (LLW) generated, buried, or stored.

  7. A Data-Based Financial Management Information System (FMIS) for Administrative Sciences Department

    DTIC Science & Technology

    1990-12-01

    Financial Management Information System that would result in improved management of financial assets, better use of clerical skills, and more detailed...develops and implements a personal computer-based Management Information System for the Management of the many funding accounts controlled by the...different software programs, into a single all-encompassing Management Information System . The system was written using dBASE IV and is currently operational.

  8. The research and development of water resources management information system based on ArcGIS

    NASA Astrophysics Data System (ADS)

    Cui, Weiqun; Gao, Xiaoli; Li, Yuzhi; Cui, Zhencai

    According to that there are large amount of data, complexity of data type and format in the water resources management, we built the water resources calculation model and established the water resources management information system based on the advanced ArcGIS and Visual Studio.NET development platform. The system can integrate the spatial data and attribute data organically, and manage them uniformly. It can analyze spatial data, inquire by map and data bidirectionally, provide various charts and report forms automatically, link multimedia information, manage database etc. . So it can provide spatial and static synthetical information services for study, management and decision of water resources, regional geology and eco-environment etc..

  9. Data-base development for water-quality modeling of the Patuxent River basin, Maryland

    USGS Publications Warehouse

    Fisher, G.T.; Summers, R.M.

    1987-01-01

    Procedures and rationale used to develop a data base and data management system for the Patuxent Watershed Nonpoint Source Water Quality Monitoring and Modeling Program of the Maryland Department of the Environment and the U.S. Geological Survey are described. A detailed data base and data management system has been developed to facilitate modeling of the watershed for water quality planning purposes; statistical analysis; plotting of meteorologic, hydrologic and water quality data; and geographic data analysis. The system is Maryland 's prototype for development of a basinwide water quality management program. A key step in the program is to build a calibrated and verified water quality model of the basin using the Hydrological Simulation Program--FORTRAN (HSPF) hydrologic model, which has been used extensively in large-scale basin modeling. The compilation of the substantial existing data base for preliminary calibration of the basin model, including meteorologic, hydrologic, and water quality data from federal and state data bases and a geographic information system containing digital land use and soils data is described. The data base development is significant in its application of an integrated, uniform approach to data base management and modeling. (Lantz-PTT)

  10. Technology for organization of the onboard system for processing and storage of ERS data for ultrasmall spacecraft

    NASA Astrophysics Data System (ADS)

    Strotov, Valery V.; Taganov, Alexander I.; Konkin, Yuriy V.; Kolesenkov, Aleksandr N.

    2017-10-01

    Task of processing and analysis of obtained Earth remote sensing data on ultra-small spacecraft board is actual taking into consideration significant expenditures of energy for data transfer and low productivity of computers. Thereby, there is an issue of effective and reliable storage of the general information flow obtained from onboard systems of information collection, including Earth remote sensing data, into a specialized data base. The paper has considered peculiarities of database management system operation with the multilevel memory structure. For storage of data in data base the format has been developed that describes a data base physical structure which contains required parameters for information loading. Such structure allows reducing a memory size occupied by data base because it is not necessary to store values of keys separately. The paper has shown architecture of the relational database management system oriented into embedment into the onboard ultra-small spacecraft software. Data base for storage of different information, including Earth remote sensing data, can be developed by means of such database management system for its following processing. Suggested database management system architecture has low requirements to power of the computer systems and memory resources on the ultra-small spacecraft board. Data integrity is ensured under input and change of the structured information.

  11. Selecting Data-Base Management Software for Microcomputers in Libraries and Information Units.

    ERIC Educational Resources Information Center

    Pieska, K. A. O.

    1986-01-01

    Presents a model for the evaluation of database management systems software from the viewpoint of librarians and information specialists. The properties of data management systems, database management systems, and text retrieval systems are outlined and compared. (10 references) (CLB)

  12. The Blood Stocks Management Scheme, a partnership venture between the National Blood Service of England and North Wales and participating hospitals for maximizing blood supply chain management.

    PubMed

    Chapman, J F; Cook, R

    2002-10-01

    The Blood Stocks Management Scheme (BSMS) has been established as a joint venture between the National Blood Service (NBS) in England and North Wales and participating hospitals to monitor the blood supply chain. Stock and wastage data are submitted to a web-based data-management system, facilitating continuous and complete red cell data collection and 'real time' data extraction. The data-management system enables peer review of performance in respect of stock holding levels and red cell wastage. The BSMS has developed an innovative web-based data-management system that enables data collection and benchmarking of practice, which should drive changes in stock management practice, therefore optimizing the use of donated blood.

  13. Improving data management and dissemination in web based information systems by semantic enrichment of descriptive data aspects

    NASA Astrophysics Data System (ADS)

    Gebhardt, Steffen; Wehrmann, Thilo; Klinger, Verena; Schettler, Ingo; Huth, Juliane; Künzer, Claudia; Dech, Stefan

    2010-10-01

    The German-Vietnamese water-related information system for the Mekong Delta (WISDOM) project supports business processes in Integrated Water Resources Management in Vietnam. Multiple disciplines bring together earth and ground based observation themes, such as environmental monitoring, water management, demographics, economy, information technology, and infrastructural systems. This paper introduces the components of the web-based WISDOM system including data, logic and presentation tier. It focuses on the data models upon which the database management system is built, including techniques for tagging or linking metadata with the stored information. The model also uses ordered groupings of spatial, thematic and temporal reference objects to semantically tag datasets to enable fast data retrieval, such as finding all data in a specific administrative unit belonging to a specific theme. A spatial database extension is employed by the PostgreSQL database. This object-oriented database was chosen over a relational database to tag spatial objects to tabular data, improving the retrieval of census and observational data at regional, provincial, and local areas. While the spatial database hinders processing raster data, a "work-around" was built into WISDOM to permit efficient management of both raster and vector data. The data model also incorporates styling aspects of the spatial datasets through styled layer descriptions (SLD) and web mapping service (WMS) layer specifications, allowing retrieval of rendered maps. Metadata elements of the spatial data are based on the ISO19115 standard. XML structured information of the SLD and metadata are stored in an XML database. The data models and the data management system are robust for managing the large quantity of spatial objects, sensor observations, census and document data. The operational WISDOM information system prototype contains modules for data management, automatic data integration, and web services for data retrieval, analysis, and distribution. The graphical user interfaces facilitate metadata cataloguing, data warehousing, web sensor data analysis and thematic mapping.

  14. Census Data System of the Management Information System for Occupational Education: Guidelines and Instructions for Reporting. CDS Document No. 1.

    ERIC Educational Resources Information Center

    Management and Information System for Occupational Education, Winchester, MA.

    The MISOE Census Data System (CDS) is one of two major subsystems of an integrated management information system (MISOE), which was developed to provide occupational education managers with comprehensive data on which to base rational management decisions. Essentially, CDS contains descriptive information systematically structured in a manner…

  15. Reference manual for data base on Nevada well logs

    USGS Publications Warehouse

    Bauer, E.M.; Cartier, K.D.

    1995-01-01

    The U.S. Geological Survey and Nevada Division of Water Resources are cooperatively using a data base for are cooperatively using a data base for managing well-log information for the State of Nevada. The Well-Log Data Base is part of an integrated system of computer data bases using the Ingres Relational Data-Base Management System, which allows efficient storage and access to water information from the State Engineer's office. The data base contains a main table, two ancillary tables, and nine lookup tables, as well as a menu-driven system for entering, updating, and reporting on the data. This reference guide outlines the general functions of the system and provides a brief description of data tables and data-entry screens.

  16. Using expert systems to implement a semantic data model of a large mass storage system

    NASA Technical Reports Server (NTRS)

    Roelofs, Larry H.; Campbell, William J.

    1990-01-01

    The successful development of large volume data storage systems will depend not only on the ability of the designers to store data, but on the ability to manage such data once it is in the system. The hypothesis is that mass storage data management can only be implemented successfully based on highly intelligent meta data management services. There now exists a proposed mass store system standard proposed by the IEEE that addresses many of the issues related to the storage of large volumes of data, however, the model does not consider a major technical issue, namely the high level management of stored data. However, if the model were expanded to include the semantics and pragmatics of the data domain using a Semantic Data Model (SDM) concept, the result would be data that is expressive of the Intelligent Information Fusion (IIF) concept and also organized and classified in context to its use and purpose. The results are presented of a demonstration prototype SDM implemented using the expert system development tool NEXPERT OBJECT. In the prototype, a simple instance of a SDM was created to support a hypothetical application for the Earth Observing System, Data Information System (EOSDIS). The massive amounts of data that EOSDIS will manage requires the definition and design of a powerful information management system in order to support even the most basic needs of the project. The application domain is characterized by a semantic like network that represents the data content and the relationships between the data based on user views and the more generalized domain architectural view of the information world. The data in the domain are represented by objects that define classes, types and instances of the data. In addition, data properties are selectively inherited between parent and daughter relationships in the domain. Based on the SDM a simple information system design is developed from the low level data storage media, through record management and meta data management to the user interface.

  17. Wiki-based Data Management to Support Systems Toxicology*

    EPA Science Inventory

    As the field of toxicology relies more heavily on systems approaches for mode of action discovery, evaluation, and modeling, the need for integrated data management is greater than ever. To meet these needs, we developed a flexible data management system that assists scientists ...

  18. Recommendation System Based On Association Rules For Distributed E-Learning Management Systems

    NASA Astrophysics Data System (ADS)

    Mihai, Gabroveanu

    2015-09-01

    Traditional Learning Management Systems are installed on a single server where learning materials and user data are kept. To increase its performance, the Learning Management System can be installed on multiple servers; learning materials and user data could be distributed across these servers obtaining a Distributed Learning Management System. In this paper is proposed the prototype of a recommendation system based on association rules for Distributed Learning Management System. Information from LMS databases is analyzed using distributed data mining algorithms in order to extract the association rules. Then the extracted rules are used as inference rules to provide personalized recommendations. The quality of provided recommendations is improved because the rules used to make the inferences are more accurate, since these rules aggregate knowledge from all e-Learning systems included in Distributed Learning Management System.

  19. Forest management applications of Landsat data in a geographic information system

    NASA Technical Reports Server (NTRS)

    Maw, K. D.; Brass, J. A.

    1982-01-01

    The utility of land-cover data resulting from Landsat MSS classification can be greatly enhanced by use in combination with ancillary data. A demonstration forest management applications data base was constructed for Santa Cruz County, California, to demonstrate geographic information system applications of classified Landsat data. The data base contained detailed soils, digital terrain, land ownership, jurisdictional boundaries, fire events, and generalized land-use data, all registered to a UTM grid base. Applications models were developed from problems typical of fire management and reforestation planning.

  20. Five Years Experience with the CLINFO Data Base Management and Analysis System

    PubMed Central

    Johnston, Howard B.; Higgins, Stanley B.; Harris, Thomas R.; Lacy, William W.

    1982-01-01

    The CLINFO data base management and analysis system is the result of a project sponsored by the National Institutes of Health (NIH) to identify data management and data analysis activities that are critical to clinical investigation. In February of 1977, one of the three prototype CLINFO systems developed by the RAND Corporation was installed in the Clinical Research Center (CRC) at Vanderbilt University Medical Center. The Vanderbilt experience with this CLINFO system over the past five years is described. Its impact on the way clinical research data has been managed and analyzed is discussed in terms of utilization by more than 100 clinical investigators and their staff. The Vanderbilt evaluation of the system and additional information on its usage since the original evaluation is presented. Factors in the design philosophy of CLINFO which create an environment that enhances the clinical investigator's capabilities to perform computer data management and analysis of his data are discussed.

  1. A Management Information System for Bare Base Civil Engineering Commanders

    DTIC Science & Technology

    1988-09-01

    initial beddown stage. The purpose of this research was to determine the feasibility of developing a microcomputer based management information system (MIS...the software best suited to synthesize four of the categories into a prototype field MIS. Keyword: Management information system , Bare bases, Civil engineering, Data bases, Information retrieval.

  2. The Emerging Role of the Data Base Manager. Report No. R-1253-PR.

    ERIC Educational Resources Information Center

    Sawtelle, Thomas K.

    The Air Force Logistics Command (AFLC) is revising and enhancing its data-processing capabilities with the development of a large-scale, multi-site, on-line, integrated data base information system known as the Advanced Logistics System (ALS). A data integrity program is to be built around a Data Base Manager (DBM), an individual or a group of…

  3. Langley experience with ADABAS/NATURAL

    NASA Technical Reports Server (NTRS)

    Swanson, A.

    1984-01-01

    The use of the data base management system ADABAS and the companion software NATURAL and COM-PLETE at the Langley Research Center is evaluated. A brief overview of data base management system technology is provided as well as system upgrading, user requirements, and use of the system for administrative support.

  4. Identifying and Validating Requirements of a Mobile-Based Self-Management System for People Living with HIV.

    PubMed

    Mehraeen, Esmaeil; Safdari, Reza; Seyedalinaghi, Seyed Ahmad; Mohammadzadeh, Niloofar; Arji, Goli

    2018-01-01

    Due to the widespread use of mobile technology and the low cost of this technology, implementing a mobile-based self-management system can lead to adherence to the medication regimens and promotion of the health of people living with HIV (PLWH). We aimed to identify requirements of a mobile-based self-management system, and validate them from the perspective of infectious diseases specialists. This is a mixed-methods study that carried out in two main phases. In the first phase, we identified requirements of a mobile-based self-management system for PLWH. In the second phase, identified requirements were validated using a researcher made questionnaire. The statistical population was infectious diseases specialists affiliated to Tehran University of Medical Sciences. The collected data were analyzed using SPSS statistical software (version 19), and descriptive statistics. By full-text review of selected studies, we determined requirements of a mobile-based self-management system in four categories: demographic, clinical, strategically and technical capabilities. According to the findings, 6 data elements for demographic category, 11 data elements for clinical category, 10 items for self-management strategies, and 11 features for technical capabilities were selected. Using the identified preferences, it is possible to design and implement a mobile-based self-management system for HIV-positive people. Developing a mobile-based self-management system is expected to progress the skills of self-management PLWH, improve of medication regimen adherence, and facilitate communication with healthcare providers.

  5. Medical-Information-Management System

    NASA Technical Reports Server (NTRS)

    Alterescu, Sidney; Friedman, Carl A.; Frankowski, James W.

    1989-01-01

    Medical Information Management System (MIMS) computer program interactive, general-purpose software system for storage and retrieval of information. Offers immediate assistance where manipulation of large data bases required. User quickly and efficiently extracts, displays, and analyzes data. Used in management of medical data and handling all aspects of data related to care of patients. Other applications include management of data on occupational safety in public and private sectors, handling judicial information, systemizing purchasing and procurement systems, and analyses of cost structures of organizations. Written in Microsoft FORTRAN 77.

  6. AOIPS data base management systems support for GARP data sets

    NASA Technical Reports Server (NTRS)

    Gary, J. P.

    1977-01-01

    A data base management system is identified, developed to provide flexible access to data sets produced by GARP during its data systems tests. The content and coverage of the data base are defined and a computer-aided, interactive information storage and retrieval system, implemented to facilitate access to user specified data subsets, is described. The computer programs developed to provide the capability were implemented on the highly interactive, minicomputer-based AOIPS and are referred to as the data retrieval system (DRS). Implemented as a user interactive but menu guided system, the DRS permits users to inventory the data tape library and create duplicate or subset data sets based on a user selected window defined by time and latitude/longitude boundaries. The DRS permits users to select, display, or produce formatted hard copy of individual data items contained within the data records.

  7. Case Studies of Ecological Integrative Information Systems: The Luquillo and Sevilleta Information Management Systems

    NASA Astrophysics Data System (ADS)

    San Gil, Inigo; White, Marshall; Melendez, Eda; Vanderbilt, Kristin

    The thirty-year-old United States Long Term Ecological Research Network has developed extensive metadata to document their scientific data. Standard and interoperable metadata is a core component of the data-driven analytical solutions developed by this research network Content management systems offer an affordable solution for rapid deployment of metadata centered information management systems. We developed a customized integrative metadata management system based on the Drupal content management system technology. Building on knowledge and experience with the Sevilleta and Luquillo Long Term Ecological Research sites, we successfully deployed the first two medium-scale customized prototypes. In this paper, we describe the vision behind our Drupal based information management instances, and list the features offered through these Drupal based systems. We also outline the plans to expand the information services offered through these metadata centered management systems. We will conclude with the growing list of participants deploying similar instances.

  8. Data management in engineering

    NASA Technical Reports Server (NTRS)

    Browne, J. C.

    1976-01-01

    An introduction to computer based data management is presented with an orientation toward the needs of engineering application. The characteristics and structure of data management systems are discussed. A link to familiar engineering applications of computing is established through a discussion of data structure and data access procedures. An example data management system for a hypothetical engineering application is presented.

  9. Data Base Management Systems Panel Workshop: Executive summary

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Data base management systems (DBMS) for space acquired and associated data are discussed. The full range of DBMS needs is covered including acquiring, managing, storing, archiving, accessing and dissemination of data for an application. Existing bottlenecks in DBMS operations, expected developments in the field of remote sensing, communications, and computer science are discussed, and an overview of existing conditions and expected problems is presented. The requirements for a proposed spatial information system and characteristics of a comprehensive browse facility for earth observations applications are included.

  10. Feature Analysis of Generalized Data Base Management Systems.

    ERIC Educational Resources Information Center

    Conference on Data Systems Languages, Monroeville, PA. Systems Committee.

    A more complete definition of the features offered in present day generalized data base management systems is provided by this second technical report of the CODASYL Systems Committee. In a tutorial format, each feature description is followed by either narrative information covering ten systems or by a table for all systems. The ten systems…

  11. Study on parallel and distributed management of RS data based on spatial data base

    NASA Astrophysics Data System (ADS)

    Chen, Yingbiao; Qian, Qinglan; Liu, Shijin

    2006-12-01

    With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.

  12. Managing data from remeasured plots: An evaluation of existing systems

    Treesearch

    John C. Byrne; Michael D. Sweet

    1992-01-01

    Proper management of the valuable data from remeasured (or permanent) forest growth plots with data base management systems (DBMS) can greatly add to their utility. Twelve desired features for such a system (activities that facilitate the storage, accuracy, and use of the data for analysis) are described and used to evaluate the 36 systems found by a survey conducted...

  13. MINIS: Multipurpose Interactive NASA Information System

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The Multipurpose Interactive NASA Information Systems (MINIS) was developed in response to the need for a data management system capable of operation on several different minicomputer systems. The desired system had to be capable of performing the functions of a LANDSAT photo descriptive data retrieval system while remaining general in terms of other acceptable user definable data bases. The system also had to be capable of performing data base updates and providing user-formatted output reports. The resultant MINI System provides all of these capabilities and several other features to complement the data management system. The MINI System is currently implemented on two minicomputer systems and is in the process of being installed on another minicomputer system. The MINIS is operational on four different data bases.

  14. Technology-based management of environmental organizations using an Environmental Management Information System (EMIS): Design and development

    NASA Astrophysics Data System (ADS)

    Kouziokas, Georgios N.

    2016-01-01

    The adoption of Information and Communication Technologies (ICT) in environmental management has become a significant demand nowadays with the rapid growth of environmental information. This paper presents a prototype Environmental Management Information System (EMIS) that was developed to provide a systematic way of managing environmental data and human resources of an environmental organization. The system was designed using programming languages, a Database Management System (DBMS) and other technologies and programming tools and combines information from the relational database in order to achieve the principal goals of the environmental organization. The developed application can be used to store and elaborate information regarding: human resources data, environmental projects, observations, reports, data about the protected species, environmental measurements of pollutant factors or other kinds of analytical measurements and also the financial data of the organization. Furthermore, the system supports the visualization of spatial data structures by using geographic information systems (GIS) and web mapping technologies. This paper describes this prototype software application, its structure, its functions and how this system can be utilized to facilitate technology-based environmental management and decision-making process.

  15. Vibroacoustic Payload Environment Prediction System (VAPEPS): VAPEPS management center remote access guide

    NASA Technical Reports Server (NTRS)

    Fernandez, J. P.; Mills, D.

    1991-01-01

    A Vibroacoustic Payload Environment Prediction System (VAPEPS) Management Center was established at the JPL. The center utilizes the VAPEPS software package to manage a data base of Space Shuttle and expendable launch vehicle payload flight and ground test data. Remote terminal access over telephone lines to the computer system, where the program resides, was established to provide the payload community a convenient means of querying the global VAPEPS data base. This guide describes the functions of the VAPEPS Management Center and contains instructions for utilizing the resources of the center.

  16. Cargo Data Management Demonstration System

    DOT National Transportation Integrated Search

    1974-02-01

    Delays in receipt and creation of cargo documents are a problem in international trade. The work described demonstrates some of the advantages and capabilities of a computer-based cargo data management system. A demonstration system for data manageme...

  17. Configuration and Data Management Process and the System Safety Professional

    NASA Technical Reports Server (NTRS)

    Shivers, Charles Herbert; Parker, Nelson C. (Technical Monitor)

    2001-01-01

    This article presents a discussion of the configuration management (CM) and the Data Management (DM) functions and provides a perspective of the importance of configuration and data management processes to the success of system safety activities. The article addresses the basic requirements of configuration and data management generally based on NASA configuration and data management policies and practices, although the concepts are likely to represent processes of any public or private organization's well-designed configuration and data management program.

  18. Shuttle Program Information Management System (SPIMS) data base

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The Shuttle Program Information Management System (SPIMS) is a computerized data base operations system. The central computer is the CDC 170-730 located at Johnson Space Center (JSC), Houston, Texas. There are several applications which have been developed and supported by SPIMS. A brief description is given.

  19. Application of data mining in science and technology management information system based on WebGIS

    NASA Astrophysics Data System (ADS)

    Wu, Xiaofang; Xu, Zhiyong; Bao, Shitai; Chen, Feixiang

    2009-10-01

    With the rapid development of science and technology and the quick increase of information, a great deal of data is accumulated in the management department of science and technology. Usually, many knowledge and rules are contained and concealed in the data. Therefore, how to excavate and use the knowledge fully is very important in the management of science and technology. It will help to examine and approve the project of science and technology more scientifically and make the achievement transformed as the realistic productive forces easier. Therefore, the data mine technology will be researched and applied to the science and technology management information system to find and excavate the knowledge in the paper. According to analyzing the disadvantages of traditional science and technology management information system, the database technology, data mining and web geographic information systems (WebGIS) technology will be introduced to develop and construct the science and technology management information system based on WebGIS. The key problems are researched in detail such as data mining and statistical analysis. What's more, the prototype system is developed and validated based on the project data of National Natural Science Foundation Committee. The spatial data mining is done from the axis of time, space and other factors. Then the variety of knowledge and rules will be excavated by using data mining technology, which helps to provide an effective support for decisionmaking.

  20. Digital processing of mesoscale analysis and space sensor data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.

  1. Program Manager: The Journal of the Defense Systems Management College. Volume 15, Number 4, July-August 1986.

    DTIC Science & Technology

    1986-08-01

    Architect, troi systems, CAD CAM, and com- functional analysis , synthesis, and National Bureau of Standards, mon engineering data bases will be the trade...Recurrent analysis of a management these s h e m evolutionary chain of data processing problem combining real data and ponents of defense support system...at the Defense first constructed his support simulator Systems Management College, the by assembling appropriate analysis Data Storage and Retrieval

  2. A web-based biosignal data management system for U-health data integration.

    PubMed

    Ro, Dongwoo; Yoo, Sooyoung; Choi, Jinwook

    2008-11-06

    In the ubiquitous healthcare environment, the biosignal data should be easily accessed and properly maintained. This paper describes a web-based data management system. It consists of a device interface, a data upload control, a central repository, and a web server. For the user-specific web services, a MFER Upload ActiveX Control was developed.

  3. REPHLEX II: An information management system for the ARS Water Data Base

    NASA Astrophysics Data System (ADS)

    Thurman, Jane L.

    1993-08-01

    The REPHLEX II computer system is an on-line information management system which allows scientists, engineers, and other researchers to retrieve data from the ARS Water Data Base using asynchronous communications. The system features two phone lines handling baud rates from 300 to 2400, customized menus to facilitate browsing, help screens, direct access to information and data files, electronic mail processing, file transfers using the XMODEM protocol, and log-in procedures which capture information on new users, process passwords, and log activity for a permanent audit trail. The primary data base on the REPHLEX II system is the ARS Water Data Base which consists of rainfall and runoff data from experimental agricultural watersheds located in the United States.

  4. Neonatal Information System Using an Interactive Microcomputer Data Base Management Program

    PubMed Central

    Engelke, Stephen C.; Paulette, Ed W.; Kopelman, Arthur E.

    1981-01-01

    A low cost, interactive microcomputer data base management system is presented which is being used in a neonatal follow-up program at the East Carolina University School of Medicine. The features and flexibility of the system could be applied to a variety of medical care settings.

  5. Data base management system configuration specification. [computer storage devices

    NASA Technical Reports Server (NTRS)

    Neiers, J. W.

    1979-01-01

    The functional requirements and the configuration of the data base management system are described. Techniques and technology which will enable more efficient and timely transfer of useful data from the sensor to the user, extraction of information by the user, and exchange of information among the users are demonstrated.

  6. A method for data base management and analysis for wind tunnel data

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1987-01-01

    To respond to the need for improved data base management and analysis capabilities for wind-tunnel data at the Langley 16-Foot Transonic Tunnel, research was conducted into current methods of managing wind-tunnel data and a method was developed as a solution to this need. This paper describes the development of the data base management and analysis method for wind-tunnel data. The design and implementation of the software system are discussed and examples of its use are shown.

  7. Creation of a Book Order Management System Using a Microcomputer and a DBMS.

    ERIC Educational Resources Information Center

    Neill, Charlotte; And Others

    1985-01-01

    Describes management decisions and resultant technology-based system that allowed a medical library to meet increasing workloads without accompanying increases in resources available. Discussion covers system analysis; capabilities of book-order management system, "BOOKDIRT;" software and training; hardware; data files; data entry;…

  8. A Data Base Management System for Clinical and Epidemiologic Studies In Systemic Lupus Erythematosus: Design and Maintenance

    PubMed Central

    Kosmides, Victoria S.; Hochberg, Marc C.

    1984-01-01

    This report describes the development, design specifications, features and implementation of a data base management system (DBMS) for clinical and epidemiologic studies in SLE. The DBMS is multidimensional with arrays formulated across patients, studies and variables. The major impact of this DBMS has been to increase the efficiency of managing and analyzing vast amounts of clinical and laboratory data and, as a result, to allow for continued growth in research productivity in areas related to SLE.

  9. Personal diabetes management system based on ubiquitous computing technology.

    PubMed

    Park, Kyung-Soon; Kim, Nam-Jin; Hong, Joo-Hyun; Park, Mi-Sook; Cha, Eun-Jung; Lee, Tae-Soo

    2006-01-01

    Assisting diabetes patients to self manage blood glucose test and insulin injection is of great importance for their healthcare. This study presented a PDA based system to manage the personal glucose level data interfaced with a small glucometer through a serial port. The data stored in the PDA can be transmitted by cradle or wireless communication to the remote web-server, where further medical analysis and service is provided. This system enables more efficient and systematic management of diabetes patients through self management and remote medical practice.

  10. An OAIS-Based Hospital Information System on the Cloud: Analysis of a NoSQL Column-Oriented Approach.

    PubMed

    Celesti, Antonio; Fazio, Maria; Romano, Agata; Bramanti, Alessia; Bramanti, Placido; Villari, Massimo

    2018-05-01

    The Open Archive Information System (OAIS) is a reference model for organizing people and resources in a system, and it is already adopted in care centers and medical systems to efficiently manage clinical data, medical personnel, and patients. Archival storage systems are typically implemented using traditional relational database systems, but the relation-oriented technology strongly limits the efficiency in the management of huge amount of patients' clinical data, especially in emerging cloud-based, that are distributed. In this paper, we present an OAIS healthcare architecture useful to manage a huge amount of HL7 clinical documents in a scalable way. Specifically, it is based on a NoSQL column-oriented Data Base Management System deployed in the cloud, thus to benefit from a big tables and wide rows available over a virtual distributed infrastructure. We developed a prototype of the proposed architecture at the IRCCS, and we evaluated its efficiency in a real case of study.

  11. Securely and Flexibly Sharing a Biomedical Data Management System

    PubMed Central

    Wang, Fusheng; Hussels, Phillip; Liu, Peiya

    2011-01-01

    Biomedical database systems need not only to address the issues of managing complex data, but also to provide data security and access control to the system. These include not only system level security, but also instance level access control such as access of documents, schemas, or aggregation of information. The latter is becoming more important as multiple users can share a single scientific data management system to conduct their research, while data have to be protected before they are published or IP-protected. This problem is challenging as users’ needs for data security vary dramatically from one application to another, in terms of who to share with, what resources to be shared, and at what access level. We develop a comprehensive data access framework for a biomedical data management system SciPort. SciPort provides fine-grained multi-level space based access control of resources at not only object level (documents and schemas), but also space level (resources set aggregated in a hierarchy way). Furthermore, to simplify the management of users and privileges, customizable role-based user model is developed. The access control is implemented efficiently by integrating access privileges into the backend XML database, thus efficient queries are supported. The secure access approach we take makes it possible for multiple users to share the same biomedical data management system with flexible access management and high data security. PMID:21625285

  12. The NEEDS Data Base Management and Archival Mass Memory System

    NASA Technical Reports Server (NTRS)

    Bailey, G. A.; Bryant, S. B.; Thomas, D. T.; Wagnon, F. W.

    1980-01-01

    A Data Base Management System and an Archival Mass Memory System are being developed that will have a 10 to the 12th bit on-line and a 10 to the 13th off-line storage capacity. The integrated system will accept packetized data from the data staging area at 50 Mbps, create a comprehensive directory, provide for file management, record the data, perform error detection and correction, accept user requests, retrieve the requested data files and provide the data to multiple users at a combined rate of 50 Mbps. Stored and replicated data files will have a bit error rate of less than 10 to the -9th even after ten years of storage. The integrated system will be demonstrated to prove the technology late in 1981.

  13. Learning Agents for Autonomous Space Asset Management (LAASAM)

    NASA Astrophysics Data System (ADS)

    Scally, L.; Bonato, M.; Crowder, J.

    2011-09-01

    Current and future space systems will continue to grow in complexity and capabilities, creating a formidable challenge to monitor, maintain, and utilize these systems and manage their growing network of space and related ground-based assets. Integrated System Health Management (ISHM), and in particular, Condition-Based System Health Management (CBHM), is the ability to manage and maintain a system using dynamic real-time data to prioritize, optimize, maintain, and allocate resources. CBHM entails the maintenance of systems and equipment based on an assessment of current and projected conditions (situational and health related conditions). A complete, modern CBHM system comprises a number of functional capabilities: sensing and data acquisition; signal processing; conditioning and health assessment; diagnostics and prognostics; and decision reasoning. In addition, an intelligent Human System Interface (HSI) is required to provide the user/analyst with relevant context-sensitive information, the system condition, and its effect on overall situational awareness of space (and related) assets. Colorado Engineering, Inc. (CEI) and Raytheon are investigating and designing an Intelligent Information Agent Architecture that will provide a complete range of CBHM and HSI functionality from data collection through recommendations for specific actions. The research leverages CEI’s expertise with provisioning management network architectures and Raytheon’s extensive experience with learning agents to define a system to autonomously manage a complex network of current and future space-based assets to optimize their utilization.

  14. Confidentiality Protection of User Data and Adaptive Resource Allocation for Managing Multiple Workflow Performance in Service-Based Systems

    ERIC Educational Resources Information Center

    An, Ho

    2012-01-01

    In this dissertation, two interrelated problems of service-based systems (SBS) are addressed: protecting users' data confidentiality from service providers, and managing performance of multiple workflows in SBS. Current SBSs pose serious limitations to protecting users' data confidentiality. Since users' sensitive data is sent in…

  15. Intelligent data management

    NASA Technical Reports Server (NTRS)

    Campbell, William J.

    1985-01-01

    Intelligent data management is the concept of interfacing a user to a database management system with a value added service that will allow a full range of data management operations at a high level of abstraction using human written language. The development of such a system will be based on expert systems and related artificial intelligence technologies, and will allow the capturing of procedural and relational knowledge about data management operations and the support of a user with such knowledge in an on-line, interactive manner. Such a system will have the following capabilities: (1) the ability to construct a model of the users view of the database, based on the query syntax; (2) the ability to transform English queries and commands into database instructions and processes; (3) the ability to use heuristic knowledge to rapidly prune the data space in search processes; and (4) the ability to use an on-line explanation system to allow the user to understand what the system is doing and why it is doing it. Additional information is given in outline form.

  16. ClinData Express – A Metadata Driven Clinical Research Data Management System for Secondary Use of Clinical Data

    PubMed Central

    Li, Zuofeng; Wen, Jingran; Zhang, Xiaoyan; Wu, Chunxiao; Li, Zuogao; Liu, Lei

    2012-01-01

    Aim to ease the secondary use of clinical data in clinical research, we introduce a metadata driven web-based clinical data management system named ClinData Express. ClinData Express is made up of two parts: 1) m-designer, a standalone software for metadata definition; 2) a web based data warehouse system for data management. With ClinData Express, what the researchers need to do is to define the metadata and data model in the m-designer. The web interface for data collection and specific database for data storage will be automatically generated. The standards used in the system and the data export modular make sure of the data reuse. The system has been tested on seven disease-data collection in Chinese and one form from dbGap. The flexibility of system makes its great potential usage in clinical research. The system is available at http://code.google.com/p/clindataexpress. PMID:23304327

  17. SU-E-T-11: A Cloud Based CT and LINAC QA Data Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiersma, R; Grelewicz, Z; Belcher, A

    Purpose: The current status quo of QA data management consists of a mixture of paper-based forms and spreadsheets for recording the results of daily, monthly, and yearly QA tests for both CT scanners and LINACs. Unfortunately, such systems suffer from a host of problems as, (1) records can be easily lost or destroyed, (2) data is difficult to access — one must physically hunt down records, (3) poor or no means of historical data analysis, and (4) no remote monitoring of machine performance off-site. To address these issues, a cloud based QA data management system was developed and implemented. Methods:more » A responsive tablet interface that optimizes clinic workflow with an easy-to-navigate interface accessible from any web browser was implemented in HTML/javascript/CSS to allow user mobility when entering QA data. Automated image QA was performed using a phantom QA kit developed in Python that is applicable to any phantom and is currently being used with the Gammex ACR, Las Vegas, Leeds, and Catphan phantoms for performing automated CT, MV, kV, and CBCT QAs, respectively. A Python based resource management system was used to distribute and manage intensive CPU tasks such as QA phantom image analysis or LaTeX-to-PDF QA report generation to independent process threads or different servers such that website performance is not affected. Results: To date the cloud QA system has performed approximately 185 QA procedures. Approximately 200 QA parameters are being actively tracked by the system on a monthly basis. Electronic access to historical QA parameter information was successful in proactively identifying a Linac CBCT scanner’s performance degradation. Conclusion: A fully comprehensive cloud based QA data management system was successfully implemented for the first time. Potential machine performance issues were proactively identified that would have been otherwise missed by a paper or spreadsheet based QA system.« less

  18. NASIS data base management system: IBM 360 TSS implementation. Volume 1: Installation standards

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The installation standards for the (NASIS) data base management system are presented. The standard approach to preparing systems documentation and the program design and coding rules and conventions are outlines. Included are instructions for preparing all major specifications and suggestions for improving the quality and efficency of the programming task.

  19. NASA Administrative Data Base Management Systems, 1984

    NASA Technical Reports Server (NTRS)

    Radosevich, J. D. (Editor)

    1984-01-01

    Strategies for converting to a data base management system (DBMS) and the implementation of the software packages necessary are discussed. Experiences with DBMS at various NASA centers are related including Langley's ADABAS/NATURAL and the NEMS subsystem of the NASA metrology informaton system. The value of the integrated workstation with a personal computer is explored.

  20. A case-based reasoning tool for breast cancer knowledge management with data mining concepts and techniques

    NASA Astrophysics Data System (ADS)

    Demigha, Souâd.

    2016-03-01

    The paper presents a Case-Based Reasoning Tool for Breast Cancer Knowledge Management to improve breast cancer screening. To develop this tool, we combine both concepts and techniques of Case-Based Reasoning (CBR) and Data Mining (DM). Physicians and radiologists ground their diagnosis on their expertise (past experience) based on clinical cases. Case-Based Reasoning is the process of solving new problems based on the solutions of similar past problems and structured as cases. CBR is suitable for medical use. On the other hand, existing traditional hospital information systems (HIS), Radiological Information Systems (RIS) and Picture Archiving Information Systems (PACS) don't allow managing efficiently medical information because of its complexity and heterogeneity. Data Mining is the process of mining information from a data set and transform it into an understandable structure for further use. Combining CBR to Data Mining techniques will facilitate diagnosis and decision-making of medical experts.

  1. Residual acceleration data on IML-1: Development of a data reduction and dissemination plan

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Alexander, J. Iwan D.; Wolf, Randy

    1992-01-01

    The main thrust of our work in the third year of contract NAG8-759 was the development and analysis of various data processing techniques that may be applicable to residual acceleration data. Our goal is the development of a data processing guide that low gravity principal investigators can use to assess their need for accelerometer data and then formulate an acceleration data analysis strategy. The work focused on the flight of the first International Microgravity Laboratory (IML-1) mission. We are also developing a data base management system to handle large quantities of residual acceleration data. This type of system should be an integral tool in the detailed analysis of accelerometer data. The system will manage a large graphics data base in the support of supervised and unsupervised pattern recognition. The goal of the pattern recognition phase is to identify specific classes of accelerations so that these classes can be easily recognized in any data base. The data base management system is being tested on the Spacelab 3 (SL3) residual acceleration data.

  2. Management and display of four-dimensional environmental data sets using McIDAS

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Santek, David; Suomi, Verner E.

    1990-01-01

    Over the past four years, great strides have been made in the areas of data management and display of 4-D meteorological data sets. A survey was conducted of available and planned 4-D meteorological data sources. The data types were evaluated for their impact on the data management and display system. The requirements were analyzed for data base management generated by the 4-D data display system. The suitability of the existing data base management procedures and file structure were evaluated in light of the new requirements. Where needed, new data base management tools and file procedures were designed and implemented. The quality of the basic 4-D data sets was assured. The interpolation and extrapolation techniques of the 4-D data were investigated. The 4-D data from various sources were combined to make a uniform and consistent data set for display purposes. Data display software was designed to create abstract line graphic 3-D displays. Realistic shaded 3-D displays were created. Animation routines for these displays were developed in order to produce a dynamic 4-D presentation. A prototype dynamic color stereo workstation was implemented. A computer functional design specification was produced based on interactive studies and user feedback.

  3. Using a data base management system for modelling SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, K.

    1985-01-01

    The usefulness of a data base management system (DBMS) for modelling historical test data for the complete series of static test firings for the Space Shuttle Main Engine (SSME) was assessed. From an analysis of user data base query requirements, it became clear that a relational DMBS which included a relationally complete query language would permit a model satisfying the query requirements. Representative models and sample queries are discussed. A list of environment-particular evaluation criteria for the desired DBMS was constructed; these criteria include requirements in the areas of user-interface complexity, program independence, flexibility, modifiability, and output capability. The evaluation process included the construction of several prototype data bases for user assessement. The systems studied, representing the three major DBMS conceptual models, were: MIRADS, a hierarchical system; DMS-1100, a CODASYL-based network system; ORACLE, a relational system; and DATATRIEVE, a relational-type system.

  4. [Medication error management climate and perception for system use according to construction of medication error prevention system].

    PubMed

    Kim, Myoung Soo

    2012-08-01

    The purpose of this cross-sectional study was to examine current status of IT-based medication error prevention system construction and the relationships among system construction, medication error management climate and perception for system use. The participants were 124 patient safety chief managers working for 124 hospitals with over 300 beds in Korea. The characteristics of the participants, construction status and perception of systems (electric pharmacopoeia, electric drug dosage calculation system, computer-based patient safety reporting and bar-code system) and medication error management climate were measured in this study. The data were collected between June and August 2011. Descriptive statistics, partial Pearson correlation and MANCOVA were used for data analysis. Electric pharmacopoeia were constructed in 67.7% of participating hospitals, computer-based patient safety reporting systems were constructed in 50.8%, electric drug dosage calculation systems were in use in 32.3%. Bar-code systems showed up the lowest construction rate at 16.1% of Korean hospitals. Higher rates of construction of IT-based medication error prevention systems resulted in greater safety and a more positive error management climate prevailed. The supportive strategies for improving perception for use of IT-based systems would add to system construction, and positive error management climate would be more easily promoted.

  5. Library Statistical Data Base Formats and Definitions.

    ERIC Educational Resources Information Center

    Jones, Dennis; And Others

    Represented are the detailed set of data structures relevant to the categorization of information, terminology, and definitions employed in the design of the library statistical data base. The data base, or management information system, provides administrators with a framework of information and standardized data for library management, planning,…

  6. NASA Cloud-Based Climate Data Services

    NASA Astrophysics Data System (ADS)

    McInerney, M. A.; Schnase, J. L.; Duffy, D. Q.; Tamkin, G. S.; Strong, S.; Ripley, W. D., III; Thompson, J. H.; Gill, R.; Jasen, J. E.; Samowich, B.; Pobre, Z.; Salmon, E. M.; Rumney, G.; Schardt, T. D.

    2012-12-01

    Cloud-based scientific data services are becoming an important part of NASA's mission. Our technological response is built around the concept of specialized virtual climate data servers, repetitive cloud provisioning, image-based deployment and distribution, and virtualization-as-a-service (VaaS). A virtual climate data server (vCDS) is an Open Archive Information System (OAIS) compliant, iRODS-based data server designed to support a particular type of scientific data collection. iRODS is data grid middleware that provides policy-based control over collection-building, managing, querying, accessing, and preserving large scientific data sets. We have deployed vCDS Version 1.0 in the Amazon EC2 cloud using S3 object storage and are using the system to deliver a subset of NASA's Intergovernmental Panel on Climate Change (IPCC) data products to the latest CentOS federated version of Earth System Grid Federation (ESGF), which is also running in the Amazon cloud. vCDS-managed objects are exposed to ESGF through FUSE (Filesystem in User Space), which presents a POSIX-compliant filesystem abstraction to applications such as the ESGF server that require such an interface. A vCDS manages data as a distinguished collection for a person, project, lab, or other logical unit. A vCDS can manage a collection across multiple storage resources using rules and microservices to enforce collection policies. And a vCDS can federate with other vCDSs to manage multiple collections over multiple resources, thereby creating what can be thought of as an ecosystem of managed collections. With the vCDS approach, we are trying to enable the full information lifecycle management of scientific data collections and make tractable the task of providing diverse climate data services. In this presentation, we describe our approach, experiences, lessons learned, and plans for the future.; (A) vCDS/ESG system stack. (B) Conceptual architecture for NASA cloud-based data services.

  7. Data base management system and display software for the National Geophysical Data Center geomagnetic CD-ROM's

    NASA Technical Reports Server (NTRS)

    Papitashvili, N. E.; Papitashvili, V. O.; Allen, J. H.; Morris, L. D.

    1995-01-01

    The National Geophysical Data Center has the largest collection of geomagnetic data from the worldwide network of magnetic observatories. The data base management system and retrieval/display software have been developed for the archived geomagnetic data (annual means, monthly, daily, hourly, and 1-minute values) and placed on the center's CD-ROM's to provide users with 'user-oriented' and 'user-friendly' support. This system is described in this paper with a brief outline of provided options.

  8. PMIS Project. Planning & Management Information System. A Project To Develop a Data Processing System for Support of the Planning and Management Needs of Local School Districts. Final Report, Year 2.

    ERIC Educational Resources Information Center

    Council of the Great City Schools, Washington, DC.

    This document examines the design and structure of PMIS (Planning and Management Information System), an information system that supports the decisionmaking process of executive management in local school districts. The system is designed around a comprehensive, longitudinal, and interrelated data base. It utilizes a powerful real-time,…

  9. Evolution of Information Management at the GSFC Earth Sciences (GES) Data and Information Services Center (DISC): 2006-2007

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Lynnes, Christopher; Vollmer, Bruce; Alcott, Gary; Berrick, Stephen

    2009-01-01

    Increasingly sophisticated National Aeronautics and Space Administration (NASA) Earth science missions have driven their associated data and data management systems from providing simple point-to-point archiving and retrieval to performing user-responsive distributed multisensor information extraction. To fully maximize the use of remote-sensor-generated Earth science data, NASA recognized the need for data systems that provide data access and manipulation capabilities responsive to research brought forth by advancing scientific analysis and the need to maximize the use and usability of the data. The decision by NASA to purposely evolve the Earth Observing System Data and Information System (EOSDIS) at the Goddard Space Flight Center (GSFC) Earth Sciences (GES) Data and Information Services Center (DISC) and other information management facilities was timely and appropriate. The GES DISC evolution was focused on replacing the EOSDIS Core System (ECS) by reusing the In-house developed disk-based Simple, Scalable, Script-based Science Product Archive (S4PA) data management system and migrating data to the disk archives. Transition was completed in December 2007

  10. Evidence-based human resource management: a study of nurse leaders' resource allocation.

    PubMed

    Fagerström, Lisbeth

    2009-05-01

    The aims were to illustrate how the RAFAELA system can be used to facilitate evidence-based human resource management. The theoretical framework of the RAFAELA system is based on a holistic view of humankind and a view of leadership founded on human resource management. Nine wards from three central hospitals in Finland participated in the study. The data, stemming from 2006-2007, were taken from the critical indicators (ward-related and nursing intensity information) for national benchmarking used in the RAFAELA system. The data were analysed descriptively. The daily nursing resources per classified patient ratio is a more specific method of measurement than the nurse-to-patient ratio. For four wards, the nursing intensity per nurse surpassed the optimal level 34% to 62.2% of days. Resource allocation was clearly improved in that a better balance between patients' care needs and available nursing resources was maintained. The RAFAELA system provides a rational, systematic and objective foundation for evidence-based human resource management. Data from a systematic use of the RAFAELA system offer objective facts and motives for evidence-based decision making in human resource management, and will therefore enhance the nurse leaders' evidence and scientific based way of working.

  11. NASIS data base management system - IBM 360/370 OS MVT implementation. 1: Installation standards

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The installation standards for the NASA Aerospace Safety Information System (NASIS) data base management system are presented. The standard approach to preparing systems documentation and the program design and coding rules and conventions are outlined. Included are instructions for preparing all major specifications and suggestions for improving the quality and efficiency of the programming task.

  12. Study on parallel and distributed management of RS data based on spatial database

    NASA Astrophysics Data System (ADS)

    Chen, Yingbiao; Qian, Qinglan; Wu, Hongqiao; Liu, Shijin

    2009-10-01

    With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.

  13. CIMS: The Cartographic Information Management System,

    DTIC Science & Technology

    1981-01-01

    information , composites of overlays to demonstrate the decision-making possibilities and slides of the cadastral sheet. System Use After data base ...create a national soils data base that can be used in managing the soil (Johnson, 1979). Small-scale information systems can be used in planning the...maps/charts over the base map, etc.). An example of the manual phase to be found in the literature is the Overlay Information System used in Prince

  14. Research in Functionally Distributed Computer Systems Development. Volume XII. Design Considerations in Distributed Data Base Management Systems.

    DTIC Science & Technology

    1977-04-01

    task of data organization, management, and storage has been given to a select group of specialists . These specialists (the Data Base Administrators...report writers, etc.) the task of data organi?9tion, management, and storage has been given to a select group of specialists . These specialists (the...distributed DBMS Involves first identifying a set of two or more tasks blocking each other from a collection of shared 12 records. Once the set of

  15. Contracts and management services site support program plan WBS 6.10.14

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knoll, J.M. Jr.

    1994-09-01

    Contracts and Management Services is recognized as the central focal point for programs having company or sitewide application in pursuit of the Hanford Missions`s financial and operational objectives. Contracts and Management Services actively pursues cost savings and operational efficiencies through: Management Standards by ensuring all employees have an accessible, integrated system of clear, complete, accurate, timely, and useful management control policies and procedures; Contract Reform by restructuring the contract, organization, and cost accounting systems to refocus Hanford contract activities on output products; Systems and Operations Evaluation by directing the Cost Reduction program, Great Ideas, and Span of Management activities; Programmore » Administration by enforcing conditions of Accountability (whether DEAR-based or FAR-based) for WHC, BCSR, ICF KH, and BHI; Contract Performance activities; chairing the WHC Cost Reduction Review Board; and analyzing companywide Performance Measures; Data Standards and Administration by establishing and directing the company data management program; giving direction to the major RL programs and mission areas for implementation of cost-effective and efficient data management practices; directing all operations, application, and interfaces contained within the Hanford PeopleCore System; directing accomplishment and delivery of TPA data management milestones; and directing the sitewide data management processes for Data Standards and the Data Directory.« less

  16. Data management in pattern recognition and image processing systems

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Bryant, N. A.

    1976-01-01

    Data management considerations are important to any system which handles large volumes of data or where the manipulation of data is technically sophisticated. A particular problem is the introduction of image-formatted files into the mainstream of data processing application. This report describes a comprehensive system for the manipulation of image, tabular, and graphical data sets which involve conversions between the various data types. A key characteristic is the use of image processing technology to accomplish data management tasks. Because of this, the term 'image-based information system' has been adopted.

  17. Improvement of web-based data acquisition and management system for GOSAT validation lidar data analysis

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra Nugraha; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2013-01-01

    A web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data-analysis has been developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS (Automated Meteorological Data Acquisition System) ground-level local meteorological data, GPS Radiosonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data. In this article, we briefly describe some improvement for higher performance and higher data usability. GPS Radiosonde upper-air meteorological data and U.S. standard atmospheric model in DAS automatically calculate molecule number density profiles. Predicted ozone density prole images above Saga city are also calculated by using Meteorological Research Institute (MRI) chemistry-climate model version 2 for comparison to actual ozone DIAL data.

  18. Research on spatio-temporal database techniques for spatial information service

    NASA Astrophysics Data System (ADS)

    Zhao, Rong; Wang, Liang; Li, Yuxiang; Fan, Rongshuang; Liu, Ping; Li, Qingyuan

    2007-06-01

    Geographic data should be described by spatial, temporal and attribute components, but the spatio-temporal queries are difficult to be answered within current GIS. This paper describes research into the development and application of spatio-temporal data management system based upon GeoWindows GIS software platform which was developed by Chinese Academy of Surveying and Mapping (CASM). Faced the current and practical requirements of spatial information application, and based on existing GIS platform, one kind of spatio-temporal data model which integrates vector and grid data together was established firstly. Secondly, we solved out the key technique of building temporal data topology, successfully developed a suit of spatio-temporal database management system adopting object-oriented methods. The system provides the temporal data collection, data storage, data management and data display and query functions. Finally, as a case study, we explored the application of spatio-temporal data management system with the administrative region data of multi-history periods of China as the basic data. With all the efforts above, the GIS capacity of management and manipulation in aspect of time and attribute of GIS has been enhanced, and technical reference has been provided for the further development of temporal geographic information system (TGIS).

  19. JAX Colony Management System (JCMS): an extensible colony and phenotype data management system.

    PubMed

    Donnelly, Chuck J; McFarland, Mike; Ames, Abigail; Sundberg, Beth; Springer, Dave; Blauth, Peter; Bult, Carol J

    2010-04-01

    The Jackson Laboratory Colony Management System (JCMS) is a software application for managing data and information related to research mouse colonies, associated biospecimens, and experimental protocols. JCMS runs directly on computers that run one of the PC Windows operating systems, but can be accessed via web browser interfaces from any computer running a Windows, Macintosh, or Linux operating system. JCMS can be configured for a single user or multiple users in small- to medium-size work groups. The target audience for JCMS includes laboratory technicians, animal colony managers, and principal investigators. The application provides operational support for colony management and experimental workflows, sample and data tracking through transaction-based data entry forms, and date-driven work reports. Flexible query forms allow researchers to retrieve database records based on user-defined criteria. Recent advances in handheld computers with integrated barcode readers, middleware technologies, web browsers, and wireless networks add to the utility of JCMS by allowing real-time access to the database from any networked computer.

  20. A data management system for engineering and scientific computing

    NASA Technical Reports Server (NTRS)

    Elliot, L.; Kunii, H. S.; Browne, J. C.

    1978-01-01

    Data elements and relationship definition capabilities for this data management system are explicitly tailored to the needs of engineering and scientific computing. System design was based upon studies of data management problems currently being handled through explicit programming. The system-defined data element types include real scalar numbers, vectors, arrays and special classes of arrays such as sparse arrays and triangular arrays. The data model is hierarchical (tree structured). Multiple views of data are provided at two levels. Subschemas provide multiple structural views of the total data base and multiple mappings for individual record types are supported through the use of a REDEFINES capability. The data definition language and the data manipulation language are designed as extensions to FORTRAN. Examples of the coding of real problems taken from existing practice in the data definition language and the data manipulation language are given.

  1. Quality data collection and management technology of aerospace complex product assembly process

    NASA Astrophysics Data System (ADS)

    Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo

    2017-04-01

    Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.

  2. Results of phase one of land use information Delphi study

    NASA Technical Reports Server (NTRS)

    Paul, C. K.; Landini, A. J.

    1975-01-01

    The Land Use Management Information System (LUMIS) is being developed for the city portion of the Santa Monica mountains. LUMIS incorporates data developed from maps and aerial photos as well as traditional land based data associated with routine city and county record keeping activities and traditional census data. To achieve the merging of natural resource data with governmental data LUMIS is being designed in accordance with restrictions associated with two other land use information systems currently being constructed by Los Angeles city staff. The two city systems are LUPAMS (Land Use Planning and Management System) which is based on data recorded by the County Assessor's office for each individual parcel of land in the city, and Geo-BEDS, a geographically based environmental data system.

  3. An integrated GIS/remote sensing data base in North Cache soil conservation district, Utah: A pilot project for the Utah Department of Agriculture's RIMS (Resource Inventory and Monitoring System)

    NASA Technical Reports Server (NTRS)

    Wheeler, D. J.; Ridd, M. K.; Merola, J. A.

    1984-01-01

    A basic geographic information system (GIS) for the North Cache Soil Conservation District (SCD) was sought for selected resource problems. Since the resource management issues in the North Cache SCD are very complex, it is not feasible in the initial phase to generate all the physical, socioeconomic, and political baseline data needed for resolving all management issues. A selection of critical varables becomes essential. Thus, there are foud specific objectives: (1) assess resource management needs and determine which resource factors ae most fundamental for building a beginning data base; (2) evaluate the variety of data gathering and analysis techniques for the resource factors selected; (3) incorporate the resulting data into a useful and efficient digital data base; and (4) demonstrate the application of the data base to selected real world resoource management issues.

  4. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    PubMed Central

    2009-01-01

    Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS) for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development), a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH) protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community. PMID:19941647

  5. A framework for a diabetes mellitus disease management system in southern Israel.

    PubMed

    Fox, Matthew A; Harman-Boehm, Ilana; Weitzman, Shimon; Zelingher, Julian

    2002-01-01

    Chronic diseases are a significant burden on western healthcare systems and national economies. It has been suggested that automated disease management for chronic disease, like diabetes mellitus (DM), improves the quality of care and reduces inappropriate utilization of diagnostic and therapeutic measures. We have designed a comprehensive DM Disease Management system for the Negev region in southern Israel. This system takes advantage of currently used clinical and administrative information systems. Algorithms for DM disease management have been created based on existing and accepted Israeli guidelines. All data fields and tables in the source information systems have been analyzed, and interfaces for periodic data loads from these systems have been specified. Based on this data, four subsets of decision support algorithms have been developed. The system generates alerts in these domains to multiple end users. We plan to use the products of this information system analysis and disease management specification in the actual development process of such a system shortly.

  6. An image based information system - Architecture for correlating satellite and topological data bases

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Zobrist, A. L.

    1978-01-01

    The paper describes the development of an image based information system and its use to process a Landsat thematic map showing land use or land cover in conjunction with a census tract polygon file to produce a tabulation of land use acreages per census tract. The system permits the efficient cross-tabulation of two or more geo-coded data sets, thereby setting the stage for the practical implementation of models of diffusion processes or cellular transformation. Characteristics of geographic information systems are considered, and functional requirements, such as data management, geocoding, image data management, and data analysis are discussed. The system is described, and the potentialities of its use are examined.

  7. Optical mass memory system (AMM-13). AMM/DBMS interface control document

    NASA Technical Reports Server (NTRS)

    Bailey, G. A.

    1980-01-01

    The baseline for external interfaces of a 10 to the 13th power bit, optical archival mass memory system (AMM-13) is established. The types of interfaces addressed include data transfer; AMM-13, Data Base Management System, NASA End-to-End Data System computer interconnect; data/control input and output interfaces; test input data source; file management; and facilities interface.

  8. Archive Inventory Management System (AIMS) — A Fast, Metrics Gathering Framework for Validating and Gaining Insight from Large File-Based Data Archives

    NASA Astrophysics Data System (ADS)

    Verma, R. V.

    2018-04-01

    The Archive Inventory Management System (AIMS) is a software package for understanding the distribution, characteristics, integrity, and nuances of files and directories in large file-based data archives on a continuous basis.

  9. A CERIF-Compatible Research Management System Based on the MARC 21 Format

    ERIC Educational Resources Information Center

    Ivanovic, Dragan; Milosavljevic, Gordana; Milosavljevic, Branko; Surla, Dusan

    2010-01-01

    Purpose: Entering data about published research results should be implemented as a web application that enables authors to input their own data without the knowledge of the bibliographic standard. The aim of this research is to develop a research management system based on a bibliographic standard and to provide data exchange with other research…

  10. San Juan National Forest Land Management Planning Support System (LMPSS) requirements definition

    NASA Technical Reports Server (NTRS)

    Werth, L. F. (Principal Investigator)

    1981-01-01

    The role of remote sensing data as it relates to a three-component land management planning system (geographic information, data base management, and planning model) can be understood only when user requirements are known. Personnel at the San Juan National Forest in southwestern Colorado were interviewed to determine data needs for managing and monitoring timber, rangelands, wildlife, fisheries, soils, water, geology and recreation facilities. While all the information required for land management planning cannot be obtained using remote sensing techniques, valuable information can be provided for the geographic information system. A wide range of sensors such as small and large format cameras, synthetic aperture radar, and LANDSAT data should be utilized. Because of the detail and accuracy required, high altitude color infrared photography should serve as the baseline data base and be supplemented and updated with data from the other sensors.

  11. Oak Ridge Environmental Information System (OREIS) functional system design document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birchfield, T.E.; Brown, M.O.; Coleman, P.R.

    1994-03-01

    The OREIS Functional System Design document provides a detailed functional description of the Oak Ridge Environmental Information System (OREIS). It expands the system requirements defined in the OREIS Phase 1-System Definition Document (ES/ER/TM-34). Documentation of OREIS development is based on the Automated Data Processing System Development Methodology, a Martin Marietta Energy Systems, Inc., procedure written to assist in developing scientific and technical computer systems. This document focuses on the development of the functional design of the user interface, which includes the integration of commercial applications software. The data model and data dictionary are summarized briefly; however, the Data Management Planmore » for OREIS (ES/ER/TM-39), a companion document to the Functional System Design document, provides the complete data dictionary and detailed descriptions of the requirements for the data base structure. The OREIS system will provide the following functions, which are executed from a Menu Manager: (1) preferences, (2) view manager, (3) macro manager, (4) data analysis (assisted analysis and unassisted analysis), and (5) spatial analysis/map generation (assisted ARC/INFO and unassisted ARC/INFO). Additional functionality includes interprocess communications, which handle background operations of OREIS.« less

  12. Polio Eradication Initiative contribution in strengthening immunization and integrated disease surveillance data management in WHO African region, 2014.

    PubMed

    Poy, Alain; Minkoulou, Etienne; Shaba, Keith; Yahaya, Ali; Gaturuku, Peter; Dadja, Landoh; Okeibunor, Joseph; Mihigo, Richard; Mkanda, Pascal

    2016-10-10

    The PEI Programme in the WHO African region invested in recruitment of qualified staff in data management, developing data management system and standards operating systems since the revamp of the Polio Eradication Initiative in 1997 to cater for data management support needs in the Region. This support went beyond polio and was expanded to routine immunization and integrated surveillance of priority diseases. But the impact of the polio data management support to other programmes such as routine immunization and disease surveillance has not yet been fully documented. This is what this article seeks to demonstrate. We reviewed how Polio data management area of work evolved progressively along with the expansion of the data management team capacity and the evolution of the data management systems from initiation of the AFP case-based to routine immunization, other case based disease surveillance and Supplementary immunization activities. IDSR has improved the data availability with support from IST Polio funded data managers who were collecting them from countries. The data management system developed by the polio team was used by countries to record information related to not only polio SIAs but also for other interventions. From the time when routine immunization data started to be part of polio data management team responsibility, the number of reports received went from around 4000 the first year (2005) to >30,000 the second year and to >47,000 in 2014. Polio data management has helped to improve the overall VPD, IDSR and routine data management as well as emergency response in the Region. As we approach the polio end game, the African Region would benefit in using the already set infrastructure for other public health initiative in the Region. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. A web-based clinical trial management system for a sham-controlled multicenter clinical trial in depression.

    PubMed

    Durkalski, Valerie; Wenle Zhao; Dillon, Catherine; Kim, Jaemyung

    2010-04-01

    Clinical trial investigators and sponsors invest vast amounts of resources and energy into conducting trials and often face daily challenges with data management, project management, and data quality control. Rather than waiting months for study progress reports, investigators need the ability to use real-time data for the coordination and management of study activities across all study team members including site investigators, oversight committees, data and safety monitoring boards, and medical safety monitors. Web-based data management systems are beginning to meet this need but what distinguishes one system from the other are user needs/requirements and cost. To illustrate the development and implementation of a web-based data and project management system for a multicenter clinical trial designed to test the superiority of repeated transcranial magnetic stimulation versus sham for the treatment of patients with major depression. The authors discuss the reasons for not using a commercially available system for this study and describe the approach to developing their own web-based system for the OPT-TMS study. Timelines, effort, system architecture, and lessons learned are shared with the hope that this information will direct clinical trial researchers and software developers towards more efficient, user-friendly systems. The developers use a combination of generic and custom application code to allow for the flexibility to adapt the system to the needs of the study. Features of the system include: central participant registration and randomization; secure data entry at the site; participant progress/study calendar; safety data reporting; device accounting; monitor verification; and user-configurable generic reports and built-in customized reports. Hard coding was more time-efficient to address project-specific issues compared with the effort of creating a generic code application. As a consequence of this strategy, the required maintenance of the system is increased and the value of using this system for other trials is reduced. Web-based central computerized systems offer time-saving, secure options for managing clinical trial data. The choice of a commercially available system or an internally developed system is determined by the requirements of the study and users. Pros and cons to both approaches were discussed. If the intention is to use the system for various trials (single and multi-center, phases I-III) across various therapeutic areas, then the overall design should be a generic structure that simplifies the general application with minimal loss of functionality.

  14. Natural Resource Information System. Volume 1: Overall description

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A prototype computer-based Natural Resource Information System was designed which could store, process, and display data of maximum usefulness to land management decision making. The system includes graphic input and display, the use of remote sensing as a data source, and it is useful at multiple management levels. A survey established current decision making processes and functions, information requirements, and data collection and processing procedures. The applications of remote sensing data and processing requirements were established. Processing software was constructed and a data base established using high-altitude imagery and map coverage of selected areas of SE Arizona. Finally a demonstration of system processing functions was conducted utilizing material from the data base.

  15. Knowledge-based geographic information systems (KBGIS): New analytic and data management tools

    USGS Publications Warehouse

    Albert, T.M.

    1988-01-01

    In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the U.S. Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved. ?? 1988 International Association for Mathematical Geology.

  16. [Study of sharing platform of web-based enhanced extracorporeal counterpulsation hemodynamic waveform data].

    PubMed

    Huang, Mingbo; Hu, Ding; Yu, Donglan; Zheng, Zhensheng; Wang, Kuijian

    2011-12-01

    Enhanced extracorporeal counterpulsation (EECP) information consists of both text and hemodynamic waveform data. At present EECP text information has been successfully managed through Web browser, while the management and sharing of hemodynamic waveform data through Internet has not been solved yet. In order to manage EECP information completely, based on the in-depth analysis of EECP hemodynamic waveform file of digital imaging and communications in medicine (DICOM) format and its disadvantages in Internet sharing, we proposed the use of the extensible markup language (XML), which is currently the Internet popular data exchange standard, as the storage specification for the sharing of EECP waveform data. Then we designed a web-based sharing system of EECP hemodynamic waveform data via ASP. NET 2.0 platform. Meanwhile, we specifically introduced the four main system function modules and their implement methods, including DICOM to XML conversion module, EECP waveform data management module, retrieval and display of EECP waveform module and the security mechanism of the system.

  17. e-Leadership of School Principals: Increasing School Effectiveness by a School Data Management System

    ERIC Educational Resources Information Center

    Blau, Ina; Presser, Ofer

    2013-01-01

    In recent years, school management systems have become an important tool for effective e-leadership and data-based decision making. School management systems emphasize information flow and e-communication between teachers, students and parents. This study examines e-leadership by secondary-school principals through the Mashov school management…

  18. Comparison of scientific and administrative database management systems

    NASA Technical Reports Server (NTRS)

    Stoltzfus, J. C.

    1983-01-01

    Some characteristics found to be different for scientific and administrative data bases are identified and some of the corresponding generic requirements for data base management systems (DBMS) are discussed. The requirements discussed are especially stringent for either the scientific or administrative data bases. For some, no commercial DBMS is fully satisfactory, and the data base designer must invent a suitable approach. For others, commercial systems are available with elegant solutions, and a wrong choice would mean an expensive work-around to provide the missing features. It is concluded that selection of a DBMS must be based on the requirements for the information system. There is no unique distinction between scientific and administrative data bases or DBMS. The distinction comes from the logical structure of the data, and understanding the data and their relationships is the key to defining the requirements and selecting an appropriate DBMS for a given set of applications.

  19. Generalized Data Management Systems--Some Perspectives.

    ERIC Educational Resources Information Center

    Minker, Jack

    A Generalized Data Management System (GDMS) is a software environment provided as a tool for analysts, administrators, and programmers who are responsible for the maintenance, query and analysis of a data base to permit the manipulation of newly defined files and data with the existing programs and system. Because the GDMS technology is believed…

  20. Office automation: The administrative window into the integrated DBMS

    NASA Technical Reports Server (NTRS)

    Brock, G. H.

    1985-01-01

    In parallel to the evolution of Management Information Systems from simple data files to complex data bases, the stand-alone computer systems have been migrating toward fully integrated systems serving the work force. The next major productivity gain may very well be to make these highly sophisticated working level Data Base Management Systems (DMBS) serve all levels of management with reports of varying levels of detail. Most attempts by the DBMS development organization to provide useful information to management seem to bog down in the quagmire of competing working level requirements. Most large DBMS development organizations possess three to five year backlogs. Perhaps Office Automation is the vehicle that brings to pass the Management Information System that really serves management. A good office automation system manned by a team of facilitators seeking opportunities to serve end-users could go a long way toward defining a DBMS that serves management. This paper will briefly discuss the problems of the DBMS organization, alternative approaches to solving some of the major problems, a debate about problems that may have no solution, and finally how office automation fits into the development of the Manager's Management Information System.

  1. Analysis and preliminary design of Kunming land use and planning management information system

    NASA Astrophysics Data System (ADS)

    Li, Li; Chen, Zhenjie

    2007-06-01

    This article analyzes Kunming land use planning and management information system from the system building objectives and system building requirements aspects, nails down the system's users, functional requirements and construction requirements. On these bases, the three-tier system architecture based on C/S and B/S is defined: the user interface layer, the business logic layer and the data services layer. According to requirements for the construction of land use planning and management information database derived from standards of the Ministry of Land and Resources and the construction program of the Golden Land Project, this paper divides system databases into planning document database, planning implementation database, working map database and system maintenance database. In the design of the system interface, this paper uses various methods and data formats for data transmission and sharing between upper and lower levels. According to the system analysis results, main modules of the system are designed as follows: planning data management, the planning and annual plan preparation and control function, day-to-day planning management, planning revision management, decision-making support, thematic inquiry statistics, planning public participation and so on; besides that, the system realization technologies are discussed from the system operation mode, development platform and other aspects.

  2. The research and implementation of PDM systems based on the .NET platform

    NASA Astrophysics Data System (ADS)

    Gao, Hong-li; Jia, Ying-lian; Yang, Ji-long; Jiang, Wei

    2005-12-01

    A new kind of PDM system scheme based on the .NET platform for solving application problems of the current PDM system applied in an enterprise is described. The key technologies of this system, such as .NET, Accessing Data, information processing, Web, ect., were discussed. The 3-tier architecture of a PDM system based on the C/S and B/S mixed mode was presented. In this system, all users share the same Database Server in order to ensure the coherence and safety of client data. ADO.NET leverages the power of XML to provide disconnected access to data, which frees the connection to be used by other clients. Using this approach, the system performance was improved. Moreover, the important function modules in a PDM system such as project management, product structure management and Document Management module were developed and realized.

  3. NASIS data base management system - IBM 360/370 OS MVT implementation. 7: Data base administrator user's guide

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Data Base Administrator User's Guide for the NASA Aerospace Safety information system is presented. The subjects discussed are: (1) multi-terminal tasking, (2) data base executive, (3) utilities, (4) maintenance, and (5) update mode functions.

  4. Recent Development in Big Data Analytics for Business Operations and Risk Management.

    PubMed

    Choi, Tsan-Ming; Chan, Hing Kai; Yue, Xiaohang

    2017-01-01

    "Big data" is an emerging topic and has attracted the attention of many researchers and practitioners in industrial systems engineering and cybernetics. Big data analytics would definitely lead to valuable knowledge for many organizations. Business operations and risk management can be a beneficiary as there are many data collection channels in the related industrial systems (e.g., wireless sensor networks, Internet-based systems, etc.). Big data research, however, is still in its infancy. Its focus is rather unclear and related studies are not well amalgamated. This paper aims to present the challenges and opportunities of big data analytics in this unique application domain. Technological development and advances for industrial-based business systems, reliability and security of industrial systems, and their operational risk management are examined. Important areas for future research are also discussed and revealed.

  5. Developing a Web-Based Nursing Practice and Research Information Management System: A Pilot Study.

    PubMed

    Choi, Jeeyae; Lapp, Cathi; Hagle, Mary E

    2015-09-01

    Many hospital information systems have been developed and implemented to collect clinical data from the bedside and have used the information to improve patient care. Because of a growing awareness that the use of clinical information improves quality of care and patient outcomes, measuring tools (electronic and paper based) have been developed, but most of them require multiple steps of data collection and analysis. This necessitated the development of a Web-based Nursing Practice and Research Information Management System that processes clinical nursing data to measure nurses' delivery of care and its impact on patient outcomes and provides useful information to clinicians, administrators, researchers, and policy makers at the point of care. This pilot study developed a computer algorithm based on a falls prevention protocol and programmed the prototype Web-based Nursing Practice and Research Information Management System. It successfully measured performance of nursing care delivered and its impact on patient outcomes successfully using clinical nursing data from the study site. Although Nursing Practice and Research Information Management System was tested with small data sets, results of study revealed that it has the potential to measure nurses' delivery of care and its impact on patient outcomes, while pinpointing components of nursing process in need of improvement.

  6. The Research of Spatial-Temporal Analysis and Decision-Making Assistant System for Disabled Person Affairs Based on Mapworld

    NASA Astrophysics Data System (ADS)

    Zhang, J. H.; Yang, J.; Sun, Y. S.

    2015-06-01

    This system combines the Mapworld platform and informationization of disabled person affairs, uses the basic information of disabled person as center frame. Based on the disabled person population database, the affairs management system and the statistical account system, the data were effectively integrated and the united information resource database was built. Though the data analysis and mining, the system provides powerful data support to the decision making, the affairs managing and the public serving. It finally realizes the rationalization, normalization and scientization of disabled person affairs management. It also makes significant contributions to the great-leap-forward development of the informationization of China Disabled Person's Federation.

  7. Model Based Definition

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  8. Data bases and data base systems related to NASA's Aerospace Program: A bibliography with indexes

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This bibliography lists 641 reports, articles, and other documents introduced into the NASA scientific and technical information system during the period January 1, 1981 through June 30, 1982. The directory was compiled to assist in the location of numerical and factual data bases and data base handling and management systems.

  9. Associative programming language and virtual associative access manager

    NASA Technical Reports Server (NTRS)

    Price, C.

    1978-01-01

    APL provides convenient associative data manipulation functions in a high level language. Six statements were added to PL/1 via a preprocessor: CREATE, INSERT, FIND, FOR EACH, REMOVE, and DELETE. They allow complete control of all data base operations. During execution, data base management programs perform the functions required to support the APL language. VAAM is the data base management system designed to support the APL language. APL/VAAM is used by CADANCE, an interactive graphic computer system. VAAM is designed to support heavily referenced files. Virtual memory files, which utilize the paging mechanism of the operating system, are used. VAAM supports a full network data structure. The two basic blocks in a VAAM file are entities and sets. Entities are the basic information element and correspond to PL/1 based structures defined by the user. Sets contain the relationship information and are implemented as arrays.

  10. NASIS data base management system: IBM 360 TSS implementation. Volume 8: Data base administrator user's guide

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Data Base Administrator User's Guide for the NASA Aerospace Safety Information System is presented. The subjects discussed are: (1) multi-terminal tasking, (2) data base executive, (3) utilities, (4) maintenance, (5) terminal support, and (6) retrieval subsystem.

  11. Research of the small satellite data management system

    NASA Astrophysics Data System (ADS)

    Yu, Xiaozhou; Zhou, Fengqi; Zhou, Jun

    2007-11-01

    Small satellite is the integration of light weight, small volume and low launch cost. It is a promising approach to realize the future space mission. A detailed study of the data management system has been carried out, with using new reconfiguration method based on System On Programmable Chip (SOPC). Compared with common structure of satellite, the Central Terminal Unit (CTU), the Remote Terminal Unit (RTU) and Serial Data Bus (SDB) of the data management are all integrated in single chip. Thus the reliability of the satellite is greatly improved. At the same time, the data management system has powerful performance owing to the modern FPGA processing ability.

  12. Commentary to Library Statistical Data Base.

    ERIC Educational Resources Information Center

    Jones, Dennis; And Others

    The National Center for Higher Education Management Systems (NCHEMS) has developed a library statistical data base which concentrates on the management information needs of administrators of public and academic libraries. This document provides an overview of the framework and conceptual approach employed in the design of the data base. The data…

  13. An Advanced IoT-based System for Intelligent Energy Management in Buildings.

    PubMed

    Marinakis, Vangelis; Doukas, Haris

    2018-02-16

    The energy sector is closely interconnected with the building sector and integrated Information and Communication Technologies (ICT) solutions for effective energy management supporting decision-making at building, district and city level are key fundamental elements for making a city Smart. The available systems are designed and intended exclusively for a predefined number of cases and systems without allowing for expansion and interoperability with other applications that is partially due to the lack of semantics. This paper presents an advanced Internet of Things (IoT) based system for intelligent energy management in buildings. A semantic framework is introduced aiming at the unified and standardised modelling of the entities that constitute the building environment. Suitable rules are formed, aiming at the intelligent energy management and the general modus operandi of Smart Building. In this context, an IoT-based system was implemented, which enhances the interactivity of the buildings' energy management systems. The results from its pilot application are presented and discussed. The proposed system extends existing approaches and integrates cross-domain data, such as the building's data (e.g., energy management systems), energy production, energy prices, weather data and end-users' behaviour, in order to produce daily and weekly action plans for the energy end-users with actionable personalised information.

  14. An Advanced IoT-based System for Intelligent Energy Management in Buildings

    PubMed Central

    Doukas, Haris

    2018-01-01

    The energy sector is closely interconnected with the building sector and integrated Information and Communication Technologies (ICT) solutions for effective energy management supporting decision-making at building, district and city level are key fundamental elements for making a city Smart. The available systems are designed and intended exclusively for a predefined number of cases and systems without allowing for expansion and interoperability with other applications that is partially due to the lack of semantics. This paper presents an advanced Internet of Things (IoT) based system for intelligent energy management in buildings. A semantic framework is introduced aiming at the unified and standardised modelling of the entities that constitute the building environment. Suitable rules are formed, aiming at the intelligent energy management and the general modus operandi of Smart Building. In this context, an IoT-based system was implemented, which enhances the interactivity of the buildings’ energy management systems. The results from its pilot application are presented and discussed. The proposed system extends existing approaches and integrates cross-domain data, such as the building’s data (e.g., energy management systems), energy production, energy prices, weather data and end-users’ behaviour, in order to produce daily and weekly action plans for the energy end-users with actionable personalised information. PMID:29462957

  15. Using the web for recruitment, screen, tracking, data management, and quality control in a dietary assessment clinical validation trial.

    PubMed

    Arab, Lenore; Hahn, Harry; Henry, Judith; Chacko, Sara; Winter, Ashley; Cambou, Mary C

    2010-03-01

    Screening and tracking subjects and data management in clinical trials require significant investments in manpower that can be reduced through the use of web-based systems. To support a validation trial of various dietary assessment tools that required multiple clinic visits and eight repeats of online assessments, we developed an interactive web-based system to automate all levels of management of a biomarker-based clinical trial. The "Energetics System" was developed to support 1) the work of the study coordinator in recruiting, screening and tracking subject flow, 2) the need of the principal investigator to review study progress, and 3) continuous data analysis. The system was designed to automate web-based self-screening into the trial. It supported scheduling tasks and triggered tailored messaging for late and non-responders. For the investigators, it provided real-time status overviews on all subjects, created electronic case reports, supported data queries and prepared analytic data files. Encryption and multi-level password protection were used to insure data privacy. The system was programmed iteratively and required six months of a web programmer's time along with active team engagement. In this study the enhancement in speed and efficiency of recruitment and quality of data collection as a result of this system outweighed the initial investment. Web-based systems have the potential to streamline the process of recruitment and day-to-day management of clinical trials in addition to improving efficiency and quality. Because of their added value they should be considered for trials of moderate size or complexity. Copyright 2009 Elsevier Inc. All rights reserved.

  16. Automatic labeling and characterization of objects using artificial neural networks

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Hill, Scott E.; Cromp, Robert F.

    1989-01-01

    Existing NASA supported scientific data bases are usually developed, managed and populated in a tedious, error prone and self-limiting way in terms of what can be described in a relational Data Base Management System (DBMS). The next generation Earth remote sensing platforms, i.e., Earth Observation System, (EOS), will be capable of generating data at a rate of over 300 Mbs per second from a suite of instruments designed for different applications. What is needed is an innovative approach that creates object-oriented databases that segment, characterize, catalog and are manageable in a domain-specific context and whose contents are available interactively and in near-real-time to the user community. Described here is work in progress that utilizes an artificial neural net approach to characterize satellite imagery of undefined objects into high-level data objects. The characterized data is then dynamically allocated to an object-oriented data base where it can be reviewed and assessed by a user. The definition, development, and evolution of the overall data system model are steps in the creation of an application-driven knowledge-based scientific information system.

  17. The utilization of neural nets in populating an object-oriented database

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Hill, Scott E.; Cromp, Robert F.

    1989-01-01

    Existing NASA supported scientific data bases are usually developed, managed and populated in a tedious, error prone and self-limiting way in terms of what can be described in a relational Data Base Management System (DBMS). The next generation Earth remote sensing platforms (i.e., Earth Observation System, (EOS), will be capable of generating data at a rate of over 300 Mbs per second from a suite of instruments designed for different applications. What is needed is an innovative approach that creates object-oriented databases that segment, characterize, catalog and are manageable in a domain-specific context and whose contents are available interactively and in near-real-time to the user community. Described here is work in progress that utilizes an artificial neural net approach to characterize satellite imagery of undefined objects into high-level data objects. The characterized data is then dynamically allocated to an object-oriented data base where it can be reviewed and assessed by a user. The definition, development, and evolution of the overall data system model are steps in the creation of an application-driven knowledge-based scientific information system.

  18. Software Framework for Peer Data-Management Services

    NASA Technical Reports Server (NTRS)

    Hughes, John; Hardman, Sean; Crichton, Daniel; Hyon, Jason; Kelly, Sean; Tran, Thuy

    2007-01-01

    Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-to-peer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.

  19. Reusing Information Management Services for Recommended Decadal Study Missions to Facilitate Aerosol and Cloud Studies

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Alcott, Gary; Lynnes, Chris; Leptoukh, Greg; Vollmer, Bruce; Berrick, Steve

    2008-01-01

    NASA Earth Sciences Division (ESD) has made great investments in the development and maintenance of data management systems and information technologies, to maximize the use of NASA generated Earth science data. With information management system infrastructure in place, mature and operational, very small delta costs are required to fully support data archival, processing, and data support services required by the recommended Decadal Study missions. This presentation describes the services and capabilities of the Goddard Space Flight Center (GSFC) Earth Sciences Data and Information Services Center (GES DISC) and the reusability for these future missions. The GES DISC has developed a series of modular, reusable data management components currently in use. They include data archive and distribution (Simple, Scalable, Script-based, Science [S4] Product Archive aka S4PA), data processing (S4 Processor for Measurements aka S4PM), data search (Mirador), data browse, visualization, and analysis (Giovanni), and data mining services. Information management system components are based on atmospheric scientist inputs. Large development and maintenance cost savings can be realized through their reuse in future missions.

  20. Generalized File Management System or Proto-DBMS?

    ERIC Educational Resources Information Center

    Braniff, Tom

    1979-01-01

    The use of a data base management system (DBMS) as opposed to traditional data processing is discussed. The generalized file concept is viewed as an entry level step to the DBMS. The transition process from one system to the other is detailed. (SF)

  1. XML Based Scientific Data Management Facility

    NASA Technical Reports Server (NTRS)

    Mehrotra, Piyush; Zubair, M.; Ziebartt, John (Technical Monitor)

    2001-01-01

    The World Wide Web consortium has developed an Extensible Markup Language (XML) to support the building of better information management infrastructures. The scientific computing community realizing the benefits of HTML has designed markup languages for scientific data. In this paper, we propose a XML based scientific data management facility, XDMF. The project is motivated by the fact that even though a lot of scientific data is being generated, it is not being shared because of lack of standards and infrastructure support for discovering and transforming the data. The proposed data management facility can be used to discover the scientific data itself, the transformation functions, and also for applying the required transformations. We have built a prototype system of the proposed data management facility that can work on different platforms. We have implemented the system using Java, and Apache XSLT engine Xalan. To support remote data and transformation functions, we had to extend the XSLT specification and the Xalan package.

  2. XML Based Scientific Data Management Facility

    NASA Technical Reports Server (NTRS)

    Mehrotra, P.; Zubair, M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The World Wide Web consortium has developed an Extensible Markup Language (XML) to support the building of better information management infrastructures. The scientific computing community realizing the benefits of XML has designed markup languages for scientific data. In this paper, we propose a XML based scientific data management ,facility, XDMF. The project is motivated by the fact that even though a lot of scientific data is being generated, it is not being shared because of lack of standards and infrastructure support for discovering and transforming the data. The proposed data management facility can be used to discover the scientific data itself, the transformation functions, and also for applying the required transformations. We have built a prototype system of the proposed data management facility that can work on different platforms. We have implemented the system using Java, and Apache XSLT engine Xalan. To support remote data and transformation functions, we had to extend the XSLT specification and the Xalan package.

  3. The Design of Data Disaster Recovery of National Fundamental Geographic Information System

    NASA Astrophysics Data System (ADS)

    Zhai, Y.; Chen, J.; Liu, L.; Liu, J.

    2014-04-01

    With the development of information technology, data security of information system is facing more and more challenges. The geographic information of surveying and mapping is fundamental and strategic resource, which is applied in all areas of national economic, defence and social development. It is especially vital to national and social interests when such classified geographic information is directly concerning Chinese sovereignty. Several urgent problems that needs to be resolved for surveying and mapping are how to do well in mass data storage and backup, establishing and improving the disaster backup system especially after sudden natural calamity accident, and ensuring all sectors rapidly restored on information system will operate correctly. For overcoming various disaster risks, protect the security of data and reduce the impact of the disaster, it's no doubt the effective way is to analysis and research on the features of storage and management and security requirements, as well as to ensure that the design of data disaster recovery system suitable for the surveying and mapping. This article analyses the features of fundamental geographic information data and the requirements of storage management, three site disaster recovery system of DBMS plan based on the popular network, storage and backup, data replication and remote switch of application technologies. In LAN that synchronous replication between database management servers and the local storage of backup management systems, simultaneously, remote asynchronous data replication between local storage backup management systems and remote database management servers. The core of the system is resolving local disaster in the remote site, ensuring data security and business continuity of local site. This article focuses on the following points: background, the necessity of disaster recovery system, the analysis of the data achievements and data disaster recovery plan. Features of this program is to use a hardware-based data hot backup, and remote online disaster recovery support for Oracle database system. The achievement of this paper is in summarizing and analysing the common characteristics of disaster of surveying and mapping business system requirements, while based on the actual situation of the industry, designed the basic GIS disaster recovery solutions, and we also give the conclusions about key technologies of RTO and RPO.

  4. [The future of clinical laboratory database management system].

    PubMed

    Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y

    1999-09-01

    To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.

  5. Research and design on system of asset management based on RFID

    NASA Astrophysics Data System (ADS)

    Guan, Peng; Du, HuaiChang; Jing, Hua; Zhang, MengYue; Zhang, Meng; Xu, GuiXian

    2011-10-01

    By analyzing the problems in the current assets management, this thesis proposing RFID technology will be applied to asset management in order to improve the management level of automation and information. This paper designed the equipment identification based on 433MHz RFID tag and reader which was deeply studied on the basis of RFID tag and card reader circuits, and this paper also illustrates the system of asset management. The RS232 converts Ethernet is a innovative technology to transfer data to PC monitor software, and implement system of asset management based on WEB techniques (PHP and MySQL).

  6. Research and Design of Embedded Wireless Meal Ordering System Based on SQLite

    NASA Astrophysics Data System (ADS)

    Zhang, Jihong; Chen, Xiaoquan

    The paper describes features and internal architecture and developing method of SQLite. And then it gives a design and program of meal ordering system. The system realizes the information interaction among the users and embedded devices with SQLite as database system. The embedded database SQLite manages the data and achieves wireless communication by using Bluetooth. A system program based on Qt/Embedded and Linux drivers realizes the local management of environmental data.

  7. A method of distributed avionics data processing based on SVM classifier

    NASA Astrophysics Data System (ADS)

    Guo, Hangyu; Wang, Jinyan; Kang, Minyang; Xu, Guojing

    2018-03-01

    Under the environment of system combat, in order to solve the problem on management and analysis of the massive heterogeneous data on multi-platform avionics system, this paper proposes a management solution which called avionics "resource cloud" based on big data technology, and designs an aided decision classifier based on SVM algorithm. We design an experiment with STK simulation, the result shows that this method has a high accuracy and a broad application prospect.

  8. IMS Version 3 Student Data Base Maintenance Program.

    ERIC Educational Resources Information Center

    Brown, John R.

    Computer routines that update the Instructional Management System (IMS) Version 3 student data base which supports the Southwest Regional Laboratory's (SWRL) student monitoring system are described. Written in IBM System 360 FORTRAN IV, the program updates the data base by adding, changing and deleting records, as well as adding and deleting…

  9. Non-Procedural Languages for Information Resource Management.

    ERIC Educational Resources Information Center

    Bearley, William L.

    The future of information resources management requires new approaches to implementing systems which will include a type of data base management that frees users to solve data processing problems logically by telling the system what they want, together with powerful non-procedural languages that will permit communication in simple, concise…

  10. 75 FR 1547 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2010

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-12

    ...: Notice of Determination. SUMMARY: Using data from Management Information System annual reports, FRA has... taken from FRA's Management Information System. Based on this data, the Administrator publishes a... effective upon publication. FOR FURTHER INFORMATION CONTACT: Lamar Allen, Alcohol and Drug Program Manager...

  11. 75 FR 79308 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2011

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-20

    ... from Management Information System annual reports, FRA has determined that the 2009 rail industry... program data taken from FRA's Management Information System. Based on this data, the Administrator... effective December 20, 2010. FOR FURTHER INFORMATION CONTACT: Lamar Allen, Alcohol and Drug Program Manager...

  12. Web-GIS based information management system to Bureau of Law Enforcement for Urban Managementenforcement for urban management

    NASA Astrophysics Data System (ADS)

    Sun, Hai; Wang, Cheng; Ren, Bo

    2007-06-01

    Daily works of Law Enforcement Bureau are crucial in the urban management. However, with the development of the city, the information and data which are relative to Law Enforcement Bureau's daily work are increasing and updating. The increasing data result in that some traditional work is limited and inefficient in daily work. Analyzing the demands and obstacles of Law Enforcement Bureau, the paper proposes a new method to solve these problems. A web-GIS based information management system was produced for Bureau of Law Enforcement for Urban Management of Foshan. First part of the paper provides an overview of the system. Second part introduces the architecture of system and data organization. In the third part, the paper describes the design and implement of functional modules detailedly. In the end, this paper is concluded and proposes some strategic recommendations for the further development of the system. This paper focuses on the architecture and implementation of the system, solves the developing issues based on ArcServer, and introduces a new concept to the local government to solve the current problems. Practical application of this system showed that it played very important role in the Law Enforcement Bureau's work.

  13. Analysis of Service Records Management Systems for Rescue and Retention of Cultural Resource Documents

    DTIC Science & Technology

    2009-06-01

    this information was not migrated to the new data- base . The responsible offices were told to destroy the old cards, and thus, vast amounts of...then necessary to examine the online service-specific records management systems , namely Army Records Information Management System (ARIMS ), Air...Force Records Information Management System (AFRIMS), and the Navy Records Management System .3 Each system

  14. A distributed cloud-based cyberinfrastructure framework for integrated bridge monitoring

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2017-04-01

    This paper describes a cloud-based cyberinfrastructure framework for the management of the diverse data involved in bridge monitoring. Bridge monitoring involves various hardware systems, software tools and laborious activities that include, for examples, a structural health monitoring (SHM), sensor network, engineering analysis programs and visual inspection. Very often, these monitoring systems, tools and activities are not coordinated, and the collected information are not shared. A well-designed integrated data management framework can support the effective use of the data and, thereby, enhance bridge management and maintenance operations. The cloud-based cyberinfrastructure framework presented herein is designed to manage not only sensor measurement data acquired from the SHM system, but also other relevant information, such as bridge engineering model and traffic videos, in an integrated manner. For the scalability and flexibility, cloud computing services and distributed database systems are employed. The information stored can be accessed through standard web interfaces. For demonstration, the cyberinfrastructure system is implemented for the monitoring of the bridges located along the I-275 Corridor in the state of Michigan.

  15. Compliance program data management system for The Idaho National Engineering Laboratory/Environmental Protection Agency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertzler, C.L.; Poloski, J.P.; Bates, R.A.

    1988-01-01

    The Compliance Program Data Management System (DMS) developed at the Idaho National Engineering Laboratory (INEL) validates and maintains the integrity of data collected to support the Consent Order and Compliance Agreement (COCA) between the INEL and the Environmental Protection Agency (EPA). The system uses dBase III Plus programs and dBase III Plus in an interactive mode to enter, store, validate, manage, and retrieve analytical information provided on EPA Contract Laboratory Program (CLP) forms and CLP forms modified to accommodate 40 CFR 264 Appendix IX constituent analyses. Data analysis and presentation is performed utilizing SAS, a statistical analysis software program. Archivingmore » of data and results is performed at appropriate stages of data management. The DMS is useful for sampling and analysis programs where adherence to EPA CLP protocol, along with maintenance and retrieval of waste site investigation sampling results is desired or requested. 3 refs.« less

  16. DataHub: Science data management in support of interactive exploratory analysis

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Rubin, Mark R.

    1993-01-01

    The DataHub addresses four areas of significant needs: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactives nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc), in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.

  17. DataHub - Science data management in support of interactive exploratory analysis

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Rubin, Mark R.

    1993-01-01

    DataHub addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactive nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc.) in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis is on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.

  18. Development of the prototype data management system of the solar H-alpha full disk observation

    NASA Astrophysics Data System (ADS)

    Wei, Ka-Ning; Zhao, Shi-Qing; Li, Qiong-Ying; Chen, Dong

    2004-06-01

    The Solar Chromospheric Telescope in Yunnan Observatory generates about 2G bytes fits format data per day. Huge amounts of data will bring inconvenience for people to use. Hence, data searching and sharing are important at present. Data searching, on-line browsing, remote accesses and download are developed with a prototype data management system of the solar H-alpha full disk observation, and improved by the working flow technology. Based on Windows XP operating system and MySQL data management system, a prototype system of browse/server model is developed by JAVA and JSP. Data compression, searching, browsing, deletion need authority and download in real-time have been achieved.

  19. Wiki-Based Data Management to Support Systems Toxicology

    EPA Science Inventory

    As the field of toxicology relies more heavily on systems approaches for mode of action discovery, evaluation, and modeling, the need for integrated data management is greater than ever. To meet these needs, we have developed a flexible system that assists individual or multiple...

  20. Substance abuse treatment management information systems: balancing federal, state, and service provider needs.

    PubMed

    Camp, J M; Krakow, M; McCarty, D; Argeriou, M

    1992-01-01

    There is increased interest in documenting the characteristics and treatment outcomes of clients served with Alcohol, Drug Abuse, and Mental Health Block Grant funds. The evolution of federal client-based management systems for substance abuse treatment services demonstrates that data collection systems are important but require continued support. A review of the Massachusetts substance abuse management information system illustrates the utility of a client-based data set. The development and implementation of a comprehensive information system require overcoming organizational barriers and project delays, fostering collaborative efforts among staff from diverse agencies, and employing considerable resources. In addition, the need to develop mechanisms for increasing the reliability of the data and ongoing training for the users is presented. Finally, three applications of the management information system's role in shaping policy are reviewed: developing services for special populations (communities of color, women, and pregnant substance abusers, and injection drug users), utilizing MIS data for evaluation purposes, and determining funding allocations.

  1. Collaborative data model and data base development for paleoenvironmental and archaeological domain using Semantic MediaWiki

    NASA Astrophysics Data System (ADS)

    Willmes, C.

    2017-12-01

    In the frame of the Collaborative Research Centre 806 (CRC 806) an interdisciplinary research project, that needs to manage data, information and knowledge from heterogeneous domains, such as archeology, cultural sciences, and the geosciences, a collaborative internal knowledge base system was developed. The system is based on the open source MediaWiki software, that is well known as the software that enables Wikipedia, for its facilitation of a web based collaborative knowledge and information management platform. This software is additionally enhanced with the Semantic MediaWiki (SMW) extension, that allows to store and manage structural data within the Wiki platform, as well as it facilitates complex query and API interfaces to the structured data stored in the SMW data base. Using an additional open source software called mobo, it is possible to improve the data model development process, as well as automated data imports, from small spreadsheets to large relational databases. Mobo is a command line tool that helps building and deploying SMW structure in an agile, Schema-Driven Development way, and allows to manage and collaboratively develop the data model formalizations, that are formalized in JSON-Schema format, using version control systems like git. The combination of a well equipped collaborative web platform facilitated by Mediawiki, the possibility to store and query structured data in this collaborative database provided by SMW, as well as the possibility for automated data import and data model development enabled by mobo, result in a powerful but flexible system to build and develop a collaborative knowledge base system. Furthermore, SMW allows the application of Semantic Web technology, the structured data can be exported into RDF, thus it is possible to set a triple-store including a SPARQL endpoint on top of the database. The JSON-Schema based data models, can be enhanced into JSON-LD, to facilitate and profit from the possibilities of Linked Data technology.

  2. Recent Experience of the Application of a Commercial Data Base Management System (ADABAS) to a Scientific Data Bank (ECDIN).

    ERIC Educational Resources Information Center

    And Others; Town, William G.

    1980-01-01

    Discusses the problems encountered and solutions adopted in application of the ADABAS database management system to the ECDIN (Environmental Chemicals Data and Information Network) data bank. SIMAS, the pilot system, and ADABAS are compared, and ECDIN ADABAS design features are described. Appendices provide additional facts about ADABAS and SIMAS.…

  3. Data-Centric Situational Awareness and Management in Intelligent Power Systems

    NASA Astrophysics Data System (ADS)

    Dai, Xiaoxiao

    The rapid development of technology and society has made the current power system a much more complicated system than ever. The request for big data based situation awareness and management becomes urgent today. In this dissertation, to respond to the grand challenge, two data-centric power system situation awareness and management approaches are proposed to address the security problems in the transmission/distribution grids and social benefits augmentation problem at the distribution-customer lever, respectively. To address the security problem in the transmission/distribution grids utilizing big data, the first approach provides a fault analysis solution based on characterization and analytics of the synchrophasor measurements. Specically, the optimal synchrophasor measurement devices selection algorithm (OSMDSA) and matching pursuit decomposition (MPD) based spatial-temporal synchrophasor data characterization method was developed to reduce data volume while preserving comprehensive information for the big data analyses. And the weighted Granger causality (WGC) method was investigated to conduct fault impact causal analysis during system disturbance for fault localization. Numerical results and comparison with other methods demonstrate the effectiveness and robustness of this analytic approach. As more social effects are becoming important considerations in power system management, the goal of situation awareness should be expanded to also include achievements in social benefits. The second approach investigates the concept and application of social energy upon the University of Denver campus grid to provide management improvement solutions for optimizing social cost. Social element--human working productivity cost, and economic element--electricity consumption cost, are both considered in the evaluation of overall social cost. Moreover, power system simulation, numerical experiments for smart building modeling, distribution level real-time pricing and social response to the pricing signals are studied for implementing the interactive artificial-physical management scheme.

  4. Virtual Network Configuration Management System for Data Center Operations and Management

    NASA Astrophysics Data System (ADS)

    Okita, Hideki; Yoshizawa, Masahiro; Uehara, Keitaro; Mizuno, Kazuhiko; Tarui, Toshiaki; Naono, Ken

    Virtualization technologies are widely deployed in data centers to improve system utilization. However, they increase the workload for operators, who have to manage the structure of virtual networks in data centers. A virtual-network management system which automates the integration of the configurations of the virtual networks is provided. The proposed system collects the configurations from server virtualization platforms and VLAN-supported switches, and integrates these configurations according to a newly developed XML-based management information model for virtual-network configurations. Preliminary evaluations show that the proposed system helps operators by reducing the time to acquire the configurations from devices and correct the inconsistency of operators' configuration management database by about 40 percent. Further, they also show that the proposed system has excellent scalability; the system takes less than 20 minutes to acquire the virtual-network configurations from a large scale network that includes 300 virtual machines. These results imply that the proposed system is effective for improving the configuration management process for virtual networks in data centers.

  5. An evaluation and implementation of rule-based Home Energy Management System using the Rete algorithm.

    PubMed

    Kawakami, Tomoya; Fujita, Naotaka; Yoshihisa, Tomoki; Tsukamoto, Masahiko

    2014-01-01

    In recent years, sensors become popular and Home Energy Management System (HEMS) takes an important role in saving energy without decrease in QoL (Quality of Life). Currently, many rule-based HEMSs have been proposed and almost all of them assume "IF-THEN" rules. The Rete algorithm is a typical pattern matching algorithm for IF-THEN rules. Currently, we have proposed a rule-based Home Energy Management System (HEMS) using the Rete algorithm. In the proposed system, rules for managing energy are processed by smart taps in network, and the loads for processing rules and collecting data are distributed to smart taps. In addition, the number of processes and collecting data are reduced by processing rules based on the Rete algorithm. In this paper, we evaluated the proposed system by simulation. In the simulation environment, rules are processed by a smart tap that relates to the action part of each rule. In addition, we implemented the proposed system as HEMS using smart taps.

  6. Foundations of Constructing a Marketing Data Base; Profitable Applications of the Computer to Marketing Management.

    ERIC Educational Resources Information Center

    Podell, Harold J.

    An introduction into the foundations of constructing a marketing data base is presented for the systems and marketing executives who are familiar with basic computer technology methods. The techniques and concepts presented are now being implemented by major organizations in the development of Management Information Systems (MIS). A marketing data…

  7. Analysis of DISMS (Defense Integrated Subsistence Management System) Increment 4

    DTIC Science & Technology

    1988-12-01

    response data entry; and rationale supporting an on-line system based on real time management information needs. Keywords: Automated systems; Subsistence; Workload capacity; Bid response; Contract administration; Computer systems.

  8. Integrating modal-based NDE techniques and bridge management systems using quality management

    NASA Astrophysics Data System (ADS)

    Sikorsky, Charles S.

    1997-05-01

    The intent of bridge management systems is to help engineers and managers determine when and where to spend bridge funds such that commerce and the motoring public needs are satisfied. A major shortcoming which states are experiencing is the NBIS data available is insufficient to perform certain functions required by new bridge management systems, such as modeling bridge deterioration and predicting costs. This paper will investigate how modal based nondestructive damage evaluation techniques can be integrated into bridge management using quality management principles. First, quality from the manufacturing perspective will be summarized. Next, the implementation of quality management in design and construction will be reinterpreted for bridge management. Based on this, a theory of approach will be formulated to improve the productivity of a highway transportation system.

  9. Improvements to information management systems simulator

    NASA Technical Reports Server (NTRS)

    Bilek, R. W.

    1972-01-01

    The performance of personnel in the augmentation and improvement of the interactive IMSIM information management simulation model is summarized. With this augmented model, NASA now has even greater capabilities for the simulation of computer system configurations, data processing loads imposed on these configurations, and executive software to control system operations. Through these simulations, NASA has an extremely cost effective capability for the design and analysis of computer-based data management systems.

  10. Data management for Computer-Aided Engineering (CAE)

    NASA Technical Reports Server (NTRS)

    Bryant, W. A.; Smith, M. R.

    1984-01-01

    Analysis of data flow through the design and manufacturing processes has established specific information management requirements and identified unique problems. The application of data management technology to the engineering/manufacturing environment addresses these problems. An overview of the IPAD prototype data base management system, representing a partial solution to these problems, is presented here.

  11. iRODS: A Distributed Data Management Cyberinfrastructure for Observatories

    NASA Astrophysics Data System (ADS)

    Rajasekar, A.; Moore, R.; Vernon, F.

    2007-12-01

    Large-scale and long-term preservation of both observational and synthesized data requires a system that virtualizes data management concepts. A methodology is needed that can work across long distances in space (distribution) and long-periods in time (preservation). The system needs to manage data stored on multiple types of storage systems including new systems that become available in the future. This concept is called infrastructure independence, and is typically implemented through virtualization mechanisms. Data grids are built upon concepts of data and trust virtualization. These concepts enable the management of collections of data that are distributed across multiple institutions, stored on multiple types of storage systems, and accessed by multiple types of clients. Data virtualization ensures that the name spaces used to identify files, users, and storage systems are persistent, even when files are migrated onto future technology. This is required to preserve authenticity, the link between the record and descriptive and provenance metadata. Trust virtualization ensures that access controls remain invariant as files are moved within the data grid. This is required to track the chain of custody of records over time. The Storage Resource Broker (http://www.sdsc.edu/srb) is one such data grid used in a wide variety of applications in earth and space sciences such as ROADNet (roadnet.ucsd.edu), SEEK (seek.ecoinformatics.org), GEON (www.geongrid.org) and NOAO (www.noao.edu). Recent extensions to data grids provide one more level of virtualization - policy or management virtualization. Management virtualization ensures that execution of management policies can be automated, and that rules can be created that verify assertions about the shared collections of data. When dealing with distributed large-scale data over long periods of time, the policies used to manage the data and provide assurances about the authenticity of the data become paramount. The integrated Rule-Oriented Data System (iRODS) (http://irods.sdsc.edu) provides the mechanisms needed to describe not only management policies, but also to track how the policies are applied and their execution results. The iRODS data grid maps management policies to rules that control the execution of the remote micro-services. As an example, a rule can be created that automatically creates a replica whenever a file is added to a specific collection, or extracts its metadata automatically and registers it in a searchable catalog. For the replication operation, the persistent state information consists of the replica location, the creation date, the owner, the replica size, etc. The mechanism used by iRODS for providing policy virtualization is based on well-defined functions, called micro-services, which are chained into alternative workflows using rules. A rule engine, based on the event-condition-action paradigm executes the rule-based workflows after an event. Rules can be deferred to a pre-determined time or executed on a periodic basis. As the data management policies evolve, the iRODS system can implement new rules, new micro-services, and new state information (metadata content) needed to manage the new policies. Each sub- collection can be managed using a different set of policies. The discussion of the concepts in rule-based policy virtualization and its application to long-term and large-scale data management for observatories such as ORION and NEON will be the basis of the paper.

  12. An Open Source Software and Web-GIS Based Platform for Airborne SAR Remote Sensing Data Management, Distribution and Sharing

    NASA Astrophysics Data System (ADS)

    Changyong, Dou; Huadong, Guo; Chunming, Han; Ming, Liu

    2014-03-01

    With more and more Earth observation data available to the community, how to manage and sharing these valuable remote sensing datasets is becoming an urgent issue to be solved. The web based Geographical Information Systems (GIS) technology provides a convenient way for the users in different locations to share and make use of the same dataset. In order to efficiently use the airborne Synthetic Aperture Radar (SAR) remote sensing data acquired in the Airborne Remote Sensing Center of the Institute of Remote Sensing and Digital Earth (RADI), Chinese Academy of Sciences (CAS), a Web-GIS based platform for airborne SAR data management, distribution and sharing was designed and developed. The major features of the system include map based navigation search interface, full resolution imagery shown overlaid the map, and all the software adopted in the platform are Open Source Software (OSS). The functions of the platform include browsing the imagery on the map navigation based interface, ordering and downloading data online, image dataset and user management, etc. At present, the system is under testing in RADI and will come to regular operation soon.

  13. Development of a statewide Landsat digital data base for forest insect damage assessment

    NASA Technical Reports Server (NTRS)

    Williams, D. L.; Dottavio, C. L.; Nelson, R. F.

    1983-01-01

    A Joint Research Project (JRP) invlving NASA/Goddard Space Flight Center and the Pennsylvania Bureau of Forestry/Division of Forest Pest Management demonstrates the utility of Landsat data for assessing forest insect damage. A major effort within the project has been the creation of map-registered, statewide Landsat digital data base for Pennsylvania. The data base, developed and stored on computers at the Pennsylvania State University Computation Center, contains Landsat imagery, a Landsat-derived forest resource map, and digitized data layers depicting Forest Pest Management District boundaries and county boundaries. A data management front-end system was also developed to provide an interface between the various layers of information within the data base and image analysis software. This front-end system insures than an automated assessment of defoliation damage can be conducted and summarized by geographic area or jurisdiction of interest.

  14. Implementation of relational data base management systems on micro-computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, C.L.

    1982-01-01

    This dissertation describes an implementation of a Relational Data Base Management System on a microcomputer. A specific floppy disk based hardward called TERAK is being used, and high level query interface which is similar to a subset of the SEQUEL language is provided. The system contains sub-systems such as I/O, file management, virtual memory management, query system, B-tree management, scanner, command interpreter, expression compiler, garbage collection, linked list manipulation, disk space management, etc. The software has been implemented to fulfill the following goals: (1) it is highly modularized. (2) The system is physically segmented into 16 logically independent, overlayable segments,more » in a way such that a minimal amount of memory is needed at execution time. (3) Virtual memory system is simulated that provides the system with seemingly unlimited memory space. (4) A language translator is applied to recognize user requests in the query language. The code generation of this translator generates compact code for the execution of UPDATE, DELETE, and QUERY commands. (5) A complete set of basic functions needed for on-line data base manipulations is provided through the use of a friendly query interface. (6) To eliminate the dependency on the environment (both software and hardware) as much as possible, so that it would be easy to transplant the system to other computers. (7) To simulate each relation as a sequential file. It is intended to be a highly efficient, single user system suited to be used by small or medium sized organizations for, say, administrative purposes. Experiments show that quite satisfying results have indeed been achieved.« less

  15. Conceptual information processing: A robust approach to KBS-DBMS integration

    NASA Technical Reports Server (NTRS)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  16. Web-Accessible Scientific Workflow System for Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roelof Versteeg; Roelof Versteeg; Trevor Rowe

    2006-03-01

    We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less

  17. A general scientific information system to support the study of climate-related data

    NASA Technical Reports Server (NTRS)

    Treinish, L. A.

    1984-01-01

    The development and use of NASA's Pilot Climate Data System (PCDS) are discussed. The PCDS is used as a focal point for managing and providing access to a large collection of actively used data for the Earth, ocean and atmospheric sciences. The PCDS provides uniform data catalogs, inventories, and access methods for selected NASA and non-NASA data sets. Scientific users can preview the data sets using graphical and statistical methods. The system has evolved from its original purpose as a climate data base management system in response to a national climate program, into an extensive package of capabilities to support many types of data sets from both spaceborne and surface based measurements with flexible data selection and analysis functions.

  18. A parallel data management system for large-scale NASA datasets

    NASA Technical Reports Server (NTRS)

    Srivastava, Jaideep

    1993-01-01

    The past decade has experienced a phenomenal growth in the amount of data and resultant information generated by NASA's operations and research projects. A key application is the reprocessing problem which has been identified to require data management capabilities beyond those available today (PRAT93). The Intelligent Information Fusion (IIF) system (ROEL91) is an ongoing NASA project which has similar requirements. Deriving our understanding of NASA's future data management needs based on the above, this paper describes an approach to using parallel computer systems (processor and I/O architectures) to develop an efficient parallel database management system to address the needs. Specifically, we propose to investigate issues in low-level record organizations and management, complex query processing, and query compilation and scheduling.

  19. A 3-D terrain visualization database for highway information management

    DOT National Transportation Integrated Search

    1999-07-26

    A Multimedia based Highway Information System (MMHIS) is described in the paper to improve the existing photologging system for various operation and management needs. The full digital, computer based MMHIS uses technologies of video, multimedia data...

  20. A Data Management System Integrating Web-based Training and Randomized Trials: Requirements, Experiences and Recommendations.

    PubMed

    Muroff, Jordana; Amodeo, Maryann; Larson, Mary Jo; Carey, Margaret; Loftin, Ralph D

    2011-01-01

    This article describes a data management system (DMS) developed to support a large-scale randomized study of an innovative web-course that was designed to improve substance abuse counselors' knowledge and skills in applying a substance abuse treatment method (i.e., cognitive behavioral therapy; CBT). The randomized trial compared the performance of web-course-trained participants (intervention group) and printed-manual-trained participants (comparison group) to determine the effectiveness of the web-course in teaching CBT skills. A single DMS was needed to support all aspects of the study: web-course delivery and management, as well as randomized trial management. The authors briefly reviewed several other systems that were described as built either to handle randomized trials or to deliver and evaluate web-based training. However it was clear that these systems fell short of meeting our needs for simultaneous, coordinated management of the web-course and the randomized trial. New England Research Institute's (NERI) proprietary Advanced Data Entry and Protocol Tracking (ADEPT) system was coupled with the web-programmed course and customized for our purposes. This article highlights the requirements for a DMS that operates at the intersection of web-based course management systems and randomized clinical trial systems, and the extent to which the coupled, customized ADEPT satisfied those requirements. Recommendations are included for institutions and individuals considering conducting randomized trials and web-based training programs, and seeking a DMS that can meet similar requirements.

  1. [Development and clinical evaluation of an anesthesia information management system].

    PubMed

    Feng, Jing-yi; Chen, Hua; Zhu, Sheng-mei

    2010-09-21

    To study the design, implementation and clinical evaluation of an anesthesia information management system. To record, process and store peri-operative patient data automatically, all kinds of bedside monitoring equipments are connected into the system based on information integrating technology; after a statistical analysis of those patient data by data mining technology, patient status can be evaluated automatically based on risk prediction standard and decision support system, and then anesthetist could perform reasonable and safe clinical processes; with clinical processes electronically recorded, standard record tables could be generated, and clinical workflow is optimized, as well. With the system, kinds of patient data could be collected, stored, analyzed and archived, kinds of anesthesia documents could be generated, and patient status could be evaluated to support clinic decision. The anesthesia information management system is useful for improving anesthesia quality, decreasing risk of patient and clinician, and aiding to provide clinical proof.

  2. A design for a ground-based data management system

    NASA Technical Reports Server (NTRS)

    Lambird, Barbara A.; Lavine, David

    1988-01-01

    An initial design for a ground-based data management system which includes intelligent data abstraction and cataloging is described. The large quantity of data on some current and future NASA missions leads to significant problems in providing scientists with quick access to relevant data. Human screening of data for potential relevance to a particular study is time-consuming and costly. Intelligent databases can provide automatic screening when given relevent scientific parameters and constraints. The data management system would provide, at a minimum, information of availability of the range of data, the type available, specific time periods covered together with data quality information, and related sources of data. The system would inform the user about the primary types of screening, analysis, and methods of presentation available to the user. The system would then aid the user with performing the desired tasks, in such a way that the user need only specify the scientific parameters and objectives, and not worry about specific details for running a particular program. The design contains modules for data abstraction, catalog plan abstraction, a user-friendly interface, and expert systems for data handling, data evaluation, and application analysis. The emphasis is on developing general facilities for data representation, description, analysis, and presentation that will be easily used by scientists directly, thus bypassing the knowledge acquisition bottleneck. Expert system technology is used for many different aspects of the data management system, including the direct user interface, the interface to the data analysis routines, and the analysis of instrument status.

  3. Energy management system turns data into market info

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Traynor, P.J.; Ackerman, W.J.

    1996-09-01

    The designers claim that Wisconsin Power & Light Co`s new energy management system is the first system of its type in the world in terms of the comprehensiveness and scope of its stored and retrievable data. Furthermore, the system`s link to the utility`s generating assets enables powerplant management to dispatch generation resources based on up-to-date unit characteristics. That means that the new system gives WP&L a competitive tool to optimize operations as well as fine-tune its EMS based on timely load and unit response information. Additionally, the EMS gives WP&L insight into the complex issues related to the unbundling ofmore » generation resources.« less

  4. WebBio, a web-based management and analysis system for patient data of biological products in hospital.

    PubMed

    Lu, Ying-Hao; Kuo, Chen-Chun; Huang, Yaw-Bin

    2011-08-01

    We selected HTML, PHP and JavaScript as the programming languages to build "WebBio", a web-based system for patient data of biological products and used MySQL as database. WebBio is based on the PHP-MySQL suite and is run by Apache server on Linux machine. WebBio provides the functions of data management, searching function and data analysis for 20 kinds of biological products (plasma expanders, human immunoglobulin and hematological products). There are two particular features in WebBio: (1) pharmacists can rapidly find out whose patients used contaminated products for medication safety, and (2) the statistics charts for a specific product can be automatically generated to reduce pharmacist's work loading. WebBio has successfully turned traditional paper work into web-based data management.

  5. Data Element Dictionary: Staff. Second Edition.

    ERIC Educational Resources Information Center

    Martin, James S.

    This document is intended to serve as a guide for institutions in the development of data bases to support the implementation of planning and management systems. This publication serves to identify and describe those staff-related data elements: (1) required to support current National Center for Higher Education Management Systems (NCHEMS)…

  6. Data Element Dictionary: Facilities. Second Edition.

    ERIC Educational Resources Information Center

    Martin, James S.

    This document is intended to serve as a guide for institutions in the development of data bases to support the implementation of planning and management systems. This publication serves to identify and describe those facilities-related data elements: (1) required to support current National Center for Higher Education Management Systems (NCHEMS)…

  7. Data Element Dictionary: Course. Second Edition.

    ERIC Educational Resources Information Center

    Martin, James S.

    This document is intended to serve as a guide for institutions in the development of data bases to support the implementation of planning and management systems. This publication serves to identify and describe those course-related data elements: (1) required to support current National Center for Higher Education Management Systems (NCHEMS)…

  8. Data Element Dictionary: Finance. Second Edition.

    ERIC Educational Resources Information Center

    Martin, James S.

    This document is intended to serve as a guide for institutions in the development of data bases to support the implementation of planning and management systems. This publication serves to identify and describe those finance-related data elements: (1) required to support current National Center for Higher Education Management Systems (NCHEMS)…

  9. SigmaCLIPSE = presentation management + NASA CLI PS + SQL

    NASA Technical Reports Server (NTRS)

    Weiss, Bernard P., Jr.

    1990-01-01

    SigmaCLIPSE provides an expert systems and 'intelligent' data base development program for diverse systems integration environments that require support for automated reasoning and expert systems technology, presentation management, and access to 'intelligent' SQL data bases. The SigmaCLIPSE technology and and its integrated ability to access 4th generation application development and decision support tools through a portable SQL interface, comprises a sophisticated software development environment for solving knowledge engineering and expert systems development problems in information intensive commercial environments -- financial services, health care, and distributed process control -- where the expert system must be extendable -- a major architectural advantage of NASA CLIPS. SigmaCLIPSE is a research effort intended to test the viability of merging SQL data bases with expert systems technology.

  10. Implementing a genomic data management system using iRODS in the Wellcome Trust Sanger Institute

    PubMed Central

    2011-01-01

    Background Increasingly large amounts of DNA sequencing data are being generated within the Wellcome Trust Sanger Institute (WTSI). The traditional file system struggles to handle these increasing amounts of sequence data. A good data management system therefore needs to be implemented and integrated into the current WTSI infrastructure. Such a system enables good management of the IT infrastructure of the sequencing pipeline and allows biologists to track their data. Results We have chosen a data grid system, iRODS (Rule-Oriented Data management systems), to act as the data management system for the WTSI. iRODS provides a rule-based system management approach which makes data replication much easier and provides extra data protection. Unlike the metadata provided by traditional file systems, the metadata system of iRODS is comprehensive and allows users to customize their own application level metadata. Users and IT experts in the WTSI can then query the metadata to find and track data. The aim of this paper is to describe how we designed and used (from both system and user viewpoints) iRODS as a data management system. Details are given about the problems faced and the solutions found when iRODS was implemented. A simple use case describing how users within the WTSI use iRODS is also introduced. Conclusions iRODS has been implemented and works as the production system for the sequencing pipeline of the WTSI. Both biologists and IT experts can now track and manage data, which could not previously be achieved. This novel approach allows biologists to define their own metadata and query the genomic data using those metadata. PMID:21906284

  11. Development of the interconnectivity and enhancement (ICE) module in the Virginia Department of Transportation's Geotechnical Database Management System Framework.

    DOT National Transportation Integrated Search

    2007-01-01

    An Internet-based, spatiotemporal Geotechnical Database Management System (GDBMS) Framework was implemented at the Virginia Department of Transportation (VDOT) in 2002 to manage geotechnical data using a distributed Geographical Information System (G...

  12. Site-Based Management versus Systems-Based Thinking: The Impact of Data-Driven Accountability and Reform

    ERIC Educational Resources Information Center

    Mette, Ian M.; Bengtson, Ed

    2015-01-01

    This case was written to help prepare building-level and central office administrators who are expected to effectively lead schools and systems in an often tumultuous world of educational accountability and reform. The intent of this case study is to allow educators to examine the impact data management has on the types of thinking required when…

  13. An on-line image data base system: Managing image collections

    Treesearch

    Malchus B. Baker; Daniel P. Huebner; Peter F. Ffolliott

    2000-01-01

    Many researchers and land management personnel want photographic records of the phases of their studies or projects. Depending on the personnel and the type of project, a study can result in a few or hundreds of photographic images. A data base system allows users to query using various parameters, such as key words, dates, and project locations, and to view images...

  14. Land management: data availability and process understanding for global change studies.

    PubMed

    Erb, Karl-Heinz; Luyssaert, Sebastiaan; Meyfroidt, Patrick; Pongratz, Julia; Don, Axel; Kloster, Silvia; Kuemmerle, Tobias; Fetzel, Tamara; Fuchs, Richard; Herold, Martin; Haberl, Helmut; Jones, Chris D; Marín-Spiotta, Erika; McCallum, Ian; Robertson, Eddy; Seufert, Verena; Fritz, Steffen; Valade, Aude; Wiltshire, Andrew; Dolman, Albertus J

    2017-02-01

    In the light of daunting global sustainability challenges such as climate change, biodiversity loss and food security, improving our understanding of the complex dynamics of the Earth system is crucial. However, large knowledge gaps related to the effects of land management persist, in particular those human-induced changes in terrestrial ecosystems that do not result in land-cover conversions. Here, we review the current state of knowledge of ten common land management activities for their biogeochemical and biophysical impacts, the level of process understanding and data availability. Our review shows that ca. one-tenth of the ice-free land surface is under intense human management, half under medium and one-fifth under extensive management. Based on our review, we cluster these ten management activities into three groups: (i) management activities for which data sets are available, and for which a good knowledge base exists (cropland harvest and irrigation); (ii) management activities for which sufficient knowledge on biogeochemical and biophysical effects exists but robust global data sets are lacking (forest harvest, tree species selection, grazing and mowing harvest, N fertilization); and (iii) land management practices with severe data gaps concomitant with an unsatisfactory level of process understanding (crop species selection, artificial wetland drainage, tillage and fire management and crop residue management, an element of crop harvest). Although we identify multiple impediments to progress, we conclude that the current status of process understanding and data availability is sufficient to advance with incorporating management in, for example, Earth system or dynamic vegetation models in order to provide a systematic assessment of their role in the Earth system. This review contributes to a strategic prioritization of research efforts across multiple disciplines, including land system research, ecological research and Earth system modelling. © 2016 John Wiley & Sons Ltd.

  15. Design, Development and Utilization Perspectives on Database Management Systems

    ERIC Educational Resources Information Center

    Shneiderman, Ben

    1977-01-01

    This paper reviews the historical development of integrated data base management systems and examines competing approaches. Topics include management and utilization, implementation and design, query languages, security, integrity, privacy and concurrency. (Author/KP)

  16. Policy-based Distributed Data Management

    NASA Astrophysics Data System (ADS)

    Moore, R. W.

    2009-12-01

    The analysis and understanding of climate variability and change builds upon access to massive collections of observational and simulation data. The analyses involve distributed computing, both at the storage systems (which support data subsetting) and at compute engines (for assimilation of observational data into simulations). The integrated Rule Oriented Data System (iRODS) organizes the distributed data into collections to facilitate enforcement of management policies, support remote data processing, and enable development of reference collections. Currently at RENCI, the iRODS data grid is being used to manage ortho-photos and lidar data for the State of North Carolina, provide a unifying storage environment for engagement centers across the state, support distributed access to visualizations of weather data, and is being explored to manage and disseminate collections of ensembles of meteorological and hydrological model results. In collaboration with the National Climatic Data Center, an iRODS data grid is being established to support data transmission from NCDC to ORNL, and to integrate NCDC archives with ORNL compute services. To manage the massive data transfers, parallel I/O streams are used between High Performance Storage System tape archives and the supercomputers at ORNL. Further, we are exploring the movement and management of large RADAR and in situ datasets to be used for data mining between RENCI and NCDC, and for the distributed creation of decision support and climate analysis tools. The iRODS data grid supports all phases of the scientific data life cycle, from management of data products for a project, to sharing of data between research institutions, to publication of data in a digital library, to preservation of data for use in future research projects. Each phase is characterized by a broader user community, with higher expectations for more detailed descriptions and analysis mechanisms for manipulating the data. The higher usage requirements are enforced by management policies that define the required metadata, the required data formats, and the required analysis tools. The iRODS policy based data management system automates the creation of the community chosen data products, validates integrity and authenticity assessment criteria, and enforces management policies across all accesses of the system.

  17. Implementation of system intelligence in a 3-tier telemedicine/PACS hierarchical storage management system

    NASA Astrophysics Data System (ADS)

    Chao, Woodrew; Ho, Bruce K. T.; Chao, John T.; Sadri, Reza M.; Huang, Lu J.; Taira, Ricky K.

    1995-05-01

    Our tele-medicine/PACS archive system is based on a three-tier distributed hierarchical architecture, including magnetic disk farms, optical jukebox, and tape jukebox sub-systems. The hierarchical storage management (HSM) architecture, built around a low cost high performance platform [personal computers (PC) and Microsoft Windows NT], presents a very scaleable and distributed solution ideal for meeting the needs of client/server environments such as tele-medicine, tele-radiology, and PACS. These image based systems typically require storage capacities mirroring those of film based technology (multi-terabyte with 10+ years storage) and patient data retrieval times at near on-line performance as demanded by radiologists. With the scaleable architecture, storage requirements can be easily configured to meet the needs of the small clinic (multi-gigabyte) to those of a major hospital (multi-terabyte). The patient data retrieval performance requirement was achieved by employing system intelligence to manage migration and caching of archived data. Relevant information from HIS/RIS triggers prefetching of data whenever possible based on simple rules. System intelligence embedded in the migration manger allows the clustering of patient data onto a single tape during data migration from optical to tape medium. Clustering of patient data on the same tape eliminates multiple tape loading and associated seek time during patient data retrieval. Optimal tape performance can then be achieved by utilizing the tape drives high performance data streaming capabilities thereby reducing typical data retrieval delays associated with streaming tape devices.

  18. Towards building high performance medical image management system for clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Fusheng; Lee, Rubao; Zhang, Xiaodong; Saltz, Joel

    2011-03-01

    Medical image based biomarkers are being established for therapeutic cancer clinical trials, where image assessment is among the essential tasks. Large scale image assessment is often performed by a large group of experts by retrieving images from a centralized image repository to workstations to markup and annotate images. In such environment, it is critical to provide a high performance image management system that supports efficient concurrent image retrievals in a distributed environment. There are several major challenges: high throughput of large scale image data over the Internet from the server for multiple concurrent client users, efficient communication protocols for transporting data, and effective management of versioning of data for audit trails. We study the major bottlenecks for such a system, propose and evaluate a solution by using a hybrid image storage with solid state drives and hard disk drives, RESTfulWeb Services based protocols for exchanging image data, and a database based versioning scheme for efficient archive of image revision history. Our experiments show promising results of our methods, and our work provides a guideline for building enterprise level high performance medical image management systems.

  19. Badger Army Ammunition Plant groundwater data management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, J.P.

    1994-12-31

    At the Badger Army Ammunition Plant (Badger), there are currently over 200 wells that are monitored on a quarterly basis. Badger has had three active production periods since its construction in 1942. During these periods, various nitrocellulose based propellants were produced including single base artillery propellants were produced including single base artillery propellant, double base rocket propellant and BALL POWDER{reg_sign} propellant. Intermediate materials used in the manufacture of these propellants were also produced, including nitroglycerine, and sulfuric and nitric acids. To meet the challenge of managing the data in-house, a groundwater data management system (GDMS) was developed. Although such systemsmore » are commercially available, they were not able to provide the specific capabilities necessary for data management and reporting at Badger. The GDMS not only provides the routine database capabilities of data sorts and queries, but has provided an automated data reporting system as well. The reporting function alone has significantly reduced the time and efforts that would normally be associated with this task. Since the GDMS was developed at Badger, the program can be continually adapted to site specific needs. Future planned modifications include automated reconciliation, improved transfer of data to graphics software, and statistical analysis and interpretation of the data.« less

  20. Review of the Water Resources Information System of Argentina

    USGS Publications Warehouse

    Hutchison, N.E.

    1987-01-01

    A representative of the U.S. Geological Survey traveled to Buenos Aires, Argentina, in November 1986, to discuss water information systems and data bank implementation in the Argentine Government Center for Water Resources Information. Software has been written by Center personnel for a minicomputer to be used to manage inventory (index) data and water quality data. Additional hardware and software have been ordered to upgrade the existing computer. Four microcomputers, statistical and data base management software, and network hardware and software for linking the computers have also been ordered. The Center plans to develop a nationwide distributed data base for Argentina that will include the major regional offices as nodes. Needs for continued development of the water resources information system for Argentina were reviewed. Identified needs include: (1) conducting a requirements analysis to define the content of the data base and insure that all user requirements are met, (2) preparing a plan for the development, implementation, and operation of the data base, and (3) developing a conceptual design to inform all development personnel and users of the basic functionality planned for the system. A quality assurance and configuration management program to provide oversight to the development process was also discussed. (USGS)

  1. Knowledge Based System Applications for Guidance and Control (Application des Systemes a Base de Connaissances au Guidage-Pilotage)

    DTIC Science & Technology

    1991-01-01

    techniques and integration concepts. Recent advances in digital computation techniques including data base management , represent the core enabling...tactical information management and effective pilot interaction are essential. Pilot decision aiding, combat automation, sensor fusion and ol-board...tactical battle management concepts offer the opportunity for substantial mission effectiveness improvements. Although real-time tactical military

  2. A Distributed Data Base Version of INGRES.

    ERIC Educational Resources Information Center

    Stonebraker, Michael; Neuhold, Eric

    Extensions are required to the currently operational INGRES data base system for it to manage a data base distributed over multiple machines in a computer network running the UNIX operating system. Three possible user views include: (1) each relation in a unique machine, (2) a user interaction with the data base which can only span relations at a…

  3. Operable Data Management for Ocean Observing Systems

    NASA Astrophysics Data System (ADS)

    Chavez, F. P.; Graybeal, J. B.; Godin, M. A.

    2004-12-01

    As oceanographic observing systems become more numerous and complex, data management solutions must follow. Most existing oceanographic data management systems fall into one of three categories: they have been developed as dedicated solutions, with limited application to other observing systems; they expect that data will be pre-processed into well-defined formats, such as netCDF; or they are conceived as robust, generic data management solutions, with complexity (high) and maturity and adoption rates (low) to match. Each approach has strengths and weaknesses; no approach yet fully addresses, nor takes advantage of, the sophistication of ocean observing systems as they are now conceived. In this presentation we describe critical data management requirements for advanced ocean observing systems, of the type envisioned by ORION and IOOS. By defining common requirements -- functional, qualitative, and programmatic -- for all such ocean observing systems, the performance and nature of the general data management solution can be characterized. Issues such as scalability, maintaining metadata relationships, data access security, visualization, and operational flexibility suggest baseline architectural characteristics, which may in turn lead to reusable components and approaches. Interoperability with other data management systems, with standards-based solutions in metadata specification and data transport protocols, and with the data management infrastructure envisioned by IOOS and ORION, can also be used to define necessary capabilities. Finally, some requirements for the software infrastructure of ocean observing systems can be inferred. Early operational results and lessons learned, from development and operations of MBARI ocean observing systems, are used to illustrate key requirements, choices, and challenges. Reference systems include the Monterey Ocean Observing System (MOOS), its component software systems (Software Infrastructure and Applications for MOOS, and the Shore Side Data System), and the Autonomous Ocean Sampling Network (AOSN).

  4. Concept for a Satellite-Based Advanced Air Traffic Management System : Volume 3. Subsystem Functional Description.

    DOT National Transportation Integrated Search

    1974-02-01

    The volume presents a detailed description of the subsystems that comprise the Satellite-Based Advanced Air Traffic Management System. Described in detail are the surveillance, navigation, communications, data processing, and airport subsystems. The ...

  5. Application of A Mobile Platform-based System for the Management of Fundus Diease in Outpatient Settings.

    PubMed

    Dend, Xun; Li, Hong-Yan; Yin, Hong; Liang, Jian-Hong; Chen, Yi; Li, Xiao-Xin; Zhao, Ming-Wei

    2016-08-01

    Objective To evaluate the application of a mobile platform-based system in the management of fundus disease in outpatient settings. Methods In the outpatient departments of fundus disease,premature babies requiring eye examination under general anesthesia and adults requiring intraocular surgery were enrolled as the subjects. According to the existing clinical practices,we developed a system that met the requirements of clinical practices and optimized the clinical management. Based on the FileMaker database,the tablet computers were used as the mobile platform and the system could also be run in iPad and PC terminals.Results Since 2013,the system recorded 7500 cases of special examinations. Since July 2015,4100 cases of intravitreal drug injection were also recored in the system. Multiple-point and real-time reservation pattern increased the efficiency and opimize the clinical management. All the clinical data were digitalized. Conclusion The mobile platform-based system can increase the efficacy of examination and other clinical processes and standardize data collection;thus,it is feasible for the clinical practices in outpatient departments of ophthalmology.

  6. A data base and analysis program for shuttle main engine dynamic pressure measurements

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1986-01-01

    A dynamic pressure data base management system is described for measurements obtained from space shuttle main engine (SSME) hot firing tests. The data were provided in terms of engine power level and rms pressure time histories, and power spectra of the dynamic pressure measurements at selected times during each test. Test measurements and engine locations are defined along with a discussion of data acquisition and reduction procedures. A description of the data base management analysis system is provided and subroutines developed for obtaining selected measurement means, variances, ranges and other statistics of interest are discussed. A summary of pressure spectra obtained at SSME rated power level is provided for reference. Application of the singular value decomposition technique to spectrum interpolation is discussed and isoplots of interpolated spectra are presented to indicate measurement trends with engine power level. Program listings of the data base management and spectrum interpolation software are given. Appendices are included to document all data base measurements.

  7. ms_lims, a simple yet powerful open source laboratory information management system for MS-driven proteomics.

    PubMed

    Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart

    2010-03-01

    MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockwood, Jr., Neil; McLellan, Jason G; Crossley, Brian

    The Resident Fish Stock Status above Chief Joseph and Grand Coulee Dams Project, commonly known as the Joint Stock Assessment Project (JSAP) is a management tool using ecosystem principles to manage artificial fish assemblages and native fish in altered environments existing in the Columbia River System above Chief Joseph and Grand Coulee Dams (blocked area). The three-phase approach of this project will enhance the fisheries resources of the blocked area by identifying data gaps, filling data gaps with research, and implementing management recommendations based on research results. The Blocked Area fisheries information housed in a central location will allow managersmore » to view the entire system while making decisions, rather than basing management decisions on isolated portions of the system. The JSAP (NWPPC program measure 10.8B.26) is designed and guided jointly by fisheries managers in the blocked area and the Columbia Basin blocked area management plan (1998). The initial year of the project (1997) identified the need for a central data storage and analysis facility, coordination with the StreamNet project, compilation of blocked area fisheries information, and a report on the ecological condition of the Spokane River System. These needs were addressed in 1998 by acquiring a central location with a data storage and analysis system, coordinating a pilot project with StreamNet, compiling fisheries distribution data throughout the blocked area, identifying data gaps based on compiled information, and researching the ecological condition of the Spokane River. In order to ensure that any additional information collected throughout the life of this project will be easily stored and manipulated by the central storage facility, it was necessary to develop standardized methodologies between the JSAP fisheries managers. The use of common collection and analytical tools is essential to the process of streamlining joint management decisions. In 1999 and 2000 the project began to address some of the identified data gaps, throughout the blocked area, with a variety of newly developed sampling projects, as well as, continuing with ongoing data collection of established projects.« less

  9. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1990-01-01

    Archival reports on developments in programs managed by the JPL Office of Telecommunications and Data Acquisition (TDA) are provided. Topics covered include: DSN advanced systems (tracking and ground-based navigation; communications, spacecraft-ground; and station control and system technology) and DSN systems implementation (capabilities for existing projects; capabilities for new projects; TDA program management and analysis; and Goldstone solar system radar).

  10. The Data Base and Decision Making in Public Schools.

    ERIC Educational Resources Information Center

    Hedges, William D.

    1984-01-01

    Describes generic types of databases--file management systems, relational database management systems, and network/hierarchical database management systems--with their respective strengths and weaknesses; discusses factors to be considered in determining whether a database is desirable; and provides evaluative criteria for use in choosing…

  11. Generic functional requirements for a NASA general-purpose data base management system

    NASA Technical Reports Server (NTRS)

    Lohman, G. M.

    1981-01-01

    Generic functional requirements for a general-purpose, multi-mission data base management system (DBMS) for application to remotely sensed scientific data bases are detailed. The motivation for utilizing DBMS technology in this environment is explained. The major requirements include: (1) a DBMS for scientific observational data; (2) a multi-mission capability; (3) user-friendly; (4) extensive and integrated information about data; (5) robust languages for defining data structures and formats; (6) scientific data types and structures; (7) flexible physical access mechanisms; (8) ways of representing spatial relationships; (9) a high level nonprocedural interactive query and data manipulation language; (10) data base maintenance utilities; (11) high rate input/output and large data volume storage; and adaptability to a distributed data base and/or data base machine configuration. Detailed functions are specified in a top-down hierarchic fashion. Implementation, performance, and support requirements are also given.

  12. Web-Based Search and Plot System for Nuclear Reaction Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Otuka, N.; Nakagawa, T.; Fukahori, T.

    2005-05-24

    A web-based search and plot system for nuclear reaction data has been developed, covering experimental data in EXFOR format and evaluated data in ENDF format. The system is implemented for Linux OS, with Perl and MySQL used for CGI scripts and the database manager, respectively. Two prototypes for experimental and evaluated data are presented.

  13. Using the Web for Recruitment, Screening, Tracking, Data Management, and Quality Control in a Dietary Assessment Clinical Validation Trial

    PubMed Central

    Hahn, Harry; Henry, Judith; Chacko, Sara; Winter, Ashley; Cambou, Mary C

    2010-01-01

    Screening and tracking subjects and data management in clinical trials require significant investments in manpower that can be reduced through the use of web-based systems. To support a validation trial of various dietary assessment tools that required multiple clinic visits and eight repeats of online assessments, we developed an interactive web-based system to automate all levels of management of a biomarker-based clinical trial. The “Energetics System” was developed to support 1) the work of the study coordinator in recruiting, screening and tracking subject flow, 2) the need of the principal investigator to review study progress, and 3) continuous data analysis. The system was designed to automate web-based self-screening into the trial. It supported scheduling tasks and triggered tailored messaging for late and non-responders. For the investigators, it provided real time status overviews on all subjects, created electronic case reports, supported data queries and prepared analytic data files. Encryption and multi-level password protection were used to insure data privacy. The system was programmed iteratively and required six months of a web programmer's time along with active team engagement. In this study the enhancement in speed and efficiency of recruitment and quality of data collection as a result of this system outweighed the initial investment. Web-based systems have the potential to streamline the process of recruitment and day-to-day management of clinical trials in addition to improving efficiency and quality. Because of their added value they should be considered for trials of moderate size or complexity. Grant support: NIH funded R01CA105048. PMID:19925884

  14. Integrated Computer System of Management in Logistics

    NASA Astrophysics Data System (ADS)

    Chwesiuk, Krzysztof

    2011-06-01

    This paper aims at presenting a concept of an integrated computer system of management in logistics, particularly in supply and distribution chains. Consequently, the paper includes the basic idea of the concept of computer-based management in logistics and components of the system, such as CAM and CIM systems in production processes, and management systems for storage, materials flow, and for managing transport, forwarding and logistics companies. The platform which integrates computer-aided management systems is that of electronic data interchange.

  15. A Web-based system for the intelligent management of diabetic patients.

    PubMed

    Riva, A; Bellazzi, R; Stefanelli, M

    1997-01-01

    We describe the design and implementation of a distributed computer-based system for the management of insulin-dependent diabetes mellitus. The goal of the system is to support the normal activities of the physicians and patients involved in the care of diabetes by providing them with a set of automated services ranging from data collection and transmission to data analysis and decision support. The system is highly integrated with current practices in the management of diabetes, and it uses Internet technology to achieve high availability and ease of use. In particular, the user interaction takes place through dynamically generated World Wide Web pages, so that all the system's functions share an intuitive graphic user interface.

  16. Information Interaction Study for DER and DMS Interoperability

    NASA Astrophysics Data System (ADS)

    Liu, Haitao; Lu, Yiming; Lv, Guangxian; Liu, Peng; Chen, Yu; Zhang, Xinhui

    The Common Information Model (CIM) is an abstract data model that can be used to represent the major objects in Distribution Management System (DMS) applications. Because the Common Information Model (CIM) doesn't modeling the Distributed Energy Resources (DERs), it can't meet the requirements of DER operation and management for Distribution Management System (DMS) advanced applications. Modeling of DER were studied based on a system point of view, the article initially proposed a CIM extended information model. By analysis the basic structure of the message interaction between DMS and DER, a bidirectional messaging mapping method based on data exchange was proposed.

  17. The Johnson Space Center Management Information Systems (JSCMIS): An interface for organizational databases

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.; Erickson, Lloyd

    1990-01-01

    The Management Information and Decision Support Environment (MIDSE) is a research activity to build and test a prototype of a generic human interface on the Johnson Space Center (JSC) Information Network (CIN). The existing interfaces were developed specifically to support operations rather than the type of data which management could use. The diversity of the many interfaces and their relative difficulty discouraged occasional users from attempting to use them for their purposes. The MIDSE activity approached this problem by designing and building an interface to one JSC data base - the personnel statistics tables of the NASA Personnel and Payroll System (NPPS). The interface was designed against the following requirements: generic (use with any relational NOMAD data base); easy to learn (intuitive operations for new users); easy to use (efficient operations for experienced users); self-documenting (help facility which informs users about the data base structure as well as the operation of the interface); and low maintenance (easy configuration to new applications). A prototype interface entitled the JSC Management Information Systems (JSCMIS) was produced. It resides on CIN/PROFS and is available to JSC management who request it. The interface has passed management review and is ready for early use. Three kinds of data are now available: personnel statistics, personnel register, and plan/actual cost.

  18. Data Base Management Systems Panel. Third workshop summary

    NASA Technical Reports Server (NTRS)

    Urena, J. L. (Editor)

    1981-01-01

    The discussions and results of a review by a panel of data base management system (DRMS) experts of various aspects of the use of DBMSs within NASA/Office of Space and Terrestrial Applications (OSTA) and related organizations are summarized. The topics discussed included the present status of the use of DBMS technology and of the various ongoing DBMS-related efforts within NASA. The report drafts of a study that seeks to determine the functional requirements for a generalized DBMS for the NASA/OSTA and related data bases are examined. Future problems and possibilities with the use of DBMS technology are also considered. A list of recommendations for NASA/OSTA data systems is included.

  19. Tracking-Data-Conversion Tool

    NASA Technical Reports Server (NTRS)

    Flora-Adams, Dana; Makihara, Jeanne; Benenyan, Zabel; Berner, Jeff; Kwok, Andrew

    2007-01-01

    Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-topeer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.

  20. Design and development of a web-based application for diabetes patient data management.

    PubMed

    Deo, S S; Deobagkar, D N; Deobagkar, Deepti D

    2005-01-01

    A web-based database management system developed for collecting, managing and analysing information of diabetes patients is described here. It is a searchable, client-server, relational database application, developed on the Windows platform using Oracle, Active Server Pages (ASP), Visual Basic Script (VB Script) and Java Script. The software is menu-driven and allows authorized healthcare providers to access, enter, update and analyse patient information. Graphical representation of data can be generated by the system using bar charts and pie charts. An interactive web interface allows users to query the database and generate reports. Alpha- and beta-testing of the system was carried out and the system at present holds records of 500 diabetes patients and is found useful in diagnosis and treatment. In addition to providing patient data on a continuous basis in a simple format, the system is used in population and comparative analysis. It has proved to be of significant advantage to the healthcare provider as compared to the paper-based system.

  1. SDMS: A scientific data management system

    NASA Technical Reports Server (NTRS)

    Massena, W. A.

    1978-01-01

    SDMS is a data base management system developed specifically to support scientific programming applications. It consists of a data definition program to define the forms of data bases, and FORTRAN-compatible subroutine calls to create and access data within them. Each SDMS data base contains one or more data sets. A data set has the form of a relation. Each column of a data set is defined to be either a key or data element. Key elements must be scalar. Data elements may also be vectors or matrices. The data elements in each row of the relation form an element set. SDMS permits direct storage and retrieval of an element set by specifying the corresponding key element values. To support the scientific environment, SDMS allows the dynamic creation of data bases via subroutine calls. It also allows intermediate or scratch data to be stored in temporary data bases which vanish at job end.

  2. Space station data system analysis/architecture study. Task 5: Program plan

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Cost estimates for both the on-board and ground segments of the Space Station Data System (SSDS) are presented along with summary program schedules. Advanced technology development recommendations are provided in the areas of distributed data base management, end-to-end protocols, command/resource management, and flight qualified artificial intelligence machines.

  3. Designing and Implementation of River Classification Assistant Management System

    NASA Astrophysics Data System (ADS)

    Zhao, Yinjun; Jiang, Wenyuan; Yang, Rujun; Yang, Nan; Liu, Haiyan

    2018-03-01

    In an earlier publication, we proposed a new Decision Classifier (DCF) for Chinese river classification based on their structures. To expand, enhance and promote the application of the DCF, we build a computer system to support river classification named River Classification Assistant Management System. Based on ArcEngine and ArcServer platform, this system implements many functions such as data management, extraction of river network, river classification, and results publication under combining Client / Server with Browser / Server framework.

  4. Information security management system planning for CBRN facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lenaeu, Joseph D.; O'Neil, Lori Ross; Leitch, Rosalyn M.

    The focus of this document is to provide guidance for the development of information security management system planning documents at chemical, biological, radiological, or nuclear (CBRN) facilities. It describes a risk-based approach for planning information security programs based on the sensitivity of the data developed, processed, communicated, and stored on facility information systems.

  5. A highly scalable information system as extendable framework solution for medical R&D projects.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Stoll, Regina; Thurow, Kerstin

    2009-01-01

    For research projects in preventive medicine a flexible information management is needed that offers a free planning and documentation of project specific examinations. The system should allow a simple, preferably automated data acquisition from several distributed sources (e.g., mobile sensors, stationary diagnostic systems, questionnaires, manual inputs) as well as an effective data management, data use and analysis. An information system fulfilling these requirements has been developed at the Center for Life Science Automation (celisca). This system combines data of multiple investigations and multiple devices and displays them on a single screen. The integration of mobile sensor systems for comfortable, location-independent capture of time-based physiological parameter and the possibility of observation of these measurements directly by this system allow new scenarios. The web-based information system presented in this paper is configurable by user interfaces. It covers medical process descriptions, operative process data visualizations, a user-friendly process data processing, modern online interfaces (data bases, web services, XML) as well as a comfortable support of extended data analysis with third-party applications.

  6. A review on technologies and their usage in solid waste monitoring and management systems: Issues and challenges.

    PubMed

    Hannan, M A; Abdulla Al Mamun, Md; Hussain, Aini; Basri, Hassan; Begum, R A

    2015-09-01

    In the backdrop of prompt advancement, information and communication technology (ICT) has become an inevitable part to plan and design of modern solid waste management (SWM) systems. This study presents a critical review of the existing ICTs and their usage in SWM systems to unfold the issues and challenges towards using integrated technologies based system. To plan, monitor, collect and manage solid waste, the ICTs are divided into four categories such as spatial technologies, identification technologies, data acquisition technologies and data communication technologies. The ICT based SWM systems classified in this paper are based on the first three technologies while the forth one is employed by almost every systems. This review may guide the reader about the basics of available ICTs and their application in SWM to facilitate the search for planning and design of a sustainable new system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. An approach for management of geometry data

    NASA Technical Reports Server (NTRS)

    Dube, R. P.; Herron, G. J.; Schweitzer, J. E.; Warkentine, E. R.

    1980-01-01

    The strategies for managing Integrated Programs for Aerospace Design (IPAD) computer-based geometry are described. The computer model of geometry is the basis for communication, manipulation, and analysis of shape information. IPAD's data base system makes this information available to all authorized departments in a company. A discussion of the data structures and algorithms required to support geometry in IPIP (IPAD's data base management system) is presented. Through the use of IPIP's data definition language, the structure of the geometry components is defined. The data manipulation language is the vehicle by which a user defines an instance of the geometry. The manipulation language also allows a user to edit, query, and manage the geometry. The selection of canonical forms is a very important part of the IPAD geometry. IPAD has a canonical form for each entity and provides transformations to alternate forms; in particular, IPAD will provide a transformation to the ANSI standard. The DBMS schemas required to support IPAD geometry are explained.

  8. Geospatial Based Information System Development in Public Administration for Sustainable Development and Planning in Urban Environment

    NASA Astrophysics Data System (ADS)

    Kouziokas, Georgios N.

    2016-09-01

    It is generally agreed that the governmental authorities should actively encourage the development of an efficient framework of information and communication technology initiatives so as to advance and promote sustainable development and planning strategies. This paper presents a prototype Information System for public administration which was designed to facilitate public management and decision making for sustainable development and planning. The system was developed by using several programming languages and programming tools and also a Database Management System (DBMS) for storing and managing urban data of many kinds. Furthermore, geographic information systems were incorporated into the system in order to make possible to the authorities to deal with issues of spatial nature such as spatial planning. The developed system provides a technology based management of geospatial information, environmental and crime data of urban environment aiming at improving public decision making and also at contributing to a more efficient sustainable development and planning.

  9. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.« less

  10. The Design and Application of Data Storage System in Miyun Satellite Ground Station

    NASA Astrophysics Data System (ADS)

    Xue, Xiping; Su, Yan; Zhang, Hongbo; Liu, Bin; Yao, Meijuan; Zhao, Shu

    2015-04-01

    China has launched Chang'E-3 satellite in 2013, firstly achieved soft landing on moon for China's lunar probe. Miyun satellite ground station firstly used SAN storage network system based-on Stornext sharing software in Chang'E-3 mission. System performance fully meets the application requirements of Miyun ground station data storage.The Stornext file system is a sharing file system with high performance, supports multiple servers to access the file system using different operating system at the same time, and supports access to data on a variety of topologies, such as SAN and LAN. Stornext focused on data protection and big data management. It is announced that Quantum province has sold more than 70,000 licenses of Stornext file system worldwide, and its customer base is growing, which marks its leading position in the big data management.The responsibilities of Miyun satellite ground station are the reception of Chang'E-3 satellite downlink data and management of local data storage. The station mainly completes exploration mission management, receiving and management of observation data, and provides a comprehensive, centralized monitoring and control functions on data receiving equipment. The ground station applied SAN storage network system based on Stornext shared software for receiving and managing data reliable.The computer system in Miyun ground station is composed by business running servers, application workstations and other storage equipments. So storage systems need a shared file system which supports heterogeneous multi-operating system. In practical applications, 10 nodes simultaneously write data to the file system through 16 channels, and the maximum data transfer rate of each channel is up to 15MB/s. Thus the network throughput of file system is not less than 240MB/s. At the same time, the maximum capacity of each data file is up to 810GB. The storage system planned requires that 10 nodes simultaneously write data to the file system through 16 channels with 240MB/s network throughput.When it is integrated,sharing system can provide 1020MB/s write speed simultaneously.When the master storage server fails, the backup storage server takes over the normal service.The literacy of client will not be affected,in which switching time is less than 5s.The design and integrated storage system meet users requirements. Anyway, all-fiber way is too expensive in SAN; SCSI hard disk transfer rate may still be the bottleneck in the development of the entire storage system. Stornext can provide users with efficient sharing, management, automatic archiving of large numbers of files and hardware solutions. It occupies a leading position in big data management. Storage is the most popular sharing shareware, and there are drawbacks in Stornext: Firstly, Stornext software is expensive, in which charge by the sites. When the network scale is large, the purchase cost will be very high. Secondly, the parameters of Stornext software are more demands on the skills of technical staff. If there is a problem, it is difficult to exclude.

  11. A data-management system for detailed areal interpretive data

    USGS Publications Warehouse

    Ferrigno, C.F.

    1986-01-01

    A data storage and retrieval system has been developed to organize and preserve areal interpretive data. This system can be used by any study where there is a need to store areal interpretive data that generally is presented in map form. This system provides the capability to grid areal interpretive data for input to groundwater flow models at any spacing and orientation. The data storage and retrieval system is designed to be used for studies that cover small areas such as counties. The system is built around a hierarchically structured data base consisting of related latitude-longitude blocks. The information in the data base can be stored at different levels of detail, with the finest detail being a block of 6 sec of latitude by 6 sec of longitude (approximately 0.01 sq mi). This system was implemented on a mainframe computer using a hierarchical data base management system. The computer programs are written in Fortran IV and PL/1. The design and capabilities of the data storage and retrieval system, and the computer programs that are used to implement the system are described. Supplemental sections contain the data dictionary, user documentation of the data-system software, changes that would need to be made to use this system for other studies, and information on the computer software tape. (Lantz-PTT)

  12. Novel, Web-based, information-exploration approach for improving operating room logistics and system processes.

    PubMed

    Nagy, Paul G; Konewko, Ramon; Warnock, Max; Bernstein, Wendy; Seagull, Jacob; Xiao, Yan; George, Ivan; Park, Adrian

    2008-03-01

    Routine clinical information systems now have the ability to gather large amounts of data that surgical managers can access to create a seamless and proactive approach to streamlining operations and minimizing delays. The challenge lies in aggregating and displaying these data in an easily accessible format that provides useful, timely information on current operations. A Web-based, graphical dashboard is described in this study, which can be used to interpret clinical operational data, allow managers to see trends in data, and help identify inefficiencies that were not apparent with more traditional, paper-based approaches. The dashboard provides a visual decision support tool that assists managers in pinpointing areas for continuous quality improvement. The limitations of paper-based techniques, the development of the automated display system, and key performance indicators in analyzing aggregate delays, time, specialties, and teamwork are reviewed. Strengths, weaknesses, opportunities, and threats associated with implementing such a program in the perioperative environment are summarized.

  13. Information logistics: A production-line approach to information services

    NASA Technical Reports Server (NTRS)

    Adams, Dennis; Lee, Chee-Seng

    1991-01-01

    Logistics can be defined as the process of strategically managing the acquisition, movement, and storage of materials, parts, and finished inventory (and the related information flow) through the organization and its marketing channels in a cost effective manner. It is concerned with delivering the right product to the right customer in the right place at the right time. The logistics function is composed of inventory management, facilities management, communications unitization, transportation, materials management, and production scheduling. The relationship between logistics and information systems is clear. Systems such as Electronic Data Interchange (EDI), Point of Sale (POS) systems, and Just in Time (JIT) inventory management systems are important elements in the management of product development and delivery. With improved access to market demand figures, logisticians can decrease inventory sizes and better service customer demand. However, without accurate, timely information, little, if any, of this would be feasible in today's global markets. Information systems specialists can learn from logisticians. In a manner similar to logistics management, information logistics is concerned with the delivery of the right data, to the ring customer, at the right time. As such, information systems are integral components of the information logistics system charged with providing customers with accurate, timely, cost-effective, and useful information. Information logistics is a management style and is composed of elements similar to those associated with the traditional logistics activity: inventory management (data resource management), facilities management (distributed, centralized and decentralized information systems), communications (participative design and joint application development methodologies), unitization (input/output system design, i.e., packaging or formatting of the information), transportations (voice, data, image, and video communication systems), materials management (data acquisition, e.g., EDI, POS, external data bases, data entry) and production scheduling (job, staff, and project scheduling).

  14. A Split-Path Schema-Based RFID Data Storage Model in Supply Chain Management

    PubMed Central

    Fan, Hua; Wu, Quanyuan; Lin, Yisong; Zhang, Jianfeng

    2013-01-01

    In modern supply chain management systems, Radio Frequency IDentification (RFID) technology has become an indispensable sensor technology and massive RFID data sets are expected to become commonplace. More and more space and time are needed to store and process such huge amounts of RFID data, and there is an increasing realization that the existing approaches cannot satisfy the requirements of RFID data management. In this paper, we present a split-path schema-based RFID data storage model. With a data separation mechanism, the massive RFID data produced in supply chain management systems can be stored and processed more efficiently. Then a tree structure-based path splitting approach is proposed to intelligently and automatically split the movement paths of products. Furthermore, based on the proposed new storage model, we design the relational schema to store the path information and time information of tags, and some typical query templates and SQL statements are defined. Finally, we conduct various experiments to measure the effect and performance of our model and demonstrate that it performs significantly better than the baseline approach in both the data expression and path-oriented RFID data query performance. PMID:23645112

  15. Applying Service-Oriented Architecture on The Development of Groundwater Modeling Support System

    NASA Astrophysics Data System (ADS)

    Li, C. Y.; WANG, Y.; Chang, L. C.; Tsai, J. P.; Hsiao, C. T.

    2016-12-01

    Groundwater simulation has become an essential step on the groundwater resources management and assessment. There are many stand-alone pre- and post-processing software packages to alleviate the model simulation loading, but the stand-alone software do not consider centralized management of data and simulation results neither do they provide network sharing functions. Hence, it is difficult to share and reuse the data and knowledge (simulation cases) systematically within or across companies. Therefore, this study develops a centralized and network based groundwater modeling support system to assist model construction. The system is based on service-oriented architecture and allows remote user to develop their modeling cases on internet. The data and cases (knowledge) are thus easy to manage centralized. MODFLOW is the modeling engine of the system, which is the most popular groundwater model in the world. The system provides a data warehouse to restore groundwater observations, MODFLOW Support Service, MODFLOW Input File & Shapefile Convert Service, MODFLOW Service, and Expert System Service to assist researchers to build models. Since the system architecture is service-oriented, it is scalable and flexible. The system can be easily extended to include the scenarios analysis and knowledge management to facilitate the reuse of groundwater modeling knowledge.

  16. TheHiveDB image data management and analysis framework.

    PubMed

    Muehlboeck, J-Sebastian; Westman, Eric; Simmons, Andrew

    2014-01-06

    The hive database system (theHiveDB) is a web-based brain imaging database, collaboration, and activity system which has been designed as an imaging workflow management system capable of handling cross-sectional and longitudinal multi-center studies. It can be used to organize and integrate existing data from heterogeneous projects as well as data from ongoing studies. It has been conceived to guide and assist the researcher throughout the entire research process, integrating all relevant types of data across modalities (e.g., brain imaging, clinical, and genetic data). TheHiveDB is a modern activity and resource management system capable of scheduling image processing on both private compute resources and the cloud. The activity component supports common image archival and management tasks as well as established pipeline processing (e.g., Freesurfer for extraction of scalar measures from magnetic resonance images). Furthermore, via theHiveDB activity system algorithm developers may grant access to virtual machines hosting versioned releases of their tools to collaborators and the imaging community. The application of theHiveDB is illustrated with a brief use case based on organizing, processing, and analyzing data from the publically available Alzheimer Disease Neuroimaging Initiative.

  17. TheHiveDB image data management and analysis framework

    PubMed Central

    Muehlboeck, J-Sebastian; Westman, Eric; Simmons, Andrew

    2014-01-01

    The hive database system (theHiveDB) is a web-based brain imaging database, collaboration, and activity system which has been designed as an imaging workflow management system capable of handling cross-sectional and longitudinal multi-center studies. It can be used to organize and integrate existing data from heterogeneous projects as well as data from ongoing studies. It has been conceived to guide and assist the researcher throughout the entire research process, integrating all relevant types of data across modalities (e.g., brain imaging, clinical, and genetic data). TheHiveDB is a modern activity and resource management system capable of scheduling image processing on both private compute resources and the cloud. The activity component supports common image archival and management tasks as well as established pipeline processing (e.g., Freesurfer for extraction of scalar measures from magnetic resonance images). Furthermore, via theHiveDB activity system algorithm developers may grant access to virtual machines hosting versioned releases of their tools to collaborators and the imaging community. The application of theHiveDB is illustrated with a brief use case based on organizing, processing, and analyzing data from the publically available Alzheimer Disease Neuroimaging Initiative. PMID:24432000

  18. X-PAT: a multiplatform patient referral data management system for small healthcare institution requirements.

    PubMed

    Masseroli, Marco; Marchente, Mario

    2008-07-01

    We present X-PAT, a platform-independent software prototype that is able to manage patient referral multimedia data in an intranet network scenario according to the specific control procedures of a healthcare institution. It is a self-developed storage framework based on a file system, implemented in eXtensible Markup Language (XML) and PHP Hypertext Preprocessor Language, and addressed to the requirements of limited-dimension healthcare entities (small hospitals, private medical centers, outpatient clinics, and laboratories). In X-PAT, healthcare data descriptions, stored in a novel Referral Base Management System (RBMS) according to Health Level 7 Clinical Document Architecture Release 2 (CDA R2) standard, can be easily applied to the specific data and organizational procedures of a particular healthcare working environment thanks also to the use of standard clinical terminology. Managed data, centralized on a server, are structured in the RBMS schema using a flexible patient record and CDA healthcare referral document structures based on XML technology. A novel search engine allows defining and performing queries on stored data, whose rapid execution is ensured by expandable RBMS indexing structures. Healthcare personnel can interface the X-PAT system, according to applied state-of-the-art privacy and security measures, through friendly and intuitive Web pages that facilitate user acceptance.

  19. The Biological and Chemical Oceanography Data Management Office

    NASA Astrophysics Data System (ADS)

    Allison, M. D.; Chandler, C. L.; Groman, R. C.; Wiebe, P. H.; Glover, D. M.; Gegg, S. R.

    2011-12-01

    Oceanography and marine ecosystem research are inherently interdisciplinary fields of study that generate and require access to a wide variety of measurements. In late 2006 the Biological and Chemical Oceanography Sections of the National Science Foundation (NSF) Geosciences Directorate Division of Ocean Sciences (OCE) funded the Biological and Chemical Oceanography Data Management Office (BCO-DMO). In late 2010 additional funding was contributed to support management of research data from the NSF Office of Polar Programs Antarctic Organisms & Ecosystems Program. The BCO-DMO is recognized in the 2011 Division of Ocean Sciences Sample and Data Policy as one of several program specific data offices that support NSF OCE funded researchers. BCO-DMO staff members offer data management support throughout the project life cycle to investigators from large national programs and medium-sized collaborative research projects, as well as researchers from single investigator awards. The office manages and serves all types of oceanographic data and information generated during the research process and contributed by the originating investigators. BCO-DMO has built a data system that includes the legacy data from several large ocean research programs (e.g. United States Joint Global Ocean Flux Study and United States GLOBal Ocean ECosystems Dynamics), to which data have been contributed from recently granted NSF OCE and OPP awards. The BCO-DMO data system can accommodate many different types of data including: in situ and experimental biological, chemical, and physical measurements; modeling results and synthesis data products. The system enables reuse of oceanographic data for new research endeavors, supports synthesis and modeling activities, provides availability of "real data" for K-12 and college level use, and provides decision-support field data for policy-relevant investigations. We will present an overview of the data management system capabilities including: map-based and text-based data discovery and access systems; recent enhancements to data search tools; data export and download utilities; and strategic use of controlled vocabularies to facilitate data integration and to improve data system interoperability.

  20. NASA and the U.S. climate program - A problem in data management

    NASA Technical Reports Server (NTRS)

    Quann, J. J.

    1978-01-01

    NASA's contribution to the total data base for the National Climate Plan will be to produce climate data sets from its experimental space observing systems and to maximize the value of these data for climate analysis and prediction. Validated data sets will be provided to NOAA for inclusion into their overall diagnostic data base. NASA data management for the Climate Plan will involve: (1) cataloging and retrieval of large integrated and distributed data sets upon user demand, and (2) the storage equivalent of 100,000 digital data tapes. It will be the largest, most complex data system ever developed by NASA

  1. Smart Water: Energy-Water Optimization in Drinking Water Systems

    EPA Science Inventory

    This project aims to develop and commercialize a Smart Water Platform – Sensor-based Data-driven Energy-Water Optimization technology in drinking water systems. The key technological advances rely on cross-platform data acquisition and management system, model-based real-time sys...

  2. PROMIS (Procurement Management Information System)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The PROcurement Management Information System (PROMIS) provides both detailed and summary level information on all procurement actions performed within NASA's procurement offices at Marshall Space Flight Center (MSFC). It provides not only on-line access, but also schedules procurement actions, monitors their progress, and updates Forecast Award Dates. Except for a few computational routines coded in FORTRAN, the majority of the systems is coded in a high level language called NATURAL. A relational Data Base Management System called ADABAS is utilized. Certain fields, called descriptors, are set up on each file to allow the selection of records based on a specified value or range of values. The use of like descriptors on different files serves as the link between the falls, thus producing a relational data base. Twenty related files are currently being maintained on PROMIS.

  3. RIMS: Resource Information Management System

    NASA Technical Reports Server (NTRS)

    Symes, J.

    1983-01-01

    An overview is given of the capabilities and functions of the resource management system (RIMS). It is a simple interactive DMS tool which allows users to build, modify, and maintain data management applications. The RIMS minimizes programmer support required to develop/maintain small data base applications. The RIMS also assists in bringing the United Information Services (UIS) budget system work inhouse. Information is also given on the relationship between the RIMS and the user community.

  4. Managing multicentre clinical trials with open source.

    PubMed

    Raptis, Dimitri Aristotle; Mettler, Tobias; Fischer, Michael Alexander; Patak, Michael; Lesurtel, Mickael; Eshmuminov, Dilmurodjon; de Rougemont, Olivier; Graf, Rolf; Clavien, Pierre-Alain; Breitenstein, Stefan

    2014-03-01

    Multicentre clinical trials are challenged by high administrative burden, data management pitfalls and costs. This leads to a reduced enthusiasm and commitment of the physicians involved and thus to a reluctance in conducting multicentre clinical trials. The purpose of this study was to develop a web-based open source platform to support a multi-centre clinical trial. We developed on Drupal, an open source software distributed under the terms of the General Public License, a web-based, multi-centre clinical trial management system with the design science research approach. This system was evaluated by user-testing and well supported several completed and on-going clinical trials and is available for free download. Open source clinical trial management systems are capable in supporting multi-centre clinical trials by enhancing efficiency, quality of data management and collaboration.

  5. TERMTrial--terminology-based documentation systems for cooperative clinical trials.

    PubMed

    Merzweiler, A; Weber, R; Garde, S; Haux, R; Knaup-Gregori, P

    2005-04-01

    Within cooperative groups of multi-center clinical trials a standardized documentation is a prerequisite for communication and sharing of data. Standardizing documentation systems means standardizing the underlying terminology. The management and consistent application of terminology systems is a difficult and fault-prone task, which should be supported by appropriate software tools. Today, documentation systems for clinical trials are often implemented as so-called Remote-Data-Entry-Systems (RDE-systems). Although there are many commercial systems, which support the development of RDE-systems there is none offering a comprehensive terminological support. Therefore, we developed the software system TERMTrial which consists of a component for the definition and management of terminology systems for cooperative groups of clinical trials and two components for the terminology-based automatic generation of trial databases and terminology-based interactive design of electronic case report forms (eCRFs). TERMTrial combines the advantages of remote data entry with a comprehensive terminological control.

  6. IEMIS (Integrated Emergency Management Information System) Floodplain Mapping Based on a Lidar Derived Data Set.

    DTIC Science & Technology

    1988-02-05

    0-A193 971 IEMIS (INTEGRATED EMERGENCY MANAGEMENT INFORMATION SYSTEM ) FLOODPLRIN MAP.. (U) ARMY ENGINEER WATERWAYS EXPERIMENT STATION VICKSBURG HS J...illustrate the application of the automated mapping capabilities of the Integrated Emergency Management Information System (IEMIS) to FISs. Unclassified...mapping capabilities of the Integrated Emergency Management Information System (IEMIS) to FISs. II. BACKGROUND The concept of mounting laser ranging

  7. The Cheetah Data Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kunz, P.F.; Word, G.B.

    1991-03-01

    Cheetah is a data management system based on the C programming language. The premise of Cheetah is that the banks' of FORTRAN based systems should be structures' as defined by the C language. Cheetah is a system to mange these structures, while preserving the use of the C language in its native form. For C structures managed by Cheetah, the user can use Cheetah utilities such as reading and writing, in a machine independent form, both binary and text files to disk or over a network. Files written by Cheetah also contain a dictionary describing in detail the data containedmore » in the file. Such information is intended to be used by interactive programs for presenting the contents of the file. Such information is intended to be used by interactive programs for presenting the contents of file. Cheetah has been ported to many different operating systems with no operating system dependent switches.« less

  8. Information-based management mode based on value network analysis for livestock enterprises

    NASA Astrophysics Data System (ADS)

    Liu, Haoqi; Lee, Changhoon; Han, Mingming; Su, Zhongbin; Padigala, Varshinee Anu; Shen, Weizheng

    2018-01-01

    With the development of computer and IT technologies, enterprise management has gradually become information-based management. Moreover, due to poor technical competence and non-uniform management, most breeding enterprises show a lack of organisation in data collection and management. In addition, low levels of efficiency result in increasing production costs. This paper adopts 'struts2' in order to construct an information-based management system for standardised and normalised management within the process of production in beef cattle breeding enterprises. We present a radio-frequency identification system by studying multiple-tag anti-collision via a dynamic grouping ALOHA algorithm. This algorithm is based on the existing ALOHA algorithm and uses an improved packet dynamic of this algorithm, which is characterised by a high-throughput rate. This new algorithm can reach a throughput 42% higher than that of the general ALOHA algorithm. With a change in the number of tags, the system throughput is relatively stable.

  9. Development of a data management front-end for use with a LANDSAT-based information system

    NASA Technical Reports Server (NTRS)

    Turner, B. J.

    1982-01-01

    The development and implementation of a data management front-end system for use with a LANDSAT based information system that facilitates the processsing of both LANDSAT and ancillary data was examined. The final tasks, reported on here, involved; (1) the implementation of the VICAR image processing software system at Penn State and the development of a user-friendly front-end for this system; (2) the implementation of JPL-developed software based on VICAR, for mosaicking LANDSAT scenes; (3) the creation and storage of a mosiac of 1981 summer LANDSAT data for the entire state of Pennsylvania; (4) demonstrations of the defoliation assessment procedure for Perry and Centre Counties, and presentation of the results at the 1982 National Gypsy Moth Review Meeting, and (5) the training of Pennsylvania Bureau of Forestry personnel in the use of the defoliation analysis system.

  10. Development of bilateral data transferability in the Virginia Department of Transportation's Geotechnical Database Management System Framework.

    DOT National Transportation Integrated Search

    2006-01-01

    An Internet-based, spatiotemporal Geotechnical Database Management System (GDBMS) Framework was designed, developed, and implemented at the Virginia Department of Transportation (VDOT) in 2002 to retrieve, manage, archive, and analyze geotechnical da...

  11. Digital Earth system based river basin data integration

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Li, Wanqing; Lin, Chao

    2014-12-01

    Digital Earth is an integrated approach to build scientific infrastructure. The Digital Earth systems provide a three-dimensional visualization and integration platform for river basin data which include the management data, in situ observation data, remote sensing observation data and model output data. This paper studies the Digital Earth system based river basin data integration technology. Firstly, the construction of the Digital Earth based three-dimensional river basin data integration environment is discussed. Then the river basin management data integration technology is presented which is realized by general database access interface, web service and ActiveX control. Thirdly, the in situ data stored in database tables as records integration is realized with three-dimensional model of the corresponding observation apparatus display in the Digital Earth system by a same ID code. In the next two parts, the remote sensing data and the model output data integration technologies are discussed in detail. The application in the Digital Zhang River basin System of China shows that the method can effectively improve the using efficiency and visualization effect of the data.

  12. Research on rebuilding the data information environment for aeronautical manufacturing enterprise

    NASA Astrophysics Data System (ADS)

    Feng, Xilan; Jiang, Zhiqiang; Zong, Xuewen; Shi, Jinfa

    2005-12-01

    The data environment on integrated information system and the basic standard on information resource management are the key effectively of the remote collaborative designing and manufacturing for complex product. A study project on rebuilding the data information environment for aeronautical manufacturing enterprise (Aero-ME) is put forwarded. Firstly, the data environment on integrated information system, the basic standard on information resource management, the basic establishment on corporation's information, the development on integrated information system, and the information education are discussed profoundly based on the practical requirement of information resource and technique for contemporary Aero-ME. Then, the idea and method with the data environment rebuilding based on I-CASE in the corporation is put forward, and the effective method and implement approach for manufacturing enterprise information is brought forwards. It will also the foundation and assurance that rebuilding the corporation data-environment and promoting standardizing information resource management for the development of Aero-ME information engineering.

  13. The design and implementation of GML data management information system based on PostgreSQL

    NASA Astrophysics Data System (ADS)

    Zhang, Aiguo; Wu, Qunyong; Xu, Qifeng

    2008-10-01

    GML expresses geographic information in text, and it provides an extensible and standard way of spatial information encoding. At the present time, the management of GML data is in terms of document. By this way, the inquiry and update of GML data is inefficient, and it demands high memory when the document is comparatively large. In this respect, the paper put forward a data management of GML based on PostgreSQL. It designs four kinds of inquiries, which are inquiry of metadata, inquiry of geometry based on property, inquiry of property based on spatial information, and inquiry of spatial data based on location. At the same time, it designs and implements the visualization of the inquired WKT data.

  14. Secure web book to store structural genomics research data.

    PubMed

    Manjasetty, Babu A; Höppner, Klaus; Mueller, Uwe; Heinemann, Udo

    2003-01-01

    Recently established collaborative structural genomics programs aim at significantly accelerating the crystal structure analysis of proteins. These large-scale projects require efficient data management systems to ensure seamless collaboration between different groups of scientists working towards the same goal. Within the Berlin-based Protein Structure Factory, the synchrotron X-ray data collection and the subsequent crystal structure analysis tasks are located at BESSY, a third-generation synchrotron source. To organize file-based communication and data transfer at the BESSY site of the Protein Structure Factory, we have developed the web-based BCLIMS, the BESSY Crystallography Laboratory Information Management System. BCLIMS is a relational data management system which is powered by MySQL as the database engine and Apache HTTP as the web server. The database interface routines are written in Python programing language. The software is freely available to academic users. Here we describe the storage, retrieval and manipulation of laboratory information, mainly pertaining to the synchrotron X-ray diffraction experiments and the subsequent protein structure analysis, using BCLIMS.

  15. VoCATS User Guide. [Draft.

    ERIC Educational Resources Information Center

    North Carolina State Dept. of Public Instruction, Raleigh. Div. of Vocational Education Services.

    This guide focuses on use of the North Carolina Vocational Competency Achievement Tracking System (VoCATS)-designated software in the instructional management process. (VoCATS is a competency-based, computer-based instructional management system that allows the collection of data on student performance achievement prior to, during, and following…

  16. A decision-support system for the analysis of clinical practice patterns.

    PubMed

    Balas, E A; Li, Z R; Mitchell, J A; Spencer, D C; Brent, E; Ewigman, B G

    1994-01-01

    Several studies documented substantial variation in medical practice patterns, but physicians often do not have adequate information on the cumulative clinical and financial effects of their decisions. The purpose of developing an expert system for the analysis of clinical practice patterns was to assist providers in analyzing and improving the process and outcome of patient care. The developed QFES (Quality Feedback Expert System) helps users in the definition and evaluation of measurable quality improvement objectives. Based on objectives and actual clinical data, several measures can be calculated (utilization of procedures, annualized cost effect of using a particular procedure, and expected utilization based on peer-comparison and case-mix adjustment). The quality management rules help to detect important discrepancies among members of the selected provider group and compare performance with objectives. The system incorporates a variety of data and knowledge bases: (i) clinical data on actual practice patterns, (ii) frames of quality parameters derived from clinical practice guidelines, and (iii) rules of quality management for data analysis. An analysis of practice patterns of 12 family physicians in the management of urinary tract infections illustrates the use of the system.

  17. Designing Extensible Data Management for Ocean Observatories, Platforms, and Devices

    NASA Astrophysics Data System (ADS)

    Graybeal, J.; Gomes, K.; McCann, M.; Schlining, B.; Schramm, R.; Wilkin, D.

    2002-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) has been collecting science data for 15 years from all kinds of oceanographic instruments and systems, and is building a next-generation observing system, the MBARI Ocean Observing System (MOOS). To meet the data management requirements of the MOOS, the Institute began developing a flexible, extensible data management solution, the Shore Side Data System (SSDS). This data management system must address a wide variety of oceanographic instruments and data sources, including instruments and platforms of the future. Our data management solution will address all elements of the data management challenge, from ingest (including suitable pre-definition of metadata) through to access and visualization. Key to its success will be ease of use, and automatic incorporation of new data streams and data sets. The data will be of many different forms, and come from many different types of instruments. Instruments will be designed for fixed locations (as with moorings), changing locations (drifters and AUVs), and cruise-based sampling. Data from airplanes, satellites, models, and external archives must also be considered. Providing an architecture which allows data from these varied sources to be automatically archived and processed, yet readily accessed, is only possible with the best practices in metadata definition, software design, and re-use of third-party components. The current status of SSDS development will be presented, including lessons learned from our science users and from previous data management designs.

  18. 41 CFR 101-30.703 - Program objectives.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30... the Federal catalog system data base; and (e) Phasing out of the Government supply system those items... management, and warehousing costs; then following through to eliminate the items from agency catalog systems...

  19. 41 CFR 101-30.703 - Program objectives.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30... the Federal catalog system data base; and (e) Phasing out of the Government supply system those items... management, and warehousing costs; then following through to eliminate the items from agency catalog systems...

  20. 41 CFR 101-30.703 - Program objectives.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30... the Federal catalog system data base; and (e) Phasing out of the Government supply system those items... management, and warehousing costs; then following through to eliminate the items from agency catalog systems...

  1. 41 CFR 101-30.703 - Program objectives.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30... the Federal catalog system data base; and (e) Phasing out of the Government supply system those items... management, and warehousing costs; then following through to eliminate the items from agency catalog systems...

  2. 41 CFR 101-30.703 - Program objectives.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30... the Federal catalog system data base; and (e) Phasing out of the Government supply system those items... management, and warehousing costs; then following through to eliminate the items from agency catalog systems...

  3. Teamwork for Oversight of Processes and Systems (TOPS). Implementation guide for TOPS version 2.0, 10 August 1992

    NASA Technical Reports Server (NTRS)

    Strand, Albert A.; Jackson, Darryl J.

    1992-01-01

    As the nation redefines priorities to deal with a rapidly changing world order, both government and industry require new approaches for oversight of management systems, particularly for high technology products. Declining defense budgets will lead to significant reductions in government contract management personnel. Concurrently, defense contractors are reducing administrative and overhead staffing to control costs. These combined pressures require bold approaches for the oversight of management systems. In the Spring of 1991, the DPRO and TRW created a Process Action Team (PAT) to jointly prepare a Performance Based Management (PBM) system titled Teamwork for Oversight of Processes and Systems (TOPS). The primary goal is implementation of a performance based management system based on objective data to review critical TRW processes with an emphasis on continuous improvement. The processes are: Finance and Business Systems, Engineering and Manufacturing Systems, Quality Assurance, and Software Systems. The team established a number of goals: delivery of quality products to contractual terms and conditions; ensure that TRW management systems meet government guidance and good business practices; use of objective data to measure critical processes; elimination of wasteful/duplicative reviews and audits; emphasis on teamwork--all efforts must be perceived to add value by both sides and decisions are made by consensus; and synergy and the creation of a strong working trust between TRW and the DPRO. TOPS permits the adjustment of oversight resources when conditions change or when TRW systems performance indicate either an increase or decrease in surveillance is appropriate. Monthly Contractor Performance Assessments (CPA) are derived from a summary of supporting system level and process-level ratings obtained from objective process-level data. Tiered, objective, data-driven metrics are highly successful in achieving a cooperative and effective method of measuring performance. The teamwork-based culture developed by TOPS proved an unequaled success in removing adversarial relationships and creating an atmosphere of continuous improvement in quality processes at TRW. The new working relationship does not decrease the responsibility or authority of the DPRO to ensure contract compliance and it permits both parties to work more effectively to improve total quality and reduce cost. By emphasizing teamwork in developing a stronger approach to efficient management of the defense industrial base TOPS is a singular success.

  4. Teamwork for Oversight of Processes and Systems (TOPS). Implementation guide for TOPS version 2.0, 10 August 1992

    NASA Astrophysics Data System (ADS)

    Strand, Albert A.; Jackson, Darryl J.

    As the nation redefines priorities to deal with a rapidly changing world order, both government and industry require new approaches for oversight of management systems, particularly for high technology products. Declining defense budgets will lead to significant reductions in government contract management personnel. Concurrently, defense contractors are reducing administrative and overhead staffing to control costs. These combined pressures require bold approaches for the oversight of management systems. In the Spring of 1991, the DPRO and TRW created a Process Action Team (PAT) to jointly prepare a Performance Based Management (PBM) system titled Teamwork for Oversight of Processes and Systems (TOPS). The primary goal is implementation of a performance based management system based on objective data to review critical TRW processes with an emphasis on continuous improvement. The processes are: Finance and Business Systems, Engineering and Manufacturing Systems, Quality Assurance, and Software Systems. The team established a number of goals: delivery of quality products to contractual terms and conditions; ensure that TRW management systems meet government guidance and good business practices; use of objective data to measure critical processes; elimination of wasteful/duplicative reviews and audits; emphasis on teamwork--all efforts must be perceived to add value by both sides and decisions are made by consensus; and synergy and the creation of a strong working trust between TRW and the DPRO. TOPS permits the adjustment of oversight resources when conditions change or when TRW systems performance indicate either an increase or decrease in surveillance is appropriate. Monthly Contractor Performance Assessments (CPA) are derived from a summary of supporting system level and process-level ratings obtained from objective process-level data. Tiered, objective, data-driven metrics are highly successful in achieving a cooperative and effective method of measuring performance. The teamwork-based culture developed by TOPS proved an unequaled success in removing adversarial relationships and creating an atmosphere of continuous improvement in quality processes at TRW. The new working relationship does not decrease the responsibility or authority of the DPRO to ensure contract compliance and it permits both parties to work more effectively to improve total quality and reduce cost. By emphasizing teamwork in developing a stronger approach to efficient management of the defense industrial base TOPS is a singular success.

  5. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Michelle M.; Wu, Chase Q.

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization formore » this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.« less

  6. Functional design specification for the problem data system. [space shuttle

    NASA Technical Reports Server (NTRS)

    Boatman, T. W.

    1975-01-01

    The purpose of the Functional Design Specification is to outline the design for the Problem Data System. The Problem Data System is a computer-based data management system designed to track the status of problems and corrective actions pertinent to space shuttle hardware.

  7. National Aeronautics and Space Administration Manned Spacecraft Center data based requirements study

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are summarized of a study to determine the requirements of a data management system to meet the needs of MSC in mission planning and program and resource management during the 1975 time frame. The study addresses overall system requirements, implementation considerations, and cost/benefit comparisions.

  8. Development of a novel imaging informatics-based system with an intelligent workflow engine (IWEIS) to support imaging-based clinical trials

    PubMed Central

    Wang, Ximing; Liu, Brent J; Martinez, Clarisa; Zhang, Xuejun; Winstein, Carolee J

    2015-01-01

    Imaging based clinical trials can benefit from a solution to efficiently collect, analyze, and distribute multimedia data at various stages within the workflow. Currently, the data management needs of these trials are typically addressed with custom-built systems. However, software development of the custom- built systems for versatile workflows can be resource-consuming. To address these challenges, we present a system with a workflow engine for imaging based clinical trials. The system enables a project coordinator to build a data collection and management system specifically related to study protocol workflow without programming. Web Access to DICOM Objects (WADO) module with novel features is integrated to further facilitate imaging related study. The system was initially evaluated by an imaging based rehabilitation clinical trial. The evaluation shows that the cost of the development of system can be much reduced compared to the custom-built system. By providing a solution to customize a system and automate the workflow, the system will save on development time and reduce errors especially for imaging clinical trials. PMID:25870169

  9. Performance Evaluation of a Data Validation System

    NASA Technical Reports Server (NTRS)

    Wong, Edmond (Technical Monitor); Sowers, T. Shane; Santi, L. Michael; Bickford, Randall L.

    2005-01-01

    Online data validation is a performance-enhancing component of modern control and health management systems. It is essential that performance of the data validation system be verified prior to its use in a control and health management system. A new Data Qualification and Validation (DQV) Test-bed application was developed to provide a systematic test environment for this performance verification. The DQV Test-bed was used to evaluate a model-based data validation package known as the Data Quality Validation Studio (DQVS). DQVS was employed as the primary data validation component of a rocket engine health management (EHM) system developed under NASA's NGLT (Next Generation Launch Technology) program. In this paper, the DQVS and DQV Test-bed software applications are described, and the DQV Test-bed verification procedure for this EHM system application is presented. Test-bed results are summarized and implications for EHM system performance improvements are discussed.

  10. QAIT: a quality assurance issue tracking tool to facilitate the improvement of clinical data quality.

    PubMed

    Zhang, Yonghong; Sun, Weihong; Gutchell, Emily M; Kvecher, Leonid; Kohr, Joni; Bekhash, Anthony; Shriver, Craig D; Liebman, Michael N; Mural, Richard J; Hu, Hai

    2013-01-01

    In clinical and translational research as well as clinical trial projects, clinical data collection is prone to errors such as missing data, and misinterpretation or inconsistency of the data. A good quality assurance (QA) program can resolve many such errors though this requires efficient communications between the QA staff and data collectors. Managing such communications is critical to resolving QA problems but imposes a major challenge for a project involving multiple clinical and data processing sites. We have developed a QA issue tracking (QAIT) system to support clinical data QA in the Clinical Breast Care Project (CBCP). This web-based application provides centralized management of QA issues with role-based access privileges. It has greatly facilitated the QA process and enhanced the overall quality of the CBCP clinical data. As a stand-alone system, QAIT can supplement any other clinical data management systems and can be adapted to support other projects. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. Metadata registry and management system based on ISO 11179 for cancer clinical trials information system

    PubMed Central

    Park, Yu Rang; Kim*, Ju Han

    2006-01-01

    Standardized management of data elements (DEs) for Case Report Form (CRF) is crucial in Clinical Trials Information System (CTIS). Traditional CTISs utilize organization-specific definitions and storage methods for Des and CRFs. We developed metadata-based DE management system for clinical trials, Clinical and Histopathological Metadata Registry (CHMR), using international standard for metadata registry (ISO 11179) for the management of cancer clinical trials information. CHMR was evaluated in cancer clinical trials with 1625 DEs extracted from the College of American Pathologists Cancer Protocols for 20 major cancers. PMID:17238675

  12. The T.M.R. Data Dictionary: A Management Tool for Data Base Design

    PubMed Central

    Ostrowski, Maureen; Bernes, Marshall R.

    1984-01-01

    In January 1981, a dictionary-driven ambulatory care information system known as TMR (The Medical Record) was installed at a large private medical group practice in Los Angeles. TMR's data dictionary has enabled the medical group to adapt the software to meet changing user needs largely without programming support. For top management, the dictionary is also a tool for navigating through the system's complexity and assuring the integrity of management goals.

  13. UNIX-based data management system for the Mobile Satellite Propagation Experiment (PiFEx)

    NASA Technical Reports Server (NTRS)

    Kantak, Anil V.

    1987-01-01

    A new method is presented for handling data resulting from Mobile Satellite propagation experiments such as the Pilot Field Experiment (PiFEx) conducted by JPL. This method uses the UNIX operating system and C programming language. The data management system is implemented on a VAX minicomputer. The system automatically divides the large data file housing data from various experiments under a predetermined format into various individual files containing data from each experiment. The system also has a number of programs written in C and FORTRAN languages to allow the researcher to obtain meaningful quantities from the data at hand.

  14. Highway Air Pollution Dispersion Modeling : Preliminary Evaluation of Thirteen Models

    DOT National Transportation Integrated Search

    1978-06-01

    Thirteen highway air pollution dispersion models have been tested, using a portion of the Airedale air quality data base. The Transportation Air Pollution Studies (TAPS) System, a data base management system specifically designed for evaluating dispe...

  15. Highway Air Pollution Dispersion Modeling : Preliminary Evaluation of Thirteen Models

    DOT National Transportation Integrated Search

    1977-01-01

    Thirteen highway air pollution dispersion models have been tested, using a portion of the Airedale air quality data base. The Transportation Air Pollution Studies (TAPS) System, a data base management system specifically designed for evaluating dispe...

  16. Design and specification of a centralized manufacturing data management and scheduling system

    NASA Technical Reports Server (NTRS)

    Farrington, Phillip A.

    1993-01-01

    As was revealed in a previous study, the Materials and Processes Laboratory's Productivity Enhancement Complex (PEC) has a number of automated production areas/cells that are not effectively integrated, limiting the ability of users to readily share data. The recent decision to utilize the PEC for the fabrication of flight hardware has focused new attention on the problem and brought to light the need for an integrated data management and scheduling system. This report addresses this need by developing preliminary designs specifications for a centralized manufacturing data management and scheduling system for managing flight hardware fabrication in the PEC. This prototype system will be developed under the auspices of the Integrated Engineering Environment (IEE) Oversight team and the IEE Committee. At their recommendation the system specifications were based on the fabrication requirements of the AXAF-S Optical Bench.

  17. [A clinical, health, economic and satisfaction simulation model, CHESS: a tool for healthcare organization management].

    PubMed

    Meilik, Ahuva; Afek, Arnon; Rotstein, Zeev

    2009-03-01

    The management of medical organizations is based on a profound understanding of the essence of the organization, its vision and missions, as well as the methods the organization utilizes to gather and analyze information. In order to maintain a maximal function level in an ever-changing environment, all organization components must function in tandem. In a previous article the authors presented medical organizations as macro systems composed of micro systems, and discussed the challenges these organization face today. Basing optimal system management on micro medical systems allows the organizations to make maximum use of the advantages that professionalism encompasses, in a flexible micro-system environment. In this article, the authors attempt to present an interactive solution for performing assessments and management in the medical arena--the CHESS model. This solution was developed at the Sheba Medical Center. The CHESS Simulator (Clinical Health Economic and Satisfaction Simulator) was formulated to function as a clinical organizational intelligence system, whose function was to supply quantitative, analyzed data regarding activity on the clinical production floor. The system is unique in that it has a differential view of the complex medical procedures which are highly variable, and also has the capability to locate elements that are based in a common similarity. Data gathering will be based on an online system computerized medical file (EMR), which is a priority for a functioning system. This solution allows medical organization (macro-system] managers and the departments (micro system) directors to make informed decisions that will ensure that the organization's goals are achieved. This is defined as evolving from a reactive management pattern to a proactive management pattern that is mandatory in the competitive atmosphere of the 21st Century.

  18. Integrating automated support for a software management cycle into the TAME system

    NASA Technical Reports Server (NTRS)

    Sunazuka, Toshihiko; Basili, Victor R.

    1989-01-01

    Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.

  19. A data management life-cycle

    USGS Publications Warehouse

    Ferderer, David A.

    2001-01-01

    Documented, reliable, and accessible data and information are essential building blocks supporting scientific research and applications that enhance society's knowledge base (fig. 1). The U.S. Geological Survey (USGS), a leading provider of science data, information, and knowledge, is uniquely positioned to integrate science and natural resource information to address societal needs. The USGS Central Energy Resources Team (USGS-CERT) provides critical information and knowledge on the quantity, quality, and distribution of the Nation's and the world's oil, gas, and coal resources. By using a life-cycle model, the USGS-CERT Data Management Project is developing an integrated data management system to (1) promote access to energy data and information, (2) increase data documentation, and (3) streamline product delivery to the public, scientists, and decision makers. The project incorporates web-based technology, data cataloging systems, data processing routines, and metadata documentation tools to improve data access, enhance data consistency, and increase office efficiency

  20. A data management system to enable urgent natural disaster computing

    NASA Astrophysics Data System (ADS)

    Leong, Siew Hoon; Kranzlmüller, Dieter; Frank, Anton

    2014-05-01

    Civil protection, in particular natural disaster management, is very important to most nations and civilians in the world. When disasters like flash floods, earthquakes and tsunamis are expected or have taken place, it is of utmost importance to make timely decisions for managing the affected areas and reduce casualties. Computer simulations can generate information and provide predictions to facilitate this decision making process. Getting the data to the required resources is a critical requirement to enable the timely computation of the predictions. An urgent data management system to support natural disaster computing is thus necessary to effectively carry out data activities within a stipulated deadline. Since the trigger of a natural disaster is usually unpredictable, it is not always possible to prepare required resources well in advance. As such, an urgent data management system for natural disaster computing has to be able to work with any type of resources. Additional requirements include the need to manage deadlines and huge volume of data, fault tolerance, reliable, flexibility to changes, ease of usage, etc. The proposed data management platform includes a service manager to provide a uniform and extensible interface for the supported data protocols, a configuration manager to check and retrieve configurations of available resources, a scheduler manager to ensure that the deadlines can be met, a fault tolerance manager to increase the reliability of the platform and a data manager to initiate and perform the data activities. These managers will enable the selection of the most appropriate resource, transfer protocol, etc. such that the hard deadline of an urgent computation can be met for a particular urgent activity, e.g. data staging or computation. We associated 2 types of deadlines [2] with an urgent computing system. Soft-hard deadline: Missing a soft-firm deadline will render the computation less useful resulting in a cost that can have severe consequences Hard deadline: Missing a hard deadline renders the computation useless and results in full catastrophic consequences. A prototype of this system has a REST-based service manager. The REST-based implementation provides a uniform interface that is easy to use. New and upcoming file transfer protocols can easily be extended and accessed via the service manager. The service manager interacts with the other four managers to coordinate the data activities so that the fundamental natural disaster urgent computing requirement, i.e. deadline, can be fulfilled in a reliable manner. A data activity can include data storing, data archiving and data storing. Reliability is ensured by the choice of a network of managers organisation model[1] the configuration manager and the fault tolerance manager. With this proposed design, an easy to use, resource-independent data management system that can support and fulfill the computation of a natural disaster prediction within stipulated deadlines can thus be realised. References [1] H. G. Hegering, S. Abeck, and B. Neumair, Integrated management of networked systems - concepts, architectures, and their operational application, Morgan Kaufmann Publishers, 340 Pine Stret, Sixth Floor, San Francisco, CA 94104-3205, USA, 1999. [2] H. Kopetz, Real-time systems design principles for distributed embedded applications, second edition, Springer, LLC, 233 Spring Street, New York, NY 10013, USA, 2011. [3] S. H. Leong, A. Frank, and D. Kranzlmu¨ ller, Leveraging e-infrastructures for urgent computing, Procedia Computer Science 18 (2013), no. 0, 2177 - 2186, 2013 International Conference on Computational Science. [4] N. Trebon, Enabling urgent computing within the existing distributed computing infrastructure, Ph.D. thesis, University of Chicago, August 2011, http://people.cs.uchicago.edu/~ntrebon/docs/dissertation.pdf.

  1. Mesoscale and severe storms (Mass) data management and analysis system

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.; Dickerson, M.

    1984-01-01

    Progress on the Mesoscale and Severe Storms (MASS) data management and analysis system is described. An interactive atmospheric data base management software package to convert four types of data (Sounding, Single Level, Grid, Image) into standard random access formats is implemented and integrated with the MASS AVE80 Series general purpose plotting and graphics display data analysis software package. An interactive analysis and display graphics software package (AVE80) to analyze large volumes of conventional and satellite derived meteorological data is enhanced to provide imaging/color graphics display utilizing color video hardware integrated into the MASS computer system. Local and remote smart-terminal capability is provided by installing APPLE III computer systems within individual scientist offices and integrated with the MASS system, thus providing color video display, graphics, and characters display of the four data types.

  2. Heterogeneity prevails: the state of clinical trial data management in Europe - results of a survey of ECRIN centres

    PubMed Central

    2010-01-01

    Background The use of Clinical Data Management Systems (CDMS) has become essential in clinical trials to handle the increasing amount of data that must be collected and analyzed. With a CDMS trial data are captured at investigator sites with "electronic Case Report Forms". Although more and more of these electronic data management systems are used in academic research centres an overview of CDMS products and of available data management and quality management resources for academic clinical trials in Europe is missing. Methods The ECRIN (European Clinical Research Infrastructure Network) data management working group conducted a two-part standardized survey on data management, software tools, and quality management for clinical trials. The questionnaires were answered by nearly 80 centres/units (with an overall response rate of 47% and 43%) from 12 European countries and EORTC. Results Our survey shows that about 90% of centres have a CDMS in routine use. Of these CDMS nearly 50% are commercial systems; Open Source solutions don't play a major role. In general, solutions used for clinical data management are very heterogeneous: 20 different commercial CDMS products (7 Open Source solutions) in addition to 17/18 proprietary systems are in use. The most widely employed CDMS products are MACRO™ and Capture System™, followed by solutions that are used in at least 3 centres: eResearch Network™, CleanWeb™, GCP Base™ and SAS™. Although quality management systems for data management are in place in most centres/units, there exist some deficits in the area of system validation. Conclusions Because the considerable heterogeneity of data management software solutions may be a hindrance to cooperation based on trial data exchange, standards like CDISC (Clinical Data Interchange Standard Consortium) should be implemented more widely. In a heterogeneous environment the use of data standards can simplify data exchange, increase the quality of data and prepare centres for new developments (e.g. the use of EHR for clinical research). Because data management and the use of electronic data capture systems in clinical trials are characterized by the impact of regulations and guidelines, ethical concerns are discussed. In this context quality management becomes an important part of compliant data management. To address these issues ECRIN will establish certified data centres to support electronic data management and associated compliance needs of clinical trial centres in Europe. PMID:20663165

  3. Description of the SSF PMAD DC testbed control system data acquisition function

    NASA Technical Reports Server (NTRS)

    Baez, Anastacio N.; Mackin, Michael; Wright, Theodore

    1992-01-01

    The NASA LeRC in Cleveland, Ohio has completed the development and integration of a Power Management and Distribution (PMAD) DC Testbed. This testbed is a reduced scale representation of the end to end, sources to loads, Space Station Freedom Electrical Power System (SSF EPS). This unique facility is being used to demonstrate DC power generation and distribution, power management and control, and system operation techniques considered to be prime candidates for the Space Station Freedom. A key capability of the testbed is its ability to be configured to address system level issues in support of critical SSF program design milestones. Electrical power system control and operation issues like source control, source regulation, system fault protection, end-to-end system stability, health monitoring, resource allocation, and resource management are being evaluated in the testbed. The SSF EPS control functional allocation between on-board computers and ground based systems is evolving. Initially, ground based systems will perform the bulk of power system control and operation. The EPS control system is required to continuously monitor and determine the current state of the power system. The DC Testbed Control System consists of standard controllers arranged in a hierarchical and distributed architecture. These controllers provide all the monitoring and control functions for the DC Testbed Electrical Power System. Higher level controllers include the Power Management Controller, Load Management Controller, Operator Interface System, and a network of computer systems that perform some of the SSF Ground based Control Center Operation. The lower level controllers include Main Bus Switch Controllers and Photovoltaic Controllers. Power system status information is periodically provided to the higher level controllers to perform system control and operation. The data acquisition function of the control system is distributed among the various levels of the hierarchy. Data requirements are dictated by the control system algorithms being implemented at each level. A functional description of the various levels of the testbed control system architecture, the data acquisition function, and the status of its implementationis presented.

  4. Artificial Intelligent Platform as Decision Tool for Asset Management, Operations and Maintenance.

    PubMed

    2018-01-04

    An Artificial Intelligence (AI) system has been developed and implemented for water, wastewater and reuse plants to improve management of sensors, short and long term maintenance plans, asset and investment management plans. It is based on an integrated approach to capture data from different computer systems and files. It adds a layer of intelligence to the data. It serves as a repository of key current and future operations and maintenance conditions that a plant needs have knowledge of. With this information, it is able to simulate the configuration of processes and assets for those conditions to improve or optimize operations, maintenance and asset management, using the IViewOps (Intelligent View of Operations) model. Based on the optimization through model runs, it is able to create output files that can feed data to other systems and inform the staff regarding optimal solutions to the conditions experienced or anticipated in the future.

  5. A Web-Based Information System for Field Data Management

    NASA Astrophysics Data System (ADS)

    Weng, Y. H.; Sun, F. S.

    2014-12-01

    A web-based field data management system has been designed and developed to allow field geologists to store, organize, manage, and share field data online. System requirements were analyzed and clearly defined first regarding what data are to be stored, who the potential users are, and what system functions are needed in order to deliver the right data in the right way to the right user. A 3-tiered architecture was adopted to create this secure, scalable system that consists of a web browser at the front end while a database at the back end and a functional logic server in the middle. Specifically, HTML, CSS, and JavaScript were used to implement the user interface in the front-end tier, the Apache web server runs PHP scripts, and MySQL to server is used for the back-end database. The system accepts various types of field information, including image, audio, video, numeric, and text. It allows users to select data and populate them on either Google Earth or Google Maps for the examination of the spatial relations. It also makes the sharing of field data easy by converting them into XML format that is both human-readable and machine-readable, and thus ready for reuse.

  6. Inventory Control System by Using Vendor Managed Inventory (VMI)

    NASA Astrophysics Data System (ADS)

    Sabila, Alzena Dona; Mustafid; Suryono

    2018-02-01

    The inventory control system has a strategic role for the business in managing inventory operations. Management of conventional inventory creates problems in the stock of goods that often runs into vacancies and excess goods at the retail level. This study aims to build inventory control system that can maintain the stability of goods availability at the retail level. The implementation of Vendor Managed Inventory (VMI) method on inventory control system provides transparency of sales data and inventory of goods at retailer level to supplier. Inventory control is performed by calculating safety stock and reorder point of goods based on sales data received by the system. Rule-based reasoning is provided on the system to facilitate the monitoring of inventory status information, thereby helping the process of inventory updates appropriately. Utilization of SMS technology is also considered as a medium of collecting sales data in real-time due to the ease of use. The results of this study indicate that inventory control using VMI ensures the availability of goods ± 70% and can reduce the accumulation of goods ± 30% at the retail level.

  7. An intelligent user interface for browsing satellite data catalogs

    NASA Technical Reports Server (NTRS)

    Cromp, Robert F.; Crook, Sharon

    1989-01-01

    A large scale domain-independent spatial data management expert system that serves as a front-end to databases containing spatial data is described. This system is unique for two reasons. First, it uses spatial search techniques to generate a list of all the primary keys that fall within a user's spatial constraints prior to invoking the database management system, thus substantially decreasing the amount of time required to answer a user's query. Second, a domain-independent query expert system uses a domain-specific rule base to preprocess the user's English query, effectively mapping a broad class of queries into a smaller subset that can be handled by a commercial natural language processing system. The methods used by the spatial search module and the query expert system are explained, and the system architecture for the spatial data management expert system is described. The system is applied to data from the International Ultraviolet Explorer (IUE) satellite, and results are given.

  8. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    NASA Astrophysics Data System (ADS)

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  9. Use of an engineering data management system in the analysis of Space Shuttle Orbiter tiles

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Vallas, M.

    1981-01-01

    This paper demonstrates the use of an engineering data management system to facilitate the extensive stress analyses of the Space Shuttle Orbiter thermal protection system. Descriptions are given of the approach and methods used (1) to gather, organize, and store the data, (2) to query data interactively, (3) to generate graphic displays of the data, and (4) to access, transform, and prepare the data for input to a stress analysis program. The relational information management system was found to be well suited to the tile analysis problem because information related to many separate tiles could be accessed individually from a data base having a natural organization from an engineering viewpoint. The flexible user features of the system facilitated changes in data content and organization which occurred during the development and refinement of the tile analysis procedure. Additionally, the query language supported retrieval of data to satisfy a variety of user-specified conditions.

  10. An optical disk archive for a data base management system

    NASA Technical Reports Server (NTRS)

    Thomas, Douglas T.

    1985-01-01

    An overview is given of a data base management system that can catalog and archive data at rates up to 50M bits/sec. Emphasis is on the laser disk system that is used for the archive. All key components in the system (3 Vax 11/780s, a SEL 32/2750, a high speed communication interface, and the optical disk) are interfaced to a 100M bits/sec 16-port fiber optic bus to achieve the high data rates. The basic data unit is an autonomous data packet. Each packet contains a primary and secondary header and can be up to a million bits in length. The data packets are recorded on the optical disk at the same time the packet headers are being used by the relational data base management software ORACLE to create a directory independent of the packet recording process. The user then interfaces to the VAX that contains the directory for a quick-look scan or retrieval of the packet(s). The total system functions are distributed between the VAX and the SEL. The optical disk unit records the data with an argon laser at 100M bits/sec from its buffer, which is interfaced to the fiber optic bus. The same laser is used in the read cycle by reducing the laser power. Additional information is given in the form of outlines, charts, and diagrams.

  11. Regional lists of plant species that occur in wetlands: data base user's guide

    USGS Publications Warehouse

    Reed, Porter B.; Auble, Gregor T.; Muhlenbruck, Jill E.; Manci, Karen M.

    1989-01-01

    The Data Base List of Plant Species that Occur in Wetlands (LIST) currently contains records for 6,728 plant species. Each record provides information on nomenclature, plant characteristics and lifeforms, distribution, and frequency of occurrence in wetlands. The List of Plant Species that Occur in Wetlands, developed to supplement the U.S. Fish and Wildlife Service's Classification of Wetlands and Deepwater Habitats of the United States (Cowardin et al. 1979), underwent an intensive review by field botanists across the country. This review was coordinated by national and regional interagency wetland plant list review panels composed of representatives from the U. S. Fish and Wildlife Service, U. S. Army Corps of Engineers, Soil Conservation Service, and the Environmental Protection Agency. Initial and updated versions of the Data Base List of Plant Species that Occur in Wetlands are available in hardcopy (Reed 1986, 1988). Regional lists are available as U.S. Fish and Wildlife Service Biological Report Series 88(26.126.13). State lists are available as National Ecology Research Center Report Series 88(18.01-18.50). The computerized data base tracks and documents indicator assignments made by regional interagency review panels and facilitates generation of reports. This user's guide describes the format and contents of the LIST Data Base. The Data Base is available on 5-1/4" floppy disks in ASCII format for use with a data base management system on an IBM PC/XT/AT compatible computer. The LIST Data Base was developed using the QUICKTEXT Data Base Management System (Osborn and Strong 1984). Use of QUICKTEXT with the LIST Data Base is strongly recommended. Instructions for loading LIST into QUICKTEXT are included in this user's guide. Other data base management systems capable of handling variable length fields can be used by individuals familiar with these software packages. LIST distribution disks are available for 13 regions (Table 1). QUICKTEXT (course QT100--Data Base Management Techniques) and regional subsets of the LIST Data Base (distributed as self-tutorial courses, Table 1) are available through the Office of Conference Services, Colorado State University.

  12. Medical image informatics infrastructure design and applications.

    PubMed

    Huang, H K; Wong, S T; Pietka, E

    1997-01-01

    Picture archiving and communication systems (PACS) is a system integration of multimodality images and health information systems designed for improving the operation of a radiology department. As it evolves, PACS becomes a hospital image document management system with a voluminous image and related data file repository. A medical image informatics infrastructure can be designed to take advantage of existing data, providing PACS with add-on value for health care service, research, and education. A medical image informatics infrastructure (MIII) consists of the following components: medical images and associated data (including PACS database), image processing, data/knowledge base management, visualization, graphic user interface, communication networking, and application oriented software. This paper describes these components and their logical connection, and illustrates some applications based on the concept of the MIII.

  13. Data Model Management for Space Information Systems

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Ramirez, Paul; Mattmann, chris

    2006-01-01

    The Reference Architecture for Space Information Management (RASIM) suggests the separation of the data model from software components to promote the development of flexible information management systems. RASIM allows the data model to evolve independently from the software components and results in a robust implementation that remains viable as the domain changes. However, the development and management of data models within RASIM are difficult and time consuming tasks involving the choice of a notation, the capture of the model, its validation for consistency, and the export of the model for implementation. Current limitations to this approach include the lack of ability to capture comprehensive domain knowledge, the loss of significant modeling information during implementation, the lack of model visualization and documentation capabilities, and exports being limited to one or two schema types. The advent of the Semantic Web and its demand for sophisticated data models has addressed this situation by providing a new level of data model management in the form of ontology tools. In this paper we describe the use of a representative ontology tool to capture and manage a data model for a space information system. The resulting ontology is implementation independent. Novel on-line visualization and documentation capabilities are available automatically, and the ability to export to various schemas can be added through tool plug-ins. In addition, the ingestion of data instances into the ontology allows validation of the ontology and results in a domain knowledge base. Semantic browsers are easily configured for the knowledge base. For example the export of the knowledge base to RDF/XML and RDFS/XML and the use of open source metadata browsers provide ready-made user interfaces that support both text- and facet-based search. This paper will present the Planetary Data System (PDS) data model as a use case and describe the import of the data model into an ontology tool. We will also describe the current effort to provide interoperability with the European Space Agency (ESA)/Planetary Science Archive (PSA) which is critically dependent on a common data model.

  14. Remote sensing and geographically based information systems

    NASA Technical Reports Server (NTRS)

    Cicone, R. C.

    1977-01-01

    A structure is proposed for a geographically-oriented computer-based information system applicable to the analysis of remote sensing digital data. The structure, intended to answer a wide variety of user needs, would permit multiple views of the data, provide independent management of data security, quality and integrity, and rely on automatic data filing. Problems in geographically-oriented data systems, including those related to line encoding and cell encoding, are considered.

  15. Monitoring is not enough: on the need for a model-based approach to migratory bird management

    USGS Publications Warehouse

    Nichols, J.D.; Bonney, Rick; Pashley, David N.; Cooper, Robert; Niles, Larry

    2000-01-01

    Informed management requires information about system state and about effects of potential management actions on system state. Population monitoring can provide the needed information about system state, as well as information that can be used to investigate effects of management actions. Three methods for investigating effects of management on bird populations are (1) retrospective analysis, (2) formal experimentation and constrained-design studies, and (3) adaptive management. Retrospective analyses provide weak inferences, regardless of the quality of the monitoring data. The active use of monitoring data in experimental or constrained-design studies or in adaptive management is recommended. Under both approaches, learning occurs via the comparison of estimates from the monitoring program with predictions from competing management models.

  16. [Infrastructure and contents of clinical data management plan].

    PubMed

    Shen, Tong; Xu, Lie-dong; Fu, Hai-jun; Liu, Yan; He, Jia; Chen, Ping-yan; Song, Yu-fei

    2015-11-01

    Establishment of quality management system (QMS) plays a critical role in the clinical data management (CDM). The objectives of CDM are to ensure the quality and integrity of the trial data. Thus, every stage or element that may impact the quality outcomes of clinical studies should be in the controlled manner, which is referred to the full life cycle of CDM associated with the data collection, handling and statistical analysis of trial data. Based on the QMS, this paper provides consensus on how to develop a compliant clinical data management plan (CDMP). According to the essential requirements of the CDM, the CDMP should encompass each process of data collection, data capture and cleaning, medical coding, data verification and reconciliation, database monitoring and management, external data transmission and integration, data documentation and data quality assurance and so on. Creating and following up data management plan in each designed data management steps, dynamically record systems used, actions taken, parties involved will build and confirm regulated data management processes, standard operational procedures and effective quality metrics in all data management activities. CDMP is one of most important data management documents that is the solid foundation for clinical data quality.

  17. Enviro-Net: From Networks of Ground-Based Sensor Systems to a Web Platform for Sensor Data Management

    PubMed Central

    Pastorello, Gilberto Z.; Sanchez-Azofeifa, G. Arturo; Nascimento, Mario A.

    2011-01-01

    Ecosystems monitoring is essential to properly understand their development and the effects of events, both climatological and anthropological in nature. The amount of data used in these assessments is increasing at very high rates. This is due to increasing availability of sensing systems and the development of new techniques to analyze sensor data. The Enviro-Net Project encompasses several of such sensor system deployments across five countries in the Americas. These deployments use a few different ground-based sensor systems, installed at different heights monitoring the conditions in tropical dry forests over long periods of time. This paper presents our experience in deploying and maintaining these systems, retrieving and pre-processing the data, and describes the Web portal developed to help with data management, visualization and analysis. PMID:22163965

  18. Development and implementation of an Integrated Water Resources Management System (IWRMS)

    NASA Astrophysics Data System (ADS)

    Flügel, W.-A.; Busch, C.

    2011-04-01

    One of the innovative objectives in the EC project BRAHMATWINN was the development of a stakeholder oriented Integrated Water Resources Management System (IWRMS). The toolset integrates the findings of the project and presents it in a user friendly way for decision support in sustainable integrated water resources management (IWRM) in river basins. IWRMS is a framework, which integrates different types of basin information and which supports the development of IWRM options for climate change mitigation. It is based on the River Basin Information System (RBIS) data models and delivers a graphical user interface for stakeholders. A special interface was developed for the integration of the enhanced DANUBIA model input and the NetSyMod model with its Mulino decision support system (mulino mDss) component. The web based IWRMS contains and combines different types of data and methods to provide river basin data and information for decision support. IWRMS is based on a three tier software framework which uses (i) html/javascript at the client tier, (ii) PHP programming language to realize the application tier, and (iii) a postgresql/postgis database tier to manage and storage all data, except the DANUBIA modelling raw data, which are file based and registered in the database tier. All three tiers can reside on one or different computers and are adapted to the local hardware infrastructure. IWRMS as well as RBIS are based on Open Source Software (OSS) components and flexible and time saving access to that database is guaranteed by web-based interfaces for data visualization and retrieval. The IWRMS is accessible via the BRAHMATWINN homepage: http://www.brahmatwinn.uni-jena.de and a user manual for the RBIS is available for download as well.

  19. The physiology analysis system: an integrated approach for warehousing, management and analysis of time-series physiology data.

    PubMed

    McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques

    2007-04-01

    The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.

  20. Assessment Systems and Data Management in Colleges of Education: An Examination of Systems and Infrastructure

    ERIC Educational Resources Information Center

    Haughton, Noela A.; Keil, Virginia L.

    2009-01-01

    The College of Education Assessment Infrastructure Survey was developed and administered to 1011 institutions over a twelve-month period ending April 2007. The survey examined the capacity of university-based teacher preparation programs to respond to the growing and increasingly complex data management requirements that accompanies assessment and…

  1. A Data Management System Integrating Web-Based Training and Randomized Trials

    ERIC Educational Resources Information Center

    Muroff, Jordana; Amodeo, Maryann; Larson, Mary Jo; Carey, Margaret; Loftin, Ralph D.

    2011-01-01

    This article describes a data management system (DMS) developed to support a large-scale randomized study of an innovative web-course that was designed to improve substance abuse counselors' knowledge and skills in applying a substance abuse treatment method (i.e., cognitive behavioral therapy; CBT). The randomized trial compared the performance…

  2. CSHM: Web-based safety and health monitoring system for construction management.

    PubMed

    Cheung, Sai On; Cheung, Kevin K W; Suen, Henry C H

    2004-01-01

    This paper describes a web-based system for monitoring and assessing construction safety and health performance, entitled the Construction Safety and Health Monitoring (CSHM) system. The design and development of CSHM is an integration of internet and database systems, with the intent to create a total automated safety and health management tool. A list of safety and health performance parameters was devised for the management of safety and health in construction. A conceptual framework of the four key components of CSHM is presented: (a) Web-based Interface (templates); (b) Knowledge Base; (c) Output Data; and (d) Benchmark Group. The combined effect of these components results in a system that enables speedy performance assessment of safety and health activities on construction sites. With the CSHM's built-in functions, important management decisions can theoretically be made and corrective actions can be taken before potential hazards turn into fatal or injurious occupational accidents. As such, the CSHM system will accelerate the monitoring and assessing of performance safety and health management tasks.

  3. Data management applications

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Kennedy Space Center's primary institutional computer is a 4 megabyte IBM 4341 with 3.175 billion characters of IBM 3350 disc storage. This system utilizes the Software AG product known as ADABAS with the on line user oriented features of NATURAL and COMPLETE as a Data Base Management System (DBMS). It is operational under the OS/VSI and is currently supporting batch/on line applications such as Personnel, Training, Physical Space Management, Procurement, Office Equipment Maintenance, and Equipment Visibility. A third and by far the largest DBMS application is known as the Shuttle Inventory Management System (SIMS) which is operational on a Honeywell 6660 (dedicated) computer system utilizing Honeywell Integrated Data Storage I (IDSI) as the DBMS. The SIMS application is designed to provide central supply system acquisition, inventory control, receipt, storage, and issue of spares, supplies, and materials.

  4. Design and Construction for Community Health Service Precision Fund Appropriation System Based on Performance Management.

    PubMed

    Gao, Xing; He, Yao; Hu, Hongpu

    2017-01-01

    Allowing for the differences in economy development, informatization degree and characteristic of population served and so on among different community health service organizations, community health service precision fund appropriation system based on performance management is designed, which can provide support for the government to appropriate financial funds scientifically and rationally for primary care. The system has the characteristic of flexibility and practicability, in which there are five subsystems including data acquisition, parameter setting, fund appropriation, statistical analysis system and user management.

  5. Results of data base management system parameterized performance testing related to GSFC scientific applications

    NASA Technical Reports Server (NTRS)

    Carchedi, C. H.; Gough, T. L.; Huston, H. A.

    1983-01-01

    The results of a variety of tests designed to demonstrate and evaluate the performance of several commercially available data base management system (DBMS) products compatible with the Digital Equipment Corporation VAX 11/780 computer system are summarized. The tests were performed on the INGRES, ORACLE, and SEED DBMS products employing applications that were similar to scientific applications under development by NASA. The objectives of this testing included determining the strength and weaknesses of the candidate systems, performance trade-offs of various design alternatives and the impact of some installation and environmental (computer related) influences.

  6. Land cover mapping of the upper Kuskokwim Resource Managment Area using LANDSAT and a digital data base approach

    USGS Publications Warehouse

    Markon, Carl J.

    1988-01-01

    Digital land cover and terrain data for the Upper Kuskokwim Resource Hanagement Area (UKRMA) were produced by the U.S. Geological Survey, Earth Resources Observation Systems Field Office, Anchorage, Alaska for the Bureau of Land Management. These and other environmental data, were incorporated into a digital data base to assist in the management and planning of the UKRMA. The digital data base includes land cover classifications, elevation, slope, and aspect data centering on the UKRMA boundaries. The data are stored on computer compatible tapes at a 50-m pixel size. Additional digital data in the data base include: (a) summer and winter Landsat multispectral scanner (MSS) data registered to a 50-m Universal Transverse Mercator grid; (b) elevation, slope, aspect, and solar illumination data; (c) soils and surficial geology; and (e) study area boundary. The classification of Landsat MSS data resulted in seven major classes and 24 subclasses. Major classes include: forest, shrubland, dwarf scrub, herbaceous, barren, water, and other. The final data base will be used by resource personnel for management and planning within the UKRMA.

  7. LUMIS: Land Use Management and Information Systems; coordinate oriented program documentation

    NASA Technical Reports Server (NTRS)

    1976-01-01

    An integrated geographic information system to assist program managers and planning groups in metropolitan regions is presented. The series of computer software programs and procedures involved in data base construction uses the census DIME file and point-in-polygon architectures. The system is described in two parts: (1) instructions to operators with regard to digitizing and editing procedures, and (2) application of data base construction algorithms to achieve map registration, assure the topological integrity of polygon files, and tabulate land use acreages within administrative districts.

  8. Design and Implementation of CNEOST Image Database Based on NoSQL System

    NASA Astrophysics Data System (ADS)

    Wang, X.

    2013-07-01

    The China Near Earth Object Survey Telescope (CNEOST) is the largest Schmidt telescope in China, and it has acquired more than 3 TB astronomical image data since it saw the first light in 2006. After the upgradation of the CCD camera in 2013, over 10 TB data will be obtained every year. The management of massive images is not only an indispensable part of data processing pipeline but also the basis of data sharing. Based on the analysis of requirement, an image management system is designed and implemented by employing the non-relational database.

  9. Design and Implementation of CNEOST Image Database Based on NoSQL System

    NASA Astrophysics Data System (ADS)

    Wang, Xin

    2014-04-01

    The China Near Earth Object Survey Telescope is the largest Schmidt telescope in China, and it has acquired more than 3 TB astronomical image data since it saw the first light in 2006. After the upgrade of the CCD camera in 2013, over 10 TB data will be obtained every year. The management of the massive images is not only an indispensable part of data processing pipeline but also the basis of data sharing. Based on the analysis of requirement, an image management system is designed and implemented by employing the non-relational database.

  10. On Building an Ontological Knowledge Base for Managing Patient Safety Events.

    PubMed

    Liang, Chen; Gong, Yang

    2015-01-01

    Over the past decade, improving healthcare quality and safety through patient safety event reporting systems has drawn much attention. Unfortunately, such systems are suffering from low data quality, inefficient data entry and ineffective information retrieval. For improving the systems, we develop a semantic web ontology based on the WHO International Classification for Patient Safety (ICPS) and AHRQ Common Formats for patient safety event reporting. The ontology holds potential in enhancing knowledge management and information retrieval, as well as providing flexible data entry and case analysis for both reporters and reviewers of patient safety events. In this paper, we detailed our efforts in data acquisition, transformation, implementation and initial evaluation of the ontology.

  11. Real-time GIS data model and sensor web service platform for environmental data management.

    PubMed

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  12. Summaries of Minnehaha Creek Watershed District Plans/Studies/Reports

    DTIC Science & Technology

    2004-01-30

    34+ Management of all wetland functional assessment data in a Microsoft Access© database "+ Development of a GIS wetland data management system "+ Recommendations...General Task B Design GIS -Based Decision Making Model: Scenario-Based $125,000 $125,000 Model of Landuse Hydro Data Monitoring Task C Water Quality...Landuse and Land cover data + Watershed GIS data layers + Flood Insurance Rate Maps + Proposed project locations + Stream miles, reaches and conditions

  13. An integrated photogrammetric and spatial database management system for producing fully structured data using aerial and remote sensing images.

    PubMed

    Ahmadi, Farshid Farnood; Ebadi, Hamid

    2009-01-01

    3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented.

  14. A review on technologies and their usage in solid waste monitoring and management systems: Issues and challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hannan, M.A., E-mail: hannan@eng.ukm.my; Abdulla Al Mamun, Md., E-mail: md.abdulla@siswa.ukm.edu.my; Hussain, Aini, E-mail: aini@eng.ukm.my

    Highlights: • Classification of available technologies for SWM system in four core category. • Organization of technology based SWM systems in three main groups. • Summary of SWM systems with target application, methodology and functional domain. • Issues and challenges are highlighted for further design of a sustainable system. - Abstract: In the backdrop of prompt advancement, information and communication technology (ICT) has become an inevitable part to plan and design of modern solid waste management (SWM) systems. This study presents a critical review of the existing ICTs and their usage in SWM systems to unfold the issues and challengesmore » towards using integrated technologies based system. To plan, monitor, collect and manage solid waste, the ICTs are divided into four categories such as spatial technologies, identification technologies, data acquisition technologies and data communication technologies. The ICT based SWM systems classified in this paper are based on the first three technologies while the forth one is employed by almost every systems. This review may guide the reader about the basics of available ICTs and their application in SWM to facilitate the search for planning and design of a sustainable new system.« less

  15. Research on image evidence in land supervision and GIS management

    NASA Astrophysics Data System (ADS)

    Li, Qiu; Wu, Lixin

    2006-10-01

    Land resource development and utilization brings many problems. The numbers, the scale and volume of illegal land use cases are on the increasing. Since the territory is vast, and the land violations are concealment, it is difficulty for an effective land supervision and management. In this paper, the concepts of evidence, and preservation of evidence were described first. The concepts of image evidence (IE), natural evidence (NE), natural preservation of evidence (NPE), general preservation of evidence (GPE) were proposed based on the characteristics of remote sensing image (RSI) which has a characteristic of objectiveness, truthfulness, high spatial resolution, more information included. Using MapObjects and Visual Basic 6.0, under the Access management to implement the conjunction of spatial vector database and attribute data table; taking RSI as the data sources and background layer; combining the powerful management of geographic information system (GIS) for spatial data, and visual analysis, a land supervision and GIS management system was design and implemented based on NPE. The practical use in Beijing shows that the system is running well, and solved some problems in land supervision and management.

  16. Medical informatics in medical research - the Severe Malaria in African Children (SMAC) Network's experience.

    PubMed

    Olola, C H O; Missinou, M A; Issifou, S; Anane-Sarpong, E; Abubakar, I; Gandi, J N; Chagomerana, M; Pinder, M; Agbenyega, T; Kremsner, P G; Newton, C R J C; Wypij, D; Taylor, T E

    2006-01-01

    Computers are widely used for data management in clinical trials in the developed countries, unlike in developing countries. Dependable systems are vital for data management, and medical decision making in clinical research. Monitoring and evaluation of data management is critical. In this paper we describe database structures and procedures of systems used to implement, coordinate, and sustain data management in Africa. We outline major lessons, challenges and successes achieved, and recommendations to improve medical informatics application in biomedical research in sub-Saharan Africa. A consortium of experienced research units at five sites in Africa in studying children with disease formed a new clinical trials network, Severe Malaria in African Children. In December 2000, the network introduced an observational study involving these hospital-based sites. After prototyping, relational database management systems were implemented for data entry and verification, data submission and quality assurance monitoring. Between 2000 and 2005, 25,858 patients were enrolled. Failure to meet data submission deadline and data entry errors correlated positively (correlation coefficient, r = 0.82), with more errors occurring when data was submitted late. Data submission lateness correlated inversely with hospital admissions (r = -0.62). Developing and sustaining dependable DBMS, ongoing modifications to optimize data management is crucial for clinical studies. Monitoring and communication systems are vital in multi-center networks for good data management. Data timeliness is associated with data quality and hospital admissions.

  17. Environmental Data Store: A Web-Based System Providing Management and Exploitation for Multi-Data-Type Environmental Data

    NASA Astrophysics Data System (ADS)

    Ji, P.; Piasecki, M.

    2012-12-01

    With the rapid growth in data volumes, data diversity and data demands from multi-disciplinary research effort, data management and exploitation are increasingly facing significant challenges for environmental scientific community. We describe Environmental data store (EDS), a system we are developing that is a web-based system following an open source implementation to manage and exploit multi-data-type environmental data. EDS provides repository services for the six fundamental data types, which meet the demands of multi-disciplinary environmental research. These data types are: a) Time Series Data, b) GeoSpatial data, c) Digital Data, d) Ex-Situ Sampling data, e) Modeling Data, f) Raster Data. Through data portal, EDS allows for efficient consuming these six types of data placed in data pool, which is made up of different data nodes corresponding to different data types, including iRODS, ODM, THREADS, ESSDB, GeoServer, etc.. EDS data portal offers unified submission interface for the above different data types; provides fully integrated, scalable search across content from the above different data systems; also features mapping, analysis, exporting and visualization, through integration with other software. EDS uses a number of developed systems, follows widely used data standards, and highlights the thematic, semantic, and syntactic support on the submission and search, in order to advance multi-disciplinary environmental research. This system will be installed and develop at the CrossRoads initiative at the City College of New York.

  18. GIS-based automated management of highway surface crack inspection system

    NASA Astrophysics Data System (ADS)

    Chung, Hung-Chi; Shinozuka, Masanobu; Soeller, Tony; Girardello, Roberto

    2004-07-01

    An automated in-situ road surface distress surveying and management system, AMPIS, has been developed on the basis of video images within the framework of GIS software. Video image processing techniques are introduced to acquire, process and analyze the road surface images obtained from a moving vehicle. ArcGIS platform is used to integrate the routines of image processing and spatial analysis in handling the full-scale metropolitan highway surface distress detection and data fusion/management. This makes it possible to present user-friendly interfaces in GIS and to provide efficient visualizations of surveyed results not only for the use of transportation engineers to manage road surveying documentations, data acquisition, analysis and management, but also for financial officials to plan maintenance and repair programs and further evaluate the socio-economic impacts of highway degradation and deterioration. A review performed in this study on fundamental principle of Pavement Management System (PMS) and its implementation indicates that the proposed approach of using GIS concept and its tools for PMS application will reshape PMS into a new information technology-based system that can provide convenient and efficient pavement inspection and management.

  19. RDBMS Applications as Online Based Data Archive: A Case of Harbour Medical Center in Pekanbaru

    NASA Astrophysics Data System (ADS)

    Febriadi, Bayu; Zamsuri, Ahmad

    2017-12-01

    Kantor Kesehatan Pelabuhan Kelas II Pekanbaru is a government office that concerns about healthy, especially about environment health. There is a problem in case of saving electronic data, also in analyzing daily data both for internal and external data. The office has some computers and other tools that are useful in saving electronic data. In fact, the data are still saved in available cupboards and it is not efficient for an important data that is analyzed for more than one time. In other words, it is not good for a data is needed to be analyzed continuously. Rational Data Base Management System (RDBMS) application is an online based saving data and it uses System Development Life Cycle (SDLC) method. Hopefully, the application will be very useful for employees Kantor Kesehatan Pelabuhan Pekanbaru in managing their work.

  20. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Integrated information processing requirements

    NASA Technical Reports Server (NTRS)

    Southall, J. W.

    1979-01-01

    The engineering-specified requirements for integrated information processing by means of the Integrated Programs for Aerospace-Vehicle Design (IPAD) system are presented. A data model is described and is based on the design process of a typical aerospace vehicle. General data management requirements are specified for data storage, retrieval, generation, communication, and maintenance. Information management requirements are specified for a two-component data model. In the general portion, data sets are managed as entities, and in the specific portion, data elements and the relationships between elements are managed by the system, allowing user access to individual elements for the purpose of query. Computer program management requirements are specified for support of a computer program library, control of computer programs, and installation of computer programs into IPAD.

  1. Application of a data base management system to a finite element model

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1980-01-01

    In today's software market, much effort is being expended on the development of data base management systems (DBMS). Most commercially available DBMS were designed for business use. However, the need for such systems within the engineering and scientific communities is becoming apparent. A potential DBMS application that appears attractive is the handling of data for finite element engineering models. The applications of a commercially available, business-oriented DBMS to a structural engineering, finite element model is explored. The model, DBMS, an approach to using the DBMS, advantages and disadvantages are described. Plans for research on a scientific and engineering DBMS are discussed.

  2. Cloud-assisted mobile-access of health data with privacy and auditability.

    PubMed

    Tong, Yue; Sun, Jinyuan; Chow, Sherman S M; Li, Pan

    2014-03-01

    Motivated by the privacy issues, curbing the adoption of electronic healthcare systems and the wild success of cloud service models, we propose to build privacy into mobile healthcare systems with the help of the private cloud. Our system offers salient features including efficient key management, privacy-preserving data storage, and retrieval, especially for retrieval at emergencies, and auditability for misusing health data. Specifically, we propose to integrate key management from pseudorandom number generator for unlinkability, a secure indexing method for privacy-preserving keyword search which hides both search and access patterns based on redundancy, and integrate the concept of attribute-based encryption with threshold signing for providing role-based access control with auditability to prevent potential misbehavior, in both normal and emergency cases.

  3. The Research on Safety Management Information System of Railway Passenger Based on Risk Management Theory

    NASA Astrophysics Data System (ADS)

    Zhu, Wenmin; Jia, Yuanhua

    2018-01-01

    Based on the risk management theory and the PDCA cycle model, requirements of the railway passenger transport safety production is analyzed, and the establishment of the security risk assessment team is proposed to manage risk by FTA with Delphi from both qualitative and quantitative aspects. The safety production committee is also established to accomplish performance appraisal, which is for further ensuring the correctness of risk management results, optimizing the safety management business processes and improving risk management capabilities. The basic framework and risk information database of risk management information system of railway passenger transport safety are designed by Ajax, Web Services and SQL technologies. The system realizes functions about risk management, performance appraisal and data management, and provides an efficient and convenient information management platform for railway passenger safety manager.

  4. Medical record management systems: criticisms and new perspectives.

    PubMed

    Frénot, S; Laforest, F

    1999-06-01

    The first generation of computerized medical records stored the data as text, but these records did not bring any improvement in information manipulation. The use of a relational database management system (DBMS) has largely solved this problem as it allows for data requests by using SQL. However, this requires data structuring which is not very appropriate to medicine. Moreover, the use of templates and icon user interfaces has introduced a deviation from the paper-based record (still existing). The arrival of hypertext user interfaces has proven to be of interest to fill the gap between the paper-based medical record and its electronic version. We think that further improvement can be accomplished by using a fully document-based system. We present the architecture, advantages and disadvantages of classical DBMS-based and Web/DBMS-based solutions. We also present a document-based solution and explain its advantages, which include communication, security, flexibility and genericity.

  5. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.« less

  6. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.« less

  7. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson P. Khosah; Charles G. Crawford

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eleventh month of Phase 1 development activities.« less

  8. Advanced Software Techniques for Data Management Systems. Volume 2: Space Shuttle Flight Executive System: Functional Design

    NASA Technical Reports Server (NTRS)

    Pepe, J. T.

    1972-01-01

    A functional design of software executive system for the space shuttle avionics computer is presented. Three primary functions of the executive are emphasized in the design: task management, I/O management, and configuration management. The executive system organization is based on the applications software and configuration requirements established during the Phase B definition of the Space Shuttle program. Although the primary features of the executive system architecture were derived from Phase B requirements, it was specified for implementation with the IBM 4 Pi EP aerospace computer and is expected to be incorporated into a breadboard data management computer system at NASA Manned Spacecraft Center's Information system division. The executive system was structured for internal operation on the IBM 4 Pi EP system with its external configuration and applications software assumed to the characteristic of the centralized quad-redundant avionics systems defined in Phase B.

  9. Real-Time Management of Multimodal Streaming Data for Monitoring of Epileptic Patients.

    PubMed

    Triantafyllopoulos, Dimitrios; Korvesis, Panagiotis; Mporas, Iosif; Megalooikonomou, Vasileios

    2016-03-01

    New generation of healthcare is represented by wearable health monitoring systems, which provide real-time monitoring of patient's physiological parameters. It is expected that continuous ambulatory monitoring of vital signals will improve treatment of patients and enable proactive personal health management. In this paper, we present the implementation of a multimodal real-time system for epilepsy management. The proposed methodology is based on a data streaming architecture and efficient management of a big flow of physiological parameters. The performance of this architecture is examined for varying spatial resolution of the recorded data.

  10. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  11. Laboratory Animal Management Assistant (LAMA): a LIMS for active research colonies.

    PubMed

    Milisavljevic, Marko; Hearty, Taryn; Wong, Tony Y T; Portales-Casamar, Elodie; Simpson, Elizabeth M; Wasserman, Wyeth W

    2010-06-01

    Laboratory Animal Management Assistant (LAMA) is an internet-based system for tracking large laboratory mouse colonies. It has a user-friendly interface with powerful search capabilities that ease day-to-day tasks such as tracking breeding cages and weaning litters. LAMA was originally developed to manage hundreds of new mouse strains generated by a large functional genomics program, the Pleiades Promoter Project ( http://www.pleiades.org ). The software system has proven to be highly flexible, suitable for diverse management approaches to mouse colonies. It allows custom tagging and grouping of animals, simplifying project-specific handling and access to data. Finally, LAMA was developed in close collaboration with mouse technicians to ease the transition from paper- or Excel-based management systems to computerized tracking, allowing data export in a popular spreadsheet format and automatic printing of cage cards. LAMA is an open-access software tool, freely available to the research community at http://launchpad.net/mousedb .

  12. Development of an expert system for analysis of Shuttle atmospheric revitalization and pressure control subsystem anomalies

    NASA Technical Reports Server (NTRS)

    Lafuse, Sharon A.

    1991-01-01

    The paper describes the Shuttle Leak Management Expert System (SLMES), a preprototype expert system developed to enable the ECLSS subsystem manager to analyze subsystem anomalies and to formulate flight procedures based on flight data. The SLMES combines the rule-based expert system technology with the traditional FORTRAN-based software into an integrated system. SLMES analyzes the data using rules, and, when it detects a problem that requires simulation, it sets up the input for the FORTRAN-based simulation program ARPCS2AT2, which predicts the cabin total pressure and composition as a function of time. The program simulates the pressure control system, the crew oxygen masks, the airlock repress/depress valves, and the leakage. When the simulation has completed, other SLMES rules are triggered to examine the results of simulation contrary to flight data and to suggest methods for correcting the problem. Results are then presented in form of graphs and tables.

  13. Life Sciences MIS

    NASA Technical Reports Server (NTRS)

    Dittman, R. A.; Marks, V.

    1983-01-01

    Management Information System, MIS, provides Life Sciences Projects Division at Johnson Space Center with automated system for project managment. MIS utilizes Tektronix 4027 color graphics display terminal and form-fillout capability. User interface with MIS data base is through series of forms.

  14. Space station data system analysis/architecture study. Task 2: Options development, DR-5. Volume 2: Design options

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The primary objective of Task 2 is the development of an information base that will support the conduct of trade studies and provide sufficient data to make key design/programmatic decisions. This includes: (1) the establishment of option categories that are most likely to influence Space Station Data System (SSDS) definition; (2) the identification of preferred options in each category; and (3) the characterization of these options with respect to performance attributes, constraints, cost and risk. This volume contains the options development for the design category. This category comprises alternative structures, configurations and techniques that can be used to develop designs that are responsive to the SSDS requirements. The specific areas discussed are software, including data base management and distributed operating systems; system architecture, including fault tolerance and system growth/automation/autonomy and system interfaces; time management; and system security/privacy. Also discussed are space communications and local area networking.

  15. Road landslide information management and forecasting system base on GIS.

    PubMed

    Wang, Wei Dong; Du, Xiang Gang; Xie, Cui Ming

    2009-09-01

    Take account of the characters of road geological hazard and its supervision, it is very important to develop the Road Landslides Information Management and Forecasting System based on Geographic Information System (GIS). The paper presents the system objective, function, component modules and key techniques in the procedure of system development. The system, based on the spatial information and attribute information of road geological hazard, was developed and applied in Guizhou, a province of China where there are numerous and typical landslides. The manager of communication, using the system, can visually inquire all road landslides information based on regional road network or on the monitoring network of individual landslide. Furthermore, the system, integrated with mathematical prediction models and the GIS's strongpoint on spatial analyzing, can assess and predict landslide developing procedure according to the field monitoring data. Thus, it can efficiently assists the road construction or management units in making decision to control the landslides and to reduce human vulnerability.

  16. System and method of self-properties for an autonomous and automatic computer environment

    NASA Technical Reports Server (NTRS)

    Sterritt, Roy (Inventor); Hinchey, Michael G. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments self health/urgency data and environment health/urgency data may be transmitted externally from an autonomic element. Other embodiments may include transmitting the self health/urgency data and environment health/urgency data together on a regular basis similar to the lub-dub of a heartbeat. Yet other embodiments may include a method for managing a system based on the functioning state and operating status of the system, wherein the method may include processing received signals from the system indicative of the functioning state and the operating status to obtain an analysis of the condition of the system, generating one or more stay alive signals based on the functioning status and the operating state of the system, transmitting the stay-alive signal, transmitting self health/urgency data, and transmitting environment health/urgency data. Still other embodiments may include an autonomic element that includes a self monitor, a self adjuster, an environment monitor, and an autonomic manager.

  17. Central Data Processing System (CDPS) user's manual: Solar heating and cooling program

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The software and data base management system required to assess the performance of solar heating and cooling systems installed at multiple sites is presented. The instrumentation data associated with these systems is collected, processed, and presented in a form which supported continuity of performance evaluation across all applications. The CDPS consisted of three major elements: communication interface computer, central data processing computer, and performance evaluation data base. Users of the performance data base were identified, and procedures for operation, and guidelines for software maintenance were outlined. The manual also defined the output capabilities of the CDPS in support of external users of the system.

  18. Guidelines for the creation and management of geographic data bases within a GIS environment, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durfee, R.C.; Land, M.L.; McCord, R.A.

    1994-07-01

    A Geographic Information System (GIS) provides the ability to manage and analyze all types of geographic and environmental information. It performs these functions by providing the tools necessary to capture, access, analyze, and display spatially referenced information in graphic and tabular form. Typical data elements that can be visualized in a map might include roads, buildings, topography, streams, waste areas, monitoring wells, groundwater measurements, soil sample results, landcover, and demography. The intent of this document is to provide data management and quality assurance (QA) guidelines that will aid implementors and users of GIS technology and data bases. These guidelines shouldmore » be useful in all, phases of GIS activities, including the following: (1) project planning, (2) data collection and generation, (3) data maintenance and management, (4) QA and standards, (5) project implementation, (6) spatial analysis and data interpretation, (7) data transformation and exchange, and (8) output and reporting. The daily use of desktop GIS technologies within Martin Marietta Energy Systems, Inc. (Energy Systems), is a relatively new phenomenon, but usage is increasing rapidly. Large volumes of GIS-related data are now being collected and analyzed for the U.S. Department of Energy (DOE) Oak Ridge Reservation (ORR) and its facilities. It is very important to establish and follow good data management practices for GIS. In the absence of such practices, data-related problems will overwhelm users for many years. In comparison with traditional data processing and software life-cycle management, there is limited information on GIS QA techniques, data standards and structures, configuration control, and documentation practices. This lack of information partially results from the newness of the technology and the complexity of spatial information and geographic analysis techniques as compared to typical tabular data management.« less

  19. Geoscience information integration and visualization research of Shandong Province, China based on ArcGIS engine

    NASA Astrophysics Data System (ADS)

    Xu, Mingzhu; Gao, Zhiqiang; Ning, Jicai

    2014-10-01

    To improve the access efficiency of geoscience data, efficient data model and storage solutions should be used. Geoscience data is usually classified by format or coordinate system in existing storage solutions. When data is large, it is not conducive to search the geographic features. In this study, a geographical information integration system of Shandong province, China was developed based on the technology of ArcGIS Engine, .NET, and SQL Server. It uses Geodatabase spatial data model and ArcSDE to organize and store spatial and attribute data and establishes geoscience database of Shangdong. Seven function modules were designed: map browse, database and subject management, layer control, map query, spatial analysis and map symbolization. The system's characteristics of can be browsed and managed by geoscience subjects make the system convenient for geographic researchers and decision-making departments to use the data.

  20. An expert system prototype for aiding in the development of software functional requirements for NASA Goddard's command management system: A case study and lessons learned

    NASA Technical Reports Server (NTRS)

    Liebowitz, Jay

    1986-01-01

    At NASA Goddard, the role of the command management system (CMS) is to transform general requests for spacecraft opeerations into detailed operational plans to be uplinked to the spacecraft. The CMS is part of the NASA Data System which entails the downlink of science and engineering data from NASA near-earth satellites to the user, and the uplink of command and control data to the spacecraft. Presently, it takes one to three years, with meetings once or twice a week, to determine functional requirements for CMS software design. As an alternative approach to the present technique of developing CMS software functional requirements, an expert system prototype was developed to aid in this function. Specifically, the knowledge base was formulated through interactions with domain experts, and was then linked to an existing expert system application generator called 'Knowledge Engineering System (Version 1.3).' Knowledge base development focused on four major steps: (1) develop the problem-oriented attribute hierachy; (2) determine the knowledge management approach; (3) encode the knowledge base; and (4) validate, test, certify, and evaluate the knowledge base and the expert system prototype as a whole. Backcasting was accomplished for validating and testing the expert system prototype. Knowledge refinement, evaluation, and implementation procedures of the expert system prototype were then transacted.

  1. Northeast Artificial Intelligence Consortium (NAIC). Volume 12. Computer Architecture for Very Large Knowledge Bases

    DTIC Science & Technology

    1990-12-01

    data rate to the electronics would be much lower on the average and the data much "richer" in information. Intelligent use of...system bottleneck, a high data rate should be provided by I/O systems. 2. machines with intelligent storage management specially designed for logic...management information processing, surveillance sensors, intelligence data collection and handling, solid state sciences, electromagnetics, and propagation, and electronic reliability/maintainability and compatibility.

  2. Study on Global GIS architecture and its key technologies

    NASA Astrophysics Data System (ADS)

    Cheng, Chengqi; Guan, Li; Lv, Xuefeng

    2009-09-01

    Global GIS (G2IS) is a system, which supports the huge data process and the global direct manipulation on global grid based on spheroid or ellipsoid surface. Based on global subdivision grid (GSG), Global GIS architecture is presented in this paper, taking advantage of computer cluster theory, the space-time integration technology and the virtual reality technology. Global GIS system architecture is composed of five layers, including data storage layer, data representation layer, network and cluster layer, data management layer and data application layer. Thereinto, it is designed that functions of four-level protocol framework and three-layer data management pattern of Global GIS based on organization, management and publication of spatial information in this architecture. Three kinds of core supportive technologies, which are computer cluster theory, the space-time integration technology and the virtual reality technology, and its application pattern in the Global GIS are introduced in detail. The primary ideas of Global GIS in this paper will be an important development tendency of GIS.

  3. Study on Global GIS architecture and its key technologies

    NASA Astrophysics Data System (ADS)

    Cheng, Chengqi; Guan, Li; Lv, Xuefeng

    2010-11-01

    Global GIS (G2IS) is a system, which supports the huge data process and the global direct manipulation on global grid based on spheroid or ellipsoid surface. Based on global subdivision grid (GSG), Global GIS architecture is presented in this paper, taking advantage of computer cluster theory, the space-time integration technology and the virtual reality technology. Global GIS system architecture is composed of five layers, including data storage layer, data representation layer, network and cluster layer, data management layer and data application layer. Thereinto, it is designed that functions of four-level protocol framework and three-layer data management pattern of Global GIS based on organization, management and publication of spatial information in this architecture. Three kinds of core supportive technologies, which are computer cluster theory, the space-time integration technology and the virtual reality technology, and its application pattern in the Global GIS are introduced in detail. The primary ideas of Global GIS in this paper will be an important development tendency of GIS.

  4. Microvax-based data management and reduction system for the regional planetary image facilities

    NASA Technical Reports Server (NTRS)

    Arvidson, R.; Guinness, E.; Slavney, S.; Weiss, B.

    1987-01-01

    Presented is a progress report for the Regional Planetary Image Facilities (RPIF) prototype image data management and reduction system being jointly implemented by Washington University and the USGS, Flagstaff. The system will consist of a MicroVAX with a high capacity (approx 300 megabyte) disk drive, a compact disk player, an image display buffer, a videodisk player, USGS image processing software, and SYSTEM 1032 - a commercial relational database management package. The USGS, Flagstaff, will transfer their image processing software including radiometric and geometric calibration routines, to the MicroVAX environment. Washington University will have primary responsibility for developing the database management aspects of the system and for integrating the various aspects into a working system.

  5. Electronic data collection and management system for global adult tobacco survey.

    PubMed

    Pujari, Sameer J; Palipudi, Krishna M; Morton, Jeremy; Levinsohn, Jay; Litavecz, Steve; Green, Michael

    2012-01-01

    Portable handheld computers and electronic data management systems have been used for national surveys in many high-income countries, however their use in developing countries has been challenging due to varying geographical, economic, climatic, political and cultural environments. In order to monitor and measure global adult tobacco use, the World Health Organization and the US Centers for Disease Control and Prevention initiated the Global Adult Tobacco Survey, a nationally representative household survey of adults, 15 years of age or older, using a standard core questionnaire, sample design, and data collection and management procedures. The Survey has been conducted in 14 low- and middle-income countries, using an electronic data collection and management system. This paper describes implementation of the electronic data collection system and associated findings. The Survey was based on a comprehensive data management protocol, to enable standardized, globally comparable high quality data collection and management. It included adaptation to specific country needs, selection of appropriate handheld hardware devices, use of open source software, and building country capacity and provide technical support. In its first phase, the Global Adult Tobacco Survey was successfully conducted between 2008 and 2010, using an electronic data collection and management system for interviews in 302,800 households in 14 countries. More than 2,644 handheld computers were fielded and over 2,634 fieldworkers, supervisors and monitors were trained to use them. Questionnaires were developed and programmed in 38 languages and scripts. The global hardware failure rate was < 1% and data loss was almost 0%. Electronic data collection and management systems can be used effectively for conducting nationally representative surveys, particularly in low- and middle-income countries, irrespective of geographical, climatic, political and cultural environments, and capacity-building at the country level is an important vehicle for Health System Strengthening.

  6. A web platform for integrated surface water - groundwater modeling and data management

    NASA Astrophysics Data System (ADS)

    Fatkhutdinov, Aybulat; Stefan, Catalin; Junghanns, Ralf

    2016-04-01

    Model-based decision support systems are considered to be reliable and time-efficient tools for resources management in various hydrology related fields. However, searching and acquisition of the required data, preparation of the data sets for simulations as well as post-processing, visualization and publishing of the simulations results often requires significantly more work and time than performing the modeling itself. The purpose of the developed software is to combine data storage facilities, data processing instruments and modeling tools in a single platform which potentially can reduce time required for performing simulations, hence decision making. The system is developed within the INOWAS (Innovative Web Based Decision Support System for Water Sustainability under a Changing Climate) project. The platform integrates spatially distributed catchment scale rainfall - runoff, infiltration and groundwater flow models with data storage, processing and visualization tools. The concept is implemented in a form of a web-GIS application and is build based on free and open source components, including the PostgreSQL database management system, Python programming language for modeling purposes, Mapserver for visualization and publishing the data, Openlayers for building the user interface and others. Configuration of the system allows performing data input, storage, pre- and post-processing and visualization in a single not disturbed workflow. In addition, realization of the decision support system in the form of a web service provides an opportunity to easily retrieve and share data sets as well as results of simulations over the internet, which gives significant advantages for collaborative work on the projects and is able to significantly increase usability of the decision support system.

  7. A Management Information System Design for a General Museum. Museum Data Bank Research Report No. 12.

    ERIC Educational Resources Information Center

    Scholtz, Sandra

    A management information system (MIS) is applied to a medium sized general museum to reflect the actual curatorial/registration functions. The recordkeeping functions of loan and conservation activities are examined since they too can be effectively handled by computer and constitute a complementary data base to the accession/catalog information.…

  8. Implementing a low-cost web-based clinical trial management system for community studies: a case study.

    PubMed

    Geyer, John; Myers, Kathleen; Vander Stoep, Ann; McCarty, Carolyn; Palmer, Nancy; DeSalvo, Amy

    2011-10-01

    Clinical trials with multiple intervention locations and a single research coordinating center can be logistically difficult to implement. Increasingly, web-based systems are used to provide clinical trial support with many commercial, open source, and proprietary systems in use. New web-based tools are available which can be customized without programming expertise to deliver web-based clinical trial management and data collection functions. To demonstrate the feasibility of utilizing low-cost configurable applications to create a customized web-based data collection and study management system for a five intervention site randomized clinical trial establishing the efficacy of providing evidence-based treatment via teleconferencing to children with attention-deficit hyperactivity disorder. The sites are small communities that would not usually be included in traditional randomized trials. A major goal was to develop database that participants could access from computers in their home communities for direct data entry. Discussed is the selection process leading to the identification and utilization of a cost-effective and user-friendly set of tools capable of customization for data collection and study management tasks. An online assessment collection application, template-based web portal creation application, and web-accessible Access 2007 database were selected and customized to provide the following features: schedule appointments, administer and monitor online secure assessments, issue subject incentives, and securely transmit electronic documents between sites. Each tool was configured by users with limited programming expertise. As of June 2011, the system has successfully been used with 125 participants in 5 communities, who have completed 536 sets of assessment questionnaires, 8 community therapists, and 11 research staff at the research coordinating center. Total automation of processes is not possible with the current set of tools as each is loosely affiliated, creating some inefficiency. This system is best suited to investigations with a single data source e.g., psychosocial questionnaires. New web-based applications can be used by investigators with limited programming experience to implement user-friendly, efficient, and cost-effective tools for multi-site clinical trials with small distant communities. Such systems allow the inclusion in research of populations that are not usually involved in clinical trials.

  9. [Information technology for the management of health care data: the EPIweb project].

    PubMed

    Vittorini, Pierpaolo; Necozione, Stefano; di Orio, Ferdinando

    2005-01-01

    In the US, the Center for Disease Control and Prevention has produced has increased the permeability of the computer science technologies, in order to achieve a better and more efficient management of health care data. In this context, the present paper proposes a discussion regarding a web-based information system, called EPIweb. This system allows researchers to select the centers for the data entry, collect and elaborate health care data, produce technical reports and discuss results. Such a system aims to be easy-to-use, totally configurable and particularly suitable for the management of multicenter studies. The paper shows the EPIweb features, proposes a sample system run, and concludes with a discussion regarding both the advantages and the possible improvements and extensions.

  10. Monitoring and controlling ATLAS data management: The Rucio web user interface

    NASA Astrophysics Data System (ADS)

    Lassnig, M.; Beermann, T.; Vigne, R.; Barisits, M.; Garonne, V.; Serfon, C.

    2015-12-01

    The monitoring and controlling interfaces of the previous data management system DQ2 followed the evolutionary requirements and needs of the ATLAS collaboration. The new data management system, Rucio, has put in place a redesigned web-based interface based upon the lessons learnt from DQ2, and the increased volume of managed information. This interface encompasses both a monitoring and controlling component, and allows easy integration for usergenerated views. The interface follows three design principles. First, the collection and storage of data from internal and external systems is asynchronous to reduce latency. This includes the use of technologies like ActiveMQ or Nagios. Second, analysis of the data into information is done massively parallel due to its volume, using a combined approach with an Oracle database and Hadoop MapReduce. Third, sharing of the information does not distinguish between human or programmatic access, making it easy to access selective parts of the information both in constrained frontends like web-browsers as well as remote services. This contribution will detail the reasons for these principles and the design choices taken. Additionally, the implementation, the interactions with external systems, and an evaluation of the system in production, both from a technological and user perspective, conclude this contribution.

  11. Planning the future of JPL's management and administrative support systems around an integrated database

    NASA Technical Reports Server (NTRS)

    Ebersole, M. M.

    1983-01-01

    JPL's management and administrative support systems have been developed piece meal and without consistency in design approach over the past twenty years. These systems are now proving to be inadequate to support effective management of tasks and administration of the Laboratory. New approaches are needed. Modern database management technology has the potential for providing the foundation for more effective administrative tools for JPL managers and administrators. Plans for upgrading JPL's management and administrative systems over a six year period evolving around the development of an integrated management and administrative data base are discussed.

  12. Improved Discovery and Re-Use of Oceanographic Data through a Data Management Center

    NASA Astrophysics Data System (ADS)

    Rauch, S.; Allison, M. D.; Groman, R. C.; Chandler, C. L.; Galvarino, C.; Gegg, S. R.; Kinkade, D.; Shepherd, A.; Wiebe, P. H.; Glover, D. M.

    2013-12-01

    Effective use and reuse of ecological data are not only contingent upon those data being well-organized and documented, but also upon data being easily discoverable and accessible by others. As funding agency and publisher policies begin placing more emphasis on, or even requiring, sharing of data, some researchers may feel overwhelmed in determining how best to manage and share their data. Other researchers may be frustrated by the inability to easily find data of interest, or they may be hesitant to use datasets that are poorly organized and lack complete documentation. In all of these scenarios, the data management and sharing process can be facilitated by data management centers, as demonstrated by the Biological and Chemical Oceanography Data Management Office (BCO-DMO). BCO-DMO was created in 2006 to work with investigators to manage data from research funded by the Division of Ocean Sciences (OCE) Biological and Chemical Oceanography Sections and the Division of Polar Programs (PLR) Antarctic Organisms and Ecosystems Program of the US National Science Foundation (NSF). BCO-DMO plays a role throughout the data lifecycle, from the early stages of offering support to researchers in developing data management plans to the final stages of depositing data in a permanent archive. An overarching BCO-DMO goal is to provide open access to data through a system that enhances data discovery and reuse. Features have been developed that allow users to find data of interest, assess fitness for purpose, and download the data for reuse. Features that enable discovery include both text-based and geospatial-based search interfaces, as well as a semantically-enabled faceted search [1]. BCO-DMO data managers work closely with the contributing investigators to develop robust metadata, an essential component to enable data reuse. The metadata, which describe data acquisition and processing methods, instrumentation, and parameters, are enhanced by the mapping of local vocabulary terms to community accepted controlled vocabularies. This use of controlled vocabularies allows for terms to be defined unambiguously, so users of the data know definitively what parameter was measured and/or analyzed and what instruments were used. Users can further assess fitness for use by visualizing data in the geospatial interface in various ways depending on the data type. Both the text- and geospatial-based interfaces provide easy access to view the datasets and download them in multiple formats. The BCO-DMO system, including the geospatial interface, relies largely on the use of open source software and tools. The data themselves are made available via the JGOFS/GLOBEC system [2], a distributed object-oriented data management system. Researchers contributing data to BCO-DMO benefit from the data management and sharing resources. Researchers looking for data can use BCO-DMO's system to find and use data of interest. This role of the data management center in facilitating discovery and reuse is one that can be extended to other research disciplines for the benefit of the science community. References: [1] Maffei, A. et al. 2011. Open Standards and Technologies in the S2S Framework. Abstract IN31A-1435 presented at AGU Fall Meeting, San Francisco, CA, 7 Dec 2011. [2] Flierl, G.R. et al. 2004. JGOFS Data System Overview, http://globec.whoi.edu/globec-dir/doc/datasys/jgsys.html.

  13. Centralized Data Management in a Multicountry, Multisite Population-based Study.

    PubMed

    Rahman, Qazi Sadeq-ur; Islam, Mohammad Shahidul; Hossain, Belal; Hossain, Tanvir; Connor, Nicholas E; Jaman, Md Jahiduj; Rahman, Md Mahmudur; Ahmed, A S M Nawshad Uddin; Ahmed, Imran; Ali, Murtaza; Moin, Syed Mamun Ibne; Mullany, Luke; Saha, Samir K; El Arifeen, Shams

    2016-05-01

    A centralized data management system was developed for data collection and processing for the Aetiology of Neonatal Infection in South Asia (ANISA) study. ANISA is a longitudinal cohort study involving neonatal infection surveillance and etiology detection in multiple sites in South Asia. The primary goal of designing such a system was to collect and store data from different sites in a standardized way to pool the data for analysis. We designed the data management system centrally and implemented it to enable data entry at individual sites. This system uses validation rules and audit that reduce errors. The study sites employ a dual data entry method to minimize keystroke errors. They upload collected data weekly to a central server via internet to create a pooled central database. Any inconsistent data identified in the central database are flagged and corrected after discussion with the relevant site. The ANISA Data Coordination Centre in Dhaka provides technical support for operations, maintenance and updating the data management system centrally. Password-protected login identifications and audit trails are maintained for the management system to ensure the integrity and safety of stored data. Centralized management of the ANISA database helps to use common data capture forms (DCFs), adapted to site-specific contextual requirements. DCFs and data entry interfaces allow on-site data entry. This reduces the workload as DCFs do not need to be shipped to a single location for entry. It also improves data quality as all collected data from ANISA goes through the same quality check and cleaning process.

  14. Procedures for woody vegetation surveys in the Kazgail rural council area, Kordofan, Sudan

    USGS Publications Warehouse

    Falconer, Allan; Cross, Matthew D.; Orr, Donald G.

    1990-01-01

    Efforts to reforest parts of the Kordofan Province of Sudan are receiving support from international development agencies. These efforts include planning and implementing reforestation activities that require the collection of natural resources and socioeconomic data, and the preparation of base maps. A combination of remote sensing, geographic information system and global positioning systems procedures are used in this study to meet these requirements.Remote sensing techniques were used to provide base maps and to guide the compilation of vegetation resources maps. These techniques provided a rapid and efficient method for documenting available resources. Pocket‐sized global positioning system units were used to establish the location of field data collected for mapping and resource analysis. A microcomputer data management system tabulated and displayed the field data. The resulting system for data analysis, management, and planning has been adopted for the mapping and inventory of the Gum Belt of Sudan.

  15. Graph Databases for Large-Scale Healthcare Systems: A Framework for Efficient Data Management and Data Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Yubin; Shankar, Mallikarjun; Park, Byung H.

    Designing a database system for both efficient data management and data services has been one of the enduring challenges in the healthcare domain. In many healthcare systems, data services and data management are often viewed as two orthogonal tasks; data services refer to retrieval and analytic queries such as search, joins, statistical data extraction, and simple data mining algorithms, while data management refers to building error-tolerant and non-redundant database systems. The gap between service and management has resulted in rigid database systems and schemas that do not support effective analytics. We compose a rich graph structure from an abstracted healthcaremore » RDBMS to illustrate how we can fill this gap in practice. We show how a healthcare graph can be automatically constructed from a normalized relational database using the proposed 3NF Equivalent Graph (3EG) transformation.We discuss a set of real world graph queries such as finding self-referrals, shared providers, and collaborative filtering, and evaluate their performance over a relational database and its 3EG-transformed graph. Experimental results show that the graph representation serves as multiple de-normalized tables, thus reducing complexity in a database and enhancing data accessibility of users. Based on this finding, we propose an ensemble framework of databases for healthcare applications.« less

  16. Bridging the Gap Between Surveyors and the Geo-Spatial Society

    NASA Astrophysics Data System (ADS)

    Müller, H.

    2016-06-01

    For many years FIG, the International Association of Surveyors, has been trying to bridge the gap between surveyors and the geospatial society as a whole, with the geospatial industries in particular. Traditionally the surveying profession contributed to the good of society by creating and maintaining highly precise and accurate geospatial data bases, based on an in-depth knowledge of spatial reference frameworks. Furthermore in many countries surveyors may be entitled to make decisions about land divisions and boundaries. By managing information spatially surveyors today develop into the role of geo-data managers, the longer the more. Job assignments in this context include data entry management, data and process quality management, design of formal and informal systems, information management, consultancy, land management, all that in close cooperation with many different stakeholders. Future tasks will include the integration of geospatial information into e-government and e-commerce systems. The list of professional tasks underpins the capabilities of surveyors to contribute to a high quality geospatial data and information management. In that way modern surveyors support the needs of a geo-spatial society. The paper discusses several approaches to define the role of the surveyor within the modern geospatial society.

  17. Importance of Data Management in a Long-term Biological Monitoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Sigurd W; Brandt, Craig C; McCracken, Kitty

    2011-01-01

    The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program. An existing relational database was adapted and extended to handle biological data. Data modeling enabled the program's database to process, store, and retrievemore » its data. The data base's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. There are limitations. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.« less

  18. The Chandra X-ray Center data system: supporting the mission of the Chandra X-ray Observatory

    NASA Astrophysics Data System (ADS)

    Evans, Janet D.; Cresitello-Dittmar, Mark; Doe, Stephen; Evans, Ian; Fabbiano, Giuseppina; Germain, Gregg; Glotfelty, Kenny; Hall, Diane; Plummer, David; Zografou, Panagoula

    2006-06-01

    The Chandra X-ray Center Data System provides end-to-end scientific software support for Chandra X-ray Observatory mission operations. The data system includes the following components: (1) observers' science proposal planning tools; (2) science mission planning tools; (3) science data processing, monitoring, and trending pipelines and tools; and (4) data archive and database management. A subset of the science data processing component is ported to multiple platforms and distributed to end-users as a portable data analysis package. Web-based user tools are also available for data archive search and retrieval. We describe the overall architecture of the data system and its component pieces, and consider the design choices and their impacts on maintainability. We discuss the many challenges involved in maintaining a large, mission-critical software system with limited resources. These challenges include managing continually changing software requirements and ensuring the integrity of the data system and resulting data products while being highly responsive to the needs of the project. We describe our use of COTS and OTS software at the subsystem and component levels, our methods for managing multiple release builds, and adapting a large code base to new hardware and software platforms. We review our experiences during the life of the mission so-far, and our approaches for keeping a small, but highly talented, development team engaged during the maintenance phase of a mission.

  19. A data and information system for processing, archival, and distribution of data for global change research

    NASA Technical Reports Server (NTRS)

    Graves, Sara J.

    1994-01-01

    Work on this project was focused on information management techniques for Marshall Space Flight Center's EOSDIS Version 0 Distributed Active Archive Center (DAAC). The centerpiece of this effort has been participation in EOSDIS catalog interoperability research, the result of which is a distributed Information Management System (IMS) allowing the user to query the inventories of all the DAAC's from a single user interface. UAH has provided the MSFC DAAC database server for the distributed IMS, and has contributed to definition and development of the browse image display capabilities in the system's user interface. Another important area of research has been in generating value-based metadata through data mining. In addition, information management applications for local inventory and archive management, and for tracking data orders were provided.

  20. A Method to Calculate and Analyze Residents' Evaluations by Using a Microcomputer Data-Base Management System.

    ERIC Educational Resources Information Center

    Mills, Myron L.

    1988-01-01

    A system developed for more efficient evaluation of graduate medical students' progress uses numerical scoring and a microcomputer database management system as an alternative to manual methods to produce accurate, objective, and meaningful summaries of resident evaluations. (Author/MSE)

  1. Patients’ Data Management System Protected by Identity-Based Authentication and Key Exchange

    PubMed Central

    Rivero-García, Alexandra; Santos-González, Iván; Hernández-Goya, Candelaria; Caballero-Gil, Pino; Yung, Moti

    2017-01-01

    A secure and distributed framework for the management of patients’ information in emergency and hospitalization services is proposed here in order to seek improvements in efficiency and security in this important area. In particular, confidentiality protection, mutual authentication, and automatic identification of patients are provided. The proposed system is based on two types of devices: Near Field Communication (NFC) wristbands assigned to patients, and mobile devices assigned to medical staff. Two other main elements of the system are an intermediate server to manage the involved data, and a second server with a private key generator to define the information required to protect communications. An identity-based authentication and key exchange scheme is essential to provide confidential communication and mutual authentication between the medical staff and the private key generator through an intermediate server. The identification of patients is carried out through a keyed-hash message authentication code. Thanks to the combination of the aforementioned tools, a secure alternative mobile health (mHealth) scheme for managing patients’ data is defined for emergency and hospitalization services. Different parts of the proposed system have been implemented, including mobile application, intermediate server, private key generator and communication channels. Apart from that, several simulations have been performed, and, compared with the current system, significant improvements in efficiency have been observed. PMID:28362328

  2. Patients' Data Management System Protected by Identity-Based Authentication and Key Exchange.

    PubMed

    Rivero-García, Alexandra; Santos-González, Iván; Hernández-Goya, Candelaria; Caballero-Gil, Pino; Yung, Moti

    2017-03-31

    A secure and distributed framework for the management of patients' information in emergency and hospitalization services is proposed here in order to seek improvements in efficiency and security in this important area. In particular, confidentiality protection, mutual authentication, and automatic identification of patients are provided. The proposed system is based on two types of devices: Near Field Communication (NFC) wristbands assigned to patients, and mobile devices assigned to medical staff. Two other main elements of the system are an intermediate server to manage the involved data, and a second server with a private key generator to define the information required to protect communications. An identity-based authentication and key exchange scheme is essential to provide confidential communication and mutual authentication between the medical staff and the private key generator through an intermediate server. The identification of patients is carried out through a keyed-hash message authentication code. Thanks to the combination of the aforementioned tools, a secure alternative mobile health (mHealth) scheme for managing patients' data is defined for emergency and hospitalization services. Different parts of the proposed system have been implemented, including mobile application, intermediate server, private key generator and communication channels. Apart from that, several simulations have been performed, and, compared with the current system, significant improvements in efficiency have been observed.

  3. Development of geotechnical analysis and design modules for the Virginia Department of Transportation's geotechnical database.

    DOT National Transportation Integrated Search

    2005-01-01

    In 2003, an Internet-based Geotechnical Database Management System (GDBMS) was developed for the Virginia Department of Transportation (VDOT) using distributed Geographic Information System (GIS) methodology for data management, archival, retrieval, ...

  4. Developing an electronic system to manage and track emergency medications.

    PubMed

    Hamm, Mark W; Calabrese, Samuel V; Knoer, Scott J; Duty, Ashley M

    2018-03-01

    The development of a Web-based program to track and manage emergency medications with radio frequency identification (RFID) is described. At the Cleveland Clinic, medication kit restocking records and dispense locations were historically documented using a paper record-keeping system. The Cleveland Clinic investigated options to replace the paper-based tracking logs with a Web-based program that could track the real-time location and inventory of emergency medication kits. Vendor collaboration with a board of pharmacy (BOP) compliance inspector and pharmacy personnel resulted in the creation of a dual barcoding system using medication and pocket labels. The Web-based program was integrated with a Cleveland Clinic-developed asset tracking system using active RFID tags to give the real-time location of the medication kit. The Web-based program and the asset tracking system allowed identification of kits nearing expiration or containing recalled medications. Conversion from a paper-based system to a Web-based program began in October 2013. After 119 days, data were evaluated to assess the success of the conversion. Pharmacists spent an average of 27 minutes per day approving medication kits during the postimplementation period versus 102 minutes daily using the paper-based system, representing a 74% decrease in pharmacist time spent on this task. Prospective reports are generated monthly to allow the manager to assess the expected workload and adjust staffing for the next month. Implementation of a BOP-approved Web-based system for managing and tracking emergency medications with RFID integration decreased pharmacist review time, minimized compliance risk, and increased access to real-time data. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  5. Managing patients' wait time in specialist out-patient clinic using real-time data from existing queue management and ADT systems.

    PubMed

    Ju, John Chen; Gan, Soon Ann; Tan Siew Wee, Justine; Huang Yuchi, Peter; Mei Mei, Chan; Wong Mei Mei, Sharon; Fong, Kam Weng

    2013-01-01

    In major cancer centers, heavy patients load and multiple registration stations could cause significant wait time, and can be result in patient complains. Real-time patient journey data and visual display are useful tools in hospital patient queue management. This paper demonstrates how we capture patient queue data without deploying any tracing devices; and how to convert data into useful patient journey information to understand where interventions are likely to be most effective. During our system development, remarkable effort has been spent on resolving data discrepancy and balancing between accuracy and system performances. A web-based dashboard to display real-time information and a framework for data analysis were also developed to facilitate our clinics' operation. Result shows our system could eliminate more than 95% of data capturing errors and has improved patient wait time data accuracy since it was deployed.

  6. Total Quality Management of Information System for Quality Assessment of Pesantren Using Fuzzy-SERVQUAL

    NASA Astrophysics Data System (ADS)

    Faizah, Arbiati; Syafei, Wahyul Amien; Isnanto, R. Rizal

    2018-02-01

    This research proposed a model combining an approach of Total Quality Management (TQM) and Fuzzy method of Service Quality (SERVQUAL) to asses service quality. TQM implementation was as quality management orienting on customer's satisfaction by involving all stakeholders. SERVQUAL model was used to measure quality service based on five dimensions such as tangible, reliability, responsiveness, assurance, and empathy. Fuzzy set theory was to accommodate subjectivity and ambiguity of quality assessment. Input data consisted of indicator data and quality assessment aspect. Input data was, then, processed to be service quality assessment questionnaires of Pesantren by using Fuzzy method to get service quality score. This process consisted of some steps as follows : inputting dimension and questionnaire data to data base system, filling questionnaire through system, then, system calculated fuzzification, defuzzification, gap of quality expected and received by service receivers, and calculating each dimension rating showing quality refinement priority. Rating of each quality dimension was, then, displayed at dashboard system to enable users to see information. From system having been built, it could be known that tangible dimension had the highest gap, -0.399, thus it needs to be prioritized and gets evaluation and refinement action soon.

  7. Proceedings of the National Conference on Energy Resource Management. Volume 1: Techniques, Procedures and Data Bases

    NASA Technical Reports Server (NTRS)

    Brumfield, J. O. (Editor); Schiffman, Y. M. (Editor)

    1982-01-01

    Topics dealing with the integration of remotely sensed data with geographic information system for application in energy resources management are discussed. Associated remote sensing and image analysis techniques are also addressed.

  8. Satellite control system nucleus for the Brazilian complete space mission

    NASA Astrophysics Data System (ADS)

    Yamaguti, Wilson; Decarvalhovieira, Anastacio Emanuel; Deoliveira, Julia Leocadia; Cardoso, Paulo Eduardo; Dacosta, Petronio Osorio

    1990-10-01

    The nucleus of the satellite control system for the Brazilian data collecting and remote sensing satellites is described. The system is based on Digital Equipment Computers and the VAX/VMS operating system. The nucleus provides the access control, the system configuration, the event management, history files management, time synchronization, wall display control, and X25 data communication network access facilities. The architecture of the nucleus and its main implementation aspects are described. The implementation experience acquired is considered.

  9. The design and implementation of hydrographical information management system (HIMS)

    NASA Astrophysics Data System (ADS)

    Sui, Haigang; Hua, Li; Wang, Qi; Zhang, Anming

    2005-10-01

    With the development of hydrographical work and information techniques, the large variety of hydrographical information including electronic charts, documents and other materials are widely used, and the traditional management mode and techniques are unsuitable for the development of the Chinese Marine Safety Administration Bureau (CMSAB). How to manage all kinds of hydrographical information has become an important and urgent problem. A lot of advanced techniques including GIS, RS, spatial database management and VR techniques are introduced for solving these problems. Some design principles and key techniques of the HIMS including the mixed mode base on B/S, C/S and stand-alone computer mode, multi-source & multi-scale data organization and management, multi-source data integration and diverse visualization of digital chart, efficient security control strategies are illustrated in detail. Based on the above ideas and strategies, an integrated system named Hydrographical Information Management System (HIMS) was developed. And the HIMS has been applied in the Shanghai Marine Safety Administration Bureau and obtained good evaluation.

  10. Smart Information Management in Health Big Data.

    PubMed

    Muteba A, Eustache

    2017-01-01

    The smart information management system (SIMS) is concerned with the organization of anonymous patient records in a big data and their extraction in order to provide needful real-time intelligence. The purpose of the present study is to highlight the design and the implementation of the smart information management system. We emphasis, in one hand, the organization of a big data in flat file in simulation of nosql database, and in the other hand, the extraction of information based on lookup table and cache mechanism. The SIMS in the health big data aims the identification of new therapies and approaches to delivering care.

  11. [Discussion on developing a data management plan and its key factors in clinical study based on electronic data capture system].

    PubMed

    Li, Qing-na; Huang, Xiu-ling; Gao, Rui; Lu, Fang

    2012-08-01

    Data management has significant impact on the quality control of clinical studies. Every clinical study should have a data management plan to provide overall work instructions and ensure that all of these tasks are completed according to the Good Clinical Data Management Practice (GCDMP). Meanwhile, the data management plan (DMP) is an auditable document requested by regulatory inspectors and must be written in a manner that is realistic and of high quality. The significance of DMP, the minimum standards and the best practices provided by GCDMP, the main contents of DMP based on electronic data capture (EDC) and some key factors of DMP influencing the quality of clinical study were elaborated in this paper. Specifically, DMP generally consists of 15 parts, namely, the approval page, the protocol summary, role and training, timelines, database design, creation, maintenance and security, data entry, data validation, quality control and quality assurance, the management of external data, serious adverse event data reconciliation, coding, database lock, data management reports, the communication plan and the abbreviated terms. Among them, the following three parts are regarded as the key factors: designing a standardized database of the clinical study, entering data in time and cleansing data efficiently. In the last part of this article, the authors also analyzed the problems in clinical research of traditional Chinese medicine using the EDC system and put forward some suggestions for improvement.

  12. An Analytics-Based Approach to Managing Cognitive Load by Using Log Data of Learning Management Systems and Footprints of Social Media

    ERIC Educational Resources Information Center

    Yen, Cheng-Huang; Chen, I-Chuan; Lai, Su-Chun; Chuang, Yea-Ru

    2015-01-01

    Traces of learning behaviors generally provide insights into learners and the learning processes that they employ. In this article, a learning-analytics-based approach is proposed for managing cognitive load by adjusting the instructional strategies used in online courses. The technology-based learning environment examined in this study involved a…

  13. Management Information System for ESD Program Offices.

    DTIC Science & Technology

    1978-03-01

    Management Information System (MIS) functional requirements for the ESD Program Office are defined in terms of the Computer-Aided Design and Specification Tool. The development of the computer data base and a description of the MIS structure is included in the report. This report addresses management areas such as cost/budgeting, scheduling, tracking capabilities, and ECP

  14. A Laboratory-Based System for Managing and Distributing Publically Funded Geochemical Data in a Collaborative Environment

    NASA Astrophysics Data System (ADS)

    McInnes, B.; Brown, A.; Liffers, M.

    2015-12-01

    Publically funded laboratories have a responsibility to generate, archive and disseminate analytical data to the research community. Laboratory managers know however, that a long tail of analytical effort never escapes researchers' thumb drives once they leave the lab. This work reports on a research data management project (Digital Mineralogy Library) where integrated hardware and software systems automatically archive and deliver analytical data and metadata to institutional and community data portals. The scientific objective of the DML project was to quantify the modal abundance of heavy minerals extracted from key lithological units in Western Australia. The selected analytical platform was a TESCAN Integrated Mineral Analyser (TIMA) that uses EDS-based mineral classification software to image and quantify mineral abundance and grain size at micron scale resolution. The analytical workflow used a bespoke laboratory information management system (LIMS) to orchestrate: (1) the preparation of grain mounts with embedded QR codes that serve as enduring links between physical samples and analytical data, (2) the assignment of an International Geo Sample Number (IGSN) and Digital Object Identifier (DOI) to each grain mount via the System for Earth Sample Registry (SESAR), (3) the assignment of a DOI to instrument metadata via Research Data Australia, (4) the delivery of TIMA analytical outputs, including spatially registered mineralogy images and mineral abundance data, to an institutionally-based data management server, and (5) the downstream delivery of a final data product via a Google Maps interface such as the AuScope Discovery Portal. The modular design of the system permits the networking of multiple instruments within a single site or multiple collaborating research institutions. Although sharing analytical data does provide new opportunities for the geochemistry community, the creation of an open data network requires: (1) adopting open data reporting standards and conventions, (2) requiring instrument manufacturers and software developers to deliver and process data in formats compatible with open standards, and (3) public funding agencies to incentivise researchers, laboratories and institutions to make their data open and accessible to consumers.

  15. Semantic World Modelling and Data Management in a 4d Forest Simulation and Information System

    NASA Astrophysics Data System (ADS)

    Roßmann, J.; Hoppen, M.; Bücken, A.

    2013-08-01

    Various types of 3D simulation applications benefit from realistic forest models. They range from flight simulators for entertainment to harvester simulators for training and tree growth simulations for research and planning. Our 4D forest simulation and information system integrates the necessary methods for data extraction, modelling and management. Using modern methods of semantic world modelling, tree data can efficiently be extracted from remote sensing data. The derived forest models contain position, height, crown volume, type and diameter of each tree. This data is modelled using GML-based data models to assure compatibility and exchangeability. A flexible approach for database synchronization is used to manage the data and provide caching, persistence, a central communication hub for change distribution, and a versioning mechanism. Combining various simulation techniques and data versioning, the 4D forest simulation and information system can provide applications with "both directions" of the fourth dimension. Our paper outlines the current state, new developments, and integration of tree extraction, data modelling, and data management. It also shows several applications realized with the system.

  16. Migration of the Japanese healthcare enterprise from a financial to integrated management: strategy and architecture.

    PubMed

    Akiyama, M

    2001-01-01

    The Hospital Information System (HIS) has been positioned as the hub of the healthcare information management architecture. In Japan, the billing system assigns an "insurance disease names" to performed exams based on the diagnosis type. Departmental systems provide localized, departmental services, such as order receipt and diagnostic reporting, but do not provide patient demographic information. The system above has many problems. The departmental system's terminals and the HIS's terminals are not integrated. Duplicate data entry introduces errors and increases workloads. Order and exam data managed by the HIS can be sent to the billing system, but departmental data cannot usually be entered. Additionally, billing systems usually keep departmental data for only a short time before it is deleted. The billing system provides payment based on what is entered. The billing system is oriented towards diagnoses. Most importantly, the system is geared towards generating billing reports rather than at providing high-quality patient care. The role of the application server is that of a mediator between system components. Data and events generated by system components are sent to the application server that routes them to appropriate destinations. It also records all system events, including state changes to clinical data, access of clinical data and so on. Finally, the Resource Management System identifies all system resources available to the enterprise. The departmental systems are responsible for managing data and clinical processes at a departmental level. The client interacts with the system via the application server, which provides a general set of system-level functions. The system is implemented using current technologies CORBA and HTTP. System data is collected by the application server and assembled into XML documents for delivery to clients. Clients can access these URLs using standard HTTP clients, since each department provides an HTTP compliant web-server. We have implemented an integrated system communicating via CORBA middleware, consisting of an application server, endoscopy departmental server, pathology departmental server and wrappered legacy HIS. We have found this new approach solves the problems outlined earlier. It provides the services needed to ensure that data is never lost and is always available, that events that occur in the hospital are always captured, and that resources are managed and tracked effectively. Finally, it reduces costs, raises efficiency, increases the quality of patient care, and ultimately saves lives. Now, we are going to integrate all remaining hospital departments, and ultimately, all hospital functions.

  17. Sustainable management for the eastern Mediterranean coast of Turkey.

    PubMed

    Berberoglu, Süha

    2003-03-01

    The objective of this article is to propose a program for the integrated coastal zone management that is required to stimulate and guide sustainable development of the Mediterranean coastal zone of Turkey. Improved data collection, quality control, analysis, and data management will provide a firm basis for future scientific understanding of the East Mediterranean coast of Turkey and will support long-term management. Various innovative procedures were proposed for a promising ecosystem-based approach to manage coastal wetlands in the Mediterranean: remote data acquisition with new technologies; environmental quality monitoring program that will provide a baseline for monitoring; linking a Geographic Information System (GIS) with natural resource management decision routines in the context of operational wetlands, fisheries, tourism management system; environmental sensitivity analysis to ensure that permitted developments are environmentally sustainable; and use of natural species to restore the wetlands and coastal dunes and sustain the system processes. The proposed management scheme will benefit the scientific community in the Mediterranean and the management/planning community in Eastern Turkey.

  18. Integration and management of massive remote-sensing data based on GeoSOT subdivision model

    NASA Astrophysics Data System (ADS)

    Li, Shuang; Cheng, Chengqi; Chen, Bo; Meng, Li

    2016-07-01

    Owing to the rapid development of earth observation technology, the volume of spatial information is growing rapidly; therefore, improving query retrieval speed from large, rich data sources for remote-sensing data management systems is quite urgent. A global subdivision model, geographic coordinate subdivision grid with one-dimension integer coding on 2n-tree, which we propose as a solution, has been used in data management organizations. However, because a spatial object may cover several grids, ample data redundancy will occur when data are stored in relational databases. To solve this redundancy problem, we first combined the subdivision model with the spatial array database containing the inverted index. We proposed an improved approach for integrating and managing massive remote-sensing data. By adding a spatial code column in an array format in a database, spatial information in remote-sensing metadata can be stored and logically subdivided. We implemented our method in a Kingbase Enterprise Server database system and compared the results with the Oracle platform by simulating worldwide image data. Experimental results showed that our approach performed better than Oracle in terms of data integration and time and space efficiency. Our approach also offers an efficient storage management system for existing storage centers and management systems.

  19. Geographic Information System (GIS) capabilities in traffic accident information management: a qualitative approach.

    PubMed

    Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali

    2017-06-01

    Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach's alpha of 75%. Data was analyzed using the decision Delphi technique. GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information.

  20. A Statewide Management Information System for the Control of Sexually Transmitted Diseases

    PubMed Central

    Fichtner, Ronald R.; Blount, Joseph H.; Spencer, Jack N.

    1983-01-01

    The persistent endemicity in the U.S. of infectious syphilis and gonorrhea, together with increasing diagnoses of gonococcal-related pelvic inflammatory disease in women and genital herpes infections, have intensified pressures on state and local VD control programs to measure, analyze, and interpret the distribution and transmission of these and other sexually transmitted diseases. In response, the Division of Venereal Disease Control (DVDC) of the Centers for Disease Control (CDC) is participating in the development of three state-wide, prototype sexually transmitted disease (STD) management information systems. A systems analysis of a typical state-wide STD control program indicated that timely, comprehensive, informational support to public health managers and policy makers should be combined with rapid, direct support of program activities using an on-line, integrated data base, computer system with telecommunications capability. This methodology uses a data base management system, query facility for ad hoc inquiries, custom design philosophies, but utilizes distinct hardware and software implementations.

  1. Data management plan for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, D.

    The Hanford Data Integration 2000 (HANDI 2000) Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract (PHMC). It is based on the Commercial-Off-The-Shelf (COTS) product solution with commercially proven business processes. The COTS product solution set, of PassPort (PP) and PeopleSoft (PS) software, supports finance, supply and chemical management/Material Safety Data Sheet.

  2. Estuary Data Mapper: A coastal information system to propel emerging science and inform environmental management decisions

    EPA Science Inventory

    The Estuary Data Mapper (EDM) is a free, interactive virtual gateway to coastal data aimed to promote research and aid in environmental management. The graphical user interface allows users to custom select and subset data based on their spatial and temporal interests giving them...

  3. Research on SaaS and Web Service Based Order Tracking

    NASA Astrophysics Data System (ADS)

    Jiang, Jianhua; Sheng, Buyun; Gong, Lixiong; Yang, Mingzhong

    To solve the order tracking of across enterprises in Dynamic Virtual Enterprise (DVE), a SaaS and web service based order tracking solution was designed by analyzing the order management process in DVE. To achieve the system, the SaaS based architecture of data management on order tasks manufacturing states was constructed, and the encapsulation method of transforming application system into web service was researched. Then the process of order tracking in the system was given out. Finally, the feasibility of this study was verified by the development of a prototype system.

  4. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Bonvini, Marco; Piette, Mary Ann

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and buildingmore » behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.« less

  5. Informatics in radiology: use of CouchDB for document-based storage of DICOM objects.

    PubMed

    Rascovsky, Simón J; Delgado, Jorge A; Sanz, Alexander; Calvo, Víctor D; Castrillón, Gabriel

    2012-01-01

    Picture archiving and communication systems traditionally have depended on schema-based Structured Query Language (SQL) databases for imaging data management. To optimize database size and performance, many such systems store a reduced set of Digital Imaging and Communications in Medicine (DICOM) metadata, discarding informational content that might be needed in the future. As an alternative to traditional database systems, document-based key-value stores recently have gained popularity. These systems store documents containing key-value pairs that facilitate data searches without predefined schemas. Document-based key-value stores are especially suited to archive DICOM objects because DICOM metadata are highly heterogeneous collections of tag-value pairs conveying specific information about imaging modalities, acquisition protocols, and vendor-supported postprocessing options. The authors used an open-source document-based database management system (Apache CouchDB) to create and test two such databases; CouchDB was selected for its overall ease of use, capability for managing attachments, and reliance on HTTP and Representational State Transfer standards for accessing and retrieving data. A large database was created first in which the DICOM metadata from 5880 anonymized magnetic resonance imaging studies (1,949,753 images) were loaded by using a Ruby script. To provide the usual DICOM query functionality, several predefined "views" (standard queries) were created by using JavaScript. For performance comparison, the same queries were executed in both the CouchDB database and a SQL-based DICOM archive. The capabilities of CouchDB for attachment management and database replication were separately assessed in tests of a similar, smaller database. Results showed that CouchDB allowed efficient storage and interrogation of all DICOM objects; with the use of information retrieval algorithms such as map-reduce, all the DICOM metadata stored in the large database were searchable with only a minimal increase in retrieval time over that with the traditional database management system. Results also indicated possible uses for document-based databases in data mining applications such as dose monitoring, quality assurance, and protocol optimization. RSNA, 2012

  6. An inventory of state natural resources information systems. [including Puerto Rico and the U.S. Virgin Islands

    NASA Technical Reports Server (NTRS)

    Martinko, E. A. (Principal Investigator); Caron, L. M.; Stewart, D. S.

    1984-01-01

    Data bases and information systems developed and maintained by state agencies to support planning and management of environmental and nutural resources were inventoried for all 50 states, Puerto Rico, and U.S. Virgin Islands. The information obtained is assembled into a computerized data base catalog which is throughly cross-referecence. Retrieval is possible by code, state, data base name, data base acronym, agency, computer, GIS capability, language, specialized software, data category name, geograhic reference, data sources, and level of reliability. The 324 automated data bases identified are described.

  7. 15 CFR Appendix II to Subpart P of... - Existing Management Areas Boundary Coordinates

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ....43.8′ N 81 deg.48.6′ W. Key West National Wildlife Refuge [Based on the North American Datum of 1983... COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT NATIONAL MARINE SANCTUARY PROGRAM REGULATIONS Florida Keys... Administration Key Largo-Management Area [Based on differential Global Positioning Systems data] Point Latitude...

  8. 15 CFR Appendix II to Subpart P of... - Existing Management Areas Boundary Coordinates

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....43.8′ N 81 deg.48.6′ W. Key West National Wildlife Refuge [Based on the North American Datum of 1983... COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT NATIONAL MARINE SANCTUARY PROGRAM REGULATIONS Florida Keys... Administration Key Largo-Management Area [Based on differential Global Positioning Systems data] Point Latitude...

  9. 15 CFR Appendix II to Subpart P of... - Existing Management Areas Boundary Coordinates

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....43.8′ N 81 deg.48.6′ W. Key West National Wildlife Refuge [Based on the North American Datum of 1983... COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT NATIONAL MARINE SANCTUARY PROGRAM REGULATIONS Florida Keys... Administration Key Largo-Management Area [Based on differential Global Positioning Systems data] Point Latitude...

  10. 15 CFR Appendix II to Subpart P of... - Existing Management Areas Boundary Coordinates

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ....43.8′ N 81 deg.48.6′ W. Key West National Wildlife Refuge [Based on the North American Datum of 1983... COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT NATIONAL MARINE SANCTUARY PROGRAM REGULATIONS Florida Keys... Administration Key Largo-Management Area [Based on differential Global Positioning Systems data] Point Latitude...

  11. 15 CFR Appendix II to Subpart P of... - Existing Management Areas Boundary Coordinates

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ....43.8′ N 81 deg.48.6′ W. Key West National Wildlife Refuge [Based on the North American Datum of 1983... COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT NATIONAL MARINE SANCTUARY PROGRAM REGULATIONS Florida Keys... Administration Key Largo-Management Area [Based on differential Global Positioning Systems data] Point Latitude...

  12. ARCADIA: a system for the integration of angiocardiographic data and images by an object-oriented DBMS.

    PubMed

    Pinciroli, F; Combi, C; Pozzi, G

    1995-02-01

    Use of data base techniques to store medical records has been going on for more than 40 years. Some aspects still remain unresolved, e.g., the management of textual data and image data within a single system. Object-orientation techniques applied to a database management system (DBMS) allow the definition of suitable data structures (e.g., to store digital images): some facilities allow the use of predefined structures when defining new ones. Currently available object-oriented DBMS, however, still need improvements both in the schema update and in the query facilities. This paper describes a prototype of a medical record that includes some multimedia features, managing both textual and image data. The prototype here described considers data from the medical records of patients subjected to percutaneous transluminal coronary artery angioplasty. We developed it on a Sun workstation with a Unix operating system and ONTOS as an object-oriented DBMS.

  13. Factual Approach in Decision Making - the Prerequisite of Success in Quality Management

    NASA Astrophysics Data System (ADS)

    Kučerová, Marta; Škůrková Lestyánszka, Katarína

    2013-12-01

    In quality management system as well as in other managerial systems, effective decisions must be always based on the data and information analysis, i.e. based on facts, in accordance with the factual approach principle in quality management. It is therefore necessary to measure and collect the data and information about processes. The article presents the results of a conducted survey, which was focused on application of factual approach in decision making. It also offers suggestions for improvements of application of the principle in business practice. This article was prepared using the research results of VEGA project No. 1/0229/08 "Perspectives of the quality management development in relation to the requirements of market in the Slovak Republic".

  14. The Cheetah data management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kunz, P.F.; Word, G.B.

    1992-09-01

    Cheetah is a data management system based on the C programming language, with support for other languages. Its main goal is to transfer data between memory and I/O steams in a general way. The streams are either associated with disk files or are network data stems. Cheetah provides optional convenience functions to assist in the management of C structures. Cheetah steams are self-describing so that general purpose applications can fully understand an incoming steam. This information can be used to display the data in an incoming steam to the user of an interactive general application, complete with variable names andmore » optional comments.« less

  15. Multimodal visualization interface for data management, self-learning and data presentation.

    PubMed

    Van Sint Jan, S; Demondion, X; Clapworthy, G; Louryan, S; Rooze, M; Cotten, A; Viceconti, M

    2006-10-01

    A multimodal visualization software, called the Data Manager (DM), has been developed to increase interdisciplinary communication around the topic of visualization and modeling of various aspects of the human anatomy. Numerous tools used in Radiology are integrated in the interface that runs on standard personal computers. The available tools, combined to hierarchical data management and custom layouts, allow analyzing of medical imaging data using advanced features outside radiological premises (for example, for patient review, conference presentation or tutorial preparation). The system is free, and based on an open-source software development architecture, and therefore updates of the system for custom applications are possible.

  16. Study on data model of large-scale urban and rural integrated cadastre

    NASA Astrophysics Data System (ADS)

    Peng, Liangyong; Huang, Quanyi; Gao, Dequan

    2008-10-01

    Urban and Rural Integrated Cadastre (URIC) has been the subject of great interests for modern cadastre management. It is highly desirable to develop a rational data model for establishing an information system of URIC. In this paper, firstly, the old cadastral management mode in China was introduced, the limitation was analyzed, and the conception of URIC and its development course in China were described. Afterwards, based on the requirements of cadastre management in developed region, the goal of URIC and two key ideas for realizing URIC were proposed. Then, conceptual management mode was studied and a data model of URIC was designed. At last, based on the raw data of land use survey with a scale of 1:1000 and urban conversional cadastral survey with a scale of 1:500 in Jiangyin city, a well-defined information system of URIC was established according to the data model and an uniform management of land use and use right and landownership in urban and rural area was successfully realized. Its feasibility and practicability was well proved.

  17. DICOM-compliant PACS with CD-based image archival

    NASA Astrophysics Data System (ADS)

    Cox, Robert D.; Henri, Christopher J.; Rubin, Richard K.; Bret, Patrice M.

    1998-07-01

    This paper describes the design and implementation of a low- cost PACS conforming to the DICOM 3.0 standard. The goal was to provide an efficient image archival and management solution on a heterogeneous hospital network as a basis for filmless radiology. The system follows a distributed, client/server model and was implemented at a fraction of the cost of a commercial PACS. It provides reliable archiving on recordable CD and allows access to digital images throughout the hospital and on the Internet. Dedicated servers have been designed for short-term storage, CD-based archival, data retrieval and remote data access or teleradiology. The short-term storage devices provide DICOM storage and query/retrieve services to scanners and workstations and approximately twelve weeks of 'on-line' image data. The CD-based archival and data retrieval processes are fully automated with the exception of CD loading and unloading. The system employs lossless compression on both short- and long-term storage devices. All servers communicate via the DICOM protocol in conjunction with both local and 'master' SQL-patient databases. Records are transferred from the local to the master database independently, ensuring that storage devices will still function if the master database server cannot be reached. The system features rules-based work-flow management and WWW servers to provide multi-platform remote data access. The WWW server system is distributed on the storage, retrieval and teleradiology servers allowing viewing of locally stored image data directly in a WWW browser without the need for data transfer to a central WWW server. An independent system monitors disk usage, processes, network and CPU load on each server and reports errors to the image management team via email. The PACS was implemented using a combination of off-the-shelf hardware, freely available software and applications developed in-house. The system has enabled filmless operation in CT, MR and ultrasound within the radiology department and throughout the hospital. The use of WWW technology has enabled the development of an intuitive we- based teleradiology and image management solution that provides complete access to image data.

  18. Upper Atmosphere Research Satellite (UARS) science data processing center implementation history

    NASA Technical Reports Server (NTRS)

    Herring, Ellen L.; Taylor, K. David

    1990-01-01

    NASA-Goddard is responsible for the development of a ground system for the Upper Atmosphere Research Satellite (UARS) observatory, whose launch is scheduled for 1991. This ground system encompasses a dedicated Central Data Handling Facility (CDHF); attention is presently given to the management of software systems design and implementation phases for CDHF by the UARS organization. Also noted are integration and testing activities performed following software deliveries to the CDHF. The UARS project has an obvious requirement for a powerful and flexible data base management system; an off-the-shelf commercial system has been incorporated.

  19. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing.

    PubMed

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

    2008-01-15

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  20. Improving the Analysis, Storage and Sharing of Neuroimaging Data using Relational Databases and Distributed Computing

    PubMed Central

    Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.

    2007-01-01

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812

  1. Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska

    NASA Astrophysics Data System (ADS)

    Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.

    2012-12-01

    Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources at one place. The study indicates that internet GIS, developed using advanced technologies, provides valuable education potential to users in hydrology and irrigation engineering and suggests that such a system can support advanced hydrological data access and analysis tools to improve utility of data in operations. Keywords: Hydrological Information System, NebHydro, Water Management, data sharing, data visualization, ArcGIS server.

  2. DataHub knowledge based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Collins, Donald J.; Doyle, Richard J.; Jacobson, Allan S.

    1991-01-01

    Viewgraphs on DataHub knowledge based assistance for science visualization and analysis using large distributed databases. Topics covered include: DataHub functional architecture; data representation; logical access methods; preliminary software architecture; LinkWinds; data knowledge issues; expert systems; and data management.

  3. GIS model-based real-time hydrological forecasting and operation management system for the Lake Balaton and its watershed

    NASA Astrophysics Data System (ADS)

    Adolf Szabó, János; Zoltán Réti, Gábor; Tóth, Tünde

    2017-04-01

    Today, the most significant mission of the decision makers on integrated water management issues is to carry out sustainable management for sharing the resources between a variety of users and the environment under conditions of considerable uncertainty (such as climate/land-use/population/etc. change) conditions. In light of this increasing water management complexity, we consider that the most pressing needs is to develop and implement up-to-date GIS model-based real-time hydrological forecasting and operation management systems for aiding decision-making processes to improve water management. After years of researches and developments the HYDROInform Ltd. has developed an integrated, on-line IT system (DIWA-HFMS: DIstributed WAtershed - Hydrologyc Forecasting & Modelling System) which is able to support a wide-ranging of the operational tasks in water resources management such as: forecasting, operation of lakes and reservoirs, water-control and management, etc. Following a test period, the DIWA-HFMS has been implemented for the Lake Balaton and its watershed (in 500 m resolution) at Central-Transdanubian Water Directorate (KDTVIZIG). The significant pillars of the system are: - The DIWA (DIstributed WAtershed) hydrologic model, which is a 3D dynamic water-balance model that distributed both in space and its parameters, and which was developed along combined principles but its mostly based on physical foundations. The DIWA integrates 3D soil-, 2D surface-, and 1D channel-hydraulic components as well. - Lakes and reservoir-operating component; - Radar-data integration module; - fully online data collection tools; - scenario manager tool to create alternative scenarios, - interactive, intuitive, highly graphical user interface. In Vienna, the main functions, operations and results-management of the system will be presented.

  4. Natural resources information system.

    NASA Technical Reports Server (NTRS)

    Leachtenauer, J. C.; Woll, A. M.

    1972-01-01

    A computer-based Natural Resources Information System was developed for the Bureaus of Indian Affairs and Land Management. The system stores, processes and displays data useful to the land manager in the decision making process. Emphasis is placed on the use of remote sensing as a data source. Data input consists of maps, imagery overlays, and on-site data. Maps and overlays are entered using a digitizer and stored as irregular polygons, lines and points. Processing functions include set intersection, union and difference and area, length and value computations. Data output consists of computer tabulations and overlays prepared on a drum plotter.

  5. A coastal information system to propel emerging science and inform environmental management decisions

    EPA Science Inventory

    The Estuary Data Mapper (EDM) is a free, interactive virtual gateway to coastal data aimed to promote research and aid in environmental management. The graphical user interface allows users to custom select and subset data based on their spatial and temporal interests giving them...

  6. Medical Big Data Warehouse: Architecture and System Design, a Case Study: Improving Healthcare Resources Distribution.

    PubMed

    Sebaa, Abderrazak; Chikh, Fatima; Nouicer, Amina; Tari, AbdelKamel

    2018-02-19

    The huge increases in medical devices and clinical applications which generate enormous data have raised a big issue in managing, processing, and mining this massive amount of data. Indeed, traditional data warehousing frameworks can not be effective when managing the volume, variety, and velocity of current medical applications. As a result, several data warehouses face many issues over medical data and many challenges need to be addressed. New solutions have emerged and Hadoop is one of the best examples, it can be used to process these streams of medical data. However, without an efficient system design and architecture, these performances will not be significant and valuable for medical managers. In this paper, we provide a short review of the literature about research issues of traditional data warehouses and we present some important Hadoop-based data warehouses. In addition, a Hadoop-based architecture and a conceptual data model for designing medical Big Data warehouse are given. In our case study, we provide implementation detail of big data warehouse based on the proposed architecture and data model in the Apache Hadoop platform to ensure an optimal allocation of health resources.

  7. Implementation and use of a microcomputer-based management information system to monitor dairy herd performance

    PubMed Central

    Lissemore, Kerry D.; Leslie, Ken E.; Menzies, Paula I.; Martin, S. Wayne; Meek, Alan H.; Etherington, Wayne G.

    1992-01-01

    A microcomputer-based herd management information system was implemented as part of the herd health program provided to 13 dairy clients by the Ontario Veterinary College, University of Guelph. The study was conducted over a two year period. Data were collected from on-farm event diaries, veterinary visit reports, and production testing information. Selected indices of reproduction, udder health, production, and heifer performance were reported. It was concluded that the implementation of a microcomputer-based information management system, operated as a bureau service, was feasible. However, limitations to the implementation in veterinary practice were identified. PMID:17423945

  8. Computer Aided Management for Information Processing Projects.

    ERIC Educational Resources Information Center

    Akman, Ibrahim; Kocamustafaogullari, Kemal

    1995-01-01

    Outlines the nature of information processing projects and discusses some project management programming packages. Describes an in-house interface program developed to utilize a selected project management package (TIMELINE) by using Oracle Data Base Management System tools and Pascal programming language for the management of information system…

  9. Data warehousing: toward knowledge management.

    PubMed

    Shams, K; Farishta, M

    2001-02-01

    With rapid changes taking place in the practice and delivery of health care, decision support systems have assumed an increasingly important role. More and more health care institutions are deploying data warehouse applications as decision support tools for strategic decision making. By making the right information available at the right time to the right decision makers in the right manner, data warehouses empower employees to become knowledge workers with the ability to make the right decisions and solve problems, creating strategic leverage for the organization. Health care management must plan and implement data warehousing strategy using a best practice approach. Through the power of data warehousing, health care management can negotiate bettermanaged care contracts based on the ability to provide accurate data on case mix and resource utilization. Management can also save millions of dollars through the implementation of clinical pathways in better resource utilization and changing physician behavior to best practices based on evidence-based medicine.

  10. Design and implementation of information visualization system on science and technology industry based on GIS

    NASA Astrophysics Data System (ADS)

    Wu, Xiaofang; Jiang, Liushi

    2011-02-01

    Usually in the traditional science and technology information system, the only text and table form are used to manage the data, and the mathematic statistics method is applied to analyze the data. It lacks for the spatial analysis and management of data. Therefore, GIS technology is introduced to visualize and analyze the information data on science and technology industry. Firstly, by using the developed platform-microsoft visual studio 2005 and ArcGIS Engine, the information visualization system on science and technology industry based on GIS is built up, which implements various functions, such as data storage and management, inquiry, statistics, chart analysis, thematic map representation. It can show the change of science and technology information from the space and time axis intuitively. Then, the data of science and technology in Guangdong province are taken as experimental data and are applied to the system. And by considering the factors of humanities, geography and economics so on, the situation and change tendency of science and technology information of different regions are analyzed and researched, and the corresponding suggestion and method are brought forward in order to provide the auxiliary support for development of science and technology industry in Guangdong province.

  11. The spatial data and knowledge gateways at the International Water Management Institute (IWMI)

    NASA Astrophysics Data System (ADS)

    Thenkabail, P. S.; Biradar, C. M.; Noojipady, P.; Islam, A.; Velpuri, M.; Vithanage, J.; Kulawardhana, W.; Li, Yuan Jie; Dheeravath, V.; Gunasinghe, S.; Alankara, R.

    2006-10-01

    In this paper we discuss spatial data and knowledge base (SDKB) gateway portals developed by the International Water Management Institute (IWMI). Our vision is to generate and/or facilitate easy and free access to state-of-art SDKB of excellence globally. Our mission is to make SDKB accessible online, globally, for free. The IWMI data storehouse pathway (IWMIDSP; http://www.iwmidsp.org) is a pathfinder global public good (GPG) portal on remote sensing and GIS (RS/GIS) data and products with specific emphasis on river basin data, but also storing valuable data on Nations, Regions, and the World. A number of other specialty GPG portals have also been released. These include Global map of irrigated area (http://www.iwmigiam.org), Drought monitoring system for southwest Asia (http://dms.iwmi.org), Tsunami satellite sensor data catalogue (http://tsdc.iwmi.org), and Knowledge base system (KBS) for Sri Lanka (http://www.iwmikbs.org). The IWMIDSP has been the backbone of several other projects such as global irrigated area mapping, drought monitoring system, wetlands, and knowledge base systems. A discussion on these pathfinder web portals follow.

  12. An Analysis of a Computerized System for Managing Curriculum Decisions and Tracking Student Progress in a Home-Based Pre-School Education Project.

    ERIC Educational Resources Information Center

    Lutz, John E.; And Others

    The degree of success of the computerized Child-Based Information System (CBIS) was analyzed in two areas--presenting, delivering, and managing a developmental curriculum; and recording, filing, and monitoring child tracking data, including requirements for Individualized Education Plans (IEP's). Preschool handicapped and high-risk children and…

  13. A Web-Based Multi-Database System Supporting Distributed Collaborative Management and Sharing of Microarray Experiment Information

    PubMed Central

    Burgarella, Sarah; Cattaneo, Dario; Masseroli, Marco

    2006-01-01

    We developed MicroGen, a multi-database Web based system for managing all the information characterizing spotted microarray experiments. It supports information gathering and storing according to the Minimum Information About Microarray Experiments (MIAME) standard. It also allows easy sharing of information and data among all multidisciplinary actors involved in spotted microarray experiments. PMID:17238488

  14. Operations system administration plan for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, D.E.

    The Hanford Data Integration 2000 (HANDI 2000) Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract (PHMC). It is based on the Commercial-Off-The-Shelf (COTS) product solution with commercially proven business processes. This includes systems that support finance, supply, chemical management, human resources and payroll activities on the Hanford Site. The Passport (PP) software is an integrated application for Accounts Payable, Contract Management, Inventory Management, Purchasing, and Material Safety Data Sheets (MSDS). The PeopleSoft (PS) software is an integrated application for General Ledger, Project Costing, Human Resources,more » Payroll, Benefits, and Training. The implementation of this set of products, as the first deliverable of the HANDI 2000 Project, is referred to as Business Management System (BMS) and MSDS.« less

  15. Improve wildlife species tracking—Implementing an enhanced global positioning system data management system for California condors

    USGS Publications Warehouse

    Waltermire, Robert G.; Emmerich, Christopher U.; Mendenhall, Laura C.; Bohrer, Gil; Weinzierl, Rolf P.; McGann, Andrew J.; Lineback, Pat K.; Kern, Tim J.; Douglas, David C.

    2016-05-03

    U.S. Fish and Wildlife Service (USFWS) staff in the Pacific Southwest Region and at the Hopper Mountain National Wildlife Refuge Complex requested technical assistance to improve their global positioning system (GPS) data acquisition, management, and archive in support of the California Condor Recovery Program. The USFWS deployed and maintained GPS units on individual Gymnogyps californianus (California condor) in support of long-term research and daily operational monitoring and management of California condors. The U.S. Geological Survey (USGS) obtained funding through the Science Support Program to provide coordination among project participants, provide GPS Global System for Mobile Communication (GSM) transmitters for testing, and compare GSM/GPS with existing Argos satellite GPS technology. The USFWS staff worked with private companies to design, develop, and fit condors with GSM/GPS transmitters. The Movebank organization, an online database of animal tracking data, coordinated with each of these companies to automatically stream their GPS data into Movebank servers and coordinated with USFWS to improve Movebank software for managing transmitter data, including proofing/error checking of incoming GPS data. The USGS arranged to pull raw GPS data from Movebank into the USGS California Condor Management and Analysis Portal (CCMAP) (https://my.usgs.gov/ccmap) for production and dissemination of a daily map of condor movements including various automated alerts. Further, the USGS developed an automatic archiving system for pulling raw and proofed Movebank data into USGS ScienceBase to comply with the Federal Information Security Management Act of 2002. This improved data management system requires minimal manual intervention resulting in more efficient data flow from GPS data capture to archive status. As a result of the project’s success, Pinnacles National Park and the Ventana Wildlife Society California condor programs became partners and adopted the same workflow, tracking, and data archive system. This GPS tracking data management model and workflow should be applicable and beneficial to other wildlife tracking programs.

  16. The Resource Manager the ATLAS Trigger and Data Acquisition System

    NASA Astrophysics Data System (ADS)

    Aleksandrov, I.; Avolio, G.; Lehmann Miotto, G.; Soloviev, I.

    2017-10-01

    The Resource Manager is one of the core components of the Data Acquisition system of the ATLAS experiment at the LHC. The Resource Manager marshals the right for applications to access resources which may exist in multiple but limited copies, in order to avoid conflicts due to program faults or operator errors. The access to resources is managed in a manner similar to what a lock manager would do in other software systems. All the available resources and their association to software processes are described in the Data Acquisition configuration database. The Resource Manager is queried about the availability of resources every time an application needs to be started. The Resource Manager’s design is based on a client-server model, hence it consists of two components: the Resource Manager “server” application and the “client” shared library. The Resource Manager server implements all the needed functionalities, while the Resource Manager client library provides remote access to the “server” (i.e., to allocate and free resources, to query about the status of resources). During the LHC’s Long Shutdown period, the Resource Manager’s requirements have been reviewed at the light of the experience gained during the LHC’s Run 1. As a consequence, the Resource Manager has undergone a full re-design and re-implementation cycle with the result of a reduction of the code base by 40% with respect to the previous implementation. This contribution will focus on the way the design and the implementation of the Resource Manager could leverage the new features available in the C++11 standard, and how the introduction of external libraries (like Boost multi-container) led to a more maintainable system. Additionally, particular attention will be given to the technical solutions adopted to ensure the Resource Manager could effort the typical requests rates of the Data Acquisition system, which is about 30000 requests in a time window of few seconds coming from more than 1000 clients.

  17. IPAD: Integrated Programs for Aerospace-vehicle Design

    NASA Technical Reports Server (NTRS)

    Miller, R. E., Jr.

    1985-01-01

    Early work was performed to apply data base technology in support of the management of engineering data in the design and manufacturing environments. The principal objective of the IPAD project is to develop a computer software system for use in the design of aerospace vehicles. Two prototype systems are created for this purpose. Relational Information Manager (RIM) is a successful commercial product. The IPAD Information Processor (IPIP), a much more sophisticated system, is still under development.

  18. An address geocoding method for improving rural spatial information infrastructure

    NASA Astrophysics Data System (ADS)

    Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, YanBing

    2010-11-01

    The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.

  19. [Development and application of information management system for advanced schistosomiasis chemotherapy and assistance in Jiangxi Province].

    PubMed

    Mao, Yuan-Hua; Li, Dong; Ning, An; Qiu, Ling; Xiong, Ji-Jie

    2011-04-01

    To develop the information management system for advanced schistosomiasis chemotherapy and assistance in Jiangxi Province. Based on Access 2003, the system was programmed by Visual Basic 6.0 and packaged by Setup Factory 8.0. In the system, advanced schistosomiasis data were able to be input, printed, indexed, and statistically analyzed. The system could be operated and maintained easily and timely. The information management system for advanced schistosomiasis chemotherapy and assistance in Jiangxi Province is successfully developed.

  20. The Evidence-base for Using Ontologies and Semantic Integration Methodologies to Support Integrated Chronic Disease Management in Primary and Ambulatory Care: Realist Review. Contribution of the IMIA Primary Health Care Informatics WG.

    PubMed

    Liyanage, H; Liaw, S-T; Kuziemsky, C; Terry, A L; Jones, S; Soler, J K; de Lusignan, S

    2013-01-01

    Most chronic diseases are managed in primary and ambulatory care. The chronic care model (CCM) suggests a wide range of community, technological, team and patient factors contribute to effective chronic disease management. Ontologies have the capability to enable formalised linkage of heterogeneous data sources as might be found across the elements of the CCM. To describe the evidence base for using ontologies and other semantic integration methods to support chronic disease management. We reviewed the evidence-base for the use of ontologies and other semantic integration methods within and across the elements of the CCM. We report them using a realist review describing the context in which the mechanism was applied, and any outcome measures. Most evidence was descriptive with an almost complete absence of empirical research and important gaps in the evidence-base. We found some use of ontologies and semantic integration methods for community support of the medical home and for care in the community. Ubiquitous information technology (IT) and other IT tools were deployed to support self-management support, use of shared registries, health behavioural models and knowledge discovery tools to improve delivery system design. Data quality issues restricted the use of clinical data; however there was an increased use of interoperable data and health system integration. Ontologies and semantic integration methods are emergent with limited evidence-base for their implementation. However, they have the potential to integrate the disparate community wide data sources to provide the information necessary for effective chronic disease management.

  1. Networking and data management for health care monitoring of mobile patients.

    PubMed

    Amato, Giuseppe; Chessa, Stefano; Conforti, Fabrizio; Macerata, Alberto; Marchesi, Carlo

    2005-01-01

    The problem of medical devices and data integration in health care is discussed and a proposal for remote monitoring of patients based on recent developments in networking and data management is presented. In particular the paper discusses the benefits of the integration of personal medical devices into a Medical Information System and how wireless sensor networks and open protocols could be employed as building blocks of a patient monitoring system.

  2. Development of a Personal Digital Assistant (PDA) based client/server NICU patient data and charting system.

    PubMed

    Carroll, A E; Saluja, S; Tarczy-Hornoch, P

    2001-01-01

    Personal Digital Assistants (PDAs) offer clinicians the ability to enter and manage critical information at the point of care. Although PDAs have always been designed to be intuitive and easy to use, recent advances in technology have made them even more accessible. The ability to link data on a PDA (client) to a central database (server) allows for near-unlimited potential in developing point of care applications and systems for patient data management. Although many stand-alone systems exist for PDAs, none are designed to work in an integrated client/server environment. This paper describes the design, software and hardware selection, and preliminary testing of a PDA based patient data and charting system for use in the University of Washington Neonatal Intensive Care Unit (NICU). This system will be the subject of a subsequent study to determine its impact on patient outcomes and clinician efficiency.

  3. Toward Transparent Data Management in Multi-layer Storage Hierarchy for HPC Systems

    DOE PAGES

    Wadhwa, Bharti; Byna, Suren; Butt, Ali R.

    2018-04-17

    Upcoming exascale high performance computing (HPC) systems are expected to comprise multi-tier storage hierarchy, and thus will necessitate innovative storage and I/O mechanisms. Traditional disk and block-based interfaces and file systems face severe challenges in utilizing capabilities of storage hierarchies due to the lack of hierarchy support and semantic interfaces. Object-based and semantically-rich data abstractions for scientific data management on large scale systems offer a sustainable solution to these challenges. Such data abstractions can also simplify users involvement in data movement. Here, we take the first steps of realizing such an object abstraction and explore storage mechanisms for these objectsmore » to enhance I/O performance, especially for scientific applications. We explore how an object-based interface can facilitate next generation scalable computing systems by presenting the mapping of data I/O from two real world HPC scientific use cases: a plasma physics simulation code (VPIC) and a cosmology simulation code (HACC). Our storage model stores data objects in different physical organizations to support data movement across layers of memory/storage hierarchy. Our implementation sclaes well to 16K parallel processes, and compared to the state of the art, such as MPI-IO and HDF5, our object-based data abstractions and data placement strategy in multi-level storage hierarchy achieves up to 7 X I/O performance improvement for scientific data.« less

  4. Toward Transparent Data Management in Multi-layer Storage Hierarchy for HPC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wadhwa, Bharti; Byna, Suren; Butt, Ali R.

    Upcoming exascale high performance computing (HPC) systems are expected to comprise multi-tier storage hierarchy, and thus will necessitate innovative storage and I/O mechanisms. Traditional disk and block-based interfaces and file systems face severe challenges in utilizing capabilities of storage hierarchies due to the lack of hierarchy support and semantic interfaces. Object-based and semantically-rich data abstractions for scientific data management on large scale systems offer a sustainable solution to these challenges. Such data abstractions can also simplify users involvement in data movement. Here, we take the first steps of realizing such an object abstraction and explore storage mechanisms for these objectsmore » to enhance I/O performance, especially for scientific applications. We explore how an object-based interface can facilitate next generation scalable computing systems by presenting the mapping of data I/O from two real world HPC scientific use cases: a plasma physics simulation code (VPIC) and a cosmology simulation code (HACC). Our storage model stores data objects in different physical organizations to support data movement across layers of memory/storage hierarchy. Our implementation sclaes well to 16K parallel processes, and compared to the state of the art, such as MPI-IO and HDF5, our object-based data abstractions and data placement strategy in multi-level storage hierarchy achieves up to 7 X I/O performance improvement for scientific data.« less

  5. Workflow management in large distributed systems

    NASA Astrophysics Data System (ADS)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  6. AdiosStMan: Parallelizing Casacore Table Data System using Adaptive IO System

    NASA Astrophysics Data System (ADS)

    Wang, R.; Harris, C.; Wicenec, A.

    2016-07-01

    In this paper, we investigate the Casacore Table Data System (CTDS) used in the casacore and CASA libraries, and methods to parallelize it. CTDS provides a storage manager plugin mechanism for third-party developers to design and implement their own CTDS storage managers. Having this in mind, we looked into various storage backend techniques that can possibly enable parallel I/O for CTDS by implementing new storage managers. After carrying on benchmarks showing the excellent parallel I/O throughput of the Adaptive IO System (ADIOS), we implemented an ADIOS based parallel CTDS storage manager. We then applied the CASA MSTransform frequency split task to verify the ADIOS Storage Manager. We also ran a series of performance tests to examine the I/O throughput in a massively parallel scenario.

  7. Hydrology of lakes in the Minneapolis-Saint Paul Metropolitan Area: A summary of available dat

    USGS Publications Warehouse

    McBride, Mark S.

    1976-01-01

    SYSTEM 2000, a generalized computer data-base management system, was used to organize the data and prepare the tables. SYSTEM 2000 provides powerful capabilities for future retrieval and analyses of the data. The data base is available to potential users so that questions not implicitly anticipated in the preparation of the published tables can be answered readily, and the user can retrieve data in tabular or other forms to meet his particular needs.

  8. A practical approach for active camera coordination based on a fusion-driven multi-agent system

    NASA Astrophysics Data System (ADS)

    Bustamante, Alvaro Luis; Molina, José M.; Patricio, Miguel A.

    2014-04-01

    In this paper, we propose a multi-agent system architecture to manage spatially distributed active (or pan-tilt-zoom) cameras. Traditional video surveillance algorithms are of no use for active cameras, and we have to look at different approaches. Such multi-sensor surveillance systems have to be designed to solve two related problems: data fusion and coordinated sensor-task management. Generally, architectures proposed for the coordinated operation of multiple cameras are based on the centralisation of management decisions at the fusion centre. However, the existence of intelligent sensors capable of decision making brings with it the possibility of conceiving alternative decentralised architectures. This problem is approached by means of a MAS, integrating data fusion as an integral part of the architecture for distributed coordination purposes. This paper presents the MAS architecture and system agents.

  9. A Computer-Managed Instruction Support System for Large Group Individualized Instruction.

    ERIC Educational Resources Information Center

    Countermine, Terry; Singh, Jane M.

    1977-01-01

    The Pennsylvania State University College of Education's Instruction Support System (ISS) was developed to manage the logistical operation of large group individualized competency-based instruction. Software and hardware charting, operational procedures, and data from student opinion questionnaires are cited. (RAO)

  10. Modeling and Simulation in Support of Testing and Evaluation

    DTIC Science & Technology

    1997-03-01

    contains standardized automated test methodology, synthetic stimuli and environments based on TECOM Ground Truth data and physics . The VPG is a distributed...Systems Acquisition Management (FSAM) coursebook , Defense Systems Management College, January 1994. Crocker, Charles M. “Application of the Simulation

  11. Data base systems in electronic design engineering

    NASA Technical Reports Server (NTRS)

    Williams, D.

    1980-01-01

    The concepts of an integrated design data base system (DBMS) as it might apply to an electronic design company are discussed. Data elements of documentation, project specifications, project tracking, firmware, software, electronic and mechanical design can be integrated and managed through a single DBMS. Combining the attributes of a DBMS data handler with specialized systems and functional data can provide users with maximum flexibility, reduced redundancy, and increased overall systems performance. Although some system overhead is lost due to redundancy in transitory data, it is believed the combination of the two data types is advisable rather than trying to do all data handling through a single DBMS.

  12. NOR-DMARD data management: implementation of data capture from electronic health records.

    PubMed

    Olsen, I C; Haavardsholm, E A; Moholt, E; Kvien, T K; Lie, E

    2014-01-01

    The use of electronic health records (EHR) is an essential part of modern health care, and electronic data capture (EDC) has become essential for managing clinical trials. Usually, these two entities are independent of each other, and transfer from one system to another is done manually. Our aim was to develop a method to capture data directly from the EHR system and transfer them into an EDC system for the NORwegian Disease-Modifying Anti-Rheumatic Drugs (NOR-DMARD) registry. All rheumatology departments contributing to NOR-DMARD had implemented a structured EHR system. Data are extracted locally and securely transferred to the study data management once a month. The study data management then parse the data into a readable format for the EDC and import the data. Once the data is in the EDC, they are available to all authorized researchers and downloadable in a preferred format. From May 2012 to August 2014 almost 6400 visits in 3400 patients treated with biologics have been successfully registered in the EDC system. Previously, NOR-DMARD used standard paper-based case report forms (CRFs), with a substantial cost for data entry. Setting up and maintaining the EDC system required some investments, but the amount saved from avoiding paper handling has made the shift into EDC profitable. In addition to this, gains have been made in administration and data quality. The transition from paper and pencil format to a fully electronic data management system in NOR-DMARD has had obvious advantages regarding feasibility, cost, data quality and accessibility of the data.

  13. Geophysical data base

    NASA Technical Reports Server (NTRS)

    Williamson, M. R.; Kirschner, L. R.

    1975-01-01

    A general data-management system that provides a random-access capability for large amounts of data is described. The system operates on a CDC 6400 computer using a combination of magnetic tape and disk storage. A FORTRAN subroutine package is provided to simplify the maintenance and use of the data.

  14. 78 FR 48430 - Agency Information Collection Activities; Submission to the Office of Management and Budget for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-08

    ... Postsecondary Education Data System (IPEDS) 2013- 2016 AGENCY: Department of Education (ED), Institute of... Integrated Postsecondary Education Data System (IPEDS) is a web-based data collection system designed to... DEPARTMENT OF EDUCATION [Docket No.: ED-2013-ICCD-0029] Agency Information Collection Activities...

  15. Laboratory data base for isomer-specific determination of polychlorinated biphenyls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, T.R.; Campbell, R.D.; Stalling, D.L.

    1984-07-01

    A computer-assisted technique for quantitative determination of polychlorinated biphenyl isomers is described. PCB isomers were identified by use of a retention index system with n-alkyl trichloroacetates as retention index marker compounds. A laboratory data base system was developed to aid in editing and quantitation of data generated from capillary gas chromatographic data. Data base management was provided by computer programs written in DSM-11 (Digital Standard MUMPS) for the PDP-11 family of computers. 13 references, 4 figures, 2 tables.

  16. Solar heating and cooling technical data and systems analysis

    NASA Technical Reports Server (NTRS)

    Christensen, D. L.

    1976-01-01

    The acquisition and processing of selected parametric data for inclusion in a computerized Data Base using the Marshall Information Retrieval and Data System (MIRADS) developed by NASA-MSFC is discussed. This data base provides extensive technical and socioeconomic information related to solar energy heating and cooling on a national scale. A broadly based research approach was used to assist in the support of program management and the application of a cost-effective program for solar energy development and demonstration.

  17. GraphMeta: Managing HPC Rich Metadata in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Dong; Chen, Yong; Carns, Philip

    High-performance computing (HPC) systems face increasingly critical metadata management challenges, especially in the approaching exascale era. These challenges arise not only from exploding metadata volumes, but also from increasingly diverse metadata, which contains data provenance and arbitrary user-defined attributes in addition to traditional POSIX metadata. This ‘rich’ metadata is becoming critical to supporting advanced data management functionality such as data auditing and validation. In our prior work, we identified a graph-based model as a promising solution to uniformly manage HPC rich metadata due to its flexibility and generality. However, at the same time, graph-based HPC rich metadata anagement also introducesmore » significant challenges to the underlying infrastructure. In this study, we first identify the challenges on the underlying infrastructure to support scalable, high-performance rich metadata management. Based on that, we introduce GraphMeta, a graphbased engine designed for this use case. It achieves performance scalability by introducing a new graph partitioning algorithm and a write-optimal storage engine. We evaluate GraphMeta under both synthetic and real HPC metadata workloads, compare it with other approaches, and demonstrate its advantages in terms of efficiency and usability for rich metadata management in HPC systems.« less

  18. Urban Planning and Management Information Systems Analysis and Design Based on GIS

    NASA Astrophysics Data System (ADS)

    Xin, Wang

    Based on the analysis of existing relevant systems on the basis of inadequate, after a detailed investigation and research, urban planning and management information system will be designed for three-tier structure system, under the LAN using C/S mode architecture. Related functions for the system designed in accordance with the requirements of the architecture design of the functional relationships between the modules. Analysis of the relevant interface and design, data storage solutions proposed. The design for small and medium urban planning information system provides a viable building program.

  19. Mobile Location-Based Services for Trusted Information in Disaster Management

    NASA Astrophysics Data System (ADS)

    Ragia, Lemonia; Deriaz, Michel; Seigneur, Jean-Marc

    The goal of the present chapter is to provide location-based services for disaster management. The application involves services related to the safety of the people due to an unexpected event. The current prototype is implemented for a specific issue of disaster management which is road traffic control. The users can ask requests on cell phones or via Internet to the system and get an answer in a display or in textual form. The data are in a central database and every user can input data via virtual tags. The system is based on spatial messages which can be sent from any user to any other in a certain distance. In this way all the users and not a separate source provide the necessary information for a dangerous situation. To avoid any contamination problems we use trust security to check the input to the system and a trust engine model to provide information with a considerable reliability.

  20. A smartphone-based prototype system for incident/work zone management driven by crowd-sourced data.

    DOT National Transportation Integrated Search

    2015-02-01

    This project develops a smartphone-based prototype system that supplements the 511 system to improve its dynamic traffic : routing service to state highway users under non-recurrent congestion. This system will save considerable time to provide cruci...

  1. A review on the application of deep learning in system health management

    NASA Astrophysics Data System (ADS)

    Khan, Samir; Yairi, Takehisa

    2018-07-01

    Given the advancements in modern technological capabilities, having an integrated health management and diagnostic strategy becomes an important part of a system's operational life-cycle. This is because it can be used to detect anomalies, analyse failures and predict the future state based on up-to-date information. By utilising condition data and on-site feedback, data models can be trained using machine learning and statistical concepts. Once trained, the logic for data processing can be embedded on on-board controllers whilst enabling real-time health assessment and analysis. However, this integration inevitably faces several difficulties and challenges for the community; indicating the need for novel approaches to address this vexing issue. Deep learning has gained increasing attention due to its potential advantages with data classification and feature extraction problems. It is an evolving research area with diverse application domains and hence its use for system health management applications must been researched if it can be used to increase overall system resilience or potential cost benefits for maintenance, repair, and overhaul activities. This article presents a systematic review of artificial intelligence based system health management with an emphasis on recent trends of deep learning within the field. Various architectures and related theories are discussed to clarify its potential. Based on the reviewed work, deep learning demonstrates plausible benefits for fault diagnosis and prognostics. However, there are a number of limitations that hinder its widespread adoption and require further development. Attention is paid to overcoming these challenges, with future opportunities being enumerated.

  2. A Cloud-based Infrastructure and Architecture for Environmental System Research

    NASA Astrophysics Data System (ADS)

    Wang, D.; Wei, Y.; Shankar, M.; Quigley, J.; Wilson, B. E.

    2016-12-01

    The present availability of high-capacity networks, low-cost computers and storage devices, and the widespread adoption of hardware virtualization and service-oriented architecture provide a great opportunity to enable data and computing infrastructure sharing between closely related research activities. By taking advantage of these approaches, along with the world-class high computing and data infrastructure located at Oak Ridge National Laboratory, a cloud-based infrastructure and architecture has been developed to efficiently deliver essential data and informatics service and utilities to the environmental system research community, and will provide unique capabilities that allows terrestrial ecosystem research projects to share their software utilities (tools), data and even data submission workflow in a straightforward fashion. The infrastructure will minimize large disruptions from current project-based data submission workflows for better acceptances from existing projects, since many ecosystem research projects already have their own requirements or preferences for data submission and collection. The infrastructure will eliminate scalability problems with current project silos by provide unified data services and infrastructure. The Infrastructure consists of two key components (1) a collection of configurable virtual computing environments and user management systems that expedite data submission and collection from environmental system research community, and (2) scalable data management services and system, originated and development by ORNL data centers.

  3. Design and Development of Countylevel Information Management System for Diseases and Pests of Hebei Province on GIS

    NASA Astrophysics Data System (ADS)

    Cheng, Xiaoyan; Zhang, Xiaoli; Xie, Fangyi

    To get a more convenience work in forest application, GIS and information system is used in forestry. GIS technology is used to build an informational management system of forest disease. For the practical requirement, the system is implemented by PDA which works outside to help completing the data collection. The major function of the system is input and output of the forest disease data and processing the report which is based on the criteria report and the assistant function of GIS. This article is aim to discuss about the theory, the process and the critical points of the information system. Besides the general information management system, GIS and PDA is introduced into the diseases system, which could combine the map and the attribute information and realize inventory data reform by PDA. The system is developed with VB and SuperMap Object (SuperMap Company).

  4. Lowering the Barrier for Standards-Compliant and Discoverable Hydrological Data Publication

    NASA Astrophysics Data System (ADS)

    Kadlec, J.

    2013-12-01

    The growing need for sharing and integration of hydrological and climate data across multiple organizations has resulted in the development of distributed, services-based, standards-compliant hydrological data management and data hosting systems. The problem with these systems is complicated set-up and deployment. Many existing systems assume that the data publisher has remote-desktop access to a locally managed server and experience with computer network setup. For corporate websites, shared web hosting services with limited root access provide an inexpensive, dynamic web presence solution using the Linux, Apache, MySQL and PHP (LAMP) software stack. In this paper, we hypothesize that a webhosting service provides an optimal, low-cost solution for hydrological data hosting. We propose a software architecture of a standards-compliant, lightweight and easy-to-deploy hydrological data management system that can be deployed on the majority of existing shared internet webhosting services. The architecture and design is validated by developing Hydroserver Lite: a PHP and MySQL-based hydrological data hosting package that is fully standards-compliant and compatible with the Consortium of Universities for Advancement of Hydrologic Sciences (CUAHSI) hydrologic information system. It is already being used for management of field data collection by students of the McCall Outdoor Science School in Idaho. For testing, the Hydroserver Lite software has been installed on multiple different free and low-cost webhosting sites including Godaddy, Bluehost and 000webhost. The number of steps required to set-up the server is compared with the number of steps required to set-up other standards-compliant hydrologic data hosting systems including THREDDS, IstSOS and MapServer SOS.

  5. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data

    PubMed Central

    2011-01-01

    Background No systematic process has previously been described for a needs assessment that identifies the operating room (OR) management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. Methods A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1). Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS) data for most scenarios (43 of 45). Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Results Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of individualized assessments. Conclusions Our technical advance is the development and use of automated event-based knowledge elicitation to identify suboptimal OR management decisions that decrease the efficiency of use of OR time. The adapted scenarios can be used in future decision-making. PMID:21214905

  6. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data.

    PubMed

    Dexter, Franklin; Wachtel, Ruth E; Epstein, Richard H

    2011-01-07

    No systematic process has previously been described for a needs assessment that identifies the operating room (OR) management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1). Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS) data for most scenarios (43 of 45). Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of individualized assessments. Our technical advance is the development and use of automated event-based knowledge elicitation to identify suboptimal OR management decisions that decrease the efficiency of use of OR time. The adapted scenarios can be used in future decision-making.

  7. PVDaCS - A prototype knowledge-based expert system for certification of spacecraft data

    NASA Technical Reports Server (NTRS)

    Wharton, Cathleen; Shiroma, Patricia J.; Simmons, Karen E.

    1989-01-01

    On-line data management techniques to certify spacecraft information are mandated by increasing telemetry rates. Knowledge-based expert systems offer the ability to certify data electronically without the need for time-consuming human interaction. Issues of automatic certification are explored by designing a knowledge-based expert system to certify data from a scientific instrument, the Orbiter Ultraviolet Spectrometer, on an operating NASA planetary spacecraft, Pioneer Venus. The resulting rule-based system, called PVDaCS (Pioneer Venus Data Certification System), is a functional prototype demonstrating the concepts of a larger system design. A key element of the system design is the representation of an expert's knowledge through the usage of well ordered sequences. PVDaCS produces a certification value derived from expert knowledge and an analysis of the instrument's operation. Results of system performance are presented.

  8. Exemplars in the use of technology for management of depression in primary care.

    PubMed

    Serrano, Neftali; Molander, Rachel; Monden, Kimberley; Grosshans, Ashley; Krahn, Dean D

    2012-06-01

    Depression care management as part of larger efforts to integrate behavioral health care into primary care has been shown to be effective in helping patients and primary care clinicians achieve improved outcomes within the primary care environment. Central to care management systems is the use of registries which enable effective clinic population management. The aim of this article is to detail the methods and utility of technology in depression care management processes while also highlighting the real-world variations and barriers that exist in different clinical environments, namely a federally qualified health center and a Veterans Administration clinic. We analyzed descriptive data from the registries of Access Community Health Centers and the William S. Middleton Veterans Administration clinics along with historical reviews of their respective care management processes. Both registry reviews showed trend data indicating improvement in scores of depression and provided baseline data on important system variables, such as the number of patients who are not making progress, the percentage of patients who are unreachable by phone, and the kind of actions needed to ensure evidence-based and efficient care. Both sites also highlighted systemic technical barriers to more complete implementation of care management processes. Care management processes are an effective and efficient part of population-based care for depression in primary care. Implementation depends on available resources including hardware, software, and clinical personnel. Additionally, care management processes and technology have evolved over time based on local needs and are part of an integrated method to support the work of primary care clinicians in providing care for patients with depression.

  9. Concepts of Management Information Systems.

    ERIC Educational Resources Information Center

    Emery, J.C.

    The paper attempts to provide a general framework for dealing with management information systems (MIS). An MIS is defined to have the following characteristics: (1) related to ongoing activities of an organization, (2) a man-machine system, (3) composed of a collection of subsystems, and (4) oriented around a large data base. An MIS places a…

  10. [Tumor Data Interacted System Design Based on Grid Platform].

    PubMed

    Liu, Ying; Cao, Jiaji; Zhang, Haowei; Zhang, Ke

    2016-06-01

    In order to satisfy demands of massive and heterogeneous tumor clinical data processing and the multi-center collaborative diagnosis and treatment for tumor diseases,a Tumor Data Interacted System(TDIS)was established based on grid platform,so that an implementing virtualization platform of tumor diagnosis service was realized,sharing tumor information in real time and carrying on standardized management.The system adopts Globus Toolkit 4.0tools to build the open grid service framework and encapsulats data resources based on Web Services Resource Framework(WSRF).The system uses the middleware technology to provide unified access interface for heterogeneous data interaction,which could optimize interactive process with virtualized service to query and call tumor information resources flexibly.For massive amounts of heterogeneous tumor data,the federated stored and multiple authorized mode is selected as security services mechanism,real-time monitoring and balancing load.The system can cooperatively manage multi-center heterogeneous tumor data to realize the tumor patient data query,sharing and analysis,and compare and match resources in typical clinical database or clinical information database in other service node,thus it can assist doctors in consulting similar case and making up multidisciplinary treatment plan for tumors.Consequently,the system can improve efficiency of diagnosis and treatment for tumor,and promote the development of collaborative tumor diagnosis model.

  11. [Development of a medical equipment support information system based on PDF portable document].

    PubMed

    Cheng, Jiangbo; Wang, Weidong

    2010-07-01

    According to the organizational structure and management system of the hospital medical engineering support, integrate medical engineering support workflow to ensure the medical engineering data effectively, accurately and comprehensively collected and kept in electronic archives. Analyse workflow of the medical, equipment support work and record all work processes by the portable electronic document. Using XML middleware technology and SQL Server database, complete process management, data calculation, submission, storage and other functions. The practical application shows that the medical equipment support information system optimizes the existing work process, standardized and digital, automatic and efficient orderly and controllable. The medical equipment support information system based on portable electronic document can effectively optimize and improve hospital medical engineering support work, improve performance, reduce costs, and provide full and accurate digital data

  12. Trust-Based Design of Human-Guided Algorithms

    DTIC Science & Technology

    2007-06-01

    Management Interdepartmental Program in Operations Research 17 May, 2007 Approved by: Laura Major Forest The Charles Stark Draper Laboratory...2. Information Analysis: predicting based on data, integrating and managing information, augmenting human operator perception and cognition. 3...allocation of automation by designers and managers . How an operator decides between manual and automatic control of a system is a necessary

  13. Autonomous self-organizing resource manager for multiple networked platforms

    NASA Astrophysics Data System (ADS)

    Smith, James F., III

    2002-08-01

    A fuzzy logic based expert system for resource management has been developed that automatically allocates electronic attack (EA) resources in real-time over many dissimilar autonomous naval platforms defending their group against attackers. The platforms can be very general, e.g., ships, planes, robots, land based facilities, etc. Potential foes the platforms deal with can also be general. This paper provides an overview of the resource manager including the four fuzzy decision trees that make up the resource manager; the fuzzy EA model; genetic algorithm based optimization; co-evolutionary data mining through gaming; and mathematical, computational and hardware based validation. Methods of automatically designing new multi-platform EA techniques are considered. The expert system runs on each defending platform rendering it an autonomous system requiring no human intervention. There is no commanding platform. Instead the platforms work cooperatively as a function of battlespace geometry; sensor data such as range, bearing, ID, uncertainty measures for sensor output; intelligence reports; etc. Computational experiments will show the defending networked platform's ability to self- organize. The platforms' ability to self-organize is illustrated through the output of the scenario generator, a software package that automates the underlying data mining problem and creates a computer movie of the platforms' interaction for evaluation.

  14. A Hadoop-based Molecular Docking System

    NASA Astrophysics Data System (ADS)

    Dong, Yueli; Guo, Quan; Sun, Bin

    2017-10-01

    Molecular docking always faces the challenge of managing tens of TB datasets. It is necessary to improve the efficiency of the storage and docking. We proposed the molecular docking platform based on Hadoop for virtual screening, it provides the preprocessing of ligand datasets and the analysis function of the docking results. A molecular cloud database that supports mass data management is constructed. Through this platform, the docking time is reduced, the data storage is efficient, and the management of the ligand datasets is convenient.

  15. Allocations for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, D.

    The Data Integration 2000 Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract. It is based on the Commercial-Off-The-Shelf product solution with commercially proven business processes. The COTS product solution set, of PassPort and People Soft software, supports finance, supply and chemical management/Material Safety Data Sheet, human resources. Allocations at Fluor Daniel Hanford are burdens added to base costs using a predetermined rate.

  16. Information systems: the key to evidence-based health practice.

    PubMed Central

    Rodrigues, R. J.

    2000-01-01

    Increasing prominence is being given to the use of best current evidence in clinical practice and health services and programme management decision-making. The role of information in evidence-based practice (EBP) is discussed, together with questions of how advanced information systems and technology (IS&T) can contribute to the establishment of a broader perspective for EBP. The author examines the development, validation and use of a variety of sources of evidence and knowledge that go beyond the well-established paradigm of research, clinical trials, and systematic literature review. Opportunities and challenges in the implementation and use of IS&T and knowledge management tools are examined for six application areas: reference databases, contextual data, clinical data repositories, administrative data repositories, decision support software, and Internet-based interactive health information and communication. Computerized and telecommunications applications that support EBP follow a hierarchy in which systems, tasks and complexity range from reference retrieval and the processing of relatively routine transactions, to complex "data mining" and rule-driven decision support systems. PMID:11143195

  17. System integration test plan for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, D.

    This document presents the system integration test plan for the Commercial-Off-The-Shelf, PassPort and PeopleSoft software, and custom software created to work with the COTS products. The PP software is an integrated application for AP, Contract Management, Inventory Management, Purchasing and Material Safety Data Sheet. The PS software is an integrated application for Project Costing, General Ledger, Human Resources/Training, Payroll, and Base Benefits.

  18. Innovative and applied research on big data platforms of smart heritage

    NASA Astrophysics Data System (ADS)

    Qiu, J.; Li, J.; Sun, H.

    2015-08-01

    Big data has huge commercial value and potential. Under the background of big data, a heritage site is faced with a number of questions and challenges such as, how to accelerate industrial innovation, benign competition and the creation of new business value. Based on the analysis of service data from the national archaeological site and park, Yuan Ming Yuan, this paper investigates the common problems of site management operations such as, inappropriate cultural interpretation, insufficient consumer demand and so on. In order to solve these operational problems, a new service system called the "one platform - three systems" was put forward. This system includes the smart heritage platform and three management systems: the smart heritage management system, the 3-O (Online-Offline-Onsite) service system and the digital explanation system. Combined with the 3-O marketing operation, the platform can realize bidirectional interaction between heritage site management units and tourists, which can also benefit visitors to the heritage site by explaining the culture and history of the heritage site, bring about more demand for cultural information and expand the social and economic benefits.

  19. 32 CFR 185.5 - Responsibilities.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... development of an MSCA data base and emergency reporting system, as described in paragraph (j) of this section... parameters of the DoD Resources Data Base (DODRDB) for MSCA, which is described in paragraph (n) of this section. Facilitate use of that data base to support decentralized management of MSCA in time of emergency...

  20. End-user interest in geotechnical data management systems.

    DOT National Transportation Integrated Search

    2008-12-01

    In conducting geotechnical site investigations, large volumes of subsurface information and associated test data : are generated. The current practice relies on paper-based filing systems that are often difficult and cumbersome : to access by users. ...

  1. Production roll out plan for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, D.E.

    The Hanford Data Integration 2000 (HANDI 2000) Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract (PHMC). It is based on the Commercial-Off-The-Shelf (COTS) product solution with commercially proven business processes. The COTS product solution set, of Passport (PP) and PeopleSoft (PS) software, supports finance, supply, human resources, and payroll activities under the current PHMC direction. The PP software is an integrated application for Accounts Payable, Contract Management, Inventory Management, Purchasing and Material Safety Data Sheets (MSDS). The PS software is an integrated application for Projects,more » General Ledger, Human Resources Training, Payroll, and Base Benefits. This set of software constitutes the Business Management System (BMS) and MSDS, a subset of the HANDI 2000 suite of systems. The primary objective of the Production Roll Out Plan is to communicate the methods and schedules for implementation and roll out to end users of BMS.« less

  2. A PC-based telemetry system for acquiring and reducing data from multiple PCM streams

    NASA Astrophysics Data System (ADS)

    Simms, D. A.; Butterfield, C. P.

    1991-07-01

    The Solar Energy Research Institute's (SERI) Wind Research Program is using Pulse Code Modulation (PCM) Telemetry Data-Acquisition Systems to study horizontal-axis wind turbines. Many PCM systems are combined for use in test installations that require accurate measurements from a variety of different locations. SERI has found them ideal for data-acquisition from multiple wind turbines and meteorological towers in wind parks. A major problem has been in providing the capability to quickly combine and examine incoming data from multiple PCM sources in the field. To solve this problem, SERI has developed a low-cost PC-based PCM Telemetry Data-Reduction System (PC-PCM System) to facilitate quick, in-the-field multiple-channel data analysis. The PC-PCM System consists of two basic components. First, PC-compatible hardware boards are used to decode and combine multiple PCM data streams. Up to four hardware boards can be installed in a single PC, which provides the capability to combine data from four PCM streams directly to PC disk or memory. Each stream can have up to 62 data channels. Second, a software package written for use under DOS was developed to simplify data-acquisition control and management. The software, called the Quick-Look Data Management Program, provides a quick, easy-to-use interface between the PC and multiple PCM data streams. The Quick-Look Data Management Program is a comprehensive menu-driven package used to organize, acquire, process, and display information from incoming PCM data streams. The paper describes both hardware and software aspects of the SERI PC-PCM system, concentrating on features that make it useful in an experiment test environment to quickly examine and verify incoming data from multiple PCM streams. Also discussed are problems and techniques associated with PC-based telemetry data-acquisition, processing, and real-time display.

  3. A New Data Management System for Biological and Chemical Oceanography

    NASA Astrophysics Data System (ADS)

    Groman, R. C.; Chandler, C.; Allison, D.; Glover, D. M.; Wiebe, P. H.

    2007-12-01

    The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created to serve PIs principally funded by NSF to conduct marine chemical and ecological research. The new office is dedicated to providing open access to data and information developed in the course of scientific research on short and intermediate time-frames. The data management system developed in support of U.S. JGOFS and U.S. GLOBEC programs is being modified to support the larger scope of the BCO-DMO effort, which includes ultimately providing a way to exchange data with other data systems. The open access system is based on a philosophy of data stewardship, support for existing and evolving data standards, and use of public domain software. The DMO staff work closely with originating PIs to manage data gathered as part of their individual programs. In the new BCO-DMO data system, project and data set metadata records designed to support re-use of the data are stored in a relational database (MySQL) and the data are stored in or made accessible by the JGOFS/GLOBEC object- oriented, relational, data management system. Data access will be provided via any standard Web browser client user interface through a GIS application (Open Source, OGC-compliant MapServer), a directory listing from the data holdings catalog, or a custom search engine that facilitates data discovery. In an effort to maximize data system interoperability, data will also be available via Web Services; and data set descriptions will be generated to comply with a variety of metadata content standards. The office is located at the Woods Hole Oceanographic Institution and web access is via http://www.bco-dmo.org.

  4. OOI CyberInfrastructure - Next Generation Oceanographic Research

    NASA Astrophysics Data System (ADS)

    Farcas, C.; Fox, P.; Arrott, M.; Farcas, E.; Klacansky, I.; Krueger, I.; Meisinger, M.; Orcutt, J.

    2008-12-01

    Software has become a key enabling technology for scientific discovery, observation, modeling, and exploitation of natural phenomena. New value emerges from the integration of individual subsystems into networked federations of capabilities exposed to the scientific community. Such data-intensive interoperability networks are crucial for future scientific collaborative research, as they open up new ways of fusing data from different sources and across various domains, and analysis on wide geographic areas. The recently established NSF OOI program, through its CyberInfrastructure component addresses this challenge by providing broad access from sensor networks for data acquisition up to computational grids for massive computations and binding infrastructure facilitating policy management and governance of the emerging system-of-scientific-systems. We provide insight into the integration core of this effort, namely, a hierarchic service-oriented architecture for a robust, performant, and maintainable implementation. We first discuss the relationship between data management and CI crosscutting concerns such as identity management, policy and governance, which define the organizational contexts for data access and usage. Next, we detail critical services including data ingestion, transformation, preservation, inventory, and presentation. To address interoperability issues between data represented in various formats we employ a semantic framework derived from the Earth System Grid technology, a canonical representation for scientific data based on DAP/OPeNDAP, and related data publishers such as ERDDAP. Finally, we briefly present the underlying transport based on a messaging infrastructure over the AMQP protocol, and the preservation based on a distributed file system through SDSC iRODS.

  5. Metadata Repository for Improved Data Sharing and Reuse Based on HL7 FHIR.

    PubMed

    Ulrich, Hannes; Kock, Ann-Kristin; Duhm-Harbeck, Petra; Habermann, Jens K; Ingenerf, Josef

    2016-01-01

    Unreconciled data structures and formats are a common obstacle to the urgently required sharing and reuse of data within healthcare and medical research. Within the North German Tumor Bank of Colorectal Cancer, clinical and sample data, based on a harmonized data set, is collected and can be pooled by using a hospital-integrated Research Data Management System supporting biobank and study management. Adding further partners who are not using the core data set requires manual adaptations and mapping of data elements. Facing this manual intervention and focusing the reuse of heterogeneous healthcare instance data (value level) and data elements (metadata level), a metadata repository has been developed. The metadata repository is an ISO 11179-3 conformant server application built for annotating and mediating data elements. The implemented architecture includes the translation of metadata information about data elements into the FHIR standard using the FHIR Data Element resource with the ISO 11179 Data Element Extensions. The FHIR-based processing allows exchange of data elements with clinical and research IT systems as well as with other metadata systems. With increasingly annotated and harmonized data elements, data quality and integration can be improved for successfully enabling data analytics and decision support.

  6. Program on State Agency Remote Sensing Data Management (SARSDM). [missouri

    NASA Technical Reports Server (NTRS)

    Eastwood, L. F., Jr.; Gotway, E. O.

    1978-01-01

    A planning study for developing a Missouri natural resources information system (NRIS) that combines satellite-derived data and other information to assist in carrying out key state tasks was conducted. Four focal applications -- dam safety, ground water supply monitoring, municipal water supply monitoring, and Missouri River basin modeling were identified. Major contributions of the study are: (1) a systematic choice and analysis of a high priority application (water resources) for a Missouri, LANDSAT-based information system; (2) a system design and implementation plan, based on Missouri, but useful for many other states; (3) an analysis of system costs, component and personnel requirements, and scheduling; and (4) an assessment of deterrents to successful technological innovation of this type in state government, and a system management plan, based on this assessment, for overcoming these obstacles in Missouri.

  7. Data Recording in Performance Management: Trouble With the Logics

    ERIC Educational Resources Information Center

    Groth Andersson, Signe; Denvall, Verner

    2017-01-01

    In recent years, performance management (PM) has become a buzzword in public sector organizations. Well-functioning PM systems rely on valid performance data, but critics point out that conflicting rationale or logic among professional staff in recording information can undermine the quality of the data. Based on a case study of social service…

  8. The database management system: A topic and a tool

    NASA Technical Reports Server (NTRS)

    Plummer, O. R.

    1984-01-01

    Data structures and data base management systems are common tools employed to deal with the administrative information of a university. An understanding of these topics is needed by a much wider audience, ranging from those interested in computer aided design and manufacturing to those using microcomputers. These tools are becoming increasingly valuable to academic programs as they develop comprehensive computer support systems. The wide use of these tools relies upon the relational data model as a foundation. Experience with the use of the IPAD RIM5.0 program is described.

  9. Application of Bayesian Classification to Content-Based Data Management

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Berrick, S.; Gopalan, A.; Hua, X.; Shen, S.; Smith, P.; Yang, K-Y.; Wheeler, K.; Curry, C.

    2004-01-01

    The high volume of Earth Observing System data has proven to be challenging to manage for data centers and users alike. At the Goddard Earth Sciences Distributed Active Archive Center (GES DAAC), about 1 TB of new data are archived each day. Distribution to users is also about 1 TB/day. A substantial portion of this distribution is MODIS calibrated radiance data, which has a wide variety of uses. However, much of the data is not useful for a particular user's needs: for example, ocean color users typically need oceanic pixels that are free of cloud and sun-glint. The GES DAAC is using a simple Bayesian classification scheme to rapidly classify each pixel in the scene in order to support several experimental content-based data services for near-real-time MODIS calibrated radiance products (from Direct Readout stations). Content-based subsetting would allow distribution of, say, only clear pixels to the user if desired. Content-based subscriptions would distribute data to users only when they fit the user's usability criteria in their area of interest within the scene. Content-based cache management would retain more useful data on disk for easy online access. The classification may even be exploited in an automated quality assessment of the geolocation product. Though initially to be demonstrated at the GES DAAC, these techniques have applicability in other resource-limited environments, such as spaceborne data systems.

  10. How do we know? An assessment of integrated community case management data quality in four districts of Malawi.

    PubMed

    Yourkavitch, Jennifer; Zalisk, Kirsten; Prosnitz, Debra; Luhanga, Misheck; Nsona, Humphreys

    2016-11-01

    The World Health Organization contracted annual data quality assessments of Rapid Access Expansion (RAcE) projects to review integrated community case management (iCCM) data quality and the monitoring and evaluation (M&E) system for iCCM, and to suggest ways to improve data quality. The first RAcE data quality assessment was conducted in Malawi in January 2014 and we present findings pertaining to data from the health management information system at the community, facility and other sub-national levels because RAcE grantees rely on that for most of their monitoring data. We randomly selected 10 health facilities (10% of eligible facilities) from the four RAcE project districts, and collected quantitative data with an adapted and comprehensive tool that included an assessment of Malawi's M&E system for iCCM data and a data verification exercise that traced selected indicators through the reporting system. We rated the iCCM M&E system across five function areas based on interviews and observations, and calculated verification ratios for each data reporting level. We also conducted key informant interviews with Health Surveillance Assistants and facility, district and central Ministry of Health staff. Scores show a high-functioning M&E system for iCCM with some deficiencies in data management processes. The system lacks quality controls, including data entry verification, a protocol for addressing errors, and written procedures for data collection, entry, analysis and management. Data availability was generally high except for supervision data. The data verification process identified gaps in completeness and consistency, particularly in Health Surveillance Assistants' record keeping. Staff at all levels would like more training in data management. This data quality assessment illuminates where an otherwise strong M&E system for iCCM fails to ensure some aspects of data quality. Prioritizing data management with documented protocols, additional training and approaches to create efficient supervision practices may improve iCCM data quality. © The Author 2016. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Rocketdyne automated dynamics data analysis and management system

    NASA Technical Reports Server (NTRS)

    Tarn, Robert B.

    1988-01-01

    An automated dynamics data analysis and management systems implemented on a DEC VAX minicomputer cluster is described. Multichannel acquisition, Fast Fourier Transformation analysis, and an online database have significantly improved the analysis of wideband transducer responses from Space Shuttle Main Engine testing. Leakage error correction to recover sinusoid amplitudes and correct for frequency slewing is described. The phase errors caused by FM recorder/playback head misalignment are automatically measured and used to correct the data. Data compression methods are described and compared. The system hardware is described. Applications using the data base are introduced, including software for power spectral density, instantaneous time history, amplitude histogram, fatigue analysis, and rotordynamics expert system analysis.

  12. Geographic Information System (GIS) capabilities in traffic accident information management: a qualitative approach

    PubMed Central

    Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali

    2017-01-01

    Background Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. Objective The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. Methods This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach’s alpha of 75%. Data was analyzed using the decision Delphi technique. Results GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Conclusion Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information. PMID:28848627

  13. TERSSE: Definition of the Total Earth Resources System for the Shuttle Era. Volume 8: User's Mission and System Requirements Data (appendix A of Volume 3)

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A computer printout is presented of the mission requirement for the TERSSE missions and their associated user tasks. The data included in the data base represents a broad-based attempt to define the amount, extent, and type of information needed for an earth resources management program in the era of the space shuttle. An effort was made to consider all aspects of remote sensing and resource management; because of its broad scope, it is not intended that the data be used without verification for in-depth studies of particular missions and/or users. The data base represents the quantitative structure necessary to define the TERSSE architecture and requirements, and to an overall integrated view of the earth resources technology requirements of the 1980's.

  14. Library Information System Time-Sharing (LISTS) Project. Final Report.

    ERIC Educational Resources Information Center

    Black, Donald V.

    The Library Information System Time-Sharing (LISTS) experiment was based on three innovations in data processing technology: (1) the advent of computer time-sharing on third-generation machines, (2) the development of general-purpose file-management software and (3) the introduction of large, library-oriented data bases. The main body of the…

  15. Advanced Technologies for Future Spacecraft Cockpits and Space-based Control Centers

    NASA Technical Reports Server (NTRS)

    Garcia-Galan, Carlos; Uckun, Serdar; Gregory, William; Williams, Kerry

    2006-01-01

    The National Aeronautics and Space Administration (NASA) is embarking on a new era of Space Exploration, aimed at sending crewed spacecraft beyond Low Earth Orbit (LEO), in medium and long duration missions to the Lunar surface, Mars and beyond. The challenges of such missions are significant and will require new technologies and paradigms in vehicle design and mission operations. Current roles and responsibilities of spacecraft systems, crew and the flight control team, for example, may not be sustainable when real-time support is not assured due to distance-induced communication lags, radio blackouts, equipment failures, or other unexpected factors. Therefore, technologies and applications that enable greater Systems and Mission Management capabilities on-board the space-based system will be necessary to reduce the dependency on real-time critical Earth-based support. The focus of this paper is in such technologies that will be required to bring advance Systems and Mission Management capabilities to space-based environments where the crew will be required to manage both the systems performance and mission execution without dependence on the ground. We refer to this concept as autonomy. Environments that require high levels of autonomy include the cockpits of future spacecraft such as the Mars Exploration Vehicle, and space-based control centers such as a Lunar Base Command and Control Center. Furthermore, this paper will evaluate the requirements, available technology, and roadmap to enable full operational implementation of onboard System Health Management, Mission Planning/re-planning, Autonomous Task/Command Execution, and Human Computer Interface applications. The technology topics covered by the paper include enabling technology to perform Intelligent Caution and Warning, where the systems provides directly actionable data for human understanding and response to failures, task automation applications that automate nominal and Off-nominal task execution based on human input or integrated health state-derived conditions. Shifting from Systems to Mission Management functions, we discuss the role of automated planning applications (tactical planning) on-board, which receive data from the other cockpit automation systems and evaluate the mission plan against the dynamic systems and mission states and events, to provide the crew with capabilities that enable them to understand, change, and manage the timeline of their mission. Lastly, we discuss the role of advanced human interface technologies that organize and provide the system md mission information to the crew in ways that maximize their situational awareness and ability to provide oversight and control of aLl the automated data and functions.

  16. TH-E-209-03: Development of An In-House CT Dose Monitoring and Management System Based On Open-Source Software Resources -- Pearls and Pitfalls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, D.

    Radiation dose monitoring solutions have opened up new opportunities for medical physicists to be more involved in modern clinical radiology practices. In particular, with the help of comprehensive radiation dose data, data-driven protocol management and informed case follow up are now feasible. Significant challenges remain however and the problems faced by medical physicists are highly heterogeneous. Imaging systems from multiple vendors and a wide range of vintages co-exist in the same department and employ data communication protocols that are not fully standardized or implemented making harmonization complex. Many different solutions for radiation dose monitoring have been implemented by imaging facilitiesmore » over the past few years. Such systems are based on commercial software, home-grown IT solutions, manual PACS data dumping, etc., and diverse pathways can be used to bring the data to impact clinical practice. The speakers will share their experiences with creating or tailoring radiation dose monitoring/management systems and procedures over the past few years, which vary significantly in design and scope. Topics to cover: (1) fluoroscopic dose monitoring and high radiation event handling from a large academic hospital; (2) dose monitoring and protocol optimization in pediatric radiology; and (3) development of a home-grown IT solution and dose data analysis framework. Learning Objectives: Describe the scope and range of radiation dose monitoring and protocol management in a modern radiology practice Review examples of data available from a variety of systems and how it managed and conveyed. Reflect on the role of the physicist in radiation dose awareness.« less

  17. Evolution of a Patient Information Management System in a Local Area Network Environment at Loyola University of Chicago Medical Center

    PubMed Central

    Price, Ronald N; Chandrasekhar, Arcot J; Tamirisa, Balaji

    1990-01-01

    The Department of Medicine at Loyola University Medical Center (LUMC) of Chicago has implemented a local area network (LAN) based Patient Information Management System (PIMS) as part of its integrated departmental database management system. PIMS consists of related database applications encompassing demographic information, current medications, problem lists, clinical data, prior events, and on-line procedure results. Integration into the existing departmental database system permits PIMS to capture and manipulate data in other departmental applications. Standardization of clinical data is accomplished through three data tables that verify diagnosis codes, procedures codes and a standardized set of clinical data elements. The modularity of the system, coupled with standardized data formats, allowed the development of a Patient Information Protocol System (PIPS). PIPS, a userdefinable protocol processor, provides physicians with individualized data entry or review screens customized for their specific research protocols or practice habits. Physician feedback indicates that the PIMS/PIPS combination enhances their ability to collect and review specific patient information by filtering large amount of clinical data.

  18. Namibian Flood Early Warning SensorWeb Pilot

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Policelli, Fritz; Frye, Stuart; Cappelare, Pat; Langenhove, Guido Van; Szarzynski, Joerg; Sohlberg, Rob

    2010-01-01

    The major goal of the Namibia SensorWeb Pilot Project is a scientifically sound, operational trans-boundary flood management decision support system for Southern African region to provide useful flood and waterborne disease forecasting tools for local decision makers. The Pilot Project established under the auspices of: Namibian Ministry of Agriculture Water and Forestry (MAWF), Department of Water Affairs; Committee on Earth Observing Satellites (CEOS), Working Group on Information Systems and Services (WGISS); and moderated by the United Nations Platform for Space-based Information for Disaster Management and Emergency Response (UN-SPIDER). The effort consists of identifying and prototyping technology which enables the rapid gathering and dissemination of both space-based and ground sensor data and data products for the purpose of flood disaster management and water-borne disease management.

  19. Adding Data Management Services to Parallel File Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandt, Scott

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decadesmore » the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file-based ecosystem; (3) common optimizations, e.g., indexing and caching, are readily supported across several file formats, avoiding effort duplication; and (4) performance improves significantly, as data processing is integrated more tightly with data storage. Our key contributions are: SciHadoop which explores changes to MapReduce assumption by taking advantage of semantics of structured data while preserving MapReduce’s failure and resource management; DataMods which extends common abstractions of parallel file systems so they become programmable such that they can be extended to natively support a variety of data models and can be hooked into emerging distributed runtimes such as Stanford’s Legion; and Miso which combines Hadoop and relational data warehousing to minimize time to insight, taking into account the overhead of ingesting data into data warehousing.« less

  20. New Systems of Food Service Management for the Air Force

    DTIC Science & Technology

    1979-09-01

    Without good management and successful innovations to meet customer needs, institutional food systems either survive precariously, or on a...unlimited access to all base facilities as well as the innovative , profit oriented management of the contractor. The results of this analysis and its...of innovative , profit-oriented management on the part of the contractor. The results of this analysis, along with its impact on the data presented in

Top