Sample records for data base management

  1. Computer-assisted engineering data base

    NASA Technical Reports Server (NTRS)

    Dube, R. P.; Johnson, H. R.

    1983-01-01

    General capabilities of data base management technology are described. Information requirements posed by the space station life cycle are discussed, and it is asserted that data base management technology supporting engineering/manufacturing in a heterogeneous hardware/data base management system environment should be applied to meeting these requirements. Today's commercial systems do not satisfy all of these requirements. The features of an R&D data base management system being developed to investigate data base management in the engineering/manufacturing environment are discussed. Features of this system represent only a partial solution to space station requirements. Areas where this system should be extended to meet full space station information management requirements are discussed.

  2. CAD/CAM data management

    NASA Technical Reports Server (NTRS)

    Bray, O. H.

    1984-01-01

    The role of data base management in CAD/CAM, particularly for geometric data is described. First, long term and short term objectives for CAD/CAM data management are identified. Second, the benefits of the data base management approach are explained. Third, some of the additional work needed in the data base area is discussed.

  3. MST radar data-base management

    NASA Technical Reports Server (NTRS)

    Wickwar, V. B.

    1983-01-01

    Data management for Mesospheric-Stratospheric-Tropospheric, (MST) radars is addressed. An incoherent-scatter radar data base is discussed in terms of purpose, centralization, scope, and nature of the data base management system.

  4. Management Data Base Development.

    ERIC Educational Resources Information Center

    Dan, Robert L.

    1975-01-01

    A management data base is seen as essential for a management information system, program budgeting, program costing, management by objectives, program evaluation, productivity measures, and accountability in institutions of higher education. The necessity of a management data base is addressed, along with the benefits and limitations it may have…

  5. Data base management system analysis and performance testing with respect to NASA requirements

    NASA Technical Reports Server (NTRS)

    Martin, E. A.; Sylto, R. V.; Gough, T. L.; Huston, H. A.; Morone, J. J.

    1981-01-01

    Several candidate Data Base Management Systems (DBM's) that could support the NASA End-to-End Data System's Integrated Data Base Management System (IDBMS) Project, later rescoped and renamed the Packet Management System (PMS) were evaluated. The candidate DBMS systems which had to run on the Digital Equipment Corporation VAX 11/780 computer system were ORACLE, SEED and RIM. Oracle and RIM are both based on the relational data base model while SEED employs a CODASYL network approach. A single data base application which managed stratospheric temperature profiles was studied. The primary reasons for using this application were an insufficient volume of available PMS-like data, a mandate to use actual rather than simulated data, and the abundance of available temperature profile data.

  6. A distributed data base management facility for the CAD/CAM environment

    NASA Technical Reports Server (NTRS)

    Balza, R. M.; Beaudet, R. W.; Johnson, H. R.

    1984-01-01

    Current/PAD research in the area of distributed data base management considers facilities for supporting CAD/CAM data management in a heterogeneous network of computers encompassing multiple data base managers supporting a variety of data models. These facilities include coordinated execution of multiple DBMSs to provide for administration of and access to data distributed across them.

  7. Managing geometric information with a data base management system

    NASA Technical Reports Server (NTRS)

    Dube, R. P.

    1984-01-01

    The strategies for managing computer based geometry are described. The computer model of geometry is the basis for communication, manipulation, and analysis of shape information. The research on integrated programs for aerospace-vehicle design (IPAD) focuses on the use of data base management system (DBMS) technology to manage engineering/manufacturing data. The objectives of IPAD is to develop a computer based engineering complex which automates the storage, management, protection, and retrieval of engineering data. In particular, this facility must manage geometry information as well as associated data. The approach taken on the IPAD project to achieve this objective is discussed. Geometry management in current systems and the approach taken in the early IPAD prototypes are examined.

  8. Management Principles to be Considered for Implementing a Data Base Management System Aboard U.S. (United States) Naval Ships under the Shipboard Non-Tactical ADP (Automated Data Processing) Program.

    DTIC Science & Technology

    1982-12-01

    Data Base Management System Aboard U.S. Naval Ships Under the Shipboard Non-tactical ADP Program by Robert Harrison Dixon December 1982 Thesis Advisor...OF REPORT a PERIOD COVIAOtt Management Principles to be Considered for Master’s Thesis Implementing a Data Base Management System December 1982 Aboard...NOTES is. KEY s0mas (Coelte on revrs side of 0..e..mp am iNe or "Neo 00111) Data Base Management System , DBMS, SNAP, SNAP I, SNAP II, Information

  9. Study of systems and techniques for data base management

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  10. Data communication between data terminal equipment and the JPL administrative data base management system

    NASA Technical Reports Server (NTRS)

    Iverson, R. W.

    1984-01-01

    Approaches to enabling an installed base of mixed data terminal equipment to access a data base management system designed to work with a specific terminal are discussed. The approach taken by the Jet Propulsion Laboratory is described. Background information on the Jet Propulsion Laboratory (JPL), its organization and a description of the Administrative Data Base Management System is included.

  11. Development of an interactive data base management system for capturing large volumes of data.

    PubMed

    Moritz, T E; Ellis, N K; VillaNueva, C B; Steeger, J E; Ludwig, S T; Deegan, N I; Shroyer, A L; Henderson, W G; Sethi, G K; Grover, F L

    1995-10-01

    Accurate collection and successful management of data are problems common to all scientific studies. For studies in which large quantities of data are collected by means of questionnaires and/or forms, data base management becomes quite laborious and time consuming. Data base management comprises data collection, data entry, data editing, and data base maintenance. In this article, the authors describe the development of an interactive data base management (IDM) system for the collection of more than 1,400 variables from a targeted population of 6,000 patients undergoing heart surgery requiring cardiopulmonary bypass. The goals of the IDM system are to increase the accuracy and efficiency with which this large amount of data is collected and processed, to reduce research nurse work load through automation of certain administrative and clerical activities, and to improve the process for implementing a uniform study protocol, standardized forms, and definitions across sites.

  12. Vibroacoustic payload environment prediction system (VAPEPS): Data base management center remote access guide

    NASA Technical Reports Server (NTRS)

    Thomas, V. C.

    1986-01-01

    A Vibroacoustic Data Base Management Center has been established at the Jet Propulsion Laboratory (JPL). The center utilizes the Vibroacoustic Payload Environment Prediction System (VAPEPS) software package to manage a data base of shuttle and expendable launch vehicle flight and ground test data. Remote terminal access over telephone lines to a dedicated VAPEPS computer system has been established to provide the payload community a convenient means of querying the global VAPEPS data base. This guide describes the functions of the JPL Data Base Management Center and contains instructions for utilizing the resources of the center.

  13. Quantitative Evaluation of 3 DBMS: ORACLE, SEED AND INGRES

    NASA Technical Reports Server (NTRS)

    Sylto, R.

    1984-01-01

    Characteristics required for NASA scientific data base management application are listed as well as performance testing objectives. Results obtained for the ORACLE, SEED, and INGRES packages are presented in charts. It is concluded that vendor packages can manage 130 megabytes of data at acceptable load and query rates. Performance tests varying data base designs and various data base management system parameters are valuable to applications for choosing packages and critical to designing effective data bases. An applications productivity increases with the use of data base management system because of enhanced capabilities such as a screen formatter, a reporter writer, and a data dictionary.

  14. A method for data base management and analysis for wind tunnel data

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1987-01-01

    To respond to the need for improved data base management and analysis capabilities for wind-tunnel data at the Langley 16-Foot Transonic Tunnel, research was conducted into current methods of managing wind-tunnel data and a method was developed as a solution to this need. This paper describes the development of the data base management and analysis method for wind-tunnel data. The design and implementation of the software system are discussed and examples of its use are shown.

  15. RIM as the data base management system for a material properties data base

    NASA Technical Reports Server (NTRS)

    Karr, P. H.; Wilson, D. J.

    1984-01-01

    Relational Information Management (RIM) was selected as the data base management system for a prototype engineering materials data base. The data base provides a central repository for engineering material properties data, which facilitates their control. Numerous RIM capabilities are exploited to satisfy prototype data base requirements. Numerical, text, tabular, and graphical data and references are being stored for five material types. Data retrieval will be accomplished both interactively and through a FORTRAN interface. The experience gained in creating and exercising the prototype will be used in specifying requirements for a production system.

  16. A distributed data base management system. [for Deep Space Network

    NASA Technical Reports Server (NTRS)

    Bryan, A. I.

    1975-01-01

    Major system design features of a distributed data management system for the NASA Deep Space Network (DSN) designed for continuous two-way deep space communications are described. The reasons for which the distributed data base utilizing third-generation minicomputers is selected as the optimum approach for the DSN are threefold: (1) with a distributed master data base, valid data is available in real-time to support DSN management activities at each location; (2) data base integrity is the responsibility of local management; and (3) the data acquisition/distribution and processing power of a third-generation computer enables the computer to function successfully as a data handler or as an on-line process controller. The concept of the distributed data base is discussed along with the software, data base integrity, and hardware used. The data analysis/update constraint is examined.

  17. Engineering data management: Experience and projections

    NASA Technical Reports Server (NTRS)

    Jefferson, D. K.; Thomson, B.

    1978-01-01

    Experiences in developing a large engineering data management system are described. Problems which were encountered are presented and projected to future systems. Business applications involving similar types of data bases are described. A data base management system architecture proposed by the business community is described and its applicability to engineering data management is discussed. It is concluded that the most difficult problems faced in engineering and business data management can best be solved by cooperative efforts.

  18. Current Standardization and Cooperative Efforts Related to Industrial Information Infrastructures.

    DTIC Science & Technology

    1993-05-01

    Data Management Systems: Components used to store, manage, and retrieve data. Data management includes knowledge bases, database management...Application Development Tools and Methods X/Open and POSIX APIs Integrated Design Support System (IDS) Knowledge -Based Systems (KBS) Application...IDEFlx) Yourdon Jackson System Design (JSD) Knowledge -Based Systems (KBSs) Structured Systems Development (SSD) Semantic Unification Meta-Model

  19. Data base management study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data base management techniques and applicable equipment are described. Recommendations which will assist potential NASA data users in selecting and using appropriate data base management tools and techniques are presented. Classes of currently available data processing equipment ranging from basic terminals to large minicomputer systems were surveyed as they apply to the needs of potential SEASAT data users. Cost and capabilities projections for this equipment through 1985 were presented. A test of a typical data base management system was described, as well as the results of this test and recommendations to assist potential users in determining when such a system is appropriate for their needs. The representative system tested was UNIVAC's DMS 1100.

  20. Database Management

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Management of the data within a planetary data system (PDS) is addressed. Principles of modern data management are described and several large NASA scientific data base systems are examined. Data management in PDS is outlined and the major data management issues are introduced.

  1. Quality data collection and management technology of aerospace complex product assembly process

    NASA Astrophysics Data System (ADS)

    Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo

    2017-04-01

    Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.

  2. Data management system for USGS/USEPA urban hydrology studies program

    USGS Publications Warehouse

    Doyle, W.H.; Lorens, J.A.

    1982-01-01

    A data management system was developed to store, update, and retrieve data collected in urban stormwater studies jointly conducted by the U.S. Geological Survey and U.S. Environmental Protection Agency in 11 cities in the United States. The data management system is used to retrieve and combine data from USGS data files for use in rainfall, runoff, and water-quality models and for data computations such as storm loads. The system is based on the data management aspect of the Statistical Analysis System (SAS) and was used to create all the data files in the data base. SAS is used for storage and retrieval of basin physiography, land-use, and environmental practices inventory data. Also, storm-event water-quality characteristics are stored in the data base. The advantages of using SAS to create and manage a data base are many with a few being that it is simple, easy to use, contains a comprehensive statistical package, and can be used to modify files very easily. Data base system development has progressed rapidly during the last two decades and the data managment system concepts used in this study reflect the advancement made in computer technology during this era. Urban stormwater data is, however, just one application for which the system can be used. (USGS)

  3. SPIRE Data-Base Management System

    NASA Technical Reports Server (NTRS)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  4. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  5. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  6. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  7. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  8. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  9. An event-version-based spatio-temporal modeling approach and its application in the cadastral management

    NASA Astrophysics Data System (ADS)

    Li, Yangdong; Han, Zhen; Liao, Zhongping

    2009-10-01

    Spatiality, temporality, legality, accuracy and continuality are characteristic of cadastral information, and the cadastral management demands that the cadastral data should be accurate, integrated and updated timely. It's a good idea to build an effective GIS management system to manage the cadastral data which are characterized by spatiality and temporality. Because no sound spatio-temporal data models have been adopted, however, the spatio-temporal characteristics of cadastral data are not well expressed in the existing cadastral management systems. An event-version-based spatio-temporal modeling approach is first proposed from the angle of event and version. Then with the help of it, an event-version-based spatio-temporal cadastral data model is built to represent spatio-temporal cadastral data. At last, the previous model is used in the design and implementation of a spatio-temporal cadastral management system. The result of the application of the system shows that the event-version-based spatio-temporal data model is very suitable for the representation and organization of cadastral data.

  10. Management and display of four-dimensional environmental data sets using McIDAS

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Santek, David; Suomi, Verner E.

    1990-01-01

    Over the past four years, great strides have been made in the areas of data management and display of 4-D meteorological data sets. A survey was conducted of available and planned 4-D meteorological data sources. The data types were evaluated for their impact on the data management and display system. The requirements were analyzed for data base management generated by the 4-D data display system. The suitability of the existing data base management procedures and file structure were evaluated in light of the new requirements. Where needed, new data base management tools and file procedures were designed and implemented. The quality of the basic 4-D data sets was assured. The interpolation and extrapolation techniques of the 4-D data were investigated. The 4-D data from various sources were combined to make a uniform and consistent data set for display purposes. Data display software was designed to create abstract line graphic 3-D displays. Realistic shaded 3-D displays were created. Animation routines for these displays were developed in order to produce a dynamic 4-D presentation. A prototype dynamic color stereo workstation was implemented. A computer functional design specification was produced based on interactive studies and user feedback.

  11. The ABC's required for establishing a practical computerized plant engineering management data base system

    NASA Technical Reports Server (NTRS)

    Maiocco, F. R.; Hume, J. P.

    1976-01-01

    A system's approach is outlined in the paper to assist facility and Plant Engineers improve their organization's data management system. The six basic steps identified may appear somewhat simple; however, adequate planning, proper resources, and the involvement of management will determine the success of a computerized facility management data base. Helpful suggestions are noted throughout the paper to insure the development of a practical computerized data management system.

  12. User requirements for NASA data base management systems. Part 1: Oceanographic discipline

    NASA Technical Reports Server (NTRS)

    Fujimoto, B.

    1981-01-01

    Generic oceanographic user requirements were collected and analyzed for use in developing a general multipurpose data base management system for future missions of the Office of Space and Terrestrial Applications (OSTA) of NASA. The collection of user requirements involved; studying the state-of-the-art technology in data base management systems; analyzing the results of related studies; formulating a viable and diverse list of scientists to be interviewed; developing a presentation format and materials; and interviewing oceanographic data users. More effective data management systems are needed to handle the increasing influx of data.

  13. Information Systems for University Planning.

    ERIC Educational Resources Information Center

    Robinson, Robert J.

    This paper proposes construction of a separate data base environment for university planning information, distinct from data bases and systems supporting operational functioning and management. The data base would receive some of its input from the management information systems (MIS)/transactional data bases and systems through a process of…

  14. Reference manual for data base on Nevada water-rights permits

    USGS Publications Warehouse

    Cartier, K.D.; Bauer, E.M.; Farnham, J.L.

    1995-01-01

    The U.S. Geological Survey and Nevada Division of Water Resources have cooperatively developed and implemented a data-base system for managing water-rights permit information for the State of Nevada. The Water-Rights Permit data base is part of an integrated system of computer data bases using the Ingres Relational Data-Base Manage-ment System, which allows efficient storage and access to water information from the State Engineer's office. The data base contains a main table, three ancillary tables, and five lookup tables, as well as a menu-driven system for entering, updating, and reporting on the data. This reference guide outlines the general functions of the system and provides a brief description of data tables and data-entry screens.

  15. Case Mix Management Systems: An Opportunity to Integrate Medical Records and Financial Management System Data Bases

    PubMed Central

    Rusnak, James E.

    1987-01-01

    Due to previous systems selections, many hospitals (health care facilities) are faced with the problem of fragmented data bases containing clinical, demographic and financial information. Projects to select and implement a Case Mix Management System (CMMS) provide an opportunity to reduce the number of separate physical files and to migrate towards systems with an integrated data base. The number of CMMS candidate systems is often restricted due to data base and system interface issues. The hospital must insure the CMMS project provides a means to implement an integrated on-line hospital information data base for use by departments in operating under a DRG-based Prospective Payment System. This paper presents guidelines for use in selecting a Case Mix Mangement System to meet the hospital's financial and operations planning, budgeting, marketing, and other management needs, while considering the data base implications of the implementation.

  16. Quality control and quality assurance plan for bridge channel-stability assessments in Massachusetts

    USGS Publications Warehouse

    Parker, Gene W.; Pinson, Harlow

    1993-01-01

    A quality control and quality assurance plan has been implemented as part of the Massachusetts bridge scour and channel-stability assessment program. This program is being conducted by the U.S. Geological Survey, Massachusetts-Rhode Island District, in cooperation with the Massachusetts Highway Department. Project personnel training, data-integrity verification, and new data-management technologies are being utilized in the channel-stability assessment process to improve current data-collection and management techniques. An automated data-collection procedure has been implemented to standardize channel-stability assessments on a regular basis within the State. An object-oriented data structure and new image management tools are used to produce a data base enabling management of multiple data object classes. Data will be reviewed by assessors and data base managers before being merged into a master bridge-scour data base, which includes automated data-verification routines.

  17. CAD/CAM data management needs, requirements and options

    NASA Technical Reports Server (NTRS)

    Lopatka, R. S.; Johnson, T. G.

    1978-01-01

    The requirements for a data management system in support of technical or scientific applications and possible courses of action were reviewed. Specific requirements were evolved while working towards higher level integration impacting all phases of the current design process and through examination of commercially marketed systems and related data base research. Arguments are proposed for varied approaches in implementing data base systems ranging from no action necessary to immediate procurement of an existing data base management system.

  18. Library Statistical Data Base Formats and Definitions.

    ERIC Educational Resources Information Center

    Jones, Dennis; And Others

    Represented are the detailed set of data structures relevant to the categorization of information, terminology, and definitions employed in the design of the library statistical data base. The data base, or management information system, provides administrators with a framework of information and standardized data for library management, planning,…

  19. Data-base development for water-quality modeling of the Patuxent River basin, Maryland

    USGS Publications Warehouse

    Fisher, G.T.; Summers, R.M.

    1987-01-01

    Procedures and rationale used to develop a data base and data management system for the Patuxent Watershed Nonpoint Source Water Quality Monitoring and Modeling Program of the Maryland Department of the Environment and the U.S. Geological Survey are described. A detailed data base and data management system has been developed to facilitate modeling of the watershed for water quality planning purposes; statistical analysis; plotting of meteorologic, hydrologic and water quality data; and geographic data analysis. The system is Maryland 's prototype for development of a basinwide water quality management program. A key step in the program is to build a calibrated and verified water quality model of the basin using the Hydrological Simulation Program--FORTRAN (HSPF) hydrologic model, which has been used extensively in large-scale basin modeling. The compilation of the substantial existing data base for preliminary calibration of the basin model, including meteorologic, hydrologic, and water quality data from federal and state data bases and a geographic information system containing digital land use and soils data is described. The data base development is significant in its application of an integrated, uniform approach to data base management and modeling. (Lantz-PTT)

  20. Forest management applications of Landsat data in a geographic information system

    NASA Technical Reports Server (NTRS)

    Maw, K. D.; Brass, J. A.

    1982-01-01

    The utility of land-cover data resulting from Landsat MSS classification can be greatly enhanced by use in combination with ancillary data. A demonstration forest management applications data base was constructed for Santa Cruz County, California, to demonstrate geographic information system applications of classified Landsat data. The data base contained detailed soils, digital terrain, land ownership, jurisdictional boundaries, fire events, and generalized land-use data, all registered to a UTM grid base. Applications models were developed from problems typical of fire management and reforestation planning.

  1. A database management capability for Ada

    NASA Technical Reports Server (NTRS)

    Chan, Arvola; Danberg, SY; Fox, Stephen; Landers, Terry; Nori, Anil; Smith, John M.

    1986-01-01

    The data requirements of mission critical defense systems have been increasing dramatically. Command and control, intelligence, logistics, and even weapons systems are being required to integrate, process, and share ever increasing volumes of information. To meet this need, systems are now being specified that incorporate data base management subsystems for handling storage and retrieval of information. It is expected that a large number of the next generation of mission critical systems will contain embedded data base management systems. Since the use of Ada has been mandated for most of these systems, it is important to address the issues of providing data base management capabilities that can be closely coupled with Ada. A comprehensive distributed data base management project has been investigated. The key deliverables of this project are three closely related prototype systems implemented in Ada. These three systems are discussed.

  2. The Blood Stocks Management Scheme, a partnership venture between the National Blood Service of England and North Wales and participating hospitals for maximizing blood supply chain management.

    PubMed

    Chapman, J F; Cook, R

    2002-10-01

    The Blood Stocks Management Scheme (BSMS) has been established as a joint venture between the National Blood Service (NBS) in England and North Wales and participating hospitals to monitor the blood supply chain. Stock and wastage data are submitted to a web-based data-management system, facilitating continuous and complete red cell data collection and 'real time' data extraction. The data-management system enables peer review of performance in respect of stock holding levels and red cell wastage. The BSMS has developed an innovative web-based data-management system that enables data collection and benchmarking of practice, which should drive changes in stock management practice, therefore optimizing the use of donated blood.

  3. The research and development of water resources management information system based on ArcGIS

    NASA Astrophysics Data System (ADS)

    Cui, Weiqun; Gao, Xiaoli; Li, Yuzhi; Cui, Zhencai

    According to that there are large amount of data, complexity of data type and format in the water resources management, we built the water resources calculation model and established the water resources management information system based on the advanced ArcGIS and Visual Studio.NET development platform. The system can integrate the spatial data and attribute data organically, and manage them uniformly. It can analyze spatial data, inquire by map and data bidirectionally, provide various charts and report forms automatically, link multimedia information, manage database etc. . So it can provide spatial and static synthetical information services for study, management and decision of water resources, regional geology and eco-environment etc..

  4. Documentation of a spatial data-base management system for monitoring pesticide application in Washington

    USGS Publications Warehouse

    Schurr, K.M.; Cox, S.E.

    1994-01-01

    The Pesticide-Application Data-Base Management System was created as a demonstration project and was tested with data submitted to the Washington State Department of Agriculture by pesticide applicators from a small geographic area. These data were entered into the Department's relational data-base system and uploaded into the system's ARC/INFO files. Locations for pesticide applica- tions are assigned within the Public Land Survey System grids, and ARC/INFO programs in the Pesticide-Application Data-Base Management System can subdivide each survey section into sixteen idealized quarter-quarter sections for display map grids. The system provides data retrieval and geographic information system plotting capabilities from a menu of seven basic retrieval options. Additionally, ARC/INFO coverages can be created from the retrieved data when required for particular applications. The Pesticide-Application Data-Base Management System, or the general principles used in the system, could be adapted to other applica- tions or to other states.

  5. Are we missing the boat? Current uses of long-term biological monitoring data in the evaluation and management of marine protected areas.

    PubMed

    Addison, P F E; Flander, L B; Cook, C N

    2015-02-01

    Protected area management agencies are increasingly using management effectiveness evaluation (MEE) to better understand, learn from and improve conservation efforts around the globe. Outcome assessment is the final stage of MEE, where conservation outcomes are measured to determine whether management objectives are being achieved. When quantitative monitoring data are available, best-practice examples of outcome assessments demonstrate that data should be assessed against quantitative condition categories. Such assessments enable more transparent and repeatable integration of monitoring data into MEE, which can promote evidence-based management and improve public accountability and reporting. We interviewed key informants from marine protected area (MPA) management agencies to investigate how scientific data sources, especially long-term biological monitoring data, are currently informing conservation management. Our study revealed that even when long-term monitoring results are available, management agencies are not using them for quantitative condition assessment in MEE. Instead, many agencies conduct qualitative condition assessments, where monitoring results are interpreted using expert judgment only. Whilst we found substantial evidence for the use of long-term monitoring data in the evidence-based management of MPAs, MEE is rarely the sole mechanism that facilitates the knowledge transfer of scientific evidence to management action. This suggests that the first goal of MEE (to enable environmental accountability and reporting) is being achieved, but the second and arguably more important goal of facilitating evidence-based management is not. Given that many MEE approaches are in their infancy, recommendations are made to assist management agencies realize the full potential of long-term quantitative monitoring data for protected area evaluation and evidence-based management. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Land cover mapping of the upper Kuskokwim Resource Managment Area using LANDSAT and a digital data base approach

    USGS Publications Warehouse

    Markon, Carl J.

    1988-01-01

    Digital land cover and terrain data for the Upper Kuskokwim Resource Hanagement Area (UKRMA) were produced by the U.S. Geological Survey, Earth Resources Observation Systems Field Office, Anchorage, Alaska for the Bureau of Land Management. These and other environmental data, were incorporated into a digital data base to assist in the management and planning of the UKRMA. The digital data base includes land cover classifications, elevation, slope, and aspect data centering on the UKRMA boundaries. The data are stored on computer compatible tapes at a 50-m pixel size. Additional digital data in the data base include: (a) summer and winter Landsat multispectral scanner (MSS) data registered to a 50-m Universal Transverse Mercator grid; (b) elevation, slope, aspect, and solar illumination data; (c) soils and surficial geology; and (e) study area boundary. The classification of Landsat MSS data resulted in seven major classes and 24 subclasses. Major classes include: forest, shrubland, dwarf scrub, herbaceous, barren, water, and other. The final data base will be used by resource personnel for management and planning within the UKRMA.

  7. The State of Cloud-Based Biospecimen and Biobank Data Management Tools.

    PubMed

    Paul, Shonali; Gade, Aditi; Mallipeddi, Sumani

    2017-04-01

    Biobanks are critical for collecting and managing high-quality biospecimens from donors with appropriate clinical annotation. The high-quality human biospecimens and associated data are required to better understand disease processes. Therefore, biobanks have become an important and essential resource for healthcare research and drug discovery. However, collecting and managing huge volumes of data (biospecimens and associated clinical data) necessitate that biobanks use appropriate data management solutions that can keep pace with the ever-changing requirements of research. To automate biobank data management, biobanks have been investing in traditional Laboratory Information Management Systems (LIMS). However, there are a myriad of challenges faced by biobanks in acquiring traditional LIMS. Traditional LIMS are cost-intensive and often lack the flexibility to accommodate changes in data sources and workflows. Cloud technology is emerging as an alternative that provides the opportunity to small and medium-sized biobanks to automate their operations in a cost-effective manner, even without IT personnel. Cloud-based solutions offer the advantage of heightened security, rapid scalability, dynamic allocation of services, and can facilitate collaboration between different research groups by using a shared environment on a "pay-as-you-go" basis. The benefits offered by cloud technology have resulted in the development of cloud-based data management solutions as an alternative to traditional on-premise software. After evaluating the advantages offered by cloud technology, several biobanks have started adopting cloud-based tools. Cloud-based tools provide biobanks with easy access to biospecimen data for real-time sharing with clinicians. Another major benefit realized by biobanks by implementing cloud-based applications is unlimited data storage on the cloud and automatic backups for protecting any data loss in the face of natural calamities.

  8. Web-based data acquisition and management system for GOSAT validation Lidar data analysis

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra N.; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2012-11-01

    An web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data analysis is developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS ground-level meteorological data, Rawinsonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data.

  9. An integrated GIS/remote sensing data base in North Cache soil conservation district, Utah: A pilot project for the Utah Department of Agriculture's RIMS (Resource Inventory and Monitoring System)

    NASA Technical Reports Server (NTRS)

    Wheeler, D. J.; Ridd, M. K.; Merola, J. A.

    1984-01-01

    A basic geographic information system (GIS) for the North Cache Soil Conservation District (SCD) was sought for selected resource problems. Since the resource management issues in the North Cache SCD are very complex, it is not feasible in the initial phase to generate all the physical, socioeconomic, and political baseline data needed for resolving all management issues. A selection of critical varables becomes essential. Thus, there are foud specific objectives: (1) assess resource management needs and determine which resource factors ae most fundamental for building a beginning data base; (2) evaluate the variety of data gathering and analysis techniques for the resource factors selected; (3) incorporate the resulting data into a useful and efficient digital data base; and (4) demonstrate the application of the data base to selected real world resoource management issues.

  10. Requirements for company-wide management

    NASA Technical Reports Server (NTRS)

    Southall, J. W.

    1980-01-01

    Computing system requirements were developed for company-wide management of information and computer programs in an engineering data processing environment. The requirements are essential to the successful implementation of a computer-based engineering data management system; they exceed the capabilities provided by the commercially available data base management systems. These requirements were derived from a study entitled The Design Process, which was prepared by design engineers experienced in development of aerospace products.

  11. The design and implementation of GML data management information system based on PostgreSQL

    NASA Astrophysics Data System (ADS)

    Zhang, Aiguo; Wu, Qunyong; Xu, Qifeng

    2008-10-01

    GML expresses geographic information in text, and it provides an extensible and standard way of spatial information encoding. At the present time, the management of GML data is in terms of document. By this way, the inquiry and update of GML data is inefficient, and it demands high memory when the document is comparatively large. In this respect, the paper put forward a data management of GML based on PostgreSQL. It designs four kinds of inquiries, which are inquiry of metadata, inquiry of geometry based on property, inquiry of property based on spatial information, and inquiry of spatial data based on location. At the same time, it designs and implements the visualization of the inquired WKT data.

  12. Georgia resource assessment project: Institutionalizing LANDSAT and geographic data base techniques

    NASA Technical Reports Server (NTRS)

    Pierce, R. R.; Rado, B. Q.; Faust, N.

    1981-01-01

    Digital data from LANDSAT for each 1.1-acre cell in Georgia were processed and the land cover conditions were categorized. Several test cases were completed and an operational hardware and software processing capability was established at the Georgia Institute of Technology. The operational capability was developed to process the entire state (60,000 sq. miles and 14 LANDSAT scenes) in a cooperative project between eleven divisions and agencies at the regional, state, and federal levels. Products were developed for State agencies such as in both mapped and statistical formats. A computerized geographical data base was developed for management programs. To a large extent the applications of the data base evolved as users of LANDSAT information requested that other data (i.e., soils, slope, land use, etc.) be made compatible with LANDSAT for management programs. To date, geographic data bases incorporating LANDSAT and other spatial data deal with elements of the municipal solid waste management program, and reservoir management for the Corps of Engineers. LANDSAT data are also being used for applications in wetland, wildlife, and forestry management.

  13. A data management infrastructure for bridge monitoring

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Byun, Jaewook; Kim, Daeyoung; Sohn, Hoon; Bae, In Hwan; Law, Kincho H.

    2015-04-01

    This paper discusses a data management infrastructure framework for bridge monitoring applications. As sensor technologies mature and become economically affordable, their deployment for bridge monitoring will continue to grow. Data management becomes a critical issue not only for storing the sensor data but also for integrating with the bridge model to support other functions, such as management, maintenance and inspection. The focus of this study is on the effective data management of bridge information and sensor data, which is crucial to structural health monitoring and life cycle management of bridge structures. We review the state-of-the-art of bridge information modeling and sensor data management, and propose a data management framework for bridge monitoring based on NoSQL database technologies that have been shown useful in handling high volume, time-series data and to flexibly deal with unstructured data schema. Specifically, Apache Cassandra and Mongo DB are deployed for the prototype implementation of the framework. This paper describes the database design for an XML-based Bridge Information Modeling (BrIM) schema, and the representation of sensor data using Sensor Model Language (SensorML). The proposed prototype data management framework is validated using data collected from the Yeongjong Bridge in Incheon, Korea.

  14. The Emerging Role of the Data Base Manager. Report No. R-1253-PR.

    ERIC Educational Resources Information Center

    Sawtelle, Thomas K.

    The Air Force Logistics Command (AFLC) is revising and enhancing its data-processing capabilities with the development of a large-scale, multi-site, on-line, integrated data base information system known as the Advanced Logistics System (ALS). A data integrity program is to be built around a Data Base Manager (DBM), an individual or a group of…

  15. The data base management system alternative for computing in the human services.

    PubMed

    Sircar, S; Schkade, L L; Schoech, D

    1983-01-01

    The traditional incremental approach to computerization presents substantial problems as systems develop and grow. The Data Base Management System approach to computerization was developed to overcome the problems resulting from implementing computer applications one at a time. The authors describe the applications approach and the alternative Data Base Management System (DBMS) approach through their developmental history, discuss the technology of DBMS components, and consider the implications of choosing the DBMS alternative. Human service managers need an understanding of the DBMS alternative and its applicability to their agency data processing needs. The basis for a conscious selection of computing alternatives is outlined.

  16. Configuration and Data Management Process and the System Safety Professional

    NASA Technical Reports Server (NTRS)

    Shivers, Charles Herbert; Parker, Nelson C. (Technical Monitor)

    2001-01-01

    This article presents a discussion of the configuration management (CM) and the Data Management (DM) functions and provides a perspective of the importance of configuration and data management processes to the success of system safety activities. The article addresses the basic requirements of configuration and data management generally based on NASA configuration and data management policies and practices, although the concepts are likely to represent processes of any public or private organization's well-designed configuration and data management program.

  17. NASA Cloud-Based Climate Data Services

    NASA Astrophysics Data System (ADS)

    McInerney, M. A.; Schnase, J. L.; Duffy, D. Q.; Tamkin, G. S.; Strong, S.; Ripley, W. D., III; Thompson, J. H.; Gill, R.; Jasen, J. E.; Samowich, B.; Pobre, Z.; Salmon, E. M.; Rumney, G.; Schardt, T. D.

    2012-12-01

    Cloud-based scientific data services are becoming an important part of NASA's mission. Our technological response is built around the concept of specialized virtual climate data servers, repetitive cloud provisioning, image-based deployment and distribution, and virtualization-as-a-service (VaaS). A virtual climate data server (vCDS) is an Open Archive Information System (OAIS) compliant, iRODS-based data server designed to support a particular type of scientific data collection. iRODS is data grid middleware that provides policy-based control over collection-building, managing, querying, accessing, and preserving large scientific data sets. We have deployed vCDS Version 1.0 in the Amazon EC2 cloud using S3 object storage and are using the system to deliver a subset of NASA's Intergovernmental Panel on Climate Change (IPCC) data products to the latest CentOS federated version of Earth System Grid Federation (ESGF), which is also running in the Amazon cloud. vCDS-managed objects are exposed to ESGF through FUSE (Filesystem in User Space), which presents a POSIX-compliant filesystem abstraction to applications such as the ESGF server that require such an interface. A vCDS manages data as a distinguished collection for a person, project, lab, or other logical unit. A vCDS can manage a collection across multiple storage resources using rules and microservices to enforce collection policies. And a vCDS can federate with other vCDSs to manage multiple collections over multiple resources, thereby creating what can be thought of as an ecosystem of managed collections. With the vCDS approach, we are trying to enable the full information lifecycle management of scientific data collections and make tractable the task of providing diverse climate data services. In this presentation, we describe our approach, experiences, lessons learned, and plans for the future.; (A) vCDS/ESG system stack. (B) Conceptual architecture for NASA cloud-based data services.

  18. Commentary to Library Statistical Data Base.

    ERIC Educational Resources Information Center

    Jones, Dennis; And Others

    The National Center for Higher Education Management Systems (NCHEMS) has developed a library statistical data base which concentrates on the management information needs of administrators of public and academic libraries. This document provides an overview of the framework and conceptual approach employed in the design of the data base. The data…

  19. US EPA Base Study Standard Operating Procedure for Data Processing and Data Management

    EPA Pesticide Factsheets

    The purpose of the Standard Operating Procedures (SOP) for data management and data processing is to facilitate consistent documentation and completion of data processing duties and management responsibilities in order to maintain a high standard of data quality.

  20. Data management in engineering

    NASA Technical Reports Server (NTRS)

    Browne, J. C.

    1976-01-01

    An introduction to computer based data management is presented with an orientation toward the needs of engineering application. The characteristics and structure of data management systems are discussed. A link to familiar engineering applications of computing is established through a discussion of data structure and data access procedures. An example data management system for a hypothetical engineering application is presented.

  1. Description of data base management systems activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    One of the major responsibilities of the JPL Computing and Information Services Office is to develop and maintain a JPL plan for providing computing services to the JPL management and administrative community that will lead to improved productivity. The CISO plan to accomplish this objective has been titled 'Management and Administrative Support Systems' (MASS). The MASS plan is based on the continued use of JPL's IBM 3032 Computer system for administrative computing and for the MASS functions. The current candidate administrative Data Base Management Systems required to support the MASS include ADABASE, Cullinane IDMS and TOTAL. Previous uses of administrative Data Base Systems have been applied to specific local functions rather than in a centralized manner with elements common to the many user groups. Limited capacity data base systems have been installed in microprocessor based office automation systems in a few Project and Management Offices using Ashton-Tate dBASE II. These experiences plus some other localized in house DBMS uses have provided an excellent background for developing user and system requirements for a single DBMS to support the MASS program.

  2. Improving data management and dissemination in web based information systems by semantic enrichment of descriptive data aspects

    NASA Astrophysics Data System (ADS)

    Gebhardt, Steffen; Wehrmann, Thilo; Klinger, Verena; Schettler, Ingo; Huth, Juliane; Künzer, Claudia; Dech, Stefan

    2010-10-01

    The German-Vietnamese water-related information system for the Mekong Delta (WISDOM) project supports business processes in Integrated Water Resources Management in Vietnam. Multiple disciplines bring together earth and ground based observation themes, such as environmental monitoring, water management, demographics, economy, information technology, and infrastructural systems. This paper introduces the components of the web-based WISDOM system including data, logic and presentation tier. It focuses on the data models upon which the database management system is built, including techniques for tagging or linking metadata with the stored information. The model also uses ordered groupings of spatial, thematic and temporal reference objects to semantically tag datasets to enable fast data retrieval, such as finding all data in a specific administrative unit belonging to a specific theme. A spatial database extension is employed by the PostgreSQL database. This object-oriented database was chosen over a relational database to tag spatial objects to tabular data, improving the retrieval of census and observational data at regional, provincial, and local areas. While the spatial database hinders processing raster data, a "work-around" was built into WISDOM to permit efficient management of both raster and vector data. The data model also incorporates styling aspects of the spatial datasets through styled layer descriptions (SLD) and web mapping service (WMS) layer specifications, allowing retrieval of rendered maps. Metadata elements of the spatial data are based on the ISO19115 standard. XML structured information of the SLD and metadata are stored in an XML database. The data models and the data management system are robust for managing the large quantity of spatial objects, sensor observations, census and document data. The operational WISDOM information system prototype contains modules for data management, automatic data integration, and web services for data retrieval, analysis, and distribution. The graphical user interfaces facilitate metadata cataloguing, data warehousing, web sensor data analysis and thematic mapping.

  3. A Data Management System for Multi-Phase Case-Control Studies

    PubMed Central

    Gibeau, Joanne M.; Steinfeldt, Lois C.; Stine, Mark J.; Tullis, Katherine V.; Lynch, H. Keith

    1983-01-01

    The design of a computerized system for the management of data in multi-phase epidemiologic case-control studies is described. Typical study phases include case-control selection, abstracting of data from medical records, and interview of study subjects or next of kin. In consultation with project personnel, requirements for the system were established: integration of data from all study phases into one data base, accurate follow-up of subjects through the study, sophisticated data editing capabilities, ready accessibility of specified programs to project personnel, and generation of current status and exception reports for project managment. SIR (Scientific Information Retrieval), a commercially available data base management system, was selected as the foundation of this system. The system forms a comprehensive data management system applicable to many types of public health research studies.

  4. Pan Air Geometry Management System (PAGMS): A data-base management system for PAN AIR geometry data

    NASA Technical Reports Server (NTRS)

    Hall, J. F.

    1981-01-01

    A data-base management system called PAGMS was developed to facilitate the data transfer in applications computer programs that create, modify, plot or otherwise manipulate PAN AIR type geometry data in preparation for input to the PAN AIR system of computer programs. PAGMS is composed of a series of FORTRAN callable subroutines which can be accessed directly from applications programs. Currently only a NOS version of PAGMS has been developed.

  5. A relational data-knowledge base system and its potential in developing a distributed data-knowledge system

    NASA Technical Reports Server (NTRS)

    Rahimian, Eric N.; Graves, Sara J.

    1988-01-01

    A new approach used in constructing a rational data knowledge base system is described. The relational database is well suited for distribution due to its property of allowing data fragmentation and fragmentation transparency. An example is formulated of a simple relational data knowledge base which may be generalized for use in developing a relational distributed data knowledge base system. The efficiency and ease of application of such a data knowledge base management system is briefly discussed. Also discussed are the potentials of the developed model for sharing the data knowledge base as well as the possible areas of difficulty in implementing the relational data knowledge base management system.

  6. A means to an end: a web-based client management system in palliative care.

    PubMed

    O'Connor, Margaret; Erwin, Trudy; Dawson, Linda

    2009-03-01

    Home-based palliative care (hospice) services require comprehensive and fully integrated information systems to develop and manage the various aspects of their business, incorporating client data and management information. These systems assist in maintaining the quality of client care as well as improved management efficiencies. This article reports on a large not-for-profit home-based palliative care service in Australia, which embarked on a project to develop an electronic data management system specifically designed to meet the needs of the palliative care sector. This web-based client information management system represents a joint venture between the organization and a commercial company and has been a very successful project.

  7. Maintaining a permanent plot data base for growth and yield research: Solutions to some recurring problems

    Treesearch

    John C. Byrne

    1993-01-01

    Methods for solving some recurring problems of maintaining a permanent plot data base for growth and yield reseuch are described. These methods include documenting data from diverse sampling designs, changing sampling designs, changing field procedures, and coordinating activities in the plots with the land management agency. Managing a permanent plot data base (...

  8. Wiki-based Data Management System for Toxicogenomics

    EPA Science Inventory

    We are developing a data management system to enable systems-based toxicology at the US EPA. This is built upon the WikiLIMS platform and is capabale of housing not just genomics data but also a wide variety of toxicology data and associated experimental design information. Thi...

  9. CyBy(2): a structure-based data management tool for chemical and biological data.

    PubMed

    Höck, Stefan; Riedl, Rainer

    2012-01-01

    We report the development of a powerful data management tool for chemical and biological data: CyBy(2). CyBy(2) is a structure-based information management tool used to store and visualize structural data alongside additional information such as project assignment, physical information, spectroscopic data, biological activity, functional data and synthetic procedures. The application consists of a database, an application server, used to query and update the database, and a client application with a rich graphical user interface (GUI) used to interact with the server.

  10. Survey of standards applicable to a database management system

    NASA Technical Reports Server (NTRS)

    Urena, J. L.

    1981-01-01

    Industry, government, and NASA standards, and the status of standardization activities of standards setting organizations applicable to the design, implementation and operation of a data base management system for space related applications are identified. The applicability of the standards to a general purpose, multimission data base management system is addressed.

  11. Fault management for data systems

    NASA Technical Reports Server (NTRS)

    Boyd, Mark A.; Iverson, David L.; Patterson-Hine, F. Ann

    1993-01-01

    Issues related to automating the process of fault management (fault diagnosis and response) for data management systems are considered. Substantial benefits are to be gained by successful automation of this process, particularly for large, complex systems. The use of graph-based models to develop a computer assisted fault management system is advocated. The general problem is described and the motivation behind choosing graph-based models over other approaches for developing fault diagnosis computer programs is outlined. Some existing work in the area of graph-based fault diagnosis is reviewed, and a new fault management method which was developed from existing methods is offered. Our method is applied to an automatic telescope system intended as a prototype for future lunar telescope programs. Finally, an application of our method to general data management systems is described.

  12. [Application of the life sciences platform based on oracle to biomedical informations].

    PubMed

    Zhao, Zhi-Yun; Li, Tai-Huan; Yang, Hong-Qiao

    2008-03-01

    The life sciences platform based on Oracle database technology is introduced in this paper. By providing a powerful data access, integrating a variety of data types, and managing vast quantities of data, the software presents a flexible, safe and scalable management platform for biomedical data processing.

  13. Solid Waste Information Management System (SWIMS). Data summary, fiscal year 1980

    NASA Astrophysics Data System (ADS)

    Batchelder, H. M.

    1981-05-01

    The solid waste information management system (SWIMS) maintains computerized records on a master data base. It provides a comprehensive system for cataloging and assembling data into output reports. The SWIMS data base contains information on the transuranic (TRU) and low level waste (LLW) generated, buried, or stored.

  14. Environmental Files and Data Bases. Part A. Introduction and Oceanographic Management Information System.

    DTIC Science & Technology

    1981-09-01

    Management Information System Naval Oceanography Program Naval Oceanographic Requirements Acoustic Reference Service Research Vehicle...THE OCEANOGRAPHIC MANAGEMENT INFORMATION SYSTEM . .. .... 2-1 3. ACOUSTIC DATA .. .. .... ......... ...... 3-1 4. GEOLOGICAL AND GEOPHYSICAL DATA...36 CHAPTER 2 THE OCEANOGRAPHIC MANAGEMENT INFORMATION SYSTEM 2-i CHAPTER 2 THE OCEANOGRAPHIC MANAGEMENT INFORMATION SYSTEM CONTENTS Page

  15. Data management for Computer-Aided Engineering (CAE)

    NASA Technical Reports Server (NTRS)

    Bryant, W. A.; Smith, M. R.

    1984-01-01

    Analysis of data flow through the design and manufacturing processes has established specific information management requirements and identified unique problems. The application of data management technology to the engineering/manufacturing environment addresses these problems. An overview of the IPAD prototype data base management system, representing a partial solution to these problems, is presented here.

  16. Reference manual for data base on Nevada well logs

    USGS Publications Warehouse

    Bauer, E.M.; Cartier, K.D.

    1995-01-01

    The U.S. Geological Survey and Nevada Division of Water Resources are cooperatively using a data base for are cooperatively using a data base for managing well-log information for the State of Nevada. The Well-Log Data Base is part of an integrated system of computer data bases using the Ingres Relational Data-Base Management System, which allows efficient storage and access to water information from the State Engineer's office. The data base contains a main table, two ancillary tables, and nine lookup tables, as well as a menu-driven system for entering, updating, and reporting on the data. This reference guide outlines the general functions of the system and provides a brief description of data tables and data-entry screens.

  17. SUPERFUND SOILS DATA MANAGEMENT SYSTEM

    EPA Science Inventory

    This paper describes the Superfund Soil Data Management System (DMS), a PC-based data system being developed by the U.S. Environmental Protection Agency (EPA) in its effort to manage and evaluate treatment and performance data for contaminated soil, sludge, and debris. his system...

  18. Principles and techniques in the design of ADMS+. [advanced data-base management system

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nick; Kang, Hyunchul

    1986-01-01

    'ADMS+/-' is an advanced data base management system whose architecture integrates the ADSM+ mainframe data base system with a large number of work station data base systems, designated ADMS-; no communications exist between these work stations. The use of this system radically decreases the response time of locally processed queries, since the work station runs in a single-user mode, and no dynamic security checking is required for the downloaded portion of the data base. The deferred update strategy used reduces overhead due to update synchronization in message traffic.

  19. Technology for organization of the onboard system for processing and storage of ERS data for ultrasmall spacecraft

    NASA Astrophysics Data System (ADS)

    Strotov, Valery V.; Taganov, Alexander I.; Konkin, Yuriy V.; Kolesenkov, Aleksandr N.

    2017-10-01

    Task of processing and analysis of obtained Earth remote sensing data on ultra-small spacecraft board is actual taking into consideration significant expenditures of energy for data transfer and low productivity of computers. Thereby, there is an issue of effective and reliable storage of the general information flow obtained from onboard systems of information collection, including Earth remote sensing data, into a specialized data base. The paper has considered peculiarities of database management system operation with the multilevel memory structure. For storage of data in data base the format has been developed that describes a data base physical structure which contains required parameters for information loading. Such structure allows reducing a memory size occupied by data base because it is not necessary to store values of keys separately. The paper has shown architecture of the relational database management system oriented into embedment into the onboard ultra-small spacecraft software. Data base for storage of different information, including Earth remote sensing data, can be developed by means of such database management system for its following processing. Suggested database management system architecture has low requirements to power of the computer systems and memory resources on the ultra-small spacecraft board. Data integrity is ensured under input and change of the structured information.

  20. An adaptable XML based approach for scientific data management and integration

    NASA Astrophysics Data System (ADS)

    Wang, Fusheng; Thiel, Florian; Furrer, Daniel; Vergara-Niedermayr, Cristobal; Qin, Chen; Hackenberg, Georg; Bourgue, Pierre-Emmanuel; Kaltschmidt, David; Wang, Mo

    2008-03-01

    Increased complexity of scientific research poses new challenges to scientific data management. Meanwhile, scientific collaboration is becoming increasing important, which relies on integrating and sharing data from distributed institutions. We develop SciPort, a Web-based platform on supporting scientific data management and integration based on a central server based distributed architecture, where researchers can easily collect, publish, and share their complex scientific data across multi-institutions. SciPort provides an XML based general approach to model complex scientific data by representing them as XML documents. The documents capture not only hierarchical structured data, but also images and raw data through references. In addition, SciPort provides an XML based hierarchical organization of the overall data space to make it convenient for quick browsing. To provide generalization, schemas and hierarchies are customizable with XML-based definitions, thus it is possible to quickly adapt the system to different applications. While each institution can manage documents on a Local SciPort Server independently, selected documents can be published to a Central Server to form a global view of shared data across all sites. By storing documents in a native XML database, SciPort provides high schema extensibility and supports comprehensive queries through XQuery. By providing a unified and effective means for data modeling, data access and customization with XML, SciPort provides a flexible and powerful platform for sharing scientific data for scientific research communities, and has been successfully used in both biomedical research and clinical trials.

  1. An Adaptable XML Based Approach for Scientific Data Management and Integration.

    PubMed

    Wang, Fusheng; Thiel, Florian; Furrer, Daniel; Vergara-Niedermayr, Cristobal; Qin, Chen; Hackenberg, Georg; Bourgue, Pierre-Emmanuel; Kaltschmidt, David; Wang, Mo

    2008-02-20

    Increased complexity of scientific research poses new challenges to scientific data management. Meanwhile, scientific collaboration is becoming increasing important, which relies on integrating and sharing data from distributed institutions. We develop SciPort, a Web-based platform on supporting scientific data management and integration based on a central server based distributed architecture, where researchers can easily collect, publish, and share their complex scientific data across multi-institutions. SciPort provides an XML based general approach to model complex scientific data by representing them as XML documents. The documents capture not only hierarchical structured data, but also images and raw data through references. In addition, SciPort provides an XML based hierarchical organization of the overall data space to make it convenient for quick browsing. To provide generalization, schemas and hierarchies are customizable with XML-based definitions, thus it is possible to quickly adapt the system to different applications. While each institution can manage documents on a Local SciPort Server independently, selected documents can be published to a Central Server to form a global view of shared data across all sites. By storing documents in a native XML database, SciPort provides high schema extensibility and supports comprehensive queries through XQuery. By providing a unified and effective means for data modeling, data access and customization with XML, SciPort provides a flexible and powerful platform for sharing scientific data for scientific research communities, and has been successfully used in both biomedical research and clinical trials.

  2. Development and Demonstration of a Statistical Data Base System for Library and Network Planning and Evaluation. Fourth Quarterly Report.

    ERIC Educational Resources Information Center

    Jones, Dennis; And Others

    The National Center for Higher Education Management Systems (NCHEMS) has completed the development and demonstration of a library statistical data base. The data base, or management information system, was developed for administrators of public and academic libraries. The system provides administrators with a framework of information and…

  3. XML Based Scientific Data Management Facility

    NASA Technical Reports Server (NTRS)

    Mehrotra, Piyush; Zubair, M.; Ziebartt, John (Technical Monitor)

    2001-01-01

    The World Wide Web consortium has developed an Extensible Markup Language (XML) to support the building of better information management infrastructures. The scientific computing community realizing the benefits of HTML has designed markup languages for scientific data. In this paper, we propose a XML based scientific data management facility, XDMF. The project is motivated by the fact that even though a lot of scientific data is being generated, it is not being shared because of lack of standards and infrastructure support for discovering and transforming the data. The proposed data management facility can be used to discover the scientific data itself, the transformation functions, and also for applying the required transformations. We have built a prototype system of the proposed data management facility that can work on different platforms. We have implemented the system using Java, and Apache XSLT engine Xalan. To support remote data and transformation functions, we had to extend the XSLT specification and the Xalan package.

  4. XML Based Scientific Data Management Facility

    NASA Technical Reports Server (NTRS)

    Mehrotra, P.; Zubair, M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The World Wide Web consortium has developed an Extensible Markup Language (XML) to support the building of better information management infrastructures. The scientific computing community realizing the benefits of XML has designed markup languages for scientific data. In this paper, we propose a XML based scientific data management ,facility, XDMF. The project is motivated by the fact that even though a lot of scientific data is being generated, it is not being shared because of lack of standards and infrastructure support for discovering and transforming the data. The proposed data management facility can be used to discover the scientific data itself, the transformation functions, and also for applying the required transformations. We have built a prototype system of the proposed data management facility that can work on different platforms. We have implemented the system using Java, and Apache XSLT engine Xalan. To support remote data and transformation functions, we had to extend the XSLT specification and the Xalan package.

  5. A web-based biosignal data management system for U-health data integration.

    PubMed

    Ro, Dongwoo; Yoo, Sooyoung; Choi, Jinwook

    2008-11-06

    In the ubiquitous healthcare environment, the biosignal data should be easily accessed and properly maintained. This paper describes a web-based data management system. It consists of a device interface, a data upload control, a central repository, and a web server. For the user-specific web services, a MFER Upload ActiveX Control was developed.

  6. Insight: An ontology-based integrated database and analysis platform for epilepsy self-management research.

    PubMed

    Sahoo, Satya S; Ramesh, Priya; Welter, Elisabeth; Bukach, Ashley; Valdez, Joshua; Tatsuoka, Curtis; Bamps, Yvan; Stoll, Shelley; Jobst, Barbara C; Sajatovic, Martha

    2016-10-01

    We present Insight as an integrated database and analysis platform for epilepsy self-management research as part of the national Managing Epilepsy Well Network. Insight is the only available informatics platform for accessing and analyzing integrated data from multiple epilepsy self-management research studies with several new data management features and user-friendly functionalities. The features of Insight include, (1) use of Common Data Elements defined by members of the research community and an epilepsy domain ontology for data integration and querying, (2) visualization tools to support real time exploration of data distribution across research studies, and (3) an interactive visual query interface for provenance-enabled research cohort identification. The Insight platform contains data from five completed epilepsy self-management research studies covering various categories of data, including depression, quality of life, seizure frequency, and socioeconomic information. The data represents over 400 participants with 7552 data points. The Insight data exploration and cohort identification query interface has been developed using Ruby on Rails Web technology and open source Web Ontology Language Application Programming Interface to support ontology-based reasoning. We have developed an efficient ontology management module that automatically updates the ontology mappings each time a new version of the Epilepsy and Seizure Ontology is released. The Insight platform features a Role-based Access Control module to authenticate and effectively manage user access to different research studies. User access to Insight is managed by the Managing Epilepsy Well Network database steering committee consisting of representatives of all current collaborating centers of the Managing Epilepsy Well Network. New research studies are being continuously added to the Insight database and the size as well as the unique coverage of the dataset allows investigators to conduct aggregate data analysis that will inform the next generation of epilepsy self-management studies. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. IPAD 2: Advances in Distributed Data Base Management for CAD/CAM

    NASA Technical Reports Server (NTRS)

    Bostic, S. W. (Compiler)

    1984-01-01

    The Integrated Programs for Aerospace-Vehicle Design (IPAD) Project objective is to improve engineering productivity through better use of computer-aided design and manufacturing (CAD/CAM) technology. The focus is on development of technology and associated software for integrated company-wide management of engineering information. The objectives of this conference are as follows: to provide a greater awareness of the critical need by U.S. industry for advancements in distributed CAD/CAM data management capability; to present industry experiences and current and planned research in distributed data base management; and to summarize IPAD data management contributions and their impact on U.S. industry and computer hardware and software vendors.

  8. A practical framework for data management processes and their evaluation in population-based medical registries.

    PubMed

    Sariyar, M; Borg, A; Heidinger, O; Pommerening, K

    2013-03-01

    We present a framework for data management processes in population-based medical registries. Existing guidelines lack the concreteness we deem necessary for them to be of practical use, especially concerning the establishment of new registries. Therefore, we propose adjustments and concretisations with regard to data quality, data privacy, data security and registry purposes. First, we separately elaborate on the issues to be included into the framework and present proposals for their improvements. Thereafter, we provide a framework for medical registries based on quasi-standard-operation procedures. The main result is a concise and scientifically based framework that tries to be both broad and concrete. Within that framework, we distinguish between data acquisition, data storage and data presentation as sub-headings. We use the framework to categorise and evaluate the data management processes of a German cancer registry. The standardisation of data management processes in medical registries is important to guarantee high quality of the registered data, to enhance the realisation of purposes, to increase efficiency and to enable comparisons between registries. Our framework is destined to show how one central impediment for such standardisations - lack of practicality - can be addressed on scientific grounds.

  9. Polio Eradication Initiative contribution in strengthening immunization and integrated disease surveillance data management in WHO African region, 2014.

    PubMed

    Poy, Alain; Minkoulou, Etienne; Shaba, Keith; Yahaya, Ali; Gaturuku, Peter; Dadja, Landoh; Okeibunor, Joseph; Mihigo, Richard; Mkanda, Pascal

    2016-10-10

    The PEI Programme in the WHO African region invested in recruitment of qualified staff in data management, developing data management system and standards operating systems since the revamp of the Polio Eradication Initiative in 1997 to cater for data management support needs in the Region. This support went beyond polio and was expanded to routine immunization and integrated surveillance of priority diseases. But the impact of the polio data management support to other programmes such as routine immunization and disease surveillance has not yet been fully documented. This is what this article seeks to demonstrate. We reviewed how Polio data management area of work evolved progressively along with the expansion of the data management team capacity and the evolution of the data management systems from initiation of the AFP case-based to routine immunization, other case based disease surveillance and Supplementary immunization activities. IDSR has improved the data availability with support from IST Polio funded data managers who were collecting them from countries. The data management system developed by the polio team was used by countries to record information related to not only polio SIAs but also for other interventions. From the time when routine immunization data started to be part of polio data management team responsibility, the number of reports received went from around 4000 the first year (2005) to >30,000 the second year and to >47,000 in 2014. Polio data management has helped to improve the overall VPD, IDSR and routine data management as well as emergency response in the Region. As we approach the polio end game, the African Region would benefit in using the already set infrastructure for other public health initiative in the Region. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Design and realization of confidential data management system RFID-based

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Wang, Zhong; Wang, Xin

    2017-03-01

    This paper introduces the composition of RFID system, and then analyzes the hardware design and software design systems, and finally summarizes the realization and application of the confidential data management system RFID-based.

  11. A case-based reasoning tool for breast cancer knowledge management with data mining concepts and techniques

    NASA Astrophysics Data System (ADS)

    Demigha, Souâd.

    2016-03-01

    The paper presents a Case-Based Reasoning Tool for Breast Cancer Knowledge Management to improve breast cancer screening. To develop this tool, we combine both concepts and techniques of Case-Based Reasoning (CBR) and Data Mining (DM). Physicians and radiologists ground their diagnosis on their expertise (past experience) based on clinical cases. Case-Based Reasoning is the process of solving new problems based on the solutions of similar past problems and structured as cases. CBR is suitable for medical use. On the other hand, existing traditional hospital information systems (HIS), Radiological Information Systems (RIS) and Picture Archiving Information Systems (PACS) don't allow managing efficiently medical information because of its complexity and heterogeneity. Data Mining is the process of mining information from a data set and transform it into an understandable structure for further use. Combining CBR to Data Mining techniques will facilitate diagnosis and decision-making of medical experts.

  12. Data warehousing: toward knowledge management.

    PubMed

    Shams, K; Farishta, M

    2001-02-01

    With rapid changes taking place in the practice and delivery of health care, decision support systems have assumed an increasingly important role. More and more health care institutions are deploying data warehouse applications as decision support tools for strategic decision making. By making the right information available at the right time to the right decision makers in the right manner, data warehouses empower employees to become knowledge workers with the ability to make the right decisions and solve problems, creating strategic leverage for the organization. Health care management must plan and implement data warehousing strategy using a best practice approach. Through the power of data warehousing, health care management can negotiate bettermanaged care contracts based on the ability to provide accurate data on case mix and resource utilization. Management can also save millions of dollars through the implementation of clinical pathways in better resource utilization and changing physician behavior to best practices based on evidence-based medicine.

  13. Data Base Management Systems Panel Workshop: Executive summary

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Data base management systems (DBMS) for space acquired and associated data are discussed. The full range of DBMS needs is covered including acquiring, managing, storing, archiving, accessing and dissemination of data for an application. Existing bottlenecks in DBMS operations, expected developments in the field of remote sensing, communications, and computer science are discussed, and an overview of existing conditions and expected problems is presented. The requirements for a proposed spatial information system and characteristics of a comprehensive browse facility for earth observations applications are included.

  14. Research in Functionally Distributed Computer Systems Development. Volume XII. Design Considerations in Distributed Data Base Management Systems.

    DTIC Science & Technology

    1977-04-01

    task of data organization, management, and storage has been given to a select group of specialists . These specialists (the Data Base Administrators...report writers, etc.) the task of data organi?9tion, management, and storage has been given to a select group of specialists . These specialists (the...distributed DBMS Involves first identifying a set of two or more tasks blocking each other from a collection of shared 12 records. Once the set of

  15. Vibroacoustic Payload Environment Prediction System (VAPEPS): VAPEPS management center remote access guide

    NASA Technical Reports Server (NTRS)

    Fernandez, J. P.; Mills, D.

    1991-01-01

    A Vibroacoustic Payload Environment Prediction System (VAPEPS) Management Center was established at the JPL. The center utilizes the VAPEPS software package to manage a data base of Space Shuttle and expendable launch vehicle payload flight and ground test data. Remote terminal access over telephone lines to the computer system, where the program resides, was established to provide the payload community a convenient means of querying the global VAPEPS data base. This guide describes the functions of the VAPEPS Management Center and contains instructions for utilizing the resources of the center.

  16. Identifying and Validating Requirements of a Mobile-Based Self-Management System for People Living with HIV.

    PubMed

    Mehraeen, Esmaeil; Safdari, Reza; Seyedalinaghi, Seyed Ahmad; Mohammadzadeh, Niloofar; Arji, Goli

    2018-01-01

    Due to the widespread use of mobile technology and the low cost of this technology, implementing a mobile-based self-management system can lead to adherence to the medication regimens and promotion of the health of people living with HIV (PLWH). We aimed to identify requirements of a mobile-based self-management system, and validate them from the perspective of infectious diseases specialists. This is a mixed-methods study that carried out in two main phases. In the first phase, we identified requirements of a mobile-based self-management system for PLWH. In the second phase, identified requirements were validated using a researcher made questionnaire. The statistical population was infectious diseases specialists affiliated to Tehran University of Medical Sciences. The collected data were analyzed using SPSS statistical software (version 19), and descriptive statistics. By full-text review of selected studies, we determined requirements of a mobile-based self-management system in four categories: demographic, clinical, strategically and technical capabilities. According to the findings, 6 data elements for demographic category, 11 data elements for clinical category, 10 items for self-management strategies, and 11 features for technical capabilities were selected. Using the identified preferences, it is possible to design and implement a mobile-based self-management system for HIV-positive people. Developing a mobile-based self-management system is expected to progress the skills of self-management PLWH, improve of medication regimen adherence, and facilitate communication with healthcare providers.

  17. Study on parallel and distributed management of RS data based on spatial data base

    NASA Astrophysics Data System (ADS)

    Chen, Yingbiao; Qian, Qinglan; Liu, Shijin

    2006-12-01

    With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.

  18. Data base management system configuration specification. [computer storage devices

    NASA Technical Reports Server (NTRS)

    Neiers, J. W.

    1979-01-01

    The functional requirements and the configuration of the data base management system are described. Techniques and technology which will enable more efficient and timely transfer of useful data from the sensor to the user, extraction of information by the user, and exchange of information among the users are demonstrated.

  19. Management Data for Selection Decisions in Building Library Collections.

    ERIC Educational Resources Information Center

    Hamaker, Charles A.

    1992-01-01

    Discusses the use of library management data, particularly circulation data, in making selection decisions for library collection development based on experiences at Louisiana State University. Development of a collection based on actual use rather than perceived research needs is considered, and the decision-making process for serials…

  20. Integrity-Based Budgeting

    ERIC Educational Resources Information Center

    Kaleba, Frank

    2008-01-01

    The central problem for the facility manager of large portfolios is not the accuracy of data, but rather data integrity. Data integrity means that it's (1) acceptable to the users; (2) based upon an objective source; (3) reproducible; and (4) internally consistent. Manns and Katsinas, in their January/February 2006 Facilities Manager article…

  1. Real-time GIS data model and sensor web service platform for environmental data management.

    PubMed

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  2. A conceptual design for an integrated data base management system for remote sensing data. [user requirements and data processing

    NASA Technical Reports Server (NTRS)

    Maresca, P. A.; Lefler, R. M.

    1978-01-01

    The requirements of potential users were considered in the design of an integrated data base management system, developed to be independent of any specific computer or operating system, and to be used to support investigations in weather and climate. Ultimately, the system would expand to include data from the agriculture, hydrology, and related Earth resources disciplines. An overview of the system and its capabilities is presented. Aspects discussed cover the proposed interactive command language; the application program command language; storage and tabular data maintained by the regional data base management system; the handling of data files and the use of system standard formats; various control structures required to support the internal architecture of the system; and the actual system architecture with the various modules needed to implement the system. The concepts on which the relational data model is based; data integrity, consistency, and quality; and provisions for supporting concurrent access to data within the system are covered in the appendices.

  3. Five Years Experience with the CLINFO Data Base Management and Analysis System

    PubMed Central

    Johnston, Howard B.; Higgins, Stanley B.; Harris, Thomas R.; Lacy, William W.

    1982-01-01

    The CLINFO data base management and analysis system is the result of a project sponsored by the National Institutes of Health (NIH) to identify data management and data analysis activities that are critical to clinical investigation. In February of 1977, one of the three prototype CLINFO systems developed by the RAND Corporation was installed in the Clinical Research Center (CRC) at Vanderbilt University Medical Center. The Vanderbilt experience with this CLINFO system over the past five years is described. Its impact on the way clinical research data has been managed and analyzed is discussed in terms of utilization by more than 100 clinical investigators and their staff. The Vanderbilt evaluation of the system and additional information on its usage since the original evaluation is presented. Factors in the design philosophy of CLINFO which create an environment that enhances the clinical investigator's capabilities to perform computer data management and analysis of his data are discussed.

  4. Confidentiality Protection of User Data and Adaptive Resource Allocation for Managing Multiple Workflow Performance in Service-Based Systems

    ERIC Educational Resources Information Center

    An, Ho

    2012-01-01

    In this dissertation, two interrelated problems of service-based systems (SBS) are addressed: protecting users' data confidentiality from service providers, and managing performance of multiple workflows in SBS. Current SBSs pose serious limitations to protecting users' data confidentiality. Since users' sensitive data is sent in…

  5. Development of a statewide Landsat digital data base for forest insect damage assessment

    NASA Technical Reports Server (NTRS)

    Williams, D. L.; Dottavio, C. L.; Nelson, R. F.

    1983-01-01

    A Joint Research Project (JRP) invlving NASA/Goddard Space Flight Center and the Pennsylvania Bureau of Forestry/Division of Forest Pest Management demonstrates the utility of Landsat data for assessing forest insect damage. A major effort within the project has been the creation of map-registered, statewide Landsat digital data base for Pennsylvania. The data base, developed and stored on computers at the Pennsylvania State University Computation Center, contains Landsat imagery, a Landsat-derived forest resource map, and digitized data layers depicting Forest Pest Management District boundaries and county boundaries. A data management front-end system was also developed to provide an interface between the various layers of information within the data base and image analysis software. This front-end system insures than an automated assessment of defoliation damage can be conducted and summarized by geographic area or jurisdiction of interest.

  6. A Split-Path Schema-Based RFID Data Storage Model in Supply Chain Management

    PubMed Central

    Fan, Hua; Wu, Quanyuan; Lin, Yisong; Zhang, Jianfeng

    2013-01-01

    In modern supply chain management systems, Radio Frequency IDentification (RFID) technology has become an indispensable sensor technology and massive RFID data sets are expected to become commonplace. More and more space and time are needed to store and process such huge amounts of RFID data, and there is an increasing realization that the existing approaches cannot satisfy the requirements of RFID data management. In this paper, we present a split-path schema-based RFID data storage model. With a data separation mechanism, the massive RFID data produced in supply chain management systems can be stored and processed more efficiently. Then a tree structure-based path splitting approach is proposed to intelligently and automatically split the movement paths of products. Furthermore, based on the proposed new storage model, we design the relational schema to store the path information and time information of tags, and some typical query templates and SQL statements are defined. Finally, we conduct various experiments to measure the effect and performance of our model and demonstrate that it performs significantly better than the baseline approach in both the data expression and path-oriented RFID data query performance. PMID:23645112

  7. Using expert systems to implement a semantic data model of a large mass storage system

    NASA Technical Reports Server (NTRS)

    Roelofs, Larry H.; Campbell, William J.

    1990-01-01

    The successful development of large volume data storage systems will depend not only on the ability of the designers to store data, but on the ability to manage such data once it is in the system. The hypothesis is that mass storage data management can only be implemented successfully based on highly intelligent meta data management services. There now exists a proposed mass store system standard proposed by the IEEE that addresses many of the issues related to the storage of large volumes of data, however, the model does not consider a major technical issue, namely the high level management of stored data. However, if the model were expanded to include the semantics and pragmatics of the data domain using a Semantic Data Model (SDM) concept, the result would be data that is expressive of the Intelligent Information Fusion (IIF) concept and also organized and classified in context to its use and purpose. The results are presented of a demonstration prototype SDM implemented using the expert system development tool NEXPERT OBJECT. In the prototype, a simple instance of a SDM was created to support a hypothetical application for the Earth Observing System, Data Information System (EOSDIS). The massive amounts of data that EOSDIS will manage requires the definition and design of a powerful information management system in order to support even the most basic needs of the project. The application domain is characterized by a semantic like network that represents the data content and the relationships between the data based on user views and the more generalized domain architectural view of the information world. The data in the domain are represented by objects that define classes, types and instances of the data. In addition, data properties are selectively inherited between parent and daughter relationships in the domain. Based on the SDM a simple information system design is developed from the low level data storage media, through record management and meta data management to the user interface.

  8. Contracts and management services site support program plan WBS 6.10.14

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knoll, J.M. Jr.

    1994-09-01

    Contracts and Management Services is recognized as the central focal point for programs having company or sitewide application in pursuit of the Hanford Missions`s financial and operational objectives. Contracts and Management Services actively pursues cost savings and operational efficiencies through: Management Standards by ensuring all employees have an accessible, integrated system of clear, complete, accurate, timely, and useful management control policies and procedures; Contract Reform by restructuring the contract, organization, and cost accounting systems to refocus Hanford contract activities on output products; Systems and Operations Evaluation by directing the Cost Reduction program, Great Ideas, and Span of Management activities; Programmore » Administration by enforcing conditions of Accountability (whether DEAR-based or FAR-based) for WHC, BCSR, ICF KH, and BHI; Contract Performance activities; chairing the WHC Cost Reduction Review Board; and analyzing companywide Performance Measures; Data Standards and Administration by establishing and directing the company data management program; giving direction to the major RL programs and mission areas for implementation of cost-effective and efficient data management practices; directing all operations, application, and interfaces contained within the Hanford PeopleCore System; directing accomplishment and delivery of TPA data management milestones; and directing the sitewide data management processes for Data Standards and the Data Directory.« less

  9. A Data-Based Financial Management Information System (FMIS) for Administrative Sciences Department

    DTIC Science & Technology

    1990-12-01

    Financial Management Information System that would result in improved management of financial assets, better use of clerical skills, and more detailed...develops and implements a personal computer-based Management Information System for the Management of the many funding accounts controlled by the...different software programs, into a single all-encompassing Management Information System . The system was written using dBASE IV and is currently operational.

  10. Improving patient recruitment to multicentre clinical trials: the case for employing a data manager in a district general hospital-based oncology centre.

    PubMed

    Street, A; Strong, J; Karp, S

    2001-01-01

    One of the most frequently cited reasons for poor recruitment to multicentre randomized clinical trials is the additional workload placed on clinical staff. We report the effect on patient recruitment of employing a data manager to support clinical staff in an English district general hospital (DGH). In addition, we explore the effect data managers have on the quality of data collected, proxied by the number of queries arising with the trial organizers. We estimate that the cost of employing a data manager on a full-time basis is 502 per patient recruited but may amount to 326 if the appointment is part-time. Data quality is high when full responsibility lies with a data manager but falls when responsibility is shared. Whether the costs of employing a data manager to recruit patients from a DGH are worth incurring depends on the value placed on the speed at which multicentre trials can be completed, how important it is to broaden the research base beyond the traditional setting of teaching hospitals, and the amount of evaluative data required.

  11. Wiki-based Data Management to Support Systems Toxicology*

    EPA Science Inventory

    As the field of toxicology relies more heavily on systems approaches for mode of action discovery, evaluation, and modeling, the need for integrated data management is greater than ever. To meet these needs, we developed a flexible data management system that assists scientists ...

  12. A data base and analysis program for shuttle main engine dynamic pressure measurements

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1986-01-01

    A dynamic pressure data base management system is described for measurements obtained from space shuttle main engine (SSME) hot firing tests. The data were provided in terms of engine power level and rms pressure time histories, and power spectra of the dynamic pressure measurements at selected times during each test. Test measurements and engine locations are defined along with a discussion of data acquisition and reduction procedures. A description of the data base management analysis system is provided and subroutines developed for obtaining selected measurement means, variances, ranges and other statistics of interest are discussed. A summary of pressure spectra obtained at SSME rated power level is provided for reference. Application of the singular value decomposition technique to spectrum interpolation is discussed and isoplots of interpolated spectra are presented to indicate measurement trends with engine power level. Program listings of the data base management and spectrum interpolation software are given. Appendices are included to document all data base measurements.

  13. A Management Information System for Bare Base Civil Engineering Commanders

    DTIC Science & Technology

    1988-09-01

    initial beddown stage. The purpose of this research was to determine the feasibility of developing a microcomputer based management information system (MIS...the software best suited to synthesize four of the categories into a prototype field MIS. Keyword: Management information system , Bare bases, Civil engineering, Data bases, Information retrieval.

  14. [Study of sharing platform of web-based enhanced extracorporeal counterpulsation hemodynamic waveform data].

    PubMed

    Huang, Mingbo; Hu, Ding; Yu, Donglan; Zheng, Zhensheng; Wang, Kuijian

    2011-12-01

    Enhanced extracorporeal counterpulsation (EECP) information consists of both text and hemodynamic waveform data. At present EECP text information has been successfully managed through Web browser, while the management and sharing of hemodynamic waveform data through Internet has not been solved yet. In order to manage EECP information completely, based on the in-depth analysis of EECP hemodynamic waveform file of digital imaging and communications in medicine (DICOM) format and its disadvantages in Internet sharing, we proposed the use of the extensible markup language (XML), which is currently the Internet popular data exchange standard, as the storage specification for the sharing of EECP waveform data. Then we designed a web-based sharing system of EECP hemodynamic waveform data via ASP. NET 2.0 platform. Meanwhile, we specifically introduced the four main system function modules and their implement methods, including DICOM to XML conversion module, EECP waveform data management module, retrieval and display of EECP waveform module and the security mechanism of the system.

  15. [Traditional Chinese Medicine data management policy in big data environment].

    PubMed

    Liang, Yang; Ding, Chang-Song; Huang, Xin-di; Deng, Le

    2018-02-01

    As traditional data management model cannot effectively manage the massive data in traditional Chinese medicine(TCM) due to the uncertainty of data object attributes as well as the diversity and abstraction of data representation, a management strategy for TCM data based on big data technology is proposed. Based on true characteristics of TCM data, this strategy could solve the problems of the uncertainty of data object attributes in TCM information and the non-uniformity of the data representation by using modeless properties of stored objects in big data technology. Hybrid indexing mode was also used to solve the conflicts brought by different storage modes in indexing process, with powerful capabilities in query processing of massive data through efficient parallel MapReduce process. The theoretical analysis provided the management framework and its key technology, while its performance was tested on Hadoop by using several common traditional Chinese medicines and prescriptions from practical TCM data source. Result showed that this strategy can effectively solve the storage problem of TCM information, with good performance in query efficiency, completeness and robustness. Copyright© by the Chinese Pharmaceutical Association.

  16. Selecting Data-Base Management Software for Microcomputers in Libraries and Information Units.

    ERIC Educational Resources Information Center

    Pieska, K. A. O.

    1986-01-01

    Presents a model for the evaluation of database management systems software from the viewpoint of librarians and information specialists. The properties of data management systems, database management systems, and text retrieval systems are outlined and compared. (10 references) (CLB)

  17. Managing EEE part standardisation and procurement

    NASA Astrophysics Data System (ADS)

    Serieys, C.; Bensoussan, A.; Petitmangin, A.; Rigaud, M.; Barbaresco, P.; Lyan, C.

    2002-12-01

    This paper presents the development activities in space components selection and procurement dealing with a new data base tool implemented at Alcatel Space using TransForm softwaa re configurator developed by Techform S.A. Based on TransForm, Access Ingenierie has devv eloped a software product named OLG@DOS which facilitate the part nomenclatures analyses for new equipment design and manufacturing in term of ACCESS data base implementation. Hi-Rel EEE part type technical, production and quality information are collected and compiled usingproduction data base issued from production tools implemented for equipment definition, description and production based on Manufacturing Resource Planning (MRP II Control Open) and Parametric Design Manager (PDM Work Manager). The analysis of any new equipment nomenclature may be conducted through this means for standardisation purpose, cost containment program and management procurement activities as well as preparation of Component reviews as Part Approval Document and Declared Part List validation.

  18. A distributed data base management capability for the deep space network

    NASA Technical Reports Server (NTRS)

    Bryan, A. I.

    1976-01-01

    The Configuration Control and Audit Assembly (CCA) is reported that has been designed to provide a distributed data base management capability for the DSN. The CCA utilizes capabilities provided by the DSN standard minicomputer and the DSN standard nonreal time high level management oriented programming language, MBASIC. The characteristics of the CCA for the first phase of implementation are described.

  19. Census Data System of the Management Information System for Occupational Education: Guidelines and Instructions for Reporting. CDS Document No. 1.

    ERIC Educational Resources Information Center

    Management and Information System for Occupational Education, Winchester, MA.

    The MISOE Census Data System (CDS) is one of two major subsystems of an integrated management information system (MISOE), which was developed to provide occupational education managers with comprehensive data on which to base rational management decisions. Essentially, CDS contains descriptive information systematically structured in a manner…

  20. Organization and management of heterogeneous, dispersed data bases in nuclear engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eastman, C.M.

    1986-01-01

    Large, complex, multiperson engineering projects in many areas, nuclear, aerospace, electronics, and manufacturing, have inherent needs for coordination, control, and management of the related engineering data. Taken in the abstract, the notion of an integrated engineering data base (IED) for such projects is attractive. The potential capabilities of an (IED) are that all data are managed in a coordinated way, are made accessible to all users who need it, allow relations between all parts of the data to be tracked and managed, provide backup, recovery, audit trails, security and access control, and allow overall project status to be monitored andmore » managed. Common data accessing schemes and user interfaces to applications are also part of an IED. This paper describes a new software product that allows incremental realization of many of the capabilities of an IED, without the massive disruption and risk.« less

  1. San Juan National Forest Land Management Planning Support System (LMPSS) requirements definition

    NASA Technical Reports Server (NTRS)

    Werth, L. F. (Principal Investigator)

    1981-01-01

    The role of remote sensing data as it relates to a three-component land management planning system (geographic information, data base management, and planning model) can be understood only when user requirements are known. Personnel at the San Juan National Forest in southwestern Colorado were interviewed to determine data needs for managing and monitoring timber, rangelands, wildlife, fisheries, soils, water, geology and recreation facilities. While all the information required for land management planning cannot be obtained using remote sensing techniques, valuable information can be provided for the geographic information system. A wide range of sensors such as small and large format cameras, synthetic aperture radar, and LANDSAT data should be utilized. Because of the detail and accuracy required, high altitude color infrared photography should serve as the baseline data base and be supplemented and updated with data from the other sensors.

  2. Residual acceleration data on IML-1: Development of a data reduction and dissemination plan

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Alexander, J. Iwan D.; Wolf, Randy

    1992-01-01

    The main thrust of our work in the third year of contract NAG8-759 was the development and analysis of various data processing techniques that may be applicable to residual acceleration data. Our goal is the development of a data processing guide that low gravity principal investigators can use to assess their need for accelerometer data and then formulate an acceleration data analysis strategy. The work focused on the flight of the first International Microgravity Laboratory (IML-1) mission. We are also developing a data base management system to handle large quantities of residual acceleration data. This type of system should be an integral tool in the detailed analysis of accelerometer data. The system will manage a large graphics data base in the support of supervised and unsupervised pattern recognition. The goal of the pattern recognition phase is to identify specific classes of accelerations so that these classes can be easily recognized in any data base. The data base management system is being tested on the Spacelab 3 (SL3) residual acceleration data.

  3. Application of data mining in science and technology management information system based on WebGIS

    NASA Astrophysics Data System (ADS)

    Wu, Xiaofang; Xu, Zhiyong; Bao, Shitai; Chen, Feixiang

    2009-10-01

    With the rapid development of science and technology and the quick increase of information, a great deal of data is accumulated in the management department of science and technology. Usually, many knowledge and rules are contained and concealed in the data. Therefore, how to excavate and use the knowledge fully is very important in the management of science and technology. It will help to examine and approve the project of science and technology more scientifically and make the achievement transformed as the realistic productive forces easier. Therefore, the data mine technology will be researched and applied to the science and technology management information system to find and excavate the knowledge in the paper. According to analyzing the disadvantages of traditional science and technology management information system, the database technology, data mining and web geographic information systems (WebGIS) technology will be introduced to develop and construct the science and technology management information system based on WebGIS. The key problems are researched in detail such as data mining and statistical analysis. What's more, the prototype system is developed and validated based on the project data of National Natural Science Foundation Committee. The spatial data mining is done from the axis of time, space and other factors. Then the variety of knowledge and rules will be excavated by using data mining technology, which helps to provide an effective support for decisionmaking.

  4. WebBio, a web-based management and analysis system for patient data of biological products in hospital.

    PubMed

    Lu, Ying-Hao; Kuo, Chen-Chun; Huang, Yaw-Bin

    2011-08-01

    We selected HTML, PHP and JavaScript as the programming languages to build "WebBio", a web-based system for patient data of biological products and used MySQL as database. WebBio is based on the PHP-MySQL suite and is run by Apache server on Linux machine. WebBio provides the functions of data management, searching function and data analysis for 20 kinds of biological products (plasma expanders, human immunoglobulin and hematological products). There are two particular features in WebBio: (1) pharmacists can rapidly find out whose patients used contaminated products for medication safety, and (2) the statistics charts for a specific product can be automatically generated to reduce pharmacist's work loading. WebBio has successfully turned traditional paper work into web-based data management.

  5. Personal diabetes management system based on ubiquitous computing technology.

    PubMed

    Park, Kyung-Soon; Kim, Nam-Jin; Hong, Joo-Hyun; Park, Mi-Sook; Cha, Eun-Jung; Lee, Tae-Soo

    2006-01-01

    Assisting diabetes patients to self manage blood glucose test and insulin injection is of great importance for their healthcare. This study presented a PDA based system to manage the personal glucose level data interfaced with a small glucometer through a serial port. The data stored in the PDA can be transmitted by cradle or wireless communication to the remote web-server, where further medical analysis and service is provided. This system enables more efficient and systematic management of diabetes patients through self management and remote medical practice.

  6. The Potential of Knowing More: A Review of Data-Driven Urban Water Management.

    PubMed

    Eggimann, Sven; Mutzner, Lena; Wani, Omar; Schneider, Mariane Yvonne; Spuhler, Dorothee; Moy de Vitry, Matthew; Beutler, Philipp; Maurer, Max

    2017-03-07

    The promise of collecting and utilizing large amounts of data has never been greater in the history of urban water management (UWM). This paper reviews several data-driven approaches which play a key role in bringing forward a sea change. It critically investigates whether data-driven UWM offers a promising foundation for addressing current challenges and supporting fundamental changes in UWM. We discuss the examples of better rain-data management, urban pluvial flood-risk management and forecasting, drinking water and sewer network operation and management, integrated design and management, increasing water productivity, wastewater-based epidemiology and on-site water and wastewater treatment. The accumulated evidence from literature points toward a future UWM that offers significant potential benefits thanks to increased collection and utilization of data. The findings show that data-driven UWM allows us to develop and apply novel methods, to optimize the efficiency of the current network-based approach, and to extend functionality of today's systems. However, generic challenges related to data-driven approaches (e.g., data processing, data availability, data quality, data costs) and the specific challenges of data-driven UWM need to be addressed, namely data access and ownership, current engineering practices and the difficulty of assessing the cost benefits of data-driven UWM.

  7. Generic functional requirements for a NASA general-purpose data base management system

    NASA Technical Reports Server (NTRS)

    Lohman, G. M.

    1981-01-01

    Generic functional requirements for a general-purpose, multi-mission data base management system (DBMS) for application to remotely sensed scientific data bases are detailed. The motivation for utilizing DBMS technology in this environment is explained. The major requirements include: (1) a DBMS for scientific observational data; (2) a multi-mission capability; (3) user-friendly; (4) extensive and integrated information about data; (5) robust languages for defining data structures and formats; (6) scientific data types and structures; (7) flexible physical access mechanisms; (8) ways of representing spatial relationships; (9) a high level nonprocedural interactive query and data manipulation language; (10) data base maintenance utilities; (11) high rate input/output and large data volume storage; and adaptability to a distributed data base and/or data base machine configuration. Detailed functions are specified in a top-down hierarchic fashion. Implementation, performance, and support requirements are also given.

  8. An approach for management of geometry data

    NASA Technical Reports Server (NTRS)

    Dube, R. P.; Herron, G. J.; Schweitzer, J. E.; Warkentine, E. R.

    1980-01-01

    The strategies for managing Integrated Programs for Aerospace Design (IPAD) computer-based geometry are described. The computer model of geometry is the basis for communication, manipulation, and analysis of shape information. IPAD's data base system makes this information available to all authorized departments in a company. A discussion of the data structures and algorithms required to support geometry in IPIP (IPAD's data base management system) is presented. Through the use of IPIP's data definition language, the structure of the geometry components is defined. The data manipulation language is the vehicle by which a user defines an instance of the geometry. The manipulation language also allows a user to edit, query, and manage the geometry. The selection of canonical forms is a very important part of the IPAD geometry. IPAD has a canonical form for each entity and provides transformations to alternate forms; in particular, IPAD will provide a transformation to the ANSI standard. The DBMS schemas required to support IPAD geometry are explained.

  9. A Data Base Management System for Clinical and Epidemiologic Studies In Systemic Lupus Erythematosus: Design and Maintenance

    PubMed Central

    Kosmides, Victoria S.; Hochberg, Marc C.

    1984-01-01

    This report describes the development, design specifications, features and implementation of a data base management system (DBMS) for clinical and epidemiologic studies in SLE. The DBMS is multidimensional with arrays formulated across patients, studies and variables. The major impact of this DBMS has been to increase the efficiency of managing and analyzing vast amounts of clinical and laboratory data and, as a result, to allow for continued growth in research productivity in areas related to SLE.

  10. Reusing Information Management Services for Recommended Decadal Study Missions to Facilitate Aerosol and Cloud Studies

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Alcott, Gary; Lynnes, Chris; Leptoukh, Greg; Vollmer, Bruce; Berrick, Steve

    2008-01-01

    NASA Earth Sciences Division (ESD) has made great investments in the development and maintenance of data management systems and information technologies, to maximize the use of NASA generated Earth science data. With information management system infrastructure in place, mature and operational, very small delta costs are required to fully support data archival, processing, and data support services required by the recommended Decadal Study missions. This presentation describes the services and capabilities of the Goddard Space Flight Center (GSFC) Earth Sciences Data and Information Services Center (GES DISC) and the reusability for these future missions. The GES DISC has developed a series of modular, reusable data management components currently in use. They include data archive and distribution (Simple, Scalable, Script-based, Science [S4] Product Archive aka S4PA), data processing (S4 Processor for Measurements aka S4PM), data search (Mirador), data browse, visualization, and analysis (Giovanni), and data mining services. Information management system components are based on atmospheric scientist inputs. Large development and maintenance cost savings can be realized through their reuse in future missions.

  11. AOIPS data base management systems support for GARP data sets

    NASA Technical Reports Server (NTRS)

    Gary, J. P.

    1977-01-01

    A data base management system is identified, developed to provide flexible access to data sets produced by GARP during its data systems tests. The content and coverage of the data base are defined and a computer-aided, interactive information storage and retrieval system, implemented to facilitate access to user specified data subsets, is described. The computer programs developed to provide the capability were implemented on the highly interactive, minicomputer-based AOIPS and are referred to as the data retrieval system (DRS). Implemented as a user interactive but menu guided system, the DRS permits users to inventory the data tape library and create duplicate or subset data sets based on a user selected window defined by time and latitude/longitude boundaries. The DRS permits users to select, display, or produce formatted hard copy of individual data items contained within the data records.

  12. A CERIF-Compatible Research Management System Based on the MARC 21 Format

    ERIC Educational Resources Information Center

    Ivanovic, Dragan; Milosavljevic, Gordana; Milosavljevic, Branko; Surla, Dusan

    2010-01-01

    Purpose: Entering data about published research results should be implemented as a web application that enables authors to input their own data without the knowledge of the bibliographic standard. The aim of this research is to develop a research management system based on a bibliographic standard and to provide data exchange with other research…

  13. Students' Groupwork Management in Online Collaborative Learning Environments

    ERIC Educational Resources Information Center

    Xu, Jianzhong; Du, Jianxia; Fan, Xitao

    2015-01-01

    The present study investigates empirical models of groupwork management in online collaborative learning environments, based on the data from 298 students (86 groups) in United States. Data revealed that, at the group level, groupwork management was positively associated with feedback and help seeking. Data further revealed that, at the individual…

  14. Keys to success for data-driven decision making: Lessons from participatory monitoring and collaborative adaptive management

    USDA-ARS?s Scientific Manuscript database

    Recent years have witnessed a call for evidence-based decisions in conservation and natural resource management, including data-driven decision-making. Adaptive management (AM) is one prevalent model for integrating scientific data into decision-making, yet AM has faced numerous challenges and limit...

  15. SDMS: A scientific data management system

    NASA Technical Reports Server (NTRS)

    Massena, W. A.

    1978-01-01

    SDMS is a data base management system developed specifically to support scientific programming applications. It consists of a data definition program to define the forms of data bases, and FORTRAN-compatible subroutine calls to create and access data within them. Each SDMS data base contains one or more data sets. A data set has the form of a relation. Each column of a data set is defined to be either a key or data element. Key elements must be scalar. Data elements may also be vectors or matrices. The data elements in each row of the relation form an element set. SDMS permits direct storage and retrieval of an element set by specifying the corresponding key element values. To support the scientific environment, SDMS allows the dynamic creation of data bases via subroutine calls. It also allows intermediate or scratch data to be stored in temporary data bases which vanish at job end.

  16. Starting a research data management program based in a university library.

    PubMed

    Henderson, Margaret E; Knott, Teresa L

    2015-01-01

    As the need for research data management grows, many libraries are considering adding data services to help with the research mission of their institution. The Virginia Commonwealth University (VCU) Libraries created a position and hired a director of research data management in September 2013. The position was new to the libraries and the university. With the backing of the library administration, a plan for building relationships with VCU faculty, researchers, students, service and resource providers, including grant administrators, was developed to educate and engage the community in data management plan writing and research data management training.

  17. Starting a Research Data Management Program Based in a University Library

    PubMed Central

    Henderson, Margaret E.; Knott, Teresa L.

    2015-01-01

    As the need for research data management grows, many libraries are considering adding data services to help with the research mission of their institution. The VCU Libraries created a position and hired a director of research data management in September 2013. The position was new to the libraries and the university. With the backing of the library administration, a plan for building relationships with VCU faculty, researchers, students, service and resource providers, including grant administrators, was developed to educate and engage the community in data management plan writing and research data management training. PMID:25611440

  18. Summaries of Minnehaha Creek Watershed District Plans/Studies/Reports

    DTIC Science & Technology

    2004-01-30

    34+ Management of all wetland functional assessment data in a Microsoft Access© database "+ Development of a GIS wetland data management system "+ Recommendations...General Task B Design GIS -Based Decision Making Model: Scenario-Based $125,000 $125,000 Model of Landuse Hydro Data Monitoring Task C Water Quality...Landuse and Land cover data + Watershed GIS data layers + Flood Insurance Rate Maps + Proposed project locations + Stream miles, reaches and conditions

  19. Associative programming language and virtual associative access manager

    NASA Technical Reports Server (NTRS)

    Price, C.

    1978-01-01

    APL provides convenient associative data manipulation functions in a high level language. Six statements were added to PL/1 via a preprocessor: CREATE, INSERT, FIND, FOR EACH, REMOVE, and DELETE. They allow complete control of all data base operations. During execution, data base management programs perform the functions required to support the APL language. VAAM is the data base management system designed to support the APL language. APL/VAAM is used by CADANCE, an interactive graphic computer system. VAAM is designed to support heavily referenced files. Virtual memory files, which utilize the paging mechanism of the operating system, are used. VAAM supports a full network data structure. The two basic blocks in a VAAM file are entities and sets. Entities are the basic information element and correspond to PL/1 based structures defined by the user. Sets contain the relationship information and are implemented as arrays.

  20. Regional lists of plant species that occur in wetlands: data base user's guide

    USGS Publications Warehouse

    Reed, Porter B.; Auble, Gregor T.; Muhlenbruck, Jill E.; Manci, Karen M.

    1989-01-01

    The Data Base List of Plant Species that Occur in Wetlands (LIST) currently contains records for 6,728 plant species. Each record provides information on nomenclature, plant characteristics and lifeforms, distribution, and frequency of occurrence in wetlands. The List of Plant Species that Occur in Wetlands, developed to supplement the U.S. Fish and Wildlife Service's Classification of Wetlands and Deepwater Habitats of the United States (Cowardin et al. 1979), underwent an intensive review by field botanists across the country. This review was coordinated by national and regional interagency wetland plant list review panels composed of representatives from the U. S. Fish and Wildlife Service, U. S. Army Corps of Engineers, Soil Conservation Service, and the Environmental Protection Agency. Initial and updated versions of the Data Base List of Plant Species that Occur in Wetlands are available in hardcopy (Reed 1986, 1988). Regional lists are available as U.S. Fish and Wildlife Service Biological Report Series 88(26.126.13). State lists are available as National Ecology Research Center Report Series 88(18.01-18.50). The computerized data base tracks and documents indicator assignments made by regional interagency review panels and facilitates generation of reports. This user's guide describes the format and contents of the LIST Data Base. The Data Base is available on 5-1/4" floppy disks in ASCII format for use with a data base management system on an IBM PC/XT/AT compatible computer. The LIST Data Base was developed using the QUICKTEXT Data Base Management System (Osborn and Strong 1984). Use of QUICKTEXT with the LIST Data Base is strongly recommended. Instructions for loading LIST into QUICKTEXT are included in this user's guide. Other data base management systems capable of handling variable length fields can be used by individuals familiar with these software packages. LIST distribution disks are available for 13 regions (Table 1). QUICKTEXT (course QT100--Data Base Management Techniques) and regional subsets of the LIST Data Base (distributed as self-tutorial courses, Table 1) are available through the Office of Conference Services, Colorado State University.

  1. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    PubMed Central

    2009-01-01

    Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS) for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development), a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH) protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community. PMID:19941647

  2. An Open Source Software and Web-GIS Based Platform for Airborne SAR Remote Sensing Data Management, Distribution and Sharing

    NASA Astrophysics Data System (ADS)

    Changyong, Dou; Huadong, Guo; Chunming, Han; Ming, Liu

    2014-03-01

    With more and more Earth observation data available to the community, how to manage and sharing these valuable remote sensing datasets is becoming an urgent issue to be solved. The web based Geographical Information Systems (GIS) technology provides a convenient way for the users in different locations to share and make use of the same dataset. In order to efficiently use the airborne Synthetic Aperture Radar (SAR) remote sensing data acquired in the Airborne Remote Sensing Center of the Institute of Remote Sensing and Digital Earth (RADI), Chinese Academy of Sciences (CAS), a Web-GIS based platform for airborne SAR data management, distribution and sharing was designed and developed. The major features of the system include map based navigation search interface, full resolution imagery shown overlaid the map, and all the software adopted in the platform are Open Source Software (OSS). The functions of the platform include browsing the imagery on the map navigation based interface, ordering and downloading data online, image dataset and user management, etc. At present, the system is under testing in RADI and will come to regular operation soon.

  3. Software Framework for Peer Data-Management Services

    NASA Technical Reports Server (NTRS)

    Hughes, John; Hardman, Sean; Crichton, Daniel; Hyon, Jason; Kelly, Sean; Tran, Thuy

    2007-01-01

    Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-to-peer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.

  4. Medical-Information-Management System

    NASA Technical Reports Server (NTRS)

    Alterescu, Sidney; Friedman, Carl A.; Frankowski, James W.

    1989-01-01

    Medical Information Management System (MIMS) computer program interactive, general-purpose software system for storage and retrieval of information. Offers immediate assistance where manipulation of large data bases required. User quickly and efficiently extracts, displays, and analyzes data. Used in management of medical data and handling all aspects of data related to care of patients. Other applications include management of data on occupational safety in public and private sectors, handling judicial information, systemizing purchasing and procurement systems, and analyses of cost structures of organizations. Written in Microsoft FORTRAN 77.

  5. Shuttle Program Information Management System (SPIMS) data base

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The Shuttle Program Information Management System (SPIMS) is a computerized data base operations system. The central computer is the CDC 170-730 located at Johnson Space Center (JSC), Houston, Texas. There are several applications which have been developed and supported by SPIMS. A brief description is given.

  6. Langley experience with ADABAS/NATURAL

    NASA Technical Reports Server (NTRS)

    Swanson, A.

    1984-01-01

    The use of the data base management system ADABAS and the companion software NATURAL and COM-PLETE at the Langley Research Center is evaluated. A brief overview of data base management system technology is provided as well as system upgrading, user requirements, and use of the system for administrative support.

  7. Archive Inventory Management System (AIMS) — A Fast, Metrics Gathering Framework for Validating and Gaining Insight from Large File-Based Data Archives

    NASA Astrophysics Data System (ADS)

    Verma, R. V.

    2018-04-01

    The Archive Inventory Management System (AIMS) is a software package for understanding the distribution, characteristics, integrity, and nuances of files and directories in large file-based data archives on a continuous basis.

  8. Temporal and Location Based RFID Event Data Management and Processing

    NASA Astrophysics Data System (ADS)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  9. Technology-based management of environmental organizations using an Environmental Management Information System (EMIS): Design and development

    NASA Astrophysics Data System (ADS)

    Kouziokas, Georgios N.

    2016-01-01

    The adoption of Information and Communication Technologies (ICT) in environmental management has become a significant demand nowadays with the rapid growth of environmental information. This paper presents a prototype Environmental Management Information System (EMIS) that was developed to provide a systematic way of managing environmental data and human resources of an environmental organization. The system was designed using programming languages, a Database Management System (DBMS) and other technologies and programming tools and combines information from the relational database in order to achieve the principal goals of the environmental organization. The developed application can be used to store and elaborate information regarding: human resources data, environmental projects, observations, reports, data about the protected species, environmental measurements of pollutant factors or other kinds of analytical measurements and also the financial data of the organization. Furthermore, the system supports the visualization of spatial data structures by using geographic information systems (GIS) and web mapping technologies. This paper describes this prototype software application, its structure, its functions and how this system can be utilized to facilitate technology-based environmental management and decision-making process.

  10. Antarctic Data Management as Part of the IPY Legacy

    NASA Astrophysics Data System (ADS)

    de Bruin, T.

    2006-12-01

    The Antarctic Treaty states that "scientific observations and results from Antarctica shall be exchanged and made freely available". Antarctica includes the Southern Ocean. In support of this, National Antarctic Data Centres (NADC) are being established to catalogue data sets and to provide information on data sets to scientists and others with interest in Antarctic science. The Joint Committee on Antarctic Data Management (JCADM) was established by the Scientific Committee on Antarctic Research (SCAR) and the Council of Managers of National Antarctic Programs (COMNAP). JCADM comprises representatives of the National Antarctic Data Centres. Currently 30 nations around the world are represented in JCADM. JCADM is responsible for the Antarctic Master Directory (AMD), the internationally accessible, web-based, searchable record of Antarctic and Southern Ocean data set descriptions. The AMD is directly integrated into the international Global Change Master Directory (GCMD) to help further merge Antarctic science into global science. The AMD is a resource for scientists to advertise the data they have collected and to search for data they may need. JCADM is the Antarctic component of the IPY Data Infrastructure, which is presently being developed. This presentation will give an overview of the organization of Antarctic and Southern Ocean data management with sections on the organizational structure of JCADM, contents of the Antarctic Master Directory, relationships to the SCAR Scientific Research Programmes (SRP) and IPY, international embedding and connections with discipline-based peer organizations like the International Oceanographic Data and Information Exchange Committee (IODE). It will focus primarily on the role that an existing infrastructure as JCADM, may play in the development of the IPY Data Infrastructure and will provide considerations for IPY data management, based on the experiences in Antarctic and oceanographic data management.

  11. Badger Army Ammunition Plant groundwater data management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, J.P.

    1994-12-31

    At the Badger Army Ammunition Plant (Badger), there are currently over 200 wells that are monitored on a quarterly basis. Badger has had three active production periods since its construction in 1942. During these periods, various nitrocellulose based propellants were produced including single base artillery propellants were produced including single base artillery propellant, double base rocket propellant and BALL POWDER{reg_sign} propellant. Intermediate materials used in the manufacture of these propellants were also produced, including nitroglycerine, and sulfuric and nitric acids. To meet the challenge of managing the data in-house, a groundwater data management system (GDMS) was developed. Although such systemsmore » are commercially available, they were not able to provide the specific capabilities necessary for data management and reporting at Badger. The GDMS not only provides the routine database capabilities of data sorts and queries, but has provided an automated data reporting system as well. The reporting function alone has significantly reduced the time and efforts that would normally be associated with this task. Since the GDMS was developed at Badger, the program can be continually adapted to site specific needs. Future planned modifications include automated reconciliation, improved transfer of data to graphics software, and statistical analysis and interpretation of the data.« less

  12. Evidence-based human resource management: a study of nurse leaders' resource allocation.

    PubMed

    Fagerström, Lisbeth

    2009-05-01

    The aims were to illustrate how the RAFAELA system can be used to facilitate evidence-based human resource management. The theoretical framework of the RAFAELA system is based on a holistic view of humankind and a view of leadership founded on human resource management. Nine wards from three central hospitals in Finland participated in the study. The data, stemming from 2006-2007, were taken from the critical indicators (ward-related and nursing intensity information) for national benchmarking used in the RAFAELA system. The data were analysed descriptively. The daily nursing resources per classified patient ratio is a more specific method of measurement than the nurse-to-patient ratio. For four wards, the nursing intensity per nurse surpassed the optimal level 34% to 62.2% of days. Resource allocation was clearly improved in that a better balance between patients' care needs and available nursing resources was maintained. The RAFAELA system provides a rational, systematic and objective foundation for evidence-based human resource management. Data from a systematic use of the RAFAELA system offer objective facts and motives for evidence-based decision making in human resource management, and will therefore enhance the nurse leaders' evidence and scientific based way of working.

  13. [Infrastructure and contents of clinical data management plan].

    PubMed

    Shen, Tong; Xu, Lie-dong; Fu, Hai-jun; Liu, Yan; He, Jia; Chen, Ping-yan; Song, Yu-fei

    2015-11-01

    Establishment of quality management system (QMS) plays a critical role in the clinical data management (CDM). The objectives of CDM are to ensure the quality and integrity of the trial data. Thus, every stage or element that may impact the quality outcomes of clinical studies should be in the controlled manner, which is referred to the full life cycle of CDM associated with the data collection, handling and statistical analysis of trial data. Based on the QMS, this paper provides consensus on how to develop a compliant clinical data management plan (CDMP). According to the essential requirements of the CDM, the CDMP should encompass each process of data collection, data capture and cleaning, medical coding, data verification and reconciliation, database monitoring and management, external data transmission and integration, data documentation and data quality assurance and so on. Creating and following up data management plan in each designed data management steps, dynamically record systems used, actions taken, parties involved will build and confirm regulated data management processes, standard operational procedures and effective quality metrics in all data management activities. CDMP is one of most important data management documents that is the solid foundation for clinical data quality.

  14. Managing data from remeasured plots: An evaluation of existing systems

    Treesearch

    John C. Byrne; Michael D. Sweet

    1992-01-01

    Proper management of the valuable data from remeasured (or permanent) forest growth plots with data base management systems (DBMS) can greatly add to their utility. Twelve desired features for such a system (activities that facilitate the storage, accuracy, and use of the data for analysis) are described and used to evaluate the 36 systems found by a survey conducted...

  15. Adventure Program Risk Management Report: 1998 Edition. Narratives and Data from 1991-1997.

    ERIC Educational Resources Information Center

    Leemon, Drew, Ed.; Schimelpfenig, Tod, Ed.; Gray, Sky, Ed.; Tarter, Shana, Ed.; Williamson, Jed, Ed.

    The Wilderness Risk Managers Committee (WRMC), a consortium of outdoor schools and organizations, works toward better understanding and management of risks in the wilderness. Among other activities, the WRMC gathers data on incidents and accidents from member organizations and other wilderness-based programs. This book compiles incident data for…

  16. Data base management system and display software for the National Geophysical Data Center geomagnetic CD-ROM's

    NASA Technical Reports Server (NTRS)

    Papitashvili, N. E.; Papitashvili, V. O.; Allen, J. H.; Morris, L. D.

    1995-01-01

    The National Geophysical Data Center has the largest collection of geomagnetic data from the worldwide network of magnetic observatories. The data base management system and retrieval/display software have been developed for the archived geomagnetic data (annual means, monthly, daily, hourly, and 1-minute values) and placed on the center's CD-ROM's to provide users with 'user-oriented' and 'user-friendly' support. This system is described in this paper with a brief outline of provided options.

  17. A method of distributed avionics data processing based on SVM classifier

    NASA Astrophysics Data System (ADS)

    Guo, Hangyu; Wang, Jinyan; Kang, Minyang; Xu, Guojing

    2018-03-01

    Under the environment of system combat, in order to solve the problem on management and analysis of the massive heterogeneous data on multi-platform avionics system, this paper proposes a management solution which called avionics "resource cloud" based on big data technology, and designs an aided decision classifier based on SVM algorithm. We design an experiment with STK simulation, the result shows that this method has a high accuracy and a broad application prospect.

  18. [Discussion on developing a data management plan and its key factors in clinical study based on electronic data capture system].

    PubMed

    Li, Qing-na; Huang, Xiu-ling; Gao, Rui; Lu, Fang

    2012-08-01

    Data management has significant impact on the quality control of clinical studies. Every clinical study should have a data management plan to provide overall work instructions and ensure that all of these tasks are completed according to the Good Clinical Data Management Practice (GCDMP). Meanwhile, the data management plan (DMP) is an auditable document requested by regulatory inspectors and must be written in a manner that is realistic and of high quality. The significance of DMP, the minimum standards and the best practices provided by GCDMP, the main contents of DMP based on electronic data capture (EDC) and some key factors of DMP influencing the quality of clinical study were elaborated in this paper. Specifically, DMP generally consists of 15 parts, namely, the approval page, the protocol summary, role and training, timelines, database design, creation, maintenance and security, data entry, data validation, quality control and quality assurance, the management of external data, serious adverse event data reconciliation, coding, database lock, data management reports, the communication plan and the abbreviated terms. Among them, the following three parts are regarded as the key factors: designing a standardized database of the clinical study, entering data in time and cleansing data efficiently. In the last part of this article, the authors also analyzed the problems in clinical research of traditional Chinese medicine using the EDC system and put forward some suggestions for improvement.

  19. Research on spatio-temporal database techniques for spatial information service

    NASA Astrophysics Data System (ADS)

    Zhao, Rong; Wang, Liang; Li, Yuxiang; Fan, Rongshuang; Liu, Ping; Li, Qingyuan

    2007-06-01

    Geographic data should be described by spatial, temporal and attribute components, but the spatio-temporal queries are difficult to be answered within current GIS. This paper describes research into the development and application of spatio-temporal data management system based upon GeoWindows GIS software platform which was developed by Chinese Academy of Surveying and Mapping (CASM). Faced the current and practical requirements of spatial information application, and based on existing GIS platform, one kind of spatio-temporal data model which integrates vector and grid data together was established firstly. Secondly, we solved out the key technique of building temporal data topology, successfully developed a suit of spatio-temporal database management system adopting object-oriented methods. The system provides the temporal data collection, data storage, data management and data display and query functions. Finally, as a case study, we explored the application of spatio-temporal data management system with the administrative region data of multi-history periods of China as the basic data. With all the efforts above, the GIS capacity of management and manipulation in aspect of time and attribute of GIS has been enhanced, and technical reference has been provided for the further development of temporal geographic information system (TGIS).

  20. Cargo Data Management Demonstration System

    DOT National Transportation Integrated Search

    1974-02-01

    Delays in receipt and creation of cargo documents are a problem in international trade. The work described demonstrates some of the advantages and capabilities of a computer-based cargo data management system. A demonstration system for data manageme...

  1. Program Manager: The Journal of the Defense Systems Management College. Volume 15, Number 4, July-August 1986.

    DTIC Science & Technology

    1986-08-01

    Architect, troi systems, CAD CAM, and com- functional analysis , synthesis, and National Bureau of Standards, mon engineering data bases will be the trade...Recurrent analysis of a management these s h e m evolutionary chain of data processing problem combining real data and ponents of defense support system...at the Defense first constructed his support simulator Systems Management College, the by assembling appropriate analysis Data Storage and Retrieval

  2. Incorporating hydrologic data and ecohydrologic relationships into ecological site descriptions

    Treesearch

    C. Jason Williams; Frederick B. Pierson; Kenneth E. Spaeth; Joel R. Brown; Osama Z. Al-Hamdan; Mark A. Weltz; Mark A. Nearing; Jeffrey E. Herrick; Jan Boll; Pete Robichaud; David C. Goodrich; Phillip Heilman; D. Phillip Guertin; Mariano Hernandez; Haiyan Wei; Stuart P. Hardegree; Eva K. Strand; Jonathan D. Bates; Loretta J. Metz; Mary H. Nichols

    2016-01-01

    The purpose of this paper is to recommend a framework and methodology for incorporating hydrologic data and ecohydrologic relationships in Ecological Site Descriptions (ESDs) and thereby enhance the utility of ESDs for assessing rangelands and guiding resilience-based management strategies. Resilience-based strategies assess and manage ecological state...

  3. Irrigation analysis based on long-term weather data

    USDA-ARS?s Scientific Manuscript database

    Irrigation-management is based upon delivery of water to a crop in the correct amount and time, and the crop’s water need is determined by calculating evapotranspiration (ET) using weather data. In 1994 an ET-network was established in the Texas High Plains to manage irrigation on a regional scale. ...

  4. Neonatal Information System Using an Interactive Microcomputer Data Base Management Program

    PubMed Central

    Engelke, Stephen C.; Paulette, Ed W.; Kopelman, Arthur E.

    1981-01-01

    A low cost, interactive microcomputer data base management system is presented which is being used in a neonatal follow-up program at the East Carolina University School of Medicine. The features and flexibility of the system could be applied to a variety of medical care settings.

  5. Two Student Self-Management Techniques Applied to Data-Based Program Modification.

    ERIC Educational Resources Information Center

    Wesson, Caren

    Two student self-management techniques, student charting and student selection of instructional activities, were applied to ongoing data-based program modification. Forty-two elementary school resource room students were assigned randomly (within teacher) to one of three treatment conditions: Teacher Chart-Teacher Select Instructional Activities…

  6. Securely and Flexibly Sharing a Biomedical Data Management System

    PubMed Central

    Wang, Fusheng; Hussels, Phillip; Liu, Peiya

    2011-01-01

    Biomedical database systems need not only to address the issues of managing complex data, but also to provide data security and access control to the system. These include not only system level security, but also instance level access control such as access of documents, schemas, or aggregation of information. The latter is becoming more important as multiple users can share a single scientific data management system to conduct their research, while data have to be protected before they are published or IP-protected. This problem is challenging as users’ needs for data security vary dramatically from one application to another, in terms of who to share with, what resources to be shared, and at what access level. We develop a comprehensive data access framework for a biomedical data management system SciPort. SciPort provides fine-grained multi-level space based access control of resources at not only object level (documents and schemas), but also space level (resources set aggregated in a hierarchy way). Furthermore, to simplify the management of users and privileges, customizable role-based user model is developed. The access control is implemented efficiently by integrating access privileges into the backend XML database, thus efficient queries are supported. The secure access approach we take makes it possible for multiple users to share the same biomedical data management system with flexible access management and high data security. PMID:21625285

  7. A data management system for engineering and scientific computing

    NASA Technical Reports Server (NTRS)

    Elliot, L.; Kunii, H. S.; Browne, J. C.

    1978-01-01

    Data elements and relationship definition capabilities for this data management system are explicitly tailored to the needs of engineering and scientific computing. System design was based upon studies of data management problems currently being handled through explicit programming. The system-defined data element types include real scalar numbers, vectors, arrays and special classes of arrays such as sparse arrays and triangular arrays. The data model is hierarchical (tree structured). Multiple views of data are provided at two levels. Subschemas provide multiple structural views of the total data base and multiple mappings for individual record types are supported through the use of a REDEFINES capability. The data definition language and the data manipulation language are designed as extensions to FORTRAN. Examples of the coding of real problems taken from existing practice in the data definition language and the data manipulation language are given.

  8. SU-E-T-11: A Cloud Based CT and LINAC QA Data Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiersma, R; Grelewicz, Z; Belcher, A

    Purpose: The current status quo of QA data management consists of a mixture of paper-based forms and spreadsheets for recording the results of daily, monthly, and yearly QA tests for both CT scanners and LINACs. Unfortunately, such systems suffer from a host of problems as, (1) records can be easily lost or destroyed, (2) data is difficult to access — one must physically hunt down records, (3) poor or no means of historical data analysis, and (4) no remote monitoring of machine performance off-site. To address these issues, a cloud based QA data management system was developed and implemented. Methods:more » A responsive tablet interface that optimizes clinic workflow with an easy-to-navigate interface accessible from any web browser was implemented in HTML/javascript/CSS to allow user mobility when entering QA data. Automated image QA was performed using a phantom QA kit developed in Python that is applicable to any phantom and is currently being used with the Gammex ACR, Las Vegas, Leeds, and Catphan phantoms for performing automated CT, MV, kV, and CBCT QAs, respectively. A Python based resource management system was used to distribute and manage intensive CPU tasks such as QA phantom image analysis or LaTeX-to-PDF QA report generation to independent process threads or different servers such that website performance is not affected. Results: To date the cloud QA system has performed approximately 185 QA procedures. Approximately 200 QA parameters are being actively tracked by the system on a monthly basis. Electronic access to historical QA parameter information was successful in proactively identifying a Linac CBCT scanner’s performance degradation. Conclusion: A fully comprehensive cloud based QA data management system was successfully implemented for the first time. Potential machine performance issues were proactively identified that would have been otherwise missed by a paper or spreadsheet based QA system.« less

  9. Case Studies of Ecological Integrative Information Systems: The Luquillo and Sevilleta Information Management Systems

    NASA Astrophysics Data System (ADS)

    San Gil, Inigo; White, Marshall; Melendez, Eda; Vanderbilt, Kristin

    The thirty-year-old United States Long Term Ecological Research Network has developed extensive metadata to document their scientific data. Standard and interoperable metadata is a core component of the data-driven analytical solutions developed by this research network Content management systems offer an affordable solution for rapid deployment of metadata centered information management systems. We developed a customized integrative metadata management system based on the Drupal content management system technology. Building on knowledge and experience with the Sevilleta and Luquillo Long Term Ecological Research sites, we successfully deployed the first two medium-scale customized prototypes. In this paper, we describe the vision behind our Drupal based information management instances, and list the features offered through these Drupal based systems. We also outline the plans to expand the information services offered through these metadata centered management systems. We will conclude with the growing list of participants deploying similar instances.

  10. Data Base On Cables And Connectors

    NASA Technical Reports Server (NTRS)

    Bowen, Arlen R.; Oliver, John D.

    1995-01-01

    Report describes Connector Adapter Cable Information Data Base (CONNAID) computer program, managing data base containing necessary information concerning electrical connectors, breakout boxes, adapter cables, backshells, and pertinent torque specifications for engineering project.

  11. I-Maculaweb: A Tool to Support Data Reuse in Ophthalmology

    PubMed Central

    Bonetto, Monica; Nicolò, Massimo; Gazzarata, Roberta; Fraccaro, Paolo; Rosa, Raffaella; Musetti, Donatella; Musolino, Maria; Traverso, Carlo E.

    2016-01-01

    This paper intends to present a Web-based application to collect and manage clinical data and clinical trials together in a unique tool. I-maculaweb is a user-friendly Web-application designed to manage, share, and analyze clinical data from patients affected by degenerative and vascular diseases of the macula. The unique and innovative scientific and technological elements of this project are the integration with individual and population data, relevant for degenerative and vascular diseases of the macula. Clinical records can also be extracted for statistical purposes and used for clinical decision support systems. I-maculaweb is based on an existing multilevel and multiscale data management model, which includes general principles that are suitable for several different clinical domains. The database structure has been specifically built to respect laterality, a key aspect in ophthalmology. Users can add and manage patient records, follow-up visits, treatment, diagnoses, and clinical history. There are two different modalities to extract records: one for the patient’s own center, in which personal details are shown and the other for statistical purposes, where all center’s anonymized data are visible. The Web-platform allows effective management, sharing, and reuse of information within primary care and clinical research. Clear and precise clinical data will improve understanding of real-life management of degenerative and vascular diseases of the macula as well as increasing precise epidemiologic and statistical data. Furthermore, this Web-based application can be easily employed as an electronic clinical research file in clinical studies. PMID:27170913

  12. Set processing in a network environment. [data bases and magnetic disks and tapes

    NASA Technical Reports Server (NTRS)

    Hardgrave, W. T.

    1975-01-01

    A combination of a local network, a mass storage system, and an autonomous set processor serving as a data/storage management machine is described. Its characteristics include: content-accessible data bases usable from all connected devices; efficient storage/access of large data bases; simple and direct programming with data manipulation and storage management handled by the set processor; simple data base design and entry from source representation to set processor representation with no predefinition necessary; capability available for user sort/order specification; significant reduction in tape/disk pack storage and mounts; flexible environment that allows upgrading hardware/software configuration without causing major interruptions in service; minimal traffic on data communications network; and improved central memory usage on large processors.

  13. REPHLEX II: An information management system for the ARS Water Data Base

    NASA Astrophysics Data System (ADS)

    Thurman, Jane L.

    1993-08-01

    The REPHLEX II computer system is an on-line information management system which allows scientists, engineers, and other researchers to retrieve data from the ARS Water Data Base using asynchronous communications. The system features two phone lines handling baud rates from 300 to 2400, customized menus to facilitate browsing, help screens, direct access to information and data files, electronic mail processing, file transfers using the XMODEM protocol, and log-in procedures which capture information on new users, process passwords, and log activity for a permanent audit trail. The primary data base on the REPHLEX II system is the ARS Water Data Base which consists of rainfall and runoff data from experimental agricultural watersheds located in the United States.

  14. SeaSketch: Implementation of a Decision-Support Platform for a Channel Islands National Marine Sanctuary Multi-sector Working Group

    NASA Astrophysics Data System (ADS)

    Goldberg, G.; McClintock, W.

    2016-12-01

    Effective interagency and cross-sector coordination is essential to ecosystem based management which depends on processes characterized by collaboration and science-based information. Many technological barriers that exist in the development of science-based management plans are closely tied to process challenges, such as the sharing of data and information or the inclusion of parties with varied levels of technical experience. The Channel Islands National Marine Sanctuary has convened a diverse working group to develop recommendations for the management of marine shipping in and around the Santa Barbara Channel, as well as recommendations regarding research needs and outreach strategies. Working group members take a multi-issue approach with four distinct goals related to the reduction of ship strikes on whales, emissions and air quality, conflicting ocean uses, and issues of navigational safety. Members range from industry representatives, scientists, and multiple local and federal government entities. The recommended management plans will be based in the best-available science, and will build off of previous efforts, making this an interesting case study of adaptive management. In addition to support from the Sanctuary and professional facilitators, the group is using a decision-support platform, SeaSketch (safepassage.seasketch.org). SeaSketch is a web-based GIS that supports collaborative science-based marine spatial planning (MSP). Each feature supports a step of the MSP process, from data gathering, identification of data needs, the design of spatial plans, evaluation of those plans with analytics, and map-based forums that facilitate data-driven discussions. Working group members are able to access these tools to explore management options and collaborate remotely, in addition to using the platform during in-person meetings and webinars. Empowering diverse audiences to engage in the design of science-based plans is of key importance to developing ecosystem-based management plans where multi-sector participation and inter-agency coordination are critical.

  15. SeaSketch: Implementation of a Decision-Support Platform for a Channel Islands National Marine Sanctuary Multi-sector Working Group

    NASA Astrophysics Data System (ADS)

    Goldberg, G.; McClintock, W.

    2016-02-01

    Effective interagency and cross-sector coordination is essential to ecosystem based management which depends on processes characterized by collaboration and science-based information. Many technological barriers that exist in the development of science-based management plans are closely tied to process challenges, such as the sharing of data and information or the inclusion of parties with varied levels of technical experience. The Channel Islands National Marine Sanctuary has convened a diverse working group to develop recommendations for the management of marine shipping in and around the Santa Barbara Channel, as well as recommendations regarding research needs and outreach strategies. Working group members take a multi-issue approach with four distinct goals related to the reduction of ship strikes on whales, emissions and air quality, conflicting ocean uses, and issues of navigational safety. Members range from industry representatives, scientists, and multiple local and federal government entities. The recommended management plans will be based in the best-available science, and will build off of previous efforts, making this an interesting case study of adaptive management. In addition to support from the Sanctuary and professional facilitators, the group is using a decision-support platform, SeaSketch (safepassage.seasketch.org). SeaSketch is a web-based GIS that supports collaborative science-based marine spatial planning (MSP). Each feature supports a step of the MSP process, from data gathering, identification of data needs, the design of spatial plans, evaluation of those plans with analytics, and map-based forums that facilitate data-driven discussions. Working group members are able to access these tools to explore management options and collaborate remotely, in addition to using the platform during in-person meetings and webinars. Empowering diverse audiences to engage in the design of science-based plans is of key importance to developing ecosystem-based management plans where multi-sector participation and inter-agency coordination are critical.

  16. Collaborative data model and data base development for paleoenvironmental and archaeological domain using Semantic MediaWiki

    NASA Astrophysics Data System (ADS)

    Willmes, C.

    2017-12-01

    In the frame of the Collaborative Research Centre 806 (CRC 806) an interdisciplinary research project, that needs to manage data, information and knowledge from heterogeneous domains, such as archeology, cultural sciences, and the geosciences, a collaborative internal knowledge base system was developed. The system is based on the open source MediaWiki software, that is well known as the software that enables Wikipedia, for its facilitation of a web based collaborative knowledge and information management platform. This software is additionally enhanced with the Semantic MediaWiki (SMW) extension, that allows to store and manage structural data within the Wiki platform, as well as it facilitates complex query and API interfaces to the structured data stored in the SMW data base. Using an additional open source software called mobo, it is possible to improve the data model development process, as well as automated data imports, from small spreadsheets to large relational databases. Mobo is a command line tool that helps building and deploying SMW structure in an agile, Schema-Driven Development way, and allows to manage and collaboratively develop the data model formalizations, that are formalized in JSON-Schema format, using version control systems like git. The combination of a well equipped collaborative web platform facilitated by Mediawiki, the possibility to store and query structured data in this collaborative database provided by SMW, as well as the possibility for automated data import and data model development enabled by mobo, result in a powerful but flexible system to build and develop a collaborative knowledge base system. Furthermore, SMW allows the application of Semantic Web technology, the structured data can be exported into RDF, thus it is possible to set a triple-store including a SPARQL endpoint on top of the database. The JSON-Schema based data models, can be enhanced into JSON-LD, to facilitate and profit from the possibilities of Linked Data technology.

  17. Recommendation System Based On Association Rules For Distributed E-Learning Management Systems

    NASA Astrophysics Data System (ADS)

    Mihai, Gabroveanu

    2015-09-01

    Traditional Learning Management Systems are installed on a single server where learning materials and user data are kept. To increase its performance, the Learning Management System can be installed on multiple servers; learning materials and user data could be distributed across these servers obtaining a Distributed Learning Management System. In this paper is proposed the prototype of a recommendation system based on association rules for Distributed Learning Management System. Information from LMS databases is analyzed using distributed data mining algorithms in order to extract the association rules. Then the extracted rules are used as inference rules to provide personalized recommendations. The quality of provided recommendations is improved because the rules used to make the inferences are more accurate, since these rules aggregate knowledge from all e-Learning systems included in Distributed Learning Management System.

  18. Study on parallel and distributed management of RS data based on spatial database

    NASA Astrophysics Data System (ADS)

    Chen, Yingbiao; Qian, Qinglan; Wu, Hongqiao; Liu, Shijin

    2009-10-01

    With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.

  19. The Evidence-base for Using Ontologies and Semantic Integration Methodologies to Support Integrated Chronic Disease Management in Primary and Ambulatory Care: Realist Review. Contribution of the IMIA Primary Health Care Informatics WG.

    PubMed

    Liyanage, H; Liaw, S-T; Kuziemsky, C; Terry, A L; Jones, S; Soler, J K; de Lusignan, S

    2013-01-01

    Most chronic diseases are managed in primary and ambulatory care. The chronic care model (CCM) suggests a wide range of community, technological, team and patient factors contribute to effective chronic disease management. Ontologies have the capability to enable formalised linkage of heterogeneous data sources as might be found across the elements of the CCM. To describe the evidence base for using ontologies and other semantic integration methods to support chronic disease management. We reviewed the evidence-base for the use of ontologies and other semantic integration methods within and across the elements of the CCM. We report them using a realist review describing the context in which the mechanism was applied, and any outcome measures. Most evidence was descriptive with an almost complete absence of empirical research and important gaps in the evidence-base. We found some use of ontologies and semantic integration methods for community support of the medical home and for care in the community. Ubiquitous information technology (IT) and other IT tools were deployed to support self-management support, use of shared registries, health behavioural models and knowledge discovery tools to improve delivery system design. Data quality issues restricted the use of clinical data; however there was an increased use of interoperable data and health system integration. Ontologies and semantic integration methods are emergent with limited evidence-base for their implementation. However, they have the potential to integrate the disparate community wide data sources to provide the information necessary for effective chronic disease management.

  20. Inventory-based landscape-scale simulation of management effectiveness and economic feasibility with BioSum

    Treesearch

    Jeremy S. Fried; Larry D. Potts; Sara M. Loreno; Glenn A. Christensen; R. Jamie Barbour

    2017-01-01

    The Forest Inventory and Analysis (FIA)-based BioSum (Bioregional Inventory Originated Simulation Under Management) is a free policy analysis framework and workflow management software solution. It addresses complex management questions concerning forest health and vulnerability for large, multimillion acre, multiowner landscapes using FIA plot data as the initial...

  1. The Johnson Space Center Management Information Systems (JSCMIS): An interface for organizational databases

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.; Erickson, Lloyd

    1990-01-01

    The Management Information and Decision Support Environment (MIDSE) is a research activity to build and test a prototype of a generic human interface on the Johnson Space Center (JSC) Information Network (CIN). The existing interfaces were developed specifically to support operations rather than the type of data which management could use. The diversity of the many interfaces and their relative difficulty discouraged occasional users from attempting to use them for their purposes. The MIDSE activity approached this problem by designing and building an interface to one JSC data base - the personnel statistics tables of the NASA Personnel and Payroll System (NPPS). The interface was designed against the following requirements: generic (use with any relational NOMAD data base); easy to learn (intuitive operations for new users); easy to use (efficient operations for experienced users); self-documenting (help facility which informs users about the data base structure as well as the operation of the interface); and low maintenance (easy configuration to new applications). A prototype interface entitled the JSC Management Information Systems (JSCMIS) was produced. It resides on CIN/PROFS and is available to JSC management who request it. The interface has passed management review and is ready for early use. Three kinds of data are now available: personnel statistics, personnel register, and plan/actual cost.

  2. Alert management for home healthcare based on home automation analysis.

    PubMed

    Truong, T T; de Lamotte, F; Diguet, J-Ph; Said-Hocine, F

    2010-01-01

    Rising healthcare for elder and disabled people can be controlled by offering people autonomy at home by means of information technology. In this paper, we present an original and sensorless alert management solution which performs multimedia and home automation service discrimination and extracts highly regular home activities as sensors for alert management. The results of simulation data, based on real context, allow us to evaluate our approach before application to real data.

  3. Model Based Definition

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  4. Compliance program data management system for The Idaho National Engineering Laboratory/Environmental Protection Agency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertzler, C.L.; Poloski, J.P.; Bates, R.A.

    1988-01-01

    The Compliance Program Data Management System (DMS) developed at the Idaho National Engineering Laboratory (INEL) validates and maintains the integrity of data collected to support the Consent Order and Compliance Agreement (COCA) between the INEL and the Environmental Protection Agency (EPA). The system uses dBase III Plus programs and dBase III Plus in an interactive mode to enter, store, validate, manage, and retrieve analytical information provided on EPA Contract Laboratory Program (CLP) forms and CLP forms modified to accommodate 40 CFR 264 Appendix IX constituent analyses. Data analysis and presentation is performed utilizing SAS, a statistical analysis software program. Archivingmore » of data and results is performed at appropriate stages of data management. The DMS is useful for sampling and analysis programs where adherence to EPA CLP protocol, along with maintenance and retrieval of waste site investigation sampling results is desired or requested. 3 refs.« less

  5. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  6. The NEEDS Data Base Management and Archival Mass Memory System

    NASA Technical Reports Server (NTRS)

    Bailey, G. A.; Bryant, S. B.; Thomas, D. T.; Wagnon, F. W.

    1980-01-01

    A Data Base Management System and an Archival Mass Memory System are being developed that will have a 10 to the 12th bit on-line and a 10 to the 13th off-line storage capacity. The integrated system will accept packetized data from the data staging area at 50 Mbps, create a comprehensive directory, provide for file management, record the data, perform error detection and correction, accept user requests, retrieve the requested data files and provide the data to multiple users at a combined rate of 50 Mbps. Stored and replicated data files will have a bit error rate of less than 10 to the -9th even after ten years of storage. The integrated system will be demonstrated to prove the technology late in 1981.

  7. Towards a Cloud Based Smart Traffic Management Framework

    NASA Astrophysics Data System (ADS)

    Rahimi, M. M.; Hakimpour, F.

    2017-09-01

    Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.

  8. Merged data models for multi-parameterized querying: Spectral data base meets GIS-based map archive

    NASA Astrophysics Data System (ADS)

    Naß, A.; D'Amore, M.; Helbert, J.

    2017-09-01

    Current and upcoming planetary missions deliver a huge amount of different data (remote sensing data, in-situ data, and derived products). Within this contribution present how different data (bases) can be managed and merged, to enable multi-parameterized querying based on the constant spatial context.

  9. NASIS data base management system - IBM 360/370 OS MVT implementation. 7: Data base administrator user's guide

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Data Base Administrator User's Guide for the NASA Aerospace Safety information system is presented. The subjects discussed are: (1) multi-terminal tasking, (2) data base executive, (3) utilities, (4) maintenance, and (5) update mode functions.

  10. Importance of Data Management in a Long-term Biological Monitoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Sigurd W; Brandt, Craig C; McCracken, Kitty

    2011-01-01

    The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program. An existing relational database was adapted and extended to handle biological data. Data modeling enabled the program's database to process, store, and retrievemore » its data. The data base's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. There are limitations. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.« less

  11. Short Fiction on Film: A Relational DataBase.

    ERIC Educational Resources Information Center

    May, Charles

    Short Fiction on Film is a database that was created and will run on DataRelator, a relational database manager created by Bill Finzer for the California State Department of Education in 1986. DataRelator was designed for use in teaching students database management skills and to provide teachers with examples of how a database manager might be…

  12. Swarming Reconnaissance Using Unmanned Aerial Vehicles in a Parallel Discrete Event Simulation

    DTIC Science & Technology

    2004-03-01

    60 4.3.1.4 Data Distribution Management . . . . . . . . . 60 4.3.1.5 Breathing Time Warp Algorithm/ Rolling Back . 61...58 BTW Breathing Time Warp . . . . . . . . . . . . . . . . . . . . . . . . . 59 DDM Data Distribution Management . . . . . . . . . . . . . . . . . . . . 60...events based on the 58 process algorithm. Data proxies/ distribution management is the vital portion of the SPEEDES im- plementation that allows objects

  13. iRODS: A Distributed Data Management Cyberinfrastructure for Observatories

    NASA Astrophysics Data System (ADS)

    Rajasekar, A.; Moore, R.; Vernon, F.

    2007-12-01

    Large-scale and long-term preservation of both observational and synthesized data requires a system that virtualizes data management concepts. A methodology is needed that can work across long distances in space (distribution) and long-periods in time (preservation). The system needs to manage data stored on multiple types of storage systems including new systems that become available in the future. This concept is called infrastructure independence, and is typically implemented through virtualization mechanisms. Data grids are built upon concepts of data and trust virtualization. These concepts enable the management of collections of data that are distributed across multiple institutions, stored on multiple types of storage systems, and accessed by multiple types of clients. Data virtualization ensures that the name spaces used to identify files, users, and storage systems are persistent, even when files are migrated onto future technology. This is required to preserve authenticity, the link between the record and descriptive and provenance metadata. Trust virtualization ensures that access controls remain invariant as files are moved within the data grid. This is required to track the chain of custody of records over time. The Storage Resource Broker (http://www.sdsc.edu/srb) is one such data grid used in a wide variety of applications in earth and space sciences such as ROADNet (roadnet.ucsd.edu), SEEK (seek.ecoinformatics.org), GEON (www.geongrid.org) and NOAO (www.noao.edu). Recent extensions to data grids provide one more level of virtualization - policy or management virtualization. Management virtualization ensures that execution of management policies can be automated, and that rules can be created that verify assertions about the shared collections of data. When dealing with distributed large-scale data over long periods of time, the policies used to manage the data and provide assurances about the authenticity of the data become paramount. The integrated Rule-Oriented Data System (iRODS) (http://irods.sdsc.edu) provides the mechanisms needed to describe not only management policies, but also to track how the policies are applied and their execution results. The iRODS data grid maps management policies to rules that control the execution of the remote micro-services. As an example, a rule can be created that automatically creates a replica whenever a file is added to a specific collection, or extracts its metadata automatically and registers it in a searchable catalog. For the replication operation, the persistent state information consists of the replica location, the creation date, the owner, the replica size, etc. The mechanism used by iRODS for providing policy virtualization is based on well-defined functions, called micro-services, which are chained into alternative workflows using rules. A rule engine, based on the event-condition-action paradigm executes the rule-based workflows after an event. Rules can be deferred to a pre-determined time or executed on a periodic basis. As the data management policies evolve, the iRODS system can implement new rules, new micro-services, and new state information (metadata content) needed to manage the new policies. Each sub- collection can be managed using a different set of policies. The discussion of the concepts in rule-based policy virtualization and its application to long-term and large-scale data management for observatories such as ORION and NEON will be the basis of the paper.

  14. Survival models for harvest management of mourning dove populations

    USGS Publications Warehouse

    Otis, D.L.

    2002-01-01

    Quantitative models of the relationship between annual survival and harvest rate of migratory game-bird populations are essential to science-based harvest management strategies. I used the best available band-recovery and harvest data for mourning doves (Zenaida macroura) to build a set of models based on different assumptions about compensatory harvest mortality. Although these models suffer from lack of contemporary data, they can be used in development of an initial set of population models that synthesize existing demographic data on a management-unit scale, and serve as a tool for prioritization of population demographic information needs. Credible harvest management plans for mourning dove populations will require a long-term commitment to population monitoring and iterative population analysis.

  15. Digital processing of mesoscale analysis and space sensor data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.

  16. Scientific data bases on a VAX-11/780 running VMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benkovitz, C.M.; Tichler, J.L.

    At Brookhaven National Laboratory several current projects are developing and applying data management techniques to compile, analyze and distribute scientific data sets that are the result of various multi institutional experiments and data gathering projects. This paper will present an overview of a few of these data management projects.

  17. Integrated groundwater data management

    USGS Publications Warehouse

    Fitch, Peter; Brodaric, Boyan; Stenson, Matt; Booth, Nathaniel; Jakeman, Anthony J.; Barreteau, Olivier; Hunt, Randall J.; Rinaudo, Jean-Daniel; Ross, Andrew

    2016-01-01

    The goal of a data manager is to ensure that data is safely stored, adequately described, discoverable and easily accessible. However, to keep pace with the evolution of groundwater studies in the last decade, the associated data and data management requirements have changed significantly. In particular, there is a growing recognition that management questions cannot be adequately answered by single discipline studies. This has led a push towards the paradigm of integrated modeling, where diverse parts of the hydrological cycle and its human connections are included. This chapter describes groundwater data management practices, and reviews the current state of the art with enterprise groundwater database management systems. It also includes discussion on commonly used data management models, detailing typical data management lifecycles. We discuss the growing use of web services and open standards such as GWML and WaterML2.0 to exchange groundwater information and knowledge, and the need for national data networks. We also discuss cross-jurisdictional interoperability issues, based on our experience sharing groundwater data across the US/Canadian border. Lastly, we present some future trends relating to groundwater data management.

  18. Comparison of scientific and administrative database management systems

    NASA Technical Reports Server (NTRS)

    Stoltzfus, J. C.

    1983-01-01

    Some characteristics found to be different for scientific and administrative data bases are identified and some of the corresponding generic requirements for data base management systems (DBMS) are discussed. The requirements discussed are especially stringent for either the scientific or administrative data bases. For some, no commercial DBMS is fully satisfactory, and the data base designer must invent a suitable approach. For others, commercial systems are available with elegant solutions, and a wrong choice would mean an expensive work-around to provide the missing features. It is concluded that selection of a DBMS must be based on the requirements for the information system. There is no unique distinction between scientific and administrative data bases or DBMS. The distinction comes from the logical structure of the data, and understanding the data and their relationships is the key to defining the requirements and selecting an appropriate DBMS for a given set of applications.

  19. JAX Colony Management System (JCMS): an extensible colony and phenotype data management system.

    PubMed

    Donnelly, Chuck J; McFarland, Mike; Ames, Abigail; Sundberg, Beth; Springer, Dave; Blauth, Peter; Bult, Carol J

    2010-04-01

    The Jackson Laboratory Colony Management System (JCMS) is a software application for managing data and information related to research mouse colonies, associated biospecimens, and experimental protocols. JCMS runs directly on computers that run one of the PC Windows operating systems, but can be accessed via web browser interfaces from any computer running a Windows, Macintosh, or Linux operating system. JCMS can be configured for a single user or multiple users in small- to medium-size work groups. The target audience for JCMS includes laboratory technicians, animal colony managers, and principal investigators. The application provides operational support for colony management and experimental workflows, sample and data tracking through transaction-based data entry forms, and date-driven work reports. Flexible query forms allow researchers to retrieve database records based on user-defined criteria. Recent advances in handheld computers with integrated barcode readers, middleware technologies, web browsers, and wireless networks add to the utility of JCMS by allowing real-time access to the database from any networked computer.

  20. Modelling strategies to predict the multi-scale effects of rural land management change

    NASA Astrophysics Data System (ADS)

    Bulygina, N.; Ballard, C. E.; Jackson, B. M.; McIntyre, N.; Marshall, M.; Reynolds, B.; Wheater, H. S.

    2011-12-01

    Changes to the rural landscape due to agricultural land management are ubiquitous, yet predicting the multi-scale effects of land management change on hydrological response remains an important scientific challenge. Much empirical research has been of little generic value due to inadequate design and funding of monitoring programmes, while the modelling issues challenge the capability of data-based, conceptual and physics-based modelling approaches. In this paper we report on a major UK research programme, motivated by a national need to quantify effects of agricultural intensification on flood risk. Working with a consortium of farmers in upland Wales, a multi-scale experimental programme (from experimental plots to 2nd order catchments) was developed to address issues of upland agricultural intensification. This provided data support for a multi-scale modelling programme, in which highly detailed physics-based models were conditioned on the experimental data and used to explore effects of potential field-scale interventions. A meta-modelling strategy was developed to represent detailed modelling in a computationally-efficient manner for catchment-scale simulation; this allowed catchment-scale quantification of potential management options. For more general application to data-sparse areas, alternative approaches were needed. Physics-based models were developed for a range of upland management problems, including the restoration of drained peatlands, afforestation, and changing grazing practices. Their performance was explored using literature and surrogate data; although subject to high levels of uncertainty, important insights were obtained, of practical relevance to management decisions. In parallel, regionalised conceptual modelling was used to explore the potential of indices of catchment response, conditioned on readily-available catchment characteristics, to represent ungauged catchments subject to land management change. Although based in part on speculative relationships, significant predictive power was derived from this approach. Finally, using a formal Bayesian procedure, these different sources of information were combined with local flow data in a catchment-scale conceptual model application , i.e. using small-scale physical properties, regionalised signatures of flow and available flow measurements.

  1. Policy-based Distributed Data Management

    NASA Astrophysics Data System (ADS)

    Moore, R. W.

    2009-12-01

    The analysis and understanding of climate variability and change builds upon access to massive collections of observational and simulation data. The analyses involve distributed computing, both at the storage systems (which support data subsetting) and at compute engines (for assimilation of observational data into simulations). The integrated Rule Oriented Data System (iRODS) organizes the distributed data into collections to facilitate enforcement of management policies, support remote data processing, and enable development of reference collections. Currently at RENCI, the iRODS data grid is being used to manage ortho-photos and lidar data for the State of North Carolina, provide a unifying storage environment for engagement centers across the state, support distributed access to visualizations of weather data, and is being explored to manage and disseminate collections of ensembles of meteorological and hydrological model results. In collaboration with the National Climatic Data Center, an iRODS data grid is being established to support data transmission from NCDC to ORNL, and to integrate NCDC archives with ORNL compute services. To manage the massive data transfers, parallel I/O streams are used between High Performance Storage System tape archives and the supercomputers at ORNL. Further, we are exploring the movement and management of large RADAR and in situ datasets to be used for data mining between RENCI and NCDC, and for the distributed creation of decision support and climate analysis tools. The iRODS data grid supports all phases of the scientific data life cycle, from management of data products for a project, to sharing of data between research institutions, to publication of data in a digital library, to preservation of data for use in future research projects. Each phase is characterized by a broader user community, with higher expectations for more detailed descriptions and analysis mechanisms for manipulating the data. The higher usage requirements are enforced by management policies that define the required metadata, the required data formats, and the required analysis tools. The iRODS policy based data management system automates the creation of the community chosen data products, validates integrity and authenticity assessment criteria, and enforces management policies across all accesses of the system.

  2. Armenian Virtual Observatory: Services and Data Sharing

    NASA Astrophysics Data System (ADS)

    Knyazyan, A. V.; Astsatryan, H. V.; Mickaelian, A. M.

    2016-06-01

    The main aim of this article is to introduce the data management and services of the Armenian Virtual Observatory (ArVO), which consists of user friendly data management mechanisms, a new and productive cross-correlation service, and data sharing API based on international standards and protocols.

  3. Content-Based Management of Image Databases in the Internet Age

    ERIC Educational Resources Information Center

    Kleban, James Theodore

    2010-01-01

    The Internet Age has seen the emergence of richly annotated image data collections numbering in the billions of items. This work makes contributions in three primary areas which aid the management of this data: image representation, efficient retrieval, and annotation based on content and metadata. The contributions are as follows. First,…

  4. NASIS data base management system: IBM 360 TSS implementation. Volume 1: Installation standards

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The installation standards for the (NASIS) data base management system are presented. The standard approach to preparing systems documentation and the program design and coding rules and conventions are outlines. Included are instructions for preparing all major specifications and suggestions for improving the quality and efficency of the programming task.

  5. Feature Analysis of Generalized Data Base Management Systems.

    ERIC Educational Resources Information Center

    Conference on Data Systems Languages, Monroeville, PA. Systems Committee.

    A more complete definition of the features offered in present day generalized data base management systems is provided by this second technical report of the CODASYL Systems Committee. In a tutorial format, each feature description is followed by either narrative information covering ten systems or by a table for all systems. The ten systems…

  6. NASA Administrative Data Base Management Systems, 1984

    NASA Technical Reports Server (NTRS)

    Radosevich, J. D. (Editor)

    1984-01-01

    Strategies for converting to a data base management system (DBMS) and the implementation of the software packages necessary are discussed. Experiences with DBMS at various NASA centers are related including Langley's ADABAS/NATURAL and the NEMS subsystem of the NASA metrology informaton system. The value of the integrated workstation with a personal computer is explored.

  7. Explaining sex differences in managerial career satisfier preferences: the role of gender self-schema.

    PubMed

    Eddleston, Kimberly A; Veiga, John F; Powell, Gary N

    2006-03-01

    Using survey data from 400 managers, the authors examined whether gender self-schema would explain sex differences in preferences for status-based and socioemotional career satisfiers. Female gender self-schema, represented by femininity and family role salience, completely mediated the relationship between managers' sex and preferences for socioemotional career satisfiers. However, male gender self-schema, represented by masculinity and career role salience, did not mediate the relationship between managers' sex and preferences for status-based career satisfiers. As expected, male managers regarded status-based career satisfiers as more important and socioemotional career satisfiers as less important than female managers did. The proposed conceptualization of male and female gender self-schemas, which was supported by the data, enhances understanding of adult self-schema and work-related attitudes and behavior.

  8. An Ecosystem Perspective On Asset Management Information

    NASA Astrophysics Data System (ADS)

    Metso, Lasse; Kans, Mirka

    2017-09-01

    Big Data and Internet of Things will increase the amount of data on asset management exceedingly. Data sharing with an increased number of partners in the area of asset management is important when developing business opportunities and new ecosystems. An asset management ecosystem is a complex set of relationships between parties taking part in asset management actions. In this paper, the current barriers and benefits of data sharing are identified based on the results of an interview study. The main benefits are transparency, access to data and reuse of data. New services can be created by taking advantage of data sharing. The main barriers to sharing data are an unclear view of the data sharing process and difficulties to recognize the benefits of data sharing. For overcoming the barriers in data sharing, this paper applies the ecosystem perspective on asset management information. The approach is explained by using the Swedish railway industry as an example.

  9. Developing a Web-Based Nursing Practice and Research Information Management System: A Pilot Study.

    PubMed

    Choi, Jeeyae; Lapp, Cathi; Hagle, Mary E

    2015-09-01

    Many hospital information systems have been developed and implemented to collect clinical data from the bedside and have used the information to improve patient care. Because of a growing awareness that the use of clinical information improves quality of care and patient outcomes, measuring tools (electronic and paper based) have been developed, but most of them require multiple steps of data collection and analysis. This necessitated the development of a Web-based Nursing Practice and Research Information Management System that processes clinical nursing data to measure nurses' delivery of care and its impact on patient outcomes and provides useful information to clinicians, administrators, researchers, and policy makers at the point of care. This pilot study developed a computer algorithm based on a falls prevention protocol and programmed the prototype Web-based Nursing Practice and Research Information Management System. It successfully measured performance of nursing care delivered and its impact on patient outcomes successfully using clinical nursing data from the study site. Although Nursing Practice and Research Information Management System was tested with small data sets, results of study revealed that it has the potential to measure nurses' delivery of care and its impact on patient outcomes, while pinpointing components of nursing process in need of improvement.

  10. Tools for beach health data management, data processing, and predictive model implementation

    USGS Publications Warehouse

    ,

    2013-01-01

    This fact sheet describes utilities created for management of recreational waters to provide efficient data management, data aggregation, and predictive modeling as well as a prototype geographic information system (GIS)-based tool for data visualization and summary. All of these utilities were developed to assist beach managers in making decisions to protect public health. The Environmental Data Discovery and Transformation (EnDDaT) Web service identifies, compiles, and sorts environmental data from a variety of sources that help to define climatic, hydrologic, and hydrodynamic characteristics including multiple data sources within the U.S. Geological Survey and the National Oceanic and Atmospheric Administration. The Great Lakes Beach Health Database (GLBH-DB) and Web application was designed to provide a flexible input, export, and storage platform for beach water quality and sanitary survey monitoring data to compliment beach monitoring programs within the Great Lakes. A real-time predictive modeling strategy was implemented by combining the capabilities of EnDDaT and the GLBH-DB for timely, automated prediction of beach water quality. The GIS-based tool was developed to map beaches based on their physical and biological characteristics, which was shared with multiple partners to provide concepts and information for future Web-accessible beach data outlets.

  11. NASIS data base management system: IBM 360 TSS implementation. Volume 8: Data base administrator user's guide

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Data Base Administrator User's Guide for the NASA Aerospace Safety Information System is presented. The subjects discussed are: (1) multi-terminal tasking, (2) data base executive, (3) utilities, (4) maintenance, (5) terminal support, and (6) retrieval subsystem.

  12. Automatic labeling and characterization of objects using artificial neural networks

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Hill, Scott E.; Cromp, Robert F.

    1989-01-01

    Existing NASA supported scientific data bases are usually developed, managed and populated in a tedious, error prone and self-limiting way in terms of what can be described in a relational Data Base Management System (DBMS). The next generation Earth remote sensing platforms, i.e., Earth Observation System, (EOS), will be capable of generating data at a rate of over 300 Mbs per second from a suite of instruments designed for different applications. What is needed is an innovative approach that creates object-oriented databases that segment, characterize, catalog and are manageable in a domain-specific context and whose contents are available interactively and in near-real-time to the user community. Described here is work in progress that utilizes an artificial neural net approach to characterize satellite imagery of undefined objects into high-level data objects. The characterized data is then dynamically allocated to an object-oriented data base where it can be reviewed and assessed by a user. The definition, development, and evolution of the overall data system model are steps in the creation of an application-driven knowledge-based scientific information system.

  13. The utilization of neural nets in populating an object-oriented database

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Hill, Scott E.; Cromp, Robert F.

    1989-01-01

    Existing NASA supported scientific data bases are usually developed, managed and populated in a tedious, error prone and self-limiting way in terms of what can be described in a relational Data Base Management System (DBMS). The next generation Earth remote sensing platforms (i.e., Earth Observation System, (EOS), will be capable of generating data at a rate of over 300 Mbs per second from a suite of instruments designed for different applications. What is needed is an innovative approach that creates object-oriented databases that segment, characterize, catalog and are manageable in a domain-specific context and whose contents are available interactively and in near-real-time to the user community. Described here is work in progress that utilizes an artificial neural net approach to characterize satellite imagery of undefined objects into high-level data objects. The characterized data is then dynamically allocated to an object-oriented data base where it can be reviewed and assessed by a user. The definition, development, and evolution of the overall data system model are steps in the creation of an application-driven knowledge-based scientific information system.

  14. Data management plan for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, D.

    The Hanford Data Integration 2000 (HANDI 2000) Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract (PHMC). It is based on the Commercial-Off-The-Shelf (COTS) product solution with commercially proven business processes. The COTS product solution set, of PassPort (PP) and PeopleSoft (PS) software, supports finance, supply and chemical management/Material Safety Data Sheet.

  15. Managing to Payroll: An Evaluation of Local Activity Data Management

    DTIC Science & Technology

    1989-06-01

    of the long, complex formulation process from line manager input to receipt of payroll authority - serves only as a starting...information from T/ A and labor cards may be input into a locally managed data base before these cards are returned to the FIPC at the end of a pay period...support future labor mix and utilization decisions. Data from the detailed reports is manually transferred to the fourth PC. Another operator using

  16. Intelligent data management

    NASA Technical Reports Server (NTRS)

    Campbell, William J.

    1985-01-01

    Intelligent data management is the concept of interfacing a user to a database management system with a value added service that will allow a full range of data management operations at a high level of abstraction using human written language. The development of such a system will be based on expert systems and related artificial intelligence technologies, and will allow the capturing of procedural and relational knowledge about data management operations and the support of a user with such knowledge in an on-line, interactive manner. Such a system will have the following capabilities: (1) the ability to construct a model of the users view of the database, based on the query syntax; (2) the ability to transform English queries and commands into database instructions and processes; (3) the ability to use heuristic knowledge to rapidly prune the data space in search processes; and (4) the ability to use an on-line explanation system to allow the user to understand what the system is doing and why it is doing it. Additional information is given in outline form.

  17. The Cadmio XML healthcare record.

    PubMed

    Barbera, Francesco; Ferri, Fernando; Ricci, Fabrizio L; Sottile, Pier Angelo

    2002-01-01

    The management of clinical data is a complex task. Patient related information reported in patient folders is a set of heterogeneous and structured data accessed by different users having different goals (in local or geographical networks). XML language provides a mechanism for describing, manipulating, and visualising structured data in web-based applications. XML ensures that the structured data is managed in a uniform and transparent manner independently from the applications and their providers guaranteeing some interoperability. Extracting data from the healthcare record and structuring them according to XML makes the data available through browsers. The MIC/MIE model (Medical Information Category/Medical Information Elements), which allows the definition and management of healthcare records and used in CADMIO, a HISA based project, is described in this paper, using XML for allowing the data to be visualised through web browsers.

  18. Bridging the Gap Between Surveyors and the Geo-Spatial Society

    NASA Astrophysics Data System (ADS)

    Müller, H.

    2016-06-01

    For many years FIG, the International Association of Surveyors, has been trying to bridge the gap between surveyors and the geospatial society as a whole, with the geospatial industries in particular. Traditionally the surveying profession contributed to the good of society by creating and maintaining highly precise and accurate geospatial data bases, based on an in-depth knowledge of spatial reference frameworks. Furthermore in many countries surveyors may be entitled to make decisions about land divisions and boundaries. By managing information spatially surveyors today develop into the role of geo-data managers, the longer the more. Job assignments in this context include data entry management, data and process quality management, design of formal and informal systems, information management, consultancy, land management, all that in close cooperation with many different stakeholders. Future tasks will include the integration of geospatial information into e-government and e-commerce systems. The list of professional tasks underpins the capabilities of surveyors to contribute to a high quality geospatial data and information management. In that way modern surveyors support the needs of a geo-spatial society. The paper discusses several approaches to define the role of the surveyor within the modern geospatial society.

  19. A Study and Model of Operating Level Financial Management Philosophy Under RMS.

    DTIC Science & Technology

    The lack of financial management education has prevented base level managers from using PRIME data as intended. This study examines the Air Force...operating level financial management philosophy before and after PRIME and the environment of PRIME adoption. A model in the form of two case problems...with solutions is created to portray the financial management concepts under PRIME to help educate base level Air Force logistic managers. The model

  20. Evolution of Information Management at the GSFC Earth Sciences (GES) Data and Information Services Center (DISC): 2006-2007

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Lynnes, Christopher; Vollmer, Bruce; Alcott, Gary; Berrick, Stephen

    2009-01-01

    Increasingly sophisticated National Aeronautics and Space Administration (NASA) Earth science missions have driven their associated data and data management systems from providing simple point-to-point archiving and retrieval to performing user-responsive distributed multisensor information extraction. To fully maximize the use of remote-sensor-generated Earth science data, NASA recognized the need for data systems that provide data access and manipulation capabilities responsive to research brought forth by advancing scientific analysis and the need to maximize the use and usability of the data. The decision by NASA to purposely evolve the Earth Observing System Data and Information System (EOSDIS) at the Goddard Space Flight Center (GSFC) Earth Sciences (GES) Data and Information Services Center (DISC) and other information management facilities was timely and appropriate. The GES DISC evolution was focused on replacing the EOSDIS Core System (ECS) by reusing the In-house developed disk-based Simple, Scalable, Script-based Science Product Archive (S4PA) data management system and migrating data to the disk archives. Transition was completed in December 2007

  1. A data management life-cycle

    USGS Publications Warehouse

    Ferderer, David A.

    2001-01-01

    Documented, reliable, and accessible data and information are essential building blocks supporting scientific research and applications that enhance society's knowledge base (fig. 1). The U.S. Geological Survey (USGS), a leading provider of science data, information, and knowledge, is uniquely positioned to integrate science and natural resource information to address societal needs. The USGS Central Energy Resources Team (USGS-CERT) provides critical information and knowledge on the quantity, quality, and distribution of the Nation's and the world's oil, gas, and coal resources. By using a life-cycle model, the USGS-CERT Data Management Project is developing an integrated data management system to (1) promote access to energy data and information, (2) increase data documentation, and (3) streamline product delivery to the public, scientists, and decision makers. The project incorporates web-based technology, data cataloging systems, data processing routines, and metadata documentation tools to improve data access, enhance data consistency, and increase office efficiency

  2. Managing Engineering Design Information

    DTIC Science & Technology

    1989-10-01

    aerospace industry, and design operations cannot be delayed until a prior task is completed [Ref. 9]. 5 ReacDevelopment, and ’ Marketin ~g nceptal...Figure 4. Translator Interface Between Application Tools 12 2. Directory Data Base Approach The directory approach uses a data base with the traditional ...Technologies, 1985, pp. 313-320. 17. Bray, O.H., "Computer-Integrated Manufacturing: The Data Management Strategy," Digital Press, Bedford, MA, 1988. 18. Atre

  3. An OAIS-Based Hospital Information System on the Cloud: Analysis of a NoSQL Column-Oriented Approach.

    PubMed

    Celesti, Antonio; Fazio, Maria; Romano, Agata; Bramanti, Alessia; Bramanti, Placido; Villari, Massimo

    2018-05-01

    The Open Archive Information System (OAIS) is a reference model for organizing people and resources in a system, and it is already adopted in care centers and medical systems to efficiently manage clinical data, medical personnel, and patients. Archival storage systems are typically implemented using traditional relational database systems, but the relation-oriented technology strongly limits the efficiency in the management of huge amount of patients' clinical data, especially in emerging cloud-based, that are distributed. In this paper, we present an OAIS healthcare architecture useful to manage a huge amount of HL7 clinical documents in a scalable way. Specifically, it is based on a NoSQL column-oriented Data Base Management System deployed in the cloud, thus to benefit from a big tables and wide rows available over a virtual distributed infrastructure. We developed a prototype of the proposed architecture at the IRCCS, and we evaluated its efficiency in a real case of study.

  4. Data-Driven Decision Making in Out-of-School Time Programs. Part 6 in a Series on Implementing Evidence-Based Practices in Out-of-School Time Programs: The Role of Organization-Level Activities. Research-to-Results Brief. Publication #2009-34

    ERIC Educational Resources Information Center

    Bandy, Tawana; Burkhauser, Mary; Metz, Allison J. R.

    2009-01-01

    Although many program managers look to data to inform decision-making and manage their programs, high-quality program data may not always be available. Yet such data are necessary for effective program implementation. The use of high-quality data facilitates program management, reduces reliance on anecdotal information, and ensures that data are…

  5. Integrating Ecosystem-Based Management Principles of Adaptive Management and Stakeholder Engagement in California Fisheries

    NASA Astrophysics Data System (ADS)

    Erickson, A.; Martone, R. G.; Hazen, L.; Mease, L.; Gourlie, D.; Le Cornu, E.; Ourens, R.; Micheli, F.

    2016-12-01

    California's fisheries management law, the Marine Life Management Act (MLMA) of 1998, signaled a transformative shift from traditional single-species management to an ecosystem-based approach. In response, the fisheries management community in California is striving to integrate new science and management innovations while maximizing its limited capacity. However, data gaps, high compliance costs, capacity constraints, and limited access to the best available data and technologies persist. Here we present two decision support tools being developed to aid California fisheries managers as they continue to implement ecosystem-based management (EBM). First, to practice adaptive management, a key principle of EBM, managers must know whether and how their decisions are meeting their management objectives over time. Based on a cross-walk of MLMA goals with metrics and indicators from sustainable fishery certification programs, we present a flexible and practical tool for tracking fishery management performance in California. We showcase a draft series of decision trees and questionnaires managers can use to quantitatively or qualitatively measure both ecological and social outcomes, helping them to prioritize management options and limited resources. Second, state fisheries managers acknowledge the need for more effective stakeholder engagement to facilitate and inform decision-making and long-term outcomes, another key principle of EBM. Here, we present a pilot version of a decision-support tool to aid managers in choosing the most appropriate stakeholder engagement strategies in various types of decision contexts. This online tool will help staff identify their engagement goals, when they can strategically engage stakeholders based on their needs, and the fishery characteristics that will inform how engagement strategies are tailored to specific contexts. We also share opportunities to expand these EBM tools to other resource management contexts and scales.

  6. Integrating Ecosystem-Based Management Principles of Adaptive Management and Stakeholder Engagement in California Fisheries

    NASA Astrophysics Data System (ADS)

    Erickson, A.; Martone, R. G.; Hazen, L.; Mease, L.; Gourlie, D.; Le Cornu, E.; Ourens, R.; Micheli, F.

    2016-02-01

    California's fisheries management law, the Marine Life Management Act (MLMA) of 1998, signaled a transformative shift from traditional single-species management to an ecosystem-based approach. In response, the fisheries management community in California is striving to integrate new science and management innovations while maximizing its limited capacity. However, data gaps, high compliance costs, capacity constraints, and limited access to the best available data and technologies persist. Here we present two decision support tools being developed to aid California fisheries managers as they continue to implement ecosystem-based management (EBM). First, to practice adaptive management, a key principle of EBM, managers must know whether and how their decisions are meeting their management objectives over time. Based on a cross-walk of MLMA goals with metrics and indicators from sustainable fishery certification programs, we present a flexible and practical tool for tracking fishery management performance in California. We showcase a draft series of decision trees and questionnaires managers can use to quantitatively or qualitatively measure both ecological and social outcomes, helping them to prioritize management options and limited resources. Second, state fisheries managers acknowledge the need for more effective stakeholder engagement to facilitate and inform decision-making and long-term outcomes, another key principle of EBM. Here, we present a pilot version of a decision-support tool to aid managers in choosing the most appropriate stakeholder engagement strategies in various types of decision contexts. This online tool will help staff identify their engagement goals, when they can strategically engage stakeholders based on their needs, and the fishery characteristics that will inform how engagement strategies are tailored to specific contexts. We also share opportunities to expand these EBM tools to other resource management contexts and scales.

  7. Wilderness monitoring and data management

    USGS Publications Warehouse

    Riebau, A. R.

    1994-01-01

    In the last decade, increased public interest in natural areas has resulted in increased monitoring activity by federal wilderness managers to assess the status of wilderness values. Wilderness values are those large-scale entities of wilderness which comprise, in sum, wilderness character. Data collected through wilderness monitoring must support the maintenance of wilderness values. Wilderness monitoring must include the development of clear data management strategies and provisions for hypothesis testing. Unfortunately, some monitoring programs do not support the status assessment of wilderness values. Often wilderness monitoring programs have neglected even the most rudimentary principles of data management. This paper presents a model for wilderness monitoring, guidelines for data management, and an overview of a PC-compatible wilderness monitoring data base, the Monitoring Information Data Analysis System (MIDAS).

  8. The quantification of instream flow rights to water

    USGS Publications Warehouse

    Milhous, Robert T.

    1990-01-01

    Energy development of all types continues to grow in the Rocky Mountain Region of the western United States. Federal resource managers increasingly need to balance energy demands, their effects on the natural and human landscape, and public perceptions towards these issues. The Western Energy Citation Clearinghouse (WECC v.1.0), part of a suite of data and information management tools developed and managed by the Wyoming Landscape Conservation Initiative (WLCI), provides resource managers with a searchable online database of citations that covers a broad spectrum of energy and landscape related topics relevant to resource managers, such as energy sources, natural and human landscape effects, and new research, methods and models. Based on the 2011 USGS Open-file Report "Abbreviated bibliography on energy development" (Montag, et al. 2011), WECC is an extensive collection of energy-related citations, as well as categorized lists of additional online resources related to oil and gas development, best practices, energy companies and Federal agencies. WECC incorporates the powerful web services of Sciencebase 2.0, the enterprise data and information platform for USGS scientists and partners, to provide secure, role-based data management features. For example, public/unauthenticated WECC users have full search and read access to the entire energy citation collection, while authenticated WLCI data stewards can manage WECC's citation collection using Sciencebase data management forms.

  9. Data management in pattern recognition and image processing systems

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Bryant, N. A.

    1976-01-01

    Data management considerations are important to any system which handles large volumes of data or where the manipulation of data is technically sophisticated. A particular problem is the introduction of image-formatted files into the mainstream of data processing application. This report describes a comprehensive system for the manipulation of image, tabular, and graphical data sets which involve conversions between the various data types. A key characteristic is the use of image processing technology to accomplish data management tasks. Because of this, the term 'image-based information system' has been adopted.

  10. Non-Procedural Languages for Information Resource Management.

    ERIC Educational Resources Information Center

    Bearley, William L.

    The future of information resources management requires new approaches to implementing systems which will include a type of data base management that frees users to solve data processing problems logically by telling the system what they want, together with powerful non-procedural languages that will permit communication in simple, concise…

  11. 75 FR 1547 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2010

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-12

    ...: Notice of Determination. SUMMARY: Using data from Management Information System annual reports, FRA has... taken from FRA's Management Information System. Based on this data, the Administrator publishes a... effective upon publication. FOR FURTHER INFORMATION CONTACT: Lamar Allen, Alcohol and Drug Program Manager...

  12. 75 FR 79308 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2011

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-20

    ... from Management Information System annual reports, FRA has determined that the 2009 rail industry... program data taken from FRA's Management Information System. Based on this data, the Administrator... effective December 20, 2010. FOR FURTHER INFORMATION CONTACT: Lamar Allen, Alcohol and Drug Program Manager...

  13. Creation of a Book Order Management System Using a Microcomputer and a DBMS.

    ERIC Educational Resources Information Center

    Neill, Charlotte; And Others

    1985-01-01

    Describes management decisions and resultant technology-based system that allowed a medical library to meet increasing workloads without accompanying increases in resources available. Discussion covers system analysis; capabilities of book-order management system, "BOOKDIRT;" software and training; hardware; data files; data entry;…

  14. Managing mapping data using commercial data base management software.

    USGS Publications Warehouse

    Elassal, A.A.

    1985-01-01

    Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author

  15. Natural Resource Information System. Volume 1: Overall description

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A prototype computer-based Natural Resource Information System was designed which could store, process, and display data of maximum usefulness to land management decision making. The system includes graphic input and display, the use of remote sensing as a data source, and it is useful at multiple management levels. A survey established current decision making processes and functions, information requirements, and data collection and processing procedures. The applications of remote sensing data and processing requirements were established. Processing software was constructed and a data base established using high-altitude imagery and map coverage of selected areas of SE Arizona. Finally a demonstration of system processing functions was conducted utilizing material from the data base.

  16. An Analytics-Based Approach to Managing Cognitive Load by Using Log Data of Learning Management Systems and Footprints of Social Media

    ERIC Educational Resources Information Center

    Yen, Cheng-Huang; Chen, I-Chuan; Lai, Su-Chun; Chuang, Yea-Ru

    2015-01-01

    Traces of learning behaviors generally provide insights into learners and the learning processes that they employ. In this article, a learning-analytics-based approach is proposed for managing cognitive load by adjusting the instructional strategies used in online courses. The technology-based learning environment examined in this study involved a…

  17. Foundations of Constructing a Marketing Data Base; Profitable Applications of the Computer to Marketing Management.

    ERIC Educational Resources Information Center

    Podell, Harold J.

    An introduction into the foundations of constructing a marketing data base is presented for the systems and marketing executives who are familiar with basic computer technology methods. The techniques and concepts presented are now being implemented by major organizations in the development of Management Information Systems (MIS). A marketing data…

  18. A management-oriented framework for selecting metrics used to assess habitat- and path-specific quality in spatially structured populations

    USGS Publications Warehouse

    Nicol, Sam; Wiederholt, Ruscena; Diffendorfer, James E.; Mattsson, Brady; Thogmartin, Wayne E.; Semmens, Darius J.; Laura Lopez-Hoffman,; Norris, Ryan

    2016-01-01

    Mobile species with complex spatial dynamics can be difficult to manage because their population distributions vary across space and time, and because the consequences of managing particular habitats are uncertain when evaluated at the level of the entire population. Metrics to assess the importance of habitats and pathways connecting habitats in a network are necessary to guide a variety of management decisions. Given the many metrics developed for spatially structured models, it can be challenging to select the most appropriate one for a particular decision. To guide the management of spatially structured populations, we define three classes of metrics describing habitat and pathway quality based on their data requirements (graph-based, occupancy-based, and demographic-based metrics) and synopsize the ecological literature relating to these classes. Applying the first steps of a formal decision-making approach (problem framing, objectives, and management actions), we assess the utility of metrics for particular types of management decisions. Our framework can help managers with problem framing, choosing metrics of habitat and pathway quality, and to elucidate the data needs for a particular metric. Our goal is to help managers to narrow the range of suitable metrics for a management project, and aid in decision-making to make the best use of limited resources.

  19. NASA and the U.S. climate program - A problem in data management

    NASA Technical Reports Server (NTRS)

    Quann, J. J.

    1978-01-01

    NASA's contribution to the total data base for the National Climate Plan will be to produce climate data sets from its experimental space observing systems and to maximize the value of these data for climate analysis and prediction. Validated data sets will be provided to NOAA for inclusion into their overall diagnostic data base. NASA data management for the Climate Plan will involve: (1) cataloging and retrieval of large integrated and distributed data sets upon user demand, and (2) the storage equivalent of 100,000 digital data tapes. It will be the largest, most complex data system ever developed by NASA

  20. A Hadoop-based Molecular Docking System

    NASA Astrophysics Data System (ADS)

    Dong, Yueli; Guo, Quan; Sun, Bin

    2017-10-01

    Molecular docking always faces the challenge of managing tens of TB datasets. It is necessary to improve the efficiency of the storage and docking. We proposed the molecular docking platform based on Hadoop for virtual screening, it provides the preprocessing of ligand datasets and the analysis function of the docking results. A molecular cloud database that supports mass data management is constructed. Through this platform, the docking time is reduced, the data storage is efficient, and the management of the ligand datasets is convenient.

  1. Allocations for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, D.

    The Data Integration 2000 Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract. It is based on the Commercial-Off-The-Shelf product solution with commercially proven business processes. The COTS product solution set, of PassPort and People Soft software, supports finance, supply and chemical management/Material Safety Data Sheet, human resources. Allocations at Fluor Daniel Hanford are burdens added to base costs using a predetermined rate.

  2. ms_lims, a simple yet powerful open source laboratory information management system for MS-driven proteomics.

    PubMed

    Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart

    2010-03-01

    MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.

  3. Making USGS Science Data more Open, Accessible, and Usable: Leveraging ScienceBase for Success

    NASA Astrophysics Data System (ADS)

    Chang, M.; Ignizio, D.; Langseth, M. L.; Norkin, T.

    2016-12-01

    In 2013, the White House released initiatives requiring federally funded research to be made publicly available and machine readable. In response, the U.S. Geological Survey (USGS) has been developing a unified approach to make USGS data available and open. This effort has involved the establishment of internal policies and the release of a Public Access Plan, which outlines a strategy for the USGS to move forward into the modern era in scientific data management. Originally designed as a catalog and collaborative data management platform, ScienceBase (www.sciencebase.gov) is being leveraged to serve as a robust data hosting solution for USGS researchers to make scientific data accessible. With the goal of maintaining persistent access to formal data products and developing a management approach to facilitate stable data citation, the ScienceBase Data Release Team was established to ensure the quality, consistency, and meaningful organization of USGS data through standardized workflows and best practices. These practices include the creation and maintenance of persistent identifiers for data, improving the use of open data formats, establishing permissions for read/write access, validating the quality of standards compliant metadata, verifying that data have been reviewed and approved prior to release, and connecting to external search catalogs such as the USGS Science Data Catalog (data.usgs.gov) and data.gov. The ScienceBase team is actively building features to support this effort by automating steps to streamline the process, building metrics to track site visits and downloads, and connecting published digital resources in line with USGS and Federal policy. By utilizing ScienceBase to achieve stewardship quality and employing a dedicated team to help USGS scientists improve the quality of their data, the USGS is helping to meet today's data quality management challenges and ensure that reliable USGS data are available to and reusable for the public.

  4. Improvement of web-based data acquisition and management system for GOSAT validation lidar data analysis

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra Nugraha; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2013-01-01

    A web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data-analysis has been developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS (Automated Meteorological Data Acquisition System) ground-level local meteorological data, GPS Radiosonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data. In this article, we briefly describe some improvement for higher performance and higher data usability. GPS Radiosonde upper-air meteorological data and U.S. standard atmospheric model in DAS automatically calculate molecule number density profiles. Predicted ozone density prole images above Saga city are also calculated by using Meteorological Research Institute (MRI) chemistry-climate model version 2 for comparison to actual ozone DIAL data.

  5. Web-Based Course Management and Web Services

    ERIC Educational Resources Information Center

    Mandal, Chittaranjan; Sinha, Vijay Luxmi; Reade, Christopher M. P.

    2004-01-01

    The architecture of a web-based course management tool that has been developed at IIT [Indian Institute of Technology], Kharagpur and which manages the submission of assignments is discussed. Both the distributed architecture used for data storage and the client-server architecture supporting the web interface are described. Further developments…

  6. Applications of Landsat data and the data base approach

    USGS Publications Warehouse

    Lauer, D.T.

    1986-01-01

    A generalized methodology for applying digital Landsat data to resource inventory and assessment tasks is currently being used by several bureaux and agencies within the US Department of the Interior. The methodology includes definition of project objectives and output, identification of source materials, construction of the digital data base, performance of computer-assisted analyses, and generation of output. The USGS, Bureau of Land Management, US Fish and Wildlife Service, Bureau of Indian Affairs, Bureau of Reclamation, and National Park Service have used this generalized methodology to assemble comprehensive digital data bases for resource management. Advanced information processing techniques have been applied to these data bases for making regional environmental surveys on millions of acres of public lands at costs ranging from $0.01 to $0.08 an acre.-Author

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockwood, Jr., Neil; McLellan, Jason G; Crossley, Brian

    The Resident Fish Stock Status above Chief Joseph and Grand Coulee Dams Project, commonly known as the Joint Stock Assessment Project (JSAP) is a management tool using ecosystem principles to manage artificial fish assemblages and native fish in altered environments existing in the Columbia River System above Chief Joseph and Grand Coulee Dams (blocked area). The three-phase approach of this project will enhance the fisheries resources of the blocked area by identifying data gaps, filling data gaps with research, and implementing management recommendations based on research results. The Blocked Area fisheries information housed in a central location will allow managersmore » to view the entire system while making decisions, rather than basing management decisions on isolated portions of the system. The JSAP (NWPPC program measure 10.8B.26) is designed and guided jointly by fisheries managers in the blocked area and the Columbia Basin blocked area management plan (1998). The initial year of the project (1997) identified the need for a central data storage and analysis facility, coordination with the StreamNet project, compilation of blocked area fisheries information, and a report on the ecological condition of the Spokane River System. These needs were addressed in 1998 by acquiring a central location with a data storage and analysis system, coordinating a pilot project with StreamNet, compiling fisheries distribution data throughout the blocked area, identifying data gaps based on compiled information, and researching the ecological condition of the Spokane River. In order to ensure that any additional information collected throughout the life of this project will be easily stored and manipulated by the central storage facility, it was necessary to develop standardized methodologies between the JSAP fisheries managers. The use of common collection and analytical tools is essential to the process of streamlining joint management decisions. In 1999 and 2000 the project began to address some of the identified data gaps, throughout the blocked area, with a variety of newly developed sampling projects, as well as, continuing with ongoing data collection of established projects.« less

  8. Development and Implementation of Team-Based Panel Management Tools: Filling the Gap between Patient and Population Information Systems.

    PubMed

    Watts, Brook; Lawrence, Renée H; Drawz, Paul; Carter, Cameron; Shumaker, Amy Hirsch; Kern, Elizabeth F

    2016-08-01

    Effective team-based models of care, such as the Patient-Centered Medical Home, require electronic tools to support proactive population management strategies that emphasize care coordination and quality improvement. Despite the spread of electronic health records (EHRs) and vendors marketing population health tools, clinical practices still may lack the ability to have: (1) local control over types of data collected/reports generated, (2) timely data (eg, up-to-date data, not several months old), and accordingly (3) the ability to efficiently monitor and improve patient outcomes. This article describes a quality improvement project at the hospital system level to develop and implement a flexible panel management (PM) tool to improve care of subpopulations of patients (eg, panels of patients with diabetes) by clinical teams. An in-depth case analysis approach is used to explore barriers and facilitators in building a PM registry tool for team-based management needs using standard data elements (eg, laboratory values, pharmacy records) found in EHRs. Also described are factors that may contribute to sustainability; to date the tool has been adapted to 6 disease-focused subpopulations encompassing more than 200,000 patients. Two key lessons emerged from this initiative: (1) though challenging, team-based clinical end users and information technology needed to work together consistently to refine the product, and (2) locally developed population management tools can provide efficient data tracking for frontline clinical teams and leadership. The preliminary work identified critical gaps that were successfully addressed by building local PM registry tools from EHR-derived data and offers lessons learned for others engaged in similar work. (Population Health Management 2016;19:232-239).

  9. ClinData Express – A Metadata Driven Clinical Research Data Management System for Secondary Use of Clinical Data

    PubMed Central

    Li, Zuofeng; Wen, Jingran; Zhang, Xiaoyan; Wu, Chunxiao; Li, Zuogao; Liu, Lei

    2012-01-01

    Aim to ease the secondary use of clinical data in clinical research, we introduce a metadata driven web-based clinical data management system named ClinData Express. ClinData Express is made up of two parts: 1) m-designer, a standalone software for metadata definition; 2) a web based data warehouse system for data management. With ClinData Express, what the researchers need to do is to define the metadata and data model in the m-designer. The web interface for data collection and specific database for data storage will be automatically generated. The standards used in the system and the data export modular make sure of the data reuse. The system has been tested on seven disease-data collection in Chinese and one form from dbGap. The flexibility of system makes its great potential usage in clinical research. The system is available at http://code.google.com/p/clindataexpress. PMID:23304327

  10. Research and design on system of asset management based on RFID

    NASA Astrophysics Data System (ADS)

    Guan, Peng; Du, HuaiChang; Jing, Hua; Zhang, MengYue; Zhang, Meng; Xu, GuiXian

    2011-10-01

    By analyzing the problems in the current assets management, this thesis proposing RFID technology will be applied to asset management in order to improve the management level of automation and information. This paper designed the equipment identification based on 433MHz RFID tag and reader which was deeply studied on the basis of RFID tag and card reader circuits, and this paper also illustrates the system of asset management. The RS232 converts Ethernet is a innovative technology to transfer data to PC monitor software, and implement system of asset management based on WEB techniques (PHP and MySQL).

  11. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Michelle M.; Wu, Chase Q.

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization formore » this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.« less

  12. 3D-Monitoring Big Geo Data on a seaport infrastructure based on FIWARE

    NASA Astrophysics Data System (ADS)

    Fernández, Pablo; Suárez, José Pablo; Trujillo, Agustín; Domínguez, Conrado; Santana, José Miguel

    2018-04-01

    Many organizations of all kinds are using new technologies to assist the acquisition and analysis of data. Seaports are a good example of this trend. Seaports generate data regarding the management of marine traffic and other elements, as well as environmental conditions given by meteorological sensors and buoys. However, this enormous amount of data, also known as Big Data, is useless without a proper system to organize, analyze and visualize it. SmartPort is an online platform for the visualization and management of a seaport data that has been built as a GIS application. This work offers a Rich Internet Application that allows the user to visualize and manage the different sources of information produced in a port environment. The Big Data management is based on the FIWARE platform, as well as "The Internet of Things" solutions for the data acquisition. At the same time, Glob3 Mobile (G3M) framework has been used for the development of map requirements. In this way, SmartPort supports 3D visualization of the ports scenery and its data sources.

  13. 3D-Monitoring Big Geo Data on a seaport infrastructure based on FIWARE

    NASA Astrophysics Data System (ADS)

    Fernández, Pablo; Suárez, José Pablo; Trujillo, Agustín; Domínguez, Conrado; Santana, José Miguel

    2018-03-01

    Many organizations of all kinds are using new technologies to assist the acquisition and analysis of data. Seaports are a good example of this trend. Seaports generate data regarding the management of marine traffic and other elements, as well as environmental conditions given by meteorological sensors and buoys. However, this enormous amount of data, also known as Big Data, is useless without a proper system to organize, analyze and visualize it. SmartPort is an online platform for the visualization and management of a seaport data that has been built as a GIS application. This work offers a Rich Internet Application that allows the user to visualize and manage the different sources of information produced in a port environment. The Big Data management is based on the FIWARE platform, as well as "The Internet of Things" solutions for the data acquisition. At the same time, Glob3 Mobile (G3M) framework has been used for the development of map requirements. In this way, SmartPort supports 3D visualization of the ports scenery and its data sources.

  14. Learning Agents for Autonomous Space Asset Management (LAASAM)

    NASA Astrophysics Data System (ADS)

    Scally, L.; Bonato, M.; Crowder, J.

    2011-09-01

    Current and future space systems will continue to grow in complexity and capabilities, creating a formidable challenge to monitor, maintain, and utilize these systems and manage their growing network of space and related ground-based assets. Integrated System Health Management (ISHM), and in particular, Condition-Based System Health Management (CBHM), is the ability to manage and maintain a system using dynamic real-time data to prioritize, optimize, maintain, and allocate resources. CBHM entails the maintenance of systems and equipment based on an assessment of current and projected conditions (situational and health related conditions). A complete, modern CBHM system comprises a number of functional capabilities: sensing and data acquisition; signal processing; conditioning and health assessment; diagnostics and prognostics; and decision reasoning. In addition, an intelligent Human System Interface (HSI) is required to provide the user/analyst with relevant context-sensitive information, the system condition, and its effect on overall situational awareness of space (and related) assets. Colorado Engineering, Inc. (CEI) and Raytheon are investigating and designing an Intelligent Information Agent Architecture that will provide a complete range of CBHM and HSI functionality from data collection through recommendations for specific actions. The research leverages CEI’s expertise with provisioning management network architectures and Raytheon’s extensive experience with learning agents to define a system to autonomously manage a complex network of current and future space-based assets to optimize their utilization.

  15. Results of phase one of land use information Delphi study

    NASA Technical Reports Server (NTRS)

    Paul, C. K.; Landini, A. J.

    1975-01-01

    The Land Use Management Information System (LUMIS) is being developed for the city portion of the Santa Monica mountains. LUMIS incorporates data developed from maps and aerial photos as well as traditional land based data associated with routine city and county record keeping activities and traditional census data. To achieve the merging of natural resource data with governmental data LUMIS is being designed in accordance with restrictions associated with two other land use information systems currently being constructed by Los Angeles city staff. The two city systems are LUPAMS (Land Use Planning and Management System) which is based on data recorded by the County Assessor's office for each individual parcel of land in the city, and Geo-BEDS, a geographically based environmental data system.

  16. The mass remote sensing image data management based on Oracle InterMedia

    NASA Astrophysics Data System (ADS)

    Zhao, Xi'an; Shi, Shaowei

    2013-07-01

    With the development of remote sensing technology, getting the image data more and more, how to apply and manage the mass image data safely and efficiently has become an urgent problem to be solved. According to the methods and characteristics of the mass remote sensing image data management and application, this paper puts forward to a new method that takes Oracle Call Interface and Oracle InterMedia to store the image data, and then takes this component to realize the system function modules. Finally, it successfully takes the VC and Oracle InterMedia component to realize the image data storage and management.

  17. Free web-based modelling platform for managed aquifer recharge (MAR) applications

    NASA Astrophysics Data System (ADS)

    Stefan, Catalin; Junghanns, Ralf; Glaß, Jana; Sallwey, Jana; Fatkhutdinov, Aybulat; Fichtner, Thomas; Barquero, Felix; Moreno, Miguel; Bonilla, José; Kwoyiga, Lydia

    2017-04-01

    Managed aquifer recharge represents a valuable instrument for sustainable water resources management. The concept implies purposeful infiltration of surface water into underground for later recovery or environmental benefits. Over decades, MAR schemes were successfully installed worldwide for a variety of reasons: to maximize the natural storage capacity of aquifers, physical aquifer management, water quality management, and ecological benefits. The INOWAS-DSS platform provides a collection of free web-based tools for planning, management and optimization of main components of MAR schemes. The tools are grouped into 13 specific applications that cover most relevant challenges encountered at MAR sites, both from quantitative and qualitative perspectives. The applications include among others the optimization of MAR site location, the assessment of saltwater intrusion, the restoration of groundwater levels in overexploited aquifers, the maximization of natural storage capacity of aquifers, the improvement of water quality, the design and operational optimization of MAR schemes, clogging development and risk assessment. The platform contains a collection of about 35 web-based tools of various degrees of complexity, which are either included in application specific workflows or used as standalone modelling instruments. Among them are simple tools derived from data mining and empirical equations, analytical groundwater related equations, as well as complex numerical flow and transport models (MODFLOW, MT3DMS and SEAWAT). Up to now, the simulation core of the INOWAS-DSS, which is based on the finite differences groundwater flow model MODFLOW, is implemented and runs on the web. A scenario analyser helps to easily set up and evaluate new management options as well as future development such as land use and climate change and compare them to previous scenarios. Additionally simple tools such as analytical equations to assess saltwater intrusion are already running online. Besides the simulation tools, a web-based data base is under development where geospatial and time series data can be stored, managed, and processed. Furthermore, a web-based information system containing user guides for the various developed tools and applications as well as basic information on MAR and related topics is published and will be regularly expanded as new tools are getting implemented. The INOWAS-DSS including its simulation tools, data base and information system provides an extensive framework to manage, plan and optimize MAR facilities. As the INOWAS-DSS is an open-source software accessible via the internet using standard web browsers, it offers new ways for data sharing and collaboration among various partners and decision makers.

  18. CIMS: The Cartographic Information Management System,

    DTIC Science & Technology

    1981-01-01

    information , composites of overlays to demonstrate the decision-making possibilities and slides of the cadastral sheet. System Use After data base ...create a national soils data base that can be used in managing the soil (Johnson, 1979). Small-scale information systems can be used in planning the...maps/charts over the base map, etc.). An example of the manual phase to be found in the literature is the Overlay Information System used in Prince

  19. 40 CFR 312.26 - Reviews of Federal, State, Tribal, and local government records.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... records or data bases of government records of the subject property and adjoining properties must be... data bases of such government records and local government records and data bases of such records... registered, or state-permitted or registered waste management activities. Such records or data bases that may...

  20. 40 CFR 312.26 - Reviews of Federal, State, Tribal, and local government records.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... records or data bases of government records of the subject property and adjoining properties must be... data bases of such government records and local government records and data bases of such records... registered, or state-permitted or registered waste management activities. Such records or data bases that may...

  1. 40 CFR 312.26 - Reviews of Federal, State, Tribal, and local government records.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... records or data bases of government records of the subject property and adjoining properties must be... data bases of such government records and local government records and data bases of such records... registered, or state-permitted or registered waste management activities. Such records or data bases that may...

  2. 40 CFR 312.26 - Reviews of Federal, State, Tribal, and local government records.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... records or data bases of government records of the subject property and adjoining properties must be... data bases of such government records and local government records and data bases of such records... registered, or state-permitted or registered waste management activities. Such records or data bases that may...

  3. Tracking-Data-Conversion Tool

    NASA Technical Reports Server (NTRS)

    Flora-Adams, Dana; Makihara, Jeanne; Benenyan, Zabel; Berner, Jeff; Kwok, Andrew

    2007-01-01

    Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-topeer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.

  4. smwrBase—An R package for managing hydrologic data, version 1.1.1

    USGS Publications Warehouse

    Lorenz, David L.

    2015-12-09

    This report describes an R package called smwrBase, which consists of a collection of functions to import, transform, manipulate, and manage hydrologic data within the R statistical environment. Functions in the package allow users to import surface-water and groundwater data from the U.S. Geological Survey’s National Water Information System database and other sources. Additional functions are provided to transform, manipulate, and manage hydrologic data in ways necessary for analyzing the data.

  5. Medical Big Data Warehouse: Architecture and System Design, a Case Study: Improving Healthcare Resources Distribution.

    PubMed

    Sebaa, Abderrazak; Chikh, Fatima; Nouicer, Amina; Tari, AbdelKamel

    2018-02-19

    The huge increases in medical devices and clinical applications which generate enormous data have raised a big issue in managing, processing, and mining this massive amount of data. Indeed, traditional data warehousing frameworks can not be effective when managing the volume, variety, and velocity of current medical applications. As a result, several data warehouses face many issues over medical data and many challenges need to be addressed. New solutions have emerged and Hadoop is one of the best examples, it can be used to process these streams of medical data. However, without an efficient system design and architecture, these performances will not be significant and valuable for medical managers. In this paper, we provide a short review of the literature about research issues of traditional data warehouses and we present some important Hadoop-based data warehouses. In addition, a Hadoop-based architecture and a conceptual data model for designing medical Big Data warehouse are given. In our case study, we provide implementation detail of big data warehouse based on the proposed architecture and data model in the Apache Hadoop platform to ensure an optimal allocation of health resources.

  6. Guidelines for the creation and management of geographic data bases within a GIS environment, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durfee, R.C.; Land, M.L.; McCord, R.A.

    1994-07-01

    A Geographic Information System (GIS) provides the ability to manage and analyze all types of geographic and environmental information. It performs these functions by providing the tools necessary to capture, access, analyze, and display spatially referenced information in graphic and tabular form. Typical data elements that can be visualized in a map might include roads, buildings, topography, streams, waste areas, monitoring wells, groundwater measurements, soil sample results, landcover, and demography. The intent of this document is to provide data management and quality assurance (QA) guidelines that will aid implementors and users of GIS technology and data bases. These guidelines shouldmore » be useful in all, phases of GIS activities, including the following: (1) project planning, (2) data collection and generation, (3) data maintenance and management, (4) QA and standards, (5) project implementation, (6) spatial analysis and data interpretation, (7) data transformation and exchange, and (8) output and reporting. The daily use of desktop GIS technologies within Martin Marietta Energy Systems, Inc. (Energy Systems), is a relatively new phenomenon, but usage is increasing rapidly. Large volumes of GIS-related data are now being collected and analyzed for the U.S. Department of Energy (DOE) Oak Ridge Reservation (ORR) and its facilities. It is very important to establish and follow good data management practices for GIS. In the absence of such practices, data-related problems will overwhelm users for many years. In comparison with traditional data processing and software life-cycle management, there is limited information on GIS QA techniques, data standards and structures, configuration control, and documentation practices. This lack of information partially results from the newness of the technology and the complexity of spatial information and geographic analysis techniques as compared to typical tabular data management.« less

  7. Land management: data availability and process understanding for global change studies.

    PubMed

    Erb, Karl-Heinz; Luyssaert, Sebastiaan; Meyfroidt, Patrick; Pongratz, Julia; Don, Axel; Kloster, Silvia; Kuemmerle, Tobias; Fetzel, Tamara; Fuchs, Richard; Herold, Martin; Haberl, Helmut; Jones, Chris D; Marín-Spiotta, Erika; McCallum, Ian; Robertson, Eddy; Seufert, Verena; Fritz, Steffen; Valade, Aude; Wiltshire, Andrew; Dolman, Albertus J

    2017-02-01

    In the light of daunting global sustainability challenges such as climate change, biodiversity loss and food security, improving our understanding of the complex dynamics of the Earth system is crucial. However, large knowledge gaps related to the effects of land management persist, in particular those human-induced changes in terrestrial ecosystems that do not result in land-cover conversions. Here, we review the current state of knowledge of ten common land management activities for their biogeochemical and biophysical impacts, the level of process understanding and data availability. Our review shows that ca. one-tenth of the ice-free land surface is under intense human management, half under medium and one-fifth under extensive management. Based on our review, we cluster these ten management activities into three groups: (i) management activities for which data sets are available, and for which a good knowledge base exists (cropland harvest and irrigation); (ii) management activities for which sufficient knowledge on biogeochemical and biophysical effects exists but robust global data sets are lacking (forest harvest, tree species selection, grazing and mowing harvest, N fertilization); and (iii) land management practices with severe data gaps concomitant with an unsatisfactory level of process understanding (crop species selection, artificial wetland drainage, tillage and fire management and crop residue management, an element of crop harvest). Although we identify multiple impediments to progress, we conclude that the current status of process understanding and data availability is sufficient to advance with incorporating management in, for example, Earth system or dynamic vegetation models in order to provide a systematic assessment of their role in the Earth system. This review contributes to a strategic prioritization of research efforts across multiple disciplines, including land system research, ecological research and Earth system modelling. © 2016 John Wiley & Sons Ltd.

  8. Conceptual information processing: A robust approach to KBS-DBMS integration

    NASA Technical Reports Server (NTRS)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  9. Study on Global GIS architecture and its key technologies

    NASA Astrophysics Data System (ADS)

    Cheng, Chengqi; Guan, Li; Lv, Xuefeng

    2009-09-01

    Global GIS (G2IS) is a system, which supports the huge data process and the global direct manipulation on global grid based on spheroid or ellipsoid surface. Based on global subdivision grid (GSG), Global GIS architecture is presented in this paper, taking advantage of computer cluster theory, the space-time integration technology and the virtual reality technology. Global GIS system architecture is composed of five layers, including data storage layer, data representation layer, network and cluster layer, data management layer and data application layer. Thereinto, it is designed that functions of four-level protocol framework and three-layer data management pattern of Global GIS based on organization, management and publication of spatial information in this architecture. Three kinds of core supportive technologies, which are computer cluster theory, the space-time integration technology and the virtual reality technology, and its application pattern in the Global GIS are introduced in detail. The primary ideas of Global GIS in this paper will be an important development tendency of GIS.

  10. Study on Global GIS architecture and its key technologies

    NASA Astrophysics Data System (ADS)

    Cheng, Chengqi; Guan, Li; Lv, Xuefeng

    2010-11-01

    Global GIS (G2IS) is a system, which supports the huge data process and the global direct manipulation on global grid based on spheroid or ellipsoid surface. Based on global subdivision grid (GSG), Global GIS architecture is presented in this paper, taking advantage of computer cluster theory, the space-time integration technology and the virtual reality technology. Global GIS system architecture is composed of five layers, including data storage layer, data representation layer, network and cluster layer, data management layer and data application layer. Thereinto, it is designed that functions of four-level protocol framework and three-layer data management pattern of Global GIS based on organization, management and publication of spatial information in this architecture. Three kinds of core supportive technologies, which are computer cluster theory, the space-time integration technology and the virtual reality technology, and its application pattern in the Global GIS are introduced in detail. The primary ideas of Global GIS in this paper will be an important development tendency of GIS.

  11. Application of Bayesian Classification to Content-Based Data Management

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Berrick, S.; Gopalan, A.; Hua, X.; Shen, S.; Smith, P.; Yang, K-Y.; Wheeler, K.; Curry, C.

    2004-01-01

    The high volume of Earth Observing System data has proven to be challenging to manage for data centers and users alike. At the Goddard Earth Sciences Distributed Active Archive Center (GES DAAC), about 1 TB of new data are archived each day. Distribution to users is also about 1 TB/day. A substantial portion of this distribution is MODIS calibrated radiance data, which has a wide variety of uses. However, much of the data is not useful for a particular user's needs: for example, ocean color users typically need oceanic pixels that are free of cloud and sun-glint. The GES DAAC is using a simple Bayesian classification scheme to rapidly classify each pixel in the scene in order to support several experimental content-based data services for near-real-time MODIS calibrated radiance products (from Direct Readout stations). Content-based subsetting would allow distribution of, say, only clear pixels to the user if desired. Content-based subscriptions would distribute data to users only when they fit the user's usability criteria in their area of interest within the scene. Content-based cache management would retain more useful data on disk for easy online access. The classification may even be exploited in an automated quality assessment of the geolocation product. Though initially to be demonstrated at the GES DAAC, these techniques have applicability in other resource-limited environments, such as spaceborne data systems.

  12. Integration of Evidence Base into a Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Saile, Lyn; Lopez, Vilma; Bickham, Grandin; Kerstman, Eric; FreiredeCarvalho, Mary; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    INTRODUCTION: A probabilistic decision support model such as the Integrated Medical Model (IMM) utilizes an immense amount of input data that necessitates a systematic, integrated approach for data collection, and management. As a result of this approach, IMM is able to forecasts medical events, resource utilization and crew health during space flight. METHODS: Inflight data is the most desirable input for the Integrated Medical Model. Non-attributable inflight data is collected from the Lifetime Surveillance for Astronaut Health study as well as the engineers, flight surgeons, and astronauts themselves. When inflight data is unavailable cohort studies, other models and Bayesian analyses are used, in addition to subject matters experts input on occasion. To determine the quality of evidence of a medical condition, the data source is categorized and assigned a level of evidence from 1-5; the highest level is one. The collected data reside and are managed in a relational SQL database with a web-based interface for data entry and review. The database is also capable of interfacing with outside applications which expands capabilities within the database itself. Via the public interface, customers can access a formatted Clinical Findings Form (CLiFF) that outlines the model input and evidence base for each medical condition. Changes to the database are tracked using a documented Configuration Management process. DISSCUSSION: This strategic approach provides a comprehensive data management plan for IMM. The IMM Database s structure and architecture has proven to support additional usages. As seen by the resources utilization across medical conditions analysis. In addition, the IMM Database s web-based interface provides a user-friendly format for customers to browse and download the clinical information for medical conditions. It is this type of functionality that will provide Exploratory Medicine Capabilities the evidence base for their medical condition list. CONCLUSION: The IMM Database in junction with the IMM is helping NASA aerospace program improve the health care and reduce risk for the astronauts crew. Both the database and model will continue to expand to meet customer needs through its multi-disciplinary evidence based approach to managing data. Future expansion could serve as a platform for a Space Medicine Wiki of medical conditions.

  13. a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.

    2015-07-01

    Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.

  14. Data Model Management for Space Information Systems

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Ramirez, Paul; Mattmann, chris

    2006-01-01

    The Reference Architecture for Space Information Management (RASIM) suggests the separation of the data model from software components to promote the development of flexible information management systems. RASIM allows the data model to evolve independently from the software components and results in a robust implementation that remains viable as the domain changes. However, the development and management of data models within RASIM are difficult and time consuming tasks involving the choice of a notation, the capture of the model, its validation for consistency, and the export of the model for implementation. Current limitations to this approach include the lack of ability to capture comprehensive domain knowledge, the loss of significant modeling information during implementation, the lack of model visualization and documentation capabilities, and exports being limited to one or two schema types. The advent of the Semantic Web and its demand for sophisticated data models has addressed this situation by providing a new level of data model management in the form of ontology tools. In this paper we describe the use of a representative ontology tool to capture and manage a data model for a space information system. The resulting ontology is implementation independent. Novel on-line visualization and documentation capabilities are available automatically, and the ability to export to various schemas can be added through tool plug-ins. In addition, the ingestion of data instances into the ontology allows validation of the ontology and results in a domain knowledge base. Semantic browsers are easily configured for the knowledge base. For example the export of the knowledge base to RDF/XML and RDFS/XML and the use of open source metadata browsers provide ready-made user interfaces that support both text- and facet-based search. This paper will present the Planetary Data System (PDS) data model as a use case and describe the import of the data model into an ontology tool. We will also describe the current effort to provide interoperability with the European Space Agency (ESA)/Planetary Science Archive (PSA) which is critically dependent on a common data model.

  15. Acquisition-Management Program

    NASA Technical Reports Server (NTRS)

    Avery, Don E.; Vann, A. Vernon; Jones, Richard H.; Rew, William E.

    1987-01-01

    NASA Acquisition Management Subsystem (AMS) program integrated NASA-wide standard automated-procurement-system program developed in 1985. Designed to provide each NASA installation with procurement data-base concept with on-line terminals for managing, tracking, reporting, and controlling contractual actions and associated procurement data. Subsystem provides control, status, and reporting for various procurement areas. Purpose of standardization is to decrease costs of procurement and operation of automatic data processing; increases procurement productivity; furnishes accurate, on-line management information and improves customer support. Written in the ADABAS NATURAL.

  16. The influence of environmental conditions on safety management in hospitals: a qualitative study.

    PubMed

    Alingh, Carien W; van Wijngaarden, Jeroen D H; Huijsman, Robbert; Paauwe, Jaap

    2018-05-02

    Hospitals are confronted with increasing safety demands from a diverse set of stakeholders, including governmental organisations, professional associations, health insurance companies, patient associations and the media. However, little is known about the effects of these institutional and competitive pressures on hospital safety management. Previous research has shown that organisations generally shape their safety management approach along the lines of control- or commitment-based management. Using a heuristic framework, based on the contextually-based human resource theory, we analysed how environmental pressures affect the safety management approach used by hospitals. A qualitative study was conducted into hospital care in the Netherlands. Five hospitals were selected for participation, based on organisational characteristics as well as variation in their reputation for patient safety. We interviewed hospital managers and staff with a central role in safety management. A total of 43 semi-structured interviews were conducted with 48 respondents. The heuristic framework was used as an initial model for analysing the data, though new codes emerged from the data as well. In order to ensure safe care delivery, institutional and competitive stakeholders often impose detailed safety requirements, strong forces for compliance and growing demands for accountability. As a consequence, hospitals experience a decrease in the room to manoeuvre. Hence, organisations increasingly choose a control-based management approach to make sure that safety demands are met. In contrast, in case of more abstract safety demands and an organisational culture which favours patient safety, hospitals generally experience more leeway. This often results in a stronger focus on commitment-based management. Institutional and competitive conditions as well as strategic choices that hospitals make have resulted in various combinations of control- and commitment-based safety management. A balanced approach is required. A strong focus on control-based management generates extrinsic motivation in employees but may, at the same time, undermine or even diminish intrinsic motivation to work on patient safety. Emphasising commitment-based management may, in contrast, strengthen intrinsic motivation but increases the risk of priorities being set elsewhere. Currently, external pressures frequently lead to the adoption of control-based management. A balanced approach requires a shift towards more trust-based safety demands.

  17. Knowledge Based System Applications for Guidance and Control (Application des Systemes a Base de Connaissances au Guidage-Pilotage)

    DTIC Science & Technology

    1991-01-01

    techniques and integration concepts. Recent advances in digital computation techniques including data base management , represent the core enabling...tactical information management and effective pilot interaction are essential. Pilot decision aiding, combat automation, sensor fusion and ol-board...tactical battle management concepts offer the opportunity for substantial mission effectiveness improvements. Although real-time tactical military

  18. A Grid Metadata Service for Earth and Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Negro, Alessandro; Aloisio, Giovanni

    2010-05-01

    Critical challenges for climate modeling researchers are strongly connected with the increasingly complex simulation models and the huge quantities of produced datasets. Future trends in climate modeling will only increase computational and storage requirements. For this reason the ability to transparently access to both computational and data resources for large-scale complex climate simulations must be considered as a key requirement for Earth Science and Environmental distributed systems. From the data management perspective (i) the quantity of data will continuously increases, (ii) data will become more and more distributed and widespread, (iii) data sharing/federation will represent a key challenging issue among different sites distributed worldwide, (iv) the potential community of users (large and heterogeneous) will be interested in discovery experimental results, searching of metadata, browsing collections of files, compare different results, display output, etc.; A key element to carry out data search and discovery, manage and access huge and distributed amount of data is the metadata handling framework. What we propose for the management of distributed datasets is the GRelC service (a data grid solution focusing on metadata management). Despite the classical approaches, the proposed data-grid solution is able to address scalability, transparency, security and efficiency and interoperability. The GRelC service we propose is able to provide access to metadata stored in different and widespread data sources (relational databases running on top of MySQL, Oracle, DB2, etc. leveraging SQL as query language, as well as XML databases - XIndice, eXist, and libxml2 based documents, adopting either XPath or XQuery) providing a strong data virtualization layer in a grid environment. Such a technological solution for distributed metadata management leverages on well known adopted standards (W3C, OASIS, etc.); (ii) supports role-based management (based on VOMS), which increases flexibility and scalability; (iii) provides full support for Grid Security Infrastructure, which means (authorization, mutual authentication, data integrity, data confidentiality and delegation); (iv) is compatible with existing grid middleware such as gLite and Globus and finally (v) is currently adopted at the Euro-Mediterranean Centre for Climate Change (CMCC - Italy) to manage the entire CMCC data production activity as well as in the international Climate-G testbed.

  19. Beliefs Regarding Classroom Management Style: Relationships to Particular Teacher Personality Characteristics.

    ERIC Educational Resources Information Center

    Martin, Nancy K.; And Others

    This study was a continuation of an in-process research effort to further refine the Inventory of Classroom Management Styles (ICMS), an instrument designed to measure teachers' perceptions of their classroom management beliefs and practices. Using preliminary data analysis based on partial data collection, the primary objective of this study was…

  20. e-Leadership of School Principals: Increasing School Effectiveness by a School Data Management System

    ERIC Educational Resources Information Center

    Blau, Ina; Presser, Ofer

    2013-01-01

    In recent years, school management systems have become an important tool for effective e-leadership and data-based decision making. School management systems emphasize information flow and e-communication between teachers, students and parents. This study examines e-leadership by secondary-school principals through the Mashov school management…

  1. Managing University Research Microdata Collections

    ERIC Educational Resources Information Center

    Woolfrey, Lynn; Fry, Jane

    2015-01-01

    This article examines the management of microdata collections in a university context. It is a cross-country analysis: Collection management at data services in Canada and South Africa are considered. The case studies are of two university sub-contexts: One collection is located in a library; the other at a Faculty-based Data Service. Stages in…

  2. An Overview of the Object Protocol Model (OPM) and the OPM Data Management Tools.

    ERIC Educational Resources Information Center

    Chen, I-Min A.; Markowitz, Victor M.

    1995-01-01

    Discussion of database management tools for scientific information focuses on the Object Protocol Model (OPM) and data management tools based on OPM. Topics include the need for new constructs for modeling scientific experiments, modeling object structures and experiments in OPM, queries and updates, and developing scientific database applications…

  3. Site-Based Management versus Systems-Based Thinking: The Impact of Data-Driven Accountability and Reform

    ERIC Educational Resources Information Center

    Mette, Ian M.; Bengtson, Ed

    2015-01-01

    This case was written to help prepare building-level and central office administrators who are expected to effectively lead schools and systems in an often tumultuous world of educational accountability and reform. The intent of this case study is to allow educators to examine the impact data management has on the types of thinking required when…

  4. A Service Oriented Web Application for Learner Knowledge Representation, Management and Sharing Conforming to IMS LIP

    ERIC Educational Resources Information Center

    Lazarinis, Fotis

    2014-01-01

    iLM is a Web based application for representation, management and sharing of IMS LIP conformant user profiles. The tool is developed using a service oriented architecture with emphasis on the easy data sharing. Data elicitation from user profiles is based on the utilization of XQuery scripts and sharing with other applications is achieved through…

  5. An on-line image data base system: Managing image collections

    Treesearch

    Malchus B. Baker; Daniel P. Huebner; Peter F. Ffolliott

    2000-01-01

    Many researchers and land management personnel want photographic records of the phases of their studies or projects. Depending on the personnel and the type of project, a study can result in a few or hundreds of photographic images. A data base system allows users to query using various parameters, such as key words, dates, and project locations, and to view images...

  6. NASIS data base management system - IBM 360/370 OS MVT implementation. 1: Installation standards

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The installation standards for the NASA Aerospace Safety Information System (NASIS) data base management system are presented. The standard approach to preparing systems documentation and the program design and coding rules and conventions are outlined. Included are instructions for preparing all major specifications and suggestions for improving the quality and efficiency of the programming task.

  7. Estuary Data Mapper: A coastal information system to propel emerging science and inform environmental management decisions

    EPA Science Inventory

    The Estuary Data Mapper (EDM) is a free, interactive virtual gateway to coastal data aimed to promote research and aid in environmental management. The graphical user interface allows users to custom select and subset data based on their spatial and temporal interests giving them...

  8. Generalized Data Management Systems--Some Perspectives.

    ERIC Educational Resources Information Center

    Minker, Jack

    A Generalized Data Management System (GDMS) is a software environment provided as a tool for analysts, administrators, and programmers who are responsible for the maintenance, query and analysis of a data base to permit the manipulation of newly defined files and data with the existing programs and system. Because the GDMS technology is believed…

  9. Using a data base management system for modelling SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, K.

    1985-01-01

    The usefulness of a data base management system (DBMS) for modelling historical test data for the complete series of static test firings for the Space Shuttle Main Engine (SSME) was assessed. From an analysis of user data base query requirements, it became clear that a relational DMBS which included a relationally complete query language would permit a model satisfying the query requirements. Representative models and sample queries are discussed. A list of environment-particular evaluation criteria for the desired DBMS was constructed; these criteria include requirements in the areas of user-interface complexity, program independence, flexibility, modifiability, and output capability. The evaluation process included the construction of several prototype data bases for user assessement. The systems studied, representing the three major DBMS conceptual models, were: MIRADS, a hierarchical system; DMS-1100, a CODASYL-based network system; ORACLE, a relational system; and DATATRIEVE, a relational-type system.

  10. An address geocoding method for improving rural spatial information infrastructure

    NASA Astrophysics Data System (ADS)

    Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, YanBing

    2010-11-01

    The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.

  11. An integrated solution for remote data access

    NASA Astrophysics Data System (ADS)

    Sapunenko, Vladimir; D'Urso, Domenico; dell'Agnello, Luca; Vagnoni, Vincenzo; Duranti, Matteo

    2015-12-01

    Data management constitutes one of the major challenges that a geographically- distributed e-Infrastructure has to face, especially when remote data access is involved. We discuss an integrated solution which enables transparent and efficient access to on-line and near-line data through high latency networks. The solution is based on the joint use of the General Parallel File System (GPFS) and of the Tivoli Storage Manager (TSM). Both products, developed by IBM, are well known and extensively used in the HEP computing community. Owing to a new feature introduced in GPFS 3.5, so-called Active File Management (AFM), the definition of a single, geographically-distributed namespace, characterised by automated data flow management between different locations, becomes possible. As a practical example, we present the implementation of AFM-based remote data access between two data centres located in Bologna and Rome, demonstrating the validity of the solution for the use case of the AMS experiment, an astro-particle experiment supported by the INFN CNAF data centre with the large disk space requirements (more than 1.5 PB).

  12. Web-Accessible Scientific Workflow System for Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roelof Versteeg; Roelof Versteeg; Trevor Rowe

    2006-03-01

    We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less

  13. An image based information system - Architecture for correlating satellite and topological data bases

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Zobrist, A. L.

    1978-01-01

    The paper describes the development of an image based information system and its use to process a Landsat thematic map showing land use or land cover in conjunction with a census tract polygon file to produce a tabulation of land use acreages per census tract. The system permits the efficient cross-tabulation of two or more geo-coded data sets, thereby setting the stage for the practical implementation of models of diffusion processes or cellular transformation. Characteristics of geographic information systems are considered, and functional requirements, such as data management, geocoding, image data management, and data analysis are discussed. The system is described, and the potentialities of its use are examined.

  14. Review of the Water Resources Information System of Argentina

    USGS Publications Warehouse

    Hutchison, N.E.

    1987-01-01

    A representative of the U.S. Geological Survey traveled to Buenos Aires, Argentina, in November 1986, to discuss water information systems and data bank implementation in the Argentine Government Center for Water Resources Information. Software has been written by Center personnel for a minicomputer to be used to manage inventory (index) data and water quality data. Additional hardware and software have been ordered to upgrade the existing computer. Four microcomputers, statistical and data base management software, and network hardware and software for linking the computers have also been ordered. The Center plans to develop a nationwide distributed data base for Argentina that will include the major regional offices as nodes. Needs for continued development of the water resources information system for Argentina were reviewed. Identified needs include: (1) conducting a requirements analysis to define the content of the data base and insure that all user requirements are met, (2) preparing a plan for the development, implementation, and operation of the data base, and (3) developing a conceptual design to inform all development personnel and users of the basic functionality planned for the system. A quality assurance and configuration management program to provide oversight to the development process was also discussed. (USGS)

  15. Enabling long-term oceanographic research: Changing data practices, information management strategies and informatics

    NASA Astrophysics Data System (ADS)

    Baker, Karen S.; Chandler, Cynthia L.

    2008-09-01

    Interdisciplinary global ocean science requires new ways of thinking about data and data management. With new data policies and growing technological capabilities, datasets of increasing variety and complexity are being made available digitally and data management is coming to be recognized as an integral part of scientific research. To meet the changing expectations of scientists collecting data and of data reuse by others, collaborative strategies involving diverse teams of information professionals are developing. These changes are stimulating the growth of information infrastructures that support multi-scale sampling, data repositories, and data integration. Two examples of oceanographic projects incorporating data management in partnership with science programs are discussed: the Palmer Station Long-Term Ecological Research program (Palmer LTER) and the United States Joint Global Ocean Flux Study (US JGOFS). Lessons learned from a decade of data management within these communities provide an experience base from which to develop information management strategies—short-term and long-term. Ocean Informatics provides one example of a conceptual framework for managing the complexities inherent to sharing oceanographic data. Elements are introduced that address the economies-of-scale and the complexities-of-scale pertinent to a broader vision of information management and scientific research.

  16. Novel, Web-based, information-exploration approach for improving operating room logistics and system processes.

    PubMed

    Nagy, Paul G; Konewko, Ramon; Warnock, Max; Bernstein, Wendy; Seagull, Jacob; Xiao, Yan; George, Ivan; Park, Adrian

    2008-03-01

    Routine clinical information systems now have the ability to gather large amounts of data that surgical managers can access to create a seamless and proactive approach to streamlining operations and minimizing delays. The challenge lies in aggregating and displaying these data in an easily accessible format that provides useful, timely information on current operations. A Web-based, graphical dashboard is described in this study, which can be used to interpret clinical operational data, allow managers to see trends in data, and help identify inefficiencies that were not apparent with more traditional, paper-based approaches. The dashboard provides a visual decision support tool that assists managers in pinpointing areas for continuous quality improvement. The limitations of paper-based techniques, the development of the automated display system, and key performance indicators in analyzing aggregate delays, time, specialties, and teamwork are reviewed. Strengths, weaknesses, opportunities, and threats associated with implementing such a program in the perioperative environment are summarized.

  17. 15 CFR Appendix II to Subpart P of... - Existing Management Areas Boundary Coordinates

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ....43.8′ N 81 deg.48.6′ W. Key West National Wildlife Refuge [Based on the North American Datum of 1983... COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT NATIONAL MARINE SANCTUARY PROGRAM REGULATIONS Florida Keys... Administration Key Largo-Management Area [Based on differential Global Positioning Systems data] Point Latitude...

  18. 15 CFR Appendix II to Subpart P of... - Existing Management Areas Boundary Coordinates

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....43.8′ N 81 deg.48.6′ W. Key West National Wildlife Refuge [Based on the North American Datum of 1983... COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT NATIONAL MARINE SANCTUARY PROGRAM REGULATIONS Florida Keys... Administration Key Largo-Management Area [Based on differential Global Positioning Systems data] Point Latitude...

  19. 15 CFR Appendix II to Subpart P of... - Existing Management Areas Boundary Coordinates

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....43.8′ N 81 deg.48.6′ W. Key West National Wildlife Refuge [Based on the North American Datum of 1983... COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT NATIONAL MARINE SANCTUARY PROGRAM REGULATIONS Florida Keys... Administration Key Largo-Management Area [Based on differential Global Positioning Systems data] Point Latitude...

  20. 15 CFR Appendix II to Subpart P of... - Existing Management Areas Boundary Coordinates

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ....43.8′ N 81 deg.48.6′ W. Key West National Wildlife Refuge [Based on the North American Datum of 1983... COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT NATIONAL MARINE SANCTUARY PROGRAM REGULATIONS Florida Keys... Administration Key Largo-Management Area [Based on differential Global Positioning Systems data] Point Latitude...

  1. 15 CFR Appendix II to Subpart P of... - Existing Management Areas Boundary Coordinates

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ....43.8′ N 81 deg.48.6′ W. Key West National Wildlife Refuge [Based on the North American Datum of 1983... COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT NATIONAL MARINE SANCTUARY PROGRAM REGULATIONS Florida Keys... Administration Key Largo-Management Area [Based on differential Global Positioning Systems data] Point Latitude...

  2. Study on data model of large-scale urban and rural integrated cadastre

    NASA Astrophysics Data System (ADS)

    Peng, Liangyong; Huang, Quanyi; Gao, Dequan

    2008-10-01

    Urban and Rural Integrated Cadastre (URIC) has been the subject of great interests for modern cadastre management. It is highly desirable to develop a rational data model for establishing an information system of URIC. In this paper, firstly, the old cadastral management mode in China was introduced, the limitation was analyzed, and the conception of URIC and its development course in China were described. Afterwards, based on the requirements of cadastre management in developed region, the goal of URIC and two key ideas for realizing URIC were proposed. Then, conceptual management mode was studied and a data model of URIC was designed. At last, based on the raw data of land use survey with a scale of 1:1000 and urban conversional cadastral survey with a scale of 1:500 in Jiangyin city, a well-defined information system of URIC was established according to the data model and an uniform management of land use and use right and landownership in urban and rural area was successfully realized. Its feasibility and practicability was well proved.

  3. Data management integration for biomedical core facilities

    NASA Astrophysics Data System (ADS)

    Zhang, Guo-Qiang; Szymanski, Jacek; Wilson, David

    2007-03-01

    We present the design, development, and pilot-deployment experiences of MIMI, a web-based, Multi-modality Multi-Resource Information Integration environment for biomedical core facilities. This is an easily customizable, web-based software tool that integrates scientific and administrative support for a biomedical core facility involving a common set of entities: researchers; projects; equipments and devices; support staff; services; samples and materials; experimental workflow; large and complex data. With this software, one can: register users; manage projects; schedule resources; bill services; perform site-wide search; archive, back-up, and share data. With its customizable, expandable, and scalable characteristics, MIMI not only provides a cost-effective solution to the overarching data management problem of biomedical core facilities unavailable in the market place, but also lays a foundation for data federation to facilitate and support discovery-driven research.

  4. Geospatial considerations for a multiorganizational, landscape-scale program

    USGS Publications Warehouse

    O'Donnell, Michael S.; Assal, Timothy J.; Anderson, Patrick J.; Bowen, Zachary H.

    2013-01-01

    Geospatial data play an increasingly important role in natural resources management, conservation, and science-based projects. The management and effective use of spatial data becomes significantly more complex when the efforts involve a myriad of landscape-scale projects combined with a multiorganizational collaboration. There is sparse literature to guide users on this daunting subject; therefore, we present a framework of considerations for working with geospatial data that will provide direction to data stewards, scientists, collaborators, and managers for developing geospatial management plans. The concepts we present apply to a variety of geospatial programs or projects, which we describe as a “scalable framework” of processes for integrating geospatial efforts with management, science, and conservation initiatives. Our framework includes five tenets of geospatial data management: (1) the importance of investing in data management and standardization, (2) the scalability of content/efforts addressed in geospatial management plans, (3) the lifecycle of a geospatial effort, (4) a framework for the integration of geographic information systems (GIS) in a landscape-scale conservation or management program, and (5) the major geospatial considerations prior to data acquisition. We conclude with a discussion of future considerations and challenges.

  5. Pathogen profiling for disease management and surveillance.

    PubMed

    Sintchenko, Vitali; Iredell, Jonathan R; Gilbert, Gwendolyn L

    2007-06-01

    The usefulness of rapid pathogen genotyping is widely recognized, but its effective interpretation and application requires integration into clinical and public health decision-making. How can pathogen genotyping data best be translated to inform disease management and surveillance? Pathogen profiling integrates microbial genomics data into communicable disease control by consolidating phenotypic identity-based methods with DNA microarrays, proteomics, metabolomics and sequence-based typing. Sharing data on pathogen profiles should facilitate our understanding of transmission patterns and the dynamics of epidemics.

  6. RDBMS Applications as Online Based Data Archive: A Case of Harbour Medical Center in Pekanbaru

    NASA Astrophysics Data System (ADS)

    Febriadi, Bayu; Zamsuri, Ahmad

    2017-12-01

    Kantor Kesehatan Pelabuhan Kelas II Pekanbaru is a government office that concerns about healthy, especially about environment health. There is a problem in case of saving electronic data, also in analyzing daily data both for internal and external data. The office has some computers and other tools that are useful in saving electronic data. In fact, the data are still saved in available cupboards and it is not efficient for an important data that is analyzed for more than one time. In other words, it is not good for a data is needed to be analyzed continuously. Rational Data Base Management System (RDBMS) application is an online based saving data and it uses System Development Life Cycle (SDLC) method. Hopefully, the application will be very useful for employees Kantor Kesehatan Pelabuhan Pekanbaru in managing their work.

  7. MINIS: Multipurpose Interactive NASA Information System

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The Multipurpose Interactive NASA Information Systems (MINIS) was developed in response to the need for a data management system capable of operation on several different minicomputer systems. The desired system had to be capable of performing the functions of a LANDSAT photo descriptive data retrieval system while remaining general in terms of other acceptable user definable data bases. The system also had to be capable of performing data base updates and providing user-formatted output reports. The resultant MINI System provides all of these capabilities and several other features to complement the data management system. The MINI System is currently implemented on two minicomputer systems and is in the process of being installed on another minicomputer system. The MINIS is operational on four different data bases.

  8. A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.

    2015-12-01

    Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends, compare multiple reanalysis datasets, and variability.

  9. Archive, Access, and Supply of Scientifically Derived Data: A Data Model for Multi-Parameterized Querying Where Spectral Data Base Meets GIS-Based Mapping Archive

    NASA Astrophysics Data System (ADS)

    Nass, A.; D'Amore, M.; Helbert, J.

    2018-04-01

    An archiving structure and reference level of derived and already published data supports the scientific community significantly by a constant rise of knowledge and understanding based on recent discussions within Information Science and Management.

  10. Recent Development in Big Data Analytics for Business Operations and Risk Management.

    PubMed

    Choi, Tsan-Ming; Chan, Hing Kai; Yue, Xiaohang

    2017-01-01

    "Big data" is an emerging topic and has attracted the attention of many researchers and practitioners in industrial systems engineering and cybernetics. Big data analytics would definitely lead to valuable knowledge for many organizations. Business operations and risk management can be a beneficiary as there are many data collection channels in the related industrial systems (e.g., wireless sensor networks, Internet-based systems, etc.). Big data research, however, is still in its infancy. Its focus is rather unclear and related studies are not well amalgamated. This paper aims to present the challenges and opportunities of big data analytics in this unique application domain. Technological development and advances for industrial-based business systems, reliability and security of industrial systems, and their operational risk management are examined. Important areas for future research are also discussed and revealed.

  11. GraphMeta: Managing HPC Rich Metadata in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Dong; Chen, Yong; Carns, Philip

    High-performance computing (HPC) systems face increasingly critical metadata management challenges, especially in the approaching exascale era. These challenges arise not only from exploding metadata volumes, but also from increasingly diverse metadata, which contains data provenance and arbitrary user-defined attributes in addition to traditional POSIX metadata. This ‘rich’ metadata is becoming critical to supporting advanced data management functionality such as data auditing and validation. In our prior work, we identified a graph-based model as a promising solution to uniformly manage HPC rich metadata due to its flexibility and generality. However, at the same time, graph-based HPC rich metadata anagement also introducesmore » significant challenges to the underlying infrastructure. In this study, we first identify the challenges on the underlying infrastructure to support scalable, high-performance rich metadata management. Based on that, we introduce GraphMeta, a graphbased engine designed for this use case. It achieves performance scalability by introducing a new graph partitioning algorithm and a write-optimal storage engine. We evaluate GraphMeta under both synthetic and real HPC metadata workloads, compare it with other approaches, and demonstrate its advantages in terms of efficiency and usability for rich metadata management in HPC systems.« less

  12. DataHub: Science data management in support of interactive exploratory analysis

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Rubin, Mark R.

    1993-01-01

    The DataHub addresses four areas of significant needs: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactives nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc), in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.

  13. DataHub - Science data management in support of interactive exploratory analysis

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Rubin, Mark R.

    1993-01-01

    DataHub addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactive nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc.) in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis is on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.

  14. Using the web for recruitment, screen, tracking, data management, and quality control in a dietary assessment clinical validation trial.

    PubMed

    Arab, Lenore; Hahn, Harry; Henry, Judith; Chacko, Sara; Winter, Ashley; Cambou, Mary C

    2010-03-01

    Screening and tracking subjects and data management in clinical trials require significant investments in manpower that can be reduced through the use of web-based systems. To support a validation trial of various dietary assessment tools that required multiple clinic visits and eight repeats of online assessments, we developed an interactive web-based system to automate all levels of management of a biomarker-based clinical trial. The "Energetics System" was developed to support 1) the work of the study coordinator in recruiting, screening and tracking subject flow, 2) the need of the principal investigator to review study progress, and 3) continuous data analysis. The system was designed to automate web-based self-screening into the trial. It supported scheduling tasks and triggered tailored messaging for late and non-responders. For the investigators, it provided real-time status overviews on all subjects, created electronic case reports, supported data queries and prepared analytic data files. Encryption and multi-level password protection were used to insure data privacy. The system was programmed iteratively and required six months of a web programmer's time along with active team engagement. In this study the enhancement in speed and efficiency of recruitment and quality of data collection as a result of this system outweighed the initial investment. Web-based systems have the potential to streamline the process of recruitment and day-to-day management of clinical trials in addition to improving efficiency and quality. Because of their added value they should be considered for trials of moderate size or complexity. Copyright 2009 Elsevier Inc. All rights reserved.

  15. East African wetland-catchment data base for sustainable wetland management

    NASA Astrophysics Data System (ADS)

    Leemhuis, Constanze; Amler, Esther; Diekkrüger, Bernd; Gabiri, Geofrey; Näschen, Kristian

    2016-10-01

    Wetlands cover an area of approx. 18 Mio ha in the East African countries of Kenya, Rwanda, Uganda and Tanzania, with still a relative small share being used for food production. Current upland agricultural use intensification in these countries due to demographic growth, climate change and globalization effects are leading to an over-exploitation of the resource base, followed by an intensification of agricultural wetland use. We aim on translating, transferring and upscaling knowledge on experimental test-site wetland properties, small-scale hydrological processes, and water related ecosystem services under different types of management from local to national scale. This information gained at the experimental wetland/catchment scale will be embedded as reference data within an East African wetland-catchment data base including catchment physical properties and a regional wetland inventory serving as a base for policy advice and the development of sustainable wetland management strategies.

  16. Improving healthcare services using web based platform for management of medical case studies.

    PubMed

    Ogescu, Cristina; Plaisanu, Claudiu; Udrescu, Florian; Dumitru, Silviu

    2008-01-01

    The paper presents a web based platform for management of medical cases, support for healthcare specialists in taking the best clinical decision. Research has been oriented mostly on multimedia data management, classification algorithms for querying, retrieving and processing different medical data types (text and images). The medical case studies can be accessed by healthcare specialists and by students as anonymous case studies providing trust and confidentiality in Internet virtual environment. The MIDAS platform develops an intelligent framework to manage sets of medical data (text, static or dynamic images), in order to optimize the diagnosis and the decision process, which will reduce the medical errors and will increase the quality of medical act. MIDAS is an integrated project working on medical information retrieval from heterogeneous, distributed medical multimedia database.

  17. The provision and utilisation of casemix and demographic data by nursing managers in seven hospitals.

    PubMed

    Blay, Nicole; Donoghue, Judith

    2003-01-01

    The role of the nursing manager has evolved from clinician and bed manager to one with greater accountability for evidence based practice, benchmarking and more recently, budget liability. Casemix data are widely believed to be a means of providing essential information for effective decision making and financial management but have not been widely utilised by nursing managers (Diers & Bozzo, 1999). This paper will report the results of a survey of nursing managers in seven hospitals within a metropolitan area health service. The hospitals include tertiary referral hospitals, specialist public hospitals and an affiliated public hospital for aged care and rehabilitation services. The survey sought to establish what casemix and related data were provided to nurse managers, who provided these data and how supplied data were utilised by the nurse managers. Results demonstrated that the majority of nursing managers surveyed received minimal (if any) casemix and/or demographic data on a routine basis. Some were provided with data in response to specific requests. The information that was provided varied both within and across hospitals, and no consistent methods of data distribution were available. Few nursing managers believed that the information provided aided their decision-making processes partly due to the minimalist nature of provided data while some nursing managers demonstrated a lack of understanding of the potential benefit of casemix data as a resource to support management decision making.

  18. The relationship between energy information management and energy management performance in higher education sector in Thailand, considering from resource and process based views

    NASA Astrophysics Data System (ADS)

    Mongkolsawat, Darunee

    The performance of energy management is usually considered through the energy reduction result however this does not sufficient for managing facility's energy in the long term. In combination to that, this study decides to investigate the relationship between the effectiveness of energy information management and the energy management performance. The interested sector is higher education institutions in Thailand due to their complex organisation both in management and property aspects. By not focusing on quantitative energy reduction as centre, the study seeks to establish a framework or tool in helping to understand such relationship qualitatively through organisation resource and process based view. Additionally, energy management structure is also accounted as initial factor. In relation to such framework, the performance of energy management is considered on its primary results concerning the issues of the data available, analysis results, and energy action. After the investigation, it is found that between the concerned factors and primary performance there are various specific relationships. For example, some tend to have direct connections as relations between the energy management structure and implemented actions, and between the investment in organisation resources and data available. While some have flexible relations as between data collection and results of analysed data. Furthermore, the load of energy management has been found influencing on organisation's motivation to invest in energy management. At the end of the paper, further application to the study is also proposed.

  19. A GIS-based Model for Natural Gas Data Conversion

    NASA Astrophysics Data System (ADS)

    Bitik, E.; Seker, D. Z.; Denli, H. H.

    2014-12-01

    In Turkey gas utility sector has undergone major changes in terms of increased competition between gas providers, efforts in improving services, and applying new technological solutions. This paper discusses the challenges met by gas companies to switch from long workflows of gas distribution, sales and maintenance into IT driven efficient management of complex information both spatially and non-spatially. The aim of this study is migration of all gas data and information into a GIS environment in order to manage and operate all infrastructure investments with a Utility Management System. All data conversion model for migration was designed and tested during the study. A flowchart is formed to transfer the old data layers to the new structure based on geodatabase.

  20. Data bases and data base systems related to NASA's Aerospace Program: A bibliography with indexes

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This bibliography lists 641 reports, articles, and other documents introduced into the NASA scientific and technical information system during the period January 1, 1981 through June 30, 1982. The directory was compiled to assist in the location of numerical and factual data bases and data base handling and management systems.

  1. Database Administration: Concepts, Tools, Experiences, and Problems.

    ERIC Educational Resources Information Center

    Leong-Hong, Belkis; Marron, Beatrice

    The concepts of data base administration, the role of the data base administrator (DBA), and computer software tools useful in data base administration are described in order to assist data base technologists and managers. A study of DBA's in the Federal Government is detailed in terms of the functions they perform, the software tools they use,…

  2. A Distributed Data Base Version of INGRES.

    ERIC Educational Resources Information Center

    Stonebraker, Michael; Neuhold, Eric

    Extensions are required to the currently operational INGRES data base system for it to manage a data base distributed over multiple machines in a computer network running the UNIX operating system. Three possible user views include: (1) each relation in a unique machine, (2) a user interaction with the data base which can only span relations at a…

  3. Research of the small satellite data management system

    NASA Astrophysics Data System (ADS)

    Yu, Xiaozhou; Zhou, Fengqi; Zhou, Jun

    2007-11-01

    Small satellite is the integration of light weight, small volume and low launch cost. It is a promising approach to realize the future space mission. A detailed study of the data management system has been carried out, with using new reconfiguration method based on System On Programmable Chip (SOPC). Compared with common structure of satellite, the Central Terminal Unit (CTU), the Remote Terminal Unit (RTU) and Serial Data Bus (SDB) of the data management are all integrated in single chip. Thus the reliability of the satellite is greatly improved. At the same time, the data management system has powerful performance owing to the modern FPGA processing ability.

  4. Monte Carlo-based interval transformation analysis for multi-criteria decision analysis of groundwater management strategies under uncertain naphthalene concentrations and health risks

    NASA Astrophysics Data System (ADS)

    Ren, Lixia; He, Li; Lu, Hongwei; Chen, Yizhong

    2016-08-01

    A new Monte Carlo-based interval transformation analysis (MCITA) is used in this study for multi-criteria decision analysis (MCDA) of naphthalene-contaminated groundwater management strategies. The analysis can be conducted when input data such as total cost, contaminant concentration and health risk are represented as intervals. Compared to traditional MCDA methods, MCITA-MCDA has the advantages of (1) dealing with inexactness of input data represented as intervals, (2) mitigating computational time due to the introduction of Monte Carlo sampling method, (3) identifying the most desirable management strategies under data uncertainty. A real-world case study is employed to demonstrate the performance of this method. A set of inexact management alternatives are considered in each duration on the basis of four criteria. Results indicated that the most desirable management strategy lied in action 15 for the 5-year, action 8 for the 10-year, action 12 for the 15-year, and action 2 for the 20-year management.

  5. IT Operational Risk Measurement Model Based on Internal Loss Data of Banks

    NASA Astrophysics Data System (ADS)

    Hao, Xiaoling

    Business operation of banks relies increasingly on information technology (IT) and the most important role of IT is to guarantee the operational continuity of business process. Therefore, IT Risk management efforts need to be seen from the perspective of operational continuity. Traditional IT risk studies focused on IT asset-based risk analysis and risk-matrix based qualitative risk evaluation. In practice, IT risk management practices of banking industry are still limited to the IT department and aren't integrated into business risk management, which causes the two departments to work in isolation. This paper presents an improved methodology for dealing with IT operational risk. It adopts quantitative measurement method, based on the internal business loss data about IT events, and uses Monte Carlo simulation to predict the potential losses. We establish the correlation between the IT resources and business processes to make sure risk management of IT and business can work synergistically.

  6. Towards building high performance medical image management system for clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Fusheng; Lee, Rubao; Zhang, Xiaodong; Saltz, Joel

    2011-03-01

    Medical image based biomarkers are being established for therapeutic cancer clinical trials, where image assessment is among the essential tasks. Large scale image assessment is often performed by a large group of experts by retrieving images from a centralized image repository to workstations to markup and annotate images. In such environment, it is critical to provide a high performance image management system that supports efficient concurrent image retrievals in a distributed environment. There are several major challenges: high throughput of large scale image data over the Internet from the server for multiple concurrent client users, efficient communication protocols for transporting data, and effective management of versioning of data for audit trails. We study the major bottlenecks for such a system, propose and evaluate a solution by using a hybrid image storage with solid state drives and hard disk drives, RESTfulWeb Services based protocols for exchanging image data, and a database based versioning scheme for efficient archive of image revision history. Our experiments show promising results of our methods, and our work provides a guideline for building enterprise level high performance medical image management systems.

  7. A distributed cloud-based cyberinfrastructure framework for integrated bridge monitoring

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2017-04-01

    This paper describes a cloud-based cyberinfrastructure framework for the management of the diverse data involved in bridge monitoring. Bridge monitoring involves various hardware systems, software tools and laborious activities that include, for examples, a structural health monitoring (SHM), sensor network, engineering analysis programs and visual inspection. Very often, these monitoring systems, tools and activities are not coordinated, and the collected information are not shared. A well-designed integrated data management framework can support the effective use of the data and, thereby, enhance bridge management and maintenance operations. The cloud-based cyberinfrastructure framework presented herein is designed to manage not only sensor measurement data acquired from the SHM system, but also other relevant information, such as bridge engineering model and traffic videos, in an integrated manner. For the scalability and flexibility, cloud computing services and distributed database systems are employed. The information stored can be accessed through standard web interfaces. For demonstration, the cyberinfrastructure system is implemented for the monitoring of the bridges located along the I-275 Corridor in the state of Michigan.

  8. International Expert Consensus Document on Takotsubo Syndrome (Part II): Diagnostic Workup, Outcome, and Management.

    PubMed

    Ghadri, Jelena-Rima; Wittstein, Ilan Shor; Prasad, Abhiram; Sharkey, Scott; Dote, Keigo; Akashi, Yoshihiro John; Cammann, Victoria Lucia; Crea, Filippo; Galiuto, Leonarda; Desmet, Walter; Yoshida, Tetsuro; Manfredini, Roberto; Eitel, Ingo; Kosuge, Masami; Nef, Holger M; Deshmukh, Abhishek; Lerman, Amir; Bossone, Eduardo; Citro, Rodolfo; Ueyama, Takashi; Corrado, Domenico; Kurisu, Satoshi; Ruschitzka, Frank; Winchester, David; Lyon, Alexander R; Omerovic, Elmir; Bax, Jeroen J; Meimoun, Patrick; Tarantini, Guiseppe; Rihal, Charanjit; Y-Hassan, Shams; Migliore, Federico; Horowitz, John D; Shimokawa, Hiroaki; Lüscher, Thomas Felix; Templin, Christian

    2018-06-07

    The clinical expert consensus statement on takotsubo syndrome (TTS) part II focuses on the diagnostic workup, outcome, and management. The recommendations are based on interpretation of the limited clinical trial data currently available and experience of international TTS experts. It summarizes the diagnostic approach, which may facilitate correct and timely diagnosis. Furthermore, the document covers areas where controversies still exist in risk stratification and management of TTS. Based on available data the document provides recommendations on optimal care of such patients for practising physicians.

  9. International Expert Consensus Document on Takotsubo Syndrome (Part II): Diagnostic Workup, Outcome, and Management

    PubMed Central

    Ghadri, Jelena-Rima; Wittstein, Ilan Shor; Prasad, Abhiram; Sharkey, Scott; Dote, Keigo; Akashi, Yoshihiro John; Cammann, Victoria Lucia; Crea, Filippo; Galiuto, Leonarda; Desmet, Walter; Yoshida, Tetsuro; Manfredini, Roberto; Eitel, Ingo; Kosuge, Masami; Nef, Holger M; Deshmukh, Abhishek; Lerman, Amir; Bossone, Eduardo; Citro, Rodolfo; Ueyama, Takashi; Corrado, Domenico; Kurisu, Satoshi; Ruschitzka, Frank; Winchester, David; Lyon, Alexander R; Omerovic, Elmir; Bax, Jeroen J; Meimoun, Patrick; Tarantini, Guiseppe; Rihal, Charanjit; Y.-Hassan, Shams; Migliore, Federico; Horowitz, John D; Shimokawa, Hiroaki; Lüscher, Thomas Felix; Templin, Christian

    2018-01-01

    Abstract The clinical expert consensus statement on takotsubo syndrome (TTS) part II focuses on the diagnostic workup, outcome, and management. The recommendations are based on interpretation of the limited clinical trial data currently available and experience of international TTS experts. It summarizes the diagnostic approach, which may facilitate correct and timely diagnosis. Furthermore, the document covers areas where controversies still exist in risk stratification and management of TTS. Based on available data the document provides recommendations on optimal care of such patients for practising physicians. PMID:29850820

  10. Knowledge-based geographic information systems (KBGIS): New analytic and data management tools

    USGS Publications Warehouse

    Albert, T.M.

    1988-01-01

    In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the U.S. Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved. ?? 1988 International Association for Mathematical Geology.

  11. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Integrated information processing requirements

    NASA Technical Reports Server (NTRS)

    Southall, J. W.

    1979-01-01

    The engineering-specified requirements for integrated information processing by means of the Integrated Programs for Aerospace-Vehicle Design (IPAD) system are presented. A data model is described and is based on the design process of a typical aerospace vehicle. General data management requirements are specified for data storage, retrieval, generation, communication, and maintenance. Information management requirements are specified for a two-component data model. In the general portion, data sets are managed as entities, and in the specific portion, data elements and the relationships between elements are managed by the system, allowing user access to individual elements for the purpose of query. Computer program management requirements are specified for support of a computer program library, control of computer programs, and installation of computer programs into IPAD.

  12. A coastal information system to propel emerging science and inform environmental management decisions

    EPA Science Inventory

    The Estuary Data Mapper (EDM) is a free, interactive virtual gateway to coastal data aimed to promote research and aid in environmental management. The graphical user interface allows users to custom select and subset data based on their spatial and temporal interests giving them...

  13. 44 CFR 65.3 - Requirement to submit new technical data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... technical data. 65.3 Section 65.3 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY... IDENTIFICATION AND MAPPING OF SPECIAL HAZARD AREAS § 65.3 Requirement to submit new technical data. A community's base flood elevations may increase or decrease resulting from physical changes affecting flooding...

  14. Data Element Dictionary: Staff. Second Edition.

    ERIC Educational Resources Information Center

    Martin, James S.

    This document is intended to serve as a guide for institutions in the development of data bases to support the implementation of planning and management systems. This publication serves to identify and describe those staff-related data elements: (1) required to support current National Center for Higher Education Management Systems (NCHEMS)…

  15. Data Element Dictionary: Facilities. Second Edition.

    ERIC Educational Resources Information Center

    Martin, James S.

    This document is intended to serve as a guide for institutions in the development of data bases to support the implementation of planning and management systems. This publication serves to identify and describe those facilities-related data elements: (1) required to support current National Center for Higher Education Management Systems (NCHEMS)…

  16. Data Element Dictionary: Course. Second Edition.

    ERIC Educational Resources Information Center

    Martin, James S.

    This document is intended to serve as a guide for institutions in the development of data bases to support the implementation of planning and management systems. This publication serves to identify and describe those course-related data elements: (1) required to support current National Center for Higher Education Management Systems (NCHEMS)…

  17. Data Element Dictionary: Finance. Second Edition.

    ERIC Educational Resources Information Center

    Martin, James S.

    This document is intended to serve as a guide for institutions in the development of data bases to support the implementation of planning and management systems. This publication serves to identify and describe those finance-related data elements: (1) required to support current National Center for Higher Education Management Systems (NCHEMS)…

  18. Optical mass memory system (AMM-13). AMM/DBMS interface control document

    NASA Technical Reports Server (NTRS)

    Bailey, G. A.

    1980-01-01

    The baseline for external interfaces of a 10 to the 13th power bit, optical archival mass memory system (AMM-13) is established. The types of interfaces addressed include data transfer; AMM-13, Data Base Management System, NASA End-to-End Data System computer interconnect; data/control input and output interfaces; test input data source; file management; and facilities interface.

  19. PMIS Project. Planning & Management Information System. A Project To Develop a Data Processing System for Support of the Planning and Management Needs of Local School Districts. Final Report, Year 2.

    ERIC Educational Resources Information Center

    Council of the Great City Schools, Washington, DC.

    This document examines the design and structure of PMIS (Planning and Management Information System), an information system that supports the decisionmaking process of executive management in local school districts. The system is designed around a comprehensive, longitudinal, and interrelated data base. It utilizes a powerful real-time,…

  20. Region and database management for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, D.

    The Data Integration 2000 Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract. It is based on the Commercial-Off-The-Shelf product solution with commercially proven business processes. The COTS product solution set, of PassPort and People Soft software, supports finance, supply and chemical management/Material Safety Data Sheet, human resources.

  1. Office automation: The administrative window into the integrated DBMS

    NASA Technical Reports Server (NTRS)

    Brock, G. H.

    1985-01-01

    In parallel to the evolution of Management Information Systems from simple data files to complex data bases, the stand-alone computer systems have been migrating toward fully integrated systems serving the work force. The next major productivity gain may very well be to make these highly sophisticated working level Data Base Management Systems (DMBS) serve all levels of management with reports of varying levels of detail. Most attempts by the DBMS development organization to provide useful information to management seem to bog down in the quagmire of competing working level requirements. Most large DBMS development organizations possess three to five year backlogs. Perhaps Office Automation is the vehicle that brings to pass the Management Information System that really serves management. A good office automation system manned by a team of facilitators seeking opportunities to serve end-users could go a long way toward defining a DBMS that serves management. This paper will briefly discuss the problems of the DBMS organization, alternative approaches to solving some of the major problems, a debate about problems that may have no solution, and finally how office automation fits into the development of the Manager's Management Information System.

  2. Internationally coordinated multi-mission planning is now critical to sustain the space-based rainfall observations needed for managing floods globally

    NASA Astrophysics Data System (ADS)

    Reed, Patrick M.; Chaney, Nathaniel W.; Herman, Jonathan D.; Ferringer, Matthew P.; Wood, Eric F.

    2015-02-01

    At present 4 of 10 dedicated rainfall observing satellite systems have exceeded their design life, some by more than a decade. Here, we show operational implications for flood management of a ‘collapse’ of space-based rainfall observing infrastructure as well as the high-value opportunities for a globally coordinated portfolio of satellite missions and data services. Results show that the current portfolio of rainfall missions fails to meet operational data needs for flood management, even when assuming a perfectly coordinated data product from all current rainfall-focused missions (i.e., the full portfolio). In the full portfolio, satellite-based rainfall data deficits vary across the globe and may preclude climate adaptation in locations vulnerable to increasing flood risks. Moreover, removing satellites that are currently beyond their design life (i.e., the reduced portfolio) dramatically increases data deficits globally and could cause entire high intensity flood events to be unobserved. Recovery from the reduced portfolio is possible with internationally coordinated replenishment of as few as 2 of the 4 satellite systems beyond their design life, yielding rainfall data coverages that outperform the current full portfolio (i.e., an optimized portfolio of eight satellites can outperform ten satellites). This work demonstrates the potential for internationally coordinated satellite replenishment and data services to substantially enhance the cost-effectiveness, sustainability and operational value of space-based rainfall observations in managing evolving flood risks.

  3. MushBase: A Mushroom Information Database Application

    PubMed Central

    Le, Vang Quy; Lee, Hyun-Sook

    2007-01-01

    A database application, namely MushBase, has been built based on Microsoft Access in order to store and manage different kinds of data about mushroom biological information of species, strains and their physiological characteristics such as geometries and growth condition(s). In addition, it is also designed to store another group of information that is experimental data about mushroom classification by Random Amplification of Polymorphic DNA (RAPD). These two groups of information are stored and managed in the way so that it is convenient to retrieve each group of data and to cross-refer between them as well. PMID:24015087

  4. Graph-based urban scene analysis using symbolic data

    NASA Astrophysics Data System (ADS)

    Moissinac, Henri; Maitre, Henri; Bloch, Isabelle

    1995-07-01

    A framework is presented for the interpretation of a urban landscape based on the analysis of aerial pictures. This method has been designed for the use of a priori knowledge provided by a geographic map in order to improve the image analysis stage. A coherent final interpretation of the studied area is proposed. It relies on a graph based data structure to modelize the urban landscape, and on a global uncertainty management to evaluate the final confidence we can have in the results presented. This structure and uncertainty management tend to reflect the hierarchy of the available data and the interpretation levels.

  5. DataHub knowledge based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Collins, Donald J.; Doyle, Richard J.; Jacobson, Allan S.

    1991-01-01

    Viewgraphs on DataHub knowledge based assistance for science visualization and analysis using large distributed databases. Topics covered include: DataHub functional architecture; data representation; logical access methods; preliminary software architecture; LinkWinds; data knowledge issues; expert systems; and data management.

  6. Secure web book to store structural genomics research data.

    PubMed

    Manjasetty, Babu A; Höppner, Klaus; Mueller, Uwe; Heinemann, Udo

    2003-01-01

    Recently established collaborative structural genomics programs aim at significantly accelerating the crystal structure analysis of proteins. These large-scale projects require efficient data management systems to ensure seamless collaboration between different groups of scientists working towards the same goal. Within the Berlin-based Protein Structure Factory, the synchrotron X-ray data collection and the subsequent crystal structure analysis tasks are located at BESSY, a third-generation synchrotron source. To organize file-based communication and data transfer at the BESSY site of the Protein Structure Factory, we have developed the web-based BCLIMS, the BESSY Crystallography Laboratory Information Management System. BCLIMS is a relational data management system which is powered by MySQL as the database engine and Apache HTTP as the web server. The database interface routines are written in Python programing language. The software is freely available to academic users. Here we describe the storage, retrieval and manipulation of laboratory information, mainly pertaining to the synchrotron X-ray diffraction experiments and the subsequent protein structure analysis, using BCLIMS.

  7. Adoption of Building Information Modelling in project planning risk management

    NASA Astrophysics Data System (ADS)

    Mering, M. M.; Aminudin, E.; Chai, C. S.; Zakaria, R.; Tan, C. S.; Lee, Y. Y.; Redzuan, A. A.

    2017-11-01

    An efficient and effective risk management required a systematic and proper methodology besides knowledge and experience. However, if the risk management is not discussed from the starting of the project, this duty is notably complicated and no longer efficient. This paper presents the adoption of Building Information Modelling (BIM) in project planning risk management. The objectives is to identify the traditional risk management practices and its function, besides, determine the best function of BIM in risk management and investigating the efficiency of adopting BIM-based risk management during the project planning phase. In order to obtain data, a quantitative approach is adopted in this research. Based on data analysis, the lack of compliance with project requirements and failure to recognise risk and develop responses to opportunity are the risks occurred when traditional risk management is implemented. When using BIM in project planning, it works as the tracking of cost control and cash flow give impact on the project cycle to be completed on time. 5D cost estimation or cash flow modeling benefit risk management in planning, controlling and managing budget and cost reasonably. There were two factors that mostly benefit a BIM-based technology which were formwork plan with integrated fall plan and design for safety model check. By adopting risk management, potential risks linked with a project and acknowledging to those risks can be identified to reduce them to an acceptable extent. This means recognizing potential risks and avoiding threat by reducing their negative effects. The BIM-based risk management can enhance the planning process of construction projects. It benefits the construction players in various aspects. It is important to know the application of BIM-based risk management as it can be a lesson learnt to others to implement BIM and increase the quality of the project.

  8. Role of Knowledge Management and Analytical CRM in Business: Data Mining Based Framework

    ERIC Educational Resources Information Center

    Ranjan, Jayanthi; Bhatnagar, Vishal

    2011-01-01

    Purpose: The purpose of the paper is to provide a thorough analysis of the concepts of business intelligence (BI), knowledge management (KM) and analytical CRM (aCRM) and to establish a framework for integrating all the three to each other. The paper also seeks to establish a KM and aCRM based framework using data mining (DM) techniques, which…

  9. Using School Performance Data to Drive School and Education District Office Accountability and Improvement: The Case of Ghana

    ERIC Educational Resources Information Center

    Prew, Martin; Quaigrain, Kenneth

    2010-01-01

    This article looks at a school management tool that allows school managers and education district offices to review the performance of their schools and use the broad-based data to undertake orchestrated planning with districts planning delivery based on the needs of schools and in support of school improvement plans. The review process also…

  10. Proceedings of the National Conference on Energy Resource Management. Volume 1: Techniques, Procedures and Data Bases

    NASA Technical Reports Server (NTRS)

    Brumfield, J. O. (Editor); Schiffman, Y. M. (Editor)

    1982-01-01

    Topics dealing with the integration of remotely sensed data with geographic information system for application in energy resources management are discussed. Associated remote sensing and image analysis techniques are also addressed.

  11. Application of Ontology Technology in Health Statistic Data Analysis.

    PubMed

    Guo, Minjiang; Hu, Hongpu; Lei, Xingyun

    2017-01-01

    Research Purpose: establish health management ontology for analysis of health statistic data. Proposed Methods: this paper established health management ontology based on the analysis of the concepts in China Health Statistics Yearbook, and used protégé to define the syntactic and semantic structure of health statistical data. six classes of top-level ontology concepts and their subclasses had been extracted and the object properties and data properties were defined to establish the construction of these classes. By ontology instantiation, we can integrate multi-source heterogeneous data and enable administrators to have an overall understanding and analysis of the health statistic data. ontology technology provides a comprehensive and unified information integration structure of the health management domain and lays a foundation for the efficient analysis of multi-source and heterogeneous health system management data and enhancement of the management efficiency.

  12. Computer Aided Management for Information Processing Projects.

    ERIC Educational Resources Information Center

    Akman, Ibrahim; Kocamustafaogullari, Kemal

    1995-01-01

    Outlines the nature of information processing projects and discusses some project management programming packages. Describes an in-house interface program developed to utilize a selected project management package (TIMELINE) by using Oracle Data Base Management System tools and Pascal programming language for the management of information system…

  13. Using Geo-Data Corporately on the Response Phase of Emergency Management

    NASA Astrophysics Data System (ADS)

    Demir Ozbek, E.; Ates, S.; Aydinoglu, A. C.

    2015-08-01

    Response phase of emergency management is the most complex phase in the entire cycle because it requires cooperation between various actors relating to emergency sectors. A variety of geo-data is needed at the emergency response such as; existing data provided by different institutions and dynamic data collected by different sectors at the time of the disaster. Disaster event is managed according to elaborately defined activity-actor-task-geodata cycle. In this concept, every activity of emergency response is determined with Standard Operation Procedure that enables users to understand their tasks and required data in any activity. In this study, a general conceptual approach for disaster and emergency management system is developed based on the regulations to serve applications in Istanbul Governorship Provincial Disaster and Emergency Directorate. The approach is implemented to industrial facility explosion example. In preparation phase, optimum ambulance locations are determined according to general response time of the ambulance to all injury cases in addition to areas that have industrial fire risk. Management of the industrial fire case is organized according to defined actors, activities, and working cycle that describe required geo-data. A response scenario was prepared and performed for an industrial facility explosion event to exercise effective working cycle of actors. This scenario provides using geo-data corporately between different actors while required data for each task is defined to manage the industrial facility explosion event. Following developing web technologies, this scenario based approach can be effective to use geo-data on the web corporately.

  14. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing.

    PubMed

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

    2008-01-15

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  15. Improving the Analysis, Storage and Sharing of Neuroimaging Data using Relational Databases and Distributed Computing

    PubMed Central

    Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.

    2007-01-01

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812

  16. Administering a Web-Based Course on Database Technology

    ERIC Educational Resources Information Center

    de Oliveira, Leonardo Rocha; Cortimiglia, Marcelo; Marques, Luis Fernando Moraes

    2003-01-01

    This article presents a managerial experience with a web-based course on data base technology for enterprise management. The course has been developed and managed by a Department of Industrial Engineering in Brazil in a Public University. Project's managerial experiences are described covering its conception stage where the Virtual Learning…

  17. Data-Centric Situational Awareness and Management in Intelligent Power Systems

    NASA Astrophysics Data System (ADS)

    Dai, Xiaoxiao

    The rapid development of technology and society has made the current power system a much more complicated system than ever. The request for big data based situation awareness and management becomes urgent today. In this dissertation, to respond to the grand challenge, two data-centric power system situation awareness and management approaches are proposed to address the security problems in the transmission/distribution grids and social benefits augmentation problem at the distribution-customer lever, respectively. To address the security problem in the transmission/distribution grids utilizing big data, the first approach provides a fault analysis solution based on characterization and analytics of the synchrophasor measurements. Specically, the optimal synchrophasor measurement devices selection algorithm (OSMDSA) and matching pursuit decomposition (MPD) based spatial-temporal synchrophasor data characterization method was developed to reduce data volume while preserving comprehensive information for the big data analyses. And the weighted Granger causality (WGC) method was investigated to conduct fault impact causal analysis during system disturbance for fault localization. Numerical results and comparison with other methods demonstrate the effectiveness and robustness of this analytic approach. As more social effects are becoming important considerations in power system management, the goal of situation awareness should be expanded to also include achievements in social benefits. The second approach investigates the concept and application of social energy upon the University of Denver campus grid to provide management improvement solutions for optimizing social cost. Social element--human working productivity cost, and economic element--electricity consumption cost, are both considered in the evaluation of overall social cost. Moreover, power system simulation, numerical experiments for smart building modeling, distribution level real-time pricing and social response to the pricing signals are studied for implementing the interactive artificial-physical management scheme.

  18. 75 FR 7234 - South Atlantic Fishery Management Council; Public Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-18

    ... Atlantic Fishery Management Council; Public Meetings AGENCY: National Marine Fisheries Service (NMFS... meetings. SUMMARY: The South Atlantic Fishery Management Council (Council) will hold meetings of its..., Southeast Data Assessment and Review (SEDAR) Committee, Ecosystem-Based Management Committee, joint Shrimp...

  19. [Accidents at work and occupational diseases trend in agriculture insurance management. The contribution of INAIL data's for the knowledge of a worrying phenomenon].

    PubMed

    Calandriello, Luigi; Goggiamani, Angela; Ienzi, Emanuela; Naldini, Silvia; Orsini, Dario

    2013-01-01

    The author's describe accidents at work and occupational diseases outcome's measure in Agricolture insurance management acquired through statistical approach based on data processing provided by INAIL Bank data. Accident's incidence in Agricolture is compared to main insurance managements, using frequency index of accidents appearance selected on line of work and type of consequence. Concerning occupational diseases the authors describes the complaints and compensation with the comparison referring the analysis to statistical general data. The data define a worrying phenomenon.

  20. Evidence-based practice beliefs and behaviors of nurses providing cancer pain management: a mixed-methods approach.

    PubMed

    Eaton, Linda H; Meins, Alexa R; Mitchell, Pamela H; Voss, Joachim; Doorenbos, Ardith Z

    2015-03-01

    To describe evidence-based practice (EBP) beliefs and behaviors of nurses who provide cancer pain management. Descriptive, cross-sectional with a mixed-methods approach. Two inpatient oncology units in the Pacific Northwest. 40 RNs.
 Data collected by interviews and web-based surveys. EBP beliefs, EBP implementation, evidence-based pain management. Nurses agreed with the positive aspects of EBP and their implementation ability, although implementation level was low. They were satisfied with their pain management practices. Oncology nursing certification was associated with innovativeness, and innovativeness was associated with EBP beliefs. Themes identified were (a) limited definition of EBP, (b) varied evidence-based pain management decision making, (c) limited identification of evidence-based pain management practices, and (d) integration of nonpharmacologic interventions into patient care. Nurses' low level of EBP implementation in the context of pain management was explained by their trust that standards of care and medical orders were evidence-based. Nurses' EBP beliefs and behaviors should be considered when developing strategies for sustaining evidence-based pain management practices. Implementation of the EBP process by nurses may not be realistic in the inpatient setting; therefore, hospital pain management policies need to be evidence-based and reinforced with nurses.

  1. Trust-Based Design of Human-Guided Algorithms

    DTIC Science & Technology

    2007-06-01

    Management Interdepartmental Program in Operations Research 17 May, 2007 Approved by: Laura Major Forest The Charles Stark Draper Laboratory...2. Information Analysis: predicting based on data, integrating and managing information, augmenting human operator perception and cognition. 3...allocation of automation by designers and managers . How an operator decides between manual and automatic control of a system is a necessary

  2. The Influence of Performance-Based Management on Teaching and Research Performance of Finnish Senior Academics

    ERIC Educational Resources Information Center

    Kivistö, Jussi; Pekkola, Elias; Lyytinen, Anu

    2017-01-01

    Despite the widespread use of performance-based management in higher education, empirical research on its actual impact has remained scarce, particularly in Europe. With agency theory as a framework, our study utilised survey data collected from Finnish universities in order to explore the influence of performance management on perceived teaching…

  3. A 3-D terrain visualization database for highway information management

    DOT National Transportation Integrated Search

    1999-07-26

    A Multimedia based Highway Information System (MMHIS) is described in the paper to improve the existing photologging system for various operation and management needs. The full digital, computer based MMHIS uses technologies of video, multimedia data...

  4. Promoting evidence-based childhood fever management through a peer education programme based on the theory of planned behaviour.

    PubMed

    Edwards, Helen; Walsh, Anne; Courtney, Mary; Monaghan, Sarah; Wilson, Jenny; Young, Jeanine

    2007-10-01

    This study examined effectiveness of a theoretically based education programme in reducing inappropriate antipyretic use in fever management. Paediatric nurses' inconsistent, ritualistic antipyretic use in fever management is influenced by many factors including inconsistent beliefs and parental requests. Determinants of antipyretic administration, identified by the theory of planned behaviour, were belief-based attitudes and subjective norms. A quasi-experiment explored group effects of a peer education programme, based on the theory of planned behaviour, on factors influencing paediatric nurses' antipyretic administration. Surveys and chart audits collected data from medical wards at experimental and control hospitals one month pre and one and four months postpeer education programme. All nurses employed in targeted wards were eligible to participate in surveys and all eligible charts were audited. The peer education programme consisted of four one-hour sessions targeting evidence-based knowledge, myths and misconceptions, normative, attitudinal and control influences over and rehearsal of evidence-based fever management. All nurses in experimental hospital targeted wards were eligible to attend. Peer education and support facilitated session information reaching those unable to attend sessions. Two-way univariate anovas explored between subject, experimental and control group and within subject factors, pre, post and latency data. Significant interactions in normative influence (p = 0.01) and intentions (p = 0.01), a significant main group effect in control influence (p = 0.01) and a significant main effect between audit data across time points (p = 0.03) highlight peer education programme effectiveness in behaviour change. Normative, control and intention changes postpeer education programme were maintained in latency data; mean temperature was not. The peer education programme, based on a behaviour change theory, initiated and maintained evidence-based intentions for antipyretics use in fever management. The promotion of evidence-based change in organizational unit intentions and behaviour highlights the crucial role peer support and education can play in continuing educational programmes.

  5. Study of Collaborative Management for Transportation Construction Project Based on BIM Technology

    NASA Astrophysics Data System (ADS)

    Jianhua, Liu; Genchuan, Luo; Daiquan, Liu; Wenlei, Li; Bowen, Feng

    2018-03-01

    Abstract. Building Information Modeling(BIM) is a building modeling technology based on the relevant information data of the construction project. It is an advanced technology and management concept, which is widely used in the whole life cycle process of planning, design, construction and operation. Based on BIM technology, transportation construction project collaborative management can have better communication through authenticity simulation and architectural visualization and can obtain the basic and real-time information such as project schedule, engineering quality, cost and environmental impact etc. The main services of highway construction management are integrated on the unified BIM platform for collaborative management to realize information intercommunication and exchange, to change the isolated situation of information in the past, and improve the level of information management. The final BIM model is integrated not only for the information management of project and the integration of preliminary documents and design drawings, but also for the automatic generation of completion data and final accounts, which covers the whole life cycle of traffic construction projects and lays a good foundation for smart highway construction.

  6. Integrating modal-based NDE techniques and bridge management systems using quality management

    NASA Astrophysics Data System (ADS)

    Sikorsky, Charles S.

    1997-05-01

    The intent of bridge management systems is to help engineers and managers determine when and where to spend bridge funds such that commerce and the motoring public needs are satisfied. A major shortcoming which states are experiencing is the NBIS data available is insufficient to perform certain functions required by new bridge management systems, such as modeling bridge deterioration and predicting costs. This paper will investigate how modal based nondestructive damage evaluation techniques can be integrated into bridge management using quality management principles. First, quality from the manufacturing perspective will be summarized. Next, the implementation of quality management in design and construction will be reinterpreted for bridge management. Based on this, a theory of approach will be formulated to improve the productivity of a highway transportation system.

  7. Data Base Directions: Information Resource Management - Strategies and Tools. Proceedings of the Workshop of the National Bureau of Standards and the Association for Computing Machinery (Ft. Lauderdale, Florida, October 20-22, 1980).

    ERIC Educational Resources Information Center

    Goldfine, Alan H., Ed.

    This workshop investigated how managers can evaluate, select, and effectively use information resource management (IRM) tools, especially data dictionary systems (DDS). An executive summary, which provides a definition of IRM as developed by workshop participants, precedes the keynote address, "Data: The Raw Material of a Paper Factory,"…

  8. X-PAT: a multiplatform patient referral data management system for small healthcare institution requirements.

    PubMed

    Masseroli, Marco; Marchente, Mario

    2008-07-01

    We present X-PAT, a platform-independent software prototype that is able to manage patient referral multimedia data in an intranet network scenario according to the specific control procedures of a healthcare institution. It is a self-developed storage framework based on a file system, implemented in eXtensible Markup Language (XML) and PHP Hypertext Preprocessor Language, and addressed to the requirements of limited-dimension healthcare entities (small hospitals, private medical centers, outpatient clinics, and laboratories). In X-PAT, healthcare data descriptions, stored in a novel Referral Base Management System (RBMS) according to Health Level 7 Clinical Document Architecture Release 2 (CDA R2) standard, can be easily applied to the specific data and organizational procedures of a particular healthcare working environment thanks also to the use of standard clinical terminology. Managed data, centralized on a server, are structured in the RBMS schema using a flexible patient record and CDA healthcare referral document structures based on XML technology. A novel search engine allows defining and performing queries on stored data, whose rapid execution is ensured by expandable RBMS indexing structures. Healthcare personnel can interface the X-PAT system, according to applied state-of-the-art privacy and security measures, through friendly and intuitive Web pages that facilitate user acceptance.

  9. Information-based management mode based on value network analysis for livestock enterprises

    NASA Astrophysics Data System (ADS)

    Liu, Haoqi; Lee, Changhoon; Han, Mingming; Su, Zhongbin; Padigala, Varshinee Anu; Shen, Weizheng

    2018-01-01

    With the development of computer and IT technologies, enterprise management has gradually become information-based management. Moreover, due to poor technical competence and non-uniform management, most breeding enterprises show a lack of organisation in data collection and management. In addition, low levels of efficiency result in increasing production costs. This paper adopts 'struts2' in order to construct an information-based management system for standardised and normalised management within the process of production in beef cattle breeding enterprises. We present a radio-frequency identification system by studying multiple-tag anti-collision via a dynamic grouping ALOHA algorithm. This algorithm is based on the existing ALOHA algorithm and uses an improved packet dynamic of this algorithm, which is characterised by a high-throughput rate. This new algorithm can reach a throughput 42% higher than that of the general ALOHA algorithm. With a change in the number of tags, the system throughput is relatively stable.

  10. EcoPrinciples Connect: A Pilot Project Matching Ecological Principles with Available Data to Promote Ecosystem-Based Management

    NASA Astrophysics Data System (ADS)

    Martone, R. G.; Erickson, A.; Mach, M.; Hale, T.; McGregor, A.; Prahler, E. E.; Foley, M.; Caldwell, M.; Hartge, E. H.

    2016-02-01

    Ocean and coastal practitioners work within existing financial constraints, jurisdictions, and legislative authorities to manage coastal and marine resources while seeking to promote and maintain a healthy and productive coastal economy. Fulfilling this mandate necessitates incorporation of best available science, including ecosystem-based management (EBM) into coastal and ocean management decisions. To do this, many agencies seek ways to apply lessons from ecological theory into their decision processes. However, making direct connections between science and management can be challenging, in part because there is no process for linking ecological principles (e.g., maintaining species diversity, habitat diversity, connectivity and populations of key species) with available data. Here we explore how incorporating emerging data and methods into resource management at a local scale can improve the overall health of our coastal and marine ecosystems. We introduce a new web-based interface, EcoPrinciples Connect, that links marine managers to scientific and geospatial information through the lens of these ecological principles, ultimately helping managers become more efficient, more consistent, and advance the integration of EBM. The EcoPrinciples Connect tool grew directly out of needs identified in response to a Center for Ocean Solutions reference guide, Incorporating Ecological Principles into California Ocean and Coastal Management: Examples from Practice. Here we illustrate how we have worked to translate the information in this guide into a co-developed, user-centric tool for agency staff. Specifically, we present a pilot project where we match publicly available data to the ecological principles for the California San Francisco Bay Conservation and Development Commission. We will share early lessons learned from pilot development and highlight opportunities for future transferability to an expanded group of practitioners.

  11. EcoPrinciples Connect: A Pilot Project Matching Ecological Principles with Available Data to Promote Ecosystem-Based Management

    NASA Astrophysics Data System (ADS)

    Martone, R. G.; Erickson, A.; Mach, M.; Hale, T.; McGregor, A.; Prahler, E. E.; Foley, M.; Caldwell, M.; Hartge, E. H.

    2016-12-01

    Ocean and coastal practitioners work within existing financial constraints, jurisdictions, and legislative authorities to manage coastal and marine resources while seeking to promote and maintain a healthy and productive coastal economy. Fulfilling this mandate necessitates incorporation of best available science, including ecosystem-based management (EBM) into coastal and ocean management decisions. To do this, many agencies seek ways to apply lessons from ecological theory into their decision processes. However, making direct connections between science and management can be challenging, in part because there is no process for linking ecological principles (e.g., maintaining species diversity, habitat diversity, connectivity and populations of key species) with available data. Here we explore how incorporating emerging data and methods into resource management at a local scale can improve the overall health of our coastal and marine ecosystems. We introduce a new web-based interface, EcoPrinciples Connect, that links marine managers to scientific and geospatial information through the lens of these ecological principles, ultimately helping managers become more efficient, more consistent, and advance the integration of EBM. The EcoPrinciples Connect tool grew directly out of needs identified in response to a Center for Ocean Solutions reference guide, Incorporating Ecological Principles into California Ocean and Coastal Management: Examples from Practice. Here we illustrate how we have worked to translate the information in this guide into a co-developed, user-centric tool for agency staff. Specifically, we present a pilot project where we match publicly available data to the ecological principles for the California San Francisco Bay Conservation and Development Commission. We will share early lessons learned from pilot development and highlight opportunities for future transferability to an expanded group of practitioners.

  12. Geographic Information System (GIS) capabilities in traffic accident information management: a qualitative approach.

    PubMed

    Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali

    2017-06-01

    Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach's alpha of 75%. Data was analyzed using the decision Delphi technique. GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information.

  13. Science framework for conservation and restoration of the sagebrush biome: Linking the Department of the Interior’s Integrated Rangeland Fire Management Strategy to long-term strategic conservation actions, Part 1. Science basis and applications

    USGS Publications Warehouse

    Chambers, Jeanne C.; Beck, Jeffrey L.; Bradford, John B.; Bybee, Jared; Campbell, Steve; Carlson, John; Christiansen, Thomas J; Clause, Karen J.; Collins, Gail; Crist, Michele R.; Dinkins, Jonathan B.; Doherty, Kevin E.; Edwards, Fred; Espinosa, Shawn; Griffin, Kathleen A.; Griffin, Paul; Haas, Jessica R.; Hanser, Steven E.; Havlina, Douglas W.; Henke, Kenneth F.; Hennig, Jacob D.; Joyce, Linda A; Kilkenny, Francis F.; Kulpa, Sarah M; Kurth, Laurie L; Maestas, Jeremy D; Manning, Mary E.; Mayer, Kenneth E.; Mealor, Brian A.; McCarthy, Clinton; Pellant, Mike; Perea, Marco A.; Prentice, Karen L.; Pyke, David A.; Wiechman , Lief A.; Wuenschel, Amarina

    2017-01-01

    The Science Framework is intended to link the Department of the Interior’s Integrated Rangeland Fire Management Strategy with long-term strategic conservation actions in the sagebrush biome. The Science Framework provides a multiscale approach for prioritizing areas for management and determining effective management strategies within the sagebrush biome. The emphasis is on sagebrush (Artemisia spp.) ecosystems and Greater sage-grouse (Centrocercus urophasianus). The approach provided in the Science Framework links sagebrush ecosystem resilience to disturbance and resistance to nonnative, invasive plant species to species habitat information based on the distribution and abundance of focal species. A geospatial process is presented that overlays information on ecosystem resilience and resistance, species habitats, and predominant threats and that can be used at the mid-scale to prioritize areas for management. A resilience and resistance habitat matrix is provided that can help decisionmakers evaluate risks and determine appropriate management strategies. Prioritized areas and management strategies can be refined by managers and stakeholders at the local scale based on higher resolution data and local knowledge. Decision tools are discussed for determining appropriate management actions for areas that are prioritized for management. Geospatial data, maps, and models are provided through the U.S. Geological Survey (USGS) ScienceBase and Bureau of Land Management (BLM) Landscape Approach Data Portal. The Science Framework is intended to be adaptive and will be updated as additional data become available on other values and species at risk. It is anticipated that the Science Framework will be widely used to: (1) inform emerging strategies to conserve sagebrush ecosystems, sagebrush dependent species, and human uses of the sagebrush system, and (2) assist managers in prioritizing and planning on-the-ground restoration and mitigation actions across the sagebrush biome.

  14. The Research of Spatial-Temporal Analysis and Decision-Making Assistant System for Disabled Person Affairs Based on Mapworld

    NASA Astrophysics Data System (ADS)

    Zhang, J. H.; Yang, J.; Sun, Y. S.

    2015-06-01

    This system combines the Mapworld platform and informationization of disabled person affairs, uses the basic information of disabled person as center frame. Based on the disabled person population database, the affairs management system and the statistical account system, the data were effectively integrated and the united information resource database was built. Though the data analysis and mining, the system provides powerful data support to the decision making, the affairs managing and the public serving. It finally realizes the rationalization, normalization and scientization of disabled person affairs management. It also makes significant contributions to the great-leap-forward development of the informationization of China Disabled Person's Federation.

  15. Research on rebuilding the data information environment for aeronautical manufacturing enterprise

    NASA Astrophysics Data System (ADS)

    Feng, Xilan; Jiang, Zhiqiang; Zong, Xuewen; Shi, Jinfa

    2005-12-01

    The data environment on integrated information system and the basic standard on information resource management are the key effectively of the remote collaborative designing and manufacturing for complex product. A study project on rebuilding the data information environment for aeronautical manufacturing enterprise (Aero-ME) is put forwarded. Firstly, the data environment on integrated information system, the basic standard on information resource management, the basic establishment on corporation's information, the development on integrated information system, and the information education are discussed profoundly based on the practical requirement of information resource and technique for contemporary Aero-ME. Then, the idea and method with the data environment rebuilding based on I-CASE in the corporation is put forward, and the effective method and implement approach for manufacturing enterprise information is brought forwards. It will also the foundation and assurance that rebuilding the corporation data-environment and promoting standardizing information resource management for the development of Aero-ME information engineering.

  16. Improved Discovery and Re-Use of Oceanographic Data through a Data Management Center

    NASA Astrophysics Data System (ADS)

    Rauch, S.; Allison, M. D.; Groman, R. C.; Chandler, C. L.; Galvarino, C.; Gegg, S. R.; Kinkade, D.; Shepherd, A.; Wiebe, P. H.; Glover, D. M.

    2013-12-01

    Effective use and reuse of ecological data are not only contingent upon those data being well-organized and documented, but also upon data being easily discoverable and accessible by others. As funding agency and publisher policies begin placing more emphasis on, or even requiring, sharing of data, some researchers may feel overwhelmed in determining how best to manage and share their data. Other researchers may be frustrated by the inability to easily find data of interest, or they may be hesitant to use datasets that are poorly organized and lack complete documentation. In all of these scenarios, the data management and sharing process can be facilitated by data management centers, as demonstrated by the Biological and Chemical Oceanography Data Management Office (BCO-DMO). BCO-DMO was created in 2006 to work with investigators to manage data from research funded by the Division of Ocean Sciences (OCE) Biological and Chemical Oceanography Sections and the Division of Polar Programs (PLR) Antarctic Organisms and Ecosystems Program of the US National Science Foundation (NSF). BCO-DMO plays a role throughout the data lifecycle, from the early stages of offering support to researchers in developing data management plans to the final stages of depositing data in a permanent archive. An overarching BCO-DMO goal is to provide open access to data through a system that enhances data discovery and reuse. Features have been developed that allow users to find data of interest, assess fitness for purpose, and download the data for reuse. Features that enable discovery include both text-based and geospatial-based search interfaces, as well as a semantically-enabled faceted search [1]. BCO-DMO data managers work closely with the contributing investigators to develop robust metadata, an essential component to enable data reuse. The metadata, which describe data acquisition and processing methods, instrumentation, and parameters, are enhanced by the mapping of local vocabulary terms to community accepted controlled vocabularies. This use of controlled vocabularies allows for terms to be defined unambiguously, so users of the data know definitively what parameter was measured and/or analyzed and what instruments were used. Users can further assess fitness for use by visualizing data in the geospatial interface in various ways depending on the data type. Both the text- and geospatial-based interfaces provide easy access to view the datasets and download them in multiple formats. The BCO-DMO system, including the geospatial interface, relies largely on the use of open source software and tools. The data themselves are made available via the JGOFS/GLOBEC system [2], a distributed object-oriented data management system. Researchers contributing data to BCO-DMO benefit from the data management and sharing resources. Researchers looking for data can use BCO-DMO's system to find and use data of interest. This role of the data management center in facilitating discovery and reuse is one that can be extended to other research disciplines for the benefit of the science community. References: [1] Maffei, A. et al. 2011. Open Standards and Technologies in the S2S Framework. Abstract IN31A-1435 presented at AGU Fall Meeting, San Francisco, CA, 7 Dec 2011. [2] Flierl, G.R. et al. 2004. JGOFS Data System Overview, http://globec.whoi.edu/globec-dir/doc/datasys/jgsys.html.

  17. QAIT: a quality assurance issue tracking tool to facilitate the improvement of clinical data quality.

    PubMed

    Zhang, Yonghong; Sun, Weihong; Gutchell, Emily M; Kvecher, Leonid; Kohr, Joni; Bekhash, Anthony; Shriver, Craig D; Liebman, Michael N; Mural, Richard J; Hu, Hai

    2013-01-01

    In clinical and translational research as well as clinical trial projects, clinical data collection is prone to errors such as missing data, and misinterpretation or inconsistency of the data. A good quality assurance (QA) program can resolve many such errors though this requires efficient communications between the QA staff and data collectors. Managing such communications is critical to resolving QA problems but imposes a major challenge for a project involving multiple clinical and data processing sites. We have developed a QA issue tracking (QAIT) system to support clinical data QA in the Clinical Breast Care Project (CBCP). This web-based application provides centralized management of QA issues with role-based access privileges. It has greatly facilitated the QA process and enhanced the overall quality of the CBCP clinical data. As a stand-alone system, QAIT can supplement any other clinical data management systems and can be adapted to support other projects. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  18. Data Base Management Systems Panel. Third workshop summary

    NASA Technical Reports Server (NTRS)

    Urena, J. L. (Editor)

    1981-01-01

    The discussions and results of a review by a panel of data base management system (DRMS) experts of various aspects of the use of DBMSs within NASA/Office of Space and Terrestrial Applications (OSTA) and related organizations are summarized. The topics discussed included the present status of the use of DBMS technology and of the various ongoing DBMS-related efforts within NASA. The report drafts of a study that seeks to determine the functional requirements for a generalized DBMS for the NASA/OSTA and related data bases are examined. Future problems and possibilities with the use of DBMS technology are also considered. A list of recommendations for NASA/OSTA data systems is included.

  19. Assessment of Hyperspectral and SAR Remote Sensing for Solid Waste Landfill Management

    NASA Astrophysics Data System (ADS)

    Ottavianelli, Giuseppe; Hobbs, Stephen; Smith, Richard; Bruno, Davide

    2005-06-01

    Globally, waste management is one of the most critical environmental concerns that modern society is facing. Controlled disposal to land (landfill) is currently important, and due to the potentially harmful effects of gas emissions and leachate land contamination, the monitoring of a landfill is inherent in all phases of the site's life cycle. Data from satellite platforms can provide key support to a number of landfill management and monitoring practices, potentially reducing operational costs and hazards, and meeting the challenges of the future waste management agenda.The few previous studies performed show the value of EO data for mapping landcover around landfills and monitoring vegetation health. However, these were largely qualitative studies limited to single sensor types. The review of these studies highlights three key aspects. Firstly, with regard to leachate and gas monitoring, space-borne remote sensing has not proved to be a valid tool for an accurate quantitative analysis, it can only support ground remediation efforts based on the expertise of the visual interpreter and the knowledge of the landfill operator. Secondly, the additional research that focuses on landfill detection concentrates only on the images' data dimension (spatial and spectral), paying less attention to the sensor-independent bio- and geo-physical variables and the modelling of remote sensing physical principles for both active and restored landfill sites. These studies show some ambiguity in their results and additional aerial images or ground truth visits are always required to support the results. Thirdly, none of the studies explores the potential of Synthetic Aperture Radar (SAR) remote sensing and SAR interferometric processing to achieve a more robust automatic detection algorithm and extract additional information and knowledge for landfill management.Based on our previous work with ERS radar images and SAR interferometry, expertise in the waste management sector, and practical knowledge of landfill management practices, we propose to evaluate the use of hyperspectral and radar images for landfill monitoring and management. CHRIS offers hyperspectral data of commensurate spatial resolution with Envisat radarimages and thus appears ideally suited for studies using multi-sensor data fusion.The goal of the research is to identify practical ways in which EO data can support landfill management and monitoring, providing quantitative data where possible. Our objectives (based on fieldwork in UK landfills) are (1) to develop robust methods of detecting and mapping landfill sites, (2) to correlate EO data with on-site operational procedures, and (3) to investigate data fusion techniques based on our findings with the separate sensors. Dissemination of the findings will be through scientific journals, professional waste management publications and workshops. It is expected that the research will help the development of techniques which could be applied to monitor waste disposal to land beyond the UK scope of this study, including global monitoring.

  20. Research on IoT-based water environment benchmark data acquisition management

    NASA Astrophysics Data System (ADS)

    Yan, Bai; Xue, Bai; Ling, Lin; Jin, Huang; Ren, Liu

    2017-11-01

    Over the past more than 30 years of reform and opening up, China’s economy has developed at a full speed. However, this rapid growth is under restrictions of resource exhaustion and environmental pollution. Green sustainable development has become a common goal of all humans. As part of environmental resources, water resources are faced with such problems as pollution and shortage, thus hindering sustainable development. The top priority in water resources protection and research is to manage the basic data on water resources, and determine what is the footstone and scientific foundation of water environment management. By studying the aquatic organisms in the Yangtze River Basin, the Yellow River Basin, the Liaohe River Basin and the 5 lake areas, this paper puts forward an IoT-based water environment benchmark data management platform which can transform parameters measured to electric signals by way of chemical probe identification, and then send the benchmark test data of the water environment to node servers. The management platform will provide data and theoretical support for environmental chemistry, toxicology, ecology, etc., promote researches on environmental sciences, lay a solid foundation for comprehensive and systematic research on China’s regional environment characteristics, biotoxicity effects and environment criteria, and provide objective data for compiling standards of the water environment benchmark data.

  1. Sharing our data—An overview of current (2016) USGS policies and practices for publishing data on ScienceBase and an example interactive mapping application

    USGS Publications Warehouse

    Chase, Katherine J.; Bock, Andrew R.; Sando, Roy

    2017-01-05

    This report provides an overview of current (2016) U.S. Geological Survey policies and practices related to publishing data on ScienceBase, and an example interactive mapping application to display those data. ScienceBase is an integrated data sharing platform managed by the U.S. Geological Survey. This report describes resources that U.S. Geological Survey Scientists can use for writing data management plans, formatting data, and creating metadata, as well as for data and metadata review, uploading data and metadata to ScienceBase, and sharing metadata through the U.S. Geological Survey Science Data Catalog. Because data publishing policies and practices are evolving, scientists should consult the resources cited in this paper for definitive policy information.An example is provided where, using the content of a published ScienceBase data release that is associated with an interpretive product, a simple user interface is constructed to demonstrate how the open source capabilities of the R programming language and environment can interact with the properties and objects of the ScienceBase item and be used to generate interactive maps.

  2. PRMS Data Warehousing Prototype

    NASA Technical Reports Server (NTRS)

    Guruvadoo, Eranna K.

    2001-01-01

    Project and Resource Management System (PRMS) is a web-based, mid-level management tool developed at KSC to provide a unified enterprise framework for Project and Mission management. The addition of a data warehouse as a strategic component to the PRMS is investigated through the analysis design and implementation processes of a data warehouse prototype. As a proof of concept, a demonstration of the prototype with its OLAP's technology for multidimensional data analysis is made. The results of the data analysis and the design constraints are discussed. The prototype can be used to motivate interest and support for an operational data warehouse.

  3. PRMS Data Warehousing Prototype

    NASA Technical Reports Server (NTRS)

    Guruvadoo, Eranna K.

    2002-01-01

    Project and Resource Management System (PRMS) is a web-based, mid-level management tool developed at KSC to provide a unified enterprise framework for Project and Mission management. The addition of a data warehouse as a strategic component to the PRMS is investigated through the analysis, design and implementation processes of a data warehouse prototype. As a proof of concept, a demonstration of the prototype with its OLAP's technology for multidimensional data analysis is made. The results of the data analysis and the design constraints are discussed. The prototype can be used to motivate interest and support for an operational data warehouse.

  4. Development and implementation of an Integrated Water Resources Management System (IWRMS)

    NASA Astrophysics Data System (ADS)

    Flügel, W.-A.; Busch, C.

    2011-04-01

    One of the innovative objectives in the EC project BRAHMATWINN was the development of a stakeholder oriented Integrated Water Resources Management System (IWRMS). The toolset integrates the findings of the project and presents it in a user friendly way for decision support in sustainable integrated water resources management (IWRM) in river basins. IWRMS is a framework, which integrates different types of basin information and which supports the development of IWRM options for climate change mitigation. It is based on the River Basin Information System (RBIS) data models and delivers a graphical user interface for stakeholders. A special interface was developed for the integration of the enhanced DANUBIA model input and the NetSyMod model with its Mulino decision support system (mulino mDss) component. The web based IWRMS contains and combines different types of data and methods to provide river basin data and information for decision support. IWRMS is based on a three tier software framework which uses (i) html/javascript at the client tier, (ii) PHP programming language to realize the application tier, and (iii) a postgresql/postgis database tier to manage and storage all data, except the DANUBIA modelling raw data, which are file based and registered in the database tier. All three tiers can reside on one or different computers and are adapted to the local hardware infrastructure. IWRMS as well as RBIS are based on Open Source Software (OSS) components and flexible and time saving access to that database is guaranteed by web-based interfaces for data visualization and retrieval. The IWRMS is accessible via the BRAHMATWINN homepage: http://www.brahmatwinn.uni-jena.de and a user manual for the RBIS is available for download as well.

  5. 78 FR 68416 - South Atlantic Fishery Management Council (SAFMC); Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-14

    ... Atlantic Fishery Management Council (SAFMC); Public Meeting AGENCY: National Marine Fisheries Service (NMFS... South Atlantic Fishery Management Council (Council). SUMMARY: The Council will hold a Council Member... the Habitat and Ecosystem-Based Management Committees; Protected Resources Committee, Southeast Data...

  6. Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios

    USGS Publications Warehouse

    Banta, Edward R.

    2014-01-01

    Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable Document Format file.

  7. An evaluation of the telehealth facilitation of diabetes and cardiovascular care in remote Australian Indigenous communities: - protocol for the telehealth eye and associated medical services network [TEAMSnet] project, a pre-post study design.

    PubMed

    Brazionis, Laima; Jenkins, Alicia; Keech, Anthony; Ryan, Chris; Bursell, Sven-Erik

    2017-01-05

    Despite substantial investment in detection, early intervention and evidence-based treatments, current management strategies for diabetes-associated retinopathy and cardiovascular disease are largely based on real-time and face-to-face approaches. There are limited data re telehealth facilitation in type 2 diabetes management. Therefore, we aim to investigate efficacy of telehealth facilitation of diabetes and cardiovascular disease care in high-risk vulnerable Aboriginal and Torres Strait Islanders in remote/very remote Australia. Using a pre-post intervention design, 600 Indigenous Australians with type 2 diabetes will be recruited from three primary-care health-services in the Northern Territory. Diabetes status will be based on clinical records. There will be four technological interventions: 1. Baseline retinal imaging [as a real-time patient education/engagement tool and telehealth screening strategy]. 2. A lifestyle survey tool administered at ≈ 6-months. 3. At ≈ 6- and 18-months, an electronic cardiovascular disease and diabetes decision-support tool based on current guidelines in the Standard Treatment Manual of the Central Australian Rural Practitioner's Association to generate clinical recommendations. 4. Mobile tablet technology developed to enhance participant engagement in self-management. Data will include: Pre-intervention clinical and encounter-history data, baseline retinopathy status, decision-support and survey data/opportunistic mobile tablet encounter data. The primary outcome is increased participant adherence to clinical appointments, a marker of engagement and self-management. A cost-benefit analysis will be performed. Remoteness is a major barrier to provision and uptake of best-practice chronic disease management. Telehealth, beyond videoconferencing of consultations, could facilitate evidence-based management of diabetes and cardiovascular disease in Indigenous Australians and serve as a model for other conditions. Australia and New Zealand Clinical Trials Register (ANZCTR): ACTRN 12616000370404 was retrospectively registered on 22/03/2016.

  8. Documenting historical data and accessing it on the World Wide Web

    Treesearch

    Malchus B. Baker; Daniel P. Huebner; Peter F. Ffolliott

    2000-01-01

    New computer technologies facilitate the storage, retrieval, and summarization of watershed-based data sets on the World Wide Web. These data sets are used by researchers when testing and validating predictive models, managers when planning and implementing watershed management practices, educators when learning about hydrologic processes, and decisionmakers when...

  9. Space station data system analysis/architecture study. Task 5: Program plan

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Cost estimates for both the on-board and ground segments of the Space Station Data System (SSDS) are presented along with summary program schedules. Advanced technology development recommendations are provided in the areas of distributed data base management, end-to-end protocols, command/resource management, and flight qualified artificial intelligence machines.

  10. Data Recording in Performance Management: Trouble With the Logics

    ERIC Educational Resources Information Center

    Groth Andersson, Signe; Denvall, Verner

    2017-01-01

    In recent years, performance management (PM) has become a buzzword in public sector organizations. Well-functioning PM systems rely on valid performance data, but critics point out that conflicting rationale or logic among professional staff in recording information can undermine the quality of the data. Based on a case study of social service…

  11. Concept for a Satellite-Based Advanced Air Traffic Management System : Volume 3. Subsystem Functional Description.

    DOT National Transportation Integrated Search

    1974-02-01

    The volume presents a detailed description of the subsystems that comprise the Satellite-Based Advanced Air Traffic Management System. Described in detail are the surveillance, navigation, communications, data processing, and airport subsystems. The ...

  12. A web-based clinical trial management system for a sham-controlled multicenter clinical trial in depression.

    PubMed

    Durkalski, Valerie; Wenle Zhao; Dillon, Catherine; Kim, Jaemyung

    2010-04-01

    Clinical trial investigators and sponsors invest vast amounts of resources and energy into conducting trials and often face daily challenges with data management, project management, and data quality control. Rather than waiting months for study progress reports, investigators need the ability to use real-time data for the coordination and management of study activities across all study team members including site investigators, oversight committees, data and safety monitoring boards, and medical safety monitors. Web-based data management systems are beginning to meet this need but what distinguishes one system from the other are user needs/requirements and cost. To illustrate the development and implementation of a web-based data and project management system for a multicenter clinical trial designed to test the superiority of repeated transcranial magnetic stimulation versus sham for the treatment of patients with major depression. The authors discuss the reasons for not using a commercially available system for this study and describe the approach to developing their own web-based system for the OPT-TMS study. Timelines, effort, system architecture, and lessons learned are shared with the hope that this information will direct clinical trial researchers and software developers towards more efficient, user-friendly systems. The developers use a combination of generic and custom application code to allow for the flexibility to adapt the system to the needs of the study. Features of the system include: central participant registration and randomization; secure data entry at the site; participant progress/study calendar; safety data reporting; device accounting; monitor verification; and user-configurable generic reports and built-in customized reports. Hard coding was more time-efficient to address project-specific issues compared with the effort of creating a generic code application. As a consequence of this strategy, the required maintenance of the system is increased and the value of using this system for other trials is reduced. Web-based central computerized systems offer time-saving, secure options for managing clinical trial data. The choice of a commercially available system or an internally developed system is determined by the requirements of the study and users. Pros and cons to both approaches were discussed. If the intention is to use the system for various trials (single and multi-center, phases I-III) across various therapeutic areas, then the overall design should be a generic structure that simplifies the general application with minimal loss of functionality.

  13. The Use of Computer-Based Data Banks in the Teaching of Clinical Problem-Solving and Medical Case Management to Veterinary Students

    ERIC Educational Resources Information Center

    Conzelman, Gaylord M.; And Others

    1975-01-01

    Describes a program at the School of Veterinary Medicine, University of California at Davis to give students experience in diagnosis and management of urinary tract diseases. Students request from computer data banks that laboratory information they deem most useful in the medical management of each clinical problem. (JT)

  14. A Lessons Learned Knowledge Warehouse to Support the Army Knowledge Management Command-Centric

    DTIC Science & Technology

    2004-03-01

    Warehouse to Support the Army Knowledge Management Command-Centric increase the quality and availability of information in context ( knowledge ) to the... information , geographical information , knowledge base, Intelligence data (HUMINT, SIGINT, etc.); and • • Human Computer Interaction (HCI): allows...the Data Fusion Process from the HCI point of view? Can the LL Knowledge Base provide any valuable information to achieve better estimates of the

  15. Managing Astronomy Research Data: Data Practices in the Sloan Digital Sky Survey and Large Synoptic Survey Telescope Projects

    ERIC Educational Resources Information Center

    Sands, Ashley Elizabeth

    2017-01-01

    Ground-based astronomy sky surveys are massive, decades-long investments in scientific data collection. Stakeholders expect these datasets to retain scientific value well beyond the lifetime of the sky survey. However, the necessary investments in knowledge infrastructures for managing sky survey data are not yet in place to ensure the long-term…

  16. Recent Experience of the Application of a Commercial Data Base Management System (ADABAS) to a Scientific Data Bank (ECDIN).

    ERIC Educational Resources Information Center

    And Others; Town, William G.

    1980-01-01

    Discusses the problems encountered and solutions adopted in application of the ADABAS database management system to the ECDIN (Environmental Chemicals Data and Information Network) data bank. SIMAS, the pilot system, and ADABAS are compared, and ECDIN ADABAS design features are described. Appendices provide additional facts about ADABAS and SIMAS.…

  17. Using the Web for Recruitment, Screening, Tracking, Data Management, and Quality Control in a Dietary Assessment Clinical Validation Trial

    PubMed Central

    Hahn, Harry; Henry, Judith; Chacko, Sara; Winter, Ashley; Cambou, Mary C

    2010-01-01

    Screening and tracking subjects and data management in clinical trials require significant investments in manpower that can be reduced through the use of web-based systems. To support a validation trial of various dietary assessment tools that required multiple clinic visits and eight repeats of online assessments, we developed an interactive web-based system to automate all levels of management of a biomarker-based clinical trial. The “Energetics System” was developed to support 1) the work of the study coordinator in recruiting, screening and tracking subject flow, 2) the need of the principal investigator to review study progress, and 3) continuous data analysis. The system was designed to automate web-based self-screening into the trial. It supported scheduling tasks and triggered tailored messaging for late and non-responders. For the investigators, it provided real time status overviews on all subjects, created electronic case reports, supported data queries and prepared analytic data files. Encryption and multi-level password protection were used to insure data privacy. The system was programmed iteratively and required six months of a web programmer's time along with active team engagement. In this study the enhancement in speed and efficiency of recruitment and quality of data collection as a result of this system outweighed the initial investment. Web-based systems have the potential to streamline the process of recruitment and day-to-day management of clinical trials in addition to improving efficiency and quality. Because of their added value they should be considered for trials of moderate size or complexity. Grant support: NIH funded R01CA105048. PMID:19925884

  18. Design, Development and Utilization Perspectives on Database Management Systems

    ERIC Educational Resources Information Center

    Shneiderman, Ben

    1977-01-01

    This paper reviews the historical development of integrated data base management systems and examines competing approaches. Topics include management and utilization, implementation and design, query languages, security, integrity, privacy and concurrency. (Author/KP)

  19. Managing multicentre clinical trials with open source.

    PubMed

    Raptis, Dimitri Aristotle; Mettler, Tobias; Fischer, Michael Alexander; Patak, Michael; Lesurtel, Mickael; Eshmuminov, Dilmurodjon; de Rougemont, Olivier; Graf, Rolf; Clavien, Pierre-Alain; Breitenstein, Stefan

    2014-03-01

    Multicentre clinical trials are challenged by high administrative burden, data management pitfalls and costs. This leads to a reduced enthusiasm and commitment of the physicians involved and thus to a reluctance in conducting multicentre clinical trials. The purpose of this study was to develop a web-based open source platform to support a multi-centre clinical trial. We developed on Drupal, an open source software distributed under the terms of the General Public License, a web-based, multi-centre clinical trial management system with the design science research approach. This system was evaluated by user-testing and well supported several completed and on-going clinical trials and is available for free download. Open source clinical trial management systems are capable in supporting multi-centre clinical trials by enhancing efficiency, quality of data management and collaboration.

  20. Taming Data to Make Decisions: Using a Spatial Fuzzy Logic Decision Support Framework to Inform Conservation and Land Use Planning

    NASA Astrophysics Data System (ADS)

    Sheehan, T.; Baker, B.; Degagne, R. S.

    2015-12-01

    With the abundance of data sources, analytical methods, and computer models, land managers are faced with the overwhelming task of making sense of a profusion of data of wildly different types. Luckily, fuzzy logic provides a method to work with different types of data using language-based propositions such as "the landscape is undisturbed," and a simple set of logic constructs. Just as many surveys allow different levels of agreement with a proposition, fuzzy logic allows values reflecting different levels of truth for a proposition. Truth levels fall within a continuum ranging from Fully True to Fully False. Hence a fuzzy logic model produces continuous results. The Environmental Evaluation Modeling System (EEMS) is a platform-independent, tree-based, fuzzy logic modeling framework. An EEMS model provides a transparent definition of an evaluation model and is commonly developed as a collaborative effort among managers, scientists, and GIS experts. Managers specify a set of evaluative propositions used to characterize the landscape. Scientists, working with managers, formulate functions that convert raw data values into truth values for the propositions and produce a logic tree to combine results into a single metric used to guide decisions. Managers, scientists, and GIS experts then work together to implement and iteratively tune the logic model and produce final results. We present examples of two successful EEMS projects that provided managers with map-based results suitable for guiding decisions: sensitivity and climate change exposure in Utah and the Colorado Plateau modeled for the Bureau of Land Management; and terrestrial ecological intactness in the Mojave and Sonoran region of southern California modeled for the Desert Renewable Energy Conservation Plan.

  1. Innovative technology for web-based data management during an outbreak

    PubMed Central

    Mukhi, Shamir N; Chester, Tammy L Stuart; Klaver-Kibria, Justine DA; Nowicki, Deborah L; Whitlock, Mandy L; Mahmud, Salah M; Louie, Marie; Lee, Bonita E

    2011-01-01

    Lack of automated and integrated data collection and management, and poor linkage of clinical, epidemiological and laboratory data during an outbreak can inhibit effective and timely outbreak investigation and response. This paper describes an innovative web-based technology, referred to as Web Data, developed for the rapid set-up and provision of interactive and adaptive data management during outbreak situations. We also describe the benefits and limitations of the Web Data technology identified through a questionnaire that was developed to evaluate the use of Web Data implementation and application during the 2009 H1N1 pandemic by Winnipeg Regional Health Authority and Provincial Laboratory for Public Health of Alberta. Some of the main benefits include: improved and secure data access, increased efficiency and reduced error, enhanced electronic collection and transfer of data, rapid creation and modification of the database, conversion of specimen-level to case-level data, and user-defined data extraction and query capabilities. Areas requiring improvement include: better understanding of privacy policies, increased capability for data sharing and linkages between jurisdictions to alleviate data entry duplication. PMID:23569597

  2. 32 CFR 185.5 - Responsibilities.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... development of an MSCA data base and emergency reporting system, as described in paragraph (j) of this section... parameters of the DoD Resources Data Base (DODRDB) for MSCA, which is described in paragraph (n) of this section. Facilitate use of that data base to support decentralized management of MSCA in time of emergency...

  3. IMS Version 3 Student Data Base Maintenance Program.

    ERIC Educational Resources Information Center

    Brown, John R.

    Computer routines that update the Instructional Management System (IMS) Version 3 student data base which supports the Southwest Regional Laboratory's (SWRL) student monitoring system are described. Written in IBM System 360 FORTRAN IV, the program updates the data base by adding, changing and deleting records, as well as adding and deleting…

  4. Community Data Management and the Exchange for Local Observations and Knowledge of the Arctic

    NASA Astrophysics Data System (ADS)

    Duerr, R.; Pulsifer, P. L.; Strawhacker, C.; Mccann, H. S.

    2016-12-01

    The mission of the Exchange for Local Observations and Knowledge of the Arctic (ELOKA) is to facilitate the collection, preservation, exchange, and use of local observations and knowledge by Indigenous communities in the Arctic by providing data management services and user support, and by fostering collaboration between resident Arctic experts and visiting researchers. ELOKA's overarching philosophy is that Local and Traditional Knowledge (LTK) and scientific data and expertise are complementary and reinforcing ways of understanding the Arctic system. Collecting, documenting, preserving, and sharing knowledge is a cooperative endeavor, and ELOKA is dedicated to fostering ethical knowledge sharing among Arctic residents and communities, scientists, educators, policy makers, and the general public. But what does that mean in practice and what are the next steps for ELOKA in the coming years? In this presentation, we discuss the ethical issues involved with data management for LTK and community-based projects, some of the tools ELOKA has developed for interacting with communities and researchers and for managing LTK data, and our plans for the future. These include a discussion of the considerations local and community-based projects should make when planning and conducting research. It is clear, for example, that research projects should either include Indigenous voices at the outset of the project or have a prominent Indigenous voice so that appropriate methods or approaches can be adopted. Discussion of data access and funder obligations will be included. The data management tools that ELOKA employs and is developing for the future that can manage the wide range of data types typical of a community or LTK project will also be described, as will ELOKA's program for transferring long-term data management skills to communities that wish to take that on. Finally, ELOKA's plans for the future will be described.

  5. STP 4-06 Model-Based Technical Data in Procurement, 3D PDF Technology Data Demonstration Project. Phase 1 Summary

    DTIC Science & Technology

    2015-07-01

    O R G STP 4-06 MODEL-BASED TECHNICAL DATA IN PROCUREMENT 3D PDF TECHNOLOGY DATA DEMONSTRATION PROJECT PHASE 1 SUMMARY REPORT DL309T2...LMI’s ISO- certified quality management procedures. J U L Y 2 0 1 5 STP 4-06 MODEL-BASED TECHNICAL DATA IN PROCUREMENT 3D PDF TECHNICAL DATA...Based Technical Data ..................................................................................... 5 3D PDF Demonstration Team

  6. Generalized File Management System or Proto-DBMS?

    ERIC Educational Resources Information Center

    Braniff, Tom

    1979-01-01

    The use of a data base management system (DBMS) as opposed to traditional data processing is discussed. The generalized file concept is viewed as an entry level step to the DBMS. The transition process from one system to the other is detailed. (SF)

  7. The T.M.R. Data Dictionary: A Management Tool for Data Base Design

    PubMed Central

    Ostrowski, Maureen; Bernes, Marshall R.

    1984-01-01

    In January 1981, a dictionary-driven ambulatory care information system known as TMR (The Medical Record) was installed at a large private medical group practice in Los Angeles. TMR's data dictionary has enabled the medical group to adapt the software to meet changing user needs largely without programming support. For top management, the dictionary is also a tool for navigating through the system's complexity and assuring the integrity of management goals.

  8. Data analytics approach to create waste generation profiles for waste management and collection.

    PubMed

    Niska, Harri; Serkkola, Ari

    2018-04-30

    Extensive monitoring data on waste generation is increasingly collected in order to implement cost-efficient and sustainable waste management operations. In addition, geospatial data from different registries of the society are opening for free usage. Novel data analytics approaches can be built on the top of the data to produce more detailed, and in-time waste generation information for the basis of waste management and collection. In this paper, a data-based approach based on the self-organizing map (SOM) and the k-means algorithm is developed for creating a set of waste generation type profiles. The approach is demonstrated using the extensive container-level waste weighting data collected in the metropolitan area of Helsinki, Finland. The results obtained highlight the potential of advanced data analytic approaches in producing more detailed waste generation information e.g. for the basis of tailored feedback services for waste producers and the planning and optimization of waste collection and recycling. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. [Medication error management climate and perception for system use according to construction of medication error prevention system].

    PubMed

    Kim, Myoung Soo

    2012-08-01

    The purpose of this cross-sectional study was to examine current status of IT-based medication error prevention system construction and the relationships among system construction, medication error management climate and perception for system use. The participants were 124 patient safety chief managers working for 124 hospitals with over 300 beds in Korea. The characteristics of the participants, construction status and perception of systems (electric pharmacopoeia, electric drug dosage calculation system, computer-based patient safety reporting and bar-code system) and medication error management climate were measured in this study. The data were collected between June and August 2011. Descriptive statistics, partial Pearson correlation and MANCOVA were used for data analysis. Electric pharmacopoeia were constructed in 67.7% of participating hospitals, computer-based patient safety reporting systems were constructed in 50.8%, electric drug dosage calculation systems were in use in 32.3%. Bar-code systems showed up the lowest construction rate at 16.1% of Korean hospitals. Higher rates of construction of IT-based medication error prevention systems resulted in greater safety and a more positive error management climate prevailed. The supportive strategies for improving perception for use of IT-based systems would add to system construction, and positive error management climate would be more easily promoted.

  10. A general scientific information system to support the study of climate-related data

    NASA Technical Reports Server (NTRS)

    Treinish, L. A.

    1984-01-01

    The development and use of NASA's Pilot Climate Data System (PCDS) are discussed. The PCDS is used as a focal point for managing and providing access to a large collection of actively used data for the Earth, ocean and atmospheric sciences. The PCDS provides uniform data catalogs, inventories, and access methods for selected NASA and non-NASA data sets. Scientific users can preview the data sets using graphical and statistical methods. The system has evolved from its original purpose as a climate data base management system in response to a national climate program, into an extensive package of capabilities to support many types of data sets from both spaceborne and surface based measurements with flexible data selection and analysis functions.

  11. Hierarchical adaptation scheme for multiagent data fusion and resource management in situation analysis

    NASA Astrophysics Data System (ADS)

    Benaskeur, Abder R.; Roy, Jean

    2001-08-01

    Sensor Management (SM) has to do with how to best manage, coordinate and organize the use of sensing resources in a manner that synergistically improves the process of data fusion. Based on the contextual information, SM develops options for collecting further information, allocates and directs the sensors towards the achievement of the mission goals and/or tunes the parameters for the realtime improvement of the effectiveness of the sensing process. Conscious of the important role that SM has to play in modern data fusion systems, we are currently studying advanced SM Concepts that would help increase the survivability of the current Halifax and Iroquois Class ships, as well as their possible future upgrades. For this purpose, a hierarchical scheme has been proposed for data fusion and resource management adaptation, based on the control theory and within the process refinement paradigm of the JDL data fusion model, and taking into account the multi-agent model put forward by the SASS Group for the situation analysis process. The novelty of this work lies in the unified framework that has been defined for tackling the adaptation of both the fusion process and the sensor/weapon management.

  12. An Advanced IoT-based System for Intelligent Energy Management in Buildings.

    PubMed

    Marinakis, Vangelis; Doukas, Haris

    2018-02-16

    The energy sector is closely interconnected with the building sector and integrated Information and Communication Technologies (ICT) solutions for effective energy management supporting decision-making at building, district and city level are key fundamental elements for making a city Smart. The available systems are designed and intended exclusively for a predefined number of cases and systems without allowing for expansion and interoperability with other applications that is partially due to the lack of semantics. This paper presents an advanced Internet of Things (IoT) based system for intelligent energy management in buildings. A semantic framework is introduced aiming at the unified and standardised modelling of the entities that constitute the building environment. Suitable rules are formed, aiming at the intelligent energy management and the general modus operandi of Smart Building. In this context, an IoT-based system was implemented, which enhances the interactivity of the buildings' energy management systems. The results from its pilot application are presented and discussed. The proposed system extends existing approaches and integrates cross-domain data, such as the building's data (e.g., energy management systems), energy production, energy prices, weather data and end-users' behaviour, in order to produce daily and weekly action plans for the energy end-users with actionable personalised information.

  13. An Advanced IoT-based System for Intelligent Energy Management in Buildings

    PubMed Central

    Doukas, Haris

    2018-01-01

    The energy sector is closely interconnected with the building sector and integrated Information and Communication Technologies (ICT) solutions for effective energy management supporting decision-making at building, district and city level are key fundamental elements for making a city Smart. The available systems are designed and intended exclusively for a predefined number of cases and systems without allowing for expansion and interoperability with other applications that is partially due to the lack of semantics. This paper presents an advanced Internet of Things (IoT) based system for intelligent energy management in buildings. A semantic framework is introduced aiming at the unified and standardised modelling of the entities that constitute the building environment. Suitable rules are formed, aiming at the intelligent energy management and the general modus operandi of Smart Building. In this context, an IoT-based system was implemented, which enhances the interactivity of the buildings’ energy management systems. The results from its pilot application are presented and discussed. The proposed system extends existing approaches and integrates cross-domain data, such as the building’s data (e.g., energy management systems), energy production, energy prices, weather data and end-users’ behaviour, in order to produce daily and weekly action plans for the energy end-users with actionable personalised information. PMID:29462957

  14. C-A1-03: Considerations in the Design and Use of an Oracle-based Virtual Data Warehouse

    PubMed Central

    Bredfeldt, Christine; McFarland, Lela

    2011-01-01

    Background/Aims The amount of clinical data available for research is growing exponentially. As it grows, increasing the efficiency of both data storage and data access becomes critical. Relational database management systems (rDBMS) such as Oracle are ideal solutions for managing longitudinal clinical data because they support large-scale data storage and highly efficient data retrieval. In addition, they can greatly simplify the management of large data warehouses, including security management and regular data refreshes. However, the HMORN Virtual Data Warehouse (VDW) was originally designed based on SAS datasets, and this design choice has a number of implications for both the design and use of an Oracle-based VDW. From a design standpoint, VDW tables are designed as flat SAS datasets, which do not take full advantage of Oracle indexing capabilities. From a data retrieval standpoint, standard VDW SAS scripts do not take advantage of SAS pass-through SQL capabilities to enable Oracle to perform the processing required to narrow datasets to the population of interest. Methods Beginning in 2009, the research department at Kaiser Permanente in the Mid-Atlantic States (KPMA) has developed an Oracle-based VDW according to the HMORN v3 specifications. In order to take advantage of the strengths of relational databases, KPMA introduced an interface layer to the VDW data, using views to provide access to standardized VDW variables. In addition, KPMA has developed SAS programs that provide access to SQL pass-through processing for first-pass data extraction into SAS VDW datasets for processing by standard VDW scripts. Results We discuss both the design and performance considerations specific to the KPMA Oracle-based VDW. We benchmarked performance of the Oracle-based VDW using both standard VDW scripts and an initial pre-processing layer to evaluate speed and accuracy of data return. Conclusions Adapting the VDW for deployment in an Oracle environment required minor changes to the underlying structure of the data. Further modifications of the underlying data structure would lead to performance enhancements. Maximally efficient data access for standard VDW scripts requires an extra step that involves restricting the data to the population of interest at the data server level prior to standard processing.

  15. Designing CIS to improve decisions in depression disease management: a discourse analysis of front line practice.

    PubMed

    Mirel, Barbara; Ackerman, Mark S; Kerber, Kevin; Klinkman, Michael

    2006-01-01

    Clinical care management promises to help diminish the major health problem of depression. To realize this promise, front line clinicians must know which care management interventions are best for which patients and act accordingly. Unfortunately, the detailed intervention data required for such differentiated assessments are missing in most clinical information systems (CIS). To determine frontline clinicians' needs for these data and to identify the data that CIS should keep, we conducted an 18 month ethnographic study and discourse analysis of telehealth depression care management. Results show care managers need data-based evidence to choose best options, and discourse analysis suggests some personalized interventions that CIS should and can feasibly capture for evidence.

  16. A parallel data management system for large-scale NASA datasets

    NASA Technical Reports Server (NTRS)

    Srivastava, Jaideep

    1993-01-01

    The past decade has experienced a phenomenal growth in the amount of data and resultant information generated by NASA's operations and research projects. A key application is the reprocessing problem which has been identified to require data management capabilities beyond those available today (PRAT93). The Intelligent Information Fusion (IIF) system (ROEL91) is an ongoing NASA project which has similar requirements. Deriving our understanding of NASA's future data management needs based on the above, this paper describes an approach to using parallel computer systems (processor and I/O architectures) to develop an efficient parallel database management system to address the needs. Specifically, we propose to investigate issues in low-level record organizations and management, complex query processing, and query compilation and scheduling.

  17. Design and Implementation of CNEOST Image Database Based on NoSQL System

    NASA Astrophysics Data System (ADS)

    Wang, X.

    2013-07-01

    The China Near Earth Object Survey Telescope (CNEOST) is the largest Schmidt telescope in China, and it has acquired more than 3 TB astronomical image data since it saw the first light in 2006. After the upgradation of the CCD camera in 2013, over 10 TB data will be obtained every year. The management of massive images is not only an indispensable part of data processing pipeline but also the basis of data sharing. Based on the analysis of requirement, an image management system is designed and implemented by employing the non-relational database.

  18. Design and Implementation of CNEOST Image Database Based on NoSQL System

    NASA Astrophysics Data System (ADS)

    Wang, Xin

    2014-04-01

    The China Near Earth Object Survey Telescope is the largest Schmidt telescope in China, and it has acquired more than 3 TB astronomical image data since it saw the first light in 2006. After the upgrade of the CCD camera in 2013, over 10 TB data will be obtained every year. The management of the massive images is not only an indispensable part of data processing pipeline but also the basis of data sharing. Based on the analysis of requirement, an image management system is designed and implemented by employing the non-relational database.

  19. Information Interaction Study for DER and DMS Interoperability

    NASA Astrophysics Data System (ADS)

    Liu, Haitao; Lu, Yiming; Lv, Guangxian; Liu, Peng; Chen, Yu; Zhang, Xinhui

    The Common Information Model (CIM) is an abstract data model that can be used to represent the major objects in Distribution Management System (DMS) applications. Because the Common Information Model (CIM) doesn't modeling the Distributed Energy Resources (DERs), it can't meet the requirements of DER operation and management for Distribution Management System (DMS) advanced applications. Modeling of DER were studied based on a system point of view, the article initially proposed a CIM extended information model. By analysis the basic structure of the message interaction between DMS and DER, a bidirectional messaging mapping method based on data exchange was proposed.

  20. Dynamic XML-based exchange of relational data: application to the Human Brain Project.

    PubMed

    Tang, Zhengming; Kadiyska, Yana; Li, Hao; Suciu, Dan; Brinkley, James F

    2003-01-01

    This paper discusses an approach to exporting relational data in XML format for data exchange over the web. We describe the first real-world application of SilkRoute, a middleware program that dynamically converts existing relational data to a user-defined XML DTD. The application, called XBrain, wraps SilkRoute in a Java Server Pages framework, thus permitting a web-based XQuery interface to a legacy relational database. The application is demonstrated as a query interface to the University of Washington Brain Project's Language Map Experiment Management System, which is used to manage data about language organization in the brain.

  1. The Biological and Chemical Oceanography Data Management Office

    NASA Astrophysics Data System (ADS)

    Allison, M. D.; Chandler, C. L.; Groman, R. C.; Wiebe, P. H.; Glover, D. M.; Gegg, S. R.

    2011-12-01

    Oceanography and marine ecosystem research are inherently interdisciplinary fields of study that generate and require access to a wide variety of measurements. In late 2006 the Biological and Chemical Oceanography Sections of the National Science Foundation (NSF) Geosciences Directorate Division of Ocean Sciences (OCE) funded the Biological and Chemical Oceanography Data Management Office (BCO-DMO). In late 2010 additional funding was contributed to support management of research data from the NSF Office of Polar Programs Antarctic Organisms & Ecosystems Program. The BCO-DMO is recognized in the 2011 Division of Ocean Sciences Sample and Data Policy as one of several program specific data offices that support NSF OCE funded researchers. BCO-DMO staff members offer data management support throughout the project life cycle to investigators from large national programs and medium-sized collaborative research projects, as well as researchers from single investigator awards. The office manages and serves all types of oceanographic data and information generated during the research process and contributed by the originating investigators. BCO-DMO has built a data system that includes the legacy data from several large ocean research programs (e.g. United States Joint Global Ocean Flux Study and United States GLOBal Ocean ECosystems Dynamics), to which data have been contributed from recently granted NSF OCE and OPP awards. The BCO-DMO data system can accommodate many different types of data including: in situ and experimental biological, chemical, and physical measurements; modeling results and synthesis data products. The system enables reuse of oceanographic data for new research endeavors, supports synthesis and modeling activities, provides availability of "real data" for K-12 and college level use, and provides decision-support field data for policy-relevant investigations. We will present an overview of the data management system capabilities including: map-based and text-based data discovery and access systems; recent enhancements to data search tools; data export and download utilities; and strategic use of controlled vocabularies to facilitate data integration and to improve data system interoperability.

  2. A data management system to enable urgent natural disaster computing

    NASA Astrophysics Data System (ADS)

    Leong, Siew Hoon; Kranzlmüller, Dieter; Frank, Anton

    2014-05-01

    Civil protection, in particular natural disaster management, is very important to most nations and civilians in the world. When disasters like flash floods, earthquakes and tsunamis are expected or have taken place, it is of utmost importance to make timely decisions for managing the affected areas and reduce casualties. Computer simulations can generate information and provide predictions to facilitate this decision making process. Getting the data to the required resources is a critical requirement to enable the timely computation of the predictions. An urgent data management system to support natural disaster computing is thus necessary to effectively carry out data activities within a stipulated deadline. Since the trigger of a natural disaster is usually unpredictable, it is not always possible to prepare required resources well in advance. As such, an urgent data management system for natural disaster computing has to be able to work with any type of resources. Additional requirements include the need to manage deadlines and huge volume of data, fault tolerance, reliable, flexibility to changes, ease of usage, etc. The proposed data management platform includes a service manager to provide a uniform and extensible interface for the supported data protocols, a configuration manager to check and retrieve configurations of available resources, a scheduler manager to ensure that the deadlines can be met, a fault tolerance manager to increase the reliability of the platform and a data manager to initiate and perform the data activities. These managers will enable the selection of the most appropriate resource, transfer protocol, etc. such that the hard deadline of an urgent computation can be met for a particular urgent activity, e.g. data staging or computation. We associated 2 types of deadlines [2] with an urgent computing system. Soft-hard deadline: Missing a soft-firm deadline will render the computation less useful resulting in a cost that can have severe consequences Hard deadline: Missing a hard deadline renders the computation useless and results in full catastrophic consequences. A prototype of this system has a REST-based service manager. The REST-based implementation provides a uniform interface that is easy to use. New and upcoming file transfer protocols can easily be extended and accessed via the service manager. The service manager interacts with the other four managers to coordinate the data activities so that the fundamental natural disaster urgent computing requirement, i.e. deadline, can be fulfilled in a reliable manner. A data activity can include data storing, data archiving and data storing. Reliability is ensured by the choice of a network of managers organisation model[1] the configuration manager and the fault tolerance manager. With this proposed design, an easy to use, resource-independent data management system that can support and fulfill the computation of a natural disaster prediction within stipulated deadlines can thus be realised. References [1] H. G. Hegering, S. Abeck, and B. Neumair, Integrated management of networked systems - concepts, architectures, and their operational application, Morgan Kaufmann Publishers, 340 Pine Stret, Sixth Floor, San Francisco, CA 94104-3205, USA, 1999. [2] H. Kopetz, Real-time systems design principles for distributed embedded applications, second edition, Springer, LLC, 233 Spring Street, New York, NY 10013, USA, 2011. [3] S. H. Leong, A. Frank, and D. Kranzlmu¨ ller, Leveraging e-infrastructures for urgent computing, Procedia Computer Science 18 (2013), no. 0, 2177 - 2186, 2013 International Conference on Computational Science. [4] N. Trebon, Enabling urgent computing within the existing distributed computing infrastructure, Ph.D. thesis, University of Chicago, August 2011, http://people.cs.uchicago.edu/~ntrebon/docs/dissertation.pdf.

  3. Vegetation and terrain mapping in Alaska using Landsat MSS and digital terrain data

    USGS Publications Warehouse

    Shasby, Mark; Carneggie, David M.

    1986-01-01

    During the past 5 years, the U.S. Geological Survey's (USGS) Earth Resources Observation Systems (EROS) Data Center Field Office in Anchorage, Alaska has worked cooperatively with Federal and State resource management agencies to produce land-cover and terrain maps for 245 million acres of Alaska. The need for current land-cover information in Alaska comes principally from the mandates of the Alaska National Interest Lands Conservation Act (ANILCA), December 1980, which requires major land management agencies to prepare comprehensive management plans. The land-cover mapping projects integrate digital Landsat data, terrain data, aerial photographs, and field data. The resultant land-cover and terrain maps and associated data bases are used for resource assessment, management, and planning by many Alaskan agencies including the U.S. Fish and Wildlife Service, U.S. Forest Service, Bureau of Land Management, and Alaska Department of Natural Resources. Applications addressed through use of the digital land-cover and terrain data bases range from comprehensive refuge planning to multiphased sampling procedures designed to inventory vegetation statewide. The land-cover mapping programs in Alaska demonstrate the operational utility of digital Landsat data and have resulted in a new land-cover mapping program by the USGS National Mapping Division to compile 1:250,000-scale land-cover maps in Alaska using a common statewide land-cover map legend.

  4. Implementation of behavioral health interventions in real world scenarios: Managing complex change.

    PubMed

    Clark, Khaya D; Miller, Benjamin F; Green, Larry A; de Gruy, Frank V; Davis, Melinda; Cohen, Deborah J

    2017-03-01

    A practice embarks on a radical reformulation of how care is designed and delivered when it decides to integrate medical and behavioral health care for its patients and success depends on managing complex change in a complex system. We examined the ways change is managed when integrating behavioral health and medical care. Observational cross-case comparative study of 19 primary care and community mental health practices. We collected mixed methods data through practice surveys, observation, and semistructured interviews. We analyzed data using a data-driven, emergent approach. The change management strategies that leadership employed to manage the changes of integrating behavioral health and medical care included: (a) advocating for a mission and vision focused on integrated care; (b) fostering collaboration, with a focus on population care and a team-based approaches; (c) attending to learning, which includes viewing the change process as continuous, and creating a culture that promoted reflection and continual improvement; (d) using data to manage change, and (e) developing approaches to finance integration. This paper reports the change management strategies employed by practice leaders making changes to integrate care, as observed by independent investigators. We offer an empirically based set of actionable recommendations that are relevant to a range of leaders (policymakers, medical directors) and practice members who wish to effectively manage the complex changes associated with integrated primary care. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Web-GIS based information management system to Bureau of Law Enforcement for Urban Managementenforcement for urban management

    NASA Astrophysics Data System (ADS)

    Sun, Hai; Wang, Cheng; Ren, Bo

    2007-06-01

    Daily works of Law Enforcement Bureau are crucial in the urban management. However, with the development of the city, the information and data which are relative to Law Enforcement Bureau's daily work are increasing and updating. The increasing data result in that some traditional work is limited and inefficient in daily work. Analyzing the demands and obstacles of Law Enforcement Bureau, the paper proposes a new method to solve these problems. A web-GIS based information management system was produced for Bureau of Law Enforcement for Urban Management of Foshan. First part of the paper provides an overview of the system. Second part introduces the architecture of system and data organization. In the third part, the paper describes the design and implement of functional modules detailedly. In the end, this paper is concluded and proposes some strategic recommendations for the further development of the system. This paper focuses on the architecture and implementation of the system, solves the developing issues based on ArcServer, and introduces a new concept to the local government to solve the current problems. Practical application of this system showed that it played very important role in the Law Enforcement Bureau's work.

  6. Are performance indicators used for hospital quality management: a qualitative interview study amongst health professionals and quality managers in The Netherlands.

    PubMed

    Botje, Daan; Ten Asbroek, Guus; Plochg, Thomas; Anema, Helen; Kringos, Dionne S; Fischer, Claudia; Wagner, Cordula; Klazinga, Niek S

    2016-10-13

    Hospitals are under increasing pressure to share indicator-based performance information. These indicators can also serve as a means to promote quality improvement and boost hospital performance. Our aim was to explore hospitals' use of performance indicators for internal quality management activities. We conducted a qualitative interview study among 72 health professionals and quality managers in 14 acute care hospitals in The Netherlands. Concentrating on orthopaedic and oncology departments, our goal was to gain insight into data collection and use of performance indicators for two conditions: knee and hip replacement surgery and breast cancer surgery. The semi-structured interviews were recorded and summarised. Based on the data, themes were synthesised and the analyses were executed systematically by two analysts independently. The findings were validated through comparison. The hospitals we investigated collect data for performance indicators in different ways. Similarly, these hospitals have different ways of using such data to support their quality management, while some do not seem to use the data for this purpose at all. Factors like 'linking pin champions', pro-active quality managers and engaged medical specialists seem to make a difference. In addition, a comprehensive hospital data infrastructure with electronic patient records and robust data collection software appears to be a prerequisite to produce reliable external performance indicators for internal quality improvement. Hospitals often fail to use performance indicators as a means to support internal quality management. Such data, then, are not used to its full potential. Hospitals are recommended to focus their human resource policy on 'linking pin champions', the engagement of professionals and a pro-active quality manager, and to invest in a comprehensive data infrastructure. Furthermore, the differences in data collection processes between Dutch hospitals make it difficult to draw comparisons between outcomes of performance indicators.

  7. A framework for a diabetes mellitus disease management system in southern Israel.

    PubMed

    Fox, Matthew A; Harman-Boehm, Ilana; Weitzman, Shimon; Zelingher, Julian

    2002-01-01

    Chronic diseases are a significant burden on western healthcare systems and national economies. It has been suggested that automated disease management for chronic disease, like diabetes mellitus (DM), improves the quality of care and reduces inappropriate utilization of diagnostic and therapeutic measures. We have designed a comprehensive DM Disease Management system for the Negev region in southern Israel. This system takes advantage of currently used clinical and administrative information systems. Algorithms for DM disease management have been created based on existing and accepted Israeli guidelines. All data fields and tables in the source information systems have been analyzed, and interfaces for periodic data loads from these systems have been specified. Based on this data, four subsets of decision support algorithms have been developed. The system generates alerts in these domains to multiple end users. We plan to use the products of this information system analysis and disease management specification in the actual development process of such a system shortly.

  8. 48 CFR 908.7101-7 - Government license tags.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... online vehicle license tag ordering data base. Contractors must obtain approval from their Federal fleet manager or OPMO for authorization to utilize the UNICOR data base. Director, Personal Property Policy...

  9. Study on Adaptive Parameter Determination of Cluster Analysis in Urban Management Cases

    NASA Astrophysics Data System (ADS)

    Fu, J. Y.; Jing, C. F.; Du, M. Y.; Fu, Y. L.; Dai, P. P.

    2017-09-01

    The fine management for cities is the important way to realize the smart city. The data mining which uses spatial clustering analysis for urban management cases can be used in the evaluation of urban public facilities deployment, and support the policy decisions, and also provides technical support for the fine management of the city. Aiming at the problem that DBSCAN algorithm which is based on the density-clustering can not realize parameter adaptive determination, this paper proposed the optimizing method of parameter adaptive determination based on the spatial analysis. Firstly, making analysis of the function Ripley's K for the data set to realize adaptive determination of global parameter MinPts, which means setting the maximum aggregation scale as the range of data clustering. Calculating every point object's highest frequency K value in the range of Eps which uses K-D tree and setting it as the value of clustering density to realize the adaptive determination of global parameter MinPts. Then, the R language was used to optimize the above process to accomplish the precise clustering of typical urban management cases. The experimental results based on the typical case of urban management in XiCheng district of Beijing shows that: The new DBSCAN clustering algorithm this paper presents takes full account of the data's spatial and statistical characteristic which has obvious clustering feature, and has a better applicability and high quality. The results of the study are not only helpful for the formulation of urban management policies and the allocation of urban management supervisors in XiCheng District of Beijing, but also to other cities and related fields.

  10. Metadata based management and sharing of distributed biomedical data

    PubMed Central

    Vergara-Niedermayr, Cristobal; Liu, Peiya

    2014-01-01

    Biomedical research data sharing is becoming increasingly important for researchers to reuse experiments, pool expertise and validate approaches. However, there are many hurdles for data sharing, including the unwillingness to share, lack of flexible data model for providing context information, difficulty to share syntactically and semantically consistent data across distributed institutions, and high cost to provide tools to share the data. SciPort is a web-based collaborative biomedical data sharing platform to support data sharing across distributed organisations. SciPort provides a generic metadata model to flexibly customise and organise the data. To enable convenient data sharing, SciPort provides a central server based data sharing architecture with a one-click data sharing from a local server. To enable consistency, SciPort provides collaborative distributed schema management across distributed sites. To enable semantic consistency, SciPort provides semantic tagging through controlled vocabularies. SciPort is lightweight and can be easily deployed for building data sharing communities. PMID:24834105

  11. Towards an ontology for data quality in integrated chronic disease management: a realist review of the literature.

    PubMed

    Liaw, S T; Rahimi, A; Ray, P; Taggart, J; Dennis, S; de Lusignan, S; Jalaludin, B; Yeo, A E T; Talaei-Khoei, A

    2013-01-01

    Effective use of routine data to support integrated chronic disease management (CDM) and population health is dependent on underlying data quality (DQ) and, for cross system use of data, semantic interoperability. An ontological approach to DQ is a potential solution but research in this area is limited and fragmented. Identify mechanisms, including ontologies, to manage DQ in integrated CDM and whether improved DQ will better measure health outcomes. A realist review of English language studies (January 2001-March 2011) which addressed data quality, used ontology-based approaches and is relevant to CDM. We screened 245 papers, excluded 26 duplicates, 135 on abstract review and 31 on full-text review; leaving 61 papers for critical appraisal. Of the 33 papers that examined ontologies in chronic disease management, 13 defined data quality and 15 used ontologies for DQ. Most saw DQ as a multidimensional construct, the most used dimensions being completeness, accuracy, correctness, consistency and timeliness. The majority of studies reported tool design and development (80%), implementation (23%), and descriptive evaluations (15%). Ontological approaches were used to address semantic interoperability, decision support, flexibility of information management and integration/linkage, and complexity of information models. DQ lacks a consensus conceptual framework and definition. DQ and ontological research is relatively immature with little rigorous evaluation studies published. Ontology-based applications could support automated processes to address DQ and semantic interoperability in repositories of routinely collected data to deliver integrated CDM. We advocate moving to ontology-based design of information systems to enable more reliable use of routine data to measure health mechanisms and impacts. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  12. Research on image evidence in land supervision and GIS management

    NASA Astrophysics Data System (ADS)

    Li, Qiu; Wu, Lixin

    2006-10-01

    Land resource development and utilization brings many problems. The numbers, the scale and volume of illegal land use cases are on the increasing. Since the territory is vast, and the land violations are concealment, it is difficulty for an effective land supervision and management. In this paper, the concepts of evidence, and preservation of evidence were described first. The concepts of image evidence (IE), natural evidence (NE), natural preservation of evidence (NPE), general preservation of evidence (GPE) were proposed based on the characteristics of remote sensing image (RSI) which has a characteristic of objectiveness, truthfulness, high spatial resolution, more information included. Using MapObjects and Visual Basic 6.0, under the Access management to implement the conjunction of spatial vector database and attribute data table; taking RSI as the data sources and background layer; combining the powerful management of geographic information system (GIS) for spatial data, and visual analysis, a land supervision and GIS management system was design and implemented based on NPE. The practical use in Beijing shows that the system is running well, and solved some problems in land supervision and management.

  13. Wiki-Based Data Management to Support Systems Toxicology

    EPA Science Inventory

    As the field of toxicology relies more heavily on systems approaches for mode of action discovery, evaluation, and modeling, the need for integrated data management is greater than ever. To meet these needs, we have developed a flexible system that assists individual or multiple...

  14. An object-based storage model for distributed remote sensing images

    NASA Astrophysics Data System (ADS)

    Yu, Zhanwu; Li, Zhongmin; Zheng, Sheng

    2006-10-01

    It is very difficult to design an integrated storage solution for distributed remote sensing images to offer high performance network storage services and secure data sharing across platforms using current network storage models such as direct attached storage, network attached storage and storage area network. Object-based storage, as new generation network storage technology emerged recently, separates the data path, the control path and the management path, which solves the bottleneck problem of metadata existed in traditional storage models, and has the characteristics of parallel data access, data sharing across platforms, intelligence of storage devices and security of data access. We use the object-based storage in the storage management of remote sensing images to construct an object-based storage model for distributed remote sensing images. In the storage model, remote sensing images are organized as remote sensing objects stored in the object-based storage devices. According to the storage model, we present the architecture of a distributed remote sensing images application system based on object-based storage, and give some test results about the write performance comparison of traditional network storage model and object-based storage model.

  15. More than Anecdotes: Fishers' Ecological Knowledge Can Fill Gaps for Ecosystem Modeling.

    PubMed

    Bevilacqua, Ana Helena V; Carvalho, Adriana R; Angelini, Ronaldo; Christensen, Villy

    2016-01-01

    Ecosystem modeling applied to fisheries remains hampered by a lack of local information. Fishers' knowledge could fill this gap, improving participation in and the management of fisheries. The same fishing area was modeled using two approaches: based on fishers' knowledge and based on scientific information. For the former, the data was collected by interviews through the Delphi methodology, and for the latter, the data was gathered from the literature. Agreement between the attributes generated by the fishers' knowledge model and scientific model is discussed and explored, aiming to improve data availability, the ecosystem model, and fisheries management. The ecosystem attributes produced from the fishers' knowledge model were consistent with the ecosystem attributes produced by the scientific model, and elaborated using only the scientific data from literature. This study provides evidence that fishers' knowledge may suitably complement scientific data, and may improve the modeling tools for the research and management of fisheries.

  16. Substance abuse treatment management information systems: balancing federal, state, and service provider needs.

    PubMed

    Camp, J M; Krakow, M; McCarty, D; Argeriou, M

    1992-01-01

    There is increased interest in documenting the characteristics and treatment outcomes of clients served with Alcohol, Drug Abuse, and Mental Health Block Grant funds. The evolution of federal client-based management systems for substance abuse treatment services demonstrates that data collection systems are important but require continued support. A review of the Massachusetts substance abuse management information system illustrates the utility of a client-based data set. The development and implementation of a comprehensive information system require overcoming organizational barriers and project delays, fostering collaborative efforts among staff from diverse agencies, and employing considerable resources. In addition, the need to develop mechanisms for increasing the reliability of the data and ongoing training for the users is presented. Finally, three applications of the management information system's role in shaping policy are reviewed: developing services for special populations (communities of color, women, and pregnant substance abusers, and injection drug users), utilizing MIS data for evaluation purposes, and determining funding allocations.

  17. Oak Ridge Environmental Information System (OREIS) functional system design document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birchfield, T.E.; Brown, M.O.; Coleman, P.R.

    1994-03-01

    The OREIS Functional System Design document provides a detailed functional description of the Oak Ridge Environmental Information System (OREIS). It expands the system requirements defined in the OREIS Phase 1-System Definition Document (ES/ER/TM-34). Documentation of OREIS development is based on the Automated Data Processing System Development Methodology, a Martin Marietta Energy Systems, Inc., procedure written to assist in developing scientific and technical computer systems. This document focuses on the development of the functional design of the user interface, which includes the integration of commercial applications software. The data model and data dictionary are summarized briefly; however, the Data Management Planmore » for OREIS (ES/ER/TM-39), a companion document to the Functional System Design document, provides the complete data dictionary and detailed descriptions of the requirements for the data base structure. The OREIS system will provide the following functions, which are executed from a Menu Manager: (1) preferences, (2) view manager, (3) macro manager, (4) data analysis (assisted analysis and unassisted analysis), and (5) spatial analysis/map generation (assisted ARC/INFO and unassisted ARC/INFO). Additional functionality includes interprocess communications, which handle background operations of OREIS.« less

  18. The Ophidia framework: toward cloud-based data analytics for climate change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni

    2015-04-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.

  19. Coordination and standardization of federal sedimentation activities

    USGS Publications Warehouse

    Glysson, G. Douglas; Gray, John R.

    1997-01-01

    - precipitation information critical to water resources management. Memorandum M-92-01 covers primarily freshwater bodies and includes activities, such as "development and distribution of consensus standards, field-data collection and laboratory analytical methods, data processing and interpretation, data-base management, quality control and quality assurance, and water- resources appraisals, assessments, and investigations." Research activities are not included.

  20. Real-Time Management of Multimodal Streaming Data for Monitoring of Epileptic Patients.

    PubMed

    Triantafyllopoulos, Dimitrios; Korvesis, Panagiotis; Mporas, Iosif; Megalooikonomou, Vasileios

    2016-03-01

    New generation of healthcare is represented by wearable health monitoring systems, which provide real-time monitoring of patient's physiological parameters. It is expected that continuous ambulatory monitoring of vital signals will improve treatment of patients and enable proactive personal health management. In this paper, we present the implementation of a multimodal real-time system for epilepsy management. The proposed methodology is based on a data streaming architecture and efficient management of a big flow of physiological parameters. The performance of this architecture is examined for varying spatial resolution of the recorded data.

  1. The Characteristics of Project Managers: An Exploration of Complex Projects in the National Aeronautics and Space Administration

    NASA Technical Reports Server (NTRS)

    Mulenburg, Gerald M.

    2000-01-01

    Study of characteristics and relationships of project managers of complex projects in the National Aeronautics and Space Administration. Study is based on Research Design, Data Collection, Interviews, Case Studies, and Data Analysis across varying disciplines such as biological research, space research, advanced aeronautical test facilities, aeronautic flight demonstrations, and projects at different NASA centers to ensure that findings were not endemic to one type of project management, or to one Center's management philosophies. Each project is treated as a separate case with the primary data collected during semi-structured interviews with the project manager responsible for the overall project. Results of the various efforts show some definite similarities of characteristics and relationships among the project managers in the study. A model for how the project managers formulated and managed their projects is included.

  2. HGDB: A web retrieving cardiovascular-associated gene data.

    PubMed

    Noorabad-Ghahroodi, Faezeh; Abdi, Samaneh; Zand, Amir Hossein; Najafi, Mohammad

    2017-04-01

    The use of data obtained from high throughput techniques in genetics studies is an essential subject in biology. The system approaches of networking and enriching may improve the data management. Here, we annotated the molecular features for cardiovascular-associated genes and presented the HGDB search-based database (www.hgdb.ir). The initial seed data was primarily used from Gene Ontology and was automatically enriched with other molecular features. The data was managed in a SQL popular and open source. The search tabs on the HGDB homepage were applied for ID/Name Gene, chromosome, cell organelle and all gene options. The search results were presented on the gene text-based and source link-based descriptions. The HGDB is a friendly website to present gene data in the cardiovascular field. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Atmospheric Radiation Measurement's Data Management Facility captures metadata and uses visualization tools to assist in routine data management.

    NASA Astrophysics Data System (ADS)

    Keck, N. N.; Macduff, M.; Martin, T.

    2017-12-01

    The Atmospheric Radiation Measurement's (ARM) Data Management Facility (DMF) plays a critical support role in processing and curating data generated by the Department of Energy's ARM Program. Data are collected near real time from hundreds of observational instruments spread out all over the globe. Data are then ingested hourly to provide time series data in NetCDF (network Common Data Format) and includes standardized metadata. Based on automated processes and a variety of user reviews the data may need to be reprocessed. Final data sets are then stored and accessed by users through the ARM Archive. Over the course of 20 years, a suite of data visualization tools have been developed to facilitate the operational processes to manage and maintain the more than 18,000 real time events, that move 1.3 TB of data each day through the various stages of the DMF's data system. This poster will present the resources and methodology used to capture metadata and the tools that assist in routine data management and discoverability.

  4. [Risk management--a new aspect of quality assessment in intensive care medicine: first results of an analysis of the DIVI's interdisciplinary quality assessment research group].

    PubMed

    Stiletto, R; Röthke, M; Schäfer, E; Lefering, R; Waydhas, Ch

    2006-10-01

    Patient security has become one of the major aspects of clinical management in recent years. The crucial point in research was focused on malpractice. In contradiction to the economic process in non medical fields, the analysis of errors during the in-patient treatment time was neglected. Patient risk management can be defined as a structured procedure in a clinical unit with the aim to reduce harmful events. A risk point model was created based on a Delphi process and founded on the DIVI data register. The risk point model was evaluated in clinically working ICU departments participating in the register data base. The results of the risk point evaluation will be integrated in the next data base update. This might be a step to improve the reliability of the register to measure quality assessment in the ICU.

  5. The research and implementation of PDM systems based on the .NET platform

    NASA Astrophysics Data System (ADS)

    Gao, Hong-li; Jia, Ying-lian; Yang, Ji-long; Jiang, Wei

    2005-12-01

    A new kind of PDM system scheme based on the .NET platform for solving application problems of the current PDM system applied in an enterprise is described. The key technologies of this system, such as .NET, Accessing Data, information processing, Web, ect., were discussed. The 3-tier architecture of a PDM system based on the C/S and B/S mixed mode was presented. In this system, all users share the same Database Server in order to ensure the coherence and safety of client data. ADO.NET leverages the power of XML to provide disconnected access to data, which frees the connection to be used by other clients. Using this approach, the system performance was improved. Moreover, the important function modules in a PDM system such as project management, product structure management and Document Management module were developed and realized.

  6. Impact of IPAD on CAD/CAM database university research

    NASA Technical Reports Server (NTRS)

    Leach, L. M.; Wozny, M. J.

    1984-01-01

    IPAD program has provided direction, focus and software products which impacted on CAD/CAM data base research and follow-on research. The relationship of IPAD to the research projects which involve the storage of geometric data in common data ase facilities such as data base machines, the exchange of data between heterogeneous data bases, the development of IGES processors, the migration of lrge CAD/CAM data base management systems to noncompatible hosts, and the value of RIM as a research tool is described.

  7. Geographic Information System (GIS) capabilities in traffic accident information management: a qualitative approach

    PubMed Central

    Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali

    2017-01-01

    Background Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. Objective The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. Methods This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach’s alpha of 75%. Data was analyzed using the decision Delphi technique. Results GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Conclusion Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information. PMID:28848627

  8. The Design of Data Disaster Recovery of National Fundamental Geographic Information System

    NASA Astrophysics Data System (ADS)

    Zhai, Y.; Chen, J.; Liu, L.; Liu, J.

    2014-04-01

    With the development of information technology, data security of information system is facing more and more challenges. The geographic information of surveying and mapping is fundamental and strategic resource, which is applied in all areas of national economic, defence and social development. It is especially vital to national and social interests when such classified geographic information is directly concerning Chinese sovereignty. Several urgent problems that needs to be resolved for surveying and mapping are how to do well in mass data storage and backup, establishing and improving the disaster backup system especially after sudden natural calamity accident, and ensuring all sectors rapidly restored on information system will operate correctly. For overcoming various disaster risks, protect the security of data and reduce the impact of the disaster, it's no doubt the effective way is to analysis and research on the features of storage and management and security requirements, as well as to ensure that the design of data disaster recovery system suitable for the surveying and mapping. This article analyses the features of fundamental geographic information data and the requirements of storage management, three site disaster recovery system of DBMS plan based on the popular network, storage and backup, data replication and remote switch of application technologies. In LAN that synchronous replication between database management servers and the local storage of backup management systems, simultaneously, remote asynchronous data replication between local storage backup management systems and remote database management servers. The core of the system is resolving local disaster in the remote site, ensuring data security and business continuity of local site. This article focuses on the following points: background, the necessity of disaster recovery system, the analysis of the data achievements and data disaster recovery plan. Features of this program is to use a hardware-based data hot backup, and remote online disaster recovery support for Oracle database system. The achievement of this paper is in summarizing and analysing the common characteristics of disaster of surveying and mapping business system requirements, while based on the actual situation of the industry, designed the basic GIS disaster recovery solutions, and we also give the conclusions about key technologies of RTO and RPO.

  9. Development of a mobile probe-based traffic data fusion and flow management platform for innovative public-private information-based partnerships.

    DOT National Transportation Integrated Search

    2011-10-17

    "Under the aegis of Intelligent Transportation Systems (ITS), real-time traffic information provision strategies are being proposed to manage traffic congestion, alleviate the effects of incidents, enhance response efficiency after disasters, and imp...

  10. A Data Management System Integrating Web-based Training and Randomized Trials: Requirements, Experiences and Recommendations.

    PubMed

    Muroff, Jordana; Amodeo, Maryann; Larson, Mary Jo; Carey, Margaret; Loftin, Ralph D

    2011-01-01

    This article describes a data management system (DMS) developed to support a large-scale randomized study of an innovative web-course that was designed to improve substance abuse counselors' knowledge and skills in applying a substance abuse treatment method (i.e., cognitive behavioral therapy; CBT). The randomized trial compared the performance of web-course-trained participants (intervention group) and printed-manual-trained participants (comparison group) to determine the effectiveness of the web-course in teaching CBT skills. A single DMS was needed to support all aspects of the study: web-course delivery and management, as well as randomized trial management. The authors briefly reviewed several other systems that were described as built either to handle randomized trials or to deliver and evaluate web-based training. However it was clear that these systems fell short of meeting our needs for simultaneous, coordinated management of the web-course and the randomized trial. New England Research Institute's (NERI) proprietary Advanced Data Entry and Protocol Tracking (ADEPT) system was coupled with the web-programmed course and customized for our purposes. This article highlights the requirements for a DMS that operates at the intersection of web-based course management systems and randomized clinical trial systems, and the extent to which the coupled, customized ADEPT satisfied those requirements. Recommendations are included for institutions and individuals considering conducting randomized trials and web-based training programs, and seeking a DMS that can meet similar requirements.

  11. Web-based decision support and visualization tools for water quality management in the Chesapeake Bay watershed

    USGS Publications Warehouse

    Mullinix, C.; Hearn, P.; Zhang, H.; Aguinaldo, J.

    2009-01-01

    Federal, State, and local water quality managers charged with restoring the Chesapeake Bay ecosystem require tools to maximize the impact of their limited resources. To address this need, the U.S. Geological Survey (USGS) and the Environmental Protection Agency's Chesapeake Bay Program (CBP) are developing a suite of Web-based tools called the Chesapeake Online Assessment Support Toolkit (COAST). The goal of COAST is to help CBP partners identify geographic areas where restoration activities would have the greatest effect, select the appropriate management strategies, and improve coordination and prioritization among partners. As part of the COAST suite of tools focused on environmental restoration, a water quality management visualization component called the Nutrient Yields Mapper (NYM) tool is being developed by USGS. The NYM tool is a web application that uses watershed yield estimates from USGS SPAtially Referenced Regressions On Watershed (SPARROW) attributes model (Schwarz et al., 2006) [6] to allow water quality managers to identify important sources of nitrogen and phosphorous within the Chesapeake Bay watershed. The NYM tool utilizes new open source technologies that have become popular in geospatial web development, including components such as OpenLayers and GeoServer. This paper presents examples of water quality data analysis based on nutrient type, source, yield, and area of interest using the NYM tool for the Chesapeake Bay watershed. In addition, we describe examples of map-based techniques for identifying high and low nutrient yield areas; web map engines; and data visualization and data management techniques.

  12. Artificial Intelligent Platform as Decision Tool for Asset Management, Operations and Maintenance.

    PubMed

    2018-01-04

    An Artificial Intelligence (AI) system has been developed and implemented for water, wastewater and reuse plants to improve management of sensors, short and long term maintenance plans, asset and investment management plans. It is based on an integrated approach to capture data from different computer systems and files. It adds a layer of intelligence to the data. It serves as a repository of key current and future operations and maintenance conditions that a plant needs have knowledge of. With this information, it is able to simulate the configuration of processes and assets for those conditions to improve or optimize operations, maintenance and asset management, using the IViewOps (Intelligent View of Operations) model. Based on the optimization through model runs, it is able to create output files that can feed data to other systems and inform the staff regarding optimal solutions to the conditions experienced or anticipated in the future.

  13. An evaluation and implementation of rule-based Home Energy Management System using the Rete algorithm.

    PubMed

    Kawakami, Tomoya; Fujita, Naotaka; Yoshihisa, Tomoki; Tsukamoto, Masahiko

    2014-01-01

    In recent years, sensors become popular and Home Energy Management System (HEMS) takes an important role in saving energy without decrease in QoL (Quality of Life). Currently, many rule-based HEMSs have been proposed and almost all of them assume "IF-THEN" rules. The Rete algorithm is a typical pattern matching algorithm for IF-THEN rules. Currently, we have proposed a rule-based Home Energy Management System (HEMS) using the Rete algorithm. In the proposed system, rules for managing energy are processed by smart taps in network, and the loads for processing rules and collecting data are distributed to smart taps. In addition, the number of processes and collecting data are reduced by processing rules based on the Rete algorithm. In this paper, we evaluated the proposed system by simulation. In the simulation environment, rules are processed by a smart tap that relates to the action part of each rule. In addition, we implemented the proposed system as HEMS using smart taps.

  14. Exemplars in the use of technology for management of depression in primary care.

    PubMed

    Serrano, Neftali; Molander, Rachel; Monden, Kimberley; Grosshans, Ashley; Krahn, Dean D

    2012-06-01

    Depression care management as part of larger efforts to integrate behavioral health care into primary care has been shown to be effective in helping patients and primary care clinicians achieve improved outcomes within the primary care environment. Central to care management systems is the use of registries which enable effective clinic population management. The aim of this article is to detail the methods and utility of technology in depression care management processes while also highlighting the real-world variations and barriers that exist in different clinical environments, namely a federally qualified health center and a Veterans Administration clinic. We analyzed descriptive data from the registries of Access Community Health Centers and the William S. Middleton Veterans Administration clinics along with historical reviews of their respective care management processes. Both registry reviews showed trend data indicating improvement in scores of depression and provided baseline data on important system variables, such as the number of patients who are not making progress, the percentage of patients who are unreachable by phone, and the kind of actions needed to ensure evidence-based and efficient care. Both sites also highlighted systemic technical barriers to more complete implementation of care management processes. Care management processes are an effective and efficient part of population-based care for depression in primary care. Implementation depends on available resources including hardware, software, and clinical personnel. Additionally, care management processes and technology have evolved over time based on local needs and are part of an integrated method to support the work of primary care clinicians in providing care for patients with depression.

  15. The Design and Usage of the New Data Management Features in NASTRAN

    NASA Technical Reports Server (NTRS)

    Pamidi, P. R.; Brown, W. K.

    1984-01-01

    Two new data management features are installed in the April 1984 release of NASTRAN. These two features are the Rigid Format Data Base and the READFILE capability. The Rigid Format Data Base is stored on external files in card image format and can be easily maintained and expanded by the use of standard text editors. This data base provides the user and the NASTRAN maintenance contractor with an easy means for making changes to a Rigid Format or for generating new Rigid Formats without unnecessary compilations and link editing of NASTRAN. Each Rigid Format entry in the data base contains the Direct Matrix Abstraction Program (DMAP), along with the associated restart, DMAP sequence subset and substructure control flags. The READFILE capability allows an user to reference an external secondary file from the NASTRAN primary input file and to read data from this secondary file. There is no limit to the number of external secondary files that may be referenced and read.

  16. Community-based primary care: improving and assessing diabetes management.

    PubMed

    Gannon, Meghan; Qaseem, Amir; Snow, Vincenza

    2010-01-01

    Morbidity and mortality associated with diabetes make it a prime target for quality improvement research. Quality gaps and racial/gender disparities persist throughout this population of patients necessitating a sustainable improvement in the clinical management of diabetes. The authors of this study sought (1) to provide a population perspective on diabetes management, and (2) to reinforce evidence-based clinical guidelines through a Web-based educational module.The project also aimed to gain insight into working remotely with a community of rural physicians. This longitudinal pre-post intervention study involved 18 internal medicine physicians and included 3 points of medical record data abstraction over 24 months. A Web-based educational module was introduced after the baseline data abstraction. This module contained chapters on clinical education, practice tools, and self-assessment. The results showed a sustained improvement in most clinical outcomes and demonstrated the effectiveness of using Web-based mediums to reinforce clinical guidelines and change physician behavior.

  17. Development of the prototype data management system of the solar H-alpha full disk observation

    NASA Astrophysics Data System (ADS)

    Wei, Ka-Ning; Zhao, Shi-Qing; Li, Qiong-Ying; Chen, Dong

    2004-06-01

    The Solar Chromospheric Telescope in Yunnan Observatory generates about 2G bytes fits format data per day. Huge amounts of data will bring inconvenience for people to use. Hence, data searching and sharing are important at present. Data searching, on-line browsing, remote accesses and download are developed with a prototype data management system of the solar H-alpha full disk observation, and improved by the working flow technology. Based on Windows XP operating system and MySQL data management system, a prototype system of browse/server model is developed by JAVA and JSP. Data compression, searching, browsing, deletion need authority and download in real-time have been achieved.

  18. Wildlife tracking data management: a new vision.

    PubMed

    Urbano, Ferdinando; Cagnacci, Francesca; Calenge, Clément; Dettki, Holger; Cameron, Alison; Neteler, Markus

    2010-07-27

    To date, the processing of wildlife location data has relied on a diversity of software and file formats. Data management and the following spatial and statistical analyses were undertaken in multiple steps, involving many time-consuming importing/exporting phases. Recent technological advancements in tracking systems have made large, continuous, high-frequency datasets of wildlife behavioural data available, such as those derived from the global positioning system (GPS) and other animal-attached sensor devices. These data can be further complemented by a wide range of other information about the animals' environment. Management of these large and diverse datasets for modelling animal behaviour and ecology can prove challenging, slowing down analysis and increasing the probability of mistakes in data handling. We address these issues by critically evaluating the requirements for good management of GPS data for wildlife biology. We highlight that dedicated data management tools and expertise are needed. We explore current research in wildlife data management. We suggest a general direction of development, based on a modular software architecture with a spatial database at its core, where interoperability, data model design and integration with remote-sensing data sources play an important role in successful GPS data handling.

  19. Wildlife tracking data management: a new vision

    PubMed Central

    Urbano, Ferdinando; Cagnacci, Francesca; Calenge, Clément; Dettki, Holger; Cameron, Alison; Neteler, Markus

    2010-01-01

    To date, the processing of wildlife location data has relied on a diversity of software and file formats. Data management and the following spatial and statistical analyses were undertaken in multiple steps, involving many time-consuming importing/exporting phases. Recent technological advancements in tracking systems have made large, continuous, high-frequency datasets of wildlife behavioural data available, such as those derived from the global positioning system (GPS) and other animal-attached sensor devices. These data can be further complemented by a wide range of other information about the animals' environment. Management of these large and diverse datasets for modelling animal behaviour and ecology can prove challenging, slowing down analysis and increasing the probability of mistakes in data handling. We address these issues by critically evaluating the requirements for good management of GPS data for wildlife biology. We highlight that dedicated data management tools and expertise are needed. We explore current research in wildlife data management. We suggest a general direction of development, based on a modular software architecture with a spatial database at its core, where interoperability, data model design and integration with remote-sensing data sources play an important role in successful GPS data handling. PMID:20566495

  20. [Informatics data quality and management].

    PubMed

    Feng, Rung-Chuang

    2009-06-01

    While the quality of data affects every aspect of business, it is frequently overlooked in terms of customer data integration, data warehousing, business intelligence and enterprise applications. Regardless of which data terms are used, a high level of data quality is a critical base condition essential to satisfy user needs and facilitate the development of effective applications. In this paper, the author introduces methods, a management framework and the major factors involved in data quality assessment. Author also integrates expert opinions to develop data quality assessment tools.

  1. Key Management Scheme Based on Route Planning of Mobile Sink in Wireless Sensor Networks.

    PubMed

    Zhang, Ying; Liang, Jixing; Zheng, Bingxin; Jiang, Shengming; Chen, Wei

    2016-01-29

    In many wireless sensor network application scenarios the key management scheme with a Mobile Sink (MS) should be fully investigated. This paper proposes a key management scheme based on dynamic clustering and optimal-routing choice of MS. The concept of Traveling Salesman Problem with Neighbor areas (TSPN) in dynamic clustering for data exchange is proposed, and the selection probability is used in MS route planning. The proposed scheme extends static key management to dynamic key management by considering the dynamic clustering and mobility of MSs, which can effectively balance the total energy consumption during the activities. Considering the different resources available to the member nodes and sink node, the session key between cluster head and MS is established by modified an ECC encryption with Diffie-Hellman key exchange (ECDH) algorithm and the session key between member node and cluster head is built with a binary symmetric polynomial. By analyzing the security of data storage, data transfer and the mechanism of dynamic key management, the proposed scheme has more advantages to help improve the resilience of the key management system of the network on the premise of satisfying higher connectivity and storage efficiency.

  2. Estimating missing hourly climatic data using artificial neural network for energy balance based ET mapping applications

    USDA-ARS?s Scientific Manuscript database

    Remote sensing based evapotranspiration (ET) mapping is an important improvement for water resources management. Hourly climatic data and reference ET are crucial for implementing remote sensing based ET models such as METRIC and SEBAL. In Turkey, data on all climatic variables may not be available ...

  3. The intelligent database machine

    NASA Technical Reports Server (NTRS)

    Yancey, K. E.

    1985-01-01

    The IDM data base was compared with the data base crack to determine whether IDM 500 would better serve the needs of the MSFC data base management system than Oracle. The two were compared and the performance of the IDM was studied. Implementations that work best on which database are implicated. The choice is left to the database administrator.

  4. [Basic research on digital logistic management of hospital].

    PubMed

    Cao, Hui

    2010-05-01

    This paper analyzes and explores the possibilities of digital information-based management realized by equipment department, general services department, supply room and other material flow departments in different hospitals in order to optimize the procedures of information-based asset management. There are various analytical methods of medical supplies business models, providing analytical data for correct decisions made by departments and leaders of hospital and the governing authorities.

  5. Defining a data management strategy for USGS Chesapeake Bay studies

    USGS Publications Warehouse

    Ladino, Cassandra

    2013-01-01

    The mission of U.S. Geological Survey’s (USGS) Chesapeake Bay studies is to provide integrated science for improved understanding and management of the Chesapeake Bay ecosystem. Collective USGS efforts in the Chesapeake Bay watershed began in the 1980s, and by the mid-1990s the USGS adopted the watershed as one of its national place-based study areas. Great focus and effort by the USGS have been directed toward Chesapeake Bay studies for almost three decades. The USGS plays a key role in using “ecosystem-based adaptive management, which will provide science to improve the efficiency and accountability of Chesapeake Bay Program activities” (Phillips, 2011). Each year USGS Chesapeake Bay studies produce published research, monitoring data, and models addressing aspects of bay restoration such as, but not limited to, fish health, water quality, land-cover change, and habitat loss. The USGS is responsible for collaborating and sharing this information with other Federal agencies and partners as described under the President’s Executive Order 13508—Strategy for Protecting and Restoring the Chesapeake Bay Watershed signed by President Obama in 2009. Historically, the USGS Chesapeake Bay studies have relied on national USGS databases to store only major nationally available sources of data such as streamflow and water-quality data collected through local monitoring programs and projects, leaving a multitude of other important project data out of the data management process. This practice has led to inefficient methods of finding Chesapeake Bay studies data and underutilization of data resources. Data management by definition is “the business functions that develop and execute plans, policies, practices and projects that acquire, control, protect, deliver and enhance the value of data and information.” (Mosley, 2008a). In other words, data management is a way to preserve, integrate, and share data to address the needs of the Chesapeake Bay studies to better manage data resources, work more efficiently with partners, and facilitate holistic watershed science. It is now the goal of the USGS Chesapeake Bay studies to implement an enhanced and all-encompassing approach to data management. This report discusses preliminary efforts to implement a physical data management system for program data that is not replicated nationally through other USGS databases.

  6. SFB754 - data management in large interdisciplinary collaborative research projects: what matters?

    NASA Astrophysics Data System (ADS)

    Mehrtens, Hela; Springer, Pina; Schirnick, Carsten; Schelten, Christiane K.

    2016-04-01

    Data management for SFB 754 is an integral part of the joint data management team at GEOMAR Helmholtz Centre for Ocean Research Kiel, a cooperation of the Cluster of Excellence "Future Ocean", the SFB 754 and other current and former nationally and EU-funded projects. The coalition successfully established one common data management infrastructure for marine sciences in Kiel. It aims to help researchers to better document the data lifecycle from acquisition to publication and share their results already during the project phase. The infrastructure is continuously improved by integration of standard tools and developing extensions in close cooperation with scientists, data centres and other research institutions. Open and frequent discussion of data management topics during SFB 754 meetings and seminars and efficient cooperation with its coordination office allowed gradual establishment of better data management practices. Furthermore a data policy was agreed on to ensure proper usage of data sets, even unpublished ones, schedules data upload and dissemination and enforces long-term public availability of the research outcome. Acceptance of the infrastructure is also backed by easy usage of the web-based platform for data set documentation and exchange among all research disciplines of the SFB 754. Members of the data management team act as data curators and assist in data publication in World Data Centres (e.g. PANGAEA). Cooperation with world data centres makes the research data then globally searchable and accessible while links to the data producers ensure citability and provide points of contact for the scientific community. A complete record of SFB 754 publications is maintained within the institutional repository for full text print publications by the GEOMAR library. This repository is strongly linked with the data management information system providing dynamic and up-to-date overviews on the various ties between publications and available data sets, expeditions and projects. Such views are also frequently used for the website and reports by the SFB 754 scientific community. The concept of a joint approach initiated by large-scale projects and participating institutions in order to establish a single data management infrastructure has proven to be very successful. We have experienced a snowball-like propagation among marine researchers at GEOMAR and Kiel University, they continue to engage data management services well known from collaboration with SFB 754. But we also observe an ongoing demand for training of new junior (and senior) scientists and continuous need for adaption to new methods and techniques. Only a standardized and consistent data management warrants completeness and integrity of published research data related to their peer-reviewed journal publications in the long run. Based on our daily experience this is best achieved, if not only, by skilled and experienced staff in a dedicated data management team which persists beyond the funding period of research projects. It can effectively carry on and impact by continuous personal contact, consultation and training of researchers on-site. (This poster is linked to the presentation by Dr. Christiane K. Schelten)

  7. Design and application of BIM based digital sand table for construction management

    NASA Astrophysics Data System (ADS)

    Fuquan, JI; Jianqiang, LI; Weijia, LIU

    2018-05-01

    This paper explores the design and application of BIM based digital sand table for construction management. Aiming at the demands and features of construction management plan for bridge and tunnel engineering, the key functional features of digital sand table should include three-dimensional GIS, model navigation, virtual simulation, information layers, and data exchange, etc. That involving the technology of 3D visualization and 4D virtual simulation of BIM, breakdown structure of BIM model and project data, multi-dimensional information layers, and multi-source data acquisition and interaction. Totally, the digital sand table is a visual and virtual engineering information integrated terminal, under the unified data standard system. Also, the applications shall contain visual constructing scheme, virtual constructing schedule, and monitoring of construction, etc. Finally, the applicability of several basic software to the digital sand table is analyzed.

  8. Factual Approach in Decision Making - the Prerequisite of Success in Quality Management

    NASA Astrophysics Data System (ADS)

    Kučerová, Marta; Škůrková Lestyánszka, Katarína

    2013-12-01

    In quality management system as well as in other managerial systems, effective decisions must be always based on the data and information analysis, i.e. based on facts, in accordance with the factual approach principle in quality management. It is therefore necessary to measure and collect the data and information about processes. The article presents the results of a conducted survey, which was focused on application of factual approach in decision making. It also offers suggestions for improvements of application of the principle in business practice. This article was prepared using the research results of VEGA project No. 1/0229/08 "Perspectives of the quality management development in relation to the requirements of market in the Slovak Republic".

  9. Comprehensive transportation asset management : risk-based inventory expansion and data needs.

    DOT National Transportation Integrated Search

    2011-12-01

    Several agencies are applying asset management principles as a business tool and paradigm to help them define goals and prioritize agency resources in decision making. Previously, transportation asset management (TAM) has focused more on big ticke...

  10. A data base and analysis program for shuttle main engine dynamic pressure measurements. Appendix F: Data base plots for SSME tests 750-120 through 750-200

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1986-01-01

    A dynamic pressure data base and data base management system developed to characterize the Space Shuttle Main Engine (SSME) dynamic pressure environment is presented. The data base represents dynamic pressure measurements obtained during single engine hot firing tests of the SSME. Software is provided to permit statistical evaluation of selected measurements under specified operating conditions. An interpolation scheme is also included to estimate spectral trends with SSME power level.

  11. A Location Based Communication Proposal for Disaster Crisis Management

    NASA Astrophysics Data System (ADS)

    Gülnerman, A. G.; Goksel, C.; Tezer, A.

    2014-12-01

    The most vital applications within urban applications under the title of Geographical Information system applications are Disaster applications. Especially, In Turkey the most occured disaster type Earthquakes impacts are hard to retain in urban due to greatness of area, data and effected resident or victim. Currently, communications between victims and institutions congested and collapsed, after disaster that results emergency service delay and so secondary death and desperation. To avoid these types of life loss, the communication should be established between public and institutions. Geographical Information System Technology is seen capable of data management techniques and communication tool. In this study, Life Saving Kiosk Modal Proposal designed as a communication tool based on GIS, after disaster, takes locational emegency demands, meets emergency demands over notification maps which is created by those demands,increase public solidarity by visualizing close emergency demanded area surrounded another one and gathers emergency service demanded institutions notifications and aims to increasethe capability of management. This design prosals' leading role is public. Increase in capability depends on public major contribution to disaster management by required communication infrastructure establishment. The aim is to propound public power instead of public despiration. Apart from general view of disaster crisis management approaches, Life Saving Kiosk Modal Proposal indicates preparedness and response phases within the disaster cycle and solve crisis management with the organization of design in preparedness phase, use in response phase. This resolution modal flow diagram is builded between public, communication tool (kiosk) amd response force. The software is included in communication tools whose functions, interface designs and user algorithms are provided considering the public participation. In this study, disaster crisis management with public participation and power use with data flow modal based on location is came up for discussion by comparing with the other available applications in manner of time, detail of data, required staff and expertise degree, data reality and data archive.

  12. Building an Ontology-driven Database for Clinical Immune Research

    PubMed Central

    Ma, Jingming

    2006-01-01

    The clinical researches of immune response usually generate a huge amount of biomedical testing data over a certain period of time. The user-friendly data management systems based on the relational database will help immunologists/clinicians to fully manage the data. On the other hand, the same biological assays such as ELISPOT and flow cytometric assays are involved in immunological experiments no matter of different study purposes. The reuse of biological knowledge is one of driving forces behind this ontology-driven data management. Therefore, an ontology-driven database will help to handle different clinical immune researches and help immunologists/clinicians easily understand the immunological data from each other. We will discuss some outlines for building an ontology-driven data management for clinical immune researches (ODMim). PMID:17238637

  13. Combining the Generic Entity-Attribute-Value Model and Terminological Models into a Common Ontology to Enable Data Integration and Decision Support.

    PubMed

    Bouaud, Jacques; Guézennec, Gilles; Séroussi, Brigitte

    2018-01-01

    The integration of clinical information models and termino-ontological models into a unique ontological framework is highly desirable for it facilitates data integration and management using the same formal mechanisms for both data concepts and information model components. This is particularly true for knowledge-based decision support tools that aim to take advantage of all facets of semantic web technologies in merging ontological reasoning, concept classification, and rule-based inferences. We present an ontology template that combines generic data model components with (parts of) existing termino-ontological resources. The approach is developed for the guideline-based decision support module on breast cancer management within the DESIREE European project. The approach is based on the entity attribute value model and could be extended to other domains.

  14. Earth-Base: A Free And Open Source, RESTful Earth Sciences Platform

    NASA Astrophysics Data System (ADS)

    Kishor, P.; Heim, N. A.; Peters, S. E.; McClennen, M.

    2012-12-01

    This presentation describes the motivation, concept, and architecture behind Earth-Base, a web-based, RESTful data-management, analysis and visualization platform for earth sciences data. Traditionally web applications have been built directly accessing data from a database using a scripting language. While such applications are great at bring results to a wide audience, they are limited in scope to the imagination and capabilities of the application developer. Earth-Base decouples the data store from the web application by introducing an intermediate "data application" tier. The data application's job is to query the data store using self-documented, RESTful URIs, and send the results back formatted as JavaScript Object Notation (JSON). Decoupling the data store from the application allows virtually limitless flexibility in developing applications, both web-based for human consumption or programmatic for machine consumption. It also allows outside developers to use the data in their own applications, potentially creating applications that the original data creator and app developer may not have even thought of. Standardized specifications for URI-based querying and JSON-formatted results make querying and developing applications easy. URI-based querying also allows utilizing distributed datasets easily. Companion mechanisms for querying data snapshots aka time-travel, usage tracking and license management, and verification of semantic equivalence of data are also described. The latter promotes the "What You Expect Is What You Get" (WYEIWYG) principle that can aid in data citation and verification.

  15. Using Android and Open Data Kit Technology in Data Management for Research in Resource-Limited Settings in the Niger Delta Region of Nigeria: Cross-Sectional Household Survey

    PubMed Central

    Maleghemi, Sylvester

    2017-01-01

    Background Data collection in Sub-Saharan Africa has traditionally been paper-based. However, the popularization of Android mobile devices and data capture software has brought paperless data management within reach. We used Open Data Kit (ODK) technology on Android mobile devices during a household survey in the Niger Delta region of Nigeria. Objective The aim of this study was to describe the pros and cons of deploying ODK for data management. Methods A descriptive cross-sectional household survey was carried out by 6 data collectors between April and May 2016. Data were obtained from 1706 persons in 601 households across 6 communities in 3 states in the Niger Delta. The use of Android mobile devices and ODK technology involved form building, testing, collection, aggregation, and download for data analysis. The median duration for data collection per household and per individual was 25.7 and 9.3 min, respectively. Results Data entries per device ranged from 33 (33/1706, 1.93%) to 482 (482/1706, 28.25%) individuals between 9 (9/601, 1.5%) and 122 (122/601, 20.3%) households. The most entries (470) were made by data collector 5. Only 2 respondents had data entry errors (2/1706, 0.12%). However, 73 (73/601, 12.1%) households had inaccurate date and time entries for when data collection started and ended. The cost of deploying ODK was estimated at US $206.7 in comparison with the estimated cost of US $466.7 for paper-based data management. Conclusions We found the use of mobile data capture technology to be efficient and cost-effective. As Internet services improve in Africa, we advocate their use as effective tools for health information management. PMID:29191798

  16. Reporting Capabilities and Management of the DSN Energy Data Base

    NASA Technical Reports Server (NTRS)

    Hughes, R. D.; Boyd, S. T.

    1981-01-01

    The DSN Energy Data Base is a collection of computer files developed and maintained by DSN Engineering. The energy consumption data must be updated monthly and summarized and displayed in printed output as desired. The methods used to handle the data and perform these tasks are described.

  17. Automated support for experience-based software management

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  18. Implementation of system intelligence in a 3-tier telemedicine/PACS hierarchical storage management system

    NASA Astrophysics Data System (ADS)

    Chao, Woodrew; Ho, Bruce K. T.; Chao, John T.; Sadri, Reza M.; Huang, Lu J.; Taira, Ricky K.

    1995-05-01

    Our tele-medicine/PACS archive system is based on a three-tier distributed hierarchical architecture, including magnetic disk farms, optical jukebox, and tape jukebox sub-systems. The hierarchical storage management (HSM) architecture, built around a low cost high performance platform [personal computers (PC) and Microsoft Windows NT], presents a very scaleable and distributed solution ideal for meeting the needs of client/server environments such as tele-medicine, tele-radiology, and PACS. These image based systems typically require storage capacities mirroring those of film based technology (multi-terabyte with 10+ years storage) and patient data retrieval times at near on-line performance as demanded by radiologists. With the scaleable architecture, storage requirements can be easily configured to meet the needs of the small clinic (multi-gigabyte) to those of a major hospital (multi-terabyte). The patient data retrieval performance requirement was achieved by employing system intelligence to manage migration and caching of archived data. Relevant information from HIS/RIS triggers prefetching of data whenever possible based on simple rules. System intelligence embedded in the migration manger allows the clustering of patient data onto a single tape during data migration from optical to tape medium. Clustering of patient data on the same tape eliminates multiple tape loading and associated seek time during patient data retrieval. Optimal tape performance can then be achieved by utilizing the tape drives high performance data streaming capabilities thereby reducing typical data retrieval delays associated with streaming tape devices.

  19. Applications of Satellite Data to Support Improvements in Irrigation and Groundwater Management in California

    NASA Technical Reports Server (NTRS)

    Melton, Forrest S.

    2017-01-01

    In agricultural regions around the world, threats to water supplies from drought and groundwater depletion are driving increased demand for tools to advance agricultural water use efficiency and support sustainable groundwater management. Satellite mapping of evapotranspiration (ET) from irrigated agricultural lands can provide agricultural producers and water resource managers with information that can be used to both optimize ag water use and improve estimates of groundwater withdrawals for irrigation. We describe the development of two remote sensing-based tools for ET mapping in California, including important lessons in terms of system design, partnership development, and transition to operations. For irrigation management, the integration of satellite data and surface sensor networks to provide timely delivery of information on crop water requirements can make irrigation scheduling more practical, convenient, and accurate.Developed through a partnership between NASA and the CA Department of Water Resources, the Satellite Irrigation Management Support (SIMS) framework integrates satellite data with information from agricultural weather networks to map crop canopy development and crop water requirements at the scale of individual fields. Information is distributed to agricultural producers and water managers via a web-based interface and web data services. SIMS also provides an API that facilitates integration with other irrigation decision support tools, such as CropManage and IrriQuest. Field trials using these integrated tools have shown that they can be used to sustain yields while improving water use efficiency and nutrient management. For sustainable groundwater management, the combination of satellite-derived estimates of ET and data on surface water deliveries for irrigation can increase the accuracy of estimates of groundwater pumping. We are developing an OpenET platform to facilitate access to ET data from multiple models and accelerate operational use of ET data in support of a range of water management applications, including implementation of the Sustainable Groundwater Management Act in CA. By providing a shared basis for decision making, we anticipate that the OpenET platform will accelerate implementation of solutions for sustainable groundwater management.

  20. Applications of Satellite Data to Support Improvements in Irrigation and Groundwater Management in California

    NASA Astrophysics Data System (ADS)

    Melton, F. S.; Huntington, J. L.; Johnson, L.; Guzman, A.; Morton, C.; Zaragoza, I.; Dexter, J.; Rosevelt, C.; Michaelis, A.; Nemani, R. R.; Cahn, M.; Temesgen, B.; Trezza, R.; Frame, K.; Eching, S.; Grimm, R.; Hall, M.

    2017-12-01

    In agricultural regions around the world, threats to water supplies from drought and groundwater depletion are driving increased demand for tools to advance agricultural water use efficiency and support sustainable groundwater management. Satellite mapping of evapotranspiration (ET) from irrigated agricultural lands can provide agricultural producers and water resource managers with information that can be used to both optimize ag water use and improve estimates of groundwater withdrawals for irrigation. We describe the development of two remote sensing-based tools for ET mapping in California, including important lessons in terms of system design, partnership development, and transition to operations. For irrigation management, the integration of satellite data and surface sensor networks to provide timely delivery of information on crop water requirements can make irrigation scheduling more practical, convenient, and accurate. Developed through a partnership between NASA and the CA Department of Water Resources, the Satellite Irrigation Management Support (SIMS) framework integrates satellite data with information from agricultural weather networks to map crop canopy development and crop water requirements at the scale of individual fields. Information is distributed to agricultural producers and water managers via a web-based interface and web data services. SIMS also provides an API that facilitates integration with other irrigation decision support tools, such as CropManage and IrriQuest. Field trials using these integrated tools have shown that they can be used to sustain yields while improving water use efficiency and nutrient management. For sustainable groundwater management, the combination of satellite-derived estimates of ET and data on surface water deliveries for irrigation can increase the accuracy of estimates of groundwater pumping. We are developing an OpenET platform to facilitate access to ET data from multiple models and accelerate operational use of ET data in support of a range of water management applications, including implementation of the Sustainable Groundwater Management Act in CA. By providing a shared basis for decision making, we anticipate that the OpenET platform will accelerate implementation of solutions for sustainable groundwater management.

  1. Evidence-based management - healthcare manager viewpoints.

    PubMed

    Janati, Ali; Hasanpoor, Edris; Hajebrahimi, Sakineh; Sadeghi-Bazargani, Homayoun

    2018-06-11

    Purpose Hospital manager decisions can have a significant impact on service effectiveness and hospital success, so using an evidence-based approach can improve hospital management. The purpose of this paper is to identify evidence-based management (EBMgt) components and challenges. Consequently, the authors provide an improving evidence-based decision-making framework. Design/methodology/approach A total of 45 semi-structured interviews were conducted in 2016. The authors also established three focus group discussions with health service managers. Data analysis followed deductive qualitative analysis guidelines. Findings Four basic themes emerged from the interviews, including EBMgt evidence sources (including sub-themes: scientific and research evidence, facts and information, political-social development plans, managers' professional expertise and ethical-moral evidence); predictors (sub-themes: stakeholder values and expectations, functional behavior, knowledge, key competencies and skill, evidence sources, evidence levels, uses and benefits and government programs); EBMgt barriers (sub-themes: managers' personal characteristics, decision-making environment, training and research system and organizational issues); and evidence-based hospital management processes (sub-themes: asking, acquiring, appraising, aggregating, applying and assessing). Originality/value Findings suggest that most participants have positive EBMgt attitudes. A full evidence-based hospital manager is a person who uses all evidence sources in a six-step decision-making process. EBMgt frameworks are a good tool to manage healthcare organizations. The authors found factors affecting hospital EBMgt and identified six evidence sources that healthcare managers can use in evidence-based decision-making processes.

  2. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    NASA Astrophysics Data System (ADS)

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  3. Policy, practice and decision making for zoonotic disease management: water and Cryptosporidium.

    PubMed

    Austin, Zoë; Alcock, Ruth E; Christley, Robert M; Haygarth, Philip M; Heathwaite, A Louise; Latham, Sophia M; Mort, Maggie; Oliver, David M; Pickup, Roger; Wastling, Jonathan M; Wynne, Brian

    2012-04-01

    Decision making for zoonotic disease management should be based on many forms of appropriate data and sources of evidence. However, the criteria and timing for policy response and the resulting management decisions are often altered when a disease outbreak occurs and captures full media attention. In the case of waterborne disease, such as the robust protozoa, Cryptosporidium spp, exposure can cause significant human health risks and preventing exposure by maintaining high standards of biological and chemical water quality remains a priority for water companies in the UK. Little has been documented on how knowledge and information is translated between the many stakeholders involved in the management of Cryptosporidium, which is surprising given the different drivers that have shaped management decisions. Such information, coupled with the uncertainties that surround these data is essential for improving future management strategies that minimise disease outbreaks. Here, we examine the interplay between scientific information, the media, and emergent government and company policies to examine these issues using qualitative and quantitative data relating to Cryptosporidium management decisions by a water company in the North West of England. Our results show that political and media influences are powerful drivers of management decisions if fuelled by high profile outbreaks. Furthermore, the strength of the scientific evidence is often constrained by uncertainties in the data, and in the way knowledge is translated between policy levels during established risk management procedures. In particular, under or over-estimating risk during risk assessment procedures together with uncertainty regarding risk factors within the wider environment, was found to restrict the knowledge-base for decision-making in Cryptosporidium management. Our findings highlight some key current and future challenges facing the management of such diseases that are widely applicable to other risk management situations. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Guide to data collection

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Guidelines and recommendations are presented for the collection of software development data. Motivation and planning for, and implementation and management of, a data collection effort are discussed. Topics covered include types, sources, and availability of data; methods and costs of data collection; types of analyses supported; and warnings and suggestions based on software engineering laboratory (SEL) experiences. This document is intended as a practical guide for software managers and engineers, abstracted and generalized from 5 years of SEL data collection.

  5. VoCATS User Guide. [Draft.

    ERIC Educational Resources Information Center

    North Carolina State Dept. of Public Instruction, Raleigh. Div. of Vocational Education Services.

    This guide focuses on use of the North Carolina Vocational Competency Achievement Tracking System (VoCATS)-designated software in the instructional management process. (VoCATS is a competency-based, computer-based instructional management system that allows the collection of data on student performance achievement prior to, during, and following…

  6. A system to build distributed multivariate models and manage disparate data sharing policies: implementation in the scalable national network for effectiveness research.

    PubMed

    Meeker, Daniella; Jiang, Xiaoqian; Matheny, Michael E; Farcas, Claudiu; D'Arcy, Michel; Pearlman, Laura; Nookala, Lavanya; Day, Michele E; Kim, Katherine K; Kim, Hyeoneui; Boxwala, Aziz; El-Kareh, Robert; Kuo, Grace M; Resnic, Frederic S; Kesselman, Carl; Ohno-Machado, Lucila

    2015-11-01

    Centralized and federated models for sharing data in research networks currently exist. To build multivariate data analysis for centralized networks, transfer of patient-level data to a central computation resource is necessary. The authors implemented distributed multivariate models for federated networks in which patient-level data is kept at each site and data exchange policies are managed in a study-centric manner. The objective was to implement infrastructure that supports the functionality of some existing research networks (e.g., cohort discovery, workflow management, and estimation of multivariate analytic models on centralized data) while adding additional important new features, such as algorithms for distributed iterative multivariate models, a graphical interface for multivariate model specification, synchronous and asynchronous response to network queries, investigator-initiated studies, and study-based control of staff, protocols, and data sharing policies. Based on the requirements gathered from statisticians, administrators, and investigators from multiple institutions, the authors developed infrastructure and tools to support multisite comparative effectiveness studies using web services for multivariate statistical estimation in the SCANNER federated network. The authors implemented massively parallel (map-reduce) computation methods and a new policy management system to enable each study initiated by network participants to define the ways in which data may be processed, managed, queried, and shared. The authors illustrated the use of these systems among institutions with highly different policies and operating under different state laws. Federated research networks need not limit distributed query functionality to count queries, cohort discovery, or independently estimated analytic models. Multivariate analyses can be efficiently and securely conducted without patient-level data transport, allowing institutions with strict local data storage requirements to participate in sophisticated analyses based on federated research networks. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  7. Designing Extensible Data Management for Ocean Observatories, Platforms, and Devices

    NASA Astrophysics Data System (ADS)

    Graybeal, J.; Gomes, K.; McCann, M.; Schlining, B.; Schramm, R.; Wilkin, D.

    2002-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) has been collecting science data for 15 years from all kinds of oceanographic instruments and systems, and is building a next-generation observing system, the MBARI Ocean Observing System (MOOS). To meet the data management requirements of the MOOS, the Institute began developing a flexible, extensible data management solution, the Shore Side Data System (SSDS). This data management system must address a wide variety of oceanographic instruments and data sources, including instruments and platforms of the future. Our data management solution will address all elements of the data management challenge, from ingest (including suitable pre-definition of metadata) through to access and visualization. Key to its success will be ease of use, and automatic incorporation of new data streams and data sets. The data will be of many different forms, and come from many different types of instruments. Instruments will be designed for fixed locations (as with moorings), changing locations (drifters and AUVs), and cruise-based sampling. Data from airplanes, satellites, models, and external archives must also be considered. Providing an architecture which allows data from these varied sources to be automatically archived and processed, yet readily accessed, is only possible with the best practices in metadata definition, software design, and re-use of third-party components. The current status of SSDS development will be presented, including lessons learned from our science users and from previous data management designs.

  8. What Health Service Provider Factors Are Associated with Low Delivery of HIV Testing to Children with Acute Malnutrition in Dowa District of Malawi?

    PubMed

    Chitete, Lusungu; Puoane, Thandi

    2015-01-01

    The Community-based Management of Acute Malnutrition is the national program for treating acute malnutrition in Malawi. Under this program's guidelines all children enrolled should undergo an HIV test, so that those infected can receive appropriate treatment and care. However, the national data of 2012 shows a low delivery of testing. Prior studies have investigated client-related factors affecting uptake of HIV testing in Community-based Management of Acute Malnutrition program. Lacking is the information on the service provider factors that are associated with the delivery of testing. This study investigated service provider factors that affect delivery of HIV testing among children enrolled in the program and explored ways in which this could be improved. A descriptive study that used qualitative methods of data collection. Client registers were reviewed to obtain the number of children enrolled in Community-based Management of Acute Malnutrition and the number of children who were tested for HIV over a 12-month period. In-depth interviews were conducted with Community-based Management of Acute Malnutrition and HIV Testing and Counselling focal persons to investigate factors affecting HIV test delivery. Descriptive statistics were used to analyze data from client registers. Information from interviews was analyzed using a thematic approach. Quantitative data revealed that 1738 (58%) of 2981 children enrolled in Community-based Management of Acute Malnutrition were tested for HIV. From in-depth interviews four themes emerged, that is, lack of resources for HIV tests; shortage of staff skilled in HIV testing and counseling; lack of commitment among staff in referring children for HIV testing; and inadequately trained staff. There is a need for a functioning health system to help reduce child mortality resulting from HIV related conditions.

  9. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data

    PubMed Central

    2011-01-01

    Background No systematic process has previously been described for a needs assessment that identifies the operating room (OR) management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. Methods A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1). Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS) data for most scenarios (43 of 45). Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Results Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of individualized assessments. Conclusions Our technical advance is the development and use of automated event-based knowledge elicitation to identify suboptimal OR management decisions that decrease the efficiency of use of OR time. The adapted scenarios can be used in future decision-making. PMID:21214905

  10. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data.

    PubMed

    Dexter, Franklin; Wachtel, Ruth E; Epstein, Richard H

    2011-01-07

    No systematic process has previously been described for a needs assessment that identifies the operating room (OR) management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1). Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS) data for most scenarios (43 of 45). Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of individualized assessments. Our technical advance is the development and use of automated event-based knowledge elicitation to identify suboptimal OR management decisions that decrease the efficiency of use of OR time. The adapted scenarios can be used in future decision-making.

  11. U.S. Geological Survey community for data integration: data upload, registry, and access tool

    USGS Publications Warehouse

    ,

    2012-01-01

    As a leading science and information agency and in fulfillment of its mission to provide reliable scientific information to describe and understand the Earth, the U.S. Geological Survey (USGS) ensures that all scientific data are effectively hosted, adequately described, and appropriately accessible to scientists, collaborators, and the general public. To succeed in this task, the USGS established the Community for Data Integration (CDI) to address data and information management issues affecting the proficiency of earth science research. Through the CDI, the USGS is providing data and metadata management tools, cyber infrastructure, collaboration tools, and training in support of scientists and technology specialists throughout the project life cycle. One of the significant tools recently created to contribute to this mission is the Uploader tool. This tool allows scientists with limited data management resources to address many of the key aspects of the data life cycle: the ability to protect, preserve, publish and share data. By implementing this application inside ScienceBase, scientists also can take advantage of other collaboration capabilities provided by the ScienceBase platform.

  12. Decision support tools to support the operations of traffic management centers (TMC)

    DOT National Transportation Integrated Search

    2011-01-31

    The goal of this project is to develop decision support tools to support traffic management operations based on collected intelligent transportation system (ITS) data. The project developments are in accordance with the needs of traffic management ce...

  13. Development of bilateral data transferability in the Virginia Department of Transportation's Geotechnical Database Management System Framework.

    DOT National Transportation Integrated Search

    2006-01-01

    An Internet-based, spatiotemporal Geotechnical Database Management System (GDBMS) Framework was designed, developed, and implemented at the Virginia Department of Transportation (VDOT) in 2002 to retrieve, manage, archive, and analyze geotechnical da...

  14. Family vs Village-Based: Intangible View on the Sustainable of Seaweed Farming

    NASA Astrophysics Data System (ADS)

    Teniwut, Wellem A.; Teniwut, Yuliana K.; Teniwut, Roberto M. K.; Hasyim, Cawalinya L.

    2017-10-01

    Compare to other fishery activities for instance fish mariculture and catching fisheries, seaweed farming is considered easier. Also, the market for seaweed is wider and will keep growing. Thus, makes seaweed farming as one of the fastest commodity to improve the welfare of a coastal community. There are technical and non-technical factors in seaweed farming management, for non-technical on this intangible factors vary between family-based and village-based management, therefore aimed of this study was to simulate farmers decision to choose between family-based and village-based on seaweed managing system trigger by intangible factors. We conducted our study in Southeast Maluku, data collecting conducted from October to December 2016 by depth interview and questionnaires on seaweed farmers. We used logistic regression to compare each intangible factors on family and village-based seaweed farming management. The result showed that for family-based management farmers were willing to transfer their knowledge among each member in the household. For village-based revealed that farmers with higher education background tend to work on village-based, also, the result also stated that in village-based management member were those who have better capability and skill, at the same time village-based management had a small probability for conflict to occur compared to family-based.

  15. [A Study on the Classification of Nursing Management Competencies and Development of related Behavioral Indicators in Hospitals].

    PubMed

    Kim, Seong Yeol; Kim, Jong Kyung

    2016-06-01

    The aim of this study was to classify nursing management competencies and develop behavioral indicators for nurse managers in hospitals. Also, levels of importance and performance based on developed criteria were to be identified and compared. Using expert survey we classified nursing management competencies and behavioral indicators with data from 34 nurse managers and professors. Subsequently, data from a survey of 216 nurse managers in 7 cities was used to analyze the importance-performance comparison of the classified nursing management competencies and behavioral indicators. Forty-two nursing management competencies were identified together with 181 behavioral indicators. The mean score for importance of nursing management competency was higher than the mean score for performance. According to the importance-performance analysis, 5 of the 42 nursing management competencies require further development: vision-building, analysis, change management, human resource development, and self-management competency. The classification of nursing management competencies and behavioral indicators for nurse managers in hospitals provides basic data for the development and evaluation of programs designed to increase the competency of nurse managers in hospitals.

  16. Early Training Estimation System (ETES). Appendix F. User’s Guide

    DTIC Science & Technology

    1984-06-01

    Related to Early Training Estimation 2-17 2-5 Organizations Interviewed During Task 1 2-17 2-6 Potential Problem Solving Aids 2-24 2-7 Task Deletion...tasks are available, only the training program elements must be estimated. Thus, by adding comparability analysis procedures to SDT data base management...data base manage- ment capabilities of the SDT, and (3) conduct trade-off studies of proposed solutions to identified training problems . 1-17

  17. Research and Design of Embedded Wireless Meal Ordering System Based on SQLite

    NASA Astrophysics Data System (ADS)

    Zhang, Jihong; Chen, Xiaoquan

    The paper describes features and internal architecture and developing method of SQLite. And then it gives a design and program of meal ordering system. The system realizes the information interaction among the users and embedded devices with SQLite as database system. The embedded database SQLite manages the data and achieves wireless communication by using Bluetooth. A system program based on Qt/Embedded and Linux drivers realizes the local management of environmental data.

  18. Medical informatics in medical research - the Severe Malaria in African Children (SMAC) Network's experience.

    PubMed

    Olola, C H O; Missinou, M A; Issifou, S; Anane-Sarpong, E; Abubakar, I; Gandi, J N; Chagomerana, M; Pinder, M; Agbenyega, T; Kremsner, P G; Newton, C R J C; Wypij, D; Taylor, T E

    2006-01-01

    Computers are widely used for data management in clinical trials in the developed countries, unlike in developing countries. Dependable systems are vital for data management, and medical decision making in clinical research. Monitoring and evaluation of data management is critical. In this paper we describe database structures and procedures of systems used to implement, coordinate, and sustain data management in Africa. We outline major lessons, challenges and successes achieved, and recommendations to improve medical informatics application in biomedical research in sub-Saharan Africa. A consortium of experienced research units at five sites in Africa in studying children with disease formed a new clinical trials network, Severe Malaria in African Children. In December 2000, the network introduced an observational study involving these hospital-based sites. After prototyping, relational database management systems were implemented for data entry and verification, data submission and quality assurance monitoring. Between 2000 and 2005, 25,858 patients were enrolled. Failure to meet data submission deadline and data entry errors correlated positively (correlation coefficient, r = 0.82), with more errors occurring when data was submitted late. Data submission lateness correlated inversely with hospital admissions (r = -0.62). Developing and sustaining dependable DBMS, ongoing modifications to optimize data management is crucial for clinical studies. Monitoring and communication systems are vital in multi-center networks for good data management. Data timeliness is associated with data quality and hospital admissions.

  19. Ubiquitous Mobile Educational Data Management by Teachers, Students and Parents: Does Technology Change School-Family Communication and Parental Involvement?

    ERIC Educational Resources Information Center

    Blau, Ina; Hameiri, Mira

    2017-01-01

    Digital educational data management has become an integral part of school practices. Accessing school database by teachers, students, and parents from mobile devices promotes data-driven educational interactions based on real-time information. This paper analyses mobile access of educational database in a large sample of 429 schools during an…

  20. Improvements to information management systems simulator

    NASA Technical Reports Server (NTRS)

    Bilek, R. W.

    1972-01-01

    The performance of personnel in the augmentation and improvement of the interactive IMSIM information management simulation model is summarized. With this augmented model, NASA now has even greater capabilities for the simulation of computer system configurations, data processing loads imposed on these configurations, and executive software to control system operations. Through these simulations, NASA has an extremely cost effective capability for the design and analysis of computer-based data management systems.

Top