Science.gov

Sample records for active database management

  1. Database Manager

    ERIC Educational Resources Information Center

    Martin, Andrew

    2010-01-01

    It is normal practice today for organizations to store large quantities of records of related information as computer-based files or databases. Purposeful information is retrieved by performing queries on the data sets. The purpose of DATABASE MANAGER is to communicate to students the method by which the computer performs these queries. This…

  2. Database Management

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Management of the data within a planetary data system (PDS) is addressed. Principles of modern data management are described and several large NASA scientific data base systems are examined. Data management in PDS is outlined and the major data management issues are introduced.

  3. Database Searching by Managers.

    ERIC Educational Resources Information Center

    Arnold, Stephen E.

    Managers and executives need the easy and quick access to business and management information that online databases can provide, but many have difficulty articulating their search needs to an intermediary. One possible solution would be to encourage managers and their immediate support staff members to search textual databases directly as they now…

  4. Database Management System

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In 1981 Wayne Erickson founded Microrim, Inc, a company originally focused on marketing a microcomputer version of RIM (Relational Information Manager). Dennis Comfort joined the firm and is now vice president, development. The team developed an advanced spinoff from the NASA system they had originally created, a microcomputer database management system known as R:BASE 4000. Microrim added many enhancements and developed a series of R:BASE products for various environments. R:BASE is now the second largest selling line of microcomputer database management software in the world.

  5. Requirements Management Database

    2009-08-13

    This application is a simplified and customized version of the RBA and CTS databases to capture federal, site, and facility requirements, link to actions that must be performed to maintain compliance with their contractual and other requirements.

  6. Use of database management software in a management information system.

    PubMed

    Sullivan, M

    1991-01-01

    If information is to play a useful role in the decision making process, an organization must design a system for capturing, organizing and presenting information from daily activities to its managers. Use of relational database management software to design a custom application as part of a management information system is both practical and cost effective. Designing a database management application is very challenging but well worth the effort, asserts the author of this case study. PMID:10119686

  7. REFEREE: BIBLIOGRAPHIC DATABASE MANAGER, DOCUMENTATION

    EPA Science Inventory

    The publication is the user's manual for 3.xx releases of REFEREE, a general-purpose bibliographic database management program for IBM-compatible microcomputers. The REFEREE software also is available from NTIS. The manual has two main sections--Quick Tour and References Guide--a...

  8. Interconnecting heterogeneous database management systems

    NASA Technical Reports Server (NTRS)

    Gligor, V. D.; Luckenbaugh, G. L.

    1984-01-01

    It is pointed out that there is still a great need for the development of improved communication between remote, heterogeneous database management systems (DBMS). Problems regarding the effective communication between distributed DBMSs are primarily related to significant differences between local data managers, local data models and representations, and local transaction managers. A system of interconnected DBMSs which exhibit such differences is called a network of distributed, heterogeneous DBMSs. In order to achieve effective interconnection of remote, heterogeneous DBMSs, the users must have uniform, integrated access to the different DBMs. The present investigation is mainly concerned with an analysis of the existing approaches to interconnecting heterogeneous DBMSs, taking into account four experimental DBMS projects.

  9. Generalized Database Management System Support for Numeric Database Environments.

    ERIC Educational Resources Information Center

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  10. Database management system for instrument data management

    SciTech Connect

    Tatum, C.P.

    1990-01-01

    Data from many measuring devices throughout the Savannah River Site (SRS) is transmitted to a central location for processing as a vital component in the SRS emergency preparedness and response program. The data processing is currently accomplished using VAX-based FORTRAN programs with the data stored in Digital's Record Management System (RMS) files which is shared using global COMMON. A program is underway to store and process this data using a Structured Query Language (SQL)-based Database Management System (DBMS). The advantages of replacing the current system with one using an SQL-based DBMS are discussed.

  11. Microcomputer Database Management Systems for Bibliographic Data.

    ERIC Educational Resources Information Center

    Pollard, Richard

    1986-01-01

    Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)

  12. Negative Effects of Learning Spreadsheet Management on Learning Database Management

    ERIC Educational Resources Information Center

    Vágner, Anikó; Zsakó, László

    2015-01-01

    A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…

  13. AGRICULTURAL BEST MANAGEMENT PRACTICE EFFECTIVENESS DATABASE

    EPA Science Inventory

    Resource Purpose:The Agricultural Best Management Practice Effectiveness Database contains the results of research projects which have collected water quality data for the purpose of determining the effectiveness of agricultural management practices in reducing pollutants ...

  14. An Introduction to Database Management Systems.

    ERIC Educational Resources Information Center

    Warden, William H., III; Warden, Bette M.

    1984-01-01

    Description of database management systems for microcomputers highlights system features and factors to consider in microcomputer system selection. A method for ranking database management systems is explained and applied to a defined need, i.e., software support for indexing a weekly newspaper. A glossary of terms and 32-item bibliography are…

  15. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins; John M. Beck

    2011-11-01

    The Next Generation Nuclear Plant (NGNP) Risk Management System (RMS) is a database used to maintain the project risk register. The RMS also maps risk reduction activities to specific identified risks. Further functionality of the RMS includes mapping reactor suppliers Design Data Needs (DDNs) to risk reduction tasks and mapping Phenomena Identification Ranking Table (PIRTs) to associated risks. This document outlines the basic instructions on how to use the RMS. This document constitutes Revision 1 of the NGNP Risk Management Database: A Model for Managing Risk. It incorporates the latest enhancements to the RMS. The enhancements include six new custom views of risk data - Impact/Consequence, Tasks by Project Phase, Tasks by Status, Tasks by Project Phase/Status, Tasks by Impact/WBS, and Tasks by Phase/Impact/WBS.

  16. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  17. Central Asia Active Fault Database

    NASA Astrophysics Data System (ADS)

    Mohadjer, Solmaz; Ehlers, Todd A.; Kakar, Najibullah

    2014-05-01

    The ongoing collision of the Indian subcontinent with Asia controls active tectonics and seismicity in Central Asia. This motion is accommodated by faults that have historically caused devastating earthquakes and continue to pose serious threats to the population at risk. Despite international and regional efforts to assess seismic hazards in Central Asia, little attention has been given to development of a comprehensive database for active faults in the region. To address this issue and to better understand the distribution and level of seismic hazard in Central Asia, we are developing a publically available database for active faults of Central Asia (including but not limited to Afghanistan, Tajikistan, Kyrgyzstan, northern Pakistan and western China) using ArcGIS. The database is designed to allow users to store, map and query important fault parameters such as fault location, displacement history, rate of movement, and other data relevant to seismic hazard studies including fault trench locations, geochronology constraints, and seismic studies. Data sources integrated into the database include previously published maps and scientific investigations as well as strain rate measurements and historic and recent seismicity. In addition, high resolution Quickbird, Spot, and Aster imagery are used for selected features to locate and measure offset of landforms associated with Quaternary faulting. These features are individually digitized and linked to attribute tables that provide a description for each feature. Preliminary observations include inconsistent and sometimes inaccurate information for faults documented in different studies. For example, the Darvaz-Karakul fault which roughly defines the western margin of the Pamir, has been mapped with differences in location of up to 12 kilometers. The sense of motion for this fault ranges from unknown to thrust and strike-slip in three different studies despite documented left-lateral displacements of Holocene and late

  18. Management of virtualized infrastructure for physics databases

    NASA Astrophysics Data System (ADS)

    Topurov, Anton; Gallerani, Luigi; Chatal, Francois; Piorkowski, Mariusz

    2012-12-01

    Demands for information storage of physics metadata are rapidly increasing together with the requirements for its high availability. Most of the HEP laboratories are struggling to squeeze more from their computer centers, thus focus on virtualizing available resources. CERN started investigating database virtualization in early 2006, first by testing database performance and stability on native Xen. Since then we have been closely evaluating the constantly evolving functionality of virtualisation solutions for database and middle tier together with the associated management applications - Oracle's Enterprise Manager and VM Manager. This session will detail our long experience in dealing with virtualized environments, focusing on newest Oracle OVM 3.0 for x86 and Oracle Enterprise Manager functionality for efficiently managing your virtualized database infrastructure.

  19. Using Online Databases in Corporate Issues Management.

    ERIC Educational Resources Information Center

    Thomsen, Steven R.

    1995-01-01

    Finds that corporate public relations practitioners felt they were able, using online database and information services, to intercept issues earlier in the "issue cycle" and thus enable their organizations to develop more "proactionary" or "catalytic" issues management repose strategies. (SR)

  20. MANAGING LARGE DATABASES WITH CUSTOMIZED SAS WINDOWS

    EPA Science Inventory

    This paper discusses the principles of database management through customized windows using SAS/AF, particularly PROC BUILD, to invoke interactive and batch processing of data entry, editing, updating, automatic report generation, and custom report generation functions, including...

  1. Integrated Space Asset Management Database and Modeling

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

  2. The land management and operations database (LMOD)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper presents the design, implementation, deployment, and application of the Land Management and Operations Database (LMOD). LMOD is the single authoritative source for reference land management and operation reference data within the USDA enterprise data warehouse. LMOD supports modeling appl...

  3. Requirements for a database management system

    SciTech Connect

    Lawrence, J.D.; McCarthy, J.

    1984-09-01

    This document discusses the requirements for a database management system that would satisfy the scientific needs of the Scientific Database Project. We give the major requirements of scientific data management, based on a system developed by Deutsch. Actual requirements, for each category, are identified as mandatory, important, and optional. Mandatory - we should not consider a DBMS unless it satisfies all mandatory requirements. Important - these requirements, while not as crucial as the mandatory ones, are important to the easy and convenient implementation and operation of a scientific database. Optional - such features are nice extras. We expect that the scientific database project can be implemented and operated in any DBMS that meets all of the mandatory and most of the important requirements.

  4. Research on computer virus database management system

    NASA Astrophysics Data System (ADS)

    Qi, Guoquan

    2011-12-01

    The growing proliferation of computer viruses becomes the lethal threat and research focus of the security of network information. While new virus is emerging, the number of viruses is growing, virus classification increasing complex. Virus naming because of agencies' capture time differences can not be unified. Although each agency has its own virus database, the communication between each other lacks, or virus information is incomplete, or a small number of sample information. This paper introduces the current construction status of the virus database at home and abroad, analyzes how to standardize and complete description of virus characteristics, and then gives the information integrity, storage security and manageable computer virus database design scheme.

  5. BIOSOLIDS DATABASE MANAGEMENT SYSTEM (BDMS)

    EPA Science Inventory

    Resource Purpose:see hard copy attachment "EPA's Biosolids Data Management System and Plans for Evaluating Biosolids Quality"
    Legislation/Enabling Authority:CWA Section 402
    Supported Program:OW, OWM, OECA, ORD, OSW, Regions 1-10, states, local facilitie...

  6. Design and management of energy databases

    SciTech Connect

    Groscurth, H.M.

    1995-07-01

    The planning of energy supply systems is getting more and more complex and involves increasing amounts of data. In the past, these data have been stored on the computer in simple text files, which were accessible via text editors. At present, powerful database management systems (DBMS) provide mature tools for handling data. However, these instruments are not yet widely used in the energy sector. Therefore, sample solutions for two standard problems of energy planning are presented, namely, the management of time series and of technology data. In addition, the database structure of the energy-optimization model ecco is discussed in detail.

  7. Organizing a breast cancer database: data management.

    PubMed

    Yi, Min; Hunt, Kelly K

    2016-06-01

    Developing and organizing a breast cancer database can provide data and serve as valuable research tools for those interested in the etiology, diagnosis, and treatment of cancer. Depending on the research setting, the quality of the data can be a major issue. Assuring that the data collection process does not contribute inaccuracies can help to assure the overall quality of subsequent analyses. Data management is work that involves the planning, development, implementation, and administration of systems for the acquisition, storage, and retrieval of data while protecting it by implementing high security levels. A properly designed database provides you with access to up-to-date, accurate information. Database design is an important component of application design. If you take the time to design your databases properly, you'll be rewarded with a solid application foundation on which you can build the rest of your application. PMID:27197511

  8. TREATABILITY DATABASE (NATIONAL RISK MANAGEMENT RESEARCH LABORATORY)

    EPA Science Inventory

    The National Risk Management Research Laboratory has developed and is continuing to expand a database on the effectiveness of proven treatment technologies in the removal/destruction of chemicals in various types of media, including water, wastewater, soil, debris, sludge, and se...

  9. Implementing a Microcomputer Database Management System.

    ERIC Educational Resources Information Center

    Manock, John J.; Crater, K. Lynne

    1985-01-01

    Current issues in selecting, structuring, and implementing microcomputer database management systems in research administration offices are discussed, and their capabilities are illustrated with the system used by the University of North Carolina at Wilmington. Trends in microcomputer technology and their likely impact on research administration…

  10. Choosing the Right Database Management Program.

    ERIC Educational Resources Information Center

    Vockell, Edward L.; Kopenec, Donald

    1989-01-01

    Provides a comparison of four database management programs commonly used in schools: AppleWorks, the DOS 3.3 and ProDOS versions of PFS, and MECC's Data Handler. Topics discussed include information storage, spelling checkers, editing functions, search strategies, graphs, printout formats, library applications, and HyperCard. (LRW)

  11. Creative Classroom Assignment Through Database Management.

    ERIC Educational Resources Information Center

    Shah, Vivek; Bryant, Milton

    1987-01-01

    The Faculty Scheduling System (FSS), a database management system designed to give administrators the ability to schedule faculty in a fast and efficient manner is described. The FSS, developed using dBASE III, requires an IBM compatible microcomputer with a minimum of 256K memory. (MLW)

  12. A Database Management System for Interlibrary Loan.

    ERIC Educational Resources Information Center

    Chang, Amy

    1990-01-01

    Discusses the increasing complexity of dealing with interlibrary loan requests and describes a database management system for interlibrary loans used at Texas Tech University. System functions are described, including file control, records maintenance, and report generation, and the impact on staff productivity is discussed. (CLB)

  13. Six database management systems for the Macintosh

    SciTech Connect

    Wheatley, M.R. ); Rock, N.M.S. )

    1989-12-01

    Six multi-file database management packages (DBMs) for the Apple Macintosh are reviewed. The reviews of Fourth Dimension, dBase Mac, FoxBase +/Mac, Omnis 3+, Omnis 5, and Oracle are based on several months of extensive testing, including actual performance with geological databases up to 4 megabytes. The MacDBMs range from those thoroughly implementing the Mac philosophy of user-friendliness (e.g. 4th Dimension) to variably Macified versions of established mainframe and PC DBMs, such as Oracle, FoxBase, and dBase. Although most of these DBMs are billed as relational, only Macintosh Oracle strictly obeys this condition. Mac users have a wide choice, from simple flat-file DBMs (MS File, etc.) to complex packages capable of interacting with many users, many files, and external mainframe or PC databases.

  14. Producing an Index with Your Microcomputer Database Manager.

    ERIC Educational Resources Information Center

    Jonassen, David

    1985-01-01

    Describes a procedure for using commonly available database management systems to produce indexes on microcomputers. Production steps discussed include creation of the database, data entry, database sort, formatting, and editing. (Author/MBR)

  15. SPIRE Data-Base Management System

    NASA Technical Reports Server (NTRS)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  16. Service Management Database for DSN Equipment

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.

  17. CONSOLIDATED HUMAN ACTIVITY DATABASE (CHAD) WEBSITE

    EPA Science Inventory

    The Consolidated Human Activity Database (CHAD) has been developed by ManTech Environmental for the Environmental Protection Agency's National Exposure Research Laboratory (NERL). This database was created to support exposure/intake dose/risk assessments. The overall design incor...

  18. Database activities at Brookhaven National Laboratory

    SciTech Connect

    Trahern, C.G.

    1995-12-01

    Brookhaven National Laboratory is a multi-disciplinary lab in the DOE system of research laboratories. Database activities are correspondingly diverse within the restrictions imposed by the dominant relational database paradigm. The authors discuss related activities and tools used in RHIC and in the other major projects at BNL. The others are the Protein Data Bank being maintained by the Chemistry department, and a Geographical Information System (GIS)--a Superfund sponsored environmental monitoring project under development in the Office of Environmental Restoration.

  19. Pre-Validated Signal Database Management System

    1996-12-18

    SPRT/DBMS is a pre-validated experimental database management system for industries where large volumes of process signals are acquired and archived. This system implements a new and powerful pattern recognition method, the spectrum transformed sequential testing (STST or ST2) procedure. A network of interacting ST2 modules deployed in parallel is integrated with a relational DBMS to fully validate process signals as they are archived. This reliable, secure DBMS then provides system modelers, code developers, and safetymore » analysts with an easily accessible source of fully validated process data.« less

  20. Pre-Validated Signal Database Management System

    SciTech Connect

    Gross, Kenny C.

    1996-12-18

    SPRT/DBMS is a pre-validated experimental database management system for industries where large volumes of process signals are acquired and archived. This system implements a new and powerful pattern recognition method, the spectrum transformed sequential testing (STST or ST2) procedure. A network of interacting ST2 modules deployed in parallel is integrated with a relational DBMS to fully validate process signals as they are archived. This reliable, secure DBMS then provides system modelers, code developers, and safety analysts with an easily accessible source of fully validated process data.

  1. HGDBMS: a human genetics database management system.

    PubMed

    Seuchter, S A; Skolnick, M H

    1988-10-01

    Human genetics research involves a large number of complex data sets naturally organized in hierarchical structures. Data collection is performed on different levels, e.g., the project level, pedigree level, individual level, and sample level. Different aspects of a study utilize different views of the data, requiring a flexible database management system (DBMS) which satisfies these different needs for data collection and retrieval. We describe HGDBMS, a comprehensive relational DBMS, implemented as an application of the GENISYS I DBMS, which allows embedding the hierarchical structure of pedigrees in a relational structure. The system's file structure is described in detail. Currently our Melanoma and Chromosome 17 map studies are managed with HGDBMS. Our initial experience demonstrates the value of a flexible system which supports the needs for data entry, update, storage, reporting, and analysis required during different phases of genetic research. Further developments will focus on the integration of HGDBMS with a human genetics expert system shell and analysis programs. PMID:3180747

  2. Database Management Systems: New Homes for Migrating Bibliographic Records.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Bierbaum, Esther G.

    1987-01-01

    Assesses bibliographic databases as part of visionary text systems such as hypertext and scholars' workstations. Downloading is discussed in terms of the capability to search records and to maintain unique bibliographic descriptions, and relational database management systems, file managers, and text databases are reviewed as possible hosts for…

  3. The Network Configuration of an Object Relational Database Management System

    NASA Technical Reports Server (NTRS)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  4. Active Design Database (ADDB) user's manual

    SciTech Connect

    Schwarz, R.L.; Nations, J.A.; Rosser, J.H.

    1991-02-01

    This manual is a guide to the Active Design Database (ADDB) on the Martin Marietta Energy Systems, Inc., IBM 3084 unclassified computer. The ADDB is an index to all CADAM models in the unclassified CADAM database and provides query and report capabilities. Section 2.0 of this manual presents an overview of the ADDB, describing the system's purpose; the functions it performs; hardware, software, and security requirements; and help and error functions. Section 3.0 describes how to access the system and how to operate the system functions using Database 2 (DB2), Time Sharing Option (TSO), and Interactive System Productivity Facility (ISPF) features employed by this system. Appendix A contains a dictionary of data elements maintained by the system. The data values are collected from the unclassified CADAM database. Appendix B provides a printout of the system help and error screens.

  5. Practical considerations in the management of large multiinstitutional databases.

    PubMed

    Edwards, F H; Clark, R E; Schwartz, M

    1994-12-01

    Large multiinstitutional databases are excellent sources of information that provide clinically useful insight into the practice of cardiac surgery. Fully informed subscribers should be aware of the practical concerns associated with the management and interpretation of database results. During development of The Society of Thoracic Surgeons National Database, three such areas have become particularly important: the database population, the database quality, and the significance of results. Appreciation of the real and philosophical problems associated with these issues will allow for greater appreciation of the intricacies of the database and will enhance the users' ability to interpret information gained from the database. PMID:7979779

  6. Integrated Space Asset Management Database and Modeling

    NASA Astrophysics Data System (ADS)

    Gagliano, L.; MacLeod, T.; Mason, S.; Percy, T.; Prescott, J.

    The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

  7. Resident database interfaces to the DAVID system, a heterogeneous distributed database management system

    NASA Technical Reports Server (NTRS)

    Moroh, Marsha

    1988-01-01

    A methodology for building interfaces of resident database management systems to a heterogeneous distributed database management system under development at NASA, the DAVID system, was developed. The feasibility of that methodology was demonstrated by construction of the software necessary to perform the interface task. The interface terminology developed in the course of this research is presented. The work performed and the results are summarized.

  8. Applications of GIS and database technologies to manage a Karst Feature Database

    USGS Publications Warehouse

    Gao, Y.; Tipping, R.G.; Alexander, E.C., Jr.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  9. Lessons Learned from Deploying an Analytical Task Management Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen

    2007-01-01

    Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.

  10. Database and Related Activities in Japan

    NASA Astrophysics Data System (ADS)

    Murakami, Izumi; Kato, Daiji; Kato, Masatoshi; Sakaue, Hiroyuki A.; Kato, Takako; Ding, Xiaobin; Morita, Shigeru; Kitajima, Masashi; Koike, Fumihiro; Nakamura, Nobuyuki; Sakamoto, Naoki; Sasaki, Akira; Skobelev, Igor; Tsuchida, Hidetsugu; Ulantsev, Artemiy; Watanabe, Tetsuya; Yamamoto, Norimasa

    2011-05-01

    We have constructed and made available atomic and molecular (AM) numerical databases on collision processes such as electron-impact excitation and ionization, recombination and charge transfer of atoms and molecules relevant for plasma physics, fusion research, astrophysics, applied-science plasma, and other related areas. The retrievable data is freely accessible via the internet. We also work on atomic data evaluation and constructing collisional-radiative models for spectroscopic plasma diagnostics. Recently we have worked on Fe ions and W ions theoretically and experimentally. The atomic data and collisional-radiative models for these ions are examined and applied to laboratory plasmas. A visible M1 transition of W26+ ion is identified at 389.41 nm by EBIT experiments and theoretical calculations. We have small non-retrievable databases in addition to our main database. Recently we evaluated photo-absorption cross sections for 9 atoms and 23 molecules and we present them as a new database. We established a new association "Forum of Atomic and Molecular Data and Their Applications" to exchange information among AM data producers, data providers and data users in Japan and we hope this will help to encourage AM data activities in Japan.

  11. An authoritative global database for active submarine hydrothermal vent fields

    NASA Astrophysics Data System (ADS)

    Beaulieu, Stace E.; Baker, Edward T.; German, Christopher R.; Maffei, Andrew

    2013-11-01

    The InterRidge Vents Database is available online as the authoritative reference for locations of active submarine hydrothermal vent fields. Here we describe the revision of the database to an open source content management system and conduct a meta-analysis of the global distribution of known active vent fields. The number of known active vent fields has almost doubled in the past decade (521 as of year 2009), with about half visually confirmed and others inferred active from physical and chemical clues. Although previously known mainly from mid-ocean ridges (MORs), active vent fields at MORs now comprise only half of the total known, with about a quarter each now known at volcanic arcs and back-arc spreading centers. Discoveries in arc and back-arc settings resulted in an increase in known vent fields within exclusive economic zones, consequently reducing the proportion known in high seas to one third. The increase in known vent fields reflects a number of factors, including increased national and commercial interests in seafloor hydrothermal deposits as mineral resources. The purpose of the database now extends beyond academic research and education and into marine policy and management, with at least 18% of known vent fields in areas granted or pending applications for mineral prospecting and 8% in marine protected areas.

  12. Selecting a PC database management system for health physics applications

    SciTech Connect

    Slaback, L.A.; Webber, W.R. )

    1987-01-01

    An integrated system of data management is a necessity for the variety and volume of data encountered in many health physics programs. A Personal Computer (PC) Database Management System (DBMS) can fill these data management needs if it is designed and constructed properly. This article presents a suggested approach to PC database design and outlines the specific features that should be examined when choosing DBMS software. This approach was used to set up a health physics database system at the National Bureau of Standards in 1985. The NBS system is described, and an example of dosimetry data entry is used to illustrate how the system works.

  13. Integrated Space Asset Management Database and Modeling

    NASA Astrophysics Data System (ADS)

    Gagliano, L.; MacLeod, T.; Mason, S.; Percy, T.; Prescott, J.

    The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

  14. The role of databases in areawide pest management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A database is a comprehensive collection of related data organized for convenient access, generally in a computer. The evolution of computer software and the need to distinguish the specialized computer systems for storing and manipulating data, stimulated development of database management systems...

  15. Teaching Database Management System Use in a Library School Curriculum.

    ERIC Educational Resources Information Center

    Cooper, Michael D.

    1985-01-01

    Description of database management systems course being taught to students at School of Library and Information Studies, University of California, Berkeley, notes course structure, assignments, and course evaluation. Approaches to teaching concepts of three types of database systems are discussed and systems used by students in the course are…

  16. Improving Recall Using Database Management Systems: A Learning Strategy.

    ERIC Educational Resources Information Center

    Jonassen, David H.

    1986-01-01

    Describes the use of microcomputer database management systems to facilitate the instructional uses of learning strategies relating to information processing skills, especially recall. Two learning strategies, cross-classification matrixing and node acquisition and integration, are highlighted. (Author/LRW)

  17. NATIONAL DATABASE ON ENVIRONMENTAL MANAGEMENT (NDEMS)

    EPA Science Inventory

    Resource Purpose:Provides data on the environmental and economic performance of 70-80 facilities nationwide that are implementing EMSs. Database included data on historical compliance and other environmental performance indicators as a "baseline" from which to measure futu...

  18. MST radar data-base management

    NASA Technical Reports Server (NTRS)

    Wickwar, V. B.

    1983-01-01

    Data management for Mesospheric-Stratospheric-Tropospheric, (MST) radars is addressed. An incoherent-scatter radar data base is discussed in terms of purpose, centralization, scope, and nature of the data base management system.

  19. High Performance Database Management for Earth Sciences

    NASA Technical Reports Server (NTRS)

    Rishe, Naphtali; Barton, David; Urban, Frank; Chekmasov, Maxim; Martinez, Maria; Alvarez, Elms; Gutierrez, Martha; Pardo, Philippe

    1998-01-01

    The High Performance Database Research Center at Florida International University is completing the development of a highly parallel database system based on the semantic/object-oriented approach. This system provides exceptional usability and flexibility. It allows shorter application design and programming cycles and gives the user control via an intuitive information structure. It empowers the end-user to pose complex ad hoc decision support queries. Superior efficiency is provided through a high level of optimization, which is transparent to the user. Manifold reduction in storage size is allowed for many applications. This system allows for operability via internet browsers. The system will be used for the NASA Applications Center program to store remote sensing data, as well as for Earth Science applications.

  20. EPILEPSY AND EDUCATION AND PREVENTION ACTIVITIES (EP) DATABASE

    EPA Science Inventory

    This database contains entries that focus on epilepsy education and prevention emphasizing the application of effective early detection and control program activities and risk reduction efforts. The database provides bibliographic citations and abstracts of various types of mater...

  1. Active In-Database Processing to Support Ambient Assisted Living Systems

    PubMed Central

    de Morais, Wagner O.; Lundström, Jens; Wickström, Nicholas

    2014-01-01

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare. PMID:25120164

  2. Active in-database processing to support ambient assisted living systems.

    PubMed

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-01-01

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare. PMID:25120164

  3. Development of a Relational Database for Learning Management Systems

    ERIC Educational Resources Information Center

    Deperlioglu, Omer; Sarpkaya, Yilmaz; Ergun, Ertugrul

    2011-01-01

    In today's world, Web-Based Distance Education Systems have a great importance. Web-based Distance Education Systems are usually known as Learning Management Systems (LMS). In this article, a database design, which was developed to create an educational institution as a Learning Management System, is described. In this sense, developed Learning…

  4. Expansion of the MANAGE database with forest and drainage studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The “Measured Annual Nutrient loads from AGricultural Environments” (MANAGE) database was published in 2006 to expand an early 1980’s compilation of nutrient export (load) data from agricultural land uses at the field or farm spatial scale. Then in 2008, MANAGE was updated with 15 additional studie...

  5. Geoscience research databases for coastal Alabama ecosystem management

    USGS Publications Warehouse

    Hummell, Richard L.

    1995-01-01

    Effective management of complex coastal ecosystems necessitates access to scientific knowledge that can be acquired through a multidisciplinary approach involving Federal and State scientists that take advantage of agency expertise and resources for the benefit of all participants working toward a set of common research and management goals. Cooperative geostatic investigations have led toward building databases of fundamental scientific knowledge that can be utilized to manage coastal Alabama's natural and future development. These databases have been used to assess the occurrence and economic potential of hard mineral resources in the Alabama EFZ, and to support oil spill contingency planning and environmental analysis for coastal Alabama.

  6. A database management capability for Ada

    NASA Technical Reports Server (NTRS)

    Chan, Arvola; Danberg, SY; Fox, Stephen; Landers, Terry; Nori, Anil; Smith, John M.

    1986-01-01

    The data requirements of mission critical defense systems have been increasing dramatically. Command and control, intelligence, logistics, and even weapons systems are being required to integrate, process, and share ever increasing volumes of information. To meet this need, systems are now being specified that incorporate data base management subsystems for handling storage and retrieval of information. It is expected that a large number of the next generation of mission critical systems will contain embedded data base management systems. Since the use of Ada has been mandated for most of these systems, it is important to address the issues of providing data base management capabilities that can be closely coupled with Ada. A comprehensive distributed data base management project has been investigated. The key deliverables of this project are three closely related prototype systems implemented in Ada. These three systems are discussed.

  7. Creating a benefit management information database.

    PubMed

    Martin, R D

    1998-09-01

    In this case study, a real-life information technology conundrum is fictionalized to illustrate the potential benefits of retaining corporate knowledge through the creation of an information repository in a managed health care plan. PMID:10187587

  8. TRENDS: The aeronautical post-test database management system

    NASA Technical Reports Server (NTRS)

    Bjorkman, W. S.; Bondi, M. J.

    1990-01-01

    TRENDS, an engineering-test database operating system developed by NASA to support rotorcraft flight tests, is described. Capabilities and characteristics of the system are presented, with examples of its use in recalling and analyzing rotorcraft flight-test data from a TRENDS database. The importance of system user-friendliness in gaining users' acceptance is stressed, as is the importance of integrating supporting narrative data with numerical data in engineering-test databases. Considerations relevant to the creation and maintenance of flight-test database are discussed and TRENDS' solutions to database management problems are described. Requirements, constraints, and other considerations which led to the system's configuration are discussed and some of the lessons learned during TRENDS' development are presented. Potential applications of TRENDS to a wide range of aeronautical and other engineering tests are identified.

  9. Evidence generation from healthcare databases: recommendations for managing change.

    PubMed

    Bourke, Alison; Bate, Andrew; Sauer, Brian C; Brown, Jeffrey S; Hall, Gillian C

    2016-07-01

    There is an increasing reliance on databases of healthcare records for pharmacoepidemiology and other medical research, and such resources are often accessed over a long period of time so it is vital to consider the impact of changes in data, access methodology and the environment. The authors discuss change in communication and management, and provide a checklist of issues to consider for both database providers and users. The scope of the paper is database research, and changes are considered in relation to the three main components of database research: the data content itself, how it is accessed, and the support and tools needed to use the database. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27183900

  10. DOE technology information management system database study report

    SciTech Connect

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  11. A survey of commercial object-oriented database management systems

    NASA Technical Reports Server (NTRS)

    Atkins, John

    1992-01-01

    The object-oriented data model is the culmination of over thirty years of database research. Initially, database research focused on the need to provide information in a consistent and efficient manner to the business community. Early data models such as the hierarchical model and the network model met the goal of consistent and efficient access to data and were substantial improvements over simple file mechanisms for storing and accessing data. However, these models required highly skilled programmers to provide access to the data. Consequently, in the early 70's E.F. Codd, an IBM research computer scientists, proposed a new data model based on the simple mathematical notion of the relation. This model is known as the Relational Model. In the relational model, data is represented in flat tables (or relations) which have no physical or internal links between them. The simplicity of this model fostered the development of powerful but relatively simple query languages that now made data directly accessible to the general database user. Except for large, multi-user database systems, a database professional was in general no longer necessary. Database professionals found that traditional data in the form of character data, dates, and numeric data were easily represented and managed via the relational model. Commercial relational database management systems proliferated and performance of relational databases improved dramatically. However, there was a growing community of potential database users whose needs were not met by the relational model. These users needed to store data with data types not available in the relational model and who required a far richer modelling environment than that provided by the relational model. Indeed, the complexity of the objects to be represented in the model mandated a new approach to database technology. The Object-Oriented Model was the result.

  12. Relational Information Management Data-Base System

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Erickson, W. J.; Gray, F. P.; Comfort, D. L.; Wahlstrom, S. O.; Von Limbach, G.

    1985-01-01

    DBMS with several features particularly useful to scientists and engineers. RIM5 interfaced with any application program written in language capable of Calling FORTRAN routines. Applications include data management for Space Shuttle Columbia tiles, aircraft flight tests, high-pressure piping, atmospheric chemistry, census, university registration, CAD/CAM Geometry, and civil-engineering dam construction.

  13. Generic database design for patient management information.

    PubMed Central

    Johnson, S. B.; Paul, T.; Khenina, A.

    1997-01-01

    Patient management information tracks general facts about the location of the patient and the providers assigned to care for the patient. The Clinical Data Repository at Columbia Presbyterian Medical Center employs a generic schema to record patient management events. The schema is extremely simple, yet can support several different views of patient information, as required by different applications: a longitudinal view of patient visits, including both inpatient and outpatient encounters; a visit-oriented view, to record facts related to a current encounter; a location-based view to provide a census of a nursing ward; and a provider-based view to give a list of the patients currently being cared for by a given clinician. All of these views can be supported in a highly efficient manner by the use of appropriate indexes. PMID:9357581

  14. Information Management Tools for Classrooms: Exploring Database Management Systems. Technical Report No. 28.

    ERIC Educational Resources Information Center

    Freeman, Carla; And Others

    In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…

  15. Development of the ageing management database of PUSPATI TRIGA reactor

    NASA Astrophysics Data System (ADS)

    Ramli, Nurhayati; Maskin, Mazleha; Tom, Phongsakorn Prak; Husain, Nurfazila; Farid, Mohd Fairus Abd; Ramli, Shaharum; Adnan, Amirul Syazwan; Abidin, Nurul Husna Zainal

    2016-01-01

    Since its first criticality in 1982, PUSPATI TRIGA Reactor (RTP) has been operated for more than 30 years. As RTP become older, ageing problems have been seen to be the prominent issues. In addressing the ageing issues, an Ageing Management (AgeM) database for managing related ageing matters was systematically developed. This paper presents the development of AgeM database taking into account all RTP major Systems, Structures and Components (SSCs) and ageing mechanism of these SSCs through the system surveillance program.

  16. Computer networks for financial activity management, control and statistics of databases of economic administration at the Joint Institute for Nuclear Research

    NASA Astrophysics Data System (ADS)

    Tyupikova, T. V.; Samoilov, V. N.

    2003-04-01

    Modern information technologies urge natural sciences to further development. But it comes together with evaluation of infrastructures, to spotlight favorable conditions for the development of science and financial base in order to prove and protect legally new research. Any scientific development entails accounting and legal protection. In the report, we consider a new direction in software, organization and control of common databases on the example of the electronic document handling, which functions in some departments of the Joint Institute for Nuclear Research.

  17. Building a GIS Database for Space and Facilities Management

    ERIC Educational Resources Information Center

    Valcik, Nicolas A.; Huesca-Dorantes, Patricia

    2003-01-01

    Growth and technology have driven the University of Texas at Dallas (UTD) to build a geographic information system (GIS) database for its facilities and space management. This chapter reviews several issues concerning this implementation: (1) the challenges involved in implementing the system; (2) the lack of efficiency and accuracy that existed…

  18. Interface between astrophysical datasets and distributed database management systems (DAVID)

    NASA Technical Reports Server (NTRS)

    Iyengar, S. S.

    1988-01-01

    This is a status report on the progress of the DAVID (Distributed Access View Integrated Database Management System) project being carried out at Louisiana State University, Baton Rouge, Louisiana. The objective is to implement an interface between Astrophysical datasets and DAVID. Discussed are design details and implementation specifics between DAVID and astrophysical datasets.

  19. Selecting a Relational Database Management System for Library Automation Systems.

    ERIC Educational Resources Information Center

    Shekhel, Alex; O'Brien, Mike

    1989-01-01

    Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)

  20. Integration of Information Retrieval and Database Management Systems.

    ERIC Educational Resources Information Center

    Deogun, Jitender S.; Raghavan, Vijay V.

    1988-01-01

    Discusses the motivation for integrating information retrieval and database management systems, and proposes a probabilistic retrieval model in which records in a file may be composed of attributes (formatted data items) and descriptors (content indicators). The details and resolutions of difficulties involved in integrating such systems are…

  1. Use of Knowledge Bases in Education of Database Management

    ERIC Educational Resources Information Center

    Radványi, Tibor; Kovács, Emod

    2008-01-01

    In this article we present a segment of Sulinet Digital Knowledgebase curriculum system in which you can find the sections of subject-matter which aid educating the database management. You can follow the order of the course from the beginning when some topics appearance and raise in elementary school, through the topics accomplish in secondary…

  2. Database Management Principles of the UCLA Library's Orion System.

    ERIC Educational Resources Information Center

    Fayollat, James; Coles, Elizabeth

    1987-01-01

    Describes an integrated online library system developed at the University of California at Los Angeles (UCLA) which incorporates a number of database management features that enhance efficiency, for record retrieval and display. Design features related to record storage and retrieval and the design of linked files are described in detail.…

  3. Database Management Information System. Annual Report, 1986-87.

    ERIC Educational Resources Information Center

    Long Beach Unified School District, CA.

    This report summarizes the findings of the Long Beach (California) Unified School District's first Database Management Information System Survey. These findings are based on the responses of 71,521 parents, students in grades 4-12, and school site staff members. Each school independently selected items from a research-based data bank to develop…

  4. Selecting Data-Base Management Software for Microcomputers in Libraries and Information Units.

    ERIC Educational Resources Information Center

    Pieska, K. A. O.

    1986-01-01

    Presents a model for the evaluation of database management systems software from the viewpoint of librarians and information specialists. The properties of data management systems, database management systems, and text retrieval systems are outlined and compared. (10 references) (CLB)

  5. Implementation of schema management in STEP-based object-oriented engineering database management system

    NASA Astrophysics Data System (ADS)

    Xiao, Ke; Zhao, Zhige; Sun, Jiaguang

    1996-03-01

    Engineering database management system (EDBMS) is the kernel of CAD/CAM system integration, and object-oriented EDBMS (OOEDBMS) is the best implementation. While STEP is becoming the standard of product data exchange and representation, supporting STEP in engineering database becomes more and more important. In this paper we introduce the architecture of STEP based OOEDBMS in our CAD/CAM integrated system GHCAD. We focus on schema management and three-grade database management in OOEDBMS. Topics such as DDL compiler, transformation from EXPRESS to DDL, DDL tools are discussed. Finally further research directions of schema management in OOEDBMS are present.

  6. A database system for enhancing fuel records management capabilities

    SciTech Connect

    Rieke, Phil; Razvi, Junaid

    1994-07-01

    The need to modernize the system of managing a large variety of fuel related data at the TRIGA Reactors Facility at General Atomics, as well as the need to improve NRC nuclear material reporting requirements, prompted the development of a database to cover all aspects of fuel records management. The TRIGA Fuel Database replaces (a) an index card system used for recording fuel movements, (b) hand calculations for uranium burnup, and (c) a somewhat aged and cumbersome system of recording fuel inspection results. It was developed using Microsoft Access, a relational database system for Windows. Instead of relying on various sources for element information, users may now review individual element statistics, record inspection results, calculate element burnup and more, all from within a single application. Taking full advantage of the ease-of-use features designed in to Windows and Access, the user can enter and extract information easily through a number of customized on screen forms, with a wide variety of reporting options available. All forms are accessed through a main 'Options' screen, with the options broken down by categories, including 'Elements', 'Special Elements/Devices', 'Control Rods' and 'Areas'. Relational integrity and data validation rules are enforced to assist in ensuring accurate and meaningful data is entered. Among other items, the database lets the user define: element types (such as FLIP or standard) and subtypes (such as fuel follower, instrumented, etc.), various inspection codes for standardizing inspection results, areas within the facility where elements are located, and the power factors associated with element positions within a reactor. Using fuel moves, power history, power factors and element types, the database tracks uranium burnup and plutonium buildup on a quarterly basis. The Fuel Database was designed with end-users in mind and does not force an operations oriented user to learn any programming or relational database theory in

  7. An image database management system for conducting CAD research

    NASA Astrophysics Data System (ADS)

    Gruszauskas, Nicholas; Drukker, Karen; Giger, Maryellen L.

    2007-03-01

    The development of image databases for CAD research is not a trivial task. The collection and management of images and their related metadata from multiple sources is a time-consuming but necessary process. By standardizing and centralizing the methods in which these data are maintained, one can generate subsets of a larger database that match the specific criteria needed for a particular research project in a quick and efficient manner. A research-oriented management system of this type is highly desirable in a multi-modality CAD research environment. An online, webbased database system for the storage and management of research-specific medical image metadata was designed for use with four modalities of breast imaging: screen-film mammography, full-field digital mammography, breast ultrasound and breast MRI. The system was designed to consolidate data from multiple clinical sources and provide the user with the ability to anonymize the data. Input concerning the type of data to be stored as well as desired searchable parameters was solicited from researchers in each modality. The backbone of the database was created using MySQL. A robust and easy-to-use interface for entering, removing, modifying and searching information in the database was created using HTML and PHP. This standardized system can be accessed using any modern web-browsing software and is fundamental for our various research projects on computer-aided detection, diagnosis, cancer risk assessment, multimodality lesion assessment, and prognosis. Our CAD database system stores large amounts of research-related metadata and successfully generates subsets of cases that match the user's desired search criteria.

  8. The MAO NASU glass archive database: search and management tools

    NASA Astrophysics Data System (ADS)

    Pakuliak, L.

    2005-06-01

    At the Main Astronomical Observatory of the National Academy of Sciences of Ukraine (MAO NASU) the astronomical glass archive counts more than 50,000 of plates obtained in various observational projects during last 50 years of the past century. The local single-user database of glass archive, created on the basis of observational logs and partly on measurement results, has been transformed into an online multy-user system to provide a remote access to the plate archive. In the paper online tools for data searching and database management are presented.

  9. Database system for analysing and managing coiled tubing drilling data

    NASA Astrophysics Data System (ADS)

    Suh, J.; Choi, Y.; Park, H.; Choe, J.

    2009-05-01

    This study present a prototype of database system for analysing and managing petrophysical data from coiled tubing drilling in the oil and gas industry. The characteristics of coiled tubing drilling data from cores were analyzed and categorized according to the whole drilling process and data modeling including object relation diagram, class diagram was carried out to design the schema of effective database system such as the relationships between tables and key index fields to create the relationships. The database system called DrillerGeoDB consists of 22 tables and those are classified with 4 groups such as project information, stratum information, drilling/logging information and operation evaluate information. DrillerGeoDB provide all sort of results of each process with a spreadsheet such as MS-Excel via application of various algorithm of logging theory and statistics function of cost evaluation. This presentation describes the details of the system development and implementation.

  10. Data Processing on Database Management Systems with Fuzzy Query

    NASA Astrophysics Data System (ADS)

    Şimşek, Irfan; Topuz, Vedat

    In this study, a fuzzy query tool (SQLf) for non-fuzzy database management systems was developed. In addition, samples of fuzzy queries were made by using real data with the tool developed in this study. Performance of SQLf was tested with the data about the Marmara University students' food grant. The food grant data were collected in MySQL database by using a form which had been filled on the web. The students filled a form on the web to describe their social and economical conditions for the food grant request. This form consists of questions which have fuzzy and crisp answers. The main purpose of this fuzzy query is to determine the students who deserve the grant. The SQLf easily found the eligible students for the grant through predefined fuzzy values. The fuzzy query tool (SQLf) could be used easily with other database system like ORACLE and SQL server.

  11. Database Design Learning: A Project-Based Approach Organized through a Course Management System

    ERIC Educational Resources Information Center

    Dominguez, Cesar; Jaime, Arturo

    2010-01-01

    This paper describes an active method for database design learning through practical tasks development by student teams in a face-to-face course. This method integrates project-based learning, and project management techniques and tools. Some scaffolding is provided at the beginning that forms a skeleton that adapts to a great variety of…

  12. The Golosiiv on-line plate archive database, management and maintenance

    NASA Astrophysics Data System (ADS)

    Pakuliak, L.; Sergeeva, T.

    2007-08-01

    We intend to create online version of the database of the MAO NASU plate archive as VO-compatible structures in accordance with principles, developed by the International Virtual Observatory Alliance in order to make them available for world astronomical community. The online version of the log-book database is constructed by means of MySQL+PHP. Data management system provides a user with user interface, gives a capability of detailed traditional form-filling radial search of plates, obtaining some auxiliary sampling, the listing of each collection and permits to browse the detail descriptions of collections. The administrative tool allows database administrator the data correction, enhancement with new data sets and control of the integrity and consistence of the database as a whole. The VO-compatible database is currently constructing under the demands and in the accordance with principles of international data archives and has to be strongly generalized in order to provide a possibility of data mining by means of standard interfaces and to be the best fitted to the demands of WFPDB Group for databases of the plate catalogues. On-going enhancements of database toward the WFPDB bring the problem of the verification of data to the forefront, as it demands the high degree of data reliability. The process of data verification is practically endless and inseparable from data management owing to a diversity of data errors nature, that means to a variety of ploys of their identification and fixing. The current status of MAO NASU glass archive forces the activity in both directions simultaneously: the enhancement of log-book database with new sets of observational data as well as generalized database creation and the cross-identification between them. The VO-compatible version of the database is supplying with digitized data of plates obtained with MicroTek ScanMaker 9800 XL TMA. The scanning procedure is not total but is conducted selectively in the frames of special

  13. An engineering database management system for spacecraft operations

    NASA Technical Reports Server (NTRS)

    Cipollone, Gregorio; Mckay, Michael H.; Paris, Joseph

    1993-01-01

    Studies at ESOC have demonstrated the feasibility of a flexible and powerful Engineering Database Management System in support for spacecraft operations documentation. The objectives set out were three-fold: first an analysis of the problems encountered by the Operations team in obtaining and managing operations documents; secondly, the definition of a concept for operations documentation and the implementation of prototype to prove the feasibility of the concept; and thirdly, definition of standards and protocols required for the exchange of data between the top-level partners in a satellite project. The EDMS prototype was populated with ERS-l satellite design data and has been used by the operations team at ESOC to gather operational experience. An operational EDMS would be implemented at the satellite prime contractor's site as a common database for all technical information surrounding a project and would be accessible by the cocontractor's and ESA teams.

  14. A Support Database System for Integrated System Health Management (ISHM)

    NASA Technical Reports Server (NTRS)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between

  15. The Oil and Natural Gas Knowledge Management Database from NETL

    DOE Data Explorer

    The Knowledge Management Database (KMD) Portal provides four options for searching the documents and data that NETL-managed oil and gas research has produced over the years for DOE’s Office of Fossil Energy. Information includes R&D carried out under both historical and ongoing DOE oil and gas research and development (R&D). The Document Repository, the CD/DVD Library, the Project Summaries from 1990 to the present, and the Oil and Natural Gas Program Reference Shelf provide a wide range of flexibility and coverage.

  16. A multidisciplinary database for geophysical time series management

    NASA Astrophysics Data System (ADS)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  17. [A computerized database for managing otorhinolaryngologic oncology patients].

    PubMed

    Mira, E; Lanza, L; Castelli, A; Benazzo, M; Tinelli, C

    1998-06-01

    In recent years the management and interdisciplinary treatment of oncological patients has become extremely complex due to the progress made in diagnosis and therapy. As a result, the knowledge required to treat patients can no longer be simply memorized or manually filed. Computer technology provides the ideal instrument for organizing, saving and analyzing data from head and neck tumor patients. The authors have prepared a computerized database to meet the following needs: ease of use, even for non computer savvy users; minimal ambiguity for data entry; use for both clinical and scientific purposes; possibility to create a network with similar database at other Centers; possibility to expand to include image management. The archive is based on a personal computer with an INTEL 80486 microprocessor, 40 Mb RAM, DOS 6.0. and Windows 3.1. The software includes four main routines: a) formulation and management of tables where oncological data are gathered; b) entry and management of patient-related clinical data; c) statistical processing for epidemiological and oncological research and; d) management of basic computer services. In clinical practice the database allows the following: a) preparation of a monthly chart of check-ups, b) rapid tracking of patients lost to followup, c) printout of a summary of the clinical history of each patient at the time of check-up and rapid updating at the end of the examination, d) automatic production of forms such as discharge letters and reports to be shared with related services (i.e. medical oncology, radiotherapy). In addition, the database is a powerful, versatile research tool which can promptly provide all sorts of oncological data and can automatically prepare tables, diagrams, correlations, survival curves. The system was developed from 1993 to 1995 and has been operative, with a few minor modifications and updates, since 1995. Today the database contains more than 1200 oncological cases and the system is used daily by

  18. Student Activities. Managing Liability.

    ERIC Educational Resources Information Center

    Bennett, Barbara; And Others

    This monograph suggests ways that college or university administrations can undertake a systematic and careful review of the risks posed by students' activities. Its purpose is to provide guidance in integrating the risk management process into a school's existing approaches to managing student organizations and activities. It is noted that no…

  19. HyperCare: a prototype of an active database for compliance with essential hypertension therapy guidelines.

    PubMed Central

    Caironi, P. V.; Portoni, L.; Combi, C.; Pinciroli, F.; Ceri, S.

    1997-01-01

    HyperCare is a prototype of a decision support system for essential hypertension care management. The medical knowledge implemented in HyperCare derives from the guidelines for the management of mild hypertension of the World Health Organization/International Society of Hypertension, and from the recommendations of the United States Joint National Committee on Detection, Evaluation and Treatment of High Blood Pressure. HyperCare has been implemented using Chimera, an active database language developed at the Politecnico di Milano. HyperCare proves the possibility to use active database systems in developing a medical data-intensive application where inferential elaboration of moderate complexity is required. PMID:9357634

  20. Computerized database management system for breast cancer patients.

    PubMed

    Sim, Kok Swee; Chong, Sze Siang; Tso, Chih Ping; Nia, Mohsen Esmaeili; Chong, Aun Kee; Abbas, Siti Fathimah

    2014-01-01

    Data analysis based on breast cancer risk factors such as age, race, breastfeeding, hormone replacement therapy, family history, and obesity was conducted on breast cancer patients using a new enhanced computerized database management system. My Structural Query Language (MySQL) is selected as the application for database management system to store the patient data collected from hospitals in Malaysia. An automatic calculation tool is embedded in this system to assist the data analysis. The results are plotted automatically and a user-friendly graphical user interface is developed that can control the MySQL database. Case studies show breast cancer incidence rate is highest among Malay women, followed by Chinese and Indian. The peak age for breast cancer incidence is from 50 to 59 years old. Results suggest that the chance of developing breast cancer is increased in older women, and reduced with breastfeeding practice. The weight status might affect the breast cancer risk differently. Additional studies are needed to confirm these findings. PMID:25045606

  1. Advanced Scientific Computing Environment Team new scientific database management task

    SciTech Connect

    Church, J.P.; Roberts, J.C.; Sims, R.N.; Smetana, A.O.; Westmoreland, B.W.

    1991-06-01

    The mission of the ASCENT Team is to continually keep pace with, evaluate, and select emerging computing technologies to define and implement prototypic scientific environments that maximize the ability of scientists and engineers to manage scientific data. These environments are to be implemented in a manner consistent with the site computing architecture and standards and NRTSC/SCS strategic plans for scientific computing. The major trends in computing hardware and software technology clearly indicate that the future computer'' will be a network environment that comprises supercomputers, graphics boxes, mainframes, clusters, workstations, terminals, and microcomputers. This network computer'' will have an architecturally transparent operating system allowing the applications code to run on any box supplying the required computing resources. The environment will include a distributed database and database managing system(s) that permits use of relational, hierarchical, object oriented, GIS, et al, databases. To reach this goal requires a stepwise progression from the present assemblage of monolithic applications codes running on disparate hardware platforms and operating systems. The first steps include converting from the existing JOSHUA system to a new J80 system that complies with modern language standards, development of a new J90 prototype to provide JOSHUA capabilities on Unix platforms, development of portable graphics tools to greatly facilitate preparation of input and interpretation of output; and extension of Jvv'' concepts and capabilities to distributed and/or parallel computing environments.

  2. Data-base location problems in distributed data-base management systems

    SciTech Connect

    Chahande, A.I.

    1989-01-01

    Recent years have witnessed an increasing number of systems, usually heterogeneous, that are geographically distributed and connected by high capacity communication channels, eg. ARPANET system, CYCLADES network, TymNET, etc. In the design and management of such systems, a major portion of the planning is concerned with storing large quantities of information (data) at judiciously selected nodes in the network, in adherence to some optimal criterion. This necessitates analysis pertaining to information storage costs, transaction (update and query) costs, response times, processing locality, etc. There are essentially two definitions of optimality - Cost measures and Performance measures. The two measures of optimality parallel each other. This research essentially considers the minimal cost objective, but incorporates the performance objectives as well, by considering cost penalties for sub-optimal performance. The distributed database design problem is fully characterized by two sub-problems: (a) Design of the Fragmentation Schema, and (b) Designing the Allocation Schema for these fragments. These problems have been addressed independently in the literature. This research, appreciating the mutual interdependence of the issues, attempts the distributed database location problem considering both aspects in unison (logical as well as physical criteria). The problem can be succinctly stated as follows: Given the set of user nodes with their respective transaction (update and query) frequencies, and a set of application programs, the database location problem assigns copies of various database files (or fragments thereof) to candidate nodes, such that the total cost is minimized. The decision must trade-off the cost of accessing, which is reduced by additional copies, against the cost of updating and storing these additional copies.

  3. An expert system to facilitate selecting a database management system

    SciTech Connect

    Roseberry, L.M.; Kilgore, D.C.

    1989-06-06

    An investigation has been initiated to develop an expert system to assist information professionals in selecting a database management system (DBMS). The system attempts to consider DBMS basic design, theory, and performance standard as well as the specific needs of the project. The user is queried for needs, wants, and resource restrictions. The inference engine tests these data against its rule set and generates prioritized recommendations. The rule set design will be discussed. The usefulness of such a tool will be discussed as well as plans for its continued evolution.

  4. SPRT/DBMS. Pre-Validated Signal Database Management System

    SciTech Connect

    Gross, K.C.

    1996-01-01

    SPRT/DBMS is a pre-validated experimental database management system for industries where large volumes of process signals are acquired and archived. This system implements a new and powerful pattern recognition method, the spectrum transformed sequential testing (STST or ST2) procedure. A network of interacting ST2 modules deployed in parallel is integrated with a relational DBMS to fully validate process signals as they are archived. This reliable, secure DBMS then provides system modelers, code developers, and safety analysts with an easily accessible source of fully validated process data.

  5. Survey of standards applicable to a database management system

    NASA Technical Reports Server (NTRS)

    Urena, J. L.

    1981-01-01

    Industry, government, and NASA standards, and the status of standardization activities of standards setting organizations applicable to the design, implementation and operation of a data base management system for space related applications are identified. The applicability of the standards to a general purpose, multimission data base management system is addressed.

  6. Management Guidelines for Database Developers' Teams in Software Development Projects

    NASA Astrophysics Data System (ADS)

    Rusu, Lazar; Lin, Yifeng; Hodosi, Georg

    Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.

  7. THE NATIONAL EXPOSURE RESEARCH LABORATORY'S CONSOLIDATED HUMAN ACTIVITY DATABASE

    EPA Science Inventory

    EPA's National Exposure Research Laboratory (NERL) has combined data from 12 U.S. studies related to human activities into one comprehensive data system that can be accessed via the Internet. The data system is called the Consolidated Human Activity Database (CHAD), and it is ...

  8. THE NATIONAL EXPOSURE RESEARCH LABORATORY'S COMPREHENSIVE HUMAN ACTIVITY DATABASE

    EPA Science Inventory

    EPA's National Exposure Research Laboratory (NERL) has combined data from nine U.S. studies related to human activities into one comprehensive data system that can be accessed via the world-wide web. The data system is called CHAD-Consolidated Human Activity Database-and it is ...

  9. Database Access Manager for the Software Engineering Laboratory (DAMSEL) user's guide

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Operating instructions for the Database Access Manager for the Software Engineering Laboratory (DAMSEL) system are presented. Step-by-step instructions for performing various data entry and report generation activities are included. Sample sessions showing the user interface display screens are also included. Instructions for generating reports are accompanied by sample outputs for each of the reports. The document groups the available software functions by the classes of users that may access them.

  10. Active fault database of Japan: Its construction and search system

    NASA Astrophysics Data System (ADS)

    Yoshioka, T.; Miyamoto, F.

    2011-12-01

    The Active fault database of Japan was constructed by the Active Fault and Earthquake Research Center, GSJ/AIST and opened to the public on the Internet from 2005 to make a probabilistic evaluation of the future faulting event and earthquake occurrence on major active faults in Japan. The database consists of three sub-database, 1) sub-database on individual site, which includes long-term slip data and paleoseismicity data with error range and reliability, 2) sub-database on details of paleoseismicity, which includes the excavated geological units and faulting event horizons with age-control, 3) sub-database on characteristics of behavioral segments, which includes the fault-length, long-term slip-rate, recurrence intervals, most-recent-event, slip per event and best-estimate of cascade earthquake. Major seismogenic faults, those are approximately the best-estimate segments of cascade earthquake, each has a length of 20 km or longer and slip-rate of 0.1m/ky or larger and is composed from about two behavioral segments in average, are included in the database. This database contains information of active faults in Japan, sorted by the concept of "behavioral segments" (McCalpin, 1996). Each fault is subdivided into 550 behavioral segments based on surface trace geometry and rupture history revealed by paleoseismic studies. Behavioral segments can be searched on the Google Maps. You can select one behavioral segment directly or search segments in a rectangle area on the map. The result of search is shown on a fixed map or the Google Maps with information of geologic and paleoseismic parameters including slip rate, slip per event, recurrence interval, and calculated rupture probability in the future. Behavioral segments can be searched also by name or combination of fault parameters. All those data are compiled from journal articles, theses, and other documents. We are currently developing a revised edition, which is based on an improved database system. More than ten

  11. Enhanced DIII-D Data Management Through a Relational Database

    NASA Astrophysics Data System (ADS)

    Burruss, J. R.; Peng, Q.; Schachter, J.; Schissel, D. P.; Terpstra, T. B.

    2000-10-01

    A relational database is being used to serve data about DIII-D experiments. The database is optimized for queries across multiple shots, allowing for rapid data mining by SQL-literate researchers. The relational database relates different experiments and datasets, thus providing a big picture of DIII-D operations. Users are encouraged to add their own tables to the database. Summary physics quantities about DIII-D discharges are collected and stored in the database automatically. Meta-data about code runs, MDSplus usage, and visualization tool usage are collected, stored in the database, and later analyzed to improve computing. Documentation on the database may be accessed through programming languages such as C, Java, and IDL, or through ODBC compliant applications such as Excel and Access. A database-driven web page also provides a convenient means for viewing database quantities through the World Wide Web. Demonstrations will be given at the poster.

  12. Management of the life and death of an earth-science database: some examples from geotherm

    USGS Publications Warehouse

    Bliss, J.D.

    1986-01-01

    Productive earth-science databases require managers who are familiar with and skilled at using available software developed specifically for database management. There also should be a primary user with a clearly understood mission. The geologic phenomenon addressed by the database must be sufficiently understood, and adequate appropriate data must be available to construct a useful database. The database manager, in concert with the primary user, must ensure that data of adequate quality are available in the database, as well as prepare for mechanisms of releasing the data when the database is terminated. The primary user needs to be held accountable along with the database manager to ensure that a useful database will be created. Quality of data and maintenance of database relevancy to the user's mission are important issues during the database's lifetime. Products prepared at termination may be used more than the operational database and thus are of critical importance. These concepts are based, in part, on both the shortcomings and successes of GEOTHERM, a comprehensive system of databases and software used to store, locate, and evaluate the geology, geochemistry, and hydrology of geothermal systems. ?? 1986.

  13. National Levee Database: monitoring, vulnerability assessment and management in Italy

    NASA Astrophysics Data System (ADS)

    Barbetta, Silvia; Camici, Stefania; Maccioni, Pamela; Moramarco, Tommaso

    2015-04-01

    Italian levees and historical breach failures to be exploited in the framework of an operational procedure addressed to the seepage vulnerability assessment of river reaches where the levee system is an important structural measure against flooding. For its structure, INLED is a dynamic geospatial database with ongoing efforts to add levee data from authorities with the charge of hydraulic risk mitigation. In particular, the database is aimed to provide the available information about: i) location and condition of levees; ii) morphological and geometrical properties; iii) photographic documentation; iv) historical levee failures; v) assessment of vulnerability to overtopping and seepage carried out through a procedure based on simple vulnerability indexes (Camici et al. 2014); vi) management, control and maintenance; vii)flood hazard maps developed by assuming the levee system undamaged/damaged during the flood event. Currently, INLED contains data of levees that are mostly located in the Tiber basin, Central Italy. References Apel H., Merz B. & Thieken A.H. Quantification of uncertainties in flood risk assessments. Int J River Basin Manag 2008, 6, (2), 149-162. Camici S,, Barbetta S., Moramarco T., Levee body vulnerability to seepage: the case study of the levee failure along the Foenna stream on 1st January 2006 (central Italy)", Journal of Flood Risk Management, in press. Colleselli F. Geotechnical problems related to river and channel embankments. Rotterdam, the Netherlands: Springer, 1994. H. R.Wallingford Consultants (HRWC). Risk assessment for flood and coastal defence for strategic planning: high level methodology technical report, London, 2003. Mazzoleni M., Bacchi B., Barontini S., Di Baldassarre G., Pilotti M. & Ranzi R. Flooding hazard mapping in floodplain areas affected by piping breaches in the Po River, Italy. J Hydrol Eng 2014, 19, (4), 717-731.

  14. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    NASA Technical Reports Server (NTRS)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  15. Cryptanalysis of Password Protection of Oracle Database Management System (DBMS)

    NASA Astrophysics Data System (ADS)

    Koishibayev, Timur; Umarova, Zhanat

    2016-04-01

    This article discusses the currently available encryption algorithms in the Oracle database, also the proposed upgraded encryption algorithm, which consists of 4 steps. In conclusion we make an analysis of password encryption of Oracle Database.

  16. The INFN-CNAF Tier-1 GEMSS Mass Storage System and database facility activity

    NASA Astrophysics Data System (ADS)

    Ricci, Pier Paolo; Cavalli, Alessandro; Dell'Agnello, Luca; Favaro, Matteo; Gregori, Daniele; Prosperini, Andrea; Pezzi, Michele; Sapunenko, Vladimir; Zizzi, Giovanni; Vagnoni, Vincenzo

    2015-05-01

    The consolidation of Mass Storage services at the INFN-CNAF Tier1 Storage department that has occurred during the last 5 years, resulted in a reliable, high performance and moderately easy-to-manage facility that provides data access, archive, backup and database services to several different use cases. At present, the GEMSS Mass Storage System, developed and installed at CNAF and based upon an integration between the IBM GPFS parallel filesystem and the Tivoli Storage Manager (TSM) tape management software, is one of the largest hierarchical storage sites in Europe. It provides storage resources for about 12% of LHC data, as well as for data of other non-LHC experiments. Files are accessed using standard SRM Grid services provided by the Storage Resource Manager (StoRM), also developed at CNAF. Data access is also provided by XRootD and HTTP/WebDaV endpoints. Besides these services, an Oracle database facility is in production characterized by an effective level of parallelism, redundancy and availability. This facility is running databases for storing and accessing relational data objects and for providing database services to the currently active use cases. It takes advantage of several Oracle technologies, like Real Application Cluster (RAC), Automatic Storage Manager (ASM) and Enterprise Manager centralized management tools, together with other technologies for performance optimization, ease of management and downtime reduction. The aim of the present paper is to illustrate the state-of-the-art of the INFN-CNAF Tier1 Storage department infrastructures and software services, and to give a brief outlook to forthcoming projects. A description of the administrative, monitoring and problem-tracking tools that play a primary role in managing the whole storage framework is also given.

  17. Database of Active Structures From the Indo-Asian Collision

    NASA Astrophysics Data System (ADS)

    Styron, Richard; Taylor, Michael; Okoronkwo, Kelechi

    2010-05-01

    The ongoing collision of India and Asia has produced a vast system of folds and faults, many of which are active today, as evidenced by such recent deadly earthquakes as the 12 May 2008 Sichuan quake [Parsons et al., 2008]. Understanding these events requires knowledge of the region’s geologic structures. Taylor and Yin [2009] have assembled HimaTibetMap-1.0, a multiformat, comprehensive database of first-order active structures in central Asia that may aid researchers, educators, and students in their studies of Indo-Asian tectonics. For example, this database may be used by seismologists, geodesists, and modelers to identify structures in particular locations that contribute to active deformation, or it may be used by teachers to illustrate concepts such as continental collision or distributed deformation of continents.

  18. A Conceptual Model and Database to Integrate Data and Project Management

    NASA Astrophysics Data System (ADS)

    Guarinello, M. L.; Edsall, R.; Helbling, J.; Evaldt, E.; Glenn, N. F.; Delparte, D.; Sheneman, L.; Schumaker, R.

    2015-12-01

    database and build it in a way that is modular and can be changed or expanded to meet user needs. Our hope is that others, especially those managing large collaborative research grants, will be able to use our project model and database design to enhance the value of their project and data management both during and following the active research period.

  19. Statistical databases

    SciTech Connect

    Kogalovskii, M.R.

    1995-03-01

    This paper presents a review of problems related to statistical database systems, which are wide-spread in various fields of activity. Statistical databases (SDB) are referred to as databases that consist of data and are used for statistical analysis. Topics under consideration are: SDB peculiarities, properties of data models adequate for SDB requirements, metadata functions, null-value problems, SDB compromise protection problems, stored data compression techniques, and statistical data representation means. Also examined is whether the present Database Management Systems (DBMS) satisfy the SDB requirements. Some actual research directions in SDB systems are considered.

  20. Keeping Track of Our Treasures: Managing Historical Data with Relational Database Software.

    ERIC Educational Resources Information Center

    Gutmann, Myron P.; And Others

    1989-01-01

    Describes the way a relational database management system manages a large historical data collection project. Shows that such databases are practical to construct. States that the programing tasks involved are not for beginners, but the rewards of having data organized are worthwhile. (GG)

  1. Enhancing Disaster Management: Development of a Spatial Database of Day Care Centers in the USA

    DOE PAGESBeta

    Singh, Nagendra; Tuttle, Mark A.; Bhaduri, Budhendra L.

    2015-07-30

    Children under the age of five constitute around 7% of the total U.S. population and represent a segment of the population, which is totally dependent on others for day-to-day activities. A significant proportion of this population spends time in some form of day care arrangement while their parents are away from home. Accounting for those children during emergencies is of high priority, which requires a broad understanding of the locations of such day care centers. As concentrations of at risk population, the spatial location of day care centers is critical for any type of emergency preparedness and response (EPR). However,more » until recently, the U.S. emergency preparedness and response community did not have access to a comprehensive spatial database of day care centers at the national scale. This paper describes an approach for the development of the first comprehensive spatial database of day care center locations throughout the USA utilizing a variety of data harvesting techniques to integrate information from widely disparate data sources followed by geolocating for spatial precision. In the context of disaster management, such spatially refined demographic databases hold tremendous potential for improving high resolution population distribution and dynamics models and databases.« less

  2. Enhancing Disaster Management: Development of a Spatial Database of Day Care Centers in the USA

    SciTech Connect

    Singh, Nagendra; Tuttle, Mark A.; Bhaduri, Budhendra L.

    2015-07-30

    Children under the age of five constitute around 7% of the total U.S. population and represent a segment of the population, which is totally dependent on others for day-to-day activities. A significant proportion of this population spends time in some form of day care arrangement while their parents are away from home. Accounting for those children during emergencies is of high priority, which requires a broad understanding of the locations of such day care centers. As concentrations of at risk population, the spatial location of day care centers is critical for any type of emergency preparedness and response (EPR). However, until recently, the U.S. emergency preparedness and response community did not have access to a comprehensive spatial database of day care centers at the national scale. This paper describes an approach for the development of the first comprehensive spatial database of day care center locations throughout the USA utilizing a variety of data harvesting techniques to integrate information from widely disparate data sources followed by geolocating for spatial precision. In the context of disaster management, such spatially refined demographic databases hold tremendous potential for improving high resolution population distribution and dynamics models and databases.

  3. Database technology and the management of multimedia data in the Mirror project

    NASA Astrophysics Data System (ADS)

    de Vries, Arjen P.; Blanken, H. M.

    1998-10-01

    Multimedia digital libraries require an open distributed architecture instead of a monolithic database system. In the Mirror project, we use the Monet extensible database kernel to manage different representation of multimedia objects. To maintain independence between content, meta-data, and the creation of meta-data, we allow distribution of data and operations using CORBA. This open architecture introduces new problems for data access. From an end user's perspective, the problem is how to search the available representations to fulfill an actual information need; the conceptual gap between human perceptual processes and the meta-data is too large. From a system's perspective, several representations of the data may semantically overlap or be irrelevant. We address these problems with an iterative query process and active user participating through relevance feedback. A retrieval model based on inference networks assists the user with query formulation. The integration of this model into the database design has two advantages. First, the user can query both the logical and the content structure of multimedia objects. Second, the use of different data models in the logical and the physical database design provides data independence and allows algebraic query optimization. We illustrate query processing with a music retrieval application.

  4. Database Design and Implementation of Game Management System for Rescue Robot Contest

    NASA Astrophysics Data System (ADS)

    Yamauchi, Hitoshi; Kojima, Atsuhiro; Koeda, Masanao

    The Rescue Robot Contest is one of the robot contests concerning lifesaving in urban disasters. In this contest, loads of rescue dummies simulating disaster victims, and contest progression status are presented to audiences, and team members operating robots. This presentation is important for both robot activity evaluation and production effect. For these purposes, a game management system for Rescue Robot Contest is originally constructed and operated. In this system, a database system is employed as base system. And this system's role is for both recording all data and events, and real-time processing for the presentation. In this paper, design and implementation of the tables and built-in functions of the database which is foundation of this system are presented. For real-time processing, embedded functions and trigger functions are implemented. These functions generate unique latest records into specific tables which stores only latest data for quick access.

  5. Health technology management: a database analysis as support of technology managers in hospitals.

    PubMed

    Miniati, Roberto; Dori, Fabrizio; Iadanza, Ernesto; Fregonara, Mario M; Gentili, Guido Biffi

    2011-01-01

    Technology management in healthcare must continually respond and adapt itself to new improvements in medical equipment. Multidisciplinary approaches which consider the interaction of different technologies, their use and user skills, are necessary in order to improve safety and quality. An easy and sustainable methodology is vital to Clinical Engineering (CE) services in healthcare organizations in order to define criteria regarding technology acquisition and replacement. This article underlines the critical aspects of technology management in hospitals by providing appropriate indicators for benchmarking CE services exclusively referring to the maintenance database from the CE department at the Careggi Hospital in Florence, Italy. PMID:22129945

  6. A computerized representation of a medical school curriculum: integration of relational and text management software in database design.

    PubMed Central

    Mattern, W. D.; Wagner, J. A.; Brown, J. S.; Fisher-Neenan, L.

    1991-01-01

    We describe the development of a computer-based representation of the medical school curriculum at the University of North Carolina at Chapel Hill (UNC-CH). Over the past seven years the Medical School's Office of Academic Affairs has employed both relational database and text management software to design an integrated curriculum database system. Depending on the function selected--exploring the curriculum, searching through course outlines, retrieving elective descriptions, identifying teaching faculty, or searching for specific topics--either text management or relational database management routines are activated in a manner transparent to the user. Initial evaluation of the system has been positive but highlights the need for a more robust biomedical language for use as a controlled vocabulary to index content. Efforts are now underway, with support from the Association of American Medical Colleges (AAMC), to engage other interested schools in the U.S. and Canada in collaborating on further development of a system. PMID:1807615

  7. EADB: An Estrogenic Activity Database for Assessing Potential Endocrine Activity

    EPA Science Inventory

    Endocrine-active chemicals can potentially have adverse effects on both humans and wildlife. They can interfere with the body’s endocrine system through direct or indirect interactions with many protein targets. Estrogen receptors (ERs) are one of the major targets, and many ...

  8. Evaluating, Migrating, and Consolidating Databases and Applications for Long-Term Surveillance and Maintenance Activities at the Rocky Flats Site

    SciTech Connect

    Surovchak, S.; Marutzky, S.; Thompson, B.; Miller, K.; Labonte, E.

    2006-07-01

    The U.S. Department of Energy (DOE) Office of Legacy Management (LM) is assuming responsibilities for long-term surveillance and maintenance (LTS and M) activities at the Rocky Flats Environmental Technology Site (RFETS) during fiscal year 2006. During the transition, LM is consolidating databases and applications that support these various functions into a few applications which will streamline future management and retrieval of data. This paper discussed the process of evaluating, migrating, and consolidating these databases and applications for LTS and M activities and provides lessons learned that will benefit future transitions. (authors)

  9. Database of Pesticides and Off-flavors for Health Crisis Management.

    PubMed

    Ueda, Yasuhito; Itoh, Mitsuo

    2016-01-01

    In this experiment, 351 pesticides and 441 different organic compounds were analyzed by GC/MS, and a database of retention time, retention index, monoisotopic mass, two selected ions, molecular formula, and CAS numbers was created. The database includes compounds such as alcohols, aldehydes, carboxylic acids, esters, ethers and hydrocarbons with unpleasant odors. This database is expected to be useful for health crisis management in the future. PMID:27211918

  10. How Database Management Systems Can Be Used To Evaluate Program Effectiveness in Small School Districts.

    ERIC Educational Resources Information Center

    Hoffman, Tony

    Sophisticated database management systems (DBMS) for microcomputers are becoming increasingly easy to use, allowing small school districts to develop their own autonomous databases for tracking enrollment and student progress in special education. DBMS applications can be designed for maintenance by district personnel with little technical…

  11. Expert systems identify fossils and manage large paleontological databases

    SciTech Connect

    Beightol, D.S. ); Conrad, M.A.

    1988-02-01

    EXPAL is a computer program permitting creation and maintenance of comprehensive databases in marine paleontology. It is designed to assist specialists and non-specialists. EXPAL includes a powerful expert system based on the morphological descriptors specific to a given group of fossils. The expert system may be used, for example, to describe and automatically identify an unknown specimen. EXPAL was first applied to Dasycladales (Calcareous green algae). Projects are under way for corresponding expert systems and databases on planktonic foraminifers and calpionellids. EXPAL runs on an IBM XT or compatible microcomputer.

  12. Knowledge Based Engineering for Spatial Database Management and Use

    NASA Technical Reports Server (NTRS)

    Peuquet, D. (Principal Investigator)

    1984-01-01

    The use of artificial intelligence techniques that are applicable to Geographic Information Systems (GIS) are examined. Questions involving the performance and modification to the database structure, the definition of spectra in quadtree structures and their use in search heuristics, extension of the knowledge base, and learning algorithm concepts are investigated.

  13. A Web Database To Manage and Organize ANSI Standards Collections.

    ERIC Educational Resources Information Center

    Matylonek, John C.; Peasley, Maren

    2001-01-01

    Discusses collections of standards by ANSI (American National Standards Institute) and the problems they create for technical libraries. Describes a custom-designed Web database at Oregon State University that is linked to online catalog records, thus enhancing access to the standards collection. (LRW)

  14. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  15. Integrated Standardized Database/Model Management System: Study management concepts and requirements

    SciTech Connect

    Baker, R.; Swerdlow, S.; Schultz, R.; Tolchin, R.

    1994-02-01

    Data-sharing among planners and planning software for utility companies is the motivation for creating the Integrated Standardized Database (ISD) and Model Management System (MMS). The purpose of this document is to define the requirements for the ISD/MMS study management component in a manner that will enhance the use of the ISD. After an analysis period which involved EPRI member utilities across the United States, the study concept was formulated. It is defined in terms of its entities, relationships and its support processes, specifically for implementation as the key component of the MMS. From the study concept definition, requirements are derived. There are unique requirements, such as the necessity to interface with DSManager, EGEAS, IRPManager, MIDAS and UPM and there are standard information systems requirements, such as create, modify, delete and browse data. An initial ordering of the requirements is established, with a section devoted to future enhancements.

  16. The carbohydrate-active enzymes database (CAZy) in 2013.

    PubMed

    Lombard, Vincent; Golaconda Ramulu, Hemalatha; Drula, Elodie; Coutinho, Pedro M; Henrissat, Bernard

    2014-01-01

    The Carbohydrate-Active Enzymes database (CAZy; http://www.cazy.org) provides online and continuously updated access to a sequence-based family classification linking the sequence to the specificity and 3D structure of the enzymes that assemble, modify and breakdown oligo- and polysaccharides. Functional and 3D structural information is added and curated on a regular basis based on the available literature. In addition to the use of the database by enzymologists seeking curated information on CAZymes, the dissemination of a stable nomenclature for these enzymes is probably a major contribution of CAZy. The past few years have seen the expansion of the CAZy classification scheme to new families, the development of subfamilies in several families and the power of CAZy for the analysis of genomes and metagenomes. This article outlines the changes that have occurred in CAZy during the past 5 years and presents our novel effort to display the resolution and the carbohydrate ligands in crystallographic complexes of CAZymes. PMID:24270786

  17. GAS CHROMATOGRAPHIC RETENTION PARAMETERS DATABASE FOR REFRIGERANT MIXTURE COMPOSITION MANAGEMENT

    EPA Science Inventory

    Composition management of mixed refrigerant systems is a challenging problem in the laboratory, manufacturing facilities, and large refrigeration machinery. Ths issue of composition management is especially critical for the maintenance of machinery that utilizes zeotropic mixture...

  18. Zebrafish Database: Customizable, Free, and Open-Source Solution for Facility Management.

    PubMed

    Yakulov, Toma Antonov; Walz, Gerd

    2015-12-01

    Zebrafish Database is a web-based customizable database solution, which can be easily adapted to serve both single laboratories and facilities housing thousands of zebrafish lines. The database allows the users to keep track of details regarding the various genomic features, zebrafish lines, zebrafish batches, and their respective locations. Advanced search and reporting options are available. Unique features are the ability to upload files and images that are associated with the respective records and an integrated calendar component that supports multiple calendars and categories. Built on the basis of the Joomla content management system, the Zebrafish Database is easily extendable without the need for advanced programming skills. PMID:26421518

  19. Migration check tool: automatic plan verification following treatment management systems upgrade and database migration.

    PubMed

    Hadley, Scott W; White, Dale; Chen, Xiaoping; Moran, Jean M; Keranen, Wayne M

    2013-01-01

    Software upgrades of the treatment management system (TMS) sometimes require that all data be migrated from one version of the database to another. It is necessary to verify that the data are correctly migrated to assure patient safety. It is impossible to verify by hand the thousands of parameters that go into each patient's radiation therapy treatment plan. Repeating pretreatment QA is costly, time-consuming, and may be inadequate in detecting errors that are introduced during the migration. In this work we investigate the use of an automatic Plan Comparison Tool to verify that plan data have been correctly migrated to a new version of a TMS database from an older version. We developed software to query and compare treatment plans between different versions of the TMS. The same plan in the two TMS systems are translated into an XML schema. A plan comparison module takes the two XML schemas as input and reports any differences in parameters between the two versions of the same plan by applying a schema mapping. A console application is used to query the database to obtain a list of active or in-preparation plans to be tested. It then runs in batch mode to compare all the plans, and a report of success or failure of the comparison is saved for review. This software tool was used as part of software upgrade and database migration from Varian's Aria 8.9 to Aria 11 TMS. Parameters were compared for 358 treatment plans in 89 minutes. This direct comparison of all plan parameters in the migrated TMS against the previous TMS surpasses current QA methods that relied on repeating pretreatment QA measurements or labor-intensive and fallible hand comparisons. PMID:24257281

  20. Database mining applied to central nervous system (CNS) activity.

    PubMed

    Pintore, M; Taboureau, O; Ros, F; Chrétien, J R

    2001-04-01

    A data set of 389 compounds, active in the central nervous system (CNS) and divided into eight classes according to the receptor type, was extracted from the RBI database and analyzed by Self-Organizing Maps (SOM), also known as Kohonen Artificial Neural Networks. This method gives a 2D representation of the distribution of the compounds in the hyperspace derived from their molecular descriptors. As SOM belongs to the category of unsupervised techniques, it has to be combined with another method in order to generate classification models with predictive ability. The fuzzy clustering (FC) approach seems to be particularly suitable to delineate clusters in a rational way from SOM and to get an automatic objective map interpretation. Maps derived by SOM showed specific regions associated with a unique receptor type and zones in which two or more activity classes are nested. Then, the modeling ability of the proposed SOM/FC Hybrid System tools applied simultaneously to eight activity classes was validated after dividing the 389 compounds into a training set and a test set, including 259 and 130 molecules, respectively. The proper experimental activity class, among the eight possible ones, was predicted simultaneously and correctly for 81% of the test set compounds. PMID:11461760

  1. DOG-SPOT database for comprehensive management of dog genetic research data.

    PubMed

    Powell, Julie As; Allen, Jeremy; Sutter, Nathan B

    2010-01-01

    Research laboratories studying the genetics of companion animals have no database tools specifically designed to aid in the management of the many kinds of data that are generated, stored and analyzed. We have developed a relational database, "DOG-SPOT," to provide such a tool. Implemented in MS-Access, the database is easy to extend or customize to suit a lab's particular needs. With DOG-SPOT a lab can manage data relating to dogs, breeds, samples, biomaterials, phenotypes, owners, communications, amplicons, sequences, markers, genotypes and personnel. Such an integrated data structure helps ensure high quality data entry and makes it easy to track physical stocks of biomaterials and oligonucleotides. PMID:21159202

  2. Military Services Fitness Database: Development of a Computerized Physical Fitness and Weight Management Database for the U.S. Army

    PubMed Central

    Williamson, Donald A.; Bathalon, Gaston P.; Sigrist, Lori D.; Allen, H. Raymond; Friedl, Karl E.; Young, Andrew J.; Martin, Corby K.; Stewart, Tiffany M.; Burrell, Lolita; Han, Hongmei; Hubbard, Van S.; Ryan, Donna

    2009-01-01

    The Department of Defense (DoD) has mandated development of a system to collect and manage data on the weight, percent body fat (%BF), and fitness of all military personnel. This project aimed to (1) develop a computerized weight and fitness database to track individuals and Army units over time allowing cross-sectional and longitudinal evaluations and (2) test the computerized system for feasibility and integrity of data collection over several years of usage. The computer application, the Military Services Fitness Database (MSFD), was designed for (1) storage and tracking of data related to height, weight, %BF for the Army Weight Control Program (AWCP) and Army Physical Fitness Test (APFT) scores and (2) generation of reports using these data. A 2.5-year pilot test of the MSFD indicated that it monitors population and individual trends of changing body weight, %BF, and fitness in a military population. PMID:19216292

  3. PylotDB - A Database Management, Graphing, and Analysis Tool Written in Python

    2012-01-04

    PylotDB, written completely in Python, provides a user interface (UI) with which to interact with, analyze, graph data from, and manage open source databases such as MySQL. The UI mitigates the user having to know in-depth knowledge of the database application programming interface (API). PylotDB allows the user to generate various kinds of plots from user-selected data; generate statistical information on text as well as numerical fields; backup and restore databases; compare database tables acrossmore » different databases as well as across different servers; extract information from any field to create new fields; generate, edit, and delete databases, tables, and fields; generate or read into a table CSV data; and similar operations. Since much of the database information is brought under control of the Python computer language, PylotDB is not intended for huge databases for which MySQL and Oracle, for example, are better suited. PylotDB is better suited for smaller databases that might be typically needed in a small research group situation. PylotDB can also be used as a learning tool for database applications in general.« less

  4. PylotDB - A Database Management, Graphing, and Analysis Tool Written in Python

    SciTech Connect

    Barnette, Daniel W.

    2012-01-04

    PylotDB, written completely in Python, provides a user interface (UI) with which to interact with, analyze, graph data from, and manage open source databases such as MySQL. The UI mitigates the user having to know in-depth knowledge of the database application programming interface (API). PylotDB allows the user to generate various kinds of plots from user-selected data; generate statistical information on text as well as numerical fields; backup and restore databases; compare database tables across different databases as well as across different servers; extract information from any field to create new fields; generate, edit, and delete databases, tables, and fields; generate or read into a table CSV data; and similar operations. Since much of the database information is brought under control of the Python computer language, PylotDB is not intended for huge databases for which MySQL and Oracle, for example, are better suited. PylotDB is better suited for smaller databases that might be typically needed in a small research group situation. PylotDB can also be used as a learning tool for database applications in general.

  5. Database Objects vs Files: Evaluation of alternative strategies for managing large remote sensing data

    NASA Astrophysics Data System (ADS)

    Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram

    2010-05-01

    Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available

  6. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    SciTech Connect

    Wolery, T W; Sutton, M

    2011-09-19

    they use a large body of thermodynamic data, generally from a supporting database file, to sort out the various important reactions from a wide spectrum of possibilities, given specified inputs. Usually codes of this kind are used to construct models of initial aqueous solutions that represent initial conditions for some process, although sometimes these calculations also represent a desired end point. Such a calculation might be used to determine the major chemical species of a dissolved component, the solubility of a mineral or mineral-like solid, or to quantify deviation from equilibrium in the form of saturation indices. Reactive transport codes such as TOUGHREACT and NUFT generally require the user to determine which chemical species and reactions are important, and to provide the requisite set of information including thermodynamic data in an input file. Usually this information is abstracted from the output of a geochemical modeling code and its supporting thermodynamic data file. The Yucca Mountain Project (YMP) developed two qualified thermodynamic databases to model geochemical processes, including ones involving repository components such as spent fuel. The first of the two (BSC, 2007a) was for systems containing dilute aqueous solutions only, the other (BSC, 2007b) for systems involving concentrated aqueous solutions and incorporating a model for such based on Pitzer's (1991) equations. A 25 C-only database with similarities to the latter was also developed for the Waste Isolation Pilot Plant (WIPP, cf. Xiong, 2005). The NAGRA/PSI database (Hummel et al., 2002) was developed to support repository studies in Europe. The YMP databases are often used in non-repository studies, including studies of geothermal systems (e.g., Wolery and Carroll, 2010) and CO2 sequestration (e.g., Aines et al., 2011).

  7. Telecommunications issues of intelligent database management for ground processing systems in the EOS era

    NASA Technical Reports Server (NTRS)

    Touch, Joseph D.

    1994-01-01

    Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.

  8. Design of Student Information Management Database Application System for Office and Departmental Target Responsibility System

    NASA Astrophysics Data System (ADS)

    Zhou, Hui

    It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.

  9. Planning the future of JPL's management and administrative support systems around an integrated database

    NASA Technical Reports Server (NTRS)

    Ebersole, M. M.

    1983-01-01

    JPL's management and administrative support systems have been developed piece meal and without consistency in design approach over the past twenty years. These systems are now proving to be inadequate to support effective management of tasks and administration of the Laboratory. New approaches are needed. Modern database management technology has the potential for providing the foundation for more effective administrative tools for JPL managers and administrators. Plans for upgrading JPL's management and administrative systems over a six year period evolving around the development of an integrated management and administrative data base are discussed.

  10. Federated web-accessible clinical data management within an extensible neuroimaging database.

    PubMed

    Ozyurt, I Burak; Keator, David B; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R; Bockholt, Jeremy; Grethe, Jeffrey S

    2010-12-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site. PMID:20567938

  11. Federated Web-accessible Clinical Data Management within an Extensible NeuroImaging Database

    PubMed Central

    Keator, David B.; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R.; Bockholt, Jeremy; Grethe, Jeffrey S.

    2010-01-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site. PMID:20567938

  12. Is Library Database Searching a Language Learning Activity?

    ERIC Educational Resources Information Center

    Bordonaro, Karen

    2010-01-01

    This study explores how non-native speakers of English think of words to enter into library databases when they begin the process of searching for information in English. At issue is whether or not language learning takes place when these students use library databases. Language learning in this study refers to the use of strategies employed by…

  13. Development of a Database Program for Managing Drilling Data in the Oil and Gas Industry

    NASA Astrophysics Data System (ADS)

    Suh, J.; Choi, Y.; Park, H.; Choe, J.

    2008-12-01

    This study presents a prototype of database program for managing drilling data for the oil and gas industry. The characteristics of petrophysical data from drilling cores were categorized to define the schema of database system such as data fields in tables, the relationships between those tables and key index fields to create the relationships. And many types of drilling reports and previous drilling database systems were reviewed to design of relational database program. Various algorithms of logging tool were analyzed to offer many kinds of function for user. Database program developed in this study provides well-organized graphic user interfaces for creating, editing, querying, exporting and visualizing the drilling data as well as for interchanging data with a spreadsheet such as MS-Excel.

  14. Flight Deck Interval Management Display. [Elements, Information and Annunciations Database User Guide

    NASA Technical Reports Server (NTRS)

    Lancaster, Jeff; Dillard, Michael; Alves, Erin; Olofinboba, Olu

    2014-01-01

    The User Guide details the Access Database provided with the Flight Deck Interval Management (FIM) Display Elements, Information, & Annunciations program. The goal of this User Guide is to support ease of use and the ability to quickly retrieve and select items of interest from the Database. The Database includes FIM Concepts identified in a literature review preceding the publication of this document. Only items that are directly related to FIM (e.g., spacing indicators), which change or enable FIM (e.g., menu with control buttons), or which are affected by FIM (e.g., altitude reading) are included in the database. The guide has been expanded from previous versions to cover database structure, content, and search features with voiced explanations.

  15. A database to manage flood risk in Catalonia

    NASA Astrophysics Data System (ADS)

    Echeverria, S.; Toldrà, R.; Verdaguer, I.

    2009-09-01

    We call priority action spots those local sites where heavy rain, increased river flow, sea storms and other flooding phenomena can cause human casualties or severe damage to property. Some examples are campsites, car parks, roads, chemical factories… In order to keep to a minimum the risk of these spots, both a prevention programme and an emergency response programme are required. The flood emergency plan of Catalonia (INUNCAT) prepared in 2005 included already a listing of priority action spots compiled by the Catalan Water Agency (ACA), which was elaborated taking into account past experience, hydraulic studies and information available by several knowledgeable sources. However, since land use evolves with time this listing of priority action spots has become outdated and incomplete. A new database is being built. Not only does this new database update and expand the previous listing, but adds to each entry information regarding prevention measures and emergency response: which spots are the most hazardous, under which weather conditions problems arise, which ones should have their access closed as soon as these conditions are forecast or actually given, which ones should be evacuated, who is in charge of the preventive actions or emergency response and so on. Carrying out this programme has to be done with the help and collaboration of all the organizations involved, foremost with the local authorities in the areas at risk. In order to achieve this goal a suitable geographical information system is necessary which can be easily used by all actors involved in this project. The best option has turned out to be the Spatial Data Infrastructure of Catalonia (IDEC), a platform to share spatial data on the Internet involving the Generalitat de Catalunya, Localret (a consortium of local authorities that promotes information technology) and other institutions.

  16. NETMARK: A Schema-less Extension for Relational Databases for Managing Semi-structured Data Dynamically

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  17. An Extensible Schema-less Database Framework for Managing High-throughput Semi-Structured Documents

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.; La, Tracy; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword searches of records for both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high throughput open database framework for managing, storing, and searching unstructured or semi structured arbitrary hierarchal models, XML and HTML.

  18. An Extensible "SCHEMA-LESS" Database Framework for Managing High-Throughput Semi-Structured Documents

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semistructured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  19. Design and Performance of a Xenobiotic Metabolism Database Manager for Building Metabolic Pathway Databases

    EPA Science Inventory

    A major challenge for scientists and regulators is accounting for the metabolic activation of chemicals that may lead to increased toxicity. Reliable forecasting of chemical metabolism is a critical factor in estimating a chemical’s toxic potential. Research is underway to develo...

  20. Technical Aspects of Interfacing MUMPS to an External SQL Relational Database Management System

    PubMed Central

    Kuzmak, Peter M.; Walters, Richard F.; Penrod, Gail

    1988-01-01

    This paper describes an interface connecting InterSystems MUMPS (M/VX) to an external relational DBMS, the SYBASE Database Management System. The interface enables MUMPS to operate in a relational environment and gives the MUMPS language full access to a complete set of SQL commands. MUMPS generates SQL statements as ASCII text and sends them to the RDBMS. The RDBMS executes the statements and returns ASCII results to MUMPS. The interface suggests that the language features of MUMPS make it an attractive tool for use in the relational database environment. The approach described in this paper separates MUMPS from the relational database. Positioning the relational database outside of MUMPS promotes data sharing and permits a number of different options to be used for working with the data. Other languages like C, FORTRAN, and COBOL can access the RDBMS database. Advanced tools provided by the relational database vendor can also be used. SYBASE is an advanced high-performance transaction-oriented relational database management system for the VAX/VMS and UNIX operating systems. SYBASE is designed using a distributed open-systems architecture, and is relatively easy to interface with MUMPS.

  1. The Future of Asset Management for Human Space Exploration: Supply Classification and an Integrated Database

    NASA Technical Reports Server (NTRS)

    Shull, Sarah A.; Gralla, Erica L.; deWeck, Olivier L.; Shishko, Robert

    2006-01-01

    One of the major logistical challenges in human space exploration is asset management. This paper presents observations on the practice of asset management in support of human space flight to date and discusses a functional-based supply classification and a framework for an integrated database that could be used to improve asset management and logistics for human missions to the Moon, Mars and beyond.

  2. Data management and database structure at the ARS Culture Collection

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The organization and management of collection data for the 96,000 strains held in the ARS Culture Collection has been an ongoing process. Originally, the records for the four separate collections were maintained by individual curators in notebooks and/or card files and subsequently on the National C...

  3. Environmental management activities

    SciTech Connect

    1997-07-01

    The Office of Environmental Management (EM) has been delegated the responsibility for the Department of Energy`s (DOE`s) cleanup of the nuclear weapons complex. The nature and magnitude of the waste management and environmental remediation problem requires the identification of technologies and scientific expertise from domestic and foreign sources. Within the United States, operational DOE facilities, as well as the decontamination and decommissioning of inactive facilities, have produced significant amounts of radioactive, hazardous, and mixed wastes. In order to ensure worker safety and the protection of the public, DOE must: (1) assess, remediate, and monitor sites and facilities; (2) store, treat, and dispose of wastes from past and current operations; and (3) develop and implement innovative technologies for environmental restoration and waste management. The EM directive necessitates looking beyond domestic capabilities to technological solutions found outside US borders. Following the collapse of the Soviet regime, formerly restricted elite Soviet scientific expertise became available to the West. EM has established a cooperative technology development program with Russian scientific institutes that meets domestic cleanup objectives by: (1) identifying and accessing Russian EM-related technologies, thereby leveraging investments and providing cost-savings; (2) improving access to technical information, scientific expertise, and technologies applicable to EM needs; and (3) increasing US private sector opportunities in Russian in EM-related areas.

  4. Environmental Management vitrification activities

    SciTech Connect

    Krumrine, P.H.

    1996-05-01

    Both the Mixed Waste and Landfill Stabilization Focus Areas as part of the Office of Technology Development efforts within the Department of Energy`s (DOE) Environmental Management (EM) Division have been developing various vitrification technologies as a treatment approach for the large quantities of transuranic (TRU), TRU mixed and Mixed Low Level Wastes that are stored in either landfills or above ground storage facilities. The technologies being developed include joule heated, plasma torch, plasma arc, induction, microwave, combustion, molten metal, and in situ methods. There are related efforts going into development glass, ceramic, and slag waste form windows of opportunity for the diverse quantities of heterogeneous wastes needing treatment. These studies look at both processing parameters, and long term performance parameters as a function of composition to assure that developed technologies have the right chemistry for success.

  5. The database management system: A topic and a tool

    NASA Technical Reports Server (NTRS)

    Plummer, O. R.

    1984-01-01

    Data structures and data base management systems are common tools employed to deal with the administrative information of a university. An understanding of these topics is needed by a much wider audience, ranging from those interested in computer aided design and manufacturing to those using microcomputers. These tools are becoming increasingly valuable to academic programs as they develop comprehensive computer support systems. The wide use of these tools relies upon the relational data model as a foundation. Experience with the use of the IPAD RIM5.0 program is described.

  6. Database machines for large statistical databases. Final report, January 16, 1983-January 15, 1986

    SciTech Connect

    DeWitt, D.J.

    1986-01-01

    Research activities have been directed towards the design and implementation of a high performance database machine for accessing, manipulating, and analyzing very large statistical data sets. Topics of discussion include statistical databases, parallel algorithms and secondary storage methods, methodology for database system performance evaluation, implementation techniques for main memory database systems, intelligent buffer management systems, Gamma-a high performance dataflow database machine, and extensible database management systems. 1 fig. (DWL)

  7. CAD-CAM database management at Bendix Kansas City

    SciTech Connect

    Witte, D.R.

    1985-05-01

    The Bendix Kansas City Division of Allied Corporation began integrating mechanical CAD-CAM capabilities into its operations in June 1980. The primary capabilities include a wireframe modeling application, a solid modeling application, and the Bendix Integrated Computer Aided Manufacturing (BICAM) System application, a set of software programs and procedures which provides user-friendly access to graphic applications and data, and user-friendly sharing of data between applications and users. BICAM also provides for enforcement of corporate/enterprise policies. Three access categories, private, local, and global, are realized through the implementation of data-management metaphors: the desk, reading rack, file cabinet, and library are for the storage, retrieval, and sharing of drawings and models. Access is provided through menu selections; searching for designs is done by a paging method or a search-by-attribute-value method. The sharing of designs between all users of Part Data is key. The BICAM System supports 375 unique users per quarter and manages over 7500 drawings and models. The BICAM System demonstrates the need for generalized models, a high-level system framework, prototyping, information-modeling methods, and an understanding of the entire enterprise. Future BICAM System implementations are planned to take advantage of this knowledge.

  8. Database Management: Building, Changing and Using Databases. Collected Papers and Abstracts of the Mid-Year Meeting of the American Society for Information Science (15th, Portland, Oregon, May 1986).

    ERIC Educational Resources Information Center

    American Society for Information Science, Washington, DC.

    This document contains abstracts of papers on database design and management which were presented at the 1986 mid-year meeting of the American Society for Information Science (ASIS). Topics considered include: knowledge representation in a bilingual art history database; proprietary database design; relational database design; in-house databases;…

  9. Database management for power system planning: Final report

    SciTech Connect

    Carlsen, K.; Fink, L.H.

    1988-05-01

    This project was undertaken to assess the suitability of representative commercial data base management system (DBMS) for serving as a basis for integrating power system planning analysis software. A data base design was developed for that purpose, and implementation was attempted using the chosen commercial DBMS. Implementation was carried only far enough to demonstrate limitations due to the current state of the art in DBMS practice, which are related in large part to a commercial rather than engineering orientation. The data base design for power system analysis software integration that was developed is described in full. It appears suitable for implementation, given suitable means, and alternative means are discussed. Such means include existing special purpose software for the integration of engineering software, and possibly special purpose data base machines. 7 figs., 13 tabs.

  10. Comparison of scientific and administrative database management systems

    NASA Technical Reports Server (NTRS)

    Stoltzfus, J. C.

    1983-01-01

    Some characteristics found to be different for scientific and administrative data bases are identified and some of the corresponding generic requirements for data base management systems (DBMS) are discussed. The requirements discussed are especially stringent for either the scientific or administrative data bases. For some, no commercial DBMS is fully satisfactory, and the data base designer must invent a suitable approach. For others, commercial systems are available with elegant solutions, and a wrong choice would mean an expensive work-around to provide the missing features. It is concluded that selection of a DBMS must be based on the requirements for the information system. There is no unique distinction between scientific and administrative data bases or DBMS. The distinction comes from the logical structure of the data, and understanding the data and their relationships is the key to defining the requirements and selecting an appropriate DBMS for a given set of applications.

  11. Information flow in the DAMA Project beyond database managers: Information flow managers

    SciTech Connect

    Russell, L.; Wolfson, O.; Yu, C.

    1996-03-01

    To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point-of-sale information, is being considered in the Demand Activated Manufacturing Project of the American Textile Partnership project. A scenario is examined in which 100,000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26,000 suppliers through the use of bill-of-materials explosions at four levels of detail. A new paradign the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced to keep estimates of demand as current as possible.

  12. Management of three-dimensional and anthropometric databases: Alexandria and Cleopatra

    NASA Astrophysics Data System (ADS)

    Paquet, Eric; Robinette, Kathleen; Rioux, Marc

    2000-10-01

    This paper describes two systems for managing 3D and anthropometric databases, namely Alexandria and Cleopatra. Each system is made out of three parts: the crawler, the analyzer, and the search engine. The crawler retrieves the content from the network while the analyzer describes automatically the shape, scale, and color of each retrieved object and writes down a compact descriptor. The search engine applies the query by example paradigm to find and retrieve similar or related objects from the database based on different aspects of 3D shape, scale, and color distribution. The descriptors are defined and the implementation of the system is detailed. The application of the system to the CAESAR anthropometric survey is discussed. Experimental results from the CAESAR database and from generic databases are presented.

  13. ``STANDARD LIBRARY'': A relational database for the management of electron microprobe standards

    NASA Astrophysics Data System (ADS)

    Diamond, Larryn W.; Schmatz, Dirk; Würsten, Felix

    1994-05-01

    Laboratory collections of well-characterized solid materials are an indispensable basis for the calibration of quantitative electron microprobe analyses. The STANDARD LIBRARY database has been designed to manage the wide variety of information needed to characterize such standards, and to provide a rapid way by which these data can be accessed. In addition to physical storage information, STANDARD LIBRARY includes a full set of chemical and mineralogic characterization variables, and a set of variables specific to microprobe calibration (instrumental setup, standard homogeneity, etc.). Application programs for STANDARD LIBRARY provide a series of interactive screen views for database search, retrieval, and editing operations (including inventories). Search and inventory results can be written as UNIX data files, some of which are formatted to be read directly by the software that controls CAMECA SX50™ electron microprobes. The application programs are coded in OSL for the INGRES™ database-management system, and run within any environment that supports INGRES™ (e.g. UNIX, VMS, DOS, etc.). STANDARD LIBRARY has been generalized, however, such that only the physical storage structure of the database is dependent on the selected database-management system.

  14. Data Acquisition and Database Management System for Samsung Superconductor Test Facility

    NASA Astrophysics Data System (ADS)

    Chu, Yong

    In order to fulfill the test requirement of KSTAR (Korea Superconducting Tokamak Advanced Research) superconducting magnet system, a large scale superconducting magnet and conductor test facility, SSTF (Samsung Superconductor Test Facility), has been constructed at Samsung Advanced Institute of Technology. The computer system for SSTF DAC (Data Acquisition and Control) is based on UNIX system and VxWorks is used for the real-time OS of the VME system. EPICS (Experimental Physics and Industrial Control System) is used for the communication between IOC server and client. A database program has been developed for the efficient management of measured data and a Linux workstation with PENTIUM-4 CPU is used for the database server. In this paper, the current status of SSTF DAC system, the database management system and recent test results are presented.

  15. Integrative database management for mouse development: systems and concepts.

    PubMed

    Singh, Amar V; Rouchka, Eric C; Rempala, Greg A; Bastian, Caleb D; Knudsen, Thomas B

    2007-03-01

    Cells in the developing embryo must integrate complex signals from the genome and environment to make decisions about their behavior or fate. The ability to understand the fundamental biology of the decision-making process, and how these decisions may go awry during abnormal development, requires a systems biology paradigm. Presently, the ability to build models with predictive capability in birth defects research is constrained by an incomplete understanding of the fundamental parameters underlying embryonic susceptibility, sensitivity, and vulnerability. Key developmental milestones must be parameterized in terms of system structure and dynamics, the relevant control methods, and the overall design logic of metabolic and regulatory networks. High-content data from genome-based studies provide some comprehensive coverage of these operational processes but a key research challenge is data integration. Analysis can be facilitated by data management resources and software to reveal the structure and function of bionetwork motifs potentially associated with an altered developmental phenotype. Borrowing from applied mathematics and artificial intelligence, we conceptualize a system that can help address the new challenges posed by the transformation of birth defects research into a data-driven science. PMID:17539026

  16. Insight: An ontology-based integrated database and analysis platform for epilepsy self-management research.

    PubMed

    Sahoo, Satya S; Ramesh, Priya; Welter, Elisabeth; Bukach, Ashley; Valdez, Joshua; Tatsuoka, Curtis; Bamps, Yvan; Stoll, Shelley; Jobst, Barbara C; Sajatovic, Martha

    2016-10-01

    We present Insight as an integrated database and analysis platform for epilepsy self-management research as part of the national Managing Epilepsy Well Network. Insight is the only available informatics platform for accessing and analyzing integrated data from multiple epilepsy self-management research studies with several new data management features and user-friendly functionalities. The features of Insight include, (1) use of Common Data Elements defined by members of the research community and an epilepsy domain ontology for data integration and querying, (2) visualization tools to support real time exploration of data distribution across research studies, and (3) an interactive visual query interface for provenance-enabled research cohort identification. The Insight platform contains data from five completed epilepsy self-management research studies covering various categories of data, including depression, quality of life, seizure frequency, and socioeconomic information. The data represents over 400 participants with 7552 data points. The Insight data exploration and cohort identification query interface has been developed using Ruby on Rails Web technology and open source Web Ontology Language Application Programming Interface to support ontology-based reasoning. We have developed an efficient ontology management module that automatically updates the ontology mappings each time a new version of the Epilepsy and Seizure Ontology is released. The Insight platform features a Role-based Access Control module to authenticate and effectively manage user access to different research studies. User access to Insight is managed by the Managing Epilepsy Well Network database steering committee consisting of representatives of all current collaborating centers of the Managing Epilepsy Well Network. New research studies are being continuously added to the Insight database and the size as well as the unique coverage of the dataset allows investigators to conduct

  17. Design and implementation of dEDBMS: a distributed engineering database management system

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Lin, Zongkai; Guo, Yuzhai

    1996-03-01

    dEDBMS is a practical distributed engineering database management system in the CAD application fields. This paper systematically introduces its system architecture, data distribution, security mechanism and concurrency control strategy etc., especially discusses in more detail the concept and implementation of the two categories and three kinds of `lock' mechanism which is presented by the authors.

  18. The Subject-Object Relationship Interface Model in Database Management Systems.

    ERIC Educational Resources Information Center

    Yannakoudakis, Emmanuel J.; Attar-Bashi, Hussain A.

    1989-01-01

    Describes a model that displays structures necessary to map between the conceptual and external levels in database management systems, using an algorithm that maps the syntactic representations of tuples onto semantic representations. A technique for translating tuples into natural language sentences is introduced, and a system implemented in…

  19. Functions and Relations: Some Applications from Database Management for the Teaching of Classroom Mathematics.

    ERIC Educational Resources Information Center

    Hauge, Sharon K.

    While functions and relations are important concepts in the teaching of mathematics, research suggests that many students lack an understanding and appreciation of these concepts. The present paper discusses an approach for teaching functions and relations that draws on the use of illustrations from database management. This approach has the…

  20. Typical applications of a microcomputer database manager to power system problems

    SciTech Connect

    Rao, N.D.

    1987-08-01

    This paper reports on the application of a popular DBMS (database management system) to the solution of selected problems in Power Systems with the IBM Personal Computer in the context of the Power Engineering Curriculum at the University of Calgary. The problems selected are inductance and capacitance calculations of overhead transmission lines and per unit calculations for large power networks. These examples illustrate how to structure, organize, manipulate and query a database and also use it to service one or more independent application program modules to perform specific tasks without unnecessary data duplication. The power and flexibility of command file programming involving multiple database file access as well as program demodulation are also shown. The application programs developed in the paper amply demonstrate not only the benefits of the built-in query language, but also the ability of the DBMS to interface these programs with one or several database files. Because of these versatile features, microcomputer database management systems hold considerable promise as important, cost effective supplementary tools to conventional programming for power utilities as well.

  1. A pilot GIS database of active faults of Mt. Etna (Sicily): A tool for integrated hazard evaluation

    NASA Astrophysics Data System (ADS)

    Barreca, Giovanni; Bonforte, Alessandro; Neri, Marco

    2013-02-01

    A pilot GIS-based system has been implemented for the assessment and analysis of hazard related to active faults affecting the eastern and southern flanks of Mt. Etna. The system structure was developed in ArcGis® environment and consists of different thematic datasets that include spatially-referenced arc-features and associated database. Arc-type features, georeferenced into WGS84 Ellipsoid UTM zone 33 Projection, represent the five main fault systems that develop in the analysed region. The backbone of the GIS-based system is constituted by the large amount of information which was collected from the literature and then stored and properly geocoded in a digital database. This consists of thirty five alpha-numeric fields which include all fault parameters available from literature such us location, kinematics, landform, slip rate, etc. Although the system has been implemented according to the most common procedures used by GIS developer, the architecture and content of the database represent a pilot backbone for digital storing of fault parameters, providing a powerful tool in modelling hazard related to the active tectonics of Mt. Etna. The database collects, organises and shares all scientific currently available information about the active faults of the volcano. Furthermore, thanks to the strong effort spent on defining the fields of the database, the structure proposed in this paper is open to the collection of further data coming from future improvements in the knowledge of the fault systems. By layering additional user-specific geographic information and managing the proposed database (topological querying) a great diversity of hazard and vulnerability maps can be produced by the user. This is a proposal of a backbone for a comprehensive geographical database of fault systems, universally applicable to other sites.

  2. PRAIRIEMAP: A GIS database for prairie grassland management in western North America

    USGS Publications Warehouse

    U.S. Geological Survey

    2003-01-01

    The USGS Forest and Rangeland Ecosystem Science Center, Snake River Field Station (SRFS) maintains a database of spatial information, called PRAIRIEMAP, which is needed to address the management of prairie grasslands in western North America. We identify and collect spatial data for the region encompassing the historical extent of prairie grasslands (Figure 1). State and federal agencies, the primary entities responsible for management of prairie grasslands, need this information to develop proactive management strategies to prevent prairie-grassland wildlife species from being listed as Endangered Species, or to develop appropriate responses if listing does occur. Spatial data are an important component in documenting current habitat and other environmental conditions, which can be used to identify areas that have undergone significant changes in land cover and to identify underlying causes. Spatial data will also be a critical component guiding the decision processes for restoration of habitat in the Great Plains. As such, the PRAIRIEMAP database will facilitate analyses of large-scale and range-wide factors that may be causing declines in grassland habitat and populations of species that depend on it for their survival. Therefore, development of a reliable spatial database carries multiple benefits for land and wildlife management. The project consists of 3 phases: (1) identify relevant spatial data, (2) assemble, document, and archive spatial data on a computer server, and (3) develop and maintain the web site (http://prairiemap.wr.usgs.gov) for query and transfer of GIS data to managers and researchers.

  3. GSIMF : a web service based software and database management system for the generation grids.

    SciTech Connect

    Wang, N.; Ananthan, B.; Gieraltowski, G.; May, E.; Vaniachine, A.; Tech-X Corp.

    2008-01-01

    To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids.

  4. Documentation of a spatial data-base management system for monitoring pesticide application in Washington

    USGS Publications Warehouse

    Schurr, K.M.; Cox, S.E.

    1994-01-01

    The Pesticide-Application Data-Base Management System was created as a demonstration project and was tested with data submitted to the Washington State Department of Agriculture by pesticide applicators from a small geographic area. These data were entered into the Department's relational data-base system and uploaded into the system's ARC/INFO files. Locations for pesticide applica- tions are assigned within the Public Land Survey System grids, and ARC/INFO programs in the Pesticide-Application Data-Base Management System can subdivide each survey section into sixteen idealized quarter-quarter sections for display map grids. The system provides data retrieval and geographic information system plotting capabilities from a menu of seven basic retrieval options. Additionally, ARC/INFO coverages can be created from the retrieved data when required for particular applications. The Pesticide-Application Data-Base Management System, or the general principles used in the system, could be adapted to other applica- tions or to other states.

  5. The Coral Triangle Atlas: an integrated online spatial database system for improving coral reef management.

    PubMed

    Cros, Annick; Ahamad Fatan, Nurulhuda; White, Alan; Teoh, Shwu Jiau; Tan, Stanley; Handayani, Christian; Huang, Charles; Peterson, Nate; Venegas Li, Ruben; Siry, Hendra Yusran; Fitriana, Ria; Gove, Jamison; Acoba, Tomoko; Knight, Maurice; Acosta, Renerio; Andrew, Neil; Beare, Doug

    2014-01-01

    In this paper we describe the construction of an online GIS database system, hosted by WorldFish, which stores bio-physical, ecological and socio-economic data for the 'Coral Triangle Area' in South-east Asia and the Pacific. The database has been built in partnership with all six (Timor-Leste, Malaysia, Indonesia, The Philippines, Solomon Islands and Papua New Guinea) of the Coral Triangle countries, and represents a valuable source of information for natural resource managers at the regional scale. Its utility is demonstrated using biophysical data, data summarising marine habitats, and data describing the extent of marine protected areas in the region. PMID:24941442

  6. The Coral Triangle Atlas: An Integrated Online Spatial Database System for Improving Coral Reef Management

    PubMed Central

    Cros, Annick; Ahamad Fatan, Nurulhuda; White, Alan; Teoh, Shwu Jiau; Tan, Stanley; Handayani, Christian; Huang, Charles; Peterson, Nate; Venegas Li, Ruben; Siry, Hendra Yusran; Fitriana, Ria; Gove, Jamison; Acoba, Tomoko; Knight, Maurice; Acosta, Renerio; Andrew, Neil; Beare, Doug

    2014-01-01

    In this paper we describe the construction of an online GIS database system, hosted by WorldFish, which stores bio-physical, ecological and socio-economic data for the ‘Coral Triangle Area’ in South-east Asia and the Pacific. The database has been built in partnership with all six (Timor-Leste, Malaysia, Indonesia, The Philippines, Solomon Islands and Papua New Guinea) of the Coral Triangle countries, and represents a valuable source of information for natural resource managers at the regional scale. Its utility is demonstrated using biophysical data, data summarising marine habitats, and data describing the extent of marine protected areas in the region. PMID:24941442

  7. Problems underlying the use of referential integrity mechanisms in relational database management systems

    SciTech Connect

    Markowitz, V.M.

    1990-12-01

    Referential integrity is used in relational data-bases for expressing existence dependencies between tuples. Relational database management systems (RDBMS) provide diverse referential integrity capabilities. Thus, is some RDBMSs referential integrity constraints can be specified non-procedurally (declaratively), while in other RDBMSs they must be specified procedurally. Moreover, some RDBMSs restrict the class of allowed referential integrity constraints. We examine in this paper the main problems underlying the use of referential integrity mechanisms in three representative RDBMSs, DB2, SYBASE, and INGRES. 11 refs.

  8. Data management with a landslide inventory of the Franconian Alb (Germany) using a spatial database and GIS tools

    NASA Astrophysics Data System (ADS)

    Bemm, Stefan; Sandmeier, Christine; Wilde, Martina; Jaeger, Daniel; Schwindt, Daniel; Terhorst, Birgit

    2014-05-01

    ), informations on location, landslide types and causes, geomorphological positions, geometries, hazards and damages, as well as assessments related to the activity of landslides. Furthermore, there are stored spatial objects, which represent the components of a landslide, in particular the scarps and the accumulation areas. Besides, waterways, map sheets, contour lines, detailed infrastructure data, digital elevation models, aspect and slope data are included. Examples of spatial queries to the database are intersections of raster and vector data for calculating values for slope gradients or aspects of landslide areas and for creating multiple, overlaying sections for the comparison of slopes, as well as distances to the infrastructure or to the next receiving drainage. Furthermore, getting informations on landslide magnitudes, distribution and clustering, as well as potential correlations concerning geomorphological or geological conditions. The data management concept in this study can be implemented for any academic, public or private use, because it is independent from any obligatory licenses. The created spatial database offers a platform for interdisciplinary research and socio-economic questions, as well as for landslide susceptibility and hazard indication mapping. Obe, R.O., Hsu, L.S. 2011. PostGIS in action. - pp 492, Manning Publications, Stamford

  9. Metadata Dictionary Database: A Proposed Tool for Academic Library Metadata Management

    ERIC Educational Resources Information Center

    Southwick, Silvia B.; Lampert, Cory

    2011-01-01

    This article proposes a metadata dictionary (MDD) be used as a tool for metadata management. The MDD is a repository of critical data necessary for managing metadata to create "shareable" digital collections. An operational definition of metadata management is provided. The authors explore activities involved in metadata management in…

  10. Reef Ecosystem Services and Decision Support Database

    EPA Science Inventory

    This scientific and management information database utilizes systems thinking to describe the linkages between decisions, human activities, and provisioning of reef ecosystem goods and services. This database provides: (1) Hierarchy of related topics - Click on topics to navigat...

  11. Rhode Island Water Supply System Management Plan Database (WSSMP-Version 1.0)

    USGS Publications Warehouse

    Granato, Gregory E.

    2004-01-01

    In Rhode Island, the availability of water of sufficient quality and quantity to meet current and future environmental and economic needs is vital to life and the State's economy. Water suppliers, the Rhode Island Water Resources Board (RIWRB), and other State agencies responsible for water resources in Rhode Island need information about available resources, the water-supply infrastructure, and water use patterns. These decision makers need historical, current, and future water-resource information. In 1997, the State of Rhode Island formalized a system of Water Supply System Management Plans (WSSMPs) to characterize and document relevant water-supply information. All major water suppliers (those that obtain, transport, purchase, or sell more than 50 million gallons of water per year) are required to prepare, maintain, and carry out WSSMPs. An electronic database for this WSSMP information has been deemed necessary by the RIWRB for water suppliers and State agencies to consistently document, maintain, and interpret the information in these plans. Availability of WSSMP data in standard formats will allow water suppliers and State agencies to improve the understanding of water-supply systems and to plan for future needs or water-supply emergencies. In 2002, however, the Rhode Island General Assembly passed a law that classifies some of the WSSMP information as confidential to protect the water-supply infrastructure from potential terrorist threats. Therefore the WSSMP database was designed for an implementation method that will balance security concerns with the information needs of the RIWRB, suppliers, other State agencies, and the public. A WSSMP database was developed by the U.S. Geological Survey in cooperation with the RIWRB. The database was designed to catalog WSSMP information in a format that would accommodate synthesis of current and future information about Rhode Island's water-supply infrastructure. This report documents the design and implementation of

  12. Watershed Data Management (WDM) Database for Salt Creek Streamflow Simulation, DuPage County, Illinois

    USGS Publications Warehouse

    Murphy, Elizabeth A.; Ishii, Audrey

    2006-01-01

    The U.S. Geological Survey (USGS), in cooperation with DuPage County Department of Engineering, Stormwater Management Division, maintains a database of hourly meteorologic and hydrologic data for use in a near real-time streamflow simulation system, which assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek watershed in DuPage County, Illinois. The majority of the precipitation data are collected from a tipping-bucket rain-gage network located in and near DuPage County. The other meteorologic data (wind speed, solar radiation, air temperature, and dewpoint temperature) are collected at Argonne National Laboratory in Argonne, Illinois. Potential evapotranspiration is computed from the meteorologic data. The hydrologic data (discharge and stage) are collected at USGS streamflow-gaging stations in DuPage County. These data are stored in a Watershed Data Management (WDM) database. This report describes a version of the WDM database that was quality-assured and quality-controlled annually to ensure the datasets were complete and accurate. This version of the WDM database contains data from January 1, 1997, through September 30, 2004, and is named SEP04.WDM. This report provides a record of time periods of poor data for each precipitation dataset and describes methods used to estimate the data for the periods when data were missing, flawed, or snowfall-affected. The precipitation dataset data-filling process was changed in 2001, and both processes are described. The other meteorologic and hydrologic datasets in the database are fully described in the annual U.S. Geological Survey Water Data Report for Illinois and, therefore, are described in less detail than the precipitation datasets in this report.

  13. The Kepler DB, a Database Management System for Arrays, Sparse Arrays and Binary Data

    NASA Technical Reports Server (NTRS)

    McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Middour, Christopher; Klaus, Todd C.; Wohler, Bill

    2010-01-01

    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30-minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database (Kepler DB) management system was created to act as the repository of this information. After one year of ight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center.

  14. Hazards Control Department use of the Sperry database management system, MAPPER

    SciTech Connect

    King, W.C.

    1987-10-19

    The Hazards Control data files are maintained in MAPPER, a Sperry-developed database management system that has been used for many years by Univac users. In its current version, MAPPER is extremely versatile and is considered one of the best fourth generation programs. It contains, in addition to a database management system, a word processor, office automation program including electronic mail, and color graphics routines. All of these routines are available for the IBM PC user employing the PEP program of Computer Logics Ltd. with proper board configuration. Some of the routines of MAPPER listed above will not be available if the user is employing VTERM through the Sytek net or by direct dial-up.

  15. Pan Air Geometry Management System (PAGMS): A data-base management system for PAN AIR geometry data

    NASA Technical Reports Server (NTRS)

    Hall, J. F.

    1981-01-01

    A data-base management system called PAGMS was developed to facilitate the data transfer in applications computer programs that create, modify, plot or otherwise manipulate PAN AIR type geometry data in preparation for input to the PAN AIR system of computer programs. PAGMS is composed of a series of FORTRAN callable subroutines which can be accessed directly from applications programs. Currently only a NOS version of PAGMS has been developed.

  16. University Management of Research: A Data-Based Policy and Planning. AIR 1989 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Strubbe, J.

    The development of an appropriate research policy for a university as well as for the national and international levels can be accomplished only if quantitative data and qualitative evaluations (scientific contribution, results, goal-achievement) are made available to illustrate research activities. A database is described that would enable…

  17. Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements

    NASA Technical Reports Server (NTRS)

    Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri

    2006-01-01

    NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.

  18. Metabolonote: a wiki-based database for managing hierarchical metadata of metabolome analyses.

    PubMed

    Ara, Takeshi; Enomoto, Mitsuo; Arita, Masanori; Ikeda, Chiaki; Kera, Kota; Yamada, Manabu; Nishioka, Takaaki; Ikeda, Tasuku; Nihei, Yoshito; Shibata, Daisuke; Kanaya, Shigehiko; Sakurai, Nozomu

    2015-01-01

    Metabolomics - technology for comprehensive detection of small molecules in an organism - lags behind the other "omics" in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata), existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data) being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called "Togo Metabolome Data" (TogoMD), with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers' understanding and use of data but also submitters' motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitate the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http

  19. Metabolonote: A Wiki-Based Database for Managing Hierarchical Metadata of Metabolome Analyses

    PubMed Central

    Ara, Takeshi; Enomoto, Mitsuo; Arita, Masanori; Ikeda, Chiaki; Kera, Kota; Yamada, Manabu; Nishioka, Takaaki; Ikeda, Tasuku; Nihei, Yoshito; Shibata, Daisuke; Kanaya, Shigehiko; Sakurai, Nozomu

    2015-01-01

    Metabolomics – technology for comprehensive detection of small molecules in an organism – lags behind the other “omics” in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata), existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data) being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called “Togo Metabolome Data” (TogoMD), with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers’ understanding and use of data but also submitters’ motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitate the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http

  20. An Introduction to Database Structure and Database Machines.

    ERIC Educational Resources Information Center

    Detweiler, Karen

    1984-01-01

    Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…

  1. Fleet-Wide Prognostic and Health Management Suite: Asset Fault Signature Database

    SciTech Connect

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2015-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: (1) Diagnostic Advisor, (2) Asset Fault Signature (AFS) Database, (3) Remaining Useful Life Advisor, and (4) Remaining Useful Life Database. The paper focuses on the AFS Database of the FW-PHM Suite, which is used to catalog asset fault signatures. A fault signature is a structured representation of the information that an expert would use to first detect and then verify the occurrence of a specific type of fault. The fault signatures developed to assess the health status of generator step-up transformers are described in the paper. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  2. Database Administrator

    ERIC Educational Resources Information Center

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  3. Advanced Scientific Computing Environment Team new scientific database management task. Progress report

    SciTech Connect

    Church, J.P.; Roberts, J.C.; Sims, R.N.; Smetana, A.O.; Westmoreland, B.W.

    1991-06-01

    The mission of the ASCENT Team is to continually keep pace with, evaluate, and select emerging computing technologies to define and implement prototypic scientific environments that maximize the ability of scientists and engineers to manage scientific data. These environments are to be implemented in a manner consistent with the site computing architecture and standards and NRTSC/SCS strategic plans for scientific computing. The major trends in computing hardware and software technology clearly indicate that the future ``computer`` will be a network environment that comprises supercomputers, graphics boxes, mainframes, clusters, workstations, terminals, and microcomputers. This ``network computer`` will have an architecturally transparent operating system allowing the applications code to run on any box supplying the required computing resources. The environment will include a distributed database and database managing system(s) that permits use of relational, hierarchical, object oriented, GIS, et al, databases. To reach this goal requires a stepwise progression from the present assemblage of monolithic applications codes running on disparate hardware platforms and operating systems. The first steps include converting from the existing JOSHUA system to a new J80 system that complies with modern language standards, development of a new J90 prototype to provide JOSHUA capabilities on Unix platforms, development of portable graphics tools to greatly facilitate preparation of input and interpretation of output; and extension of ``Jvv`` concepts and capabilities to distributed and/or parallel computing environments.

  4. A Prescribed Fire Emission Factors Database for Land Management and Air Quality Applications

    NASA Astrophysics Data System (ADS)

    Lincoln, E.; Hao, W.; Baker, S.; Yokelson, R. J.; Burling, I. R.; Urbanski, S. P.; Miller, W.; Weise, D. R.; Johnson, T. J.

    2010-12-01

    Prescribed fire is a significant emissions source in the U.S. and that needs to be adequately characterized in atmospheric transport/chemistry models. In addition, the Clean Air Act, its amendments, and air quality regulations require that prescribed fire managers estimate the quantity of emissions that a prescribed fire will produce. Several published papers contain a few emission factors for prescribed fire and additional results are found in unpublished documents whose quality has to be assessed. In conjunction with three research projects developing detailed new emissions data and meteorological tools to assist prescribed fire managers, the Strategic Environmental Research and Development Program (SERDP) is supporting development of a database that contains emissions information related to prescribed burning. Ultimately, this database will be available on the Internet and will contain older emissions information that has been assessed and newer emissions information that has been developed from both laboratory-scale and field measurements. The database currently contains emissions information from over 300 burns of different wildland vegetation types, including grasslands, shrublands, woodlands, forests, and tundra over much of North America. A summary of the compiled data will be presented, along with suggestions for additional categories.

  5. The Use of SQL and Second Generation Database Management Systems for Data Processing and Information Retrieval in Libraries.

    ERIC Educational Resources Information Center

    Leigh, William; Paz, Noemi

    1989-01-01

    Describes Structured Query Language (SQL), the result of an American National Standards Institute effort to standardize language used to query computer databases and a common element in second generation database management systems. The discussion covers implementations of SQL, associated products, and techniques for its use in online catalogs,…

  6. Head-to-Head Evaluation of the Pro-Cite and Sci-Mate Bibliographic Database Management Systems.

    ERIC Educational Resources Information Center

    Saari, David S.; Foster, George A., Jr.

    1989-01-01

    Compares two full featured database management systems for bibliographic information in terms of programs and documentation; record creation and editing; online database citations; search procedures; access to references in external text files; sorting and printing functions; style sheets; indexes; and file operations. (four references) (CLB)

  7. PlantCAZyme: a database for plant carbohydrate-active enzymes

    PubMed Central

    Ekstrom, Alexander; Taujale, Rahil; McGinn, Nathan; Yin, Yanbin

    2014-01-01

    PlantCAZyme is a database built upon dbCAN (database for automated carbohydrate active enzyme annotation), aiming to provide pre-computed sequence and annotation data of carbohydrate active enzymes (CAZymes) to plant carbohydrate and bioenergy research communities. The current version contains data of 43 790 CAZymes of 159 protein families from 35 plants (including angiosperms, gymnosperms, lycophyte and bryophyte mosses) and chlorophyte algae with fully sequenced genomes. Useful features of the database include: (i) a BLAST server and a HMMER server that allow users to search against our pre-computed sequence data for annotation purpose, (ii) a download page to allow batch downloading data of a specific CAZyme family or species and (iii) protein browse pages to provide an easy access to the most comprehensive sequence and annotation data. Database URL: http://cys.bios.niu.edu/plantcazyme/ PMID:25125445

  8. Managing database under the DPSIR tool for the implementation of European Water Framework Directive

    NASA Astrophysics Data System (ADS)

    Cinnirella, S.; Trombino, G.; Pesenti, E.; Algieri, A.; Pirrone, N.

    2003-04-01

    With the purpose of establishing an European legislation aimed to protect inland surface waters, transitional waters, coastal waters and groundwater the European Parliament and the Council of the European Union adopted the Water Framework Directive 2000/60/EC (WFD). The holistic approach of the WFD for the management of waters have been adopted to protect water bodies considered as interlinked systems from the spring to the sea. Having the above in mind, two years ago was started the EUROCAT project funded by the European Commission as part of the FP5 which is aimed to define for different catchment-coastal zone continuums in Europe a possible policy responses to mitigate environmental pressures driven by main economic activities existing in the area. In order to account for different aspects related to spatial interactions between the watershed and coastal zone, the Drivers-Pressures-State-Impact-Response (DPSIR) framework was developed for the Po Catchment-Adriatic Coastal Zone continuum and applied as policy tool to evaluate important aspects to be considered during the implementation of the WFD. The DPSIR includes ecological as well as socio-economic indicators that represent the Drivers that create Pressures which are gradually integrated as the Status of the system is assessed. Evaluation of the Impact for different Drivers and Pressure factors allows to identify the optimal strategy(-ies) for achieving the objectives and goals of the WFD. The aim of this paper is to discuss and present a multi-layer database that includes socio-economic, physical and ecological indicators that have been used to run different biogeochemical and socio-economic models including the one for assessing nutrient fluxes in the catchment (MONERIS, Behrendt, 1996), the nutrient cycle in surface and deep seawaters of the Adriatic Sea (WASP model) and socio-economic models (i.e., DEFINITE) for different socio-economic and environmental scenarios including the Business As Usual (BAU) and

  9. An online spatial database of Australian Indigenous Biocultural Knowledge for contemporary natural and cultural resource management.

    PubMed

    Pert, Petina L; Ens, Emilie J; Locke, John; Clarke, Philip A; Packer, Joanne M; Turpin, Gerry

    2015-11-15

    With growing international calls for the enhanced involvement of Indigenous peoples and their biocultural knowledge in managing conservation and the sustainable use of physical environment, it is timely to review the available literature and develop cross-cultural approaches to the management of biocultural resources. Online spatial databases are becoming common tools for educating land managers about Indigenous Biocultural Knowledge (IBK), specifically to raise a broad awareness of issues, identify knowledge gaps and opportunities, and to promote collaboration. Here we describe a novel approach to the application of internet and spatial analysis tools that provide an overview of publically available documented Australian IBK (AIBK) and outline the processes used to develop the online resource. By funding an AIBK working group, the Australian Centre for Ecological Analysis and Synthesis (ACEAS) provided a unique opportunity to bring together cross-cultural, cross-disciplinary and trans-organizational contributors who developed these resources. Without such an intentionally collaborative process, this unique tool would not have been developed. The tool developed through this process is derived from a spatial and temporal literature review, case studies and a compilation of methods, as well as other relevant AIBK papers. The online resource illustrates the depth and breadth of documented IBK and identifies opportunities for further work, partnerships and investment for the benefit of not only Indigenous Australians, but all Australians. The database currently includes links to over 1500 publically available IBK documents, of which 568 are geo-referenced and were mapped. It is anticipated that as awareness of the online resource grows, more documents will be provided through the website to build the database. It is envisaged that this will become a well-used tool, integral to future natural and cultural resource management and maintenance. PMID:25682266

  10. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  11. National information network and database system of hazardous waste management in China

    SciTech Connect

    Ma Hongchang

    1996-12-31

    Industries in China generate large volumes of hazardous waste, which makes it essential for the nation to pay more attention to hazardous waste management. National laws and regulations, waste surveys, and manifest tracking and permission systems have been initiated. Some centralized hazardous waste disposal facilities are under construction. China`s National Environmental Protection Agency (NEPA) has also obtained valuable information on hazardous waste management from developed countries. To effectively share this information with local environmental protection bureaus, NEPA developed a national information network and database system for hazardous waste management. This information network will have such functions as information collection, inquiry, and connection. The long-term objective is to establish and develop a national and local hazardous waste management information network. This network will significantly help decision makers and researchers because it will be easy to obtain information (e.g., experiences of developed countries in hazardous waste management) to enhance hazardous waste management in China. The information network consists of five parts: technology consulting, import-export management, regulation inquiry, waste survey, and literature inquiry.

  12. A Method to Calculate and Analyze Residents' Evaluations by Using a Microcomputer Data-Base Management System.

    ERIC Educational Resources Information Center

    Mills, Myron L.

    1988-01-01

    A system developed for more efficient evaluation of graduate medical students' progress uses numerical scoring and a microcomputer database management system as an alternative to manual methods to produce accurate, objective, and meaningful summaries of resident evaluations. (Author/MSE)

  13. Activated charcoal. (Latest citations from the Compendex database). Published Search

    SciTech Connect

    Not Available

    1993-06-01

    The bibliography contains citations concerning theoretical aspects and industrial applications of activated charcoal. Topics include adsorption capacity and mechanism studies, kinetic and thermodynamic aspects, and description and evaluation of adsorptive abilities. Applications include use in water analyses and waste treatment, air pollution control and measurement, and in nuclear facilities. (Contains a minimum of 151 citations and includes a subject term index and title list.)

  14. Development of genome viewer (Web Omics Viewer) for managing databases of cucumber genome

    NASA Astrophysics Data System (ADS)

    Wojcieszek, M.; RóŻ, P.; Pawełkowicz, M.; Nowak, R.; Przybecki, Z.

    Cucumber is an important plant in horticulture and science world. Sequencing projects of C. sativus genome enable new methodological aproaches in further investigation of this species. Accessibility is crucial to fully exploit obtained information about detail structure of genes, markers and other characteristic features such contigs, scaffolds and chromosomes. Genome viewer is one of tools providing plain and easy way for presenting genome data for users and for databases administration. Gbrowse - the main viewer has several very useful features but lacks in managing simplicity. Our group developed new genome browser Web Omics Viewer (WOV), keeping functionality but improving utilization and accessibility to cucumber genome data.

  15. Object-Oriented Database for Managing Building Modeling Components and Metadata: Preprint

    SciTech Connect

    Long, N.; Fleming, K.; Brackney, L.

    2011-12-01

    Building simulation enables users to explore and evaluate multiple building designs. When tools for optimization, parametrics, and uncertainty analysis are combined with analysis engines, the sheer number of discrete simulation datasets makes it difficult to keep track of the inputs. The integrity of the input data is critical to designers, engineers, and researchers for code compliance, validation, and building commissioning long after the simulations are finished. This paper discusses an application that stores inputs needed for building energy modeling in a searchable, indexable, flexible, and scalable database to help address the problem of managing simulation input data.

  16. Managing vulnerabilities and achieving compliance for Oracle databases in a modern ERP environment

    NASA Astrophysics Data System (ADS)

    Hölzner, Stefan; Kästle, Jan

    In this paper we summarize good practices on how to achieve compliance for an Oracle database in combination with an ERP system. We use an integrated approach to cover both the management of vulnerabilities (preventive measures) and the use of logging and auditing features (detective controls). This concise overview focusses on the combination Oracle and SAP and it’s dependencies, but also outlines security issues that arise with other ERP systems. Using practical examples, we demonstrate common vulnerabilities and coutermeasures as well as guidelines for the use of auditing features.

  17. Application of a microcomputer-based database management system to distribution system reliability evaluation

    SciTech Connect

    Hsu, Y.Y.; Chen, L.M.; Chen, J.L. . Dept. of Electrical Engineering); Hsueh, M.C.; Lin, C.T.; Chen, Y.W.; Chen, J.J. ); Liu, T.S.S.; Chen, W.C. )

    1990-01-01

    The experience with the application of a database management system (DBMS) to handle the large amounts of data involved in distribution system reliability evaluation is reported. To demonstrate the capability of the DBMS in data manipulation, reliability evaluation of a distribution system in Taiwan is performed using a DBMS installed on an IBM PC-AT. It is found that using DBMS tool is a very efficient way of organizing data required by distribution planners. Moreover, the DBMS method is very cost-effective since it is installed on a personal computer.

  18. Performance of online drug information databases as clinical decision support tools in infectious disease medication management.

    PubMed

    Polen, Hyla H; Zapantis, Antonia; Clauson, Kevin A; Clauson, Kevin Alan; Jebrock, Jennifer; Paris, Mark

    2008-01-01

    Infectious disease (ID) medication management is complex and clinical decision support tools (CDSTs) can provide valuable assistance. This study evaluated scope and completeness of ID drug information found in online databases by evaluating their ability to answer 147 question/answer pairs. Scope scores produced highest rankings (%) for: Micromedex (82.3), Lexi-Comp/American Hospital Formulary Service (81.0), and Medscape Drug Reference (81.0); lowest includes: Epocrates Online Premium (47.0), Johns Hopkins ABX Guide (45.6), and PEPID PDC (40.8). PMID:18999059

  19. Graph Databases for Large-Scale Healthcare Systems: A Framework for Efficient Data Management and Data Services

    SciTech Connect

    Park, Yubin; Shankar, Mallikarjun; Park, Byung H.; Ghosh, Dr. Joydeep

    2014-01-01

    Designing a database system for both efficient data management and data services has been one of the enduring challenges in the healthcare domain. In many healthcare systems, data services and data management are often viewed as two orthogonal tasks; data services refer to retrieval and analytic queries such as search, joins, statistical data extraction, and simple data mining algorithms, while data management refers to building error-tolerant and non-redundant database systems. The gap between service and management has resulted in rigid database systems and schemas that do not support effective analytics. We compose a rich graph structure from an abstracted healthcare RDBMS to illustrate how we can fill this gap in practice. We show how a healthcare graph can be automatically constructed from a normalized relational database using the proposed 3NF Equivalent Graph (3EG) transformation.We discuss a set of real world graph queries such as finding self-referrals, shared providers, and collaborative filtering, and evaluate their performance over a relational database and its 3EG-transformed graph. Experimental results show that the graph representation serves as multiple de-normalized tables, thus reducing complexity in a database and enhancing data accessibility of users. Based on this finding, we propose an ensemble framework of databases for healthcare applications.

  20. ‘Isotopo’ a database application for facile analysis and management of mass isotopomer data

    PubMed Central

    Ahmed, Zeeshan; Zeeshan, Saman; Huber, Claudia; Hensel, Michael; Schomburg, Dietmar; Münch, Richard; Eylert, Eva; Eisenreich, Wolfgang; Dandekar, Thomas

    2014-01-01

    The composition of stable-isotope labelled isotopologues/isotopomers in metabolic products can be measured by mass spectrometry and supports the analysis of pathways and fluxes. As a prerequisite, the original mass spectra have to be processed, managed and stored to rapidly calculate, analyse and compare isotopomer enrichments to study, for instance, bacterial metabolism in infection. For such applications, we provide here the database application ‘Isotopo’. This software package includes (i) a database to store and process isotopomer data, (ii) a parser to upload and translate different data formats for such data and (iii) an improved application to process and convert signal intensities from mass spectra of 13C-labelled metabolites such as tertbutyldimethylsilyl-derivatives of amino acids. Relative mass intensities and isotopomer distributions are calculated applying a partial least square method with iterative refinement for high precision data. The data output includes formats such as graphs for overall enrichments in amino acids. The package is user-friendly for easy and robust data management of multiple experiments. Availability: The ‘Isotopo’ software is available at the following web link (section Download): http://spp1316.uni-wuerzburg.de/bioinformatics/isotopo/. The package contains three additional files: software executable setup (installer), one data set file (discussed in this article) and one excel file (which can be used to convert data from excel to ‘.iso’ format). The ‘Isotopo’ software is compatible only with the Microsoft Windows operating system. Database URL: http://spp1316.uni-wuerzburg.de/bioinformatics/isotopo/. PMID:25204646

  1. Database management research for the Human Genome Project: Progress report, 7/1/96-3/15/97

    SciTech Connect

    Goodman, N.

    1997-03-01

    Progress is reported on the development of software that works in conjunction with database management systems (DBMSs) in ways that are useful for genomics. This new release of LabBase has two major advantages over the previous version, namely it runs on the Sybase relational DBMS rather than ObjectStore and offers more complete data modeling features than the previous version so is suitable for more kinds of genetic databases.

  2. Knowledge management: An abstraction of knowledge base and database management systems

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  3. The use of database management systems and artificial intelligence in automating the planning of optical navigation pictures

    NASA Technical Reports Server (NTRS)

    Davis, Robert P.; Underwood, Ian M.

    1987-01-01

    The use of database management systems (DBMS) and AI to minimize human involvement in the planning of optical navigation pictures for interplanetary space probes is discussed, with application to the Galileo mission. Parameters characterizing the desirability of candidate pictures, and the program generating them, are described. How these parameters automatically build picture records in a database, and the definition of the database structure, are then discussed. The various rules, priorities, and constraints used in selecting pictures are also described. An example is provided of an expert system, written in Prolog, for automatically performing the selection process.

  4. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    NASA Astrophysics Data System (ADS)

    Shao, Weber; Kupelian, Patrick A.; Wang, Jason; Low, Daniel A.; Ruan, Dan

    2014-03-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  5. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    NASA Technical Reports Server (NTRS)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  6. A Toolkit for Active Object-Oriented Databases with Application to Interoperability

    NASA Technical Reports Server (NTRS)

    King, Roger

    1996-01-01

    In our original proposal we stated that our research would 'develop a novel technology that provides a foundation for collaborative information processing.' The essential ingredient of this technology is the notion of 'deltas,' which are first-class values representing collections of proposed updates to a database. The Heraclitus framework provides a variety of algebraic operators for building up, combining, inspecting, and comparing deltas. Deltas can be directly applied to the database to yield a new state, or used 'hypothetically' in queries against the state that would arise if the delta were applied. The central point here is that the step of elevating deltas to 'first-class' citizens in database programming languages will yield tremendous leverage on the problem of supporting updates in collaborative information processing. In short, our original intention was to develop the theoretical and practical foundation for a technology based on deltas in an object-oriented database context, develop a toolkit for active object-oriented databases, and apply this toward collaborative information processing.

  7. A Toolkit for Active Object-Oriented Databases with Application to Interoperability

    NASA Technical Reports Server (NTRS)

    King, Roger

    1996-01-01

    In our original proposal we stated that our research would 'develop a novel technology that provides a foundation for collaborative information processing.' The essential ingredient of this technology is the notion of 'deltas,' which are first-class values representing collections of proposed updates to a database. The Heraclitus framework provides a variety of algebraic operators for building up, combining, inspecting, and comparing deltas. Deltas can be directly applied to the database to yield a new state, or used 'hypothetically' in queries against the state that would arise if the delta were applied. The central point here is that the step of elevating deltas to 'first-class' citizens in database programming languages will yield tremendous leverage on the problem of supporting updates in collaborative information processing. In short, our original intention was to develop the theoretical and practical foundation for a technology based on deltas in an object- oriented database context, develop a toolkit for active object-oriented databases, and apply this toward collaborative information processing.

  8. Integrated Database Construction for Efficient Support of Yeongsan River Estuary Management System

    NASA Astrophysics Data System (ADS)

    Lee, G. H.; Kim, K. H.; Lee, S. J.

    2014-02-01

    Yeongsan River is one of the four major rivers in South Korea, and it flows toward the Yellow Sea by passing through Damyang and Gwangju. In particular, the skewness of the main stream in Yeongsan River is relatively higher compared to other rivers. Accordingly, flood damage occurred frequently due to the flooding of sea water during tidal periods. Additionally, the environment of the estuary in Yeongsan River has been severely damaged due to indiscreet development and the inflow of various waste waters. Therefore, water quality improvement and management are crucial. For better water quality management, the government ministry is collecting various data from different fields to identify the water quality conditions. The necessity of collected data is being heightened in order to apply them into the estuary management system. However, in terms of the observed data, the observed field or items frequently modified according to social interests. Additionally, index is needed in order to search for massive amount of observation data. Due to this, the process of construction into database is relatively difficult. Therefore, in this study, these characteristics were considered for construction into the integrated DB.

  9. Developing a database management system to support birth defects surveillance in Florida.

    PubMed

    Salemi, Jason L; Hauser, Kimberlea W; Tanner, Jean Paul; Sampat, Diana; Correia, Jane A; Watkins, Sharon M; Kirby, Russell S

    2010-01-01

    The value of any public health surveillance program is derived from the ways in which data are managed and used to improve the public's health. Although birth defects surveillance programs vary in their case volume, budgets, staff, and objectives, the capacity to operate efficiently and maximize resources remains critical to long-term survival. The development of a fully-integrated relational database management system (DBMS) can enrich a surveillance program's data and improve efficiency. To build upon the Florida Birth Defects Registry--a statewide registry relying solely on linkage of administrative datasets and unconfirmed diagnosis codes-the Florida Department of Health provided funding to the University of South Florida to develop and pilot an enhanced surveillance system in targeted areas with a more comprehensive approach to case identification and diagnosis confirmation. To manage operational and administrative complexities, a DBMS was developed, capable of managing transmission of project data from multiple sources, tracking abstractor time during record reviews, offering tools for defect coding and case classification, and providing reports to DBMS users. Since its inception, the DBMS has been used as part of our surveillance projects to guide the receipt of over 200 case lists and review of 12,924 fetuses and infants (with associated maternal records) suspected of having selected birth defects in over 90 birthing and transfer facilities in Florida. The DBMS has provided both anticipated and unexpected benefits. Automation of the processes for managing incoming case lists has reduced clerical workload considerably, while improving accuracy of working lists for field abstraction. Data quality has improved through more effective use of internal edits and comparisons with values for other data elements, while simultaneously increasing abstractor efficiency in completion of case abstraction. We anticipate continual enhancement to the DBMS in the future

  10. Activity Management System user reference manual. Revision 1

    SciTech Connect

    Gates, T.A.; Burdick, M.B.

    1994-09-22

    The Activity Management System (AMS) was developed in response to the need for a simple-to-use, low-cost, user interface system for collecting and logging Hanford Waste Vitrification Plant Project (HWVP) activities. This system needed to run on user workstations and provide common user access to a database stored on a local network file server. Most important, users wanted a system that provided a management tool that supported their individual process for completing activities. Existing system treated the performer as a tool of the system. All AMS data is maintained in encrypted format. Users can feel confident that any activities they have entered into the database are private and that, as the originator, they retain sole control over who can see them. Once entered into the AMS database, the activities cannot be accessed by anyone other than the originator, the designated agent, or by authorized viewers who have been explicitly granted the right to look at specific activities by the originator. This user guide is intended to assist new AMS users in learning how to use the application and, after the initial learning process, will serve as an ongoing reference for experienced users in performing infrequently used functions. Online help screens provide reference to some of the key information in this manual. Additional help screens, encompassing all the applicable material in this manual, will be incorporated into future AMS revisions. A third, and most important, source of help is the AMS administrator(s). This guide describes the initial production version of AMS, which has been designated Revision 1.0.

  11. Outcome Management in Cardiac Surgery Using the Society of Thoracic Surgeons National Database.

    PubMed

    Halpin, Linda S; Gallardo, Bret E; Speir, Alan M; Ad, Niv

    2016-09-01

    Health care reform has helped streamline patient care and reimbursement by encouraging providers to provide the best outcome for the best value. Institutions with cardiac surgery programs need a methodology to monitor and improve outcomes linked to reimbursement. The Society of Thoracic Surgeons National Database (STSND) is a tool for monitoring outcomes and improving care. This article identifies the purpose, goals, and reporting system of the STSND and ways these data can be used for benchmarking, linking outcomes to the effectiveness of treatment, and identifying factors associated with mortality and complications. We explain the methodology used at Inova Heart and Vascular Institute, Falls Church, Virginia, to perform outcome management by using the STSND and address our performance-improvement cycle through discussion of data collection, analysis, and outcome reporting. We focus on the revision of clinical practice and offer examples of how patient outcomes have been improved using this methodology. PMID:27568532

  12. On the evaluation of fuzzy quantified queries in a database management system

    NASA Technical Reports Server (NTRS)

    Bosc, Patrick; Pivert, Olivier

    1992-01-01

    Many propositions to extend database management systems have been made in the last decade. Some of them aim at the support of a wider range of queries involving fuzzy predicates. Unfortunately, these queries are somewhat complex and the question of their efficiency is a subject under discussion. In this paper, we focus on a particular subset of queries, namely those using fuzzy quantified predicates. More precisely, we will consider the case where such predicates apply to individual elements as well as to sets of elements. Thanks to some interesting properties of alpha-cuts of fuzzy sets, we are able to show that the evaluation of these queries can be significantly improved with respect to a naive strategy based on exhaustive scans of sets or files.

  13. Demonstration of SLUMIS: a clinical database and management information system for a multi organ transplant program.

    PubMed Central

    Kurtz, M.; Bennett, T.; Garvin, P.; Manuel, F.; Williams, M.; Langreder, S.

    1991-01-01

    Because of the rapid evolution of the heart, heart/lung, liver, kidney and kidney/pancreas transplant programs at our institution, and because of a lack of an existing comprehensive database, we were required to develop a computerized management information system capable of supporting both clinical and research requirements of a multifaceted transplant program. SLUMIS (ST. LOUIS UNIVERSITY MULTI-ORGAN INFORMATION SYSTEM) was developed for the following reasons: 1) to comply with the reporting requirements of various transplant registries, 2) for reporting to an increasing number of government agencies and insurance carriers, 3) to obtain updates of our operative experience at regular intervals, 4) to integrate the Histocompatibility and Immunogenetics Laboratory (HLA) for online test result reporting, and 5) to facilitate clinical investigation. PMID:1807741

  14. Data base management systems activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The Data Management System-1100 is designed to operate in conjunction with the UNIVAC 1100 Series Operating System on any 1100 Series computer. DMS-1100 is divided into the following four major software components: (1) Data Definition Languages (DDL); (2) Data Management Routine (DMR); (3) Data Manipulation Languages (DML); and (4) Data Base Utilities (DBU). These software components are described in detail.

  15. An intelligent interactive visual database management system for Space Shuttle closeout image management

    NASA Technical Reports Server (NTRS)

    Ragusa, James M.; Orwig, Gary; Gilliam, Michael; Blacklock, David; Shaykhian, Ali

    1994-01-01

    Status is given of an applications investigation on the potential for using an expert system shell for classification and retrieval of high resolution, digital, color space shuttle closeout photography. This NASA funded activity has focused on the use of integrated information technologies to intelligently classify and retrieve still imagery from a large, electronically stored collection. A space shuttle processing problem is identified, a working prototype system is described, and commercial applications are identified. A conclusion reached is that the developed system has distinct advantages over the present manual system and cost efficiencies will result as the system is implemented. Further, commercial potential exists for this integrated technology.

  16. Hazardous waste database: Waste management policy implications for the US Department of Energy`s Environmental Restoration and Waste Management Programmatic Environmental Impact Statement

    SciTech Connect

    Lazaro, M.A.; Policastro, A.J.; Antonopoulos, A.A.; Hartmann, H.M.; Koebnick, B.; Dovel, M.; Stoll, P.W.

    1994-03-01

    The hazardous waste risk assessment modeling (HaWRAM) database is being developed to analyze the risk from treatment technology operations and potential transportation accidents associated with the hazardous waste management alternatives. These alternatives are being assessed in the Department of Energy`s Environmental Restoration and Waste Management Programmatic Environmental Impact Statement (EM PEIS). To support the risk analysis, the current database contains complexwide detailed information on hazardous waste shipments from 45 Department of Energy installations during FY 1992. The database is currently being supplemented with newly acquired data. This enhancement will improve database information on operational hazardous waste generation rates, and the level and type of current on-site treatment at Department of Energy installations.

  17. [Cystic Fibrosis Cloud database: An information system for storage and management of clinical and microbiological data of cystic fibrosis patients].

    PubMed

    Prieto, Claudia I; Palau, María J; Martina, Pablo; Achiary, Carlos; Achiary, Andrés; Bettiol, Marisa; Montanaro, Patricia; Cazzola, María L; Leguizamón, Mariana; Massillo, Cintia; Figoli, Cecilia; Valeiras, Brenda; Perez, Silvia; Rentería, Fernando; Diez, Graciela; Yantorno, Osvaldo M; Bosch, Alejandra

    2016-01-01

    The epidemiological and clinical management of cystic fibrosis (CF) patients suffering from acute pulmonary exacerbations or chronic lung infections demands continuous updating of medical and microbiological processes associated with the constant evolution of pathogens during host colonization. In order to monitor the dynamics of these processes, it is essential to have expert systems capable of storing and subsequently extracting the information generated from different studies of the patients and microorganisms isolated from them. In this work we have designed and developed an on-line database based on an information system that allows to store, manage and visualize data from clinical studies and microbiological analysis of bacteria obtained from the respiratory tract of patients suffering from cystic fibrosis. The information system, named Cystic Fibrosis Cloud database is available on the http://servoy.infocomsa.com/cfc_database site and is composed of a main database and a web-based interface, which uses Servoy's product architecture based on Java technology. Although the CFC database system can be implemented as a local program for private use in CF centers, it can also be used, updated and shared by different users who can access the stored information in a systematic, practical and safe manner. The implementation of the CFC database could have a significant impact on the monitoring of respiratory infections, the prevention of exacerbations, the detection of emerging organisms, and the adequacy of control strategies for lung infections in CF patients. PMID:26895996

  18. An Integrated Photogrammetric and Spatial Database Management System for Producing Fully Structured Data Using Aerial and Remote Sensing Images

    PubMed Central

    Ahmadi, Farshid Farnood; Ebadi, Hamid

    2009-01-01

    3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented. PMID:22574014

  19. An integrated photogrammetric and spatial database management system for producing fully structured data using aerial and remote sensing images.

    PubMed

    Ahmadi, Farshid Farnood; Ebadi, Hamid

    2009-01-01

    3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented. PMID:22574014

  20. Rainforests: Conservation and resource management. (Latest citations from the Biobusiness database). Published Search

    SciTech Connect

    Not Available

    1994-12-01

    The bibliography contains citations concerning conservation of rainforest ecology and management of natural resources. Topics include plant community structure and development, nutrient dynamics, rainfall characteristics and water budgets, and forest dynamics. Studies performed in specific forest areas are included. Effects of human activities are also considered. (Contains a minimum of 154 citations and includes a subject term index and title list.)

  1. Karlsruhe Database for Radioactive Wastes (KADABRA) - Accounting and Management System for Radioactive Waste Treatment - 12275

    SciTech Connect

    Himmerkus, Felix; Rittmeyer, Cornelia

    2012-07-01

    The data management system KADABRA was designed according to the purposes of the Cen-tral Decontamination Department (HDB) of the Wiederaufarbeitungsanlage Karlsruhe Rueckbau- und Entsorgungs-GmbH (WAK GmbH), which is specialized in the treatment and conditioning of radioactive waste. The layout considers the major treatment processes of the HDB as well as regulatory and legal requirements. KADABRA is designed as an SAG ADABAS application on IBM system Z mainframe. The main function of the system is the data management of all processes related to treatment, transfer and storage of radioactive material within HDB. KADABRA records the relevant data concerning radioactive residues, interim products and waste products as well as the production parameters relevant for final disposal. Analytical data from the laboratory and non destructive assay systems, that describe the chemical and radiological properties of residues, production batches, interim products as well as final waste products, can be linked to the respective dataset for documentation and declaration. The system enables the operator to trace the radioactive material through processing and storage. Information on the actual sta-tus of the material as well as radiological data and storage position can be gained immediately on request. A variety of programs accessed to the database allow the generation of individual reports on periodic or special request. KADABRA offers a high security standard and is constantly adapted to the recent requirements of the organization. (authors)

  2. Overlay metrology productivity and stability enhancements using an offline recipe database manager (RDM)

    NASA Astrophysics Data System (ADS)

    DeMoor, Stephen J.; Peters, Robert M.; Calvert, Todd E.; Hilbun, Stephanie L.; Beck, George P., III; Bushman, Kristi L.; Fields, Russell D.

    2000-06-01

    Tool cost of ownership and manufacturing productivity continue to be key factors in equipment selection discussions. Products that differentiate themselves by maximizing tool utilization and minimizing engineering resources make the best economic impact in a time of increasing fab capital costs. This paper will demonstrate the use of a single off-line recipe database manager (RDM) in conjunction with multiple optical misregistration measurement tools for the purpose of misregistration recipe creation and management in a high volume ASIC manufacturing line. A strategy for minimizing the number of recipe elements and the amount of time required to create and maintain all recipes will be discussed. Data will be presented which demonstrates significant reduction in tool time required for recipe setup, leading directly to increased tool availability for production use. In addition, the RDM allows for standardization of misregistration measurement setup for similar process levels across multiple product devices within a single product family. Data will be shown demonstrating TIS stability and consistency as a result of the standardized setup. Future work, including fully automated recipe creation via CAD output data would be discussed.

  3. Using non-local databases for the environmental assessment of industrial activities: The case of Latin America

    SciTech Connect

    Osses de Eicker, Margarita; Hischier, Roland; Hurni, Hans; Zah, Rainer

    2010-04-15

    Nine non-local databases were evaluated with respect to their suitability for the environmental assessment of industrial activities in Latin America. Three assessment methods were considered, namely Life Cycle Assessment (LCA), Environmental Impact Assessment (EIA) and air emission inventories. The analysis focused on data availability in the databases and the applicability of their international data to Latin American industry. The study showed that the European EMEP/EEA Guidebook and the U.S. EPA AP-42 database are the most suitable ones for air emission inventories, whereas the LCI database Ecoinvent is the most suitable one for LCA and EIA. Due to the data coverage in the databases, air emission inventories are easier to develop than LCA or EIA, which require more comprehensive information. One strategy to overcome the limitations of non-local databases for Latin American industry is the combination of validated data from international databases with newly developed local datasets.

  4. Activity, assay and target data curation and quality in the ChEMBL database.

    PubMed

    Papadatos, George; Gaulton, Anna; Hersey, Anne; Overington, John P

    2015-09-01

    The emergence of a number of publicly available bioactivity databases, such as ChEMBL, PubChem BioAssay and BindingDB, has raised awareness about the topics of data curation, quality and integrity. Here we provide an overview and discussion of the current and future approaches to activity, assay and target data curation of the ChEMBL database. This curation process involves several manual and automated steps and aims to: (1) maximise data accessibility and comparability; (2) improve data integrity and flag outliers, ambiguities and potential errors; and (3) add further curated annotations and mappings thus increasing the usefulness and accuracy of the ChEMBL data for all users and modellers in particular. Issues related to activity, assay and target data curation and integrity along with their potential impact for users of the data are discussed, alongside robust selection and filter strategies in order to avoid or minimise these, depending on the desired application. PMID:26201396

  5. Second Research Coordination Meeting on Reference Database for Neutron Activation Analysis -- Summary Report

    SciTech Connect

    Firestone, Richard B.; Kellett, Mark A.

    2008-03-19

    The second meeting of the Co-ordinated Research Project on"Reference Database for Neutron Activation Analysis" was held at the IAEA, Vienna from 7-9 May, 2007. A summary of the presentations made by participants is given, along with reports on specifically assigned tasks and subsequent discussions. In order to meet the overall objectives of this CRP, the outputs have been reiterated and new task assignments made.

  6. The Johnson Space Center Management Information Systems (JSCMIS): An interface for organizational databases

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.; Erickson, Lloyd

    1990-01-01

    The Management Information and Decision Support Environment (MIDSE) is a research activity to build and test a prototype of a generic human interface on the Johnson Space Center (JSC) Information Network (CIN). The existing interfaces were developed specifically to support operations rather than the type of data which management could use. The diversity of the many interfaces and their relative difficulty discouraged occasional users from attempting to use them for their purposes. The MIDSE activity approached this problem by designing and building an interface to one JSC data base - the personnel statistics tables of the NASA Personnel and Payroll System (NPPS). The interface was designed against the following requirements: generic (use with any relational NOMAD data base); easy to learn (intuitive operations for new users); easy to use (efficient operations for experienced users); self-documenting (help facility which informs users about the data base structure as well as the operation of the interface); and low maintenance (easy configuration to new applications). A prototype interface entitled the JSC Management Information Systems (JSCMIS) was produced. It resides on CIN/PROFS and is available to JSC management who request it. The interface has passed management review and is ready for early use. Three kinds of data are now available: personnel statistics, personnel register, and plan/actual cost.

  7. PARPs database: A LIMS systems for protein-protein interaction data mining or laboratory information management system

    PubMed Central

    Droit, Arnaud; Hunter, Joanna M; Rouleau, Michèle; Ethier, Chantal; Picard-Cloutier, Aude; Bourgais, David; Poirier, Guy G

    2007-01-01

    Background In the "post-genome" era, mass spectrometry (MS) has become an important method for the analysis of proteins and the rapid advancement of this technique, in combination with other proteomics methods, results in an increasing amount of proteome data. This data must be archived and analysed using specialized bioinformatics tools. Description We herein describe "PARPs database," a data analysis and management pipeline for liquid chromatography tandem mass spectrometry (LC-MS/MS) proteomics. PARPs database is a web-based tool whose features include experiment annotation, protein database searching, protein sequence management, as well as data-mining of the peptides and proteins identified. Conclusion Using this pipeline, we have successfully identified several interactions of biological significance between PARP-1 and other proteins, namely RFC-1, 2, 3, 4 and 5. PMID:18093328

  8. The Use of INFO, a Database Management System, in Teaching Library and Information Studies at Manchester Polytechnic.

    ERIC Educational Resources Information Center

    Rowley, J. E.; And Others

    1988-01-01

    Outlines the courses offered by the Department of Library and Information Studies at Manchester Polytechnic, and describes the use of a database management system to teach aspects of information science. Details of a number of specific applications are given and future developments are discussed. (CLB)

  9. A spatial classification and database for management, research, and policy making: The Great Lakes aquatic habitat framework

    EPA Science Inventory

    Managing the world’s largest and complex freshwater ecosystem, the Laurentian Great Lakes, requires a spatially hierarchical basin-wide database of ecological and socioeconomic information that are comparable across the region. To meet such a need, we developed a hierarchi...

  10. A Web-Based Multi-Database System Supporting Distributed Collaborative Management and Sharing of Microarray Experiment Information

    PubMed Central

    Burgarella, Sarah; Cattaneo, Dario; Masseroli, Marco

    2006-01-01

    We developed MicroGen, a multi-database Web based system for managing all the information characterizing spotted microarray experiments. It supports information gathering and storing according to the Minimum Information About Microarray Experiments (MIAME) standard. It also allows easy sharing of information and data among all multidisciplinary actors involved in spotted microarray experiments. PMID:17238488

  11. Enhancing the Management of a High School's Non-Print Media Collection through a Computer Databased Bibliographic Cataloging System.

    ERIC Educational Resources Information Center

    Fischer, David L.

    Through the development and use of a computer databased bibliographic cataloging system, this practicum aimed to improve the organization and management of the non-print media collections housed in a Native American Indian reservation's high school language arts department and library in order for the teaching staff to gain better access to the…

  12. Legacy2Drupal - Conversion of an existing oceanographic relational database to a semantically enabled Drupal content management system

    NASA Astrophysics Data System (ADS)

    Maffei, A. R.; Chandler, C. L.; Work, T.; Allen, J.; Groman, R. C.; Fox, P. A.

    2009-12-01

    Content Management Systems (CMSs) provide powerful features that can be of use to oceanographic (and other geo-science) data managers. However, in many instances, geo-science data management offices have previously designed customized schemas for their metadata. The WHOI Ocean Informatics initiative and the NSF funded Biological Chemical and Biological Data Management Office (BCO-DMO) have jointly sponsored a project to port an existing, relational database containing oceanographic metadata, along with an existing interface coded in Cold Fusion middleware, to a Drupal6 Content Management System. The goal was to translate all the existing database tables, input forms, website reports, and other features present in the existing system to employ Drupal CMS features. The replacement features include Drupal content types, CCK node-reference fields, themes, RDB, SPARQL, workflow, and a number of other supporting modules. Strategic use of some Drupal6 CMS features enables three separate but complementary interfaces that provide access to oceanographic research metadata via the MySQL database: 1) a Drupal6-powered front-end; 2) a standard SQL port (used to provide a Mapserver interface to the metadata and data; and 3) a SPARQL port (feeding a new faceted search capability being developed). Future plans include the creation of science ontologies, by scientist/technologist teams, that will drive semantically-enabled faceted search capabilities planned for the site. Incorporation of semantic technologies included in the future Drupal 7 core release is also anticipated. Using a public domain CMS as opposed to proprietary middleware, and taking advantage of the many features of Drupal 6 that are designed to support semantically-enabled interfaces will help prepare the BCO-DMO database for interoperability with other ecosystem databases.

  13. Covariant Evolutionary Event Analysis for Base Interaction Prediction Using a Relational Database Management System for RNA.

    PubMed

    Xu, Weijia; Ozer, Stuart; Gutell, Robin R

    2009-01-01

    With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure. PMID:20502534

  14. COMPARISON OF EXERCISE PARTICIPATION RATES FOR CHILDREN IN THE LITERATURE WITH THOSE IN EPA'S CONSOLIDATED HUMAN ACTIVITY DATABASE (CHAD)

    EPA Science Inventory

    CHAD contains over 22,000 person-days of human activity pattern survey data. Part of the database includes exercise participation rates for children 0-17 years old, as well as for adults. Analyses of this database indicates that approximately 34% of the 0-17 age group (herea...

  15. Computer Science and Technology: Modeling and Measurement Techniques for Evaluation of Design Alternatives in the Implementation of Database Management Software. Final Report.

    ERIC Educational Resources Information Center

    Deutsch, Donald R.

    This report describes a research effort that was carried out over a period of several years to develop and demonstrate a methodology for evaluating proposed Database Management System designs. The major proposition addressed by this study is embodied in the thesis statement: Proposed database management system designs can be evaluated best through…

  16. Southern African Treatment Resistance Network (SATuRN) RegaDB HIV drug resistance and clinical management database: supporting patient management, surveillance and research in southern Africa.

    PubMed

    Manasa, Justen; Lessells, Richard; Rossouw, Theresa; Naidu, Kevindra; Van Vuuren, Cloete; Goedhals, Dominique; van Zyl, Gert; Bester, Armand; Skingsley, Andrew; Stott, Katharine; Danaviah, Siva; Chetty, Terusha; Singh, Lavanya; Moodley, Pravi; Iwuji, Collins; McGrath, Nuala; Seebregts, Christopher J; de Oliveira, Tulio

    2014-01-01

    Substantial amounts of data have been generated from patient management and academic exercises designed to better understand the human immunodeficiency virus (HIV) epidemic and design interventions to control it. A number of specialized databases have been designed to manage huge data sets from HIV cohort, vaccine, host genomic and drug resistance studies. Besides databases from cohort studies, most of the online databases contain limited curated data and are thus sequence repositories. HIV drug resistance has been shown to have a great potential to derail the progress made thus far through antiretroviral therapy. Thus, a lot of resources have been invested in generating drug resistance data for patient management and surveillance purposes. Unfortunately, most of the data currently available relate to subtype B even though >60% of the epidemic is caused by HIV-1 subtype C. A consortium of clinicians, scientists, public health experts and policy markers working in southern Africa came together and formed a network, the Southern African Treatment and Resistance Network (SATuRN), with the aim of increasing curated HIV-1 subtype C and tuberculosis drug resistance data. This article describes the HIV-1 data curation process using the SATuRN Rega database. The data curation is a manual and time-consuming process done by clinical, laboratory and data curation specialists. Access to the highly curated data sets is through applications that are reviewed by the SATuRN executive committee. Examples of research outputs from the analysis of the curated data include trends in the level of transmitted drug resistance in South Africa, analysis of the levels of acquired resistance among patients failing therapy and factors associated with the absence of genotypic evidence of drug resistance among patients failing therapy. All these studies have been important for informing first- and second-line therapy. This database is a free password-protected open source database available on

  17. 15 years of zooming in and zooming out: Developing a new single scale national active fault database of New Zealand

    NASA Astrophysics Data System (ADS)

    Ries, William; Langridge, Robert; Villamor, Pilar; Litchfield, Nicola; Van Dissen, Russ; Townsend, Dougal; Lee, Julie; Heron, David; Lukovic, Biljana

    2014-05-01

    In New Zealand, we are currently reconciling multiple digital coverages of mapped active faults into a national coverage at a single scale (1:250,000). This seems at first glance to be a relatively simple task. However, methods used to capture data, the scale of capture, and the initial purpose of the fault mapping, has produced datasets that have very different characteristics. The New Zealand digital active fault database (AFDB) was initially developed as a way of managing active fault locations and fault-related features within a computer-based spatial framework. The data contained within the AFDB comes from a wide range of studies, from plate tectonic (1:500,000) to cadastral (1:2,000) scale. The database was designed to allow capture of field observations and remotely sourced data without a loss in data resolution. This approach has worked well as a method for compiling a centralised database for fault information but not for providing a complete national coverage at a single scale. During the last 15 years other complementary projects have used and also contributed data to the AFDB, most notably the QMAP project (a national series of geological maps completed over 19 years that include coverage of active and inactive faults at 1:250,000). AFDB linework and attributes was incorporated into this series but simplification of linework and attributes has occurred to maintain map clarity at 1:250,000 scale. Also, during this period on-going mapping of active faults has improved upon these data. Other projects of note that have used data from the AFDB include the National Seismic Hazard Model of New Zealand and the Global Earthquake Model (GEM). The main goal of the current project has been to provide the best digital spatial representation of a fault trace at 1:250,000 scale and combine this with the most up to date attributes. In some areas this has required a simplification of very fine detailed data and in some cases new mapping to provide a complete coverage

  18. The Carbohydrate-Active EnZymes database (CAZy): an expert resource for Glycogenomics.

    PubMed

    Cantarel, Brandi L; Coutinho, Pedro M; Rancurel, Corinne; Bernard, Thomas; Lombard, Vincent; Henrissat, Bernard

    2009-01-01

    The Carbohydrate-Active Enzyme (CAZy) database is a knowledge-based resource specialized in the enzymes that build and breakdown complex carbohydrates and glycoconjugates. As of September 2008, the database describes the present knowledge on 113 glycoside hydrolase, 91 glycosyltransferase, 19 polysaccharide lyase, 15 carbohydrate esterase and 52 carbohydrate-binding module families. These families are created based on experimentally characterized proteins and are populated by sequences from public databases with significant similarity. Protein biochemical information is continuously curated based on the available literature and structural information. Over 6400 proteins have assigned EC numbers and 700 proteins have a PDB structure. The classification (i) reflects the structural features of these enzymes better than their sole substrate specificity, (ii) helps to reveal the evolutionary relationships between these enzymes and (iii) provides a convenient framework to understand mechanistic properties. This resource has been available for over 10 years to the scientific community, contributing to information dissemination and providing a transversal nomenclature to glycobiologists. More recently, this resource has been used to improve the quality of functional predictions of a number genome projects by providing expert annotation. The CAZy resource resides at URL: http://www.cazy.org/. PMID:18838391

  19. [Research and development of medical case database: a novel medical case information system integrating with biospecimen management].

    PubMed

    Pan, Shiyang; Mu, Yuan; Wang, Hong; Wang, Tong; Huang, Peijun; Ma, Jianfeng; Jiang, Li; Zhang, Jie; Gu, Bing; Yi, Lujiang

    2010-04-01

    To meet the needs of management of medical case information and biospecimen simultaneously, we developed a novel medical case information system integrating with biospecimen management. The database established by MS SQL Server 2000 covered, basic information, clinical diagnosis, imaging diagnosis, pathological diagnosis and clinical treatment of patient; physicochemical property, inventory management and laboratory analysis of biospecimen; users log and data maintenance. The client application developed by Visual C++ 6.0 was used to implement medical case and biospecimen management, which was based on Client/Server model. This system can perform input, browse, inquest, summary of case and related biospecimen information, and can automatically synthesize case-records based on the database. Management of not only a long-term follow-up on individual, but also of grouped cases organized according to the aim of research can be achieved by the system. This system can improve the efficiency and quality of clinical researches while biospecimens are used coordinately. It realizes synthesized and dynamic management of medical case and biospecimen, which may be considered as a new management platform. PMID:20481307

  20. The Database Business: Managing Today--Planning for Tomorrow. Issues and Futures.

    ERIC Educational Resources Information Center

    Aitchison, T. M.; And Others

    1988-01-01

    Current issues and the future of the database business are discussed in five papers. Topics covered include aspects relating to the quality of database production; international ownership in the U.S. information marketplace; an overview of pricing strategies in the electronic information industry; and pricing issues from the viewpoints of online…

  1. A Parallel Relational Database Management System Approach to Relevance Feedback in Information Retrieval.

    ERIC Educational Resources Information Center

    Lundquist, Carol; Frieder, Ophir; Holmes, David O.; Grossman, David

    1999-01-01

    Describes a scalable, parallel, relational database-drive information retrieval engine. To support portability across a wide range of execution environments, all algorithms adhere to the SQL-92 standard. By incorporating relevance feedback algorithms, accuracy is enhanced over prior database-driven information retrieval efforts. Presents…

  2. Deadlock detection and resolution in data-base management systems: A comprehensive approach

    SciTech Connect

    Park, Y.C.

    1989-01-01

    In this dissertation, several algorithms for deadlock detection and resolution in database management systems are presented, where two-phase locking -is assumed for ensuring serializability, the lock requests obey the granularity locking protocol and each granule may be locked in one of the following lock modes: IS, IX, S, SIX and X. For each object, lock requests are honored according to a first-come-first-served basis except for lock upgradations. The author presents algorithms for deadlock detection and resolution in sequential transaction processing to achieve the goal of early deadlock detection with the appropriate victim selection. He also presents a deadlock detection and resolution algorithm for parallel transaction processing which achieves the same objectives and an algorithm for distribute transaction processing which minimizes the amount of inter-site message communications. He proposes a new efficient algorithm for deadlock detection in sequential transaction processing, where the basic idea is the construction of a directed graph called a Holder/Walter-Transaction Waited-By Graph. He establishes guidelines for the identification of a victim in a deadlock cycle and show how deadlocks can be resolved with minimal victim cost. In addition, his algorithm allows us to resolve some deadlocks without aborting any transaction. In the case of parallel transaction processing, a transaction can have multiple outstanding lock requests. He introduces two types of deadlocks: explicit deadlocks and implicit deadlocks. To detect deadlocks in this environment, he introduces a new type of directed graph called a transaction waited-by graph. He presents deadlock detection mechanisms, identify deadlock detection time, and show how victims can be selected with minimal cost.

  3. The design and implementation of EPL: An event pattern language for active databases

    NASA Technical Reports Server (NTRS)

    Giuffrida, G.; Zaniolo, C.

    1994-01-01

    The growing demand for intelligent information systems requires closer coupling of rule-based reasoning engines, such as CLIPS, with advanced data base management systems (DBMS). For instance, several commercial DBMS now support the notion of triggers that monitor events and transactions occurring in the database and fire induced actions, which perform a variety of critical functions, including safeguarding the integrity of data, monitoring access, and recording volatile information needed by administrators, analysts, and expert systems to perform assorted tasks; examples of these tasks include security enforcement, market studies, knowledge discovery, and link analysis. At UCLA, we designed and implemented the event pattern language (EPL) which is capable of detecting and acting upon complex patterns of events which are temporally related to each other. For instance, a plant manager should be notified when a certain pattern of overheating repeats itself over time in a chemical process; likewise, proper notification is required when a suspicious sequence of bank transactions is executed within a certain time limit. The EPL prototype is built in CLIPS to operate on top of Sybase, a commercial relational DBMS, where actions can be triggered by events such as simple database updates, insertions, and deletions. The rule-based syntax of EPL allows the sequences of goals in rules to be interpreted as sequences of temporal events; each goal can correspond to either (1) a simple event, or (2) a (possibly negated) event/condition predicate, or (3) a complex event defined as the disjunction and repetition of other events. Various extensions have been added to CLIPS in order to tailor the interface with Sybase and its open client/server architecture.

  4. A plant resource and experiment management system based on the Golm Plant Database as a basic tool for omics research

    PubMed Central

    Köhl, Karin I; Basler, Georg; Lüdemann, Alexander; Selbig, Joachim; Walther, Dirk

    2008-01-01

    Background For omics experiments, detailed characterisation of experimental material with respect to its genetic features, its cultivation history and its treatment history is a requirement for analyses by bioinformatics tools and for publication needs. Furthermore, meta-analysis of several experiments in systems biology based approaches make it necessary to store this information in a standardised manner, preferentially in relational databases. In the Golm Plant Database System, we devised a data management system based on a classical Laboratory Information Management System combined with web-based user interfaces for data entry and retrieval to collect this information in an academic environment. Results The database system contains modules representing the genetic features of the germplasm, the experimental conditions and the sampling details. In the germplasm module, genetically identical lines of biological material are generated by defined workflows, starting with the import workflow, followed by further workflows like genetic modification (transformation), vegetative or sexual reproduction. The latter workflows link lines and thus create pedigrees. For experiments, plant objects are generated from plant lines and united in so-called cultures, to which the cultivation conditions are linked. Materials and methods for each cultivation step are stored in a separate ACCESS database of the plant cultivation unit. For all cultures and thus every plant object, each cultivation site and the culture's arrival time at a site are logged by a barcode-scanner based system. Thus, for each plant object, all site-related parameters, e.g. automatically logged climate data, are available. These life history data and genetic information for the plant objects are linked to analytical results by the sampling module, which links sample components to plant object identifiers. This workflow uses controlled vocabulary for organs and treatments. Unique names generated by the system

  5. Status Report for Remediation Decision Support Project, Task 1, Activity 1.B – Physical and Hydraulic Properties Database and Interpretation

    SciTech Connect

    Rockhold, Mark L.

    2008-09-26

    The objective of Activity 1.B of the Remediation Decision Support (RDS) Project is to compile all available physical and hydraulic property data for sediments from the Hanford Site, to port these data into the Hanford Environmental Information System (HEIS), and to make the data web-accessible to anyone on the Hanford Local Area Network via the so-called Virtual Library. In past years efforts were made by RDS project staff to compile all available physical and hydraulic property data for Hanford sediments and to transfer these data into SoilVision{reg_sign}, a commercial geotechnical software package designed for storing, analyzing, and manipulating soils data. Although SoilVision{reg_sign} has proven to be useful, its access and use restrictions have been recognized as a limitation to the effective use of the physical and hydraulic property databases by the broader group of potential users involved in Hanford waste site issues. In order to make these data more widely available and useable, a decision was made to port them to HEIS and to make them web-accessible via a Virtual Library module. In FY08 the objectives of Activity 1.B of the RDS Project were to: (1) ensure traceability and defensibility of all physical and hydraulic property data currently residing in the SoilVision{reg_sign} database maintained by PNNL, (2) transfer the physical and hydraulic property data from the Microsoft Access database files used by SoilVision{reg_sign} into HEIS, which has most recently been maintained by Fluor-Hanford, Inc., (3) develop a Virtual Library module for accessing these data from HEIS, and (4) write a User's Manual for the Virtual Library module. The development of the Virtual Library module was to be performed by a third party under subcontract to Fluor. The intent of these activities is to make the available physical and hydraulic property data more readily accessible and useable by technical staff and operable unit managers involved in waste site assessments and

  6. 'The surface management system' (SuMS) database: a surface-based database to aid cortical surface reconstruction, visualization and analysis

    NASA Technical Reports Server (NTRS)

    Dickson, J.; Drury, H.; Van Essen, D. C.

    2001-01-01

    Surface reconstructions of the cerebral cortex are increasingly widely used in the analysis and visualization of cortical structure, function and connectivity. From a neuroinformatics perspective, dealing with surface-related data poses a number of challenges. These include the multiplicity of configurations in which surfaces are routinely viewed (e.g. inflated maps, spheres and flat maps), plus the diversity of experimental data that can be represented on any given surface. To address these challenges, we have developed a surface management system (SuMS) that allows automated storage and retrieval of complex surface-related datasets. SuMS provides a systematic framework for the classification, storage and retrieval of many types of surface-related data and associated volume data. Within this classification framework, it serves as a version-control system capable of handling large numbers of surface and volume datasets. With built-in database management system support, SuMS provides rapid search and retrieval capabilities across all the datasets, while also incorporating multiple security levels to regulate access. SuMS is implemented in Java and can be accessed via a Web interface (WebSuMS) or using downloaded client software. Thus, SuMS is well positioned to act as a multiplatform, multi-user 'surface request broker' for the neuroscience community.

  7. Question Database Management and Program for Generation of Examinations in National Board of Medical Examiners Format

    PubMed Central

    Hall, James R.; Weitz, Fredric I.

    1983-01-01

    A microcomputer-based program has been developed to maintain a database of exam questions from which examinations can be administered in a standard format — based upon the question types of the National Board of Medical Examiners. This program maintains databases on each exam given and also on the use of questions from the master question files. Selection of microcomputers and peripherals and of operating systems and languages was done to provide economy and enhance portability.

  8. Design and utilization of a Flight Test Engineering Database Management System at the NASA Dryden Flight Research Facility

    NASA Technical Reports Server (NTRS)

    Knighton, Donna L.

    1992-01-01

    A Flight Test Engineering Database Management System (FTE DBMS) was designed and implemented at the NASA Dryden Flight Research Facility. The X-29 Forward Swept Wing Advanced Technology Demonstrator flight research program was chosen for the initial system development and implementation. The FTE DBMS greatly assisted in planning and 'mass production' card preparation for an accelerated X-29 research program. Improved Test Plan tracking and maneuver management for a high flight-rate program were proven, and flight rates of up to three flights per day, two times per week were maintained.

  9. De-MA: a web Database for electron Microprobe Analyses to assist EMP lab manager and users

    NASA Astrophysics Data System (ADS)

    Allaz, J. M.

    2012-12-01

    Lab managers and users of electron microprobe (EMP) facilities require comprehensive, yet flexible documentation structures, as well as an efficient scheduling mechanism. A single on-line database system for managing reservations, and providing information on standards, quantitative and qualitative setups (element mapping, etc.), and X-ray data has been developed for this purpose. This system is particularly useful in multi-user facilities where experience ranges from beginners to the highly experienced. New users and occasional facility users will find these tools extremely useful in developing and maintaining high quality, reproducible, and efficient analyses. This user-friendly database is available through the web, and uses MySQL as a database and PHP/HTML as script language (dynamic website). The database includes several tables for standards information, X-ray lines, X-ray element mapping, PHA, element setups, and agenda. It is configurable for up to five different EMPs in a single lab, each of them having up to five spectrometers and as many diffraction crystals as required. The installation should be done on a web server supporting PHP/MySQL, although installation on a personal computer is possible using third-party freeware to create a local Apache server, and to enable PHP/MySQL. Since it is web-based, any user outside the EMP lab can access this database anytime through any web browser and on any operating system. The access can be secured using a general password protection (e.g. htaccess). The web interface consists of 6 main menus. (1) "Standards" lists standards defined in the database, and displays detailed information on each (e.g. material type, name, reference, comments, and analyses). Images such as EDS spectra or BSE can be associated with a standard. (2) "Analyses" lists typical setups to use for quantitative analyses, allows calculation of mineral composition based on a mineral formula, or calculation of mineral formula based on a fixed

  10. Reflective Database Access Control

    ERIC Educational Resources Information Center

    Olson, Lars E.

    2009-01-01

    "Reflective Database Access Control" (RDBAC) is a model in which a database privilege is expressed as a database query itself, rather than as a static privilege contained in an access control list. RDBAC aids the management of database access controls by improving the expressiveness of policies. However, such policies introduce new interactions…

  11. A database about the tornadic activity in Catalonia (NE Spain) since 1994

    NASA Astrophysics Data System (ADS)

    Morales, M. E.; Arús, J.; Llasat, M. C.; Castán, S.

    2009-09-01

    Although tornadic activity is not the most important hazard in Spain, the damages that tornadoes and downburst generate are considerable in urban areas, giving place in some occasions to casualties. In Spain, the oldest systematic works collecting data about tornadoes, refer to the Balearic Islands, although some series about tornadoes in Spain have also been collected and analysed (Gayà, 2005). These series shows a positive increase that is probably more related to a change in the perception level of the population than to climatic change. In some occasions it is difficult to separate the damages produced by the tornado itself from those produced by other associated hazards like heavy rains, hail or a wind storms. It was the case of the September 2006 event, in which flash floods and tornadoes were recorded. In the same sense in some occasions, damages produced by a downsburt are confused with those that produced by a tornado. Having in mind all these facts, having a good systematic data base about tornadoes is necessary, before to obtain some conclusions not enough justified. This kind of database is not easy to obtain, because of it requires to have detailed information about damages, meteorological observations and testimonies that has to be filtered by a good quality control. After a general presentation about tornadoes and downsbursts in Mediterranean Region, this contribution presents the database that have affected Catalonia during the period 1994-2009, starting with the tornado recorded on the Espluga de Francolí the 31 August 1994.This database has been built in basis to the AEMET information, the Consorcio de Compensación de Seguros (the insurance company of Spain for natural disasters), the newspapers and field visits to the affected places.

  12. Preparing a clinical activities database in plastic surgery using existing information systems.

    PubMed

    Dunn, K; Bazire, N; Housley, L; Shakespeare, P

    1994-11-01

    The collection of data for audit and clinical activity analysis is time consuming. This report outlines a technique by which data held on hospital data systems, such as the PAS and theatre management systems, can be made available to clinical investigators. The saving in time and effort can be enormous. PMID:7598400

  13. Watershed Data Management (WDM) database for Salt Creek streamflow simulation, DuPage County, Illinois, water years 2005-11

    USGS Publications Warehouse

    Bera, Maitreyee

    2014-01-01

    The U.S. Geological Survey (USGS), in cooperation with DuPage County Stormwater Management Division, maintains a USGS database of hourly meteorologic and hydrologic data for use in a near real-time streamflow simulation system, which assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek watershed in DuPage County, Illinois. Most of the precipitation data are collected from a tipping-bucket rain-gage network located in and near DuPage County. The other meteorologic data (wind speed, solar radiation, air temperature, and dewpoint temperature) are collected at Argonne National Laboratory in Argonne, Ill. Potential evapotranspiration is computed from the meteorologic data. The hydrologic data (discharge and stage) are collected at USGS streamflow-gaging stations in DuPage County. These data are stored in a Watershed Data Management (WDM) database. An earlier report describes in detail the WDM database development including the processing of data from January 1, 1997, through September 30, 2004, in SEP04.WDM database. SEP04.WDM is updated with the appended data from October 1, 2004, through September 30, 2011, water years 2005–11 and renamed as SEP11.WDM. This report details the processing of meteorologic and hydrologic data in SEP11.WDM. This report provides a record of snow affected periods and the data used to fill missing-record periods for each precipitation site during water years 2005–11. The meteorologic data filling methods are described in detail in Over and others (2010), and an update is provided in this report.

  14. A spatial classification and database for management, research, and policy making: The Great Lakes aquatic habitat framework

    USGS Publications Warehouse

    Wang, Lizhu; Riseng, Catherine M.; Mason, Lacey; Werhrly, Kevin; Rutherford, Edward; McKenna Jr, James E.; Castiglione, Chris; Johnson, Lucinda B.; Infante, Dana M.; Sowa, Scott P.; Robertson, Mike; Schaeffer, Jeff; Khoury, Mary; Gaiot, John; Hollenhurst, Tom; Brooks, Colin N.; Coscarelli, Mark

    2015-01-01

    Managing the world's largest and most complex freshwater ecosystem, the Laurentian Great Lakes, requires a spatially hierarchical basin-wide database of ecological and socioeconomic information that is comparable across the region. To meet such a need, we developed a spatial classification framework and database — Great Lakes Aquatic Habitat Framework (GLAHF). GLAHF consists of catchments, coastal terrestrial, coastal margin, nearshore, and offshore zones that encompass the entire Great Lakes Basin. The catchments captured in the database as river pour points or coastline segments are attributed with data known to influence physicochemical and biological characteristics of the lakes from the catchments. The coastal terrestrial zone consists of 30-m grid cells attributed with data from the terrestrial region that has direct connection with the lakes. The coastal margin and nearshore zones consist of 30-m grid cells attributed with data describing the coastline conditions, coastal human disturbances, and moderately to highly variable physicochemical and biological characteristics. The offshore zone consists of 1.8-km grid cells attributed with data that are spatially less variable compared with the other aquatic zones. These spatial classification zones and their associated data are nested within lake sub-basins and political boundaries and allow the synthesis of information from grid cells to classification zones, within and among political boundaries, lake sub-basins, Great Lakes, or within the entire Great Lakes Basin. This spatially structured database could help the development of basin-wide management plans, prioritize locations for funding and specific management actions, track protection and restoration progress, and conduct research for science-based decision making.

  15. Identification of promiscuous ene-reductase activity by mining structural databases using active site constellations

    PubMed Central

    Steinkellner, Georg; Gruber, Christian C.; Pavkov-Keller, Tea; Binter, Alexandra; Steiner, Kerstin; Winkler, Christoph; Łyskowski, Andrzej; Schwamberger, Orsolya; Oberer, Monika; Schwab, Helmut; Faber, Kurt; Macheroux, Peter; Gruber, Karl

    2014-01-01

    The exploitation of catalytic promiscuity and the application of de novo design have recently opened the access to novel, non-natural enzymatic activities. Here we describe a structural bioinformatic method for predicting catalytic activities of enzymes based on three-dimensional constellations of functional groups in active sites (‘catalophores’). As a proof-of-concept we identify two enzymes with predicted promiscuous ene-reductase activity (reduction of activated C–C double bonds) and compare them with known ene-reductases, that is, members of the Old Yellow Enzyme family. Despite completely different amino acid sequences, overall structures and protein folds, high-resolution crystal structures reveal equivalent binding modes of typical Old Yellow Enzyme substrates and ligands. Biochemical and biocatalytic data show that the two enzymes indeed possess ene-reductase activity and reveal an inverted stereopreference compared with Old Yellow Enzymes for some substrates. This method could thus be a tool for the identification of viable starting points for the development and engineering of novel biocatalysts. PMID:24954722

  16. The U.S. Geological Survey mapping and cartographic database activities, 2006-2010

    USGS Publications Warehouse

    Craun, Kari J.; Donnelly, John P.; Allord, Gregory J.

    2011-01-01

    The U.S. Geological Survey (USGS) began systematic topographic mapping of the United States in the 1880s, beginning with scales of 1:250,000 and 1:125,000 in support of geological mapping. Responding to the need for higher resolution and more detail, the 1:62,500-scale, 15-minute, topographic map series was begun in the beginning of the 20th century. Finally, in the 1950s the USGS adopted the 1:24,000-scale, 7.5-minute topographic map series to portray even more detail, completing the coverage of the conterminous 48 states of the United States with this series in 1992. In 2001, the USGS developed the vision and concept of The National Map, a topographic database for the 21st century and the source for a new generation of topographic maps (http://nationalmap.gov/). In 2008, the initial production of those maps began with a 1:24,000-scale digital product. In a separate, but related project, the USGS began scanning the existing inventory of historical topographic maps at all scales to accompany the new topographic maps. The USGS also had developed a digital database of The National Atlas of the United States. The digital version of Atlas is now Web-available and supports a mapping engine for small scale maps of the United States and North America. These three efforts define topographic mapping activities of the USGS during the last few years and are discussed below.

  17. jSPyDB, an open source database-independent tool for data management

    NASA Astrophysics Data System (ADS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  18. Are Bibliographic Management Software Search Interfaces Reliable?: A Comparison between Search Results Obtained Using Database Interfaces and the EndNote Online Search Function

    ERIC Educational Resources Information Center

    Fitzgibbons, Megan; Meert, Deborah

    2010-01-01

    The use of bibliographic management software and its internal search interfaces is now pervasive among researchers. This study compares the results between searches conducted in academic databases' search interfaces versus the EndNote search interface. The results show mixed search reliability, depending on the database and type of search…

  19. Digital Database of Recently Active Traces of the Hayward Fault, California

    USGS Publications Warehouse

    Lienkaemper, James J.

    2006-01-01

    The purpose of this map is to show the location of and evidence for recent movement on active fault traces within the Hayward Fault Zone, California. The mapped traces represent the integration of the following three different types of data: (1) geomorphic expression, (2) creep (aseismic fault slip),and (3) trench exposures. This publication is a major revision of an earlier map (Lienkaemper, 1992), which both brings up to date the evidence for faulting and makes it available formatted both as a digital database for use within a geographic information system (GIS) and for broader public access interactively using widely available viewing software. The pamphlet describes in detail the types of scientific observations used to make the map, gives references pertaining to the fault and the evidence of faulting, and provides guidance for use of and limitations of the map. [Last revised Nov. 2008, a minor update for 2007 LiDAR and recent trench investigations; see version history below.

  20. A novel meta-analytic approach: Mining frequent co-activation patterns in neuroimaging databases

    PubMed Central

    Caspers, Julian; Zilles, Karl; Beierle, Christoph; Rottschy, Claudia; Eickhoff, Simon B.

    2016-01-01

    In recent years, coordinate-based meta-analyses have become a powerful and widely used tool to study coactivity across neuroimaging experiments, a development that was supported by the emergence of large-scale neuroimaging databases like BrainMap. However, the evaluation of co-activation patterns is constrained by the fact that previous coordinate-based meta-analysis techniques like Activation Likelihood Estimation (ALE) and Multilevel Kernel Density Analysis (MKDA) reveal all brain regions that show convergent activity within a dataset without taking into account actual within-experiment co-occurrence patterns. To overcome this issue we here propose a novel meta-analytic approach named PaMiNI that utilizes a combination of two well-established data-mining techniques, Gaussian mixture modeling and the Apriori algorithm. By this, PaMiNI enables a data-driven detection of frequent co-activation patterns within neuroimaging datasets. The feasibility of the method is demonstrated by means of several analyses on simulated data as well as a real application. The analyses of the simulated data show that PaMiNI identifies the brain regions underlying the simulated activation foci and perfectly separates the co-activation patterns of the experiments in the simulations. Furthermore, PaMiNI still yields good results when activation foci of distinct brain regions become closer together or if they are non-Gaussian distributed. For the further evaluation, a real dataset on working memory experiments is used, which was previously examined in an ALE meta-analysis and hence allows a cross-validation of both methods. In this latter analysis, PaMiNI revealed a fronto-parietal “core” network of working memory and furthermore indicates a left-lateralization in this network. Finally, to encourage a widespread usage of this new method, the PaMiNI approach was implemented into a publicly available software system. PMID:24365675

  1. User support for a library-managed online database search service: the BMA Library free MEDLINE service.

    PubMed

    Rowlands, J; Yeadon, J; Forrester, W; McSeán, T

    1997-07-01

    This paper discusses user support in the context of a library-managed online database search service. Experience is drawn from the British Medical Association (BMA) Library's Free MEDLINE Service. More than 9,600 BMA members, who are largely unfamiliar with computer communications and database searching, have registered as users of the service. User support has played a significant role in the development of the service and has comprised four main aspects: an information pack, a help desk, online help, and MEDLINE courses. The paper includes an analysis of help desk usage statistics collected from January 1996 through June 1996, and highlights other relevant research. Plans for further service enhancements and their implications in terms of future user support are discussed. PMID:9285124

  2. SELCTV SYSTEM MANUAL FOR SELCTV AND REFER DATABASES AND THE SELCTV DATA MANAGEMENT PROGRAM

    EPA Science Inventory

    The SELCTV database is a compilation of the side effects of pesticides on arthropod predators and parasitoids that provide biological control of pest arthropods in the agricultural ecosystem. he primary source of side effects data is the published scientific literature; reference...

  3. Guide on Data Models in the Selection and Use of Database Management Systems. Final Report.

    ERIC Educational Resources Information Center

    Gallagher, Leonard J.; Draper, Jesse M.

    A tutorial introduction to data models in general is provided, with particular emphasis on the relational and network models defined by the two proposed ANSI (American National Standards Institute) database language standards. Examples based on the network and relational models include specific syntax and semantics, while examples from the other…

  4. Creating Smarter Classrooms: Data-Based Decision Making for Effective Classroom Management

    ERIC Educational Resources Information Center

    Gage, Nicholas A.; McDaniel, Sara

    2012-01-01

    The term "data-based decision making" (DBDM) has become pervasive in education and typically refers to the use of data to make decisions in schools, from assessment of an individual student's academic progress to whole-school reform efforts. Research suggests that special education teachers who use progress monitoring data (a DBDM approach) adapt…

  5. 76 FR 59170 - Hartford Financial Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-23

    ... published in the Federal Register on September 2, 2011 (76 FR 54794). The workers of The Hartford, Corporate... Employment and Training Administration Hartford Financial Services, Inc., Corporate/EIT/CTO Database...) applicable to workers and former workers Hartford Financial Services, Inc., Corporate/EIT/CTO...

  6. The PRRS Host Genomic Consortium (PHGC) Database: Management of large data sets.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In any consortium project where large amounts of phenotypic and genotypic data are collected across several research labs, issues arise with maintenance and analysis of datasets. The PRRS Host Genomic Consortium (PHGC) Database was developed to meet this need for the PRRS research community. The sch...

  7. Development of a grape genomics database using IBM DB2 content manager software

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A relational database was created for the North American Grapevine Genome project at the Viticultural Research Center, at Florida A&M University. The collaborative project with USDA, ARS researchers is an important resource for viticulture production of new grapevine varieties which will be adapted ...

  8. Huntington's Disease Research Roster Support with a Microcomputer Database Management System

    PubMed Central

    Gersting, J. M.; Conneally, P. M.; Beidelman, K.

    1983-01-01

    This paper chronicles the MEGADATS (Medical Genetics Acquisition and DAta Transfer System) database development effort in collecting, storing, retrieving, and plotting human family pedigrees. The newest system, MEGADATS-3M, is detailed. Emphasis is on the microcomputer version of MEGADATS-3M and its use to support the Huntington's Disease research roster project. Examples of data input and pedigree plotting are included.

  9. Chronic Fatigue Syndrome (CFS): Managing Activities and Exercise

    MedlinePlus

    ... Fatigue Syndrome (CFS) Share Compartir Managing Activities and Exercise On this Page Avoiding Extremes Developing an Activity ... recent manageable level of activity. Strength and Conditioning Exercises Strength and conditioning exercises are an important component ...

  10. Managerial activities and skills of nurse managers: an exploratory study.

    PubMed

    Lin, Li-Min; Wu, Jen-Her; White, Louis P

    2005-01-01

    In this study, the authors used the activity competency model (Wu, Chen, and Lin 2004) to investigate the perceived importance of managerial activities and skills required of three levels of nurse managers. They identify the portfolio of the management activities and the needed skills at each management level. The results of this study provide guidelines for management development programs, training, and career planning for nurse managers, and can also serve as guidelines for recruiting and selecting effective nurse managers. PMID:16190515

  11. Analysis of DOE international environmental management activities

    SciTech Connect

    Ragaini, R.C.

    1995-09-01

    The Department of Energy`s (DOE) Strategic Plan (April 1994) states that DOE`s long-term vision includes world leadership in environmental restoration and waste management activities. The activities of the DOE Office of Environmental Management (EM) can play a key role in DOE`s goals of maintaining U.S. global competitiveness and ensuring the continuation of a world class science and technology community. DOE`s interest in attaining these goals stems partly from its participation in organizations like the Trade Policy Coordinating Committee (TPCC), with its National Environmental Export Promotion Strategy, which seeks to strengthen U.S. competitiveness and the building of public-private partnerships as part of U.S. industrial policy. The International Interactions Field Office task will build a communication network which will facilitate the efficient and effective communication between DOE Headquarters, Field Offices, and contractors. Under this network, Headquarters will provide the Field Offices with information on the Administration`s policies and activities (such as the DOE Strategic Plan), interagency activities, as well as relevant information from other field offices. Lawrence Livermore National Laboratory (LLNL) will, in turn, provide Headquarters with information on various international activities which, when appropriate, will be included in reports to groups like the TPCC and the EM Focus Areas. This task provides for the collection, review, and analysis of information on the more significant international environmental restoration and waste management initiatives and activities which have been used or are being considered at LLNL. Information gathering will focus on efforts and accomplishments in meeting the challenges of providing timely and cost effective cleanup of its environmentally damaged sites and facilities, especially through international technical exchanges and/or the implementation of foreign-development technologies.

  12. Soil Characterization Database for the Area 3 Radioactive Waste Management Site, Nevada Test Site, Nye County, Nevada

    SciTech Connect

    R. D. Van Remortel; Y. J. Lee; K. E. Snyder

    2005-01-01

    Soils were characterized in an investigation at the Area 3 Radioactive Waste Management Site at the U.S. Department of Energy Nevada Test Site in Nye County, Nevada. Data from the investigation are presented in four parameter groups: sample and site characteristics, U.S. Department of Agriculture (USDA) particle size fractions, chemical parameters, and American Society for Testing Materials-Unified Soil Classification System (ASTM-USCS) particle size fractions. Spread-sheet workbooks based on these parameter groups are presented to evaluate data quality, conduct database updates, and set data structures and formats for later extraction and analysis. This document does not include analysis or interpretation of presented data.

  13. Soil Characterization Database for the Area 5 Radioactive Waste Management Site, Nevada Test Site, Nye County, Nevada

    SciTech Connect

    Y. J. Lee; R. D. Van Remortel; K. E. Snyder

    2005-01-01

    Soils were characterized in an investigation at the Area 5 Radioactive Waste Management Site at the U.S. Department of Energy Nevada Test Site in Nye County, Nevada. Data from the investigation are presented in four parameter groups: sample and site characteristics, U.S. Department of Agriculture (USDA) particle size fractions, chemical parameters, and American Society for Testing Materials-Unified Soil Classification System (ASTM-USCS) particle size fractions. Spread-sheet workbooks based on these parameter groups are presented to evaluate data quality, conduct database updates,and set data structures and formats for later extraction and analysis. This document does not include analysis or interpretation of presented data.

  14. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  15. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-08-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirty-sixth month of development activities.

  16. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  17. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-09-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eleventh month of Phase 1

  18. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2004-09-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its twenty-fourth month of development activities.

  19. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2004-02-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eighteenth month of development activities.

  20. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-02-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirtieth month of development activities.

  1. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase 1

  2. Earthquake Model of the Middle East (EMME) Project: Active Fault Database for the Middle East Region

    NASA Astrophysics Data System (ADS)

    Gülen, L.; Wp2 Team

    2010-12-01

    The Earthquake Model of the Middle East (EMME) Project is a regional project of the umbrella GEM (Global Earthquake Model) project (http://www.emme-gem.org/). EMME project region includes Turkey, Georgia, Armenia, Azerbaijan, Syria, Lebanon, Jordan, Iran, Pakistan, and Afghanistan. Both EMME and SHARE projects overlap and Turkey becomes a bridge connecting the two projects. The Middle East region is tectonically and seismically very active part of the Alpine-Himalayan orogenic belt. Many major earthquakes have occurred in this region over the years causing casualties in the millions. The EMME project will use PSHA approach and the existing source models will be revised or modified by the incorporation of newly acquired data. More importantly the most distinguishing aspect of the EMME project from the previous ones will be its dynamic character. This very important characteristic is accomplished by the design of a flexible and scalable database that will permit continuous update, refinement, and analysis. A digital active fault map of the Middle East region is under construction in ArcGIS format. We are developing a database of fault parameters for active faults that are capable of generating earthquakes above a threshold magnitude of Mw≥5.5. Similar to the WGCEP-2007 and UCERF-2 projects, the EMME project database includes information on the geometry and rates of movement of faults in a “Fault Section Database”. The “Fault Section” concept has a physical significance, in that if one or more fault parameters change, a new fault section is defined along a fault zone. So far over 3,000 Fault Sections have been defined and parameterized for the Middle East region. A separate “Paleo-Sites Database” includes information on the timing and amounts of fault displacement for major fault zones. A digital reference library that includes the pdf files of the relevant papers, reports is also being prepared. Another task of the WP-2 of the EMME project is to prepare

  3. There must be a better way! Managing a corporate web site dynamically from a database

    SciTech Connect

    j.z. cohen

    1998-10-21

    This document is a set of slides available from http://www1.y12.org/lmes_sti/html/ycsdinf-98-8/index.htm that describes limitations of static web pages for conveying information, a plan for overcoming these limitations by generating web pages dynamically from a database, expected advantages and disadvantages of this method, design for a system using the method, and future plans.

  4. Third millenium ideal gas and condensed phase thermochemical database for combustion (with update from active thermochemical tables).

    SciTech Connect

    Burcat, A.; Ruscic, B.; Chemistry; Technion - Israel Inst. of Tech.

    2005-07-29

    The thermochemical database of species involved in combustion processes is and has been available for free use for over 25 years. It was first published in print in 1984, approximately 8 years after it was first assembled, and contained 215 species at the time. This is the 7th printed edition and most likely will be the last one in print in the present format, which involves substantial manual labor. The database currently contains more than 1300 species, specifically organic molecules and radicals, but also inorganic species connected to combustion and air pollution. Since 1991 this database is freely available on the internet, at the Technion-IIT ftp server, and it is continuously expanded and corrected. The database is mirrored daily at an official mirror site, and at random at about a dozen unofficial mirror and 'finger' sites. The present edition contains numerous corrections and many recalculations of data of provisory type by the G3//B3LYP method, a high-accuracy composite ab initio calculation. About 300 species are newly calculated and are not yet published elsewhere. In anticipation of the full coupling, which is under development, the database started incorporating the available (as yet unpublished) values from Active Thermochemical Tables. The electronic version now also contains an XML file of the main database to allow transfer to other formats and ease finding specific information of interest. The database is used by scientists, educators, engineers and students at all levels, dealing primarily with combustion and air pollution, jet engines, rocket propulsion, fireworks, but also by researchers involved in upper atmosphere kinetics, astrophysics, abrasion metallurgy, etc. This introductory article contains explanations of the database and the means to use it, its sources, ways of calculation, and assessments of the accuracy of data.

  5. Spatial database for the management of "urban geology" geothematic information: the case of Drama City, Greece

    NASA Astrophysics Data System (ADS)

    Pantelias, Eustathios; Zervakou, Alexandra D.; Tsombos, Panagiotis I.; Nikolakopoulos, Konstantinos G.

    2008-10-01

    The aggregation of population in big cities leads to the concentration of human activities, economic wealth, over consumption of natural resources and urban growth without planning and sustainable management. As a result, urban societies are exposed to various dangers and threats with economical, social, ecological - environmental impacts on the urban surroundings. Problems associated with urban development are related to their geological conditions and those of their surroundings, e.g. flooding, land subsidence, groundwater pollution, soil contamination, earthquakes, landslides, etc. For these reasons, no sustainable urban planning can be done without geological information support. The first systematic recording, codification and documentation of "urban geology" geothematic information in Greece is implemented by the Institute of Geological and Mineral Exploration (I.G.M.E.) in the frame of project "Collection, codification and documentation of geothematic information for urban and suburban areas in Greece - pilot applications". Through the implementation of this project, all geothematic information derived from geological mapping, geotechnical - geochemical - geophysical research and measurements in four pilot areas of Greece Drama (North Greece), Nafplio & Sparti (Peloponnesus) and Thrakomakedones (Attica) is stored and processed in specially designed geodatabases in GIS environment containing vector and raster data. For the specific GIS application ArcGIS Personal Geodatabase is used. Data is classified in geothematic layers, grouped in geothematic datasets (e.g. Topography, Geology - Tectonics, Submarine Geology, Technical Geology, Hydrogeology, Soils, Radioactive elements, etc) and being processed in order to produced multifunctional geothematic maps. All compiled data constitute the essential base for land use planning and environmental protection in specific urban areas. With the termination of the project the produced geodatabase and other digital data

  6. 77 FR 31615 - Improving Mail Management Policies, Procedures, and Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-29

    ... ADMINISTRATION Improving Mail Management Policies, Procedures, and Activities AGENCY: Office of Governmentwide... Services Administration (GSA) has issued Federal Management Regulation (FMR) Bulletin G-03 which provides guidance to Executive Branch agencies for improving mail management policies, procedures, and...

  7. Decision Support for Active Water Management (Invited)

    NASA Astrophysics Data System (ADS)

    Maidment, D. R.; Salas, F.; Minsker, B. S.

    2013-12-01

    Active water management refers to real-time adjustment of water management decisions based on observation and modeling of current water conditions. A case study is presented of a decision-support system for active water management in the San Antonio and Guadalupe basins using web services and cloud computing to create at the University of Texas a repository of observations, forecasts and model simulations from federal, state and regional water agencies and academia. Each day, National Weather Service river flow forecasts at 47 points in the basin are densified to create corresponding flows in 5500 river reaches using the RAPID river flow model operated in "Model as a Service" mode at the University of Illinois. These flows are adjusted by using the "Declarations of Intent" to pump water compiled by the Texas Commission for Environmental Quality which is the WaterMaster for all surface water withdrawals in the basin. The results are viewed through web maps that convey both maps of the spatial pattern of flow at a particular point in time, and charts of time series of flows at particular points in space.

  8. Woodlot management. (Latest citations from the Cab abstracts database). Published Search

    SciTech Connect

    1995-01-01

    The bibliography contains citations concerning worldwide woodlot and forest management. Topics cover private forest ecology, environmental policies, legislation, and land use. Timber harvesting, logging, timber valuation, farm management and forest inventories are examined. Forestry economics, including forecasting and planning, are also included. (Contains a minimum of 199 citations and includes a subject term index and title list.)

  9. Sustainable forest management. (Latest citations from the Cab abstracts database). Published Search

    SciTech Connect

    1996-12-01

    The bibliography contains citations concerning developments in sustainable forestry management. Topics include international regulations, economics, strategies, land use rights, ecological impact, political developments, and evaluations of sustainable forestry resource management programs. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  10. A new database on contaminant exposure and effects in terrestrial vertebrates for natural resource managers

    USGS Publications Warehouse

    Rattner, B.A.; Pearson, J.L.; Garrett, L.J.; Erwin, R.M.; Walz, A.; Ottinger, M.A.

    1997-01-01

    The Biomonitoring of Environmental Status and Trends (BEST) program of the Department of the Interior is focused to identify and understand effects of contaminant stressors on biological resources under their stewardship. Despite the desire of many to continuously monitor the environmental health of our estuaries, much can be learned by summarizing existing temporal, geographic, and phylogenetic contaminant information. To this end, retrospective contaminant exposure and effects data for amphibians, reptiles, birds, and mammals residing within 30 km of Atlantic coast estuaries are being assembled through searches of published literature (e.g., Fisheries Review, Wildlife Review, BIOSIS Previews) and databases (e.g., US EPA Ecological Incident Information System; USGS Diagnostic and Epizootic Databases), and compilation of summary data from unpublished reports of government natural resource agencies, private conservation groups, and universities. These contaminant exposure and effect data for terrestrial vertebrates (CEE-TV) are being summarized using Borland dBASE in a 96- field format, including species, collection time and site coordinates, sample matrix, contaminant concentration, biomarker and bioindicator responses, and source of information (N>1500 records). This CEE-TV database has been imported into the ARC/INFO geographic information system (GIS), for purposes of examining geographic coverage and trends, and to identify critical data gaps. A preliminary risk assessment will be conducted to identify and characterize contaminants and other stressors potentially affecting terrestrial vertebrates that reside, migrate through or reproduce in these estuaries. Evaluations are underway, using specific measurement and assessment endpoints, to rank and prioritize estuarine ecosystems in which terrestrial vertebrates are potentially at risk for purposes of prediction and focusing future biomonitoring efforts.

  11. COMPILATION AND MANAGEMENT OF ORP GLASS FORMULATION DATABASE, VSL-12R2470-1 REV 0

    SciTech Connect

    Kruger, Albert A.; Pasieka, Holly K.; Muller, Isabelle; Gilbo, Konstantin; Perez-Cardenas, Fernando; Joseph, Innocent; Pegg, Ian L.; Kot, Wing K.

    2012-12-13

    The present report describes the first steps in the development of a glass property-composition database for WTP LAW and HL W glasses that includes all of the data that were used in the development of the WTP baseline models and all of the data collected subsequently as part of WTP enhancement studies perfonned for ORP. The data were reviewed to identifY some of the more significant gaps in the composition space that will need to be filled to support waste processing at Hanford. The WTP baseline models have been evaluated against the new data in terms of range of validity and prediction perfonnance.

  12. Management of patients with active caries.

    PubMed

    Milgrom, Peter

    2014-07-01

    This paper reports on a mechanism to manage caries as a disease and to medically intervene in the disease process to halt progression. The goal of this paper is to provide this alternative to a surgical-only approach. The management of caries begins with assessing lesion activity and the potential for arrest. This requires a clinical and radiological assessment and evaluation of risk. Hopeless teeth are extracted and large cavities filled to reduce infection. Risk reduction strategies are employed so efforts to arrest lesions can be successful. Teeth with lesions in the enamel or outer third of the dentin should be sealed, not restored, as restorations can weaken teeth and can be traumatic to pulps. PMID:25076627

  13. Design for manufacturability production management activity report

    NASA Astrophysics Data System (ADS)

    Miyazaki, Norihiko; Sato, T.; Honma, M.; Yoshioka, N.; Hosono, K.; Onodera, T.; Itoh, H.; Suzuki, H.; Uga, T.; Kadota, K.; Iriki, N.

    2006-05-01

    Design For Manufacturability Production Management (DFM-PM) Subcommittee has been started in succession to Reticle Management Subcommittee (RMS) in Semiconductor Manufacturing Technology Committee for Japan (SMTCJ) from 2005. Our activity focuses on the SoC (System On Chip) Business, and it pursues the improvement of communication in manufacturing technique. The first theme of activity is the investigation and examination of the new trends about production (manufacturer) technology and related information, and proposals of business solution. The second theme is the standardization activity about manufacture technology and the cooperation with related semiconductors' organizations. And the third theme is holding workshop and support for promotion and spread of the standardization technology throughout semiconductor companies. We expand a range of scope from design technology to wafer pattern reliability and we will propose the competition domain, the collaboration area and the standardization technology on DFM. Furthermore, we will be able to make up a SoC business model as the 45nm node technology beyond manufacturing platform in cooperating with the design information and the production information by utilizing EDA technology.

  14. A database and model to support proactive management of sediment-related sewer blockages.

    PubMed

    Rodríguez, Juan Pablo; McIntyre, Neil; Díaz-Granados, Mario; Maksimović, Cedo

    2012-10-01

    Due to increasing customer and political pressures, and more stringent environmental regulations, sediment and other blockage issues are now a high priority when assessing sewer system operational performance. Blockages caused by sediment deposits reduce sewer system reliability and demand remedial action at considerable operational cost. Consequently, procedures are required for identifying which parts of the sewer system are in most need of proactive removal of sediments. This paper presents an exceptionally long (7.5 years) and spatially detailed (9658 grid squares--0.03 km² each--covering a population of nearly 7.5 million) data set obtained from a customer complaints database in Bogotá (Colombia). The sediment-related blockage data are modelled using homogeneous and non-homogeneous Poisson process models. In most of the analysed areas the inter-arrival time between blockages can be represented by the homogeneous process, but there are a considerable number of areas (up to 34%) for which there is strong evidence of non-stationarity. In most of these cases, the mean blockage rate increases over time, signifying a continual deterioration of the system despite repairs, this being particularly marked for pipe and gully pot related blockages. The physical properties of the system (mean pipe slope, diameter and pipe length) have a clear but weak influence on observed blockage rates. The Bogotá case study illustrates the potential value of customer complaints databases and formal analysis frameworks for proactive sewerage maintenance scheduling in large cities. PMID:22794800

  15. Wetlands legislation and management. (Latest citations from the Selected Water Resources Abstracts database). Published Search

    SciTech Connect

    Not Available

    1994-02-01

    The bibliography contains citations concerning federal and state legislation governing coastal and fresh water wetlands. Studies of regional regulations and management of specific sites are included. Topics such as reconciling environmental considerations with economic pressures and landowners' rights are covered. Wetlands restoration projects, conservation projects, and development plans are also presented. Many citations discuss wetlands management in relation to the Clean Water Act. (Contains 250 citations and includes a subject term index and title list.)

  16. The Neotoma Paleoecology Database

    NASA Astrophysics Data System (ADS)

    Grimm, E. C.; Ashworth, A. C.; Barnosky, A. D.; Betancourt, J. L.; Bills, B.; Booth, R.; Blois, J.; Charles, D. F.; Graham, R. W.; Goring, S. J.; Hausmann, S.; Smith, A. J.; Williams, J. W.; Buckland, P.

    2015-12-01

    The Neotoma Paleoecology Database (www.neotomadb.org) is a multiproxy, open-access, relational database that includes fossil data for the past 5 million years (the late Neogene and Quaternary Periods). Modern distributional data for various organisms are also being made available for calibration and paleoecological analyses. The project is a collaborative effort among individuals from more than 20 institutions worldwide, including domain scientists representing a spectrum of Pliocene-Quaternary fossil data types, as well as experts in information technology. Working groups are active for diatoms, insects, ostracodes, pollen and plant macroscopic remains, testate amoebae, rodent middens, vertebrates, age models, geochemistry and taphonomy. Groups are also active in developing online tools for data analyses and for developing modules for teaching at different levels. A key design concept of NeotomaDB is that stewards for various data types are able to remotely upload and manage data. Cooperatives for different kinds of paleo data, or from different regions, can appoint their own stewards. Over the past year, much progress has been made on development of the steward software-interface that will enable this capability. The steward interface uses web services that provide access to the database. More generally, these web services enable remote programmatic access to the database, which both desktop and web applications can use and which provide real-time access to the most current data. Use of these services can alleviate the need to download the entire database, which can be out-of-date as soon as new data are entered. In general, the Neotoma web services deliver data either from an entire table or from the results of a view. Upon request, new web services can be quickly generated. Future developments will likely expand the spatial and temporal dimensions of the database. NeotomaDB is open to receiving new datasets and stewards from the global Quaternary community

  17. Managing the Big Data Avalanche in Astronomy - Data Mining the Galaxy Zoo Classification Database

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.

    2014-01-01

    We will summarize a variety of data mining experiments that have been applied to the Galaxy Zoo database of galaxy classifications, which were provided by the volunteer citizen scientists. The goal of these exercises is to learn new and improved classification rules for diverse populations of galaxies, which can then be applied to much larger sky surveys of the future, such as the LSST (Large Synoptic Sky Survey), which is proposed to obtain detailed photometric data for approximately 20 billion galaxies. The massive Big Data that astronomy projects will generate in the future demand greater application of data mining and data science algorithms, as well as greater training of astronomy students in the skills of data mining and data science. The project described here has involved several graduate and undergraduate research assistants at George Mason University.

  18. Measuring impact of protected area management interventions: current and future use of the Global Database of Protected Area Management Effectiveness.

    PubMed

    Coad, Lauren; Leverington, Fiona; Knights, Kathryn; Geldmann, Jonas; Eassom, April; Kapos, Valerie; Kingston, Naomi; de Lima, Marcelo; Zamora, Camilo; Cuardros, Ivon; Nolte, Christoph; Burgess, Neil D; Hockings, Marc

    2015-11-01

    Protected areas (PAs) are at the forefront of conservation efforts, and yet despite considerable progress towards the global target of having 17% of the world's land area within protected areas by 2020, biodiversity continues to decline. The discrepancy between increasing PA coverage and negative biodiversity trends has resulted in renewed efforts to enhance PA effectiveness. The global conservation community has conducted thousands of assessments of protected area management effectiveness (PAME), and interest in the use of these data to help measure the conservation impact of PA management interventions is high. Here, we summarize the status of PAME assessment, review the published evidence for a link between PAME assessment results and the conservation impacts of PAs, and discuss the limitations and future use of PAME data in measuring the impact of PA management interventions on conservation outcomes. We conclude that PAME data, while designed as a tool for local adaptive management, may also help to provide insights into the impact of PA management interventions from the local-to-global scale. However, the subjective and ordinal characteristics of the data present significant limitations for their application in rigorous scientific impact evaluations, a problem that should be recognized and mitigated where possible. PMID:26460133

  19. Measuring impact of protected area management interventions: current and future use of the Global Database of Protected Area Management Effectiveness

    PubMed Central

    Coad, Lauren; Leverington, Fiona; Knights, Kathryn; Geldmann, Jonas; Eassom, April; Kapos, Valerie; Kingston, Naomi; de Lima, Marcelo; Zamora, Camilo; Cuardros, Ivon; Nolte, Christoph; Burgess, Neil D.; Hockings, Marc

    2015-01-01

    Protected areas (PAs) are at the forefront of conservation efforts, and yet despite considerable progress towards the global target of having 17% of the world's land area within protected areas by 2020, biodiversity continues to decline. The discrepancy between increasing PA coverage and negative biodiversity trends has resulted in renewed efforts to enhance PA effectiveness. The global conservation community has conducted thousands of assessments of protected area management effectiveness (PAME), and interest in the use of these data to help measure the conservation impact of PA management interventions is high. Here, we summarize the status of PAME assessment, review the published evidence for a link between PAME assessment results and the conservation impacts of PAs, and discuss the limitations and future use of PAME data in measuring the impact of PA management interventions on conservation outcomes. We conclude that PAME data, while designed as a tool for local adaptive management, may also help to provide insights into the impact of PA management interventions from the local-to-global scale. However, the subjective and ordinal characteristics of the data present significant limitations for their application in rigorous scientific impact evaluations, a problem that should be recognized and mitigated where possible. PMID:26460133

  20. DBAASP v.2: an enhanced database of structure and antimicrobial/cytotoxic activity of natural and synthetic peptides

    PubMed Central

    Pirtskhalava, Malak; Gabrielian, Andrei; Cruz, Phillip; Griggs, Hannah L.; Squires, R. Burke; Hurt, Darrell E.; Grigolava, Maia; Chubinidze, Mindia; Gogoladze, George; Vishnepolsky, Boris; Alekseev, Vsevolod; Rosenthal, Alex; Tartakovsky, Michael

    2016-01-01

    Antimicrobial peptides (AMPs) are anti-infectives that may represent a novel and untapped class of biotherapeutics. Increasing interest in AMPs means that new peptides (natural and synthetic) are discovered faster than ever before. We describe herein a new version of the Database of Antimicrobial Activity and Structure of Peptides (DBAASPv.2, which is freely accessible at http://dbaasp.org). This iteration of the database reports chemical structures and empirically-determined activities (MICs, IC50, etc.) against more than 4200 specific target microbes for more than 2000 ribosomal, 80 non-ribosomal and 5700 synthetic peptides. Of these, the vast majority are monomeric, but nearly 200 of these peptides are found as homo- or heterodimers. More than 6100 of the peptides are linear, but about 515 are cyclic and more than 1300 have other intra-chain covalent bonds. More than half of the entries in the database were added after the resource was initially described, which reflects the recent sharp uptick of interest in AMPs. New features of DBAASPv.2 include: (i) user-friendly utilities and reporting functions, (ii) a ‘Ranking Search’ function to query the database by target species and return a ranked list of peptides with activity against that target and (iii) structural descriptions of the peptides derived from empirical data or calculated by molecular dynamics (MD) simulations. The three-dimensional structural data are critical components for understanding structure–activity relationships and for design of new antimicrobial drugs. We created more than 300 high-throughput MD simulations specifically for inclusion in DBAASP. The resulting structures are described in the database by novel trajectory analysis plots and movies. Another 200+ DBAASP entries have links to the Protein DataBank. All of the structures are easily visualized directly in the web browser. PMID:26578581

  1. DBAASP v.2: an enhanced database of structure and antimicrobial/cytotoxic activity of natural and synthetic peptides.

    PubMed

    Pirtskhalava, Malak; Gabrielian, Andrei; Cruz, Phillip; Griggs, Hannah L; Squires, R Burke; Hurt, Darrell E; Grigolava, Maia; Chubinidze, Mindia; Gogoladze, George; Vishnepolsky, Boris; Alekseev, Vsevolod; Rosenthal, Alex; Tartakovsky, Michael

    2016-01-01

    Antimicrobial peptides (AMPs) are anti-infectives that may represent a novel and untapped class of biotherapeutics. Increasing interest in AMPs means that new peptides (natural and synthetic) are discovered faster than ever before. We describe herein a new version of the Database of Antimicrobial Activity and Structure of Peptides (DBAASPv.2, which is freely accessible at http://dbaasp.org). This iteration of the database reports chemical structures and empirically-determined activities (MICs, IC50, etc.) against more than 4200 specific target microbes for more than 2000 ribosomal, 80 non-ribosomal and 5700 synthetic peptides. Of these, the vast majority are monomeric, but nearly 200 of these peptides are found as homo- or heterodimers. More than 6100 of the peptides are linear, but about 515 are cyclic and more than 1300 have other intra-chain covalent bonds. More than half of the entries in the database were added after the resource was initially described, which reflects the recent sharp uptick of interest in AMPs. New features of DBAASPv.2 include: (i) user-friendly utilities and reporting functions, (ii) a 'Ranking Search' function to query the database by target species and return a ranked list of peptides with activity against that target and (iii) structural descriptions of the peptides derived from empirical data or calculated by molecular dynamics (MD) simulations. The three-dimensional structural data are critical components for understanding structure-activity relationships and for design of new antimicrobial drugs. We created more than 300 high-throughput MD simulations specifically for inclusion in DBAASP. The resulting structures are described in the database by novel trajectory analysis plots and movies. Another 200+ DBAASP entries have links to the Protein DataBank. All of the structures are easily visualized directly in the web browser. PMID:26578581

  2. Lessons Learned From Developing Reactor Pressure Vessel Steel Embrittlement Database

    SciTech Connect

    Wang, Jy-An John

    2010-08-01

    Materials behaviors caused by neutron irradiation under fission and/or fusion environments can be little understood without practical examination. Easily accessible material information system with large material database using effective computers is necessary for design of nuclear materials and analyses or simulations of the phenomena. The developed Embrittlement Data Base (EDB) at ORNL is this comprehensive collection of data. EDB database contains power reactor pressure vessel surveillance data, the material test reactor data, foreign reactor data (through bilateral agreements authorized by NRC), and the fracture toughness data. The lessons learned from building EDB program and the associated database management activity regarding Material Database Design Methodology, Architecture and the Embedded QA Protocol are described in this report. The development of IAEA International Database on Reactor Pressure Vessel Materials (IDRPVM) and the comparison of EDB database and IAEA IDRPVM database are provided in the report. The recommended database QA protocol and database infrastructure are also stated in the report.

  3. Unterstützung der IT-Service-Management-Prozesse an der Technischen Universität München durch eine Configuration-Management-Database

    NASA Astrophysics Data System (ADS)

    Knittl, Silvia

    Hochschulprozesse in Lehre und Verwaltung erfordern durch die steigende Integration und IT-Unterstützung ein sogenanntes Business Alignment der IT und damit auch ein professionelleres IT-Service-Management (ITSM). Die IT Infrastructure Library (ITIL) mit ihrer Beschreibung von in der Praxis bewährten Prozessen hat sich zum de-facto Standard im ITSM etabliert. Ein solcher Prozess ist das Konfigurationsmanagement. Es bildet die IT-Infrastruktur als Konfigurationselemente und deren Beziehungen in einem Werkzeug, genannt Configuration Management Database (CMDB), ab und unterstützt so das ITSM. Dieser Bericht beschreibt die Erfahrungen mit der prototypischen Einführung einer CMDB an der Technischen Universität München.

  4. Assessing urology and nephrology research activity in Arab countries using ISI web of science bibliometric database

    PubMed Central

    2014-01-01

    Background Bibliometric analysis is increasingly being used for research assessment. The main objective of this study was to assess research output in Urology and Nephrology subject from the Arab countries. Original scientific articles or reviews published from the 21 Arab countries in “Urology and Nephrology” subject were screened using the ISI Web of Science database. Research productivity was evaluated based on a methodology developed and used in other bibliometric studies by analyzing the annual productivity, names of journals, citations; top 10 active institution and authors as well as country contribution to Urology and Nephrology research. Results Three thousand and seventy six documents in “urology and nephrology” subject category were retrieved from 104 journals. This represents 1.4% of the global research output in “urology and nephrology”. Four hundred and two documents (12.66%) were published in Annales D Urologie Journal. The h-index of the retrieved documents was 57. The total number of citations, at the time of data analysis, was 30401 with an average citation of 9.57 per document. Egypt, with a total publication of 1284 (40.43%) ranked first among the Arab countries in “urology and nephrology” subject category. Mansoura University in Egypt was the most productive institution with a total of 561 (15.33%) documents. Arab researchers collaborated most with researchers from the United States of America (226; 7.12%) in urology and nephrology research. Conclusion The present data reveals a good contribution of some Arab countries to the field of “urology and nephrology”. More efforts are needed by some other Arab countries to bridge the gap in urology and nephrology research. Overall, the quality of urology/nephrology research is considered relatively high as measured by h-index. Cooperation in urology/nephrology research should be encouraged in the Arab world to bridge the gap with that from developed countries. PMID:24758477

  5. Inland wetlands legislation and management. (Latest citations from the NTIS Bibliographic database). Published Search

    SciTech Connect

    Not Available

    1993-11-01

    The bibliography contains citations concerning Federal and state laws and management programs for the protection and use of inland wetlands. the use of wetlands to control highway runoff and community wastewater is discussed. Wetlands protection programs, restoration projects, resource planning, and wetlands identification methods are cited. (Contains 250 citations and includes a subject term index and title list.)

  6. Inland wetlands legislation and management. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1996-03-01

    The bibliography contains citations concerning Federal and state laws and management programs for the protection and use of inland wetlands. The use of wetlands to control highway runoff and community wastewater is discussed. Wetlands protection programs, restoration projects, resource planning, and wetlands identification methods are cited. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  7. Railroad management planning. (Latest citations from the NTIS database). Published Search

    SciTech Connect

    Not Available

    1993-01-01

    The bibliography contains citations concerning railroad management techniques and their impact on operations. Topics include freight statistics, impacts on communities, and yard operations. Forecasts of future trends and government policies regarding railroad operations are also discussed. (Contains a minimum of 76 citations and includes a subject term index and title list.)

  8. Content-Based Management of Image Databases in the Internet Age

    ERIC Educational Resources Information Center

    Kleban, James Theodore

    2010-01-01

    The Internet Age has seen the emergence of richly annotated image data collections numbering in the billions of items. This work makes contributions in three primary areas which aid the management of this data: image representation, efficient retrieval, and annotation based on content and metadata. The contributions are as follows. First,…

  9. Avibase – a database system for managing and organizing taxonomic concepts

    PubMed Central

    Lepage, Denis; Vaidya, Gaurav; Guralnick, Robert

    2014-01-01

    Abstract Scientific names of biological entities offer an imperfect resolution of the concepts that they are intended to represent. Often they are labels applied to entities ranging from entire populations to individual specimens representing those populations, even though such names only unambiguously identify the type specimen to which they were originally attached. Thus the real-life referents of names are constantly changing as biological circumscriptions are redefined and thereby alter the sets of individuals bearing those names. This problem is compounded by other characteristics of names that make them ambiguous identifiers of biological concepts, including emendations, homonymy and synonymy. Taxonomic concepts have been proposed as a way to address issues related to scientific names, but they have yet to receive broad recognition or implementation. Some efforts have been made towards building systems that address these issues by cataloguing and organizing taxonomic concepts, but most are still in conceptual or proof-of-concept stage. We present the on-line database Avibase as one possible approach to organizing taxonomic concepts. Avibase has been successfully used to describe and organize 844,000 species-level and 705,000 subspecies-level taxonomic concepts across every major bird taxonomic checklist of the last 125 years. The use of taxonomic concepts in place of scientific names, coupled with efficient resolution services, is a major step toward addressing some of the main deficiencies in the current practices of scientific name dissemination and use. PMID:25061375

  10. Attributes of the Federal Energy Management Program's Federal Site Building Characteristics Database

    SciTech Connect

    Loper, Susan A.; Sandusky, William F.

    2010-12-31

    Typically, the Federal building stock is referred to as a group of about one-half million buildings throughout the United States. Additional information beyond this level is generally limited to distribution of that total by agency and maybe distribution of the total by state. However, additional characterization of the Federal building stock is required as the Federal sector seeks ways to implement efficiency projects to reduce energy and water use intensity as mandated by legislation and Executive Order. Using a Federal facility database that was assembled for use in a geographic information system tool, additional characterization of the Federal building stock is provided including information regarding the geographical distribution of sites, building counts and percentage of total by agency, distribution of sites and building totals by agency, distribution of building count and floor space by Federal building type classification by agency, and rank ordering of sites, buildings, and floor space by state. A case study is provided regarding how the building stock has changed for the Department of Energy from 2000 through 2008.

  11. The Recovery of a Clinical Database Management System after Destruction by Fire *

    PubMed Central

    Covvey, H.D.; McAlister, N.H.; Greene, J.; Wigle, E.D.

    1981-01-01

    In August 1980 a fire in the Cardiovascular Unit at Toronto General Hospital severely damaged the physical plant and rendered all on-site equipment unrecoverable. Among the hardware items in the fire was the computer which supports our cardiovascular database system. Within hours after the fire it was determined that the computer was no longer serviceable. Beyond off-site back-up tapes, there was the possibility that recent records on the computer had suffered a similar fate. Immediate procedures were instituted to obtain a replacement computer system and to clean media to permit data recovery. Within 2 months a partial system was supporting all users, and all data was recovered and being used. The destructive potential of a fire is rarely seriously considered relative to computer equipment in our clinical environments. Full-replacement value insurance; an excellent equipment supplier with the capacity to respond to an emergency; backup and recovery procedures with off-site storage; and dedicated staff are key hedges against disaster.

  12. The GTN-P Data Management System: A central database for permafrost monitoring parameters of the Global Terrestrial Network for Permafrost (GTN-P) and beyond

    NASA Astrophysics Data System (ADS)

    Lanckman, Jean-Pierre; Elger, Kirsten; Karlsson, Ævar Karl; Johannsson, Halldór; Lantuit, Hugues

    2013-04-01

    Permafrost is a direct indicator of climate change and has been identified as Essential Climate Variable (ECV) by the global observing community. The monitoring of permafrost temperatures, active-layer thicknesses and other parameters has been performed for several decades already, but it was brought together within the Global Terrestrial Network for Permafrost (GTN-P) in the 1990's only, including the development of measurement protocols to provide standardized data. GTN-P is the primary international observing network for permafrost sponsored by the Global Climate Observing System (GCOS) and the Global Terrestrial Observing System (GTOS), and managed by the International Permafrost Association (IPA). All GTN-P data was outfitted with an "open data policy" with free data access via the World Wide Web. The existing data, however, is far from being homogeneous: it is not yet optimized for databases, there is no framework for data reporting or archival and data documentation is incomplete. As a result, and despite the utmost relevance of permafrost in the Earth's climate system, the data has not been used by as many researchers as intended by the initiators of the programs. While the monitoring of many other ECVs has been tackled by organized international networks (e.g. FLUXNET), there is still no central database for all permafrost-related parameters. The European Union project PAGE21 created opportunities to develop this central database for permafrost monitoring parameters of GTN-P during the duration of the project and beyond. The database aims to be the one location where the researcher can find data, metadata, and information of all relevant parameters for a specific site. Each component of the Data Management System (DMS), including parameters, data levels and metadata formats were developed in cooperation with the GTN-P and the IPA. The general framework of the GTN-P DMS is based on an object oriented model (OOM), open for as many parameters as possible, and

  13. System hazards in managing laboratory test requests and results in primary care: medical protection database analysis and conceptual model

    PubMed Central

    Bowie, Paul; Price, Julie; Hepworth, Neil; Dinwoodie, Mark; McKay, John

    2015-01-01

    Objectives To analyse a medical protection organisation's database to identify hazards related to general practice systems for ordering laboratory tests, managing test results and communicating test result outcomes to patients. To integrate these data with other published evidence sources to inform design of a systems-based conceptual model of related hazards. Design A retrospective database analysis. Setting General practices in the UK and Ireland. Participants 778 UK and Ireland general practices participating in a medical protection organisation's clinical risk self-assessment (CRSA) programme from January 2008 to December 2014. Main outcome measures Proportion of practices with system risks; categorisation of identified hazards; most frequently occurring hazards; development of a conceptual model of hazards; and potential impacts on health, well-being and organisational performance. Results CRSA visits were undertaken to 778 UK and Ireland general practices of which a range of systems hazards were recorded across the laboratory test ordering and results management systems in 647 practices (83.2%). A total of 45 discrete hazard categories were identified with a mean of 3.6 per practice (SD=1.94). The most frequently occurring hazard was the inadequate process for matching test requests and results received (n=350, 54.1%). Of the 1604 instances where hazards were recorded, the most frequent was at the ‘postanalytical test stage’ (n=702, 43.8%), followed closely by ‘communication outcomes issues’ (n=628, 39.1%). Conclusions Based on arguably the largest data set currently available on the subject matter, our study findings shed new light on the scale and nature of hazards related to test results handling systems, which can inform future efforts to research and improve the design and reliability of these systems. PMID:26614621

  14. Management of Reclaimed Produced Water in California Enhanced with the Expanded U.S. Geological Survey Produced Waters Geochemical Database

    NASA Astrophysics Data System (ADS)

    Gans, K. D.; Blondes, M. S.; Kharaka, Y. K.; Reidy, M. E.; Conaway, C. H.; Thordsen, J. J.; Rowan, E. L.; Engle, M.

    2015-12-01

    In California, in 2014, every barrel of oil produced also produced 16 barrels of water. Approximately 3.2 billion barrels of water were co-produced with California oil in 2014. Half of California's produced water is generally used for steam and water injection for enhanced oil recovery. The other half (~215,000 acre-feet of water) is available for potential reuse. Concerns about the severe drought, groundwater depletion, and contamination have prompted petroleum operators and water districts to examine the recycling of produced water. Knowledge of the geochemistry of produced waters is valuable in determining the feasibility of produced water reuse. Water with low salinity can be reclaimed for use outside of the petroleum industry (e.g. irrigation, municipal uses, and industrial operations). Since a great proportion of California petroleum wells have produced water with relatively low salinity (generally 10,000-40,000 mg/L TDS), reclaiming produced water could be important as a drought mitigation strategy, especially in the parched southern San Joaquin Valley with many oil fields. The USGS Produced Waters Geochemical Database, available at http://eerscmap.usgs.gov/pwapp, will facilitate studies on the management of produced water for reclamation in California. Expanding on the USGS 2002 database, we have more accurately located California wells. We have added new data for 300 wells in the Sacramento Valley, San Joaquin Valley and the Los Angeles Basin for a total of ~ 1100 wells in California. In addition to the existing (2002) geochemical analyses of major ions and total dissolved solids, the new data also include geochemical analyses of minor ions and stable isotopes. We have added an interactive web map application which allows the user to filter data on chosen fields (e.g. TDS < 35,000 mg/L). Using the web map application as well as more in-depth investigation on the full data set can provide critical insight for better management of produced waters in water

  15. The computational structural mechanics testbed architecture. Volume 4: The global-database manager GAL-DBM

    NASA Technical Reports Server (NTRS)

    Wright, Mary A.; Regelbrugge, Marc E.; Felippa, Carlos A.

    1989-01-01

    This is the fourth of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 4 describes the nominal-record data management component of the NICE software. It is intended for all users.

  16. Knowledge Management in Cardiac Surgery: The Second Tehran Heart Center Adult Cardiac Surgery Database Report

    PubMed Central

    Abbasi, Kyomars; Karimi, Abbasali; Abbasi, Seyed Hesameddin; Ahmadi, Seyed Hossein; Davoodi, Saeed; Babamahmoodi, Abdolreza; Movahedi, Namdar; Salehiomran, Abbas; Shirzad, Mahmood; Bina, Peyvand

    2012-01-01

    Background: The Adult Cardiac Surgery Databank (ACSD) of Tehran Heart Center was established in 2002 with a view to providing clinical prediction rules for outcomes of cardiac procedures, developing risk score systems, and devising clinical guidelines. This is a general analysis of the collected data. Methods: All the patients referred to Tehran Heart Center for any kind of heart surgery between 2002 and 2008 were included, and their demographic, medical, clinical, operative, and postoperative data were gathered. This report presents general information as well as in-hospital mortality rates regarding all the cardiac procedures performed in the above time period. Results: There were 24959 procedures performed: 19663 (78.8%) isolated coronary artery bypass grafting surgeries (CABGs); 1492 (6.0%) isolated valve surgeries; 1437 (5.8%) CABGs concomitant with other procedures; 832 (3.3%) CABGs combined with valve surgeries; 722 (2.9%) valve surgeries concomitant with other procedures; 545 (2.2%) surgeries other than CABG or valve surgery; and 267 (1.1%) CABGs concomitant with valve and other types of surgery. The overall mortality was 205 (1.04%), with the lowest mortality rate (0.47%) in the isolated CABGs and the highest (4.49%) in the CABGs concomitant with valve surgeries and other types of surgery. Meanwhile, the overall mortality rate was higher in the female patients than in the males (1.90% vs. 0.74%, respectively). Conclusion: Isolated CABG was the most prevalent procedure at our center with the lowest mortality rate. However, the overall mortality was more prevalent in our female patients. This database can serve as a platform for the participation of the other countries in the region in the creation of a regional ACSD. PMID:23304179

  17. Contemporary Trends in the Immediate Surgical Management of Renal Trauma Using A National Database

    PubMed Central

    McClung, Christopher; Hotaling, James M.; Wang, Jin; Wessells, Hunter; Voelzke, Bryan B.

    2013-01-01

    Background The National Trauma Data Bank was utilized to analyze open surgical management of renal trauma during the first 24 hours of hospital admission, excluding those who were treated with conservative measures. A descriptive analysis of initial management trends following renal trauma was also performed as a secondary analysis. Methods Using the NTDB, patients with renal injuries were identified, and AIS codes were stratified to a corresponding AAST renal injury grade. Trends in initial management were assessed using the following initial treatment categories: observation, minimally invasive surgery, and open renal surgery. Analysis of initial open surgery was further examined according to etiology of injury (blunt vs. penetrating), type of open renal surgery, concomitant abdominal surgery, patient demographics, and time to surgery. Results 9,002 renal injuries (0.3%) mapped to an AAST renal grade. Of these, 1,183 patients underwent open surgery for their renal injury in the first 24 hours. There were 773 penetrating and 410 blunt injuries within this cohort. The majority of surgical patients sustained a high-grade renal injury (AAST 4–5: 64%). The overall nephrectomy rate in the first 24 hours was 54% and 83% for the penetrating and blunt groups, respectively. While the overall nephrectomy rate for AAST 1–3 renal injuries in the first 24 hours was low (1.8%), the nephrectomy rate was higher in the setting of an exploratory laparotomy (30%). Of those undergoing renal surgery in the first 24 hours, 86% had concomitant surgery performed for other abdominal injuries. Mean time from ED presentation to surgery was less for penetrating trauma. Conclusions Of patients requiring open surgery for renal trauma within 24 hours of admission, nephrectomy is the most common surgery. Continued effort to reduce nephrectomy rates following abdominal trauma are necessary. Level of Evidence III (exploratory cohort analysis, nonrandomized) PMID:24064872

  18. Principles and techniques in the design of ADMS+. [advanced data-base management system

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nick; Kang, Hyunchul

    1986-01-01

    'ADMS+/-' is an advanced data base management system whose architecture integrates the ADSM+ mainframe data base system with a large number of work station data base systems, designated ADMS-; no communications exist between these work stations. The use of this system radically decreases the response time of locally processed queries, since the work station runs in a single-user mode, and no dynamic security checking is required for the downloaded portion of the data base. The deferred update strategy used reduces overhead due to update synchronization in message traffic.

  19. DATA COLLECTION AND DATABASE DEVELOPMENT FOR CLEAN COAL TECHNOLOGY BY-PRODUCT CHARACTERISTICS AND MANAGEMENT PRACTICES

    SciTech Connect

    1998-10-01

    The primary goal of this task is to provide an easily accessible compilation of characterization information on clean coal technology (CCT) by-products to government agencies and industry to facilitate sound regulatory and management decisions. Supporting objectives are to (1) fully utilize information from previous DOE projects, (2) coordinate with industry and other research groups, (3) focus on by-products from pressurized fluidized-bed combustion (PFBC) and gasification, and (4) provide information relevant to the EPA evaluation criteria for the decision on the Resource Conservation and Recovery Act (RCRA) status of fluidized-bed combustion (FBC) by-products.

  20. Development of a database for prompt gamma-ray neutron activation analysis: Summary report of the third research coordination meeting

    SciTech Connect

    Lindstrom, Richard M.; Firestone, Richard B.; Pavi, ???

    2003-04-01

    The main discussions and conclusions from the Third Co-ordination Meeting on the Development of a Database for Prompt Gamma-ray Neutron Activation Analysis are summarized in this report. All results were reviewed in detail, and the final version of the TECDOC and the corresponding software were agreed upon and approved for preparation. Actions were formulated with the aim of completing the final version of the TECDOC and associated software by May 2003.

  1. Being PRO-ACTive: What can a Clinical Trial Database Reveal About ALS?

    PubMed

    Zach, Neta; Ennist, David L; Taylor, Albert A; Alon, Hagit; Sherman, Alexander; Kueffner, Robert; Walker, Jason; Sinani, Ervin; Katsovskiy, Igor; Cudkowicz, Merit; Leitner, Melanie L

    2015-04-01

    Advancing research and clinical care, and conducting successful and cost-effective clinical trials requires characterizing a given patient population. To gather a sufficiently large cohort of patients in rare diseases such as amyotrophic lateral sclerosis (ALS), we developed the Pooled Resource Open-Access ALS Clinical Trials (PRO-ACT) platform. The PRO-ACT database currently consists of >8600 ALS patient records from 17 completed clinical trials, and more trials are being incorporated. The database was launched in an open-access mode in December 2012; since then, >400 researchers from >40 countries have requested the data. This review gives an overview on the research enabled by this resource, through several examples of research already carried out with the goal of improving patient care and understanding the disease. These examples include predicting ALS progression, the simulation of future ALS clinical trials, the verification of previously proposed predictive features, the discovery of novel predictors of ALS progression and survival, the newly identified stratification of patients based on their disease progression profiles, and the development of tools for better clinical trial recruitment and monitoring. Results from these approaches clearly demonstrate the value of large datasets for developing a better understanding of ALS natural history, prognostic factors, patient stratification, and more. The increasing use by the community suggests that further analyses of the PRO-ACT database will continue to reveal more information about this disease that has for so long defied our understanding. PMID:25613183

  2. Idaho Senior Center Activities, Activity Participation Level, and Managers' Perceptions of Activity Success.

    ERIC Educational Resources Information Center

    Girvan, James T.; Harris, Frances

    A survey completed by managers of 77 senior centers in Idaho revealed that meals, blood pressure screening, and games and trips were the most successful activities offered. Alzheimer's support groups, library books for loan, and exercise classes were the least successful. Possible reasons for the success or failure of these activities were…

  3. A Multi-Disciplinary Management of Flooding Risk Based on the Use of Rainfall Data, Historical Impacts Databases and Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Renard, F.; Alonso, L.; Soto, D.

    2014-12-01

    Greater Lyon (1.3 million inhabitants 650 km ²), France, is subjected to recurring floods, with numerous consequences. From the perspective of prevention and management of this risk, the local authorities, in partnership with multidisciplinary researchers, have developed since 1988 a database built by the field teams, which specifically identifies all floods (places, date, impacts, damage, etc.). At first, this historical database is compared to two other databases, the emergency services and the local newspaper ones, in georeferencing these events using a GIS. It turns out that the historical database is more complete and precise, but the contribution of the other two bases is not negligible, and a useful complement to the knowledge of impacts. Thanks to the dense rain measurement network (30 rain gauges), the flood information is then compared to the distribution of rainfall for each episode (interpolation by ordinary kriging). The results are satisfactory and validate the accuracy of the information contained in the database, but also the accuracy of rainfall measurements. Thereafter, the number of flood on the study area is confronted with rainfall characteristics (intensity, duration and height of precipitated water). It does not appear here clear relationship between the number of floods and rainfall characteristics, because of the diversity of land uses, its permeability and the the types of local sewer network and urban water management. Finally, floods observed in the database are compared spatially with a GIS to flooding from the sewer network modeling (using the software Canoe). A strong spatial similarity between floods observed in the field and simulated flood is found in the majority of cases, despite the limitations of each tools. These encouraging results confirm the accuracy of the database and the reliability of the simulation software, and offer many operational perspectives to better understand the flood and learn to cope with the flooding risk.

  4. Transfer of Physical and Hydraulic Properties Databases to the Hanford Environmental Information System - PNNL Remediation Decision Support Project, Task 1, Activity 6

    SciTech Connect

    Rockhold, Mark L.; Middleton, Lisa A.

    2009-03-31

    This report documents the requirements for transferring physical and hydraulic property data compiled by PNNL into the Hanford Environmental Information System (HEIS). The Remediation Decision Support (RDS) Project is managed by Pacific Northwest National Laboratory (PNNL) to support Hanford Site waste management and remedial action decisions by the U.S. Department of Energy and one of their current site contractors - CH2M-Hill Plateau Remediation Company (CHPRC). The objective of Task 1, Activity 6 of the RDS project is to compile all available physical and hydraulic property data for sediments from the Hanford Site, to port these data into the Hanford Environmental Information System (HEIS), and to make the data web-accessible to anyone on the Hanford Local Area Network via the so-called Virtual Library.1 These physical and hydraulic property data are used to estimate parameters for analytical and numerical flow and transport models that are used for site risk assessments and evaluation of remedial action alternatives. In past years efforts were made by RDS project staff to compile all available physical and hydraulic property data for Hanford sediments and to transfer these data into SoilVision{reg_sign}, a commercial geotechnical software package designed for storing, analyzing, and manipulating soils data. Although SoilVision{reg_sign} has proven to be useful, its access and use restrictions have been recognized as a limitation to the effective use of the physical and hydraulic property databases by the broader group of potential users involved in Hanford waste site issues. In order to make these data more widely available and useable, a decision was made to port them to HEIS and to make them web-accessible via a Virtual Library module. In FY08 the original objectives of this activity on the RDS project were to: (1) ensure traceability and defensibility of all physical and hydraulic property data currently residing in the SoilVision{reg_sign} database

  5. Curating and Preserving the Big Canopy Database System: an Active Curation Approach using SEAD

    NASA Astrophysics Data System (ADS)

    Myers, J.; Cushing, J. B.; Lynn, P.; Weiner, N.; Ovchinnikova, A.; Nadkarni, N.; McIntosh, A.

    2015-12-01

    Modern research is increasingly dependent upon highly heterogeneous data and on the associated cyberinfrastructure developed to organize, analyze, and visualize that data. However, due to the complexity and custom nature of such combined data-software systems, it can be very challenging to curate and preserve them for the long term at reasonable cost and in a way that retains their scientific value. In this presentation, we describe how this challenge was met in preserving the Big Canopy Database (CanopyDB) system using an agile approach and leveraging the Sustainable Environment - Actionable Data (SEAD) DataNet project's hosted data services. The CanopyDB system was developed over more than a decade at Evergreen State College to address the needs of forest canopy researchers. It is an early yet sophisticated exemplar of the type of system that has become common in biological research and science in general, including multiple relational databases for different experiments, a custom database generation tool used to create them, an image repository, and desktop and web tools to access, analyze, and visualize this data. SEAD provides secure project spaces with a semantic content abstraction (typed content with arbitrary RDF metadata statements and relationships to other content), combined with a standards-based curation and publication pipeline resulting in packaged research objects with Digital Object Identifiers. Using SEAD, our cross-project team was able to incrementally ingest CanopyDB components (images, datasets, software source code, documentation, executables, and virtualized services) and to iteratively define and extend the metadata and relationships needed to document them. We believe that both the process, and the richness of the resultant standards-based (OAI-ORE) preservation object, hold lessons for the development of best-practice solutions for preserving scientific data in association with the tools and services needed to derive value from it.

  6. mycoCLAP, the database for characterized lignocellulose-active proteins of fungal origin: resource and text mining curation support

    PubMed Central

    Strasser, Kimchi; McDonnell, Erin; Nyaga, Carol; Wu, Min; Wu, Sherry; Almeida, Hayda; Meurs, Marie-Jean; Kosseim, Leila; Powlowski, Justin; Butler, Greg; Tsang, Adrian

    2015-01-01

    Enzymes active on components of lignocellulosic biomass are used for industrial applications ranging from food processing to biofuels production. These include a diverse array of glycoside hydrolases, carbohydrate esterases, polysaccharide lyases and oxidoreductases. Fungi are prolific producers of these enzymes, spurring fungal genome sequencing efforts to identify and catalogue the genes that encode them. To facilitate the functional annotation of these genes, biochemical data on over 800 fungal lignocellulose-degrading enzymes have been collected from the literature and organized into the searchable database, mycoCLAP (http://mycoclap.fungalgenomics.ca). First implemented in 2011, and updated as described here, mycoCLAP is capable of ranking search results according to closest biochemically characterized homologues: this improves the quality of the annotation, and significantly decreases the time required to annotate novel sequences. The database is freely available to the scientific community, as are the open source applications based on natural language processing developed to support the manual curation of mycoCLAP. Database URL: http://mycoclap.fungalgenomics.ca PMID:25754864

  7. A New Activity-Based Financial Cost Management Method

    NASA Astrophysics Data System (ADS)

    Qingge, Zhang

    The standard activity-based financial cost management model is a new model of financial cost management, which is on the basis of the standard cost system and the activity-based cost and integrates the advantages of the two. It is a new model of financial cost management with more accurate and more adequate cost information by taking the R&D expenses as the accounting starting point and after-sale service expenses as the terminal point and covering the whole producing and operating process and the whole activities chain and value chain aiming at serving the internal management and decision.

  8. Active management of food allergy: an emerging concept.

    PubMed

    Anagnostou, Katherine; Stiefel, Gary; Brough, Helen; du Toit, George; Lack, Gideon; Fox, Adam T

    2015-04-01

    IgE-mediated food allergies are common and currently there is no cure. Traditionally, management has relied upon patient education, food avoidance and the provision of an emergency medication plan. Despite this, food allergy can significantly impact on quality of life. Therefore, in recent years, evolving research has explored alternative management strategies. A more active approach to management is being adopted, which includes early introduction of potentially allergenic foods, anticipatory testing, active monitoring, desensitisation to food allergens and active risk management. This review will discuss these areas in turn. PMID:25378378

  9. The intelligent database machine

    NASA Technical Reports Server (NTRS)

    Yancey, K. E.

    1985-01-01

    The IDM data base was compared with the data base crack to determine whether IDM 500 would better serve the needs of the MSFC data base management system than Oracle. The two were compared and the performance of the IDM was studied. Implementations that work best on which database are implicated. The choice is left to the database administrator.

  10. VIEWCACHE: An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nick; Sellis, Timoleon

    1991-01-01

    The objective is to illustrate the concept of incremental access to distributed databases. An experimental database management system, ADMS, which has been developed at the University of Maryland, in College Park, uses VIEWCACHE, a database access method based on incremental search. VIEWCACHE is a pointer-based access method that provides a uniform interface for accessing distributed databases and catalogues. The compactness of the pointer structures formed during database browsing and the incremental access method allow the user to search and do inter-database cross-referencing with no actual data movement between database sites. Once the search is complete, the set of collected pointers pointing to the desired data are dereferenced.

  11. US - Former Soviet Union environmental management activities

    SciTech Connect

    1995-09-01

    The Office of Environmental Management (EM) has been delegated the responsibility for US DOE`s cleanup of nuclear weapons complex. The nature and the magnitude of the waste management and environmental remediation problem requires the identification of technologies and scientific expertise from domestic and foreign sources. This booklet makes comparisons and describes coordinated projects and workshops between the USA and the former Soviet Union.

  12. Database of Industrial Technological Information in Kanagawa : Networks for Technology Activities

    NASA Astrophysics Data System (ADS)

    Saito, Akira; Shindo, Tadashi

    This system is one of the databases which require participation by its members and of which premise is to open all the data in it. Aiming at free technological cooperation and exchange among industries it was constructed by Kanagawa Prefecture in collaboration with enterprises located in it. The input data is 36 items such as major product, special and advantageous technology, technolagy to be wanted for cooperation, facility and equipment, which technologically characterize each enterprise. They are expressed in 2,000 characters and written by natural language including Kanji except for some coded items. 24 search items are accessed by natural language so that in addition to interactive searching procedures including menu-type it enables extensive searching. The information service started in Oct., 1986 covering data from 2,000 enterprisen.

  13. Quarterly Briefing Book on Environmental and Waste Management Activities

    SciTech Connect

    Brown, M.C.

    1991-06-01

    The purpose of the Quarterly Briefing Book on Environmental and Waste Management Activities is to provide managers and senior staff at the US Department of Energy-Richland Operations Office and its contractors with timely and concise information on Hanford Site environmental and waste management activities. Each edition updates the information on the topics in the previous edition, deletes those determined not to be of current interest, and adds new topics to keep up to date with changing environmental and waste management requirements and issues. Section A covers current waste management and environmental restoration issues. In Section B are writeups on national or site-wide environmental and waste management topics. Section C has writeups on program- and waste-specific environmental and waste management topics. Section D provides information on waste sites and inventories on the site. 15 figs., 4 tabs.

  14. Contextualizing Solar Cycle 24: Report on the Development of a Homogenous Database of Bipolar Active Regions Spanning Four Cycles

    NASA Astrophysics Data System (ADS)

    Munoz-Jaramillo, A.; Werginz, Z. A.; DeLuca, M. D.; Vargas-Acosta, J. P.; Longcope, D. W.; Harvey, J. W.; Martens, P.; Zhang, J.; Vargas-Dominguez, S.; DeForest, C. E.; Lamb, D. A.

    2015-12-01

    The solar cycle can be understood as a process that alternates the large-scale magnetic field of the Sun between poloidal and toroidal configurations. Although the process that transitions the solar cycle between toroidal and poloidal phases is still not fully understood, theoretical studies, and observational evidence, suggest that this process is driven by the emergence and decay of bipolar magnetic regions (BMRs) at the photosphere. Furthermore, the emergence of BMRs at the photosphere is the main driver behind solar variability and solar activity in general; making the study of their properties doubly important for heliospheric physics. However, in spite of their critical role, there is still no unified catalog of BMRs spanning multiple instruments and covering the entire period of systematic measurement of the solar magnetic field (i.e. 1975 to present).In this presentation we discuss an ongoing project to address this deficiency by applying our Bipolar Active Region Detection (BARD) code on full disk magnetograms measured by the 512 (1975-1993) and SPMG (1992-2003) instruments at the Kitt Peak Vacuum Telescope (KPVT), SOHO/MDI (1996-2011) and SDO/HMI (2010-present). First we will discuss the results of our revitalization of 512 and SPMG KPVT data, then we will discuss how our BARD code operates, and finally report the results of our cross-callibration.The corrected and improved KPVT magnetograms will be made available through the National Solar Observatory (NSO) and Virtual Solar Observatory (VSO), including updated synoptic maps produced by running the corrected KPVT magnetograms though the SOLIS pipeline. The homogeneous active region database will be made public by the end of 2017 once it has reached a satisfactory level of quality and maturity. The Figure shows all bipolar active regions present in our database (as of Aug 2015) colored according to the sign of their leading polarity. Marker size is indicative of the total active region flux. Anti

  15. Implementation of a very large atmospheric correction lookup table for ASTER using a relational database management system

    NASA Astrophysics Data System (ADS)

    Murray, Alex T.; Eng, Bjorn T.; Thome, Kurtis J.

    1996-11-01

    The advanced spaceborne thermal emission and reflection radiometer (ASTER) is designed to provide a high resolution map of the Earth in both visible, near-infrared, and thermal spectral regions of the electromagnetic spectrum. The ASTER science team has developed several standard data product algorithms, but the most complex and computing-intensive of these is the estimation of surface radiance and reflectance values, which is done by modeling and correcting for the effects of the atmosphere. The algorithm for atmospheric correction in the visible bands sensed by ASTER calls fur the use of a very large atmospheric correction look up table (ACLUT). The ACLUT contains coefficients which describe atmospheric effects on ASTER data under various conditions. The parameters used to characterize the atmosphere and its effects on radiation in the ASTER bands include aerosol and molecular optical depth, aerosol size distribution, single scattering albedo, and solar, nadir view, and azimuth angles. The ACLUT coefficients are produced by thousands of runs of a radiative transfer code (RTC) program produced by Phil Slater and Kurt Thome of U. of A. The final version of ACLUT is expected to be in the neighborhood of 10 gigabytes. The RDBMS Sybase is used to manage the process of generating the ACLUT as well as to host the table and service queries on it. Queries on the table are made using ASTER band number and seven floating-point values as keys. The floating-point keys do not necessarily exactly match key values in the database, so the query involves a hierarchical closest-fit search. All aspects of table implementation are described.

  16. International Project Management Committee: Overview and Activities

    NASA Technical Reports Server (NTRS)

    Hoffman, Edward

    2010-01-01

    This slide presentation discusses the purpose and composition of the International Project Management Committee (IMPC). The IMPC was established by members of 15 space agencies, companies and professional organizations. The goal of the committee is to establish a means to share experiences and best practices with space project/program management practitioners at the global level. The space agencies that are involved are: AEB, DLR, ESA, ISRO, JAXA, KARI, and NASA. The industrial and professional organizational members are Comau, COSPAR, PMI, and Thales Alenia Space.

  17. Stackfile Database

    NASA Technical Reports Server (NTRS)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  18. Djeen (Database for Joomla!’s Extensible Engine): a research information management system for flexible multi-technology project administration

    PubMed Central

    2013-01-01

    Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665

  19. Databases for multilevel biophysiology research available at Physiome.jp

    PubMed Central

    Asai, Yoshiyuki; Abe, Takeshi; Li, Li; Oka, Hideki; Nomura, Taishin; Kitano, Hiroaki

    2015-01-01

    Physiome.jp (http://physiome.jp) is a portal site inaugurated in 2007 to support model-based research in physiome and systems biology. At Physiome.jp, several tools and databases are available to support construction of physiological, multi-hierarchical, large-scale models. There are three databases in Physiome.jp, housing mathematical models, morphological data, and time-series data. In late 2013, the site was fully renovated, and in May 2015, new functions were implemented to provide information infrastructure to support collaborative activities for developing models and performing simulations within the database framework. This article describes updates to the databases implemented since 2013, including cooperation among the three databases, interactive model browsing, user management, version management of models, management of parameter sets, and interoperability with applications. PMID:26441671

  20. Activated sludge process: Waste treatment. (Latest citations from the Biobusiness database). Published Search

    SciTech Connect

    1996-01-01

    The bibliography contains citations concerning the use of the activated sludge process in waste and wastewater treatment. Topics include biochemistry of the activated sludge process, effects of various pollutants on process activity, effects of environmental variables such as oxygen and water levels, and nutrient requirements of microorganisms employed in activated sludge processes. The citations also explore use of the process to treat specific wastes, such as halocarbons, metallic wastes, and petrochemical effluents; and wastes from pharmaceutical and dairy processes. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  1. Activated sludge process: Waste treatment. (Latest citations from the Biobusiness database). Published Search

    SciTech Connect

    Not Available

    1993-10-01

    The bibliography contains citations concerning the use of the activated sludge process in waste and wastewater treatment. Topics include biochemistry of the activated sludge process, effects of various pollutants on process activity, effects of environmental variables such as oxygen and water levels, and nutrient requirements of microorganisms employed in activated sludge processes. The citations also explore use of the process to treat specific wastes, such as halocarbons, metallic wastes, and petrochemical effluents; and wastes from pharmaceutical and dairy processes. (Contains 250 citations and includes a subject term index and title list.)

  2. Activated sludge process: Waste treatment. (Latest citations from the Biobusiness database). Published Search

    SciTech Connect

    Not Available

    1993-07-01

    The bibliography contains citations concerning the use of the activated sludge process in waste and wastewater treatment. Topics include biochemistry of the activated sludge process, effects of various pollutants on process activity, effects of environmental variables such as oxygen and water levels, and nutrient requirements of microorganisms employed in activated sludge processes. The citations also explore use of the process to treat specific wastes, such as halocarbons, metallic wastes, and petrochemical effluents; and wastes from pharmaceutical and dairy processes. (Contains 250 citations and includes a subject term index and title list.)

  3. Description of data base management systems activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    One of the major responsibilities of the JPL Computing and Information Services Office is to develop and maintain a JPL plan for providing computing services to the JPL management and administrative community that will lead to improved productivity. The CISO plan to accomplish this objective has been titled 'Management and Administrative Support Systems' (MASS). The MASS plan is based on the continued use of JPL's IBM 3032 Computer system for administrative computing and for the MASS functions. The current candidate administrative Data Base Management Systems required to support the MASS include ADABASE, Cullinane IDMS and TOTAL. Previous uses of administrative Data Base Systems have been applied to specific local functions rather than in a centralized manner with elements common to the many user groups. Limited capacity data base systems have been installed in microprocessor based office automation systems in a few Project and Management Offices using Ashton-Tate dBASE II. These experiences plus some other localized in house DBMS uses have provided an excellent background for developing user and system requirements for a single DBMS to support the MASS program.

  4. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  5. Age-related patterns of vigorous-intensity physical activity in youth: The International Children's Accelerometry Database.

    PubMed

    Corder, Kirsten; Sharp, Stephen J; Atkin, Andrew J; Andersen, Lars B; Cardon, Greet; Page, Angie; Davey, Rachel; Grøntved, Anders; Hallal, Pedro C; Janz, Kathleen F; Kordas, Katarzyna; Kriemler, Susi; Puder, Jardena J; Sardinha, Luis B; Ekelund, Ulf; van Sluijs, Esther M F

    2016-12-01

    Physical activity declines during youth but most evidence reports on combined moderate and vigorous-intensity physical activity. We investigated how vigorous-intensity activity varies with age. Cross-sectional data from 24,025 participants (5.0-18.0 y; from 20 studies in 10 countries obtained 2008-2010) providing ≥ 1 day accelerometer data (International Children's Accelerometry Database (ICAD)). Linear regression was used to investigate age-related patterns in vigorous-intensity activity; models included age (exposure), adjustments for monitor wear-time and study. Moderate-intensity activity was examined for comparison. Interactions were used to investigate whether the age/vigorous-activity association differed by sex, weight status, ethnicity, maternal education and region. A 6.9% (95% CI 6.2, 7.5) relative reduction in mean vigorous-intensity activity with every year of age was observed; for moderate activity the relative reduction was 6.0% (5.6%, 6.4%). The age-related decrease in vigorous-intensity activity remained after adjustment for moderate activity. A larger age-related decrease in vigorous activity was observed for girls (- 10.7%) versus boys (- 2.9%), non-white (- 12.9% to - 9.4%) versus white individuals (- 6.1%), lowest maternal education (high school (- 2.0%)) versus college/university (ns) and for overweight/obese (- 6.1%) versus healthy-weight participants (- 8.1%). In addition to larger annual decreases in vigorous-intensity activity, overweight/obese individuals, girls and North Americans had comparatively lower average vigorous-intensity activity at 5.0-5.9 y. Age-related declines in vigorous-intensity activity during youth appear relatively greater than those of moderate activity. However, due to a higher baseline, absolute moderate-intensity activity decreases more than vigorous. Overweight/obese individuals, girls, and North Americans appear especially in need of vigorous-intensity activity promotion due to low levels at 5

  6. Data management and language enhancement for generalized set theory computer language for operation of large relational databases

    NASA Technical Reports Server (NTRS)

    Finley, Gail T.

    1988-01-01

    This report covers the study of the relational database implementation in the NASCAD computer program system. The existing system is used primarily for computer aided design. Attention is also directed to a hidden-surface algorithm for final drawing output.

  7. Multiple objective optimization for active sensor management

    NASA Astrophysics Data System (ADS)

    Page, Scott F.; Dolia, Alexander N.; Harris, Chris J.; White, Neil M.

    2005-03-01

    The performance of a multi-sensor data fusion system is inherently constrained by the configuration of the given sensor suite. Intelligent or adaptive control of sensor resources has been shown to offer improved fusion performance in many applications. Common approaches to sensor management select sensor observation tasks that are optimal in terms of a measure of information. However, optimising for information alone is inherently sub-optimal as it does not take account of any other system requirements such as stealth or sensor power conservation. We discuss the issues relating to developing a suite of performance metrics for optimising multi-sensor systems and propose some candidate metrics. In addition it may not always be necessary to maximize information gain, in some cases small increases in information gain may take place at the cost of large sensor resource requirements. Additionally, the problems of sensor tasking and placement are usually treated separately, leading to a lack of coherency between sensor management frameworks. We propose a novel approach based on a high level decentralized information-theoretic sensor management architecture that unifies the processes of sensor tasking and sensor placement into a single framework. Sensors are controlled using a minimax multiple objective optimisation approach in order to address probability of target detection, sensor power consumption, and sensor survivability whilst maintaining a target estimation covariance threshold. We demonstrate the potential of the approach through simulation of a multi-sensor, target tracking scenario and compare the results with a single objective information based approach.

  8. DataBase on Demand

    NASA Astrophysics Data System (ADS)

    Gaspar Aparicio, R.; Gomez, D.; Coterillo Coz, I.; Wojcik, D.

    2012-12-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  9. An expressed sequence tag database of T-cell-enriched activated chicken splenocytes: sequence analysis of 5251 clones.

    PubMed

    Tirunagaru, V G; Sofer, L; Cui, J; Burnside, J

    2000-06-01

    The cDNA and gene sequences of many mammalian cytokines and their receptors are known. However, corresponding information on avian cytokines is limited due to the lack of cross-species activity at the functional level or strong homology at the molecular level. To improve the efficiency of identifying cytokines and novel chicken genes, a directionally cloned cDNA library from T-cell-enriched activated chicken splenocytes was constructed, and the partial sequence of 5251 clones was obtained. Sequence clustering indicates that 2357 (42%) of the clones are present as a single copy, and 2961 are distinct clones, demonstrating the high level of complexity of this library. Comparisons of the sequence data with known DNA sequences in GenBank indicate that approximately 25% of the clones match known chicken genes, 39% have similarity to known genes in other species, and 11% had no match to any sequence in the database. Several previously uncharacterized chicken cytokines and their receptors were present in our library. This collection provides a useful database for cataloging genes expressed in T cells and a valuable resource for future investigations of gene expression in avian immunology. A chicken EST Web site (http://udgenome. ags.udel. edu/chickest/chick.htm) has been created to provide access to the data, and a set of unique sequences has been deposited with GenBank (Accession Nos. AI979741-AI982511). Our new Web site (http://www. chickest.udel.edu) will be active as of March 3, 2000, and will also provide keyword-searching capabilities for BLASTX and BLASTN hits of all our clones. PMID:10860659

  10. hERGAPDbase: a database documenting hERG channel inhibitory potentials and APD-prolongation activities of chemical compounds.

    PubMed

    Hishigaki, Haretsugu; Kuhara, Satoru

    2011-01-01

    Drug-induced QT interval prolongation is one of the most common reasons for the withdrawal of drugs from the market. In the past decade, at least nine drugs, i.e. terfenadine, astemizole, grepafloxacin, terodiline, droperidol, lidoflazine, sertindole, levomethadyl and cisapride, have been removed from the market or their use has been severely restricted because of drug-induced QT interval prolongation. Therefore, this irregularity is a major safety concern in the case of drugs submitted for regulatory approval. The most common mechanism of drug-induced QT interval prolongation may be drug-related inhibition of the human ether-á-go-go-related gene (hERG) channel, which subsequently results in prolongation of the cardiac action potential duration (APD). hERGAPDbase is a database of electrophysiological experimental data documenting potential hERG channel inhibitory actions and the APD-prolongation activities of chemical compounds. All data entries are manually collected from scientific papers and curated by a person. With hERGAPDbase, we aim to provide useful information for chemical and pharmacological scientists and enable easy access to electrophysiological experimental data on chemical compounds. Database URL: http://www.grt.kyushu-u.ac.jp/hergapdbase/. PMID:21586548

  11. US EPA’s Watershed Management Research Activities

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s Urban Watershed Management Branch (UWMB) is responsible for developing and demonstrating methods to manage the risk to public health, property and the environment from wet-weather flows (WWF) in urban watersheds. The activities are prim...

  12. Guide to good practices for line and training manager activities

    SciTech Connect

    1998-06-01

    The purpose of this guide is to provide direction for line and training managers in carrying out their responsibilities for training and qualifying personnel and to verify that existing training activities are effective.

  13. Building Databases for Education. ERIC Digest.

    ERIC Educational Resources Information Center

    Klausmeier, Jane A.

    This digest provides a brief explanation of what a database is; explains how a database can be used; identifies important factors that should be considered when choosing database management system software; and provides citations to sources for finding reviews and evaluations of database management software. The digest is concerned primarily with…

  14. Nonexercise activity thermogenesis in obesity management.

    PubMed

    Villablanca, Pedro A; Alegria, Jorge R; Mookadam, Farouk; Holmes, David R; Wright, R Scott; Levine, James A

    2015-04-01

    Obesity is linked to cardiovascular disease. The global increase in sedentary lifestyle is an important factor contributing to the rising prevalence of the obesity epidemic. Traditionally, counseling has focused on moderate- to vigorous-intensity exercise, with disappointing results. Nonexercise activity thermogenesis (NEAT) is an important component of daily energy expenditure. It represents the common daily activities, such as fidgeting, walking, and standing. These high-effect NEAT movements could result in up to an extra 2000 kcal of expenditure per day beyond the basal metabolic rate, depending on body weight and level of activity. Implementing NEAT during leisure-time and occupational activities could be essential to maintaining a negative energy balance. NEAT can be applied by being upright, ambulating, and redesigning workplace and leisure-time environments to promote NEAT. The benefits of NEAT include not only the extra calories expended but also the reduced occurrence of the metabolic syndrome, cardiovascular events, and all-cause mortality. We believe that to overcome the obesity epidemic and its adverse cardiovascular consequences, NEAT should be part of the current medical recommendations. The content of this review is based on a literature search of PubMed and the Google search engine between January 1, 1960, and October 1, 2014, using the search terms physical activity, obesity, energy expenditure, nonexercise activity thermogenesis, and NEAT. PMID:25841254

  15. Biofuel Database

    National Institute of Standards and Technology Data Gateway

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  16. Nondestructive testing: Neutron radiography and neutron activation. (Latest citations from the INSPEC database). Published Search

    SciTech Connect

    1996-04-01

    The bibliography contains citations concerning the technology of neutron radiography and neutron activation for nondestructive testing of materials. The development and evaluation of neutron activation analysis and neutron diffraction examination of liquids and solids are presented. Citations also discuss nondestructive assay, verification, evaluation, and multielement analysis of biomedical, environmental, industrial, and geological materials. Nondestructive identification of chemical agents, explosives, weapons, and drugs in sealed containers are explored. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  17. Essential Learnings in Environmental Education--A Database for Building Activities and Programs.

    ERIC Educational Resources Information Center

    Ballard, Melissa, Comp.; Pandya, Mamata, Comp.

    The purpose of this book is to provide building blocks for designing and reviewing environmental education programs and activities. This handbook provides 600 basic concepts needed to attain the environmental education goals outlined at the Tbilisi, USSR, conference and generally agreed to be the fundamental core of quality environmental…

  18. Activated carbon: Utilization excluding industrial waste treatment. (Latest citations from the Compendex database). Published Search

    SciTech Connect

    Not Available

    1993-06-01

    The bibliography contains citations concerning the commercial use and theoretical studies of activated carbon. Topics include performance evaluations in water treatment processes, preparation and regeneration techniques, materials recovery, and pore structure studies. Adsorption characteristics for specific materials are discussed. Studies pertaining specifically to industrial waste treatment are excluded. (Contains 250 citations and includes a subject term index and title list.)

  19. Data-Based Active Learning in the Principles of Macroeconomics Course: A Mock FOMC Meeting

    ERIC Educational Resources Information Center

    Whiting, Cathleen

    2006-01-01

    The author presents an active-learning exercise for the introductory macroeconomics class in which students participate in a mock Federal Open Market Committee (FOMC) meeting. Preparation involves data gathering and writing both a research report and a policy recommendation. An FOMC meeting is simulated in which students give their policy…

  20. Electronic Databases.

    ERIC Educational Resources Information Center

    Williams, Martha E.

    1985-01-01

    Presents examples of bibliographic, full-text, and numeric databases. Also discusses how to access these databases online, aids to online retrieval, and several issues and trends (including copyright and downloading, transborder data flow, use of optical disc/videodisc technology, and changing roles in database generation and processing). (JN)

  1. Review of Medical Dispute Cases in the Pain Management in Korea: A Medical Malpractice Liability Insurance Database Study

    PubMed Central

    Moon, Hyun Seog

    2015-01-01

    Background Pain medicine often requires medico-legal involvement, even though diagnosis and treatments have improved considerably. Multiple guidelines for pain physicians contain many recommendations regarding interventional treatment. Unfortunately, no definite treatment guidelines exist because there is no complete consensus among individual guidelines. Pain intervention procedures are widely practiced and highly associated with adverse events and complications. However, a comprehensive, systemic review of medical-dispute cases (MDCs) in Korea has not yet been reported. The purpose of this article is to analyze the frequency and type of medical dispute activity undertaken by pain specialists in Korea. Methods Data on medical disputes cases were collected through the Korea Medical Association mutual aid and through a private medical malpractice liability insurance company. Data regarding the frequency and type of MDCs, along with brief case descriptions, were obtained. Results Pain in the lumbar region made up a major proportion of MDCs and compensation costs. Infection, nerve injury, and diagnosis related cases were the most major contents of MDCs. Only a small proportion of cases involved patient death or unconsciousness, but compensation costs were the highest. Conclusions More systemic guidelines and recommendations on interventional pain management are needed, especially those focused on medico-legal cases. Complications arising from pain management procedures and treatments may be avoided by physicians who have the required knowledge and expertise regarding anatomy and pain intervention procedures and know how to recognize procedural aberrations as soon as they occur. PMID:26495080

  2. Description of data base management activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    ARC's current and future data processing needs have been identified and defined. Requirements for improved data processing capabilities are listed: (1) Centralized control of system data files. Establish a DBA/data base administrator position with responsibility for management of the Centers numerous data bases; (2) Programmer tools to improve efficiency of performance; (3) Direct and timely access to information. Presently, the user submits query requests to the data processing department where they are prioritized with other queries and then batch processed using a report generator; (4) On line data entry. With the merger of the Dryden facility with ARC on line data entry, edit and updates have become mandatory for timely operation and reporting. A DBMS software package was purchased to meet the above requirements.

  3. The Global Terrestrial Network for Permafrost Database: metadata statistics and prospective analysis on future permafrost temperature and active layer depth monitoring site distribution

    NASA Astrophysics Data System (ADS)

    Biskaborn, B. K.; Lanckman, J.-P.; Lantuit, H.; Elger, K.; Streletskiy, D. A.; Cable, W. L.; Romanovsky, V. E.

    2015-03-01

    The Global Terrestrial Network for Permafrost (GTN-P) provides the first dynamic database associated with the Thermal State of Permafrost (TSP) and the Circumpolar Active Layer Monitoring (CALM) programs, which extensively collect permafrost temperature and active layer thickness data from Arctic, Antarctic and Mountain permafrost regions. The purpose of the database is to establish an "early warning system" for the consequences of climate change in permafrost regions and to provide standardized thermal permafrost data to global models. In this paper we perform statistical analysis of the GTN-P metadata aiming to identify the spatial gaps in the GTN-P site distribution in relation to climate-effective environmental parameters. We describe the concept and structure of the Data Management System in regard to user operability, data transfer and data policy. We outline data sources and data processing including quality control strategies. Assessment of the metadata and data quality reveals 63% metadata completeness at active layer sites and 50% metadata completeness for boreholes. Voronoi Tessellation Analysis on the spatial sample distribution of boreholes and active layer measurement sites quantifies the distribution inhomogeneity and provides potential locations of additional permafrost research sites to improve the representativeness of thermal monitoring across areas underlain by permafrost. The depth distribution of the boreholes reveals that 73% are shallower than 25 m and 27% are deeper, reaching a maximum of 1 km depth. Comparison of the GTN-P site distribution with permafrost zones, soil organic carbon contents and vegetation types exhibits different local to regional monitoring situations on maps. Preferential slope orientation at the sites most likely causes a bias in the temperature monitoring and should be taken into account when using the data for global models. The distribution of GTN-P sites within zones of projected temperature change show a high

  4. Active listening: The key of successful communication in hospital managers

    PubMed Central

    Jahromi, Vahid Kohpeima; Tabatabaee, Seyed Saeed; Abdar, Zahra Esmaeili; Rajabi, Mahboobeh

    2016-01-01

    Introduction One of the important causes of medical errors and unintentional harm to patients is ineffective communication. The important part of this skill, in case it has been forgotten, is listening. The objective of this study was to determine whether managers in hospitals listen actively. Methods This study was conducted between May and June 2014 among three levels of managers at teaching hospitals in Kerman, Iran. Active Listening skill among hospital managers was measured by self-made Active Listening Skill Scale (ALSS), which consists of the key elements of active listening and has five subscales, i.e., Avoiding Interruption, Maintaining Interest, Postponing Evaluation, Organizing Information, and Showing Interest. The data were analyzed by IBM-SPSS software, version 20, and the Pearson product-moment correlation coefficient, the chi-squared test, and multiple linear regressions. Results The mean score of active listening in hospital managers was 2.32 out of 3.The highest score (2.27) was obtained by the first-level managers, and the top managers got the lowest score (2.16). Hospital mangers were best in showing interest and worst in avoiding interruptions. The area of employment was a significant predictor of avoiding interruption and the managers’ gender was a strong predictor of skill in maintaining interest (p < 0.05). The type of management and education can predict postponing evaluation, and the length of employment can predict showing interest (p < 0.05). Conclusion There is a necessity for the development of strategies to create more awareness among the hospital managers concerning their active listening skills. PMID:27123221

  5. Methods and pitfalls in searching drug safety databases utilising the Medical Dictionary for Regulatory Activities (MedDRA).

    PubMed

    Brown, Elliot G

    2003-01-01

    The Medical Dictionary for Regulatory Activities (MedDRA) is a unified standard terminology for recording and reporting adverse drug event data. Its introduction is widely seen as a significant improvement on the previous situation, where a multitude of terminologies of widely varying scope and quality were in use. However, there are some complexities that may cause difficulties, and these will form the focus for this paper. Two methods of searching MedDRA-coded databases are described: searching based on term selection from all of MedDRA and searching based on terms in the safety database. There are several potential traps for the unwary in safety searches. There may be multiple locations of relevant terms within a system organ class (SOC) and lack of recognition of appropriate group terms; the user may think that group terms are more inclusive than is the case. MedDRA may distribute terms relevant to one medical condition across several primary SOCs. If the database supports the MedDRA model, it is possible to perform multiaxial searching: while this may help find terms that might have been missed, it is still necessary to consider the entire contents of the SOCs to find all relevant terms and there are many instances of incomplete secondary linkages. It is important to adjust for multiaxiality if data are presented using primary and secondary locations. Other sources for errors in searching are non-intuitive placement and the selection of terms as preferred terms (PTs) that may not be widely recognised. Some MedDRA rules could also result in errors in data retrieval if the individual is unaware of these: in particular, the lack of multiaxial linkages for the Investigations SOC, Social circumstances SOC and Surgical and medical procedures SOC and the requirement that a PT may only be present under one High Level Term (HLT) and one High Level Group Term (HLGT) within any single SOC. Special Search Categories (collections of PTs assembled from various SOCs by

  6. First Look: TRADEMARKSCAN Database.

    ERIC Educational Resources Information Center

    Fernald, Anne Conway; Davidson, Alan B.

    1984-01-01

    Describes database produced by Thomson and Thomson and available on Dialog which contains over 700,000 records representing all active federal trademark registrations and applications for registrations filed in United States Patent and Trademark Office. A typical record, special features, database applications, learning to use TRADEMARKSCAN, and…

  7. A Quality System Database

    NASA Technical Reports Server (NTRS)

    Snell, William H.; Turner, Anne M.; Gifford, Luther; Stites, William

    2010-01-01

    A quality system database (QSD), and software to administer the database, were developed to support recording of administrative nonconformance activities that involve requirements for documentation of corrective and/or preventive actions, which can include ISO 9000 internal quality audits and customer complaints.

  8. Activity-Based Costing: A Cost Management Tool.

    ERIC Educational Resources Information Center

    Turk, Frederick J.

    1993-01-01

    In college and university administration, overhead costs are often charged to programs indiscriminately, whereas the support activities that underlie those costs remain unanalyzed. It is time for institutions to decrease ineffective use of resources. Activity-based management attributes costs more accurately and can improve efficiency. (MSE)

  9. Intrauterine resuscitation: active management of fetal distress.

    PubMed

    Thurlow, J A; Kinsella, S M

    2002-04-01

    Acute fetal distress in labour is a condition of progressive fetal asphyxia with hypoxia and acidosis. It is usually diagnosed by finding characteristic features in the fetal heart rate pattern, wherever possible supported by fetal scalp pH measurement. Intrauterine resuscitation consists of applying specific measures with the aim of increasing oxygen delivery to the placenta and umbilical blood flow, in order to reverse hypoxia and acidosis. These measures include initial left lateral recumbent positioning followed by right lateral or knee-elbow if necessary, rapid intravenous infusion of a litre of non-glucose crystalloid, maternal oxygen administration at the highest practical inspired percentage, inhibition of uterine contractions usually with subcutaneous or intravenous terbutaline 250 microg, and intra-amniotic infusion of warmed crystalloid solution. Specific manoeuvres for umbilical cord prolapse are also described. Intrauterine resuscitation may be used as part of the obstetric management of labour, while preparing for caesarean delivery for fetal distress, or at the time of establishment of regional analgesia during labour in the compromised fetus. The principles may also be applied during inter-hospital transfers of sick or labouring parturients. PMID:15321562

  10. Using Sales Management Students to Manage Professional Selling Students in an Innovative Active Learning Project

    ERIC Educational Resources Information Center

    Young, Joyce A.; Hawes, Jon M.

    2013-01-01

    This paper describes an application of active learning within two different courses: professional selling and sales management. Students assumed the roles of sales representatives and sales managers for an actual fund-raiser--a golf outing--sponsored by a student chapter of the American Marketing Association. The sales project encompassed an…

  11. Database of the Geology and Thermal Activity of Norris Geyser Basin, Yellowstone National Park

    USGS Publications Warehouse

    Flynn, Kathryn; Graham Wall, Brita; White, Donald E.; Hutchinson, Roderick A.; Keith, Terry E.C.; Clor, Laura; Robinson, Joel E.

    2008-01-01

    This dataset contains contacts, geologic units and map boundaries from Plate 1 of USGS Professional Paper 1456, 'The Geology and Remarkable Thermal Activity of Norris Geyser Basin, Yellowstone National Park, Wyoming.' The features are contained in the Annotation, basins_poly, contours, geology_arc, geology_poly, point_features, and stream_arc feature classes as well as a table of geologic units and their descriptions. This dataset was constructed to produce a digital geologic map as a basis for studying hydrothermal processes in Norris Geyser Basin. The original map does not contain registration tic marks. To create the geodatabase, the original scanned map was georegistered to USGS aerial photographs of the Norris Junction quadrangle collected in 1994. Manmade objects, i.e. roads, parking lots, and the visitor center, along with stream junctions and other hydrographic features, were used for registration.

  12. [Active career management needed for female doctors].

    PubMed

    Maas, Angela H E M; ter Braak, Edith W M T; Verbon, Annelies

    2015-01-01

    For more than 15 years two-thirds of medical students have been women. Despite this, they represent a minority (16-25 %) of professors in academic medicine. There is still a major gender gap to the disadvantage of women in leading positions in academia, with women earning only 80% of the salary of their male counterparts and fewer opportunities for scientific grants. Recent studies have shown that career ambition among men and women in medicine is comparable. However, successful women more often doubt their own achievements than men do. This is known as the 'imposter phenomenon' and acts as a barrier to career progression. Female leadership should be more actively promoted and encouraged to establish the diversity and creativity that we need in our current healthcare system. PMID:26959735

  13. An Interactive Geospatial Database and Visualization Approach to Early Warning Systems and Monitoring of Active Volcanoes: GEOWARN

    NASA Astrophysics Data System (ADS)

    Gogu, R. C.; Schwandner, F. M.; Hurni, L.; Dietrich, V. J.

    2002-12-01

    Large parts of southern and central Europe and the Pacific rim are situated in tectonically, seismic and volcanological extremely active zones. With the growth of population and tourism, vulnerability and risk towards natural hazards have expanded over large areas. Socio-economical aspects, land use, tourist and industrial planning as well as environmental protection increasingly require needs of natural hazard assessment. The availability of powerful and reliable satellite, geophysical and geochemical information and warning systems is therefore increasingly vital. Besides, once such systems have proven to be effective, they can be applied for similar purposes in other European areas and worldwide. Technologies today have proven that early warning of volcanic activity can be achieved by monitoring measurable changes in geophysical and geochemical parameters. Correlation between different monitored data sets, which would improve any prediction, is very scarce or missing. Visualisation of all spatial information and integration into an "intelligent cartographic concept" is of paramount interest in order to develop 2-, 3- and 4-dimensional models to approach the risk and emergency assessment as well as environmental and socio-economic planning. In the framework of the GEOWARN project, a database prototype for an Early Warning System (EWS) and monitoring of volcanic activity in case of hydrothermal-explosive and volcanic reactivation has been designed. The platform-independent, web-based, JAVA-programmed, interactive multidisciplinary multiparameter visualization software being developed at ETH allows expansion and utilization to other volcanoes, world-wide databases of volcanic unrest, or other types of natural hazard assessment. Within the project consortium, scientific data have been acquired on two pilot sites: Campi Flegrei (Italy) and Nisyros Greece, including 2&3D Topography and Bathymetry, Elevation (DEM) and Landscape models (DLM) derived from conventional

  14. Passive and active adaptive management: Approaches and an example

    USGS Publications Warehouse

    Williams, B.K.

    2011-01-01

    Adaptive management is a framework for resource conservation that promotes iterative learning-based decision making. Yet there remains considerable confusion about what adaptive management entails, and how to actually make resource decisions adaptively. A key but somewhat ambiguous distinction in adaptive management is between active and passive forms of adaptive decision making. The objective of this paper is to illustrate some approaches to active and passive adaptive management with a simple example involving the drawdown of water impoundments on a wildlife refuge. The approaches are illustrated for the drawdown example, and contrasted in terms of objectives, costs, and potential learning rates. Some key challenges to the actual practice of AM are discussed, and tradeoffs between implementation costs and long-term benefits are highlighted. ?? 2010 Elsevier Ltd.

  15. Optimized Dual Threshold Entity Resolution For Electronic Health Record Databases – Training Set Size And Active Learning

    PubMed Central

    Joffe, Erel; Byrne, Michael J.; Reeder, Phillip; Herskovic, Jorge R.; Johnson, Craig W.; McCoy, Allison B.; Bernstam, Elmer V.

    2013-01-01

    Clinical databases may contain several records for a single patient. Multiple general entity-resolution algorithms have been developed to identify such duplicate records. To achieve optimal accuracy, algorithm parameters must be tuned to a particular dataset. The purpose of this study was to determine the required training set size for probabilistic, deterministic and Fuzzy Inference Engine (FIE) algorithms with parameters optimized using the particle swarm approach. Each algorithm classified potential duplicates into: definite match, non-match and indeterminate (i.e., requires manual review). Training sets size ranged from 2,000–10,000 randomly selected record-pairs. We also evaluated marginal uncertainty sampling for active learning. Optimization reduced manual review size (Deterministic 11.6% vs. 2.5%; FIE 49.6% vs. 1.9%; and Probabilistic 10.5% vs. 3.5%). FIE classified 98.1% of the records correctly (precision=1.0). Best performance required training on all 10,000 randomly-selected record-pairs. Active learning achieved comparable results with 3,000 records. Automated optimization is effective and targeted sampling can reduce the required training set size. PMID:24551372

  16. Mechanisms and Management of Stress Fractures in Physically Active Persons

    PubMed Central

    Romani, William A.; Gieck, Joe H.; Perrin, David H.; Saliba, Ethan N.; Kahler, David M.

    2002-01-01

    Objective: To describe the anatomy of bone and the physiology of bone remodeling as a basis for the proper management of stress fractures in physically active people. Data Sources: We searched PubMed for the years 1965 through 2000 using the key words stress fracture, bone remodeling, epidemiology, and rehabilitation. Data Synthesis: Bone undergoes a normal remodeling process in physically active persons. Increased stress leads to an acceleration of this remodeling process, a subsequent weakening of bone, and a higher susceptibility to stress fracture. When a stress fracture is suspected, appropriate management of the injury should begin immediately. Effective management includes a cyclic process of activity and rest that is based on the remodeling process of bone. Conclusions/Recommendations: Bone continuously remodels itself to withstand the stresses involved with physical activity. Stress fractures occur as the result of increased remodeling and a subsequent weakening of the outer surface ofthe bone. Once a stress fracture is suspected, a cyclic management program that incorporates the physiology of bone remodeling should be initiated. The cyclic program should allow the physically active person to remove the source of the stress to the bone, maintain fitness, promote a safe return to activity, and permit the bone to heal properly. PMID:16558676

  17. The Impact of Environment and Occupation on the Health and Safety of Active Duty Air Force Members: Database Development and De-Identification.

    PubMed

    Erich, Roger; Eaton, Melinda; Mayes, Ryan; Pierce, Lamar; Knight, Andrew; Genovesi, Paul; Escobar, James; Mychalczuk, George; Selent, Monica

    2016-08-01

    Preparing data for medical research can be challenging, detail oriented, and time consuming. Transcription errors, missing or nonsensical data, and records not applicable to the study population may hamper progress and, if unaddressed, can lead to erroneous conclusions. In addition, study data may be housed in multiple disparate databases and complex formats. Merging methods may be incomplete to obtain temporally synchronized data elements. We created a comprehensive database to explore the general hypothesis that environmental and occupational factors influence health outcomes and risk-taking behavior among active duty Air Force personnel. Several databases containing demographics, medical records, health survey responses, and safety incident reports were cleaned, validated, and linked to form a comprehensive, relational database. The final step involved removing and transforming personally identifiable information to form a Health Insurance Portability and Accountability Act compliant limited database. Initial data consisted of over 62.8 million records containing 221 variables. When completed, approximately 23.9 million clean and valid records with 214 variables remained. With a clean, robust database, future analysis aims to identify high-risk career fields for targeted interventions or uncover potential protective factors in low-risk career fields. PMID:27483519

  18. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for scientific activities in the vicinity of Barrow, Alaska.

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Kassin, A.; Kofoed, K. B.; Copenhaver, W.; Laney, C. M.; Gaylord, A. G.; Collins, J. A.; Tweedie, C. E.

    2014-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic and the Barrow Area Information Database (BAID, www.barrowmapped.org) tracks and facilitates a gamut of research, management, and educational activities in the area. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 12,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, save or print maps and query results, and filter or view information by space, time, and/or other tags. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. Recent advances include the addition of more than 2000 new research sites, provision of differential global position system (dGPS) and Unmanned Aerial Vehicle (UAV) support to visiting scientists, surveying over 80 miles of coastline to document rates of erosion, training of local GIS personal to better make use of science in local decision making, deployment and near real time connectivity to a wireless micrometeorological sensor network, links to Barrow area datasets housed at national data archives and substantial upgrades to the BAID website and web mapping applications.

  19. Status Report on Transfer of Physical and Hydraulic Properties Databases to the Hanford Environmental Information System - PNNL Remediation Decision Support Project, Task 1, Activity 6

    SciTech Connect

    Rockhold, Mark L.; Middleton, Lisa A.; Cantrell, Kirk J.

    2009-06-30

    This document provides a status report on efforts to transfer physical and hydraulic property data from PNNL to CHPRC for incorporation into HEIS. The Remediation Decision Support (RDS) Project is managed by Pacific Northwest National Laboratory (PNNL) to support Hanford Site waste management and remedial action decisions by the U.S. Department of Energy and their contractors. The objective of Task 1, Activity 6 of the RDS project is to compile all available physical and hydraulic property data for sediments from the Hanford Site, to port these data into the Hanford Environmental Information System (HEIS), and to make the data web-accessible to anyone on the Hanford Local Area Network via the so-called Virtual Library. These physical and hydraulic property data are used to estimate parameters for analytical and numerical flow and transport models that are used for site risk assessments and evaluation of remedial action alternatives. In past years efforts were made by RDS project staff to compile all available physical and hydraulic property data for Hanford sediments and to transfer these data into SoilVision{reg_sign}, a commercial geotechnical software package designed for storing, analyzing, and manipulating soils data. Although SoilVision{reg_sign} has proven to be useful, its access and use restrictions have been recognized as a limitation to the effective use of the physical and hydraulic property databases by the broader group of potential users involved in Hanford waste site issues. In order to make these data more widely available and useable, a decision was made to port them to HEIS and to make them web-accessible via a Virtual Library module. In FY08 the original objectives of this activity on the RDS project were to: (1) ensure traceability and defensibility of all physical and hydraulic property data currently residing in the SoilVision{reg_sign} database maintained by PNNL, (2) transfer the physical and hydraulic property data from the Microsoft

  20. TRANSFORMATION OF DEVELOPMENTAL NEUROTOXICITY DATA INTO STRUCTURE-SEARCHABLE TOXML DATABASE IN SUPPORT OF STRUCTURE-ACTIVITY RELATIONSHIP (SAR) WORKFLOW.

    EPA Science Inventory

    Early hazard identification of new chemicals is often difficult due to lack of data on the novel material for toxicity endpoints, including neurotoxicity. At present, there are no structure searchable neurotoxicity databases. A working group was formed to construct a database to...

  1. WAX ActiveLibrary: a tool to manage information overload.

    PubMed

    Hanka, R; O'Brien, C; Heathfield, H; Buchan, I E

    1999-11-01

    WAX Active-Library (Cambridge Centre for Clinical Informatics) is a knowledge management system that seeks to support doctors' decision making through the provision of electronic books containing a wide range of clinical knowledge and locally based information. WAX has been piloted in several regions in the United Kingdom and formally evaluated in 17 GP surgeries based in Cambridgeshire. The evaluation has provided evidence that WAX Active-Library significantly improves GPs' access to relevant information sources and by increasing appropriate patient management and referrals this might also lead to an improvement in clinical outcomes. PMID:10662094

  2. Open Clients for Distributed Databases

    NASA Astrophysics Data System (ADS)

    Chayes, D. N.; Arko, R. A.

    2001-12-01

    We are actively developing a collection of open source example clients that demonstrate use of our "back end" data management infrastructure. The data management system is reported elsewhere at this meeting (Arko and Chayes: A Scaleable Database Infrastructure). In addition to their primary goal of being examples for others to build upon, some of these clients may have limited utility in them selves. More information about the clients and the data infrastructure is available on line at http://data.ldeo.columbia.edu. The available examples to be demonstrated include several web-based clients including those developed for the Community Review System of the Digital Library for Earth System Education, a real-time watch standers log book, an offline interface to use log book entries, a simple client to search on multibeam metadata and others are Internet enabled and generally web-based front ends that support searches against one or more relational databases using industry standard SQL queries. In addition to the web based clients, simple SQL searches from within Excel and similar applications will be demonstrated. By defining, documenting and publishing a clear interface to the fully searchable databases, it becomes relatively easy to construct client interfaces that are optimized for specific applications in comparison to building a monolithic data and user interface system.

  3. USER'S GUIDE FOR THE MUNICIPAL SOLID WASTE LIFE-CYCLE DATABASE

    EPA Science Inventory

    The report describes how to use the municipal solid waste (MSW) life cycle database, a software application with Microsoft Access interfaces, that provides environmental data for energy production, materials production, and MSW management activities and equipment. The basic datab...

  4. Prognostic and health management of active assets in nuclear power plants

    SciTech Connect

    Agarwal, Vivek; Lybeck, Nancy; Pham, Binh T.; Rusaw, Richard; Bickford, Randall

    2015-06-04

    This study presents the development of diagnostic and prognostic capabilities for active assets in nuclear power plants (NPPs). The research was performed under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program. Idaho National Laboratory researched, developed, implemented, and demonstrated diagnostic and prognostic models for generator step-up transformers (GSUs). The Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software developed by the Electric Power Research Institute was used to perform diagnosis and prognosis. As part of the research activity, Idaho National Laboratory implemented 22 GSU diagnostic models in the Asset Fault Signature Database and two wellestablished GSU prognostic models for the paper winding insulation in the Remaining Useful Life Database of the FW-PHM Suite. The implemented models along with a simulated fault data stream were used to evaluate the diagnostic and prognostic capabilities of the FW-PHM Suite. Knowledge of the operating condition of plant asset gained from diagnosis and prognosis is critical for the safe, productive, and economical long-term operation of the current fleet of NPPs. This research addresses some of the gaps in the current state of technology development and enables effective application of diagnostics and prognostics to nuclear plant assets.

  5. Prognostic and health management of active assets in nuclear power plants

    DOE PAGESBeta

    Agarwal, Vivek; Lybeck, Nancy; Pham, Binh T.; Rusaw, Richard; Bickford, Randall

    2015-06-04

    This study presents the development of diagnostic and prognostic capabilities for active assets in nuclear power plants (NPPs). The research was performed under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program. Idaho National Laboratory researched, developed, implemented, and demonstrated diagnostic and prognostic models for generator step-up transformers (GSUs). The Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software developed by the Electric Power Research Institute was used to perform diagnosis and prognosis. As part of the research activity, Idaho National Laboratory implemented 22 GSU diagnostic models in the Asset Fault Signature Database and twomore » wellestablished GSU prognostic models for the paper winding insulation in the Remaining Useful Life Database of the FW-PHM Suite. The implemented models along with a simulated fault data stream were used to evaluate the diagnostic and prognostic capabilities of the FW-PHM Suite. Knowledge of the operating condition of plant asset gained from diagnosis and prognosis is critical for the safe, productive, and economical long-term operation of the current fleet of NPPs. This research addresses some of the gaps in the current state of technology development and enables effective application of diagnostics and prognostics to nuclear plant assets.« less

  6. Management of Water for Unconventional Oil and Gas Operations Enhanced with the Expanded U.S.Geological Survey Produced Waters Geochemical Database

    NASA Astrophysics Data System (ADS)

    Gans, K. D.; Blondes, M. S.; Thordsen, J. J.; Thomas, B.; Reidy, M. E.; Engle, M.; Kharaka, Y. K.; Rowan, E. L.

    2014-12-01

    Increases in hydraulic fracturing practices for shale gas and tight oil reservoirs have dramatically increased petroleum production in the USA, but have also made the issue of water management from these operations a high priority. Hydraulic fracturing requires ~ 10,000 to 50,000 m3 of water per well for injection in addition to water used to drill the well. Initially much of the water used for hydraulic fracturing was fresh water, but attitudes and operations are changing in response to costs and concerns. Concerns about groundwater depletion and contamination have prompted operators to increase the amount of produced water that can be recycled for hydraulic fracturing and to find suitable locations for salt-water injection. Knowledge of the geochemistry of produced waters is valuable in determining the feasibility of produced water recycling. Water with low salinity can be reclaimed for use outside of the petroleum industry (e.g. irrigation, municipal uses, and industrial operations). The updated and expanded USGS Produced Waters Database available at http://eerscmap.usgs.gov/pwapp/ will facilitate and enhance studies on management of water, including produced water, for unconventional oil and gas drilling and production. The USGS database contains > 160,000 samples. Expanding on the 2002 database, we have filled in state and regional gaps with information from conventional and unconventional wells and have increased the number of constituents to include minor and trace chemicals, isotopes, and time series data. We currently have produced water data from 5,200 tight gas wells, 4,500 coal-bed methane (CBM) wells, 3,500 shale gas wells, and 700 tight oil wells. These numbers will increase as we continue to receive positive responses from oil companies, state oil and gas commissions, and scientists wanting to contribute their data. This database is an important resource for a wide range of interested parties. Scientists from universities, government agencies, public

  7. Maize databases

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This chapter is a succinct overview of maize data held in the species-specific database MaizeGDB (the Maize Genomics and Genetics Database), and selected multi-species data repositories, such as Gramene/Ensembl Plants, Phytozome, UniProt and the National Center for Biotechnology Information (NCBI), ...

  8. Managing Rock and Paleomagnetic Data Flow with the MagIC Database: from Measurement and Analysis to Comprehensive Archive and Visualization

    NASA Astrophysics Data System (ADS)

    Koppers, A. A.; Minnett, R. C.; Tauxe, L.; Constable, C.; Donadini, F.

    2008-12-01

    The Magnetics Information Consortium (MagIC) is commissioned to implement and maintain an online portal to a relational database populated by rock and paleomagnetic data. The goal of MagIC is to archive all measurements and derived properties for studies of paleomagnetic directions (inclination, declination) and intensities, and for rock magnetic experiments (hysteresis, remanence, susceptibility, anisotropy). Organizing data for presentation in peer-reviewed publications or for ingestion into databases is a time-consuming task, and to facilitate these activities, three tightly integrated tools have been developed: MagIC-PY, the MagIC Console Software, and the MagIC Online Database. A suite of Python scripts is available to help users port their data into the MagIC data format. They allow the user to add important metadata, perform basic interpretations, and average results at the specimen, sample and site levels. These scripts have been validated for use as Open Source software under the UNIX, Linux, PC and Macintosh© operating systems. We have also developed the MagIC Console Software program to assist in collating rock and paleomagnetic data for upload to the MagIC database. The program runs in Microsoft Excel© on both Macintosh© computers and PCs. It performs routine consistency checks on data entries, and assists users in preparing data for uploading into the online MagIC database. The MagIC website is hosted under EarthRef.org at http://earthref.org/MAGIC/ and has two search nodes, one for paleomagnetism and one for rock magnetism. Both nodes provide query building based on location, reference, methods applied, material type and geological age, as well as a visual FlashMap interface to browse and select locations. Users can also browse the database by data type (inclination, intensity, VGP, hysteresis, susceptibility) or by data compilation to view all contributions associated with previous databases, such as PINT, GMPDB or TAFI or other user

  9. Efficient Management of Complex Striped Files in Active Storage

    SciTech Connect

    Piernas Canovas, Juan; Nieplocha, Jaroslaw

    2008-08-25

    Active Storage provides an opportunity for reducing the band- width requirements between the storage and compute elements of cur- rent supercomputing systems, and leveraging the processing power of the storage nodes used by some modern file systems. To achieve both objec- tives, Active Storage allows certain processing tasks to be performed directly on the storage nodes, near the data they manage. However, Active Storage must also support key requirements of scientific applications. In particular, Active Storage must be able to support striped files and files with complex formats (e.g., netCDF). In this paper, we describe how these important requirements can be addressed. The experimental results on a Lustre file system not only show that our proposal can re- duce the network traffic to near zero and scale the performance with the number of storage nodes, but also that it provides an efficient treatment of striped files and can manage files with complex data structures.

  10. SU-E-J-129: A Strategy to Consolidate the Image Database of a VERO Unit Into a Radiotherapy Management System

    SciTech Connect

    Yan, Y; Medin, P; Yordy, J; Zhao, B; Jiang, S

    2014-06-01

    Purpose: To present a strategy to integrate the imaging database of a VERO unit with a treatment management system (TMS) to improve clinical workflow and consolidate image data to facilitate clinical quality control and documentation. Methods: A VERO unit is equipped with both kV and MV imaging capabilities for IGRT treatments. It has its own imaging database behind a firewall. It has been a challenge to transfer images on this unit to a TMS in a radiation therapy clinic so that registered images can be reviewed remotely with an approval or rejection record. In this study, a software system, iPump-VERO, was developed to connect VERO and a TMS in our clinic. The patient database folder on the VERO unit was mapped to a read-only folder on a file server outside VERO firewall. The application runs on a regular computer with the read access to the patient database folder. It finds the latest registered images and fuses them in one of six predefined patterns before sends them via DICOM connection to the TMS. The residual image registration errors will be overlaid on the fused image to facilitate image review. Results: The fused images of either registered kV planar images or CBCT images are fully DICOM compatible. A sentinel module is built to sense new registered images with negligible computing resources from the VERO ExacTrac imaging computer. It takes a few seconds to fuse registered images and send them to the TMS. The whole process is automated without any human intervention. Conclusion: Transferring images in DICOM connection is the easiest way to consolidate images of various sources in your TMS. Technically the attending does not have to go to the VERO treatment console to review image registration prior delivery. It is a useful tool for a busy clinic with a VERO unit.

  11. Optically selected BLR-less active galactic nuclei from the SDSS Stripe82 Database - I. The sample

    NASA Astrophysics Data System (ADS)

    Zhang, Xue-Guang

    2014-02-01

    This is the first paper in a dedicated series to study the properties of the optically-selected broad-line-region-less (BLR-less) active galactic nuclei (AGNs; with no-hidden central broad emission line regions). We carried out a systematic search for the BLR-less AGNs through the Sloan Digital Sky Survey Legacy Survey (SDSS Stripe82 Database). Based on the spectral decomposition results for all the 136 676 spectroscopic objects (galaxies and quasars) with redshift less than 0.35 covered by the SDSS Stripe82 region, our spectroscopic sample for the BLR-less AGNs includes 22 693 pure narrow line objects without broad emission lines but with apparent AGN continuum emission RAGN > 0.3 and apparent stellar lights Rssp > 0.3. Then, using the properties of the photometry magnitude RMS (RMS) and Pearson's coefficients (R1, 2) between two different SDSS band light curves: RMS_k>3× RMS_{M_k} and R1, 2 > ˜0.8, the final 281 pure narrow line objects with true photometry variabilities are our selected reliable candidates for the BLR-less AGNs. The selected candidates with higher confidence levels not only have the expected spectral features of the BLR-less AGNs, but also show significant true photometry variabilities. The reported sample enlarges at least four times the current sample of the BLR-less AGNs, and will provide more reliable information to explain the lack of the BLRs of AGNs in our following studies.

  12. A Ranking Analysis of the Management Schools in Greater China (2000-2010): Evidence from the SSCI Database

    ERIC Educational Resources Information Center

    Hou, Mingjun; Fan, Peihua; Liu, Heng

    2014-01-01

    The authors rank the management schools in Greater China (including Mainland China, Hong Kong, Taiwan, and Macau) based on their academic publications in the Social Sciences Citation Index management and business journals from 2000 to 2010. Following K. Ritzberger's (2008) and X. Yu and Z. Gao's (2010) ranking method, the authors develop…

  13. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  14. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    PubMed

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap

  15. The spectral database Specchio: Data management, data sharing and initial processing of field spectrometer data within the Dimensions of Biodiversity project

    NASA Astrophysics Data System (ADS)

    Hueni, A.; Schweiger, A. K.

    2015-12-01

    Field spectrometry has substantially gained importance in vegetation ecology due to the increasing knowledge about causal ties between vegetation spectra and biochemical and structural plant traits. Additionally, worldwide databases enable the exchange of spectral and plant trait data and promote global research cooperation. This can be expected to further enhance the use of field spectrometers in ecological studies. However, the large amount of data collected during spectral field campaigns poses major challenges regarding data management, archiving and processing. The spectral database Specchio is designed to organize, manage, process and share spectral data and metadata. We provide an example for using Specchio based on leaf level spectra of prairie plant species collected during the 2015 field campaign of the Dimensions of Biodiversity research project, conducted at the Cedar Creek Long-Term Ecological Research site, in central Minnesota. We show how spectral data collections can be efficiently administered, organized and shared between distinct research groups and explore the capabilities of Specchio for data quality checks and initial processing steps.

  16. Draft position paper on knowledge management in space activities

    NASA Technical Reports Server (NTRS)

    Holm, Jeanne; Moura, Denis

    2003-01-01

    As other fields of industry, space activities are facing the challenge of Knowledge Management and the International Academy of Astronautics decided to settle in 2002 a Study Group to analyse the problem and issue general guidelines. This communication presents the draft position paper of this group in view to be discussed during the 2003 IAF Congress.

  17. Moodog: Tracking Student Activity in Online Course Management Systems

    ERIC Educational Resources Information Center

    Zhang, Hangjin; Almeroth, Kevin

    2010-01-01

    Many universities are currently using Course Management Systems (CMSes) to conduct online learning, for example, by distributing course materials or submitting homework assignments. However, most CMSes do not include comprehensive activity tracking and analysis capabilities. This paper describes a method to track students' online learning…

  18. Management of Hypertension: Adapting New Guidelines for Active Patients.

    ERIC Educational Resources Information Center

    Tanji, Jeffrey L.; Batt, Mark E.

    1995-01-01

    Discusses recent guidelines on hypertension from the National Heart, Lung, and Blood Institute and details the latest management protocols for patients with high blood pressure. The article helps physicians interpret the guidelines for treating active patients, highlighting diagnosis, step care revision, pharmacology, and sports participation…

  19. Assessment of global disease activity in RA patients monitored in the METEOR database: the patient's versus the rheumatologist's opinion.

    PubMed

    Gvozdenović, Emilia; Koevoets, Rosanne; Wolterbeek, Ron; van der Heijde, Désirée; Huizinga, Tom W J; Allaart, Cornelia F; Landewé, Robert B M

    2014-04-01

    The objectives of this study were to compare the patient's (PtGDA) and physician's (PhGDA) assessment of global disease activity and to identify factors that might influence these differences as well as factors that may influence the patient's and the physician's scores separately. Anonymous data were used from 2,117 Dutch patients included in the Measurement of efficacy of Treatment in the Era of Rheumatology database. PtGDA and PhGDA were scored independently on a 100-mm visual analog scale (VAS) with 0 and 100 as extremes. The agreement, intraclass correlation coefficients (ICC), was calculated and a Bland-Altman plot was created to visualize the differences between PtGDA and PhGDA. Linear mixed model analysis was used to model PtGDA and PhGDA. Logistic repeated measurements were used to model the difference in PtGDA and PhGDA (PtGDA > PhGDA versus PtGDA ≤ PhGDA). Gender patient, gender physician, age, swollen joint count (SJC), tender joint count, VAS pain, disease duration, and erythrocyte sedimentation rate (ESR) were considered as possible determinants in both models. Mean (standard deviation) age was 57 (15) years and 67 % of the patients were female. Agreement between PtGDA and PhGDA was moderate (ICC, 0.57). Patients scored on average 11 units higher (worse) than rheumatologists (95 % limits of agreement, -25.2 to 47.6). Patient's perception of pain (VAS) was positively associated with a PtGDA being higher than PhGDA. Similarly, ESR and swollen joint counts were positively associated with a PtGDA being lower or equal to the PhGDA. Patients rate global disease activity consistently higher than their rheumatologists. Patients base their judgment primarily on the level of pain, physicians on the level of SJC and ESR. PMID:24068385

  20. ePlantLIBRA: A composition and biological activity database for bioactive compounds in plant food supplements.

    PubMed

    Plumb, J; Lyons, J; Nørby, K; Thomas, M; Nørby, E; Poms, R; Bucchini, L; Restani, P; Kiely, M; Finglas, P

    2016-02-15

    The newly developed ePlantLIBRA database is a comprehensive and searchable database, with up-to-date coherent and validated scientific information on plant food supplement (PFS) bioactive compounds, with putative health benefits as well as adverse effects, and contaminants and residues. It is the only web-based database available compiling peer reviewed publications and case studies on PFS. A user-friendly, efficient and flexible interface has been developed for searching, extracting, and exporting the data, including links to the original references. Data from over 570 publications have been quality evaluated and entered covering 70 PFS or their botanical ingredients. PMID:26433297

  1. Genome databases

    SciTech Connect

    Courteau, J.

    1991-10-11

    Since the Genome Project began several years ago, a plethora of databases have been developed or are in the works. They range from the massive Genome Data Base at Johns Hopkins University, the central repository of all gene mapping information, to small databases focusing on single chromosomes or organisms. Some are publicly available, others are essentially private electronic lab notebooks. Still others limit access to a consortium of researchers working on, say, a single human chromosome. An increasing number incorporate sophisticated search and analytical software, while others operate as little more than data lists. In consultation with numerous experts in the field, a list has been compiled of some key genome-related databases. The list was not limited to map and sequence databases but also included the tools investigators use to interpret and elucidate genetic data, such as protein sequence and protein structure databases. Because a major goal of the Genome Project is to map and sequence the genomes of several experimental animals, including E. coli, yeast, fruit fly, nematode, and mouse, the available databases for those organisms are listed as well. The author also includes several databases that are still under development - including some ambitious efforts that go beyond data compilation to create what are being called electronic research communities, enabling many users, rather than just one or a few curators, to add or edit the data and tag it as raw or confirmed.

  2. Active surveillance for the management of localized prostate cancer: Guideline recommendations

    PubMed Central

    Morash, Chris; Tey, Rovena; Agbassi, Chika; Klotz, Laurence; McGowan, Tom; Srigley, John; Evans, Andrew

    2015-01-01

    Introduction: The objective is to provide guidance on the role of active surveillance (AS) as a management strategy for low-risk prostate cancer patients and to ensure that AS is offered to appropriate patients assessed by a standardized protocol. Prostate cancer is often a slowly progressive or sometimes non-progressive indolent disease diagnosed at an early stage with localized tumours that are unlikely to cause morbidity or death. Standard active treatments for prostate cancer include radiotherapy (RT) or radical prostatectomy (RP), but the harms from over diagnosis and overtreatment are of a significant concern. AS is increasingly being considered as a management strategy to avoid or delay the potential harms caused by unnecessary radical treatment. Methods: A literature search of MEDLINE, EMBASE, the Cochrane library, guideline databases and relevant meeting proceedings was performed and a systematic review of identified evidence was synthesized to make recommendations relating to the role of AS in the management of localized prostate cancer. Results: No exiting guidelines or reviews were suitable for use in the synthesis of evidence for the recommendations, but 59 reports of primary studies were identified. Due to studies being either non-comparative or heterogeneous, pooled meta-analyses were not conducted. Conclusion: The working group concluded that for patients with low-risk (Gleason score ≤6) localized prostate cancer, AS is the preferred disease management strategy. Active treatment (RP or RT) is appropriate for patients with intermediate-risk (Gleason score 7) localized prostate cancer. For select patients with low-volume Gleason 3+4=7 localized prostate cancer, AS can be considered. PMID:26225165

  3. Management and climate contributions to satellite-derived active fire trends in the contiguous United States

    NASA Astrophysics Data System (ADS)

    Lin, Hsiao-Wen; McCarty, Jessica L.; Wang, Dongdong; Rogers, Brendan M.; Morton, Douglas C.; Collatz, G. James; Jin, Yufang; Randerson, James T.

    2014-04-01

    Fires in croplands, plantations, and rangelands contribute significantly to fire emissions in the United States, yet are often overshadowed by wildland fires in efforts to develop inventories or estimate responses to climate change. Here we quantified decadal trends, interannual variability, and seasonality of Terra Moderate Resolution Imaging Spectroradiometer (MODIS) observations of active fires (thermal anomalies) as a function of management type in the contiguous U.S. during 2001-2010. We used the Monitoring Trends in Burn Severity database to identify active fires within the perimeter of large wildland fires and land cover maps to identify active fires in croplands. A third class of fires defined as prescribed/other included all residual satellite active fire detections. Large wildland fires were the most variable of all three fire types and had no significant annual trend in the contiguous U.S. during 2001-2010. Active fires in croplands, in contrast, increased at a rate of 3.4% per year. Cropland and prescribed/other fire types combined were responsible for 77% of the total active fire detections within the U.S and were most abundant in the south and southeast. In the west, cropland active fires decreased at a rate of 5.9% per year, likely in response to intensive air quality policies. Potential evaporation was a dominant regulator of the interannual variability of large wildland fires, but had a weaker influence on the other two fire types. Our analysis suggests it may be possible to modify landscape fire emissions within the U.S. by influencing the way fires are used in managed ecosystems.

  4. Management and climate contributions to satellite-derived active fire trends in the contiguous United States

    PubMed Central

    Lin, Hsiao-Wen; McCarty, Jessica L; Wang, Dongdong; Rogers, Brendan M; Morton, Douglas C; Collatz, G James; Jin, Yufang; Randerson, James T

    2014-01-01

    Fires in croplands, plantations, and rangelands contribute significantly to fire emissions in the United States, yet are often overshadowed by wildland fires in efforts to develop inventories or estimate responses to climate change. Here we quantified decadal trends, interannual variability, and seasonality of Terra Moderate Resolution Imaging Spectroradiometer (MODIS) observations of active fires (thermal anomalies) as a function of management type in the contiguous U.S. during 2001–2010. We used the Monitoring Trends in Burn Severity database to identify active fires within the perimeter of large wildland fires and land cover maps to identify active fires in croplands. A third class of fires defined as prescribed/other included all residual satellite active fire detections. Large wildland fires were the most variable of all three fire types and had no significant annual trend in the contiguous U.S. during 2001–2010. Active fires in croplands, in contrast, increased at a rate of 3.4% per year. Cropland and prescribed/other fire types combined were responsible for 77% of the total active fire detections within the U.S and were most abundant in the south and southeast. In the west, cropland active fires decreased at a rate of 5.9% per year, likely in response to intensive air quality policies. Potential evaporation was a dominant regulator of the interannual variability of large wildland fires, but had a weaker influence on the other two fire types. Our analysis suggests it may be possible to modify landscape fire emissions within the U.S. by influencing the way fires are used in managed ecosystems. Key Points Wildland, cropland, and prescribed fires had different trends and patterns Sensitivity to climate varied with fire type Intensity of air quality regulation influenced cropland burning trends PMID:26213662

  5. CAZymes Analysis Toolkit (CAT): web service for searching and analyzing carbohydrate-active enzymes in a newly sequenced organism using CAZy database.

    PubMed

    Park, Byung H; Karpinets, Tatiana V; Syed, Mustafa H; Leuze, Michael R; Uberbacher, Edward C

    2010-12-01

    The Carbohydrate-Active Enzyme (CAZy) database provides a rich set of manually annotated enzymes that degrade, modify, or create glycosidic bonds. Despite rich and invaluable information stored in the database, software tools utilizing this information for annotation of newly sequenced genomes by CAZy families are limited. We have employed two annotation approaches to fill the gap between manually curated high-quality protein sequences collected in the CAZy database and the growing number of other protein sequences produced by genome or metagenome sequencing projects. The first approach is based on a similarity search against the entire nonredundant sequences of the CAZy database. The second approach performs annotation using links or correspondences between the CAZy families and protein family domains. The links were discovered using the association rule learning algorithm applied to sequences from the CAZy database. The approaches complement each other and in combination achieved high specificity and sensitivity when cross-evaluated with the manually curated genomes of Clostridium thermocellum ATCC 27405 and Saccharophagus degradans 2-40. The capability of the proposed framework to predict the function of unknown protein domains and of hypothetical proteins in the genome of Neurospora crassa is demonstrated. The framework is implemented as a Web service, the CAZymes Analysis Toolkit, and is available at http://cricket.ornl.gov/cgi-bin/cat.cgi. PMID:20696711

  6. Hazard Analysis Database Report

    SciTech Connect

    GRAMS, W.H.

    2000-12-28

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.

  7. Hazard Analysis Database Report

    SciTech Connect

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  8. Scale out databases for CERN use cases

    NASA Astrophysics Data System (ADS)

    Baranowski, Zbigniew; Grzybek, Maciej; Canali, Luca; Lanza Garcia, Daniel; Surdy, Kacper

    2015-12-01

    Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log database.

  9. Cytochrome P450 database.

    PubMed

    Lisitsa, A V; Gusev, S A; Karuzina, I I; Archakov, A I; Koymans, L

    2001-01-01

    This paper describes a specialized database dedicated exclusively to the cytochrome P450 superfamily. The system provides the impression of superfamily's nomenclature and describes structure and function of different P450 enzymes. Information on P450-catalyzed reactions, substrate preferences, peculiarities of induction and inhibition is available through the database management system. Also the source genes and appropriate translated proteins can be retrieved together with corresponding literature references. Developed programming solution provides the flexible interface for browsing, searching, grouping and reporting the information. Local version of database manager and required data files are distributed on a compact disk. Besides, there is a network version of the software available on Internet. The network version implies the original mechanism, which is useful for the permanent online extension of the data scope. PMID:11769119

  10. BIOMARKERS DATABASE

    EPA Science Inventory

    This database was developed by assembling and evaluating the literature relevant to human biomarkers. It catalogues and evaluates the usefulness of biomarkers of exposure, susceptibility and effect which may be relevant for a longitudinal cohort study. In addition to describing ...

  11. Database Systems. Course Three. Information Systems Curriculum.

    ERIC Educational Resources Information Center

    O'Neil, Sharon Lund; Everett, Donna R.

    This course is the third of seven in the Information Systems curriculum. The purpose of the course is to familiarize students with database management concepts and standard database management software. Databases and their roles, advantages, and limitations are explained. An overview of the course sets forth the condition and performance standard…

  12. SECONDARY WASTE MANAGEMENT STRATEGY FOR EARLY LOW ACTIVITY WASTE TREATMENT

    SciTech Connect

    TW, CRAWFORD

    2008-07-17

    This study evaluates parameters relevant to River Protection Project secondary waste streams generated during Early Low Activity Waste operations and recommends a strategy for secondary waste management that considers groundwater impact, cost, and programmatic risk. The recommended strategy for managing River Protection Project secondary waste is focused on improvements in the Effiuent Treatment Facility. Baseline plans to build a Solidification Treatment Unit adjacent to Effluent Treatment Facility should be enhanced to improve solid waste performance and mitigate corrosion of tanks and piping supporting the Effiuent Treatment Facility evaporator. This approach provides a life-cycle benefit to solid waste performance and reduction of groundwater contaminants.

  13. Redis database administration tool

    2013-02-13

    MyRedis is a product of the Lorenz subproject under the ASC Scirntific Data Management effort. MyRedis is a web based utility designed to allow easy administration of instances of Redis databases. It can be usedd to view and manipulate data as well as run commands directly against a variety of different Redis hosts.

  14. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for scientific activities in the vicinity of Barrow, Alaska

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Kassin, A.; Gaylord, A.; Brown, J.; Tweedie, C. E.

    2012-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic. The Barrow Area Information Database (BAID, www.baidims.org) is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 9,600 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, and save or print maps and query results. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. BAID has been used to: Optimize research site choice; Reduce duplication of science effort; Discover complementary and potentially detrimental research activities in an area of scientific interest; Re-establish historical research sites for resampling efforts assessing change in ecosystem structure and function over time; Exchange knowledge across disciplines and generations; Facilitate communication between western science and traditional ecological knowledge; Provide local residents access to science data that facilitates adaptation to arctic change; (and) Educate the next generation of environmental and computer scientists. This poster describes key activities that will be undertaken over the next three years to provide BAID users with novel software tools to interact with a current and diverse selection of information and data about the Barrow area. Key activities include: 1. Collecting data on research

  15. Short Fiction on Film: A Relational DataBase.

    ERIC Educational Resources Information Center

    May, Charles

    Short Fiction on Film is a database that was created and will run on DataRelator, a relational database manager created by Bill Finzer for the California State Department of Education in 1986. DataRelator was designed for use in teaching students database management skills and to provide teachers with examples of how a database manager might be…

  16. Small-scale and Global Dynamos and the Area and Flux Distributions of Active Regions, Sunspot Groups, and Sunspots: A Multi-database Study

    NASA Astrophysics Data System (ADS)

    Muñoz-Jaramillo, Andrés; Senkpeil, Ryan R.; Windmueller, John C.; Amouzou, Ernest C.; Longcope, Dana W.; Tlatov, Andrey G.; Nagovitsyn, Yury A.; Pevtsov, Alexei A.; Chapman, Gary A.; Cookson, Angela M.; Yeates, Anthony R.; Watson, Fraser T.; Balmaceda, Laura A.; DeLuca, Edward E.; Martens, Petrus C. H.

    2015-02-01

    In this work, we take advantage of 11 different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions—where a pure Weibull (log-normal) characterizes the distribution of structures with fluxes below (above) 1021Mx (1022Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behavior of a power-law distribution (when extended to smaller fluxes), making our results compatible with the results of Parnell et al. We propose that this is evidence of two separate mechanisms giving rise to visible structures on the photosphere: one directly connected to the global component of the dynamo (and the generation of bipolar active regions), and the other with the small-scale component of the dynamo (and the fragmentation of magnetic structures due to their interaction with turbulent convection).

  17. SMALL-SCALE AND GLOBAL DYNAMOS AND THE AREA AND FLUX DISTRIBUTIONS OF ACTIVE REGIONS, SUNSPOT GROUPS, AND SUNSPOTS: A MULTI-DATABASE STUDY

    SciTech Connect

    Muñoz-Jaramillo, Andrés; Windmueller, John C.; Amouzou, Ernest C.; Longcope, Dana W.; Senkpeil, Ryan R.; Tlatov, Andrey G.; Nagovitsyn, Yury A.; Pevtsov, Alexei A.; Chapman, Gary A.; Cookson, Angela M.; Yeates, Anthony R.; Watson, Fraser T.; Balmaceda, Laura A.; DeLuca, Edward E.; Martens, Petrus C. H.

    2015-02-10

    In this work, we take advantage of 11 different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions—where a pure Weibull (log-normal) characterizes the distribution of structures with fluxes below (above) 10{sup 21}Mx (10{sup 22}Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behavior of a power-law distribution (when extended to smaller fluxes), making our results compatible with the results of Parnell et al. We propose that this is evidence of two separate mechanisms giving rise to visible structures on the photosphere: one directly connected to the global component of the dynamo (and the generation of bipolar active regions), and the other with the small-scale component of the dynamo (and the fragmentation of magnetic structures due to their interaction with turbulent convection)

  18. The Gaia Parameter Database

    NASA Astrophysics Data System (ADS)

    de Bruijne, J. H. J.; Lammers, U.; Perryman, M. A. C.

    2005-01-01

    The parallel development of many aspects of a complex mission like Gaia, which includes numerous participants in ESA, industrial companies, and a large and active scientific collaboration throughout Europe, makes keeping track of the many design changes, instrument and operational complexities, and numerical values for the data analysis a very challenging problem. A comprehensive, easily-accessible, up-to-date, and definitive compilation of a large range of numerical quantities is required, and the Gaia parameter database has been established to satisfy these needs. The database is a centralised repository containing, besides mathematical, physical, and astronomical constants, many satellite and subsystem design parameters. At the end of 2004, more than 1600 parameters had been included. Version control has been implemented, providing, next to a `live' version with the most recent parameters, well-defined reference versions of the full database contents. The database can be queried or browsed using a regular Web browser (http://www.rssd.esa.int/Gaia/paramdb). Query results are formated by default in HTML. Data can also be retrieved as Fortran-77, Fortran-90, Java, ANSIC, C++, or XML structures for direct inclusion into software codes in these languages. The idea is that all collaborating scientists can use the database parameters and values, once retrieved, directly linked to computational routines. An off-line access mode is also available, enabling users to automatically download the contents of the database. The database will be maintained actively, and significant extensions of the contents are planned. Consistent use in the future of the database by the Gaia community at large, including all industrial teams, will ensure correct numerical values throughout the complex software systems being built up as details of the Gaia design develop. The database is already being used for the telemetry simulation chain in ESTEC, and in the data simulations for GDAAS2.

  19. Serials Management by Microcomputer: The Potential of DBMS.

    ERIC Educational Resources Information Center

    Vogel, J. Thomas; Burns, Lynn W.

    1984-01-01

    Describes serials management at Philadelphia College of Textiles and Science library via a microcomputer, a file manager called PFS, and a relational database management system called dBase II. Check-in procedures, programing with dBase II, "static" and "active" databases, and claim procedures are discussed. Check-in forms are included. (EJS)

  20. Briefing book on environmental and waste management activities

    SciTech Connect

    Quayle, T.A.

    1993-04-01

    The purpose of the Briefing Book is to provide current information on Environmental Restoration and Waste Management Activities at the Hanford Site. Each edition updates the information in the previous edition by deleting those sections determined not to be of current interest and adding new topics to keep up to date with the changing requirements and issues. This edition covers the period from October 15, 1992 through April 15, 1993.

  1. United States-Russia: Environmental management activities, Summer 1998

    SciTech Connect

    1998-09-01

    A Joint Coordinating Committee for Environmental Restoration and Waste Management (JCCEM) was formed between the US and Russia. This report describes the areas of research being studied under JCCEM, namely: Efficient separations; Contaminant transport and site characterization; Mixed wastes; High level waste tank remediation; Transuranic stabilization; Decontamination and decommissioning; and Emergency response. Other sections describe: Administrative framework for cooperation; Scientist exchange; Future actions; Non-JCCEM DOE-Russian activities; and JCCEM publications.

  2. Experiment Databases

    NASA Astrophysics Data System (ADS)

    Vanschoren, Joaquin; Blockeel, Hendrik

    Next to running machine learning algorithms based on inductive queries, much can be learned by immediately querying the combined results of many prior studies. Indeed, all around the globe, thousands of machine learning experiments are being executed on a daily basis, generating a constant stream of empirical information on machine learning techniques. While the information contained in these experiments might have many uses beyond their original intent, results are typically described very concisely in papers and discarded afterwards. If we properly store and organize these results in central databases, they can be immediately reused for further analysis, thus boosting future research. In this chapter, we propose the use of experiment databases: databases designed to collect all the necessary details of these experiments, and to intelligently organize them in online repositories to enable fast and thorough analysis of a myriad of collected results. They constitute an additional, queriable source of empirical meta-data based on principled descriptions of algorithm executions, without reimplementing the algorithms in an inductive database. As such, they engender a very dynamic, collaborative approach to experimentation, in which experiments can be freely shared, linked together, and immediately reused by researchers all over the world. They can be set up for personal use, to share results within a lab or to create open, community-wide repositories. Here, we provide a high-level overview of their design, and use an existing experiment database to answer various interesting research questions about machine learning algorithms and to verify a number of recent studies.

  3. Drinking Water Database

    NASA Technical Reports Server (NTRS)

    Murray, ShaTerea R.

    2004-01-01

    This summer I had the opportunity to work in the Environmental Management Office (EMO) under the Chemical Sampling and Analysis Team or CS&AT. This team s mission is to support Glenn Research Center (GRC) and EM0 by providing chemical sampling and analysis services and expert consulting. Services include sampling and chemical analysis of water, soil, fbels, oils, paint, insulation materials, etc. One of this team s major projects is the Drinking Water Project. This is a project that is done on Glenn s water coolers and ten percent of its sink every two years. For the past two summers an intern had been putting together a database for this team to record the test they had perform. She had successfully created a database but hadn't worked out all the quirks. So this summer William Wilder (an intern from Cleveland State University) and I worked together to perfect her database. We began be finding out exactly what every member of the team thought about the database and what they would change if any. After collecting this data we both had to take some courses in Microsoft Access in order to fix the problems. Next we began looking at what exactly how the database worked from the outside inward. Then we began trying to change the database but we quickly found out that this would be virtually impossible.

  4. Waste management and technologies analytical database project for Los Alamos National Laboratory/Department of Energy. Final report, June 7, 1993--June 15, 1994

    SciTech Connect

    1995-04-17

    The Waste Management and Technologies Analytical Database System (WMTADS) supported by the Department of Energy`s (DOE) Office of Environmental Management (EM), Office of Technology Development (EM-50), was developed and based at the Los Alamos National Laboratory (LANL), Los Alamos, New Mexico, to collect, identify, organize, track, update, and maintain information related to existing/available/developing and planned technologies to characterize, treat, and handle mixed, hazardous and radioactive waste for storage and disposal in support of EM strategies and goals and to focus area projects. WMTADS was developed as a centralized source of on-line information regarding technologies for environmental management processes that can be accessed by a computer, modem, phone line, and communications software through a Local Area Network (LAN), and server connectivity on the Internet, the world`s largest computer network, and with file transfer protocol (FTP) can also be used to globally transfer files from the server to the user`s computer through Internet and World Wide Web (WWW) using Mosaic.

  5. Weight Management for Athletes and Active Individuals: A Brief Review.

    PubMed

    Manore, Melinda M

    2015-11-01

    Weight management for athletes and active individuals is unique because of their high daily energy expenditure; thus, the emphasis is usually placed on changing the diet side of the energy balance equation. When dieting for weight loss, active individuals also want to preserve lean tissue, which means that energy restriction cannot be too severe or lean tissue is lost. First, this brief review addresses the issues of weight management in athletes and active individuals and factors to consider when determining a weight-loss goal. Second, the concept of dynamic energy balance is reviewed, including two mathematical models developed to improve weight-loss predictions based on changes in diet and exercise. These models are now available on the Internet. Finally, dietary strategies for weight loss/maintenance that can be successfully used with active individuals are given. Emphasis is placed on teaching the benefits of consuming a low-ED diet (e.g., high-fiber, high-water, low-fat foods), which allows for the consumption of a greater volume of food to increase satiety while reducing energy intake. Health professionals and sport dietitians need to understand dynamic energy balance and be prepared with effective and evidence-based dietary approaches to help athletes and active individuals achieve their body-weight goals. PMID:26553496

  6. Database Quality: Label or Liable.

    ERIC Educational Resources Information Center

    Armstrong, C. J.

    The Centre for Information Quality Management (CIQM) was set up by the Library Association and UK (United Kingdom) Online User Group to act as a clearinghouse to which database users may report problems relating to the quality of any aspect of a database being used. CIQM acts as an intermediary between the user and information provider in…

  7. Analytical results, database management and quality assurance for analysis of soil and groundwater samples collected by cone penetrometer from the F and H Area seepage basins

    SciTech Connect

    Boltz, D.R.; Johnson, W.H.; Serkiz, S.M.

    1994-10-01

    The Quantification of Soil Source Terms and Determination of the Geochemistry Controlling Distribution Coefficients (K{sub d} values) of Contaminants at the F- and H-Area Seepage Basins (FHSB) study was designed to generate site-specific contaminant transport factors for contaminated groundwater downgradient of the Basins. The experimental approach employed in this study was to collect soil and its associated porewater from contaminated areas downgradient of the FHSB. Samples were collected over a wide range of geochemical conditions (e.g., pH, conductivity, and contaminant concentration) and were used to describe the partitioning of contaminants between the aqueous phase and soil surfaces at the site. The partitioning behavior may be used to develop site-specific transport factors. This report summarizes the analytical procedures and results for both soil and porewater samples collected as part of this study and the database management of these data.

  8. A Comprehensive Software and Database Management System for Glomerular Filtration Rate Estimation by Radionuclide Plasma Sampling and Serum Creatinine Methods

    PubMed Central

    Jha, Ashish Kumar

    2015-01-01

    Glomerular filtration rate (GFR) estimation by plasma sampling method is considered as the gold standard. However, this method is not widely used because the complex technique and cumbersome calculations coupled with the lack of availability of user-friendly software. The routinely used Serum Creatinine method (SrCrM) of GFR estimation also requires the use of online calculators which cannot be used without internet access. We have developed user-friendly software “GFR estimation software” which gives the options to estimate GFR by plasma sampling method as well as SrCrM. We have used Microsoft Windows® as operating system and Visual Basic 6.0 as the front end and Microsoft Access® as database tool to develop this software. We have used Russell's formula for GFR calculation by plasma sampling method. GFR calculations using serum creatinine have been done using MIRD, Cockcroft-Gault method, Schwartz method, and Counahan-Barratt methods. The developed software is performing mathematical calculations correctly and is user-friendly. This software also enables storage and easy retrieval of the raw data, patient's information and calculated GFR for further processing and comparison. This is user-friendly software to calculate the GFR by various plasma sampling method and blood parameter. This software is also a good system for storing the raw and processed data for future analysis. PMID:26097422

  9. BAID: The Barrow Area Information Database - An Interactive Web Mapping Portal and Cyberinfrastructure Showcasing Scientific Activities in the Vicinity of Barrow, Arctic Alaska.

    NASA Astrophysics Data System (ADS)

    Escarzaga, S. M.; Cody, R. P.; Kassin, A.; Barba, M.; Gaylord, A. G.; Manley, W. F.; Mazza Ramsay, F. D.; Vargas, S. A., Jr.; Tarin, G.; Laney, C. M.; Villarreal, S.; Aiken, Q.; Collins, J. A.; Green, E.; Nelson, L.; Tweedie, C. E.

    2015-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic and the Barrow Area Information Database (BAID, www.barrowmapped.org) tracks and facilitates a gamut of research, management, and educational activities in the area. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 12,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, save or print maps and query results, and filter or view information by space, time, and/or other tags. Additionally, data are described with metadata that meet Federal Geographic Data Committee standards. Recent advances include the addition of more than 2000 new research sites, the addition of a query builder user interface allowing rich and complex queries, and provision of differential global position system (dGPS) and high-resolution aerial imagery support to visiting scientists. Recent field surveys include over 80 miles of coastline to document rates of erosion and the collection of high-resolution sonar data for bathymetric mapping of Elson Lagoon and near shore region of the Chukchi Sea. A network of five climate stations has been deployed across the peninsula to serve as a wireless net for the research community and to deliver near real time climatic data to the user community. Local GIS personal have also been trained to better make use of scientific data for local decision making. Links to Barrow area datasets are housed at national data archives and substantial upgrades have

  10. Solubility Database

    National Institute of Standards and Technology Data Gateway

    SRD 106 IUPAC-NIST Solubility Database (Web, free access)   These solubilities are compiled from 18 volumes (Click here for List) of the International Union for Pure and Applied Chemistry(IUPAC)-NIST Solubility Data Series. The database includes liquid-liquid, solid-liquid, and gas-liquid systems. Typical solvents and solutes include water, seawater, heavy water, inorganic compounds, and a variety of organic compounds such as hydrocarbons, halogenated hydrocarbons, alcohols, acids, esters and nitrogen compounds. There are over 67,500 solubility measurements and over 1800 references.

  11. Nuclear Science References Database

    SciTech Connect

    Pritychenko, B.; Běták, E.; Singh, B.; Totans, J.

    2014-06-15

    The Nuclear Science References (NSR) database together with its associated Web interface, is the world's only comprehensive source of easily accessible low- and intermediate-energy nuclear physics bibliographic information for more than 210,000 articles since the beginning of nuclear science. The weekly-updated NSR database provides essential support for nuclear data evaluation, compilation and research activities. The principles of the database and Web application development and maintenance are described. Examples of nuclear structure, reaction and decay applications are specifically included. The complete NSR database is freely available at the websites of the National Nuclear Data Center (http://www.nndc.bnl.gov/nsr) and the International Atomic Energy Agency (http://www-nds.iaea.org/nsr)

  12. DESIGN AND PERFORMANCE OF A XENOBIOTIC METABOLISM DATABASE MANAGER FOR METABOLIC SIMULATOR ENHANCEMENT AND CHEMICAL RISK ANALYSIS

    EPA Science Inventory

    A major uncertainty that has long been recognized in evaluating chemical toxicity is accounting for metabolic activation of chemicals resulting in increased toxicity. In silico approaches to predict chemical metabolism and to subsequently screen and prioritize chemicals for risk ...

  13. JDD, Inc. Database

    NASA Technical Reports Server (NTRS)

    Miller, David A., Jr.

    2004-01-01

    JDD Inc, is a maintenance and custodial contracting company whose mission is to provide their clients in the private and government sectors "quality construction, construction management and cleaning services in the most efficient and cost effective manners, (JDD, Inc. Mission Statement)." This company provides facilities support for Fort Riley in Fo,rt Riley, Kansas and the NASA John H. Glenn Research Center at Lewis Field here in Cleveland, Ohio. JDD, Inc. is owned and operated by James Vaughn, who started as painter at NASA Glenn and has been working here for the past seventeen years. This summer I worked under Devan Anderson, who is the safety manager for JDD Inc. in the Logistics and Technical Information Division at Glenn Research Center The LTID provides all transportation, secretarial, security needs and contract management of these various services for the center. As a safety manager, my mentor provides Occupational Health and Safety Occupation (OSHA) compliance to all JDD, Inc. employees and handles all other issues (Environmental Protection Agency issues, workers compensation, safety and health training) involving to job safety. My summer assignment was not as considered "groundbreaking research" like many other summer interns have done in the past, but it is just as important and beneficial to JDD, Inc. I initially created a database using a Microsoft Excel program to classify and categorize data pertaining to numerous safety training certification courses instructed by our safety manager during the course of the fiscal year. This early portion of the database consisted of only data (training field index, employees who were present at these training courses and who was absent) from the training certification courses. Once I completed this phase of the database, I decided to expand the database and add as many dimensions to it as possible. Throughout the last seven weeks, I have been compiling more data from day to day operations and been adding the

  14. Carbon sink activity and GHG budget of managed European grasslands

    NASA Astrophysics Data System (ADS)

    Klumpp, Katja; Herfurth, Damien; Soussana, Jean-Francois; Fluxnet Grassland Pi's, European

    2013-04-01

    In agriculture, a large proportion (89%) of greenhouse gas (GHG) emission saving potential may be achieved by means of soil C sequestration. Recent demonstrations of carbon sink activities of European ecosystemes, however, often questioned the existence of C storing grasslands, as though a net sink of C was observed, uncertainty surrounding this estimate was larger than the sink itself (Janssens et al., 2003, Schulze et al., 2009. Then again, some of these estimates were based on a small number of measurements, and on models. Not surprising, there is still, a paucity of studies demonstrating the existence of grassland systems, where C sequestration would exceed (in CO2 equivalents) methane emissions from the enteric fermentation of ruminants and nitrous oxide emissions from managed soils. Grasslands are heavily relied upon for food and forage production. A key component of the carbon sink activity in grasslands is thus the impact of changes in management practices or effects of past and recent management, such as intensification as well as climate (and -variation). We analysed data (i.e. flux, ecological, management and soil organic carbon) from a network of European grassland flux observation sites (36). These sites covered different types and intensities of management, and offered the opportunity to understand grassland carbon cycling and trade-offs between C sinks and CH4 and N2O emissions. For some sites, the assessment of carbon sink activities were compared using two methods; repeated soil inventory and determination of the ecosystem C budget by continuous measurement of CO2 exchange in combination with quantification of other C imports and exports (net C storage, NCS). In general grassland, were a potential sink of C with 60±12 g C /m2.yr (median; min -456; max 645). Grazed sites had a higher NCS compared to cut sites (median 99 vs 67 g C /m2.yr), while permanent grassland sites tended to have a lower NCS compared to temporary sown grasslands (median 64 vs

  15. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2013-06-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, relational databases, and NoSQL databases. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system. PMID:23749755

  16. Teaching Case: Adapting the Access Northwind Database to Support a Database Course

    ERIC Educational Resources Information Center

    Dyer, John N.; Rogers, Camille

    2015-01-01

    A common problem encountered when teaching database courses is that few large illustrative databases exist to support teaching and learning. Most database textbooks have small "toy" databases that are chapter objective specific, and thus do not support application over the complete domain of design, implementation and management concepts…

  17. Small Business Innovations (Integrated Database)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Because of the diversity of NASA's information systems, it was necessary to develop DAVID as a central database management system. Under a Small Business Innovation Research (SBIR) grant, Ken Wanderman and Associates, Inc. designed software tools enabling scientists to interface with DAVID and commercial database management systems, as well as artificial intelligence programs. The software has been installed at a number of data centers and is commercially available.

  18. 78 FR 21118 - Agency Information Collection Activities; Submission to the Office of Management and Budget for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-09

    ... Agency Information Collection Activities; Submission to the Office of Management and Budget for Review... includes all schools that are not supported primarily by public funds, that provide classroom instruction... Management Services, Office of Management. BILLING CODE 4000-01-P...

  19. Reportable Nuclide Criteria for ORNL Radioactive Waste Management Activities - 13005

    SciTech Connect

    McDowell, Kip; Forrester, Tim; Saunders, Mark

    2013-07-01

    The U.S. Department of Energy's Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee generates numerous radioactive waste streams. Many of those streams contain a large number of radionuclides with an extremely broad range of concentrations. To feasibly manage the radionuclide information, ORNL developed reportable nuclide criteria to distinguish between those nuclides in a waste stream that require waste tracking versus those nuclides of such minimal activity that do not require tracking. The criteria include tracking thresholds drawn from ORNL onsite management requirements, transportation requirements, and relevant treatment and disposal facility acceptance criteria. As a management practice, ORNL maintains waste tracking on a nuclide in a specific waste stream if it exceeds any of the reportable nuclide criteria. Nuclides in a specific waste stream that screen out as non-reportable under all these criteria may be dropped from ORNL waste tracking. The benefit of these criteria is to ensure that nuclides in a waste stream with activities which meaningfully affect safety and compliance are tracked, while documenting the basis for removing certain isotopes from further consideration. (authors)

  20. Reportable Nuclide Criteria for ORNL Waste Management Activities - 13005

    SciTech Connect

    McDowell, Kip; Forrester, Tim; Saunders, Mark Edward

    2013-01-01

    The U.S. Department of Energy's Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee generates numerous radioactive waste streams. Many of those streams contain a large number of radionuclides with an extremely broad range of concentrations. To feasibly manage the radionuclide information, ORNL developed a reportable nuclide criteria to distinguish between those nuclides in a waste stream that require waste tracking versus those nuclides of such minimal activity that do not require tracking. The criteria include tracking thresholds drawn from ORNL onsite management requirements, transportation requirements, and relevant treatment and disposal facility acceptance criteria. As a management practice, ORNL maintains waste tracking on a nuclide in a specific waste stream if it exceeds any of the reportable nuclide criteria. Nuclides in a specific waste stream that screen out as non-reportable under all these criteria may be dropped from ORNL waste tracking. The benefit of this criteria is to ensure that nuclides in a waste stream with activities which meaningfully affect safety and compliance are tracked, while documenting the basis for removing certain isotopes from further consideration.

  1. Waste management activities and carbon emissions in Africa

    SciTech Connect

    Couth, R.; Trois, C.

    2011-01-15

    This paper summarizes research into waste management activities and carbon emissions from territories in sub-Saharan Africa with the main objective of quantifying emission reductions (ERs) that can be gained through viable improvements to waste management in Africa. It demonstrates that data on waste and carbon emissions is poor and generally inadequate for prediction models. The paper shows that the amount of waste produced and its composition are linked to national Gross Domestic Product (GDP). Waste production per person is around half that in developed countries with a mean around 230 kg/hd/yr. Sub-Saharan territories produce waste with a biogenic carbon content of around 56% (+/-25%), which is approximately 40% greater than developed countries. This waste is disposed in uncontrolled dumps that produce large amounts of methane gas. Greenhouse gas (GHG) emissions from waste will rise with increasing urbanization and can only be controlled through funding mechanisms from developed countries.

  2. Pollution effects on fisheries — potential management activities

    NASA Astrophysics Data System (ADS)

    Sindermann, C. J.

    1980-03-01

    Management of ocean pollution must be based on the best available scientific information, with adequate consideration of economic, social, and political realities. Unfortunately, the best available scientific information about pollution effects on fisheries is often fragmentary, and often conjectural; therefore a primary concern of management should be a critical review and assessment of available factual information about effects of pollutants on fish and shellfish stocks. A major problem in any such review and assessment is the separation of pollutant effects from the effects of all the other environmental factors that influence survival and well-being of marine animals. Data from long-term monitoring of resource abundance, and from monitoring of all determinant environmental variables, will be required for analyses that lead to resolution of the problem. Information must also be acquired about fluxes of contaminants through resource-related ecosystems, and about contaminant effects on resource species as demonstrated in field and laboratory experiments. Other possible management activities include: (1) encouragement of continued efforts to document clearly the localized and general effects of pollution on living resources; (2) continued pressure to identify and use reliable biological indicators of environmental degradation (indicators of choice at present are: unusually high levels of genetic and other anomalies in the earliest life history stages; presence of pollution-associated disease signs, particularly fin erosion and ulcers, in fish; and biochemical/physiological changes); and (3) major efforts to reduce inputs of pollutants clearly demonstrated to be harmful to living resources, from point sources as well as ocean dumping. Such pollution management activities, based on continuous efforts in stock assessment, environmental assessment, and experimental studies, can help to insure that rational decisions will be made about uses and abuses of coastal

  3. Automating The Work at The Skin and Allergy Private Clinic : A Case Study on Using an Imaging Database to Manage Patients Records

    NASA Astrophysics Data System (ADS)

    Alghalayini, Mohammad Abdulrahman

    Today, many institutions and organizations are facing serious problem due to the tremendously increasing size of documents, and this problem is further triggering the storage and retrieval problems due to the continuously growing space and efficiency requirements. This problem is becoming more complex with time and the increase in the size and number of documents in an organization; therefore, there is a world wide growing demand to address this problem. This demand and challenge can be met by converting the tremendous amount of paper documents to images using a process to enable specialized document imaging people to select the most suitable image type and scanning resolution to use when there is a need for storing documents images. This documents management process, if applied, attempts to solve the problem of the image storage type and size to some extent. In this paper, we present a case study resembling an applied process to manage the registration of new patients in a private clinic and to optimize following up the registered patients after having their information records stored in an imaging database system; therefore, through this automation approach, we optimize the work process and maximize the efficiency of the Skin and Allergy Clinic tasks.

  4. Comparative analysis of benign prostatic hyperplasia management by urologists and nonurologists: A Korean nationwide health insurance database study

    PubMed Central

    Park, Juhyun; Lee, Young Ju; Lee, Jeong Woo; Yoo, Tag Keun; Chung, Jae Il; Yun, Seok-Joong; Hong, Jun Hyuk; Seo, Seong Il; Cho, Sung Yong

    2015-01-01

    Purpose To compare the current management of benign prostatic hyperplasia (BPH) by urologists and nonurologists by use of Korean nationwide health insurance data. Materials and Methods We obtained patient data from the national health insurance system. New patients diagnosed with BPH in 2009 were divided into two groups depending on whether they were diagnosed by a urologist (U group) or by a nonurologist (NU group). Results A total of 390,767 individuals were newly diagnosed with BPH in 2009. Of these, 240,907 patients (61.7%) were in the U group and 149,860 patients (38.3%) were in the NU group. The rate of all initial evaluation tests, except serum creatinine, was significantly lower in the NU group. The initial prescription rate was higher in the U group, whereas the prescription period was longer in the NU group. Regarding the initial drugs prescribed, the use of alpha-blockers was common in both groups. However, the U group was prescribed combination therapy of an alpha-blocker and 5-alpha-reductase inhibitor as the second choice, whereas the NU group received monotherapy with a 5-alpha-reductase inhibitor. During the 1-year follow-up, the incidence of surgery was significantly different between the U group and the NU group. Conclusions There are distinct differences in the diagnosis and treatment of BPH by urologists and nonurologists in Korea. These differences may have adverse consequences for BPH patients. Urological societies should take a leadership role in the management of BPH and play an educational role for nonurologists as well as urologists. PMID:25763128

  5. Energy management and control of active distribution systems

    NASA Astrophysics Data System (ADS)

    Shariatzadeh, Farshid

    Advancements in the communication, control, computation and information technologies have driven the transition to the next generation active power distribution systems. Novel control techniques and management strategies are required to achieve the efficient, economic and reliable grid. The focus of this work is energy management and control of active distribution systems (ADS) with integrated renewable energy sources (RESs) and demand response (DR). Here, ADS mean automated distribution system with remotely operated controllers and distributed energy resources (DERs). DER as active part of the next generation future distribution system includes: distributed generations (DGs), RESs, energy storage system (ESS), plug-in hybrid electric vehicles (PHEV) and DR. Integration of DR and RESs into ADS is critical to realize the vision of sustainability. The objective of this dissertation is the development of management architecture to control and operate ADS in the presence of DR and RES. One of the most challenging issues for operating ADS is the inherent uncertainty of DR and RES as well as conflicting objective of DER and electric utilities. ADS can consist of different layers such as system layer and building layer and coordination between these layers is essential. In order to address these challenges, multi-layer energy management and control architecture is proposed with robust algorithms in this work. First layer of proposed multi-layer architecture have been implemented at the system layer. Developed AC optimal power flow (AC-OPF) generates fair price for all DR and non-DR loads which is used as a control signal for second layer. Second layer controls DR load at buildings using a developed look-ahead robust controller. Load aggregator collects information from all buildings and send aggregated load to the system optimizer. Due to the different time scale at these two management layers, time coordination scheme is developed. Robust and deterministic controllers

  6. MANAGING ENGINEERING ACTIVITIES FOR THE PLATEAU REMEDIATION CONTRACT - HANFORD

    SciTech Connect

    KRONVALL CM

    2011-01-14

    In 2008, the primary Hanford clean-up contract transitioned to the CH2MHill Plateau Remediation Company (CHPRC). Prior to transition, Engineering resources assigned to remediation/Decontamination and Decommissioning (D&D) activities were a part of a centralized engineering organization and matrixed to the performing projects. Following transition, these resources were reassigned directly to the performing project, with a loose matrix through a smaller Central Engineering (CE) organization. The smaller (10 FTE) central organization has retained responsibility for the overall technical quality of engineering for the CHPRC, but no longer performs staffing and personnel functions. As the organization has matured, there are lessons learned that can be shared with other organizations going through or contemplating performing a similar change. Benefits that have been seen from the CHPRC CE organization structure include the following: (1) Staff are closely aligned with the 'Project/facility' that they are assigned to support; (2) Engineering priorities are managed to be consistent with the 'Project/facility' priorities; (3) Individual Engineering managers are accountable for identifying staffing needs and the filling of staffing positions; (4) Budget priorities are managed within the local organization structure; (5) Rather than being considered a 'functional' organization, engineering is considered a part of a line, direct funded organization; (6) The central engineering organization is able to provide 'overview' activities and maintain independence from the engineering organizations in the field; and (7) The central engineering organization is able to maintain a stable of specialized experts that are able to provide independent reviews of field projects and day-to-day activities.

  7. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Senoo, Tetsuo

    As computer technology, communication technology and others have progressed, many corporations are likely to locate constructing and utilizing their own databases at the center of the information activities, and aim at developing their information activities newly. This paper considers how information management in a corporation is affected under changing management and technology environments, and clarifies and generalizes what in-house databases should be constructed and utilized from the viewpoints of requirements to be furnished, types and forms of information to be dealt, indexing, use type and frequency, evaluation method and so on. The author outlines an information system of Matsushita called MATIS (Matsushita Technical Information System) as an actual example, and describes the present status and some points to be reminded in constructing and utilizing databases of REP, BOOK and SYMP.

  8. Drinking Water Treatability Database (Database)

    EPA Science Inventory

    The drinking Water Treatability Database (TDB) will provide data taken from the literature on the control of contaminants in drinking water, and will be housed on an interactive, publicly-available USEPA web site. It can be used for identifying effective treatment processes, rec...

  9. FishTraits Database

    USGS Publications Warehouse

    Angermeier, Paul L.; Frimpong, Emmanuel A.

    2009-01-01

    The need for integrated and widely accessible sources of species traits data to facilitate studies of ecology, conservation, and management has motivated development of traits databases for various taxa. In spite of the increasing number of traits-based analyses of freshwater fishes in the United States, no consolidated database of traits of this group exists publicly, and much useful information on these species is documented only in obscure sources. The largely inaccessible and unconsolidated traits information makes large-scale analysis involving many fishes and/or traits particularly challenging. FishTraits is a database of >100 traits for 809 (731 native and 78 exotic) fish species found in freshwaters of the conterminous United States, including 37 native families and 145 native genera. The database contains information on four major categories of traits: (1) trophic ecology, (2) body size and reproductive ecology (life history), (3) habitat associations, and (4) salinity and temperature tolerances. Information on geographic distribution and conservation status is also included. Together, we refer to the traits, distribution, and conservation status information as attributes. Descriptions of attributes are available here. Many sources were consulted to compile attributes, including state and regional species accounts and other databases.

  10. Database management research for the Human Genome Project. Final progress report for period: 02/01/99 - 06/14/00

    SciTech Connect

    Bult, Carol J.

    1999-11-01

    The MouseBLAST server allows researchers to search a sequence within mouse/rodent sequence databases to find matching sequences that may be associated with mouse genes. Query results may be linked to gene detail records in the Mouse Genome Database (MGD). Searches are performed using WU-BLAST 2.0. All sequence databases are updated on a weekly basis.

  11. Active Piezoelectric Structures for Tip Clearance Management Assessed

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Managing blade tip clearance in turbomachinery stages is critical to developing advanced subsonic propulsion systems. Active casing structures with embedded piezoelectric actuators appear to be a promising solution. They can control static and dynamic tip clearance, compensate for uneven deflections, and accomplish electromechanical coupling at the material level. In addition, they have a compact design. To assess the feasibility of this concept and assist the development of these novel structures, the NASA Lewis Research Center developed in-house computational capabilities for composite structures with piezoelectric actuators and sensors, and subsequently used them to simulate candidate active casing structures. The simulations indicated the potential of active casings to modify the blade tip clearance enough to improve stage efficiency. They also provided valuable design information, such as preliminary actuator configurations (number and location) and the corresponding voltage patterns required to compensate for uneven casing deformations. An active ovalization of a casing with four discrete piezoceramic actuators attached on the outer surface is shown. The center figure shows the predicted radial displacements along the hoop direction that are induced when electrostatic voltage is applied at the piezoceramic actuators. This work, which has demonstrated the capabilities of in-house computational models to analyze and design active casing structures, is expected to contribute toward the development of advanced subsonic engines.

  12. Volcanic disasters and incidents: A new database

    NASA Astrophysics Data System (ADS)

    Witham, C. S.

    2005-12-01

    A new database on human mortality and morbidity, and civil evacuations arising from volcanic activity is presented. The aim is to quantify the human impacts of volcanic phenomena during the 20th Century. Data include numbers of deaths, injuries, evacuees and people made homeless, and the nature of the associated volcanic phenomena. The database has been compiled from a wide range of sources, and discrepancies between these are indicated where they arise. The quality of the data varies according to the source and the impacts reported. Data for homelessness are particularly poor and effects from ashfall and injuries appear to be under-reported. Of the 491 events included in the database, ˜53% resulted in deaths, although the total death toll of 91,724 is dominated by the disasters at Mt Pelée and Nevado del Ruiz. Pyroclastic density currents account for the largest proportion of deaths, and lahars for the most injuries incurred. The Philippines, Indonesia, and Southeast Asia, as a region, were the worst affected, and middle-income countries experienced greater human impacts than low or high-income countries. Compilation of the database has highlighted a number of problems with the completeness and accuracy of the existing CRED EM-DAT disaster database that includes volcanic events. This database is used by a range of organisations involved with risk management. The new database is intended as a resource for future analysis and will be made available via the Internet. It is hoped that it will be maintained and expanded.

  13. The need for and provision of intrathecal baclofen therapy for the management of spasticity in England: an assessment of the Hospital Episode Statistics database

    PubMed Central

    Narendran, Rajesh C; Duarte, Rui V; Valyi, Andrea; Eldabe, Sam

    2015-01-01

    Objectives The aim of this study was to evaluate changes in the uptake of intrathecal baclofen (ITB) following commissioning of this therapy by the National Health Service (NHS) England in April 2013. The specific objectives of this study were: (i) to explore the gap between the need for and the actual provision of ITB services; and (ii) to compare England figures with other European countries with comparable data available. Setting Data for ITB -related procedures were obtained from the Hospital Episode Statistics (HES) database from 2009/2010 to 2013/2014. Participants Patients receiving ITB for the management of spasticity. Results The available data for implantation of ITB from 2009/2010 to 2013/2014 for the treatment of spasticity due to varied aetiologies show that there has not been an increase in uptake of this therapy. The estimated need for this treatment based on the incidence and prevalence of conditions susceptible to ITB therapy is between 4.6 and 5.7 per million population. Our analysis of the data available from the HES database showed that the actual number of implants is around 3.0 per million population. The same period 2009–2014 has seen an increase in the delivery of other neuromodulation techniques including spinal cord stimulation, deep brain stimulation and sacral nerve stimulation. Conclusions There is a considerable gap between the need for and provision of ITB figures nationally. Additionally, within the same area, we have observed important differences in the ITB service delivery between the various trusts. The reasons for this can be multifactorial, including individual experience and opinions, organisational structures, resource and financial limitations. Further research analysing the efficacy and cost-effectiveness of this treatment in the UK might inform the development of Technology Appraisal Guidance for ITB, potentially leading to an improvement in service provision. PMID:26129634

  14. Research Data Management and Libraries: Relationships, Activities, Drivers and Influences

    PubMed Central

    Pinfield, Stephen; Cox, Andrew M.; Smith, Jen

    2014-01-01

    The management of research data is now a major challenge for research organisations. Vast quantities of born-digital data are being produced in a wide variety of forms at a rapid rate in universities. This paper analyses the contribution of academic libraries to research data management (RDM) in the wider institutional context. In particular it: examines the roles and relationships involved in RDM, identifies the main components of an RDM programme, evaluates the major drivers for RDM activities, and analyses the key factors influencing the shape of RDM developments. The study is written from the perspective of library professionals, analysing data from 26 semi-structured interviews of library staff from different UK institutions. This is an early qualitative contribution to the topic complementing existing quantitative and case study approaches. Results show that although libraries are playing a significant role in RDM, there is uncertainty and variation in the relationship with other stakeholders such as IT services and research support offices. Current emphases in RDM programmes are on developments of policies and guidelines, with some early work on technology infrastructures and support services. Drivers for developments include storage, security, quality, compliance, preservation, and sharing with libraries associated most closely with the last three. The paper also highlights a ‘jurisdictional’ driver in which libraries are claiming a role in this space. A wide range of factors, including governance, resourcing and skills, are identified as influencing ongoing developments. From the analysis, a model is constructed designed to capture the main aspects of an institutional RDM programme. This model helps to clarify the different issues involved in RDM, identifying layers of activity, multiple stakeholders and drivers, and a large number of factors influencing the implementation of any initiative. Institutions may usefully benchmark their activities against

  15. Research data management and libraries: relationships, activities, drivers and influences.

    PubMed

    Pinfield, Stephen; Cox, Andrew M; Smith, Jen

    2014-01-01

    The management of research data is now a major challenge for research organisations. Vast quantities of born-digital data are being produced in a wide variety of forms at a rapid rate in universities. This paper analyses the contribution of academic libraries to research data management (RDM) in the wider institutional context. In particular it: examines the roles and relationships involved in RDM, identifies the main components of an RDM programme, evaluates the major drivers for RDM activities, and analyses the key factors influencing the shape of RDM developments. The study is written from the perspective of library professionals, analysing data from 26 semi-structured interviews of library staff from different UK institutions. This is an early qualitative contribution to the topic complementing existing quantitative and case study approaches. Results show that although libraries are playing a significant role in RDM, there is uncertainty and variation in the relationship with other stakeholders such as IT services and research support offices. Current emphases in RDM programmes are on developments of policies and guidelines, with some early work on technology infrastructures and support services. Drivers for developments include storage, security, quality, compliance, preservation, and sharing with libraries associated most closely with the last three. The paper also highlights a 'jurisdictional' driver in which libraries are claiming a role in this space. A wide range of factors, including governance, resourcing and skills, are identified as influencing ongoing developments. From the analysis, a model is constructed designed to capture the main aspects of an institutional RDM programme. This model helps to clarify the different issues involved in RDM, identifying layers of activity, multiple stakeholders and drivers, and a large number of factors influencing the implementation of any initiative. Institutions may usefully benchmark their activities against the

  16. Activation of AMP-activated kinase as a strategy for managing autosomal dominant polycystic kidney disease.

    PubMed

    McCarty, Mark F; Barroso-Aranda, Jorge; Contreras, Francisco

    2009-12-01

    There is evidence that overactivity of both mammalian target of rapamycin (mTOR) and cystic fibrosis transmembrane conductance regulator (CFTR) contributes importantly to the progressive expansion of renal cysts in autosomal dominant polycystic kidney disease (ADPKD). Recent research has established that AMP-activated kinase (AMPK) can suppress the activity of each of these proteins. Clinical AMPK activators such as metformin and berberine may thus have potential in the clinical management of ADPKD. The traditional use of berberine in diarrhea associated with bacterial infections may reflect, in part, the inhibitory impact of AMPK on chloride extrusion by small intestinal enterocytes. PMID:19570618

  17. Standardized phenology monitoring methods to track plant and animal activity for science and resource management applications

    NASA Astrophysics Data System (ADS)

    Denny, Ellen G.; Gerst, Katharine L.; Miller-Rushing, Abraham J.; Tierney, Geraldine L.; Crimmins, Theresa M.; Enquist, Carolyn A. F.; Guertin, Patricia; Rosemartin, Alyssa H.; Schwartz, Mark D.; Thomas, Kathryn A.; Weltzin, Jake F.

    2014-05-01

    Phenology offers critical insights into the responses of species to climate change; shifts in species' phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological "status", or the ability to track presence-absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.

  18. Standardized phenology monitoring methods to track plant and animal activity for science and resource management applications

    USGS Publications Warehouse

    Denny, Ellen G.; Gerst, Katharine L.; Miller-Rushing, Abraham J.; Tierney, Geraldine L.; Crimmins, Theresa M.; Enquist, Carolyn A.F.; Guertin, Patricia; Rosemartin, Alyssa H.; Schwartz, Mark D.; Thomas, Kathryn A.; Weltzin, Jake F.

    2014-01-01

    Phenology offers critical insights into the responses of species to climate change; shifts in species’ phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological “status”, or the ability to track presence–absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.

  19. Data Extraction and Management in Networks of Observational Health Care Databases for Scientific Research: A Comparison of EU-ADR, OMOP, Mini-Sentinel and MATRICE Strategies

    PubMed Central

    Gini, Rosa; Schuemie, Martijn; Brown, Jeffrey; Ryan, Patrick; Vacchi, Edoardo; Coppola, Massimo; Cazzola, Walter; Coloma, Preciosa; Berni, Roberto; Diallo, Gayo; Oliveira, José Luis; Avillach, Paul; Trifirò, Gianluca; Rijnbeek, Peter; Bellentani, Mariadonata; van Der Lei, Johan; Klazinga, Niek; Sturkenboom, Miriam

    2016-01-01

    Introduction: We see increased use of existing observational data in order to achieve fast and transparent production of empirical evidence in health care research. Multiple databases are often used to increase power, to assess rare exposures or outcomes, or to study diverse populations. For privacy and sociological reasons, original data on individual subjects can’t be shared, requiring a distributed network approach where data processing is performed prior to data sharing. Case Descriptions and Variation Among Sites: We created a conceptual framework distinguishing three steps in local data processing: (1) data reorganization into a data structure common across the network; (2) derivation of study variables not present in original data; and (3) application of study design to transform longitudinal data into aggregated data sets for statistical analysis. We applied this framework to four case studies to identify similarities and differences in the United States and Europe: Exploring and Understanding Adverse Drug Reactions by Integrative Mining of Clinical Records and Biomedical Knowledge (EU-ADR), Observational Medical Outcomes Partnership (OMOP), the Food and Drug Administration’s (FDA’s) Mini-Sentinel, and the Italian network—the Integration of Content Management Information on the Territory of Patients with Complex Diseases or with Chronic Conditions (MATRICE). Findings: National networks (OMOP, Mini-Sentinel, MATRICE) all adopted shared procedures for local data reorganization. The multinational EU-ADR network needed locally defined procedures to reorganize its heterogeneous data into a common structure. Derivation of new data elements was centrally defined in all networks but the procedure was not shared in EU-ADR. Application of study design was a common and shared procedure in all the case studies. Computer procedures were embodied in different programming languages, including SAS, R, SQL, Java, and C++. Conclusion: Using our conceptual framework

  20. Simulated Medication Therapy Management Activities in a Pharmacotherapy Laboratory Course

    PubMed Central

    Thorpe, Joshua M.; Trapskin, Kari

    2011-01-01

    Objective. To measure the impact of medication therapy management (MTM) learning activities on students’ confidence and intention to provide MTM using the Theory of Planned Behavior. Design. An MTM curriculum combining lecture instruction and active-learning strategies was incorporated into a required pharmacotherapy laboratory course. Assessment. A validated survey instrument was developed to evaluate student confidence and intent to engage in MTM services using the domains comprising the Theory of Planned Behavior. Confidence scores improved significantly from baseline for all items (p < 0.00), including identification of billable services, documentation, and electronic billing. Mean scores improved significantly for all Theory of Planned Behavior items within the constructs of perceived behavioral control and subjective norms (p < 0.05). At baseline, 42% of students agreed or strongly agreed that they had knowledge and skills to provide MTM. This percentage increased to 82% following completion of the laboratory activities. Conclusion. Implementation of simulated MTM activities in a pharmacotherapy laboratory significantly increased knowledge scores, confidence measures, and scores on Theory of Planned Behavior constructs related to perceived behavioral control and subjective norms. Despite these improvements, intention to engage in future MTM services remained unchanged. PMID:21829269

  1. Annual Review of Database Developments 1991.

    ERIC Educational Resources Information Center

    Basch, Reva

    1991-01-01

    Review of developments in databases highlights a new emphasis on accessibility. Topics discussed include the internationalization of databases; databases that deal with finance, drugs, and toxic waste; access to public records, both personal and corporate; media online; reducing large files of data to smaller, more manageable files; and…

  2. Advanced Query Formulation in Deductive Databases.

    ERIC Educational Resources Information Center

    Niemi, Timo; Jarvelin, Kalervo

    1992-01-01

    Discusses deductive databases and database management systems (DBMS) and introduces a framework for advanced query formulation for end users. Recursive processing is described, a sample extensional database is presented, query types are explained, and criteria for advanced query formulation from the end user's viewpoint are examined. (31…

  3. Development of soybean gene expression database (SGED)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Large volumes of microarray expression data is a challenge for analysis. To address this problem a web-based database, Soybean Expression Database (SGED) was built, using PERL/CGI, C and an ORACLE database management system. SGED contains three components. The Data Mining component serves as a repos...

  4. Management practices that concentrate visitor activities: Camping impact management at Isle Royale National Park, USA

    USGS Publications Warehouse

    Marion, J.L.; Farrell, T.A.

    2002-01-01

    This study assessed campsite conditions and the effectiveness of campsite impact management strategies at Isle Royale National Park, USA. Protocols for assessing indicators of vegetation and soil conditions were developed and applied to 156 campsites and 88 shelters within 36 backcountry campgrounds. The average site was 68 m2 and 83% of sites lost vegetation over areas less than 47 m2. Results reveal that management actions to spatially concentrate camping activities and reduce camping disturbance have been highly successful. Comparisons of disturbed area/overnight stay among other protected areas reinforces this assertion. These reductions in area of camping disturbance are attributed to a designated site camping policy, limitation on site numbers, construction of sites in sloping terrain, use of facilities, and an ongoing program of campsite maintenance. Such actions are most appropriate in higher use backcountry and wilderness settings.

  5. Management practices that concentrate visitor activities: camping impact management at Isle Royale National Park, USA.

    PubMed

    Marion, Jeffrey L; Farrell, Tracy A

    2002-10-01

    This study assessed campsite conditions and the effectiveness of campsite impact management strategies at Isle Royale National Park, USA. Protocols for assessing indicators of vegetation and soil conditions were developed and applied to 156 campsites and 88 shelters within 36 backcountry campgrounds. The average site was 68 m2 and 83% of sites lost vegetation over areas less than 47 m2. We believe that management actions implemented to spatially concentrate camping activities and reduce camping disturbance have been highly successful. Comparisons of disturbed area/overnight stay among other protected areas reinforces this assertion. These reductions in area of camping disturbance are attributed to a designated site camping policy, limitation on site numbers, construction of sites in sloping terrain, use of facilities, and an ongoing program of campsite maintenance. Such actions are most appropriate in higher use backcountry and wilderness settings. PMID:12418164

  6. Tracing thyroid hormone-disrupting compounds: database compilation and structure-activity evaluation for an effect-directed analysis of sediment.

    PubMed

    Weiss, Jana M; Andersson, Patrik L; Zhang, Jin; Simon, Eszter; Leonards, Pim E G; Hamers, Timo; Lamoree, Marja H

    2015-07-01

    A variety of anthropogenic compounds has been found to be capable of disrupting the endocrine systems of organisms, in laboratory studies as well as in wildlife. The most widely described endpoint is estrogenicity, but other hormonal disturbances, e.g., thyroid hormone disruption, are gaining more and more attention. Here, we present a review and chemical characterization, using principal component analysis, of organic compounds that have been tested for their capacity to bind competitively to the thyroid hormone transport protein transthyretin (TTR). The database contains 250 individual compounds and technical mixtures, of which 144 compounds are defined as TTR binders. Almost one third of these compounds (n = 52) were even more potent than the natural hormone thyroxine (T4). The database was used as a tool to assist in the identification of thyroid hormone-disrupting compounds (THDCs) in an effect-directed analysis (EDA) study of a sediment sample. Two compounds could be confirmed to contribute to the detected TTR-binding potency in the sediment sample, i.e., triclosan and nonylphenol technical mixture. They constituted less than 1% of the TTR-binding potency of the unfractionated extract. The low rate of explained activity may be attributed to the challenges related to identification of unknown contaminants in combination with the limited knowledge about THDCs in general. This study demonstrates the need for databases containing compound-specific toxicological properties. In the framework of EDA, such a database could be used to assist in the identification and confirmation of causative compounds focusing on thyroid hormone disruption. PMID:25986900

  7. Chemical Explosion Database

    NASA Astrophysics Data System (ADS)

    Johansson, Peder; Brachet, Nicolas

    2010-05-01

    A database containing information on chemical explosions, recorded and located by the International Data Center (IDC) of the CTBTO, should be established in the IDC prior to entry into force of the CTBT. Nearly all of the large chemical explosions occur in connection with mining activity. As a first step towards the establishment of this database, a survey of presumed mining areas where sufficiently large explosions are conducted has been done. This is dominated by the large coal mining areas like the Powder River (U.S.), Kuznetsk (Russia), Bowen (Australia) and Ekibastuz (Kazakhstan) basins. There are also several other smaller mining areas, in e.g. Scandinavia, Poland, Kazakhstan and Australia, with large enough explosions for detection. Events in the Reviewed Event Bulletin (REB) of the IDC that are located in or close to these mining areas, and which therefore are candidates for inclusion in the database, have been investigated. Comparison with a database of infrasound events has been done as many mining blasts generate strong infrasound signals and therefore also are included in the infrasound database. Currently there are 66 such REB events in 18 mining areas in the infrasound database. On a yearly basis several hundreds of events in mining areas have been recorded and included in the REB. Establishment of the database of chemical explosions requires confirmation and ground truth information from the States Parties regarding these events. For an explosion reported in the REB, the appropriate authority in whose country the explosion occurred is encouraged, on a voluntary basis, to seek out information on the explosion and communicate this information to the IDC.

  8. Cross exploitation of geo-databases and earth observation data for stakes characterization in the framework of multi-risk analysis and management: RASOR examples

    NASA Astrophysics Data System (ADS)

    Tholey, Nadine; Yesou, Herve; Maxant, Jerome; Montabord, Myldred; Studer, Mathias; Faivre, Robin; Rudari, Roberto; de Fraipont, Paul

    2016-04-01

    In the context of risk analysis and management, information is needed on the landscape under investigation, especially for vulnerability assessment purposes where landuse and stakes characterization is of prime importance for the knowledge and description of exposure elements in modelling scenarios. Such thematic information over at-risk areas can be extracted from available global, regional or local scale open sources databases (e.g. ESA-Globcover, Natural Earth, Copernicus core services, OSM, …) or derived from the exploitation of EO satellite images at high and very high spatial resolution (e.g. SPOT, soon Sentinel2, Pleiades, WorldView, …) over territories where this type of information is not available or not sufficiently up to date. However, EO data processing, and derived results highlight also the gap between what would be needed for a complete representation of vulnerability , i.e. a functional description of the land use, a structural description of the buildings including their functional use , and what is reasonable accessible by exploiting EO data, i.e. a biophysical description of the land cover at different spatial resolution from decametric scales to sub-metric ones, especially for urban block and building information. Potential and limits of these multi-scale and multi-sources of geo-information will be illustrated by examples related to different types of landscape and urban settlements in Asia (Indonesia), Europe (Greece), and the Caribbean (Haiti) regions, and exploited within the framework of the RASOR (Rapid Analysis and Spatialisation Of Risk) project (European Commission FP7) which is developing a platform to perform multi-hazard risk analysis to support the full cycle of disaster management.

  9. Hidradenitis Suppurativa Management in the United States: An Analysis of the National Ambulatory Medical Care Survey and MarketScan Medicaid Databases

    PubMed Central

    Davis, Scott A.; Lin, Hsien-Chang; Balkrishnan, Rajesh; Feldman, Steven R.

    2015-01-01

    Purpose To present nationally representative data demonstrating how frequently hidradenitis suppurativa (HS) occurs in specific groups and how it is currently managed. Methods We analyzed data from the 1990–2009 National Ambulatory Medical Care Survey (NAMCS) and the 2003–2007 MarketScan Medicaid databases for patients with a diagnosis of HS (ICD-9-CM code 705.83). Visits per 100,000 population of each race and ethnicity were calculated using the 2000 US Census data for specific demographics. Results There were 164,000 patient visits (95% CI: 128,000–200,000) annually with a diagnosis of HS in the NAMCS, and 17,270 HS patients were found in the MarketScan Medicaid over the 5-year period. Antibiotics were the most common treatment, followed by pain medications, topical steroids, and isotretinoin. Prescriptions of biologics and systemic methotrexate, cyclosporine, and acitretin were not observed in the NAMCS. Physicians prescribed medications in 74% of visits and used procedures in 11% of visits. African Americans, females, and young adults had higher numbers of visits for HS. Conclusions Our data showing a maximum of 0.06% of the population being treated for HS in a given year are consistent with the low estimates of HS prevalence. Compared to the current prescribing patterns, the more frequent prescription of biologics and systemic treatments may yield better outcomes. PMID:27172455

  10. Construction of databases: advances and significance in clinical research.

    PubMed

    Long, Erping; Huang, Bingjie; Wang, Liming; Lin, Xiaoyu; Lin, Haotian

    2015-12-01

    Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the definition, classification, and establishment of databases. Afterward, the workflow for establishing databases, inputting data, verifying data, and managing databases is presented. Meanwhile, by discussing the application of databases in clinical research, we illuminate the important role of databases in clinical research practice. Lastly, we introduce the reanalysis of randomized controlled trials (RCTs) and cloud computing techniques, showing the most recent advancements of databases in clinical research. PMID:27215009

  11. The AMMA database

    NASA Astrophysics Data System (ADS)

    Boichard, Jean-Luc; Brissebrat, Guillaume; Cloche, Sophie; Eymard, Laurence; Fleury, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim

    2010-05-01

    The AMMA project includes aircraft, ground-based and ocean measurements, an intensive use of satellite data and diverse modelling studies. Therefore, the AMMA database aims at storing a great amount and a large variety of data, and at providing the data as rapidly and safely as possible to the AMMA research community. In order to stimulate the exchange of information and collaboration between researchers from different disciplines or using different tools, the database provides a detailed description of the products and uses standardized formats. The AMMA database contains: - AMMA field campaigns datasets; - historical data in West Africa from 1850 (operational networks and previous scientific programs); - satellite products from past and future satellites, (re-)mapped on a regular latitude/longitude grid and stored in NetCDF format (CF Convention); - model outputs from atmosphere or ocean operational (re-)analysis and forecasts, and from research simulations. The outputs are processed as the satellite products are. Before accessing the data, any user has to sign the AMMA data and publication policy. This chart only covers the use of data in the framework of scientific objectives and categorically excludes the redistribution of data to third parties and the usage for commercial applications. Some collaboration between data producers and users, and the mention of the AMMA project in any publication is also required. The AMMA database and the associated on-line tools have been fully developed and are managed by two teams in France (IPSL Database Centre, Paris and OMP, Toulouse). Users can access data of both data centres using an unique web portal. This website is composed of different modules : - Registration: forms to register, read and sign the data use chart when an user visits for the first time - Data access interface: friendly tool allowing to build a data extraction request by selecting various criteria like location, time, parameters... The request can

  12. Active Management of Flap-Edge Trailing Vortices

    NASA Technical Reports Server (NTRS)

    Greenblatt, David; Yao, Chung-Sheng; Vey, Stefan; Paschereit, Oliver C.; Meyer, Robert

    2008-01-01

    The vortex hazard produced by large airliners and increasingly larger airliners entering service, combined with projected rapid increases in the demand for air transportation, is expected to act as a major impediment to increased air traffic capacity. Significant reduction in the vortex hazard is possible, however, by employing active vortex alleviation techniques that reduce the wake severity by dynamically modifying its vortex characteristics, providing that the techniques do not degrade performance or compromise safety and ride quality. With this as background, a series of experiments were performed, initially at NASA Langley Research Center and subsequently at the Berlin University of Technology in collaboration with the German Aerospace Center. The investigations demonstrated the basic mechanism for managing trailing vortices using retrofitted devices that are decoupled from conventional control surfaces. The basic premise for managing vortices advanced here is rooted in the erstwhile forgotten hypothesis of Albert Betz, as extended and verified ingeniously by Coleman duPont Donaldson and his collaborators. Using these devices, vortices may be perturbed at arbitrarily long wavelengths down to wavelengths less than a typical airliner wingspan and the oscillatory loads on the wings, and hence the vehicle, are small. Significant flexibility in the specific device has been demonstrated using local passive and active separation control as well as local circulation control via Gurney flaps. The method is now in a position to be tested in a wind tunnel with a longer test section on a scaled airliner configuration. Alternatively, the method can be tested directly in a towing tank, on a model aircraft, a light aircraft or a full-scale airliner. The authors believed that this method will have significant appeal from an industry perspective due to its retrofit potential with little to no impact on cruise (devices tucked away in the cove or retracted); low operating power

  13. The CEBAF Element Database

    SciTech Connect

    Theodore Larrieu, Christopher Slominski, Michele Joyce

    2011-03-01

    With the inauguration of the CEBAF Element Database (CED) in Fall 2010, Jefferson Lab computer scientists have taken a step toward the eventual goal of a model-driven accelerator. Once fully populated, the database will be the primary repository of information used for everything from generating lattice decks to booting control computers to building controls screens. A requirement influencing the CED design is that it provide access to not only present, but also future and past configurations of the accelerator. To accomplish this, an introspective database schema was designed that allows new elements, types, and properties to be defined on-the-fly with no changes to table structure. Used in conjunction with Oracle Workspace Manager, it allows users to query data from any time in the database history with the same tools used to query the present configuration. Users can also check-out workspaces to use as staging areas for upcoming machine configurations. All Access to the CED is through a well-documented Application Programming Interface (API) that is translated automatically from original C++ source code into native libraries for scripting languages such as perl, php, and TCL making access to the CED easy and ubiquitous.

  14. LHCb distributed conditions database

    NASA Astrophysics Data System (ADS)

    Clemencic, M.

    2008-07-01

    The LHCb Conditions Database project provides the necessary tools to handle non-event time-varying data. The main users of conditions are reconstruction and analysis processes, which are running on the Grid. To allow efficient access to the data, we need to use a synchronized replica of the content of the database located at the same site as the event data file, i.e. the LHCb Tier1. The replica to be accessed is selected from information stored on LFC (LCG File Catalog) and managed with the interface provided by the LCG developed library CORAL. The plan to limit the submission of jobs to those sites where the required conditions are available will also be presented. LHCb applications are using the Conditions Database framework on a production basis since March 2007. We have been able to collect statistics on the performance and effectiveness of both the LCG library COOL (the library providing conditions handling functionalities) and the distribution framework itself. Stress tests on the CNAF hosted replica of the Conditions Database have been performed and the results will be summarized here.

  15. The magnet components database system

    SciTech Connect

    Baggett, M.J. ); Leedy, R.; Saltmarsh, C.; Tompkins, J.C. )

    1990-01-01

    The philosophy, structure, and usage MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. These data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 10 figs.

  16. The Genopolis Microarray Database

    PubMed Central

    Splendiani, Andrea; Brandizi, Marco; Even, Gael; Beretta, Ottavio; Pavelka, Norman; Pelizzola, Mattia; Mayhaus, Manuel; Foti, Maria; Mauri, Giancarlo; Ricciardi-Castagnoli, Paola

    2007-01-01

    Background Gene expression databases are key resources for microarray data management and analysis and the importance of a proper annotation of their content is well understood. Public repositories as well as microarray database systems that can be implemented by single laboratories exist. However, there is not yet a tool that can easily support a collaborative environment where different users with different rights of access to data can interact to define a common highly coherent content. The scope of the Genopolis database is to provide a resource that allows different groups performing microarray experiments related to a common subject to create a common coherent knowledge base and to analyse it. The Genopolis database has been implemented as a dedicated system for the scientific community studying dendritic and macrophage cells functions and host-parasite interactions. Results The Genopolis Database system allows the community to build an object based MIAME compliant annotation of their experiments and to store images, raw and processed data from the Affymetrix GeneChip® platform. It supports dynamical definition of controlled vocabularies and provides automated and supervised steps to control the coherence of data and annotations. It allows a precise control of the visibility of the database content to different sub groups in the community and facilitates exports of its content to public repositories. It provides an interactive users interface for data analysis: this allows users to visualize data matrices based on functional lists and sample characterization, and to navigate to other data matrices defined by similarity of expression values as well as functional characterizations of genes involved. A collaborative environment is also provided for the definition and sharing of functional annotation by users. Conclusion The Genopolis Database supports a community in building a common coherent knowledge base and analyse it. This fills a gap between a local

  17. 17 CFR 240.3b-15 - Definition of ancillary portfolio management securities activities.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... of incidental trading activities for portfolio management purposes; and (3) Are limited to risk... governing body of the dealer and included in the internal risk management control system for the dealer... portfolio management securities activities. 240.3b-15 Section 240.3b-15 Commodity and Securities...

  18. 17 CFR 240.3b-15 - Definition of ancillary portfolio management securities activities.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... of incidental trading activities for portfolio management purposes; and (3) Are limited to risk... governing body of the dealer and included in the internal risk management control system for the dealer... portfolio management securities activities. 240.3b-15 Section 240.3b-15 Commodity and Securities...

  19. 17 CFR 240.3b-15 - Definition of ancillary portfolio management securities activities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... of incidental trading activities for portfolio management purposes; and (3) Are limited to risk... governing body of the dealer and included in the internal risk management control system for the dealer... portfolio management securities activities. 240.3b-15 Section 240.3b-15 Commodity and Securities...

  20. 15 CFR 930.38 - Consistency determinations for activities initiated prior to management program approval.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., DEPARTMENT OF COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT FEDERAL CONSISTENCY WITH APPROVED COASTAL MANAGEMENT PROGRAMS Consistency for Federal Agency Activities § 930.38 Consistency determinations for... activities initiated prior to management program approval. 930.38 Section 930.38 Commerce and Foreign...