Science.gov

Sample records for active database management

  1. Database Manager

    ERIC Educational Resources Information Center

    Martin, Andrew

    2010-01-01

    It is normal practice today for organizations to store large quantities of records of related information as computer-based files or databases. Purposeful information is retrieved by performing queries on the data sets. The purpose of DATABASE MANAGER is to communicate to students the method by which the computer performs these queries. This…

  2. Database Management

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Management of the data within a planetary data system (PDS) is addressed. Principles of modern data management are described and several large NASA scientific data base systems are examined. Data management in PDS is outlined and the major data management issues are introduced.

  3. Database Searching by Managers.

    ERIC Educational Resources Information Center

    Arnold, Stephen E.

    Managers and executives need the easy and quick access to business and management information that online databases can provide, but many have difficulty articulating their search needs to an intermediary. One possible solution would be to encourage managers and their immediate support staff members to search textual databases directly as they now…

  4. Database Management System

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In 1981 Wayne Erickson founded Microrim, Inc, a company originally focused on marketing a microcomputer version of RIM (Relational Information Manager). Dennis Comfort joined the firm and is now vice president, development. The team developed an advanced spinoff from the NASA system they had originally created, a microcomputer database management system known as R:BASE 4000. Microrim added many enhancements and developed a series of R:BASE products for various environments. R:BASE is now the second largest selling line of microcomputer database management software in the world.

  5. TWRS technical baseline database manager definition document

    SciTech Connect

    Acree, C.D.

    1997-08-13

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

  6. Requirements Management Database

    2009-08-13

    This application is a simplified and customized version of the RBA and CTS databases to capture federal, site, and facility requirements, link to actions that must be performed to maintain compliance with their contractual and other requirements.

  7. Database Design for Preservation Project Management: The California Newspaper Project.

    ERIC Educational Resources Information Center

    Hayman, Lynne M.

    1997-01-01

    Describes a database designed to manage a serials preservation project in which issues from multiple repositories are gathered and collated for preservation microfilming. Management information, added to bibliographic and holdings records, supports the production of reports tracking preservation activity. (Author)

  8. Construction of file database management

    SciTech Connect

    MERRILL,KYLE J.

    2000-03-01

    This work created a database for tracking data analysis files from multiple lab techniques and equipment stored on a central file server. Experimental details appropriate for each file type are pulled from the file header and stored in a searchable database. The database also stores specific location and self-directory structure for each data file. Queries can be run on the database according to file type, sample type or other experimental parameters. The database was constructed in Microsoft Access and Visual Basic was used for extraction of information from the file header.

  9. Negative Effects of Learning Spreadsheet Management on Learning Database Management

    ERIC Educational Resources Information Center

    Vágner, Anikó; Zsakó, László

    2015-01-01

    A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…

  10. An Introduction to Database Management Systems.

    ERIC Educational Resources Information Center

    Warden, William H., III; Warden, Bette M.

    1984-01-01

    Description of database management systems for microcomputers highlights system features and factors to consider in microcomputer system selection. A method for ranking database management systems is explained and applied to a defined need, i.e., software support for indexing a weekly newspaper. A glossary of terms and 32-item bibliography are…

  11. Frame-Based Approach To Database Management

    NASA Astrophysics Data System (ADS)

    Voros, Robert S.; Hillman, Donald J.; Decker, D. Richard; Blank, Glenn D.

    1989-03-01

    Practical knowledge-based systems need to reason in terms of knowledge that is already available in databases. This type of knowledge is usually represented as tables acquired from external databases and published reports. Knowledge based systems provide a means for reasoning about entities at a higher level of abstraction. What is needed in many of today's expert systems is a link between the knowledge base and external databases. One such approach is a frame-based database management system. Package Expert (PEx) designs packages for integrated circuits. The thrust of our work is to bring together diverse technologies, data and design knowledge in a coherent system. PEx uses design rules to reason about properties of chips and potential packages, including dimensions, possible materials and packaging requirements. This information is available in existing databases. PEx needs to deal with the following types of information consistently: material databases which are in several formats; technology databases, also in several formats; and parts files which contain dimensional information. It is inefficient and inelegant to have rules access the database directly. Instead, PEx uses a frame-based hierarchical knowledge management approach to databases. Frames serve as the interface between rule-based knowledge and databases. We describe PEx and the use of frames in database retrieval. We first give an overview and the design evolution of the expert system. Next, we describe the system implementation. Finally, we describe how the rules in the expert system access the databases via frames.

  12. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  13. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins; John M. Beck

    2011-11-01

    The Next Generation Nuclear Plant (NGNP) Risk Management System (RMS) is a database used to maintain the project risk register. The RMS also maps risk reduction activities to specific identified risks. Further functionality of the RMS includes mapping reactor suppliers Design Data Needs (DDNs) to risk reduction tasks and mapping Phenomena Identification Ranking Table (PIRTs) to associated risks. This document outlines the basic instructions on how to use the RMS. This document constitutes Revision 1 of the NGNP Risk Management Database: A Model for Managing Risk. It incorporates the latest enhancements to the RMS. The enhancements include six new custom views of risk data - Impact/Consequence, Tasks by Project Phase, Tasks by Status, Tasks by Project Phase/Status, Tasks by Impact/WBS, and Tasks by Phase/Impact/WBS.

  14. Central Asia Active Fault Database

    NASA Astrophysics Data System (ADS)

    Mohadjer, Solmaz; Ehlers, Todd A.; Kakar, Najibullah

    2014-05-01

    The ongoing collision of the Indian subcontinent with Asia controls active tectonics and seismicity in Central Asia. This motion is accommodated by faults that have historically caused devastating earthquakes and continue to pose serious threats to the population at risk. Despite international and regional efforts to assess seismic hazards in Central Asia, little attention has been given to development of a comprehensive database for active faults in the region. To address this issue and to better understand the distribution and level of seismic hazard in Central Asia, we are developing a publically available database for active faults of Central Asia (including but not limited to Afghanistan, Tajikistan, Kyrgyzstan, northern Pakistan and western China) using ArcGIS. The database is designed to allow users to store, map and query important fault parameters such as fault location, displacement history, rate of movement, and other data relevant to seismic hazard studies including fault trench locations, geochronology constraints, and seismic studies. Data sources integrated into the database include previously published maps and scientific investigations as well as strain rate measurements and historic and recent seismicity. In addition, high resolution Quickbird, Spot, and Aster imagery are used for selected features to locate and measure offset of landforms associated with Quaternary faulting. These features are individually digitized and linked to attribute tables that provide a description for each feature. Preliminary observations include inconsistent and sometimes inaccurate information for faults documented in different studies. For example, the Darvaz-Karakul fault which roughly defines the western margin of the Pamir, has been mapped with differences in location of up to 12 kilometers. The sense of motion for this fault ranges from unknown to thrust and strike-slip in three different studies despite documented left-lateral displacements of Holocene and late

  15. Ridesharing and the database management system

    SciTech Connect

    Taasevigen, D.

    1981-08-01

    Lawrence Livermore National Laboratory has operated a ridesharing program since 1977. As the volume of recordkeeping and information tracking for the program became more extensive, the need for an easily altered and operated database system became apparent. The following report describes the needs of the ridesharing program and how our database management system answers those needs.

  16. Integrated Space Asset Management Database and Modeling

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

  17. The land management and operations database (LMOD)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper presents the design, implementation, deployment, and application of the Land Management and Operations Database (LMOD). LMOD is the single authoritative source for reference land management and operation reference data within the USDA enterprise data warehouse. LMOD supports modeling appl...

  18. Research on computer virus database management system

    NASA Astrophysics Data System (ADS)

    Qi, Guoquan

    2011-12-01

    The growing proliferation of computer viruses becomes the lethal threat and research focus of the security of network information. While new virus is emerging, the number of viruses is growing, virus classification increasing complex. Virus naming because of agencies' capture time differences can not be unified. Although each agency has its own virus database, the communication between each other lacks, or virus information is incomplete, or a small number of sample information. This paper introduces the current construction status of the virus database at home and abroad, analyzes how to standardize and complete description of virus characteristics, and then gives the information integrity, storage security and manageable computer virus database design scheme.

  19. Organizing a breast cancer database: data management.

    PubMed

    Yi, Min; Hunt, Kelly K

    2016-06-01

    Developing and organizing a breast cancer database can provide data and serve as valuable research tools for those interested in the etiology, diagnosis, and treatment of cancer. Depending on the research setting, the quality of the data can be a major issue. Assuring that the data collection process does not contribute inaccuracies can help to assure the overall quality of subsequent analyses. Data management is work that involves the planning, development, implementation, and administration of systems for the acquisition, storage, and retrieval of data while protecting it by implementing high security levels. A properly designed database provides you with access to up-to-date, accurate information. Database design is an important component of application design. If you take the time to design your databases properly, you'll be rewarded with a solid application foundation on which you can build the rest of your application.

  20. Organizing a breast cancer database: data management.

    PubMed

    Yi, Min; Hunt, Kelly K

    2016-06-01

    Developing and organizing a breast cancer database can provide data and serve as valuable research tools for those interested in the etiology, diagnosis, and treatment of cancer. Depending on the research setting, the quality of the data can be a major issue. Assuring that the data collection process does not contribute inaccuracies can help to assure the overall quality of subsequent analyses. Data management is work that involves the planning, development, implementation, and administration of systems for the acquisition, storage, and retrieval of data while protecting it by implementing high security levels. A properly designed database provides you with access to up-to-date, accurate information. Database design is an important component of application design. If you take the time to design your databases properly, you'll be rewarded with a solid application foundation on which you can build the rest of your application. PMID:27197511

  1. Implementing a Microcomputer Database Management System.

    ERIC Educational Resources Information Center

    Manock, John J.; Crater, K. Lynne

    1985-01-01

    Current issues in selecting, structuring, and implementing microcomputer database management systems in research administration offices are discussed, and their capabilities are illustrated with the system used by the University of North Carolina at Wilmington. Trends in microcomputer technology and their likely impact on research administration…

  2. Choosing the Right Database Management Program.

    ERIC Educational Resources Information Center

    Vockell, Edward L.; Kopenec, Donald

    1989-01-01

    Provides a comparison of four database management programs commonly used in schools: AppleWorks, the DOS 3.3 and ProDOS versions of PFS, and MECC's Data Handler. Topics discussed include information storage, spelling checkers, editing functions, search strategies, graphs, printout formats, library applications, and HyperCard. (LRW)

  3. How Should we Manage all These Databases?

    SciTech Connect

    Langley, K.E.

    1998-11-01

    In an organization where there are many DBAs working with many instances and databases on many machines with many developers - how do you manage all of this without total chaos? This paper will outline how the central Database Support organization at Lockheed Martin Energy Systems in Oak Ridge, TN manages more than 250 instances on more than 90 systems with a variety of operating systems. This discussion will include how tasks and responsibilities are divided between System DBAs, Application Project DBAs, and developers. The use of standards as well as local routines to maintain the systems will be discussed. Information on the type of communications used to keep the different groups informed and up-to-date will also be presented.

  4. How Should We Manage All Those Databases?

    SciTech Connect

    Langley, K E

    1998-10-01

    In an organization where there are many DBAs working with many instances and databases on many machines with many developers - how do you manage all of this without total chaos? This paper will outline how the central Database Support organization at Lockheed Martin Energy Systems in Oak Ridge, TN manages more than 250 instances on more than 90 systems with a variety of operating systems. This discussion will include how tasks and responsibilities are divided between System DBAs, Application Project DBAs, and developers. The use of standards as well as local routines to maintain the systems will be discussed. Information on the type of communications used to keep the different group informed and up-to-date will also be presented.

  5. SPIRE Data-Base Management System

    NASA Technical Reports Server (NTRS)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  6. Service Management Database for DSN Equipment

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.

  7. Database activities at Brookhaven National Laboratory

    SciTech Connect

    Trahern, C.G.

    1995-12-01

    Brookhaven National Laboratory is a multi-disciplinary lab in the DOE system of research laboratories. Database activities are correspondingly diverse within the restrictions imposed by the dominant relational database paradigm. The authors discuss related activities and tools used in RHIC and in the other major projects at BNL. The others are the Protein Data Bank being maintained by the Chemistry department, and a Geographical Information System (GIS)--a Superfund sponsored environmental monitoring project under development in the Office of Environmental Restoration.

  8. Pre-Validated Signal Database Management System

    1996-12-18

    SPRT/DBMS is a pre-validated experimental database management system for industries where large volumes of process signals are acquired and archived. This system implements a new and powerful pattern recognition method, the spectrum transformed sequential testing (STST or ST2) procedure. A network of interacting ST2 modules deployed in parallel is integrated with a relational DBMS to fully validate process signals as they are archived. This reliable, secure DBMS then provides system modelers, code developers, and safetymore » analysts with an easily accessible source of fully validated process data.« less

  9. HGDBMS: a human genetics database management system.

    PubMed

    Seuchter, S A; Skolnick, M H

    1988-10-01

    Human genetics research involves a large number of complex data sets naturally organized in hierarchical structures. Data collection is performed on different levels, e.g., the project level, pedigree level, individual level, and sample level. Different aspects of a study utilize different views of the data, requiring a flexible database management system (DBMS) which satisfies these different needs for data collection and retrieval. We describe HGDBMS, a comprehensive relational DBMS, implemented as an application of the GENISYS I DBMS, which allows embedding the hierarchical structure of pedigrees in a relational structure. The system's file structure is described in detail. Currently our Melanoma and Chromosome 17 map studies are managed with HGDBMS. Our initial experience demonstrates the value of a flexible system which supports the needs for data entry, update, storage, reporting, and analysis required during different phases of genetic research. Further developments will focus on the integration of HGDBMS with a human genetics expert system shell and analysis programs. PMID:3180747

  10. HGDBMS: a human genetics database management system.

    PubMed

    Seuchter, S A; Skolnick, M H

    1988-10-01

    Human genetics research involves a large number of complex data sets naturally organized in hierarchical structures. Data collection is performed on different levels, e.g., the project level, pedigree level, individual level, and sample level. Different aspects of a study utilize different views of the data, requiring a flexible database management system (DBMS) which satisfies these different needs for data collection and retrieval. We describe HGDBMS, a comprehensive relational DBMS, implemented as an application of the GENISYS I DBMS, which allows embedding the hierarchical structure of pedigrees in a relational structure. The system's file structure is described in detail. Currently our Melanoma and Chromosome 17 map studies are managed with HGDBMS. Our initial experience demonstrates the value of a flexible system which supports the needs for data entry, update, storage, reporting, and analysis required during different phases of genetic research. Further developments will focus on the integration of HGDBMS with a human genetics expert system shell and analysis programs.

  11. Integrated Space Asset Management Database and Modeling

    NASA Astrophysics Data System (ADS)

    Gagliano, L.; MacLeod, T.; Mason, S.; Percy, T.; Prescott, J.

    The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

  12. Applications of GIS and database technologies to manage a Karst Feature Database

    USGS Publications Warehouse

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  13. Lessons Learned from Deploying an Analytical Task Management Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen

    2007-01-01

    Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.

  14. An authoritative global database for active submarine hydrothermal vent fields

    NASA Astrophysics Data System (ADS)

    Beaulieu, Stace E.; Baker, Edward T.; German, Christopher R.; Maffei, Andrew

    2013-11-01

    The InterRidge Vents Database is available online as the authoritative reference for locations of active submarine hydrothermal vent fields. Here we describe the revision of the database to an open source content management system and conduct a meta-analysis of the global distribution of known active vent fields. The number of known active vent fields has almost doubled in the past decade (521 as of year 2009), with about half visually confirmed and others inferred active from physical and chemical clues. Although previously known mainly from mid-ocean ridges (MORs), active vent fields at MORs now comprise only half of the total known, with about a quarter each now known at volcanic arcs and back-arc spreading centers. Discoveries in arc and back-arc settings resulted in an increase in known vent fields within exclusive economic zones, consequently reducing the proportion known in high seas to one third. The increase in known vent fields reflects a number of factors, including increased national and commercial interests in seafloor hydrothermal deposits as mineral resources. The purpose of the database now extends beyond academic research and education and into marine policy and management, with at least 18% of known vent fields in areas granted or pending applications for mineral prospecting and 8% in marine protected areas.

  15. Integrated Space Asset Management Database and Modeling

    NASA Astrophysics Data System (ADS)

    Gagliano, L.; MacLeod, T.; Mason, S.; Percy, T.; Prescott, J.

    The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

  16. The role of databases in areawide pest management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A database is a comprehensive collection of related data organized for convenient access, generally in a computer. The evolution of computer software and the need to distinguish the specialized computer systems for storing and manipulating data, stimulated development of database management systems...

  17. Adapting the rangeland database for managing ecological site description data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Field data collection for writing Ecological Data Descriptions (ESD) creates a paperwork burden that reduces efficiency of ESD preparation. The recently developed Rangeland Database and Field Data Entry System is well suited to managing ESD data. This database was developed to automate data entry an...

  18. MST radar data-base management

    NASA Technical Reports Server (NTRS)

    Wickwar, V. B.

    1983-01-01

    Data management for Mesospheric-Stratospheric-Tropospheric, (MST) radars is addressed. An incoherent-scatter radar data base is discussed in terms of purpose, centralization, scope, and nature of the data base management system.

  19. Managing a large database of camera fingerprints

    NASA Astrophysics Data System (ADS)

    Goljan, Miroslav; Fridrich, Jessica; Filler, Tomáš

    2010-01-01

    Sensor fingerprint is a unique noise-like pattern caused by slightly varying pixel dimensions and inhomogeneity of the silicon wafer from which the sensor is made. The fingerprint can be used to prove that an image came from a specific digital camera. The presence of a camera fingerprint in an image is usually established using a detector that evaluates cross-correlation between the fingerprint and image noise. The complexity of the detector is thus proportional to the number of pixels in the image. Although computing the detector statistic for a few megapixel image takes several seconds on a single-processor PC, the processing time becomes impractically large if a sizeable database of camera fingerprints needs to be searched through. In this paper, we present a fast searching algorithm that utilizes special "fingerprint digests" and sparse data structures to address several tasks that forensic analysts will find useful when deploying camera identification from fingerprints in practice. In particular, we develop fast algorithms for finding if a given fingerprint already resides in the database and for determining whether a given image was taken by a camera whose fingerprint is in the database.

  20. A database management capability for Ada

    NASA Technical Reports Server (NTRS)

    Chan, Arvola; Danberg, SY; Fox, Stephen; Landers, Terry; Nori, Anil; Smith, John M.

    1986-01-01

    The data requirements of mission critical defense systems have been increasing dramatically. Command and control, intelligence, logistics, and even weapons systems are being required to integrate, process, and share ever increasing volumes of information. To meet this need, systems are now being specified that incorporate data base management subsystems for handling storage and retrieval of information. It is expected that a large number of the next generation of mission critical systems will contain embedded data base management systems. Since the use of Ada has been mandated for most of these systems, it is important to address the issues of providing data base management capabilities that can be closely coupled with Ada. A comprehensive distributed data base management project has been investigated. The key deliverables of this project are three closely related prototype systems implemented in Ada. These three systems are discussed.

  1. Active in-database processing to support ambient assisted living systems.

    PubMed

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-08-12

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  2. Expansion of the MANAGE database with forest and drainage studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The “Measured Annual Nutrient loads from AGricultural Environments” (MANAGE) database was published in 2006 to expand an early 1980’s compilation of nutrient export (load) data from agricultural land uses at the field or farm spatial scale. Then in 2008, MANAGE was updated with 15 additional studie...

  3. Development of a Relational Database for Learning Management Systems

    ERIC Educational Resources Information Center

    Deperlioglu, Omer; Sarpkaya, Yilmaz; Ergun, Ertugrul

    2011-01-01

    In today's world, Web-Based Distance Education Systems have a great importance. Web-based Distance Education Systems are usually known as Learning Management Systems (LMS). In this article, a database design, which was developed to create an educational institution as a Learning Management System, is described. In this sense, developed Learning…

  4. Representing clinical communication knowledge through database management system integration.

    PubMed

    Khairat, Saif; Craven, Catherine; Gong, Yang

    2012-01-01

    Clinical communication failures are considered the leading cause of medical errors [1]. The complexity of the clinical culture and the significant variance in training and education levels form a challenge to enhancing communication within the clinical team. In order to improve communication, a comprehensive understanding of the overall communication process in health care is required. In an attempt to further understand clinical communication, we conducted a thorough methodology literature review to identify strengths and limitations of previous approaches [2]. Our research proposes a new data collection method to study the clinical communication activities among Intensive Care Unit (ICU) clinical teams with a primary focus on the attending physician. In this paper, we present the first ICU communication instrument, and, we introduce the use of database management system to aid in discovering patterns and associations within our ICU communications data repository.

  5. Representing clinical communication knowledge through database management system integration.

    PubMed

    Khairat, Saif; Craven, Catherine; Gong, Yang

    2012-01-01

    Clinical communication failures are considered the leading cause of medical errors [1]. The complexity of the clinical culture and the significant variance in training and education levels form a challenge to enhancing communication within the clinical team. In order to improve communication, a comprehensive understanding of the overall communication process in health care is required. In an attempt to further understand clinical communication, we conducted a thorough methodology literature review to identify strengths and limitations of previous approaches [2]. Our research proposes a new data collection method to study the clinical communication activities among Intensive Care Unit (ICU) clinical teams with a primary focus on the attending physician. In this paper, we present the first ICU communication instrument, and, we introduce the use of database management system to aid in discovering patterns and associations within our ICU communications data repository. PMID:22874366

  6. Geoscience research databases for coastal Alabama ecosystem management

    USGS Publications Warehouse

    Hummell, Richard L.

    1995-01-01

    Effective management of complex coastal ecosystems necessitates access to scientific knowledge that can be acquired through a multidisciplinary approach involving Federal and State scientists that take advantage of agency expertise and resources for the benefit of all participants working toward a set of common research and management goals. Cooperative geostatic investigations have led toward building databases of fundamental scientific knowledge that can be utilized to manage coastal Alabama's natural and future development. These databases have been used to assess the occurrence and economic potential of hard mineral resources in the Alabama EFZ, and to support oil spill contingency planning and environmental analysis for coastal Alabama.

  7. Evidence generation from healthcare databases: recommendations for managing change.

    PubMed

    Bourke, Alison; Bate, Andrew; Sauer, Brian C; Brown, Jeffrey S; Hall, Gillian C

    2016-07-01

    There is an increasing reliance on databases of healthcare records for pharmacoepidemiology and other medical research, and such resources are often accessed over a long period of time so it is vital to consider the impact of changes in data, access methodology and the environment. The authors discuss change in communication and management, and provide a checklist of issues to consider for both database providers and users. The scope of the paper is database research, and changes are considered in relation to the three main components of database research: the data content itself, how it is accessed, and the support and tools needed to use the database. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27183900

  8. DOE technology information management system database study report

    SciTech Connect

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  9. TRENDS: The aeronautical post-test database management system

    NASA Technical Reports Server (NTRS)

    Bjorkman, W. S.; Bondi, M. J.

    1990-01-01

    TRENDS, an engineering-test database operating system developed by NASA to support rotorcraft flight tests, is described. Capabilities and characteristics of the system are presented, with examples of its use in recalling and analyzing rotorcraft flight-test data from a TRENDS database. The importance of system user-friendliness in gaining users' acceptance is stressed, as is the importance of integrating supporting narrative data with numerical data in engineering-test databases. Considerations relevant to the creation and maintenance of flight-test database are discussed and TRENDS' solutions to database management problems are described. Requirements, constraints, and other considerations which led to the system's configuration are discussed and some of the lessons learned during TRENDS' development are presented. Potential applications of TRENDS to a wide range of aeronautical and other engineering tests are identified.

  10. A survey of commercial object-oriented database management systems

    NASA Technical Reports Server (NTRS)

    Atkins, John

    1992-01-01

    The object-oriented data model is the culmination of over thirty years of database research. Initially, database research focused on the need to provide information in a consistent and efficient manner to the business community. Early data models such as the hierarchical model and the network model met the goal of consistent and efficient access to data and were substantial improvements over simple file mechanisms for storing and accessing data. However, these models required highly skilled programmers to provide access to the data. Consequently, in the early 70's E.F. Codd, an IBM research computer scientists, proposed a new data model based on the simple mathematical notion of the relation. This model is known as the Relational Model. In the relational model, data is represented in flat tables (or relations) which have no physical or internal links between them. The simplicity of this model fostered the development of powerful but relatively simple query languages that now made data directly accessible to the general database user. Except for large, multi-user database systems, a database professional was in general no longer necessary. Database professionals found that traditional data in the form of character data, dates, and numeric data were easily represented and managed via the relational model. Commercial relational database management systems proliferated and performance of relational databases improved dramatically. However, there was a growing community of potential database users whose needs were not met by the relational model. These users needed to store data with data types not available in the relational model and who required a far richer modelling environment than that provided by the relational model. Indeed, the complexity of the objects to be represented in the model mandated a new approach to database technology. The Object-Oriented Model was the result.

  11. Relational Information Management Data-Base System

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Erickson, W. J.; Gray, F. P.; Comfort, D. L.; Wahlstrom, S. O.; Von Limbach, G.

    1985-01-01

    DBMS with several features particularly useful to scientists and engineers. RIM5 interfaced with any application program written in language capable of Calling FORTRAN routines. Applications include data management for Space Shuttle Columbia tiles, aircraft flight tests, high-pressure piping, atmospheric chemistry, census, university registration, CAD/CAM Geometry, and civil-engineering dam construction.

  12. Kingfisher: a system for remote sensing image database management

    NASA Astrophysics Data System (ADS)

    Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.

    2003-04-01

    At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.

  13. Development of the ageing management database of PUSPATI TRIGA reactor

    NASA Astrophysics Data System (ADS)

    Ramli, Nurhayati; Maskin, Mazleha; Tom, Phongsakorn Prak; Husain, Nurfazila; Farid, Mohd Fairus Abd; Ramli, Shaharum; Adnan, Amirul Syazwan; Abidin, Nurul Husna Zainal

    2016-01-01

    Since its first criticality in 1982, PUSPATI TRIGA Reactor (RTP) has been operated for more than 30 years. As RTP become older, ageing problems have been seen to be the prominent issues. In addressing the ageing issues, an Ageing Management (AgeM) database for managing related ageing matters was systematically developed. This paper presents the development of AgeM database taking into account all RTP major Systems, Structures and Components (SSCs) and ageing mechanism of these SSCs through the system surveillance program.

  14. Information flow in the DAMA project beyond database managers: information flow managers

    NASA Astrophysics Data System (ADS)

    Russell, Lucian; Wolfson, Ouri; Yu, Clement

    1996-12-01

    To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point of sale information, is being considered in the Demand Activated Manufacturing Project (DAMA) of the American Textile Partnership (AMTEX) project. A scenario is examined in which 100 000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26 000 suppliers through the use of bill of materials explosions at four levels of detail. Enabling this communication requires an approach that shares common features with both workflows and database management. A new paradigm, the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced so as to keep estimates of demand as current as possible.

  15. Computer networks for financial activity management, control and statistics of databases of economic administration at the Joint Institute for Nuclear Research

    NASA Astrophysics Data System (ADS)

    Tyupikova, T. V.; Samoilov, V. N.

    2003-04-01

    Modern information technologies urge natural sciences to further development. But it comes together with evaluation of infrastructures, to spotlight favorable conditions for the development of science and financial base in order to prove and protect legally new research. Any scientific development entails accounting and legal protection. In the report, we consider a new direction in software, organization and control of common databases on the example of the electronic document handling, which functions in some departments of the Joint Institute for Nuclear Research.

  16. Selecting a Relational Database Management System for Library Automation Systems.

    ERIC Educational Resources Information Center

    Shekhel, Alex; O'Brien, Mike

    1989-01-01

    Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)

  17. Use of Knowledge Bases in Education of Database Management

    ERIC Educational Resources Information Center

    Radványi, Tibor; Kovács, Emod

    2008-01-01

    In this article we present a segment of Sulinet Digital Knowledgebase curriculum system in which you can find the sections of subject-matter which aid educating the database management. You can follow the order of the course from the beginning when some topics appearance and raise in elementary school, through the topics accomplish in secondary…

  18. Interface between astrophysical datasets and distributed database management systems (DAVID)

    NASA Technical Reports Server (NTRS)

    Iyengar, S. S.

    1988-01-01

    This is a status report on the progress of the DAVID (Distributed Access View Integrated Database Management System) project being carried out at Louisiana State University, Baton Rouge, Louisiana. The objective is to implement an interface between Astrophysical datasets and DAVID. Discussed are design details and implementation specifics between DAVID and astrophysical datasets.

  19. Database Management Principles of the UCLA Library's Orion System.

    ERIC Educational Resources Information Center

    Fayollat, James; Coles, Elizabeth

    1987-01-01

    Describes an integrated online library system developed at the University of California at Los Angeles (UCLA) which incorporates a number of database management features that enhance efficiency, for record retrieval and display. Design features related to record storage and retrieval and the design of linked files are described in detail.…

  20. An image database management system for conducting CAD research

    NASA Astrophysics Data System (ADS)

    Gruszauskas, Nicholas; Drukker, Karen; Giger, Maryellen L.

    2007-03-01

    The development of image databases for CAD research is not a trivial task. The collection and management of images and their related metadata from multiple sources is a time-consuming but necessary process. By standardizing and centralizing the methods in which these data are maintained, one can generate subsets of a larger database that match the specific criteria needed for a particular research project in a quick and efficient manner. A research-oriented management system of this type is highly desirable in a multi-modality CAD research environment. An online, webbased database system for the storage and management of research-specific medical image metadata was designed for use with four modalities of breast imaging: screen-film mammography, full-field digital mammography, breast ultrasound and breast MRI. The system was designed to consolidate data from multiple clinical sources and provide the user with the ability to anonymize the data. Input concerning the type of data to be stored as well as desired searchable parameters was solicited from researchers in each modality. The backbone of the database was created using MySQL. A robust and easy-to-use interface for entering, removing, modifying and searching information in the database was created using HTML and PHP. This standardized system can be accessed using any modern web-browsing software and is fundamental for our various research projects on computer-aided detection, diagnosis, cancer risk assessment, multimodality lesion assessment, and prognosis. Our CAD database system stores large amounts of research-related metadata and successfully generates subsets of cases that match the user's desired search criteria.

  1. Management of Equipment Databases at CERN for the Atlas Experiment

    NASA Astrophysics Data System (ADS)

    Galvão, Kaio Karam; Pommès, Kathy; Molina-Pérez, Jorge; Maidantchik, Carmen; Grael, Felipe Fink

    2008-06-01

    The ATLAS experiment is about to finish its installation phase, entering into operation on the summer of 2008. This installation has represented an enormous challenge in terms of developing, setting up, and administrating the Equipment Databases, due to the large complexity of the detector, its associated services, and the necessary infrastructure. All major equipment is registered prior to installation including its electronic description and interconnectivity. This information is stored in Oracle databases. 3D visualization tools, user interfaces for portable devices, and generic retrieval/updating mechanisms have been developed in order to carry out the management of the sub-detectors databases. The full traceability of all installed equipment is crucial from ATLAS organizational point of view, and it is also a requirement by the French authorities to fulfill the INB (Installation Nucléaire de Base) protocol.

  2. Overview of the LDEF MSIG databasing activities

    NASA Technical Reports Server (NTRS)

    Funk, Joan G.

    1995-01-01

    The Long Duration Exposure Facility (LDEF) and the accompanying experiments were composed of and contained a wide variety of materials, representing the largest collection of materials flown in low earth orbit (LEO) and retrieved for ground-based analysis to date. The results and implications of the mechanical, thermal, optical, and electrical data from these materials are the foundation on which future LEO spacecraft and missions will be built. The LDEF Materials Special Investigation Group (MSIG) has been charged with establishing and developing databases to document these materials and their performance to assure not only that the data are archived for future generations but also that the data are available to the spacecraft user community in an easily accessed, user-friendly form. This paper gives an overview of the current LDEF Materials Databases, their capabilities, and availability. An overview of the philosophy and format of a developing handbook on LEO effects on materials is also described.

  3. Concierge: Personal Database Software for Managing Digital Research Resources

    PubMed Central

    Sakai, Hiroyuki; Aoyama, Toshihiro; Yamaji, Kazutsuna; Usui, Shiro

    2007-01-01

    This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literature management, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp). PMID:18974800

  4. Database Design Learning: A Project-Based Approach Organized through a Course Management System

    ERIC Educational Resources Information Center

    Dominguez, Cesar; Jaime, Arturo

    2010-01-01

    This paper describes an active method for database design learning through practical tasks development by student teams in a face-to-face course. This method integrates project-based learning, and project management techniques and tools. Some scaffolding is provided at the beginning that forms a skeleton that adapts to a great variety of…

  5. Two Student Self-Management Techniques Applied to Data-Based Program Modification.

    ERIC Educational Resources Information Center

    Wesson, Caren

    Two student self-management techniques, student charting and student selection of instructional activities, were applied to ongoing data-based program modification. Forty-two elementary school resource room students were assigned randomly (within teacher) to one of three treatment conditions: Teacher Chart-Teacher Select Instructional Activities…

  6. An engineering database management system for spacecraft operations

    NASA Technical Reports Server (NTRS)

    Cipollone, Gregorio; Mckay, Michael H.; Paris, Joseph

    1993-01-01

    Studies at ESOC have demonstrated the feasibility of a flexible and powerful Engineering Database Management System in support for spacecraft operations documentation. The objectives set out were three-fold: first an analysis of the problems encountered by the Operations team in obtaining and managing operations documents; secondly, the definition of a concept for operations documentation and the implementation of prototype to prove the feasibility of the concept; and thirdly, definition of standards and protocols required for the exchange of data between the top-level partners in a satellite project. The EDMS prototype was populated with ERS-l satellite design data and has been used by the operations team at ESOC to gather operational experience. An operational EDMS would be implemented at the satellite prime contractor's site as a common database for all technical information surrounding a project and would be accessible by the cocontractor's and ESA teams.

  7. The Golosiiv on-line plate archive database, management and maintenance

    NASA Astrophysics Data System (ADS)

    Pakuliak, L.; Sergeeva, T.

    2007-08-01

    We intend to create online version of the database of the MAO NASU plate archive as VO-compatible structures in accordance with principles, developed by the International Virtual Observatory Alliance in order to make them available for world astronomical community. The online version of the log-book database is constructed by means of MySQL+PHP. Data management system provides a user with user interface, gives a capability of detailed traditional form-filling radial search of plates, obtaining some auxiliary sampling, the listing of each collection and permits to browse the detail descriptions of collections. The administrative tool allows database administrator the data correction, enhancement with new data sets and control of the integrity and consistence of the database as a whole. The VO-compatible database is currently constructing under the demands and in the accordance with principles of international data archives and has to be strongly generalized in order to provide a possibility of data mining by means of standard interfaces and to be the best fitted to the demands of WFPDB Group for databases of the plate catalogues. On-going enhancements of database toward the WFPDB bring the problem of the verification of data to the forefront, as it demands the high degree of data reliability. The process of data verification is practically endless and inseparable from data management owing to a diversity of data errors nature, that means to a variety of ploys of their identification and fixing. The current status of MAO NASU glass archive forces the activity in both directions simultaneously: the enhancement of log-book database with new sets of observational data as well as generalized database creation and the cross-identification between them. The VO-compatible version of the database is supplying with digitized data of plates obtained with MicroTek ScanMaker 9800 XL TMA. The scanning procedure is not total but is conducted selectively in the frames of special

  8. Region and database management for HANDI 2000 business management system

    SciTech Connect

    Wilson, D.

    1998-08-26

    The Data Integration 2000 Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract. It is based on the Commercial-Off-The-Shelf product solution with commercially proven business processes. The COTS product solution set, of PassPort and People Soft software, supports finance, supply and chemical management/Material Safety Data Sheet, human resources.

  9. A Support Database System for Integrated System Health Management (ISHM)

    NASA Technical Reports Server (NTRS)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between

  10. The Oil and Natural Gas Knowledge Management Database from NETL

    DOE Data Explorer

    The Knowledge Management Database (KMD) Portal provides four options for searching the documents and data that NETL-managed oil and gas research has produced over the years for DOE’s Office of Fossil Energy. Information includes R&D carried out under both historical and ongoing DOE oil and gas research and development (R&D). The Document Repository, the CD/DVD Library, the Project Summaries from 1990 to the present, and the Oil and Natural Gas Program Reference Shelf provide a wide range of flexibility and coverage.

  11. A multidisciplinary database for geophysical time series management

    NASA Astrophysics Data System (ADS)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  12. Computerized database management system for breast cancer patients.

    PubMed

    Sim, Kok Swee; Chong, Sze Siang; Tso, Chih Ping; Nia, Mohsen Esmaeili; Chong, Aun Kee; Abbas, Siti Fathimah

    2014-01-01

    Data analysis based on breast cancer risk factors such as age, race, breastfeeding, hormone replacement therapy, family history, and obesity was conducted on breast cancer patients using a new enhanced computerized database management system. My Structural Query Language (MySQL) is selected as the application for database management system to store the patient data collected from hospitals in Malaysia. An automatic calculation tool is embedded in this system to assist the data analysis. The results are plotted automatically and a user-friendly graphical user interface is developed that can control the MySQL database. Case studies show breast cancer incidence rate is highest among Malay women, followed by Chinese and Indian. The peak age for breast cancer incidence is from 50 to 59 years old. Results suggest that the chance of developing breast cancer is increased in older women, and reduced with breastfeeding practice. The weight status might affect the breast cancer risk differently. Additional studies are needed to confirm these findings.

  13. Advanced Scientific Computing Environment Team new scientific database management task

    SciTech Connect

    Church, J.P.; Roberts, J.C.; Sims, R.N.; Smetana, A.O.; Westmoreland, B.W.

    1991-06-01

    The mission of the ASCENT Team is to continually keep pace with, evaluate, and select emerging computing technologies to define and implement prototypic scientific environments that maximize the ability of scientists and engineers to manage scientific data. These environments are to be implemented in a manner consistent with the site computing architecture and standards and NRTSC/SCS strategic plans for scientific computing. The major trends in computing hardware and software technology clearly indicate that the future computer'' will be a network environment that comprises supercomputers, graphics boxes, mainframes, clusters, workstations, terminals, and microcomputers. This network computer'' will have an architecturally transparent operating system allowing the applications code to run on any box supplying the required computing resources. The environment will include a distributed database and database managing system(s) that permits use of relational, hierarchical, object oriented, GIS, et al, databases. To reach this goal requires a stepwise progression from the present assemblage of monolithic applications codes running on disparate hardware platforms and operating systems. The first steps include converting from the existing JOSHUA system to a new J80 system that complies with modern language standards, development of a new J90 prototype to provide JOSHUA capabilities on Unix platforms, development of portable graphics tools to greatly facilitate preparation of input and interpretation of output; and extension of Jvv'' concepts and capabilities to distributed and/or parallel computing environments.

  14. Student Activities. Managing Liability.

    ERIC Educational Resources Information Center

    Bennett, Barbara; And Others

    This monograph suggests ways that college or university administrations can undertake a systematic and careful review of the risks posed by students' activities. Its purpose is to provide guidance in integrating the risk management process into a school's existing approaches to managing student organizations and activities. It is noted that no…

  15. Survey of standards applicable to a database management system

    NASA Technical Reports Server (NTRS)

    Urena, J. L.

    1981-01-01

    Industry, government, and NASA standards, and the status of standardization activities of standards setting organizations applicable to the design, implementation and operation of a data base management system for space related applications are identified. The applicability of the standards to a general purpose, multimission data base management system is addressed.

  16. Management Guidelines for Database Developers' Teams in Software Development Projects

    NASA Astrophysics Data System (ADS)

    Rusu, Lazar; Lin, Yifeng; Hodosi, Georg

    Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.

  17. THE NATIONAL EXPOSURE RESEARCH LABORATORY'S CONSOLIDATED HUMAN ACTIVITY DATABASE

    EPA Science Inventory

    EPA's National Exposure Research Laboratory (NERL) has combined data from 12 U.S. studies related to human activities into one comprehensive data system that can be accessed via the Internet. The data system is called the Consolidated Human Activity Database (CHAD), and it is ...

  18. THE NATIONAL EXPOSURE RESEARCH LABORATORY'S COMPREHENSIVE HUMAN ACTIVITY DATABASE

    EPA Science Inventory

    EPA's National Exposure Research Laboratory (NERL) has combined data from nine U.S. studies related to human activities into one comprehensive data system that can be accessed via the world-wide web. The data system is called CHAD-Consolidated Human Activity Database-and it is ...

  19. Active fault database of Japan: Its construction and search system

    NASA Astrophysics Data System (ADS)

    Yoshioka, T.; Miyamoto, F.

    2011-12-01

    The Active fault database of Japan was constructed by the Active Fault and Earthquake Research Center, GSJ/AIST and opened to the public on the Internet from 2005 to make a probabilistic evaluation of the future faulting event and earthquake occurrence on major active faults in Japan. The database consists of three sub-database, 1) sub-database on individual site, which includes long-term slip data and paleoseismicity data with error range and reliability, 2) sub-database on details of paleoseismicity, which includes the excavated geological units and faulting event horizons with age-control, 3) sub-database on characteristics of behavioral segments, which includes the fault-length, long-term slip-rate, recurrence intervals, most-recent-event, slip per event and best-estimate of cascade earthquake. Major seismogenic faults, those are approximately the best-estimate segments of cascade earthquake, each has a length of 20 km or longer and slip-rate of 0.1m/ky or larger and is composed from about two behavioral segments in average, are included in the database. This database contains information of active faults in Japan, sorted by the concept of "behavioral segments" (McCalpin, 1996). Each fault is subdivided into 550 behavioral segments based on surface trace geometry and rupture history revealed by paleoseismic studies. Behavioral segments can be searched on the Google Maps. You can select one behavioral segment directly or search segments in a rectangle area on the map. The result of search is shown on a fixed map or the Google Maps with information of geologic and paleoseismic parameters including slip rate, slip per event, recurrence interval, and calculated rupture probability in the future. Behavioral segments can be searched also by name or combination of fault parameters. All those data are compiled from journal articles, theses, and other documents. We are currently developing a revised edition, which is based on an improved database system. More than ten

  20. Object and file management in the EXODUS extensible database system

    SciTech Connect

    Carey, M.J.; DeWitt, D.J.; Richardson, J.E.; Shekita, E.J.

    1986-03-01

    This paper describes the design of the object-oriented storage component of EXODUS, an extensible database management system currently under development at the University of Wisconsin. The basic abstraction in the EXODUS storage system is the storage object, an uninterpreted variable-length record of arbitrary size; higher level abstractions such as records and indices are supported via the storage object abstraction. One of the key design features described here is a scheme for managing large dynamic objects, as storage objects can occupy many disk pages and can grow or shrink at arbitrary points. The data structure and algorithms used to support such objects are described, and performance results from a preliminary prototype of the EXODUS large-object management scheme are presented. A scheme for maintaining versions of large objects is also described. The file structure used in the EXODUS storage system, which provides a mechanism for grouping and sequencing through a set of related storage objects and the EXODUS approach to buffer management, concurrency control, and recovery, both for small and large objects are discussed. 30 refs., 13 figs.

  1. Database Access Manager for the Software Engineering Laboratory (DAMSEL) user's guide

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Operating instructions for the Database Access Manager for the Software Engineering Laboratory (DAMSEL) system are presented. Step-by-step instructions for performing various data entry and report generation activities are included. Sample sessions showing the user interface display screens are also included. Instructions for generating reports are accompanied by sample outputs for each of the reports. The document groups the available software functions by the classes of users that may access them.

  2. Enhanced DIII-D Data Management Through a Relational Database

    NASA Astrophysics Data System (ADS)

    Burruss, J. R.; Peng, Q.; Schachter, J.; Schissel, D. P.; Terpstra, T. B.

    2000-10-01

    A relational database is being used to serve data about DIII-D experiments. The database is optimized for queries across multiple shots, allowing for rapid data mining by SQL-literate researchers. The relational database relates different experiments and datasets, thus providing a big picture of DIII-D operations. Users are encouraged to add their own tables to the database. Summary physics quantities about DIII-D discharges are collected and stored in the database automatically. Meta-data about code runs, MDSplus usage, and visualization tool usage are collected, stored in the database, and later analyzed to improve computing. Documentation on the database may be accessed through programming languages such as C, Java, and IDL, or through ODBC compliant applications such as Excel and Access. A database-driven web page also provides a convenient means for viewing database quantities through the World Wide Web. Demonstrations will be given at the poster.

  3. Database activity in the Italian Astronet: DIRA 2

    NASA Technical Reports Server (NTRS)

    Benacchio, L.; Nanni, M.

    1992-01-01

    The development and utilization of informational archives and databases started, in the Italian Astronet Project, in the middle of 1983. In that year, a small group of astronomers and some more technical people met together in an Astronet working group, with a common, painful experience in managing astronomical catalogues and archives with computers. Nowadays, some years later, some software packages and the contents of both, a relative general database and several local databases represent the work and the effort of the group. The systems have been conceived and developed keeping in mind the original goal of the group: to allow the single atronomer to make a free use of original data. The main package (DIRA) was rewritten, after some years of use, to fully take advantage of the several suggestions of the astronomer that used it and gathered experiences in the astronomical catalog's management. A more technical goal was to install the whole system, born and developed in the vms environment, on unix and unix-like systems. This new version, DIRA2, has a new user interface, a query language with SQL style commands supporting numerical and character functions also and a set of commands to create new catalogues from existing data. The graphics commands are also more powerful with respect to the previous version. DIRA (and DIRA2 of course) philosophy and design are very simple and proved to be very appreciated by astronomers, namely, to normalize and homogenize, at minimum, astronomical catalogues, to collect satisfactory astronomical documentation on their contents and, finally, to allow an astronomical approach to the dialogue with the database. DIRA2 is currently used in most Italian astronomical institutes to retrieve data from a still growing database of about 140 well documented and controlled astronomical catalogues, for the identification of objects and the preparation of a 'medium size' survey, in astrometry and in the creation of new catalogues.

  4. National Levee Database: monitoring, vulnerability assessment and management in Italy

    NASA Astrophysics Data System (ADS)

    Barbetta, Silvia; Camici, Stefania; Maccioni, Pamela; Moramarco, Tommaso

    2015-04-01

    Italian levees and historical breach failures to be exploited in the framework of an operational procedure addressed to the seepage vulnerability assessment of river reaches where the levee system is an important structural measure against flooding. For its structure, INLED is a dynamic geospatial database with ongoing efforts to add levee data from authorities with the charge of hydraulic risk mitigation. In particular, the database is aimed to provide the available information about: i) location and condition of levees; ii) morphological and geometrical properties; iii) photographic documentation; iv) historical levee failures; v) assessment of vulnerability to overtopping and seepage carried out through a procedure based on simple vulnerability indexes (Camici et al. 2014); vi) management, control and maintenance; vii)flood hazard maps developed by assuming the levee system undamaged/damaged during the flood event. Currently, INLED contains data of levees that are mostly located in the Tiber basin, Central Italy. References Apel H., Merz B. & Thieken A.H. Quantification of uncertainties in flood risk assessments. Int J River Basin Manag 2008, 6, (2), 149-162. Camici S,, Barbetta S., Moramarco T., Levee body vulnerability to seepage: the case study of the levee failure along the Foenna stream on 1st January 2006 (central Italy)", Journal of Flood Risk Management, in press. Colleselli F. Geotechnical problems related to river and channel embankments. Rotterdam, the Netherlands: Springer, 1994. H. R.Wallingford Consultants (HRWC). Risk assessment for flood and coastal defence for strategic planning: high level methodology technical report, London, 2003. Mazzoleni M., Bacchi B., Barontini S., Di Baldassarre G., Pilotti M. & Ranzi R. Flooding hazard mapping in floodplain areas affected by piping breaches in the Po River, Italy. J Hydrol Eng 2014, 19, (4), 717-731.

  5. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    NASA Technical Reports Server (NTRS)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  6. Management of the life and death of an earth-science database: some examples from geotherm

    USGS Publications Warehouse

    Bliss, J.D.

    1986-01-01

    Productive earth-science databases require managers who are familiar with and skilled at using available software developed specifically for database management. There also should be a primary user with a clearly understood mission. The geologic phenomenon addressed by the database must be sufficiently understood, and adequate appropriate data must be available to construct a useful database. The database manager, in concert with the primary user, must ensure that data of adequate quality are available in the database, as well as prepare for mechanisms of releasing the data when the database is terminated. The primary user needs to be held accountable along with the database manager to ensure that a useful database will be created. Quality of data and maintenance of database relevancy to the user's mission are important issues during the database's lifetime. Products prepared at termination may be used more than the operational database and thus are of critical importance. These concepts are based, in part, on both the shortcomings and successes of GEOTHERM, a comprehensive system of databases and software used to store, locate, and evaluate the geology, geochemistry, and hydrology of geothermal systems. ?? 1986.

  7. The INFN-CNAF Tier-1 GEMSS Mass Storage System and database facility activity

    NASA Astrophysics Data System (ADS)

    Ricci, Pier Paolo; Cavalli, Alessandro; Dell'Agnello, Luca; Favaro, Matteo; Gregori, Daniele; Prosperini, Andrea; Pezzi, Michele; Sapunenko, Vladimir; Zizzi, Giovanni; Vagnoni, Vincenzo

    2015-05-01

    The consolidation of Mass Storage services at the INFN-CNAF Tier1 Storage department that has occurred during the last 5 years, resulted in a reliable, high performance and moderately easy-to-manage facility that provides data access, archive, backup and database services to several different use cases. At present, the GEMSS Mass Storage System, developed and installed at CNAF and based upon an integration between the IBM GPFS parallel filesystem and the Tivoli Storage Manager (TSM) tape management software, is one of the largest hierarchical storage sites in Europe. It provides storage resources for about 12% of LHC data, as well as for data of other non-LHC experiments. Files are accessed using standard SRM Grid services provided by the Storage Resource Manager (StoRM), also developed at CNAF. Data access is also provided by XRootD and HTTP/WebDaV endpoints. Besides these services, an Oracle database facility is in production characterized by an effective level of parallelism, redundancy and availability. This facility is running databases for storing and accessing relational data objects and for providing database services to the currently active use cases. It takes advantage of several Oracle technologies, like Real Application Cluster (RAC), Automatic Storage Manager (ASM) and Enterprise Manager centralized management tools, together with other technologies for performance optimization, ease of management and downtime reduction. The aim of the present paper is to illustrate the state-of-the-art of the INFN-CNAF Tier1 Storage department infrastructures and software services, and to give a brief outlook to forthcoming projects. A description of the administrative, monitoring and problem-tracking tools that play a primary role in managing the whole storage framework is also given.

  8. A Conceptual Model and Database to Integrate Data and Project Management

    NASA Astrophysics Data System (ADS)

    Guarinello, M. L.; Edsall, R.; Helbling, J.; Evaldt, E.; Glenn, N. F.; Delparte, D.; Sheneman, L.; Schumaker, R.

    2015-12-01

    database and build it in a way that is modular and can be changed or expanded to meet user needs. Our hope is that others, especially those managing large collaborative research grants, will be able to use our project model and database design to enhance the value of their project and data management both during and following the active research period.

  9. Keeping Track of Our Treasures: Managing Historical Data with Relational Database Software.

    ERIC Educational Resources Information Center

    Gutmann, Myron P.; And Others

    1989-01-01

    Describes the way a relational database management system manages a large historical data collection project. Shows that such databases are practical to construct. States that the programing tasks involved are not for beginners, but the rewards of having data organized are worthwhile. (GG)

  10. Enhancing Disaster Management: Development of a Spatial Database of Day Care Centers in the USA

    DOE PAGES

    Singh, Nagendra; Tuttle, Mark A.; Bhaduri, Budhendra L.

    2015-07-30

    Children under the age of five constitute around 7% of the total U.S. population and represent a segment of the population, which is totally dependent on others for day-to-day activities. A significant proportion of this population spends time in some form of day care arrangement while their parents are away from home. Accounting for those children during emergencies is of high priority, which requires a broad understanding of the locations of such day care centers. As concentrations of at risk population, the spatial location of day care centers is critical for any type of emergency preparedness and response (EPR). However,more » until recently, the U.S. emergency preparedness and response community did not have access to a comprehensive spatial database of day care centers at the national scale. This paper describes an approach for the development of the first comprehensive spatial database of day care center locations throughout the USA utilizing a variety of data harvesting techniques to integrate information from widely disparate data sources followed by geolocating for spatial precision. In the context of disaster management, such spatially refined demographic databases hold tremendous potential for improving high resolution population distribution and dynamics models and databases.« less

  11. Enhancing Disaster Management: Development of a Spatial Database of Day Care Centers in the USA

    SciTech Connect

    Singh, Nagendra; Tuttle, Mark A.; Bhaduri, Budhendra L.

    2015-07-30

    Children under the age of five constitute around 7% of the total U.S. population and represent a segment of the population, which is totally dependent on others for day-to-day activities. A significant proportion of this population spends time in some form of day care arrangement while their parents are away from home. Accounting for those children during emergencies is of high priority, which requires a broad understanding of the locations of such day care centers. As concentrations of at risk population, the spatial location of day care centers is critical for any type of emergency preparedness and response (EPR). However, until recently, the U.S. emergency preparedness and response community did not have access to a comprehensive spatial database of day care centers at the national scale. This paper describes an approach for the development of the first comprehensive spatial database of day care center locations throughout the USA utilizing a variety of data harvesting techniques to integrate information from widely disparate data sources followed by geolocating for spatial precision. In the context of disaster management, such spatially refined demographic databases hold tremendous potential for improving high resolution population distribution and dynamics models and databases.

  12. A Vibroacoustic Database Management Center for Shuttle and expendable launch vehicle payloads

    NASA Technical Reports Server (NTRS)

    Thomas, Valerie C.

    1987-01-01

    A Vibroacoustic Database Management Center has recently been established at the Jet Propulsion Laboratory (JPL). The center uses the Vibroacoustic Payload Environment Prediction System (VAPEPS) computer program to maintain a database of flight and ground-test data and structural parameters for both Shuttle and expendable launch-vehicle payloads. Given the launch-vehicle environment, the VAPEPS prediction software, which employs Statistical Energy Analysis (SEA) methods, can be used with or without the database to establish the vibroacoustic environment for new payload components. This paper summarizes the VAPEPS program and describes the functions of the Database Management Center at JPL.

  13. EADB: An Estrogenic Activity Database for Assessing Potential Endocrine Activity

    EPA Science Inventory

    Endocrine-active chemicals can potentially have adverse effects on both humans and wildlife. They can interfere with the body’s endocrine system through direct or indirect interactions with many protein targets. Estrogen receptors (ERs) are one of the major targets, and many ...

  14. Evaluating, Migrating, and Consolidating Databases and Applications for Long-Term Surveillance and Maintenance Activities at the Rocky Flats Site

    SciTech Connect

    Surovchak, S.; Marutzky, S.; Thompson, B.; Miller, K.; Labonte, E.

    2006-07-01

    The U.S. Department of Energy (DOE) Office of Legacy Management (LM) is assuming responsibilities for long-term surveillance and maintenance (LTS and M) activities at the Rocky Flats Environmental Technology Site (RFETS) during fiscal year 2006. During the transition, LM is consolidating databases and applications that support these various functions into a few applications which will streamline future management and retrieval of data. This paper discussed the process of evaluating, migrating, and consolidating these databases and applications for LTS and M activities and provides lessons learned that will benefit future transitions. (authors)

  15. Database of Pesticides and Off-flavors for Health Crisis Management.

    PubMed

    Ueda, Yasuhito; Itoh, Mitsuo

    2016-01-01

    In this experiment, 351 pesticides and 441 different organic compounds were analyzed by GC/MS, and a database of retention time, retention index, monoisotopic mass, two selected ions, molecular formula, and CAS numbers was created. The database includes compounds such as alcohols, aldehydes, carboxylic acids, esters, ethers and hydrocarbons with unpleasant odors. This database is expected to be useful for health crisis management in the future. PMID:27211918

  16. Expert systems identify fossils and manage large paleontological databases

    SciTech Connect

    Beightol, D.S. ); Conrad, M.A.

    1988-02-01

    EXPAL is a computer program permitting creation and maintenance of comprehensive databases in marine paleontology. It is designed to assist specialists and non-specialists. EXPAL includes a powerful expert system based on the morphological descriptors specific to a given group of fossils. The expert system may be used, for example, to describe and automatically identify an unknown specimen. EXPAL was first applied to Dasycladales (Calcareous green algae). Projects are under way for corresponding expert systems and databases on planktonic foraminifers and calpionellids. EXPAL runs on an IBM XT or compatible microcomputer.

  17. Knowledge Based Engineering for Spatial Database Management and Use

    NASA Technical Reports Server (NTRS)

    Peuquet, D. (Principal Investigator)

    1984-01-01

    The use of artificial intelligence techniques that are applicable to Geographic Information Systems (GIS) are examined. Questions involving the performance and modification to the database structure, the definition of spectra in quadtree structures and their use in search heuristics, extension of the knowledge base, and learning algorithm concepts are investigated.

  18. GAS CHROMATOGRAPHIC RETENTION PARAMETERS DATABASE FOR REFRIGERANT MIXTURE COMPOSITION MANAGEMENT

    EPA Science Inventory

    Composition management of mixed refrigerant systems is a challenging problem in the laboratory, manufacturing facilities, and large refrigeration machinery. Ths issue of composition management is especially critical for the maintenance of machinery that utilizes zeotropic mixture...

  19. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  20. Integrated Standardized Database/Model Management System: Study management concepts and requirements

    SciTech Connect

    Baker, R.; Swerdlow, S.; Schultz, R.; Tolchin, R.

    1994-02-01

    Data-sharing among planners and planning software for utility companies is the motivation for creating the Integrated Standardized Database (ISD) and Model Management System (MMS). The purpose of this document is to define the requirements for the ISD/MMS study management component in a manner that will enhance the use of the ISD. After an analysis period which involved EPRI member utilities across the United States, the study concept was formulated. It is defined in terms of its entities, relationships and its support processes, specifically for implementation as the key component of the MMS. From the study concept definition, requirements are derived. There are unique requirements, such as the necessity to interface with DSManager, EGEAS, IRPManager, MIDAS and UPM and there are standard information systems requirements, such as create, modify, delete and browse data. An initial ordering of the requirements is established, with a section devoted to future enhancements.

  1. Zebrafish Database: Customizable, Free, and Open-Source Solution for Facility Management.

    PubMed

    Yakulov, Toma Antonov; Walz, Gerd

    2015-12-01

    Zebrafish Database is a web-based customizable database solution, which can be easily adapted to serve both single laboratories and facilities housing thousands of zebrafish lines. The database allows the users to keep track of details regarding the various genomic features, zebrafish lines, zebrafish batches, and their respective locations. Advanced search and reporting options are available. Unique features are the ability to upload files and images that are associated with the respective records and an integrated calendar component that supports multiple calendars and categories. Built on the basis of the Joomla content management system, the Zebrafish Database is easily extendable without the need for advanced programming skills.

  2. Migration check tool: automatic plan verification following treatment management systems upgrade and database migration.

    PubMed

    Hadley, Scott W; White, Dale; Chen, Xiaoping; Moran, Jean M; Keranen, Wayne M

    2013-01-01

    Software upgrades of the treatment management system (TMS) sometimes require that all data be migrated from one version of the database to another. It is necessary to verify that the data are correctly migrated to assure patient safety. It is impossible to verify by hand the thousands of parameters that go into each patient's radiation therapy treatment plan. Repeating pretreatment QA is costly, time-consuming, and may be inadequate in detecting errors that are introduced during the migration. In this work we investigate the use of an automatic Plan Comparison Tool to verify that plan data have been correctly migrated to a new version of a TMS database from an older version. We developed software to query and compare treatment plans between different versions of the TMS. The same plan in the two TMS systems are translated into an XML schema. A plan comparison module takes the two XML schemas as input and reports any differences in parameters between the two versions of the same plan by applying a schema mapping. A console application is used to query the database to obtain a list of active or in-preparation plans to be tested. It then runs in batch mode to compare all the plans, and a report of success or failure of the comparison is saved for review. This software tool was used as part of software upgrade and database migration from Varian's Aria 8.9 to Aria 11 TMS. Parameters were compared for 358 treatment plans in 89 minutes. This direct comparison of all plan parameters in the migrated TMS against the previous TMS surpasses current QA methods that relied on repeating pretreatment QA measurements or labor-intensive and fallible hand comparisons. PMID:24257281

  3. Migration check tool: automatic plan verification following treatment management systems upgrade and database migration.

    PubMed

    Hadley, Scott W; White, Dale; Chen, Xiaoping; Moran, Jean M; Keranen, Wayne M

    2013-11-04

    Software upgrades of the treatment management system (TMS) sometimes require that all data be migrated from one version of the database to another. It is necessary to verify that the data are correctly migrated to assure patient safety. It is impossible to verify by hand the thousands of parameters that go into each patient's radiation therapy treatment plan. Repeating pretreatment QA is costly, time-consuming, and may be inadequate in detecting errors that are introduced during the migration. In this work we investigate the use of an automatic Plan Comparison Tool to verify that plan data have been correctly migrated to a new version of a TMS database from an older version. We developed software to query and compare treatment plans between different versions of the TMS. The same plan in the two TMS systems are translated into an XML schema. A plan comparison module takes the two XML schemas as input and reports any differences in parameters between the two versions of the same plan by applying a schema mapping. A console application is used to query the database to obtain a list of active or in-preparation plans to be tested. It then runs in batch mode to compare all the plans, and a report of success or failure of the comparison is saved for review. This software tool was used as part of software upgrade and database migration from Varian's Aria 8.9 to Aria 11 TMS. Parameters were compared for 358 treatment plans in 89 minutes. This direct comparison of all plan parameters in the migrated TMS against the previous TMS surpasses current QA methods that relied on repeating pretreatment QA measurements or labor-intensive and fallible hand comparisons.

  4. Military Services Fitness Database: Development of a Computerized Physical Fitness and Weight Management Database for the U.S. Army

    PubMed Central

    Williamson, Donald A.; Bathalon, Gaston P.; Sigrist, Lori D.; Allen, H. Raymond; Friedl, Karl E.; Young, Andrew J.; Martin, Corby K.; Stewart, Tiffany M.; Burrell, Lolita; Han, Hongmei; Hubbard, Van S.; Ryan, Donna

    2009-01-01

    The Department of Defense (DoD) has mandated development of a system to collect and manage data on the weight, percent body fat (%BF), and fitness of all military personnel. This project aimed to (1) develop a computerized weight and fitness database to track individuals and Army units over time allowing cross-sectional and longitudinal evaluations and (2) test the computerized system for feasibility and integrity of data collection over several years of usage. The computer application, the Military Services Fitness Database (MSFD), was designed for (1) storage and tracking of data related to height, weight, %BF for the Army Weight Control Program (AWCP) and Army Physical Fitness Test (APFT) scores and (2) generation of reports using these data. A 2.5-year pilot test of the MSFD indicated that it monitors population and individual trends of changing body weight, %BF, and fitness in a military population. PMID:19216292

  5. [Role and management of cancer clinical database in the application of gastric cancer precision medicine].

    PubMed

    Li, Yuanfang; Zhou, Zhiwei

    2016-02-01

    Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.

  6. PylotDB - A Database Management, Graphing, and Analysis Tool Written in Python

    SciTech Connect

    Barnette, Daniel W.

    2012-01-04

    PylotDB, written completely in Python, provides a user interface (UI) with which to interact with, analyze, graph data from, and manage open source databases such as MySQL. The UI mitigates the user having to know in-depth knowledge of the database application programming interface (API). PylotDB allows the user to generate various kinds of plots from user-selected data; generate statistical information on text as well as numerical fields; backup and restore databases; compare database tables across different databases as well as across different servers; extract information from any field to create new fields; generate, edit, and delete databases, tables, and fields; generate or read into a table CSV data; and similar operations. Since much of the database information is brought under control of the Python computer language, PylotDB is not intended for huge databases for which MySQL and Oracle, for example, are better suited. PylotDB is better suited for smaller databases that might be typically needed in a small research group situation. PylotDB can also be used as a learning tool for database applications in general.

  7. PylotDB - A Database Management, Graphing, and Analysis Tool Written in Python

    2012-01-04

    PylotDB, written completely in Python, provides a user interface (UI) with which to interact with, analyze, graph data from, and manage open source databases such as MySQL. The UI mitigates the user having to know in-depth knowledge of the database application programming interface (API). PylotDB allows the user to generate various kinds of plots from user-selected data; generate statistical information on text as well as numerical fields; backup and restore databases; compare database tables acrossmore » different databases as well as across different servers; extract information from any field to create new fields; generate, edit, and delete databases, tables, and fields; generate or read into a table CSV data; and similar operations. Since much of the database information is brought under control of the Python computer language, PylotDB is not intended for huge databases for which MySQL and Oracle, for example, are better suited. PylotDB is better suited for smaller databases that might be typically needed in a small research group situation. PylotDB can also be used as a learning tool for database applications in general.« less

  8. [Active management of labor].

    PubMed

    Ruiz Ortiz, E; Villalobos Román, M; Flores Murrieta, G; Sotomayor Alvarado, L

    1991-01-01

    Eighty three primigravidae patients at the end of latency labor, erased cervix, 3 cm dilation, vertex presentation and adequate pelvis, were studied. Two groups were formed: 53 patients in the study group, who received active management of labor, and 30 patients in the control group, treated in the traditional way. In all the patients a graphic recording of labor, was carried out; it included all the events, and as labor advanced, a signoidal curve of cervical dilatation, was registered, as well as the hyperbolic one for presentation descent. The study group received the method in a systematized manner, as follows: 1. Peridular block. 2. Amniotomy. 3. IV oxytocin one hour after amniotomy. 4. FCR monitoring. 5. Detection of dystocia origin. Materno-fetal morbidity was registered in both groups, as well as cesarean section rate, instrumental delivery and its indications, labor duration, and time of stay in labor room. Diminution of above intems and opportune detection of dystocia, were determined. It was concluded that a constructive action plan, starting at hospital admission in most healthy women, allows a normal delivery of brief duration.

  9. An improved FORTRAN 77 recombinant DNA database management system with graphic extensions in GKS.

    PubMed

    Van Rompuy, L L; Lesage, C; Vanderhaegen, M E; Telemans, M P; Zabeau, M F

    1986-12-01

    We have improved an existing clone database management system written in FORTRAN 77 and adapted it to our software environment. Improvements are that the database can be interrogated for any type of information, not just keywords. Also, recombinant DNA constructions can be represented in a simplified 'shorthand', whereafter a program assembles the full nucleotide sequence from the contributing fragments, which may be obtained from nucleotide sequence databases. Another improvement is the replacement of the database manager by programs, running in batch to maintain the databank and verify its consistency automatically. Finally, graphic extensions are written in Graphical Kernel System, to draw linear and circular restriction maps of recombinants. Besides restriction sites, recombinant features can be presented from the feature lines of recombinant database entries, or from the feature tables of nucleotide databases. The clone database management system is fully integrated into the sequence analysis software package from the Pasteur Institute, Paris, and is made accessible through the same menu. As a result, recombinant DNA sequences can directly be analysed by the sequence analysis programs.

  10. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    SciTech Connect

    Wolery, T W; Sutton, M

    2011-09-19

    they use a large body of thermodynamic data, generally from a supporting database file, to sort out the various important reactions from a wide spectrum of possibilities, given specified inputs. Usually codes of this kind are used to construct models of initial aqueous solutions that represent initial conditions for some process, although sometimes these calculations also represent a desired end point. Such a calculation might be used to determine the major chemical species of a dissolved component, the solubility of a mineral or mineral-like solid, or to quantify deviation from equilibrium in the form of saturation indices. Reactive transport codes such as TOUGHREACT and NUFT generally require the user to determine which chemical species and reactions are important, and to provide the requisite set of information including thermodynamic data in an input file. Usually this information is abstracted from the output of a geochemical modeling code and its supporting thermodynamic data file. The Yucca Mountain Project (YMP) developed two qualified thermodynamic databases to model geochemical processes, including ones involving repository components such as spent fuel. The first of the two (BSC, 2007a) was for systems containing dilute aqueous solutions only, the other (BSC, 2007b) for systems involving concentrated aqueous solutions and incorporating a model for such based on Pitzer's (1991) equations. A 25 C-only database with similarities to the latter was also developed for the Waste Isolation Pilot Plant (WIPP, cf. Xiong, 2005). The NAGRA/PSI database (Hummel et al., 2002) was developed to support repository studies in Europe. The YMP databases are often used in non-repository studies, including studies of geothermal systems (e.g., Wolery and Carroll, 2010) and CO2 sequestration (e.g., Aines et al., 2011).

  11. Database Objects vs Files: Evaluation of alternative strategies for managing large remote sensing data

    NASA Astrophysics Data System (ADS)

    Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram

    2010-05-01

    Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available

  12. Telecommunications issues of intelligent database management for ground processing systems in the EOS era

    NASA Technical Reports Server (NTRS)

    Touch, Joseph D.

    1994-01-01

    Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.

  13. Design of Student Information Management Database Application System for Office and Departmental Target Responsibility System

    NASA Astrophysics Data System (ADS)

    Zhou, Hui

    It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.

  14. Is Library Database Searching a Language Learning Activity?

    ERIC Educational Resources Information Center

    Bordonaro, Karen

    2010-01-01

    This study explores how non-native speakers of English think of words to enter into library databases when they begin the process of searching for information in English. At issue is whether or not language learning takes place when these students use library databases. Language learning in this study refers to the use of strategies employed by…

  15. Design and Performance of a Xenobiotic Metabolism Database Manager for Building Metabolic Pathway Databases

    EPA Science Inventory

    A major challenge for scientists and regulators is accounting for the metabolic activation of chemicals that may lead to increased toxicity. Reliable forecasting of chemical metabolism is a critical factor in estimating a chemical’s toxic potential. Research is underway to develo...

  16. MADMAX - Management and analysis database for multiple ~omics experiments.

    PubMed

    Lin, Ke; Kools, Harrie; de Groot, Philip J; Gavai, Anand K; Basnet, Ram K; Cheng, Feng; Wu, Jian; Wang, Xiaowu; Lommen, Arjen; Hooiveld, Guido J E J; Bonnema, Guusje; Visser, Richard G F; Muller, Michael R; Leunissen, Jack A M

    2011-01-01

    The rapid increase of ~omics datasets generated by microarray, mass spectrometry and next generation sequencing technologies requires an integrated platform that can combine results from different ~omics datasets to provide novel insights in the understanding of biological systems. MADMAX is designed to provide a solution for storage and analysis of complex ~omics datasets. In addition, analysis results (such as lists of genes) will be merged to reveal candidate genes supported by all datasets. The system constitutes an ISA-Tab compliant LIMS part which is independent of different analysis pipelines. A pilot study of different type of ~omics data in Brassica rapa demonstrates the possible use of MADMAX. The web-based user interface provides easy access to data and analysis tools on top of the database.

  17. MADMAX - Management and analysis database for multiple ~omics experiments.

    PubMed

    Lin, Ke; Kools, Harrie; de Groot, Philip J; Gavai, Anand K; Basnet, Ram K; Cheng, Feng; Wu, Jian; Wang, Xiaowu; Lommen, Arjen; Hooiveld, Guido J E J; Bonnema, Guusje; Visser, Richard G F; Muller, Michael R; Leunissen, Jack A M

    2011-01-01

    The rapid increase of ~omics datasets generated by microarray, mass spectrometry and next generation sequencing technologies requires an integrated platform that can combine results from different ~omics datasets to provide novel insights in the understanding of biological systems. MADMAX is designed to provide a solution for storage and analysis of complex ~omics datasets. In addition, analysis results (such as lists of genes) will be merged to reveal candidate genes supported by all datasets. The system constitutes an ISA-Tab compliant LIMS part which is independent of different analysis pipelines. A pilot study of different type of ~omics data in Brassica rapa demonstrates the possible use of MADMAX. The web-based user interface provides easy access to data and analysis tools on top of the database. PMID:21778530

  18. Managing Geological Profiles in Databases for 3D Visualisation

    NASA Astrophysics Data System (ADS)

    Jarna, A.; Grøtan, B. O.; Henderson, I. H. C.; Iversen, S.; Khloussy, E.; Nordahl, B.; Rindstad, B. I.

    2016-10-01

    Geology and all geological structures are three-dimensional in space. GIS and databases are common tools used by geologists to interpret and communicate geological data. The NGU (Geological Survey of Norway) is the national institution for the study of bedrock, mineral resources, surficial deposits and groundwater and marine geology. 3D geology is usually described by geological profiles, or vertical sections through a map, where you can look at the rock structure below the surface. The goal is to gradually expand the usability of existing and new geological profiles to make them more available in the retail applications as well as build easier entry and registration of profiles. The project target is to develop the methodology for acquisition of data, modification and use of data and its further presentation on the web by creating a user-interface directly linked to NGU's webpage. This will allow users to visualise profiles in a 3D model.

  19. Planning the future of JPL's management and administrative support systems around an integrated database

    NASA Technical Reports Server (NTRS)

    Ebersole, M. M.

    1983-01-01

    JPL's management and administrative support systems have been developed piece meal and without consistency in design approach over the past twenty years. These systems are now proving to be inadequate to support effective management of tasks and administration of the Laboratory. New approaches are needed. Modern database management technology has the potential for providing the foundation for more effective administrative tools for JPL managers and administrators. Plans for upgrading JPL's management and administrative systems over a six year period evolving around the development of an integrated management and administrative data base are discussed.

  20. Federated Web-accessible Clinical Data Management within an Extensible NeuroImaging Database

    PubMed Central

    Keator, David B.; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R.; Bockholt, Jeremy; Grethe, Jeffrey S.

    2010-01-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site. PMID:20567938

  1. Flight Deck Interval Management Display. [Elements, Information and Annunciations Database User Guide

    NASA Technical Reports Server (NTRS)

    Lancaster, Jeff; Dillard, Michael; Alves, Erin; Olofinboba, Olu

    2014-01-01

    The User Guide details the Access Database provided with the Flight Deck Interval Management (FIM) Display Elements, Information, & Annunciations program. The goal of this User Guide is to support ease of use and the ability to quickly retrieve and select items of interest from the Database. The Database includes FIM Concepts identified in a literature review preceding the publication of this document. Only items that are directly related to FIM (e.g., spacing indicators), which change or enable FIM (e.g., menu with control buttons), or which are affected by FIM (e.g., altitude reading) are included in the database. The guide has been expanded from previous versions to cover database structure, content, and search features with voiced explanations.

  2. An Extensible "SCHEMA-LESS" Database Framework for Managing High-Throughput Semi-Structured Documents

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semistructured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  3. An Extensible Schema-less Database Framework for Managing High-throughput Semi-Structured Documents

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.; La, Tracy; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword searches of records for both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high throughput open database framework for managing, storing, and searching unstructured or semi structured arbitrary hierarchal models, XML and HTML.

  4. NETMARK: A Schema-less Extension for Relational Databases for Managing Semi-structured Data Dynamically

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  5. MAGIC-SPP: a database-driven DNA sequence processing package with associated management tools

    PubMed Central

    Liang, Chun; Sun, Feng; Wang, Haiming; Qu, Junfeng; Freeman, Robert M; Pratt, Lee H; Cordonnier-Pratt, Marie-Michèle

    2006-01-01

    Background Processing raw DNA sequence data is an especially challenging task for relatively small laboratories and core facilities that produce as many as 5000 or more DNA sequences per week from multiple projects in widely differing species. To meet this challenge, we have developed the flexible, scalable, and automated sequence processing package described here. Results MAGIC-SPP is a DNA sequence processing package consisting of an Oracle 9i relational database, a Perl pipeline, and user interfaces implemented either as JavaServer Pages (JSP) or as a Java graphical user interface (GUI). The database not only serves as a data repository, but also controls processing of trace files. MAGIC-SPP includes an administrative interface, a laboratory information management system, and interfaces for exploring sequences, monitoring quality control, and troubleshooting problems related to sequencing activities. In the sequence trimming algorithm it employs new features designed to improve performance with respect to concerns such as concatenated linkers, identification of the expected start position of a vector insert, and extending the useful length of trimmed sequences by bridging short regions of low quality when the following high quality segment is sufficiently long to justify doing so. Conclusion MAGIC-SPP has been designed to minimize human error, while simultaneously being robust, versatile, flexible and automated. It offers a unique combination of features that permit administration by a biologist with little or no informatics background. It is well suited to both individual research programs and core facilities. PMID:16522212

  6. Data management and database structure at the ARS Culture Collection

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The organization and management of collection data for the 96,000 strains held in the ARS Culture Collection has been an ongoing process. Originally, the records for the four separate collections were maintained by individual curators in notebooks and/or card files and subsequently on the National C...

  7. The Future of Asset Management for Human Space Exploration: Supply Classification and an Integrated Database

    NASA Technical Reports Server (NTRS)

    Shull, Sarah A.; Gralla, Erica L.; deWeck, Olivier L.; Shishko, Robert

    2006-01-01

    One of the major logistical challenges in human space exploration is asset management. This paper presents observations on the practice of asset management in support of human space flight to date and discusses a functional-based supply classification and a framework for an integrated database that could be used to improve asset management and logistics for human missions to the Moon, Mars and beyond.

  8. The database management system: A topic and a tool

    NASA Technical Reports Server (NTRS)

    Plummer, O. R.

    1984-01-01

    Data structures and data base management systems are common tools employed to deal with the administrative information of a university. An understanding of these topics is needed by a much wider audience, ranging from those interested in computer aided design and manufacturing to those using microcomputers. These tools are becoming increasingly valuable to academic programs as they develop comprehensive computer support systems. The wide use of these tools relies upon the relational data model as a foundation. Experience with the use of the IPAD RIM5.0 program is described.

  9. Rail transit energy management program: Energy database. Volume 2. Final report

    SciTech Connect

    Uher, R.A.

    1995-03-01

    The Rail Transportation Energy Management Program (EMP) is a private/public partnership whose objective is to reduce rail transit energy cost and improve energy efficiency. The Energy Database (EDB) was set up under the program. The purpose of the EDB is to provide information to the members of the program. This information includes rail transit energy and energy cost data and the results of implementation of energy cost reduction strategies. The EDB also includes a means for timely exchange of information among transit authorities as well as associated with energy management. The database is presently set up on a personal computer and is accessed by the users via an 800 telephone line.

  10. Environmental management activities

    SciTech Connect

    1997-07-01

    The Office of Environmental Management (EM) has been delegated the responsibility for the Department of Energy`s (DOE`s) cleanup of the nuclear weapons complex. The nature and magnitude of the waste management and environmental remediation problem requires the identification of technologies and scientific expertise from domestic and foreign sources. Within the United States, operational DOE facilities, as well as the decontamination and decommissioning of inactive facilities, have produced significant amounts of radioactive, hazardous, and mixed wastes. In order to ensure worker safety and the protection of the public, DOE must: (1) assess, remediate, and monitor sites and facilities; (2) store, treat, and dispose of wastes from past and current operations; and (3) develop and implement innovative technologies for environmental restoration and waste management. The EM directive necessitates looking beyond domestic capabilities to technological solutions found outside US borders. Following the collapse of the Soviet regime, formerly restricted elite Soviet scientific expertise became available to the West. EM has established a cooperative technology development program with Russian scientific institutes that meets domestic cleanup objectives by: (1) identifying and accessing Russian EM-related technologies, thereby leveraging investments and providing cost-savings; (2) improving access to technical information, scientific expertise, and technologies applicable to EM needs; and (3) increasing US private sector opportunities in Russian in EM-related areas.

  11. Database Management: Building, Changing and Using Databases. Collected Papers and Abstracts of the Mid-Year Meeting of the American Society for Information Science (15th, Portland, Oregon, May 1986).

    ERIC Educational Resources Information Center

    American Society for Information Science, Washington, DC.

    This document contains abstracts of papers on database design and management which were presented at the 1986 mid-year meeting of the American Society for Information Science (ASIS). Topics considered include: knowledge representation in a bilingual art history database; proprietary database design; relational database design; in-house databases;…

  12. Management of three-dimensional and anthropometric databases: Alexandria and Cleopatra

    NASA Astrophysics Data System (ADS)

    Paquet, Eric; Robinette, Kathleen; Rioux, Marc

    2000-10-01

    This paper describes two systems for managing 3D and anthropometric databases, namely Alexandria and Cleopatra. Each system is made out of three parts: the crawler, the analyzer, and the search engine. The crawler retrieves the content from the network while the analyzer describes automatically the shape, scale, and color of each retrieved object and writes down a compact descriptor. The search engine applies the query by example paradigm to find and retrieve similar or related objects from the database based on different aspects of 3D shape, scale, and color distribution. The descriptors are defined and the implementation of the system is detailed. The application of the system to the CAESAR anthropometric survey is discussed. Experimental results from the CAESAR database and from generic databases are presented.

  13. ``STANDARD LIBRARY'': A relational database for the management of electron microprobe standards

    NASA Astrophysics Data System (ADS)

    Diamond, Larryn W.; Schmatz, Dirk; Würsten, Felix

    1994-05-01

    Laboratory collections of well-characterized solid materials are an indispensable basis for the calibration of quantitative electron microprobe analyses. The STANDARD LIBRARY database has been designed to manage the wide variety of information needed to characterize such standards, and to provide a rapid way by which these data can be accessed. In addition to physical storage information, STANDARD LIBRARY includes a full set of chemical and mineralogic characterization variables, and a set of variables specific to microprobe calibration (instrumental setup, standard homogeneity, etc.). Application programs for STANDARD LIBRARY provide a series of interactive screen views for database search, retrieval, and editing operations (including inventories). Search and inventory results can be written as UNIX data files, some of which are formatted to be read directly by the software that controls CAMECA SX50™ electron microprobes. The application programs are coded in OSL for the INGRES™ database-management system, and run within any environment that supports INGRES™ (e.g. UNIX, VMS, DOS, etc.). STANDARD LIBRARY has been generalized, however, such that only the physical storage structure of the database is dependent on the selected database-management system.

  14. Information flow in the DAMA Project beyond database managers: Information flow managers

    SciTech Connect

    Russell, L.; Wolfson, O.; Yu, C.

    1996-03-01

    To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point-of-sale information, is being considered in the Demand Activated Manufacturing Project of the American Textile Partnership project. A scenario is examined in which 100,000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26,000 suppliers through the use of bill-of-materials explosions at four levels of detail. A new paradign the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced to keep estimates of demand as current as possible.

  15. The COMET initiative database: progress and activities update (2014).

    PubMed

    Gargon, Elizabeth; Williamson, Paula R; Altman, Doug G; Blazeby, Jane M; Clarke, Mike

    2015-01-01

    The COMET Initiative database is a repository of studies relevant to the development of core outcome sets (COS). Use of the website continues to increase, with more than 16,500 visits in 2014 (36 % increase over 2013), 12,257 unique visitors (47 % increase), 9780 new visitors (43 % increase) and a rise in the proportion of visits from outside the UK (8565 visits; 51 % of all visits). By December 2014, a total of 6588 searches had been completed, with 2383 in 2014 alone (11 % increase). The growing awareness of the need for COS is reflected in the website and database usage figures. PMID:26558998

  16. Environmental Management vitrification activities

    SciTech Connect

    Krumrine, P.H.

    1996-05-01

    Both the Mixed Waste and Landfill Stabilization Focus Areas as part of the Office of Technology Development efforts within the Department of Energy`s (DOE) Environmental Management (EM) Division have been developing various vitrification technologies as a treatment approach for the large quantities of transuranic (TRU), TRU mixed and Mixed Low Level Wastes that are stored in either landfills or above ground storage facilities. The technologies being developed include joule heated, plasma torch, plasma arc, induction, microwave, combustion, molten metal, and in situ methods. There are related efforts going into development glass, ceramic, and slag waste form windows of opportunity for the diverse quantities of heterogeneous wastes needing treatment. These studies look at both processing parameters, and long term performance parameters as a function of composition to assure that developed technologies have the right chemistry for success.

  17. Insight: An ontology-based integrated database and analysis platform for epilepsy self-management research.

    PubMed

    Sahoo, Satya S; Ramesh, Priya; Welter, Elisabeth; Bukach, Ashley; Valdez, Joshua; Tatsuoka, Curtis; Bamps, Yvan; Stoll, Shelley; Jobst, Barbara C; Sajatovic, Martha

    2016-10-01

    We present Insight as an integrated database and analysis platform for epilepsy self-management research as part of the national Managing Epilepsy Well Network. Insight is the only available informatics platform for accessing and analyzing integrated data from multiple epilepsy self-management research studies with several new data management features and user-friendly functionalities. The features of Insight include, (1) use of Common Data Elements defined by members of the research community and an epilepsy domain ontology for data integration and querying, (2) visualization tools to support real time exploration of data distribution across research studies, and (3) an interactive visual query interface for provenance-enabled research cohort identification. The Insight platform contains data from five completed epilepsy self-management research studies covering various categories of data, including depression, quality of life, seizure frequency, and socioeconomic information. The data represents over 400 participants with 7552 data points. The Insight data exploration and cohort identification query interface has been developed using Ruby on Rails Web technology and open source Web Ontology Language Application Programming Interface to support ontology-based reasoning. We have developed an efficient ontology management module that automatically updates the ontology mappings each time a new version of the Epilepsy and Seizure Ontology is released. The Insight platform features a Role-based Access Control module to authenticate and effectively manage user access to different research studies. User access to Insight is managed by the Managing Epilepsy Well Network database steering committee consisting of representatives of all current collaborating centers of the Managing Epilepsy Well Network. New research studies are being continuously added to the Insight database and the size as well as the unique coverage of the dataset allows investigators to conduct

  18. Insight: An ontology-based integrated database and analysis platform for epilepsy self-management research.

    PubMed

    Sahoo, Satya S; Ramesh, Priya; Welter, Elisabeth; Bukach, Ashley; Valdez, Joshua; Tatsuoka, Curtis; Bamps, Yvan; Stoll, Shelley; Jobst, Barbara C; Sajatovic, Martha

    2016-10-01

    We present Insight as an integrated database and analysis platform for epilepsy self-management research as part of the national Managing Epilepsy Well Network. Insight is the only available informatics platform for accessing and analyzing integrated data from multiple epilepsy self-management research studies with several new data management features and user-friendly functionalities. The features of Insight include, (1) use of Common Data Elements defined by members of the research community and an epilepsy domain ontology for data integration and querying, (2) visualization tools to support real time exploration of data distribution across research studies, and (3) an interactive visual query interface for provenance-enabled research cohort identification. The Insight platform contains data from five completed epilepsy self-management research studies covering various categories of data, including depression, quality of life, seizure frequency, and socioeconomic information. The data represents over 400 participants with 7552 data points. The Insight data exploration and cohort identification query interface has been developed using Ruby on Rails Web technology and open source Web Ontology Language Application Programming Interface to support ontology-based reasoning. We have developed an efficient ontology management module that automatically updates the ontology mappings each time a new version of the Epilepsy and Seizure Ontology is released. The Insight platform features a Role-based Access Control module to authenticate and effectively manage user access to different research studies. User access to Insight is managed by the Managing Epilepsy Well Network database steering committee consisting of representatives of all current collaborating centers of the Managing Epilepsy Well Network. New research studies are being continuously added to the Insight database and the size as well as the unique coverage of the dataset allows investigators to conduct

  19. Functions and Relations: Some Applications from Database Management for the Teaching of Classroom Mathematics.

    ERIC Educational Resources Information Center

    Hauge, Sharon K.

    While functions and relations are important concepts in the teaching of mathematics, research suggests that many students lack an understanding and appreciation of these concepts. The present paper discusses an approach for teaching functions and relations that draws on the use of illustrations from database management. This approach has the…

  20. Toward public volume database management: a case study of NOVA, the National Online Volumetric Archive

    NASA Astrophysics Data System (ADS)

    Fletcher, Alex; Yoo, Terry S.

    2004-04-01

    Public databases today can be constructed with a wide variety of authoring and management structures. The widespread appeal of Internet search engines suggests that public information be made open and available to common search strategies, making accessible information that would otherwise be hidden by the infrastructure and software interfaces of a traditional database management system. We present the construction and organizational details for managing NOVA, the National Online Volumetric Archive. As an archival effort of the Visible Human Project for supporting medical visualization research, archiving 3D multimodal radiological teaching files, and enhancing medical education with volumetric data, our overall database structure is simplified; archives grow by accruing information, but seldom have to modify, delete, or overwrite stored records. NOVA is being constructed and populated so that it is transparent to the Internet; that is, much of its internal structure is mirrored in HTML allowing internet search engines to investigate, catalog, and link directly to the deep relational structure of the collection index. The key organizational concept for NOVA is the Image Content Group (ICG), an indexing strategy for cataloging incoming data as a set structure rather than by keyword management. These groups are managed through a series of XML files and authoring scripts. We cover the motivation for Image Content Groups, their overall construction, authorship, and management in XML, and the pilot results for creating public data repositories using this strategy.

  1. Information Technologies in Public Health Management: A Database on Biocides to Improve Quality of Life

    PubMed Central

    Roman, C; Scripcariu, L; Diaconescu, RM; Grigoriu, A

    2012-01-01

    Background Biocides for prolonging the shelf life of a large variety of materials have been extensively used over the last decades. It has estimated that the worldwide biocide consumption to be about 12.4 billion dollars in 2011, and is expected to increase in 2012. As biocides are substances we get in contact with in our everyday lives, access to this type of information is of paramount importance in order to ensure an appropriate living environment. Consequently, a database where information may be quickly processed, sorted, and easily accessed, according to different search criteria, is the most desirable solution. The main aim of this work was to design and implement a relational database with complete information about biocides used in public health management to improve the quality of life. Methods: Design and implementation of a relational database for biocides, by using the software “phpMyAdmin”. Results: A database, which allows for an efficient collection, storage, and management of information including chemical properties and applications of a large quantity of biocides, as well as its adequate dissemination into the public health environment. Conclusion: The information contained in the database herein presented promotes an adequate use of biocides, by means of information technologies, which in consequence may help achieve important improvement in our quality of life. PMID:23113190

  2. PRAIRIEMAP: A GIS database for prairie grassland management in western North America

    USGS Publications Warehouse

    ,

    2003-01-01

    The USGS Forest and Rangeland Ecosystem Science Center, Snake River Field Station (SRFS) maintains a database of spatial information, called PRAIRIEMAP, which is needed to address the management of prairie grasslands in western North America. We identify and collect spatial data for the region encompassing the historical extent of prairie grasslands (Figure 1). State and federal agencies, the primary entities responsible for management of prairie grasslands, need this information to develop proactive management strategies to prevent prairie-grassland wildlife species from being listed as Endangered Species, or to develop appropriate responses if listing does occur. Spatial data are an important component in documenting current habitat and other environmental conditions, which can be used to identify areas that have undergone significant changes in land cover and to identify underlying causes. Spatial data will also be a critical component guiding the decision processes for restoration of habitat in the Great Plains. As such, the PRAIRIEMAP database will facilitate analyses of large-scale and range-wide factors that may be causing declines in grassland habitat and populations of species that depend on it for their survival. Therefore, development of a reliable spatial database carries multiple benefits for land and wildlife management. The project consists of 3 phases: (1) identify relevant spatial data, (2) assemble, document, and archive spatial data on a computer server, and (3) develop and maintain the web site (http://prairiemap.wr.usgs.gov) for query and transfer of GIS data to managers and researchers.

  3. GSIMF : a web service based software and database management system for the generation grids.

    SciTech Connect

    Wang, N.; Ananthan, B.; Gieraltowski, G.; May, E.; Vaniachine, A.; Tech-X Corp.

    2008-01-01

    To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids.

  4. Documentation of a spatial data-base management system for monitoring pesticide application in Washington

    USGS Publications Warehouse

    Schurr, K.M.; Cox, S.E.

    1994-01-01

    The Pesticide-Application Data-Base Management System was created as a demonstration project and was tested with data submitted to the Washington State Department of Agriculture by pesticide applicators from a small geographic area. These data were entered into the Department's relational data-base system and uploaded into the system's ARC/INFO files. Locations for pesticide applica- tions are assigned within the Public Land Survey System grids, and ARC/INFO programs in the Pesticide-Application Data-Base Management System can subdivide each survey section into sixteen idealized quarter-quarter sections for display map grids. The system provides data retrieval and geographic information system plotting capabilities from a menu of seven basic retrieval options. Additionally, ARC/INFO coverages can be created from the retrieved data when required for particular applications. The Pesticide-Application Data-Base Management System, or the general principles used in the system, could be adapted to other applica- tions or to other states.

  5. The Coral Triangle Atlas: an integrated online spatial database system for improving coral reef management.

    PubMed

    Cros, Annick; Ahamad Fatan, Nurulhuda; White, Alan; Teoh, Shwu Jiau; Tan, Stanley; Handayani, Christian; Huang, Charles; Peterson, Nate; Venegas Li, Ruben; Siry, Hendra Yusran; Fitriana, Ria; Gove, Jamison; Acoba, Tomoko; Knight, Maurice; Acosta, Renerio; Andrew, Neil; Beare, Doug

    2014-01-01

    In this paper we describe the construction of an online GIS database system, hosted by WorldFish, which stores bio-physical, ecological and socio-economic data for the 'Coral Triangle Area' in South-east Asia and the Pacific. The database has been built in partnership with all six (Timor-Leste, Malaysia, Indonesia, The Philippines, Solomon Islands and Papua New Guinea) of the Coral Triangle countries, and represents a valuable source of information for natural resource managers at the regional scale. Its utility is demonstrated using biophysical data, data summarising marine habitats, and data describing the extent of marine protected areas in the region.

  6. The Coral Triangle Atlas: An Integrated Online Spatial Database System for Improving Coral Reef Management

    PubMed Central

    Cros, Annick; Ahamad Fatan, Nurulhuda; White, Alan; Teoh, Shwu Jiau; Tan, Stanley; Handayani, Christian; Huang, Charles; Peterson, Nate; Venegas Li, Ruben; Siry, Hendra Yusran; Fitriana, Ria; Gove, Jamison; Acoba, Tomoko; Knight, Maurice; Acosta, Renerio; Andrew, Neil; Beare, Doug

    2014-01-01

    In this paper we describe the construction of an online GIS database system, hosted by WorldFish, which stores bio-physical, ecological and socio-economic data for the ‘Coral Triangle Area’ in South-east Asia and the Pacific. The database has been built in partnership with all six (Timor-Leste, Malaysia, Indonesia, The Philippines, Solomon Islands and Papua New Guinea) of the Coral Triangle countries, and represents a valuable source of information for natural resource managers at the regional scale. Its utility is demonstrated using biophysical data, data summarising marine habitats, and data describing the extent of marine protected areas in the region. PMID:24941442

  7. Career management: an active process.

    PubMed

    Mackowiak, J; Eckel, F M

    1985-03-01

    The self-assessment, goal-setting, and career-planning techniques of career management are discussed, and the organization's role in career management is discussed. Career management is a planned process, initiated and carried out by an individual with the assistance of others. Because work and nonwork activities are so interrelated, career and life management planning can maximize a pharmacist's personal success. The career- and life-management process begins with the development of a personal definition of success. A self-assessment must be made of one's values, needs, interests, and activities. The next step of the process involves setting goals and establishing a plan or strategy to achieve them. Establishing a career path requires researching alternate career goals. Career competencies are identified that can increase an employee's chances of success. The employer shares the responsibility for career development through coaching, job structuring, and keeping the employee aware of constraints. Through the integration of the roles of the individual and the organization in the career-management process, employees can optimize their contribution to an organization. Pharmacists can successfully manage their careers by applying the techniques of self-assessment, goal setting, and career planning. PMID:3985018

  8. TheSNPpit—A High Performance Database System for Managing Large Scale SNP Data

    PubMed Central

    Groeneveld, Eildert; Lichtenberg, Helmut

    2016-01-01

    The fast development of high throughput genotyping has opened up new possibilities in genetics while at the same time producing considerable data handling issues. TheSNPpit is a database system for managing large amounts of multi panel SNP genotype data from any genotyping platform. With an increasing rate of genotyping in areas like animal and plant breeding as well as human genetics, already now hundreds of thousand of individuals need to be managed. While the common database design with one row per SNP can manage hundreds of samples this approach becomes progressively slower as the size of the data sets increase until it finally fails completely once tens or even hundreds of thousands of individuals need to be managed. TheSNPpit has implemented three ideas to also accomodate such large scale experiments: highly compressed vector storage in a relational database, set based data manipulation, and a very fast export written in C with Perl as the base for the framework and PostgreSQL as the database backend. Its novel subset system allows the creation of named subsets based on the filtering of SNP (based on major allele frequency, no-calls, and chromosomes) and manually applied sample and SNP lists at negligible storage costs, thus avoiding the issue of proliferating file copies. The named subsets are exported for down stream analysis. PLINK ped and map files are processed as in- and outputs. TheSNPpit allows management of different panel sizes in the same population of individuals when higher density panels replace previous lower density versions as it occurs in animal and plant breeding programs. A completely generalized procedure allows storage of phenotypes. TheSNPpit only occupies 2 bits for storing a single SNP implying a capacity of 4 mio SNPs per 1MB of disk storage. To investigate performance scaling, a database with more than 18.5 mio samples has been created with 3.4 trillion SNPs from 12 panels ranging from 1000 through 20 mio SNPs resulting in a

  9. Data management with a landslide inventory of the Franconian Alb (Germany) using a spatial database and GIS tools

    NASA Astrophysics Data System (ADS)

    Bemm, Stefan; Sandmeier, Christine; Wilde, Martina; Jaeger, Daniel; Schwindt, Daniel; Terhorst, Birgit

    2014-05-01

    ), informations on location, landslide types and causes, geomorphological positions, geometries, hazards and damages, as well as assessments related to the activity of landslides. Furthermore, there are stored spatial objects, which represent the components of a landslide, in particular the scarps and the accumulation areas. Besides, waterways, map sheets, contour lines, detailed infrastructure data, digital elevation models, aspect and slope data are included. Examples of spatial queries to the database are intersections of raster and vector data for calculating values for slope gradients or aspects of landslide areas and for creating multiple, overlaying sections for the comparison of slopes, as well as distances to the infrastructure or to the next receiving drainage. Furthermore, getting informations on landslide magnitudes, distribution and clustering, as well as potential correlations concerning geomorphological or geological conditions. The data management concept in this study can be implemented for any academic, public or private use, because it is independent from any obligatory licenses. The created spatial database offers a platform for interdisciplinary research and socio-economic questions, as well as for landslide susceptibility and hazard indication mapping. Obe, R.O., Hsu, L.S. 2011. PostGIS in action. - pp 492, Manning Publications, Stamford

  10. Metadata Dictionary Database: A Proposed Tool for Academic Library Metadata Management

    ERIC Educational Resources Information Center

    Southwick, Silvia B.; Lampert, Cory

    2011-01-01

    This article proposes a metadata dictionary (MDD) be used as a tool for metadata management. The MDD is a repository of critical data necessary for managing metadata to create "shareable" digital collections. An operational definition of metadata management is provided. The authors explore activities involved in metadata management in…

  11. Development of a database management system for Coal Combustion By-Products (CCBs)

    SciTech Connect

    O`Leary, E.M.; Peck, W.D.; Pflughoeft-Hassett, D.F.

    1997-06-01

    Coal combustion by-products (CCBs) are produced in high volumes worldwide. Utilization of these materials is economically and environmentally advantageous and is expected to increase as disposal costs increase. The American Coal Ash Association (ACAA) is developing a database to contain characterization and utilization information on CCBs. This database will provide information for use by managers, marketers, operations personnel, and researchers that will aid in their decision making and long-term planning for issues related to CCBs. The comprehensive nature of the database and the interactive user application will enable ACAA members to efficiently and economically access a wealth of data on CCBs and will promote the technically sound, environmentally safe, and commercially competitive use of CCBs.

  12. Rhode Island Water Supply System Management Plan Database (WSSMP-Version 1.0)

    USGS Publications Warehouse

    Granato, Gregory E.

    2004-01-01

    In Rhode Island, the availability of water of sufficient quality and quantity to meet current and future environmental and economic needs is vital to life and the State's economy. Water suppliers, the Rhode Island Water Resources Board (RIWRB), and other State agencies responsible for water resources in Rhode Island need information about available resources, the water-supply infrastructure, and water use patterns. These decision makers need historical, current, and future water-resource information. In 1997, the State of Rhode Island formalized a system of Water Supply System Management Plans (WSSMPs) to characterize and document relevant water-supply information. All major water suppliers (those that obtain, transport, purchase, or sell more than 50 million gallons of water per year) are required to prepare, maintain, and carry out WSSMPs. An electronic database for this WSSMP information has been deemed necessary by the RIWRB for water suppliers and State agencies to consistently document, maintain, and interpret the information in these plans. Availability of WSSMP data in standard formats will allow water suppliers and State agencies to improve the understanding of water-supply systems and to plan for future needs or water-supply emergencies. In 2002, however, the Rhode Island General Assembly passed a law that classifies some of the WSSMP information as confidential to protect the water-supply infrastructure from potential terrorist threats. Therefore the WSSMP database was designed for an implementation method that will balance security concerns with the information needs of the RIWRB, suppliers, other State agencies, and the public. A WSSMP database was developed by the U.S. Geological Survey in cooperation with the RIWRB. The database was designed to catalog WSSMP information in a format that would accommodate synthesis of current and future information about Rhode Island's water-supply infrastructure. This report documents the design and implementation of

  13. Reef Ecosystem Services and Decision Support Database

    EPA Science Inventory

    This scientific and management information database utilizes systems thinking to describe the linkages between decisions, human activities, and provisioning of reef ecosystem goods and services. This database provides: (1) Hierarchy of related topics - Click on topics to navigat...

  14. The MANAGE database: nutrient load and site characteristic updates and runoff concentration data.

    PubMed

    Harmel, Daren; Qian, Song; Reckhow, Ken; Casebolt, Pamela

    2008-01-01

    The "Measured Annual Nutrient loads from AGricultural Environments" (MANAGE) database was developed to be a readily accessible, easily queried database of site characteristic and field-scale nutrient export data. The original version of MANAGE, which drew heavily from an early 1980s compilation of nutrient export data, created an electronic database with nutrient load data and corresponding site characteristics from 40 studies on agricultural (cultivated and pasture/range) land uses. In the current update, N and P load data from 15 additional studies of agricultural runoff were included along with N and P concentration data for all 55 studies. The database now contains 1677 watershed years of data for various agricultural land uses (703 for pasture/rangeland; 333 for corn; 291 for various crop rotations; 177 for wheat/oats; and 4-33 yr for barley, citrus, vegetables, sorghum, soybeans, cotton, fallow, and peanuts). Across all land uses, annual runoff loads averaged 14.2 kg ha(-1) for total N and 2.2 kg ha(-1) for total P. On average, these losses represented 10 to 25% of applied fertilizer N and 4 to 9% of applied fertilizer P. Although such statistics produce interesting generalities across a wide range of land use, management, and climatic conditions, regional crop-specific analyses should be conducted to guide regulatory and programmatic decisions. With this update, MANAGE contains data from a vast majority of published peer-reviewed N and P export studies on homogeneous agricultural land uses in the USA under natural rainfall-runoff conditions and thus provides necessary data for modeling and decision-making related to agricultural runoff. The current version can be downloaded at http://www.ars.usda.gov/spa/manage-nutrient.

  15. Watershed Data Management (WDM) Database for Salt Creek Streamflow Simulation, DuPage County, Illinois

    USGS Publications Warehouse

    Murphy, Elizabeth A.; Ishii, Audrey

    2006-01-01

    The U.S. Geological Survey (USGS), in cooperation with DuPage County Department of Engineering, Stormwater Management Division, maintains a database of hourly meteorologic and hydrologic data for use in a near real-time streamflow simulation system, which assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek watershed in DuPage County, Illinois. The majority of the precipitation data are collected from a tipping-bucket rain-gage network located in and near DuPage County. The other meteorologic data (wind speed, solar radiation, air temperature, and dewpoint temperature) are collected at Argonne National Laboratory in Argonne, Illinois. Potential evapotranspiration is computed from the meteorologic data. The hydrologic data (discharge and stage) are collected at USGS streamflow-gaging stations in DuPage County. These data are stored in a Watershed Data Management (WDM) database. This report describes a version of the WDM database that was quality-assured and quality-controlled annually to ensure the datasets were complete and accurate. This version of the WDM database contains data from January 1, 1997, through September 30, 2004, and is named SEP04.WDM. This report provides a record of time periods of poor data for each precipitation dataset and describes methods used to estimate the data for the periods when data were missing, flawed, or snowfall-affected. The precipitation dataset data-filling process was changed in 2001, and both processes are described. The other meteorologic and hydrologic datasets in the database are fully described in the annual U.S. Geological Survey Water Data Report for Illinois and, therefore, are described in less detail than the precipitation datasets in this report.

  16. The Kepler DB, a Database Management System for Arrays, Sparse Arrays and Binary Data

    NASA Technical Reports Server (NTRS)

    McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Middour, Christopher; Klaus, Todd C.; Wohler, Bill

    2010-01-01

    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30-minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database (Kepler DB) management system was created to act as the repository of this information. After one year of ight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center.

  17. Somma-Vesuvius' activity: a mineral chemistry database

    NASA Astrophysics Data System (ADS)

    Redi, Daniele; Cannatelli, Claudia; Esposito, Rosario; Lima, Annamaria; Petrosino, Paola; De Vivo, Benedetto

    2016-08-01

    Clinopyroxene and olivine are ubiquitous phases in Somma-Vesuvius (SV) volcanics and for the first time they were systematically studied in several products younger than 40 ka. In this manuscript chemical compositions (major, trace and rare earth elements) of a large set of olivine and clinopyroxene crystals from selected rock samples are presented and discussed. Fourteen pumice samples from Plinian pyroclastic deposits as well as three scoriae and eight lava samples from inter-Plinian deposits were collected. A representative number of olivine and clinopyroxene crystals (n ~ 50) were selected for each sample and analysed by electron microprobe and laser ablation inductively coupled plasma mass spectrometer, resulting in a large database, which is now available to the scientific community. All studied eruptive products contain olivine and clinopyroxene crystals spanning a wide range of compositions. Olivines show Fo content varying from 91 to 68, while clinopyroxenes display Mg# ranging from 93 to 71. In samples younger than A.D. 79, the more evolved (Mg#82-72) clinopyroxene crystals show clear Ca enrichment (~23.5-24.5 wt% CaO) with respect to those from older samples (before-A.D.79, ~23-21 wt% CaO). The results corroborate disequilibrium between olivine, clinopyroxene and the hosting melt, and an increasing role of carbonate assimilation in SV magma evolution in the last 2 ka. The database here produced is thought as a share product that makes available mineral data and can be used for further studies by researchers to investigate geochemical evolution of the SV system.

  18. Database system for management of health physics and industrial hygiene records.

    SciTech Connect

    Murdoch, B. T.; Blomquist, J. A.; Cooke, R. H.; Davis, J. T.; Davis, T. M.; Dolecek, E. H.; Halka-Peel, L.; Johnson, D.; Keto, D. N.; Reyes, L. R.; Schlenker, R. A.; Woodring; J. L.

    1999-10-05

    This paper provides an overview of the Worker Protection System (WPS), a client/server, Windows-based database management system for essential radiological protection and industrial hygiene. Seven operational modules handle records for external dosimetry, bioassay/internal dosimetry, sealed sources, routine radiological surveys, lasers, workplace exposure, and respirators. WPS utilizes the latest hardware and software technologies to provide ready electronic access to a consolidated source of worker protection.

  19. Dynamic Tables: An Architecture for Managing Evolving, Heterogeneous Biomedical Data in Relational Database Management Systems

    PubMed Central

    Corwin, John; Silberschatz, Avi; Miller, Perry L.; Marenco, Luis

    2007-01-01

    Data sparsity and schema evolution issues affecting clinical informatics and bioinformatics communities have led to the adoption of vertical or object-attribute–value-based database schemas to overcome limitations posed when using conventional relational database technology. This paper explores these issues and discusses why biomedical data are difficult to model using conventional relational techniques. The authors propose a solution to these obstacles based on a relational database engine using a sparse, column-store architecture. The authors provide benchmarks comparing the performance of queries and schema-modification operations using three different strategies: (1) the standard conventional relational design; (2) past approaches used by biomedical informatics researchers; and (3) their sparse, column-store architecture. The performance results show that their architecture is a promising technique for storing and processing many types of data that are not handled well by the other two semantic data models. PMID:17068350

  20. University Management of Research: A Data-Based Policy and Planning. AIR 1989 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Strubbe, J.

    The development of an appropriate research policy for a university as well as for the national and international levels can be accomplished only if quantitative data and qualitative evaluations (scientific contribution, results, goal-achievement) are made available to illustrate research activities. A database is described that would enable…

  1. Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements

    NASA Technical Reports Server (NTRS)

    Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri

    2006-01-01

    NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.

  2. Metabolonote: A Wiki-Based Database for Managing Hierarchical Metadata of Metabolome Analyses

    PubMed Central

    Ara, Takeshi; Enomoto, Mitsuo; Arita, Masanori; Ikeda, Chiaki; Kera, Kota; Yamada, Manabu; Nishioka, Takaaki; Ikeda, Tasuku; Nihei, Yoshito; Shibata, Daisuke; Kanaya, Shigehiko; Sakurai, Nozomu

    2015-01-01

    Metabolomics – technology for comprehensive detection of small molecules in an organism – lags behind the other “omics” in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata), existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data) being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called “Togo Metabolome Data” (TogoMD), with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers’ understanding and use of data but also submitters’ motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitate the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http

  3. An Introduction to Database Structure and Database Machines.

    ERIC Educational Resources Information Center

    Detweiler, Karen

    1984-01-01

    Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…

  4. Database Administrator

    ERIC Educational Resources Information Center

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  5. BDC: Operational Database Management for the SPOT/HELIOS Operations Control System

    NASA Astrophysics Data System (ADS)

    Guiral, P.; Teodomante, S.

    Since operational database is essential for the executable environment of satellite control system (e.g. for monitoring and commanding), the French Space Agency (CNES), as ground segment designer and satellite operator, allocates important resources to Operational Database Management tools development. Indeed, this kind of tool is necessary in order to generate and maintain the operations control system (OCS) data repository during all the relevant space system life. In this context, the objectives of this paper are firstly to present lessons learnt from SPOT/Helios product line and secondly to point out the new challenges related to the increasing number of satellite systems to qualify and maintain during the upcoming years. "BDC", as a component of the SPOT / Helios operations control, is an Operational Database Management tool designed and developed by CNES. This tool has been used since 1998 for SPOT4, then has been upgraded for Helios 1A / 1B, SPOT5 and currently is being customized for Helios 2A. We emphasize the need for CNES of having at one's disposal a tool enabling a significant flexibility in handling data modification during technical and operational qualification phases. This implies: an evolution of the data exchanges between the satellite contractor, Astrium, and • CNES. constraints on the tool development process, leading to the choice of developing • first a prototype and then industrializing it. After a brief data description, the tool is technically described, in particular its architecture and the design choices that allow reusability for different satellites lines. Keywords: Satellite operations, Operations Control System, Data management, Relational Database.

  6. KNApSAcK Metabolite Activity Database for retrieving the relationships between metabolites and biological activities.

    PubMed

    Nakamura, Yukiko; Afendi, Farit Mochamad; Parvin, Aziza Kawsar; Ono, Naoaki; Tanaka, Ken; Hirai Morita, Aki; Sato, Tetsuo; Sugiura, Tadao; Altaf-Ul-Amin, Md; Kanaya, Shigehiko

    2014-01-01

    Databases (DBs) are required by various omics fields because the volume of molecular biology data is increasing rapidly. In this study, we provide instructions for users and describe the current status of our metabolite activity DB. To facilitate a comprehensive understanding of the interactions between the metabolites of organisms and the chemical-level contribution of metabolites to human health, we constructed a metabolite activity DB known as the KNApSAcK Metabolite Activity DB. It comprises 9,584 triplet relationships (metabolite-biological activity-target species), including 2,356 metabolites, 140 activity categories, 2,963 specific descriptions of biological activities and 778 target species. Approximately 46% of the activities described in the DB are related to chemical ecology, most of which are attributed to antimicrobial agents and plant growth regulators. The majority of the metabolites with antimicrobial activities are flavonoids and phenylpropanoids. The metabolites with plant growth regulatory effects include plant hormones. Over half of the DB contents are related to human health care and medicine. The five largest groups are toxins, anticancer agents, nervous system agents, cardiovascular agents and non-therapeutic agents, such as flavors and fragrances. The KNApSAcK Metabolite Activity DB is integrated within the KNApSAcK Family DBs to facilitate further systematized research in various omics fields, especially metabolomics, nutrigenomics and foodomics. The KNApSAcK Metabolite Activity DB could also be utilized for developing novel drugs and materials, as well as for identifying viable drug resources and other useful compounds.

  7. Fleet-Wide Prognostic and Health Management Suite: Asset Fault Signature Database

    SciTech Connect

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2015-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: (1) Diagnostic Advisor, (2) Asset Fault Signature (AFS) Database, (3) Remaining Useful Life Advisor, and (4) Remaining Useful Life Database. The paper focuses on the AFS Database of the FW-PHM Suite, which is used to catalog asset fault signatures. A fault signature is a structured representation of the information that an expert would use to first detect and then verify the occurrence of a specific type of fault. The fault signatures developed to assess the health status of generator step-up transformers are described in the paper. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  8. Kal-Haiti a Research Database for Risks Management and Sustainable Reconstruction in Haiti

    NASA Astrophysics Data System (ADS)

    Giros, A.; Fontannaz, D.; Allenbach, B.; Treinsoutrot, D.; De Michele, M.

    2012-07-01

    Following the 12th January 2010 earthquake in Haiti, the French Agence Nationale de la Recherche has funded a project named KAL-Haiti which aims at gathering remote sensing imagery as well as in-situ and exogenous data into a knowledge base. This database, seen as a shareable resource, can serve as a basis for helping the reconstruction of the country, but also as a reference for scientific studies devoted to all phases of risk management. The project main outcome will be a geo-referenced database containing a selection of remotely sensed imagery acquired before and after the disastrous event supplemented with all relevant ancillary data, and enriched with in-situ measurements and exogenous data. The resulting reference database is freely available for research and for reconstruction tasks. It is strongly expected that users will also become contributors by sharing their own data production, thus participating to the growth of the initial kernel. The database will also be enriched with new satellite images, monitoring the evolution of the Haitian situation over the next 10 years.

  9. Conceptual database modeling: a method for enabling end users (radiologists) to understand and develop their information management applications.

    PubMed

    Hawkins, H; Young, S K; Hubert, K C; Hallock, P

    2001-06-01

    As medical technology advances at a rapid pace, clinicians become further and further removed from the design of their own technological tools. This is particularly evident with information management. For radiologists, clinical histories, patient reports, and other pertinent information require sophisticated tools for data handling. However, as databases grow more powerful and sophisticated, systems require the expertise of programmers and information technology personnel. The radiologist, the clinician end-user, must maintain involvement in the development of system tools to insure effective information management. Conceptual database modeling is a design method that serves to bridge the gap between the technological aspects of information management and its clinical applications. Conceptual database modeling involves developing information systems in simple language so that anyone can have input into the overall design. This presentation describes conceptual database modeling, using object role modeling, as a means by which end-users (clinicians) may participate in database development.

  10. A cohort and database study of airway management in patients undergoing thyroidectomy for retrosternal goitre.

    PubMed

    Gilfillan, N; Ball, C M; Myles, P S; Serpell, J; Johnson, W R; Paul, E

    2014-11-01

    Patients undergoing thyroid surgery with retrosternal goitre may raise concerns for the anaesthetist, especially airway management. We reviewed a multicentre prospective thyroid surgery database and extracted data for those patients with retrosternal goitre. Additionally, we reviewed the anaesthetic charts of patients with retrosternal goitre at our institution to identify the anaesthetic induction technique and airway management. Of 4572 patients in the database, 919 (20%) had a retrosternal goitre. Two cases of early postoperative tracheomalacia were reported, one in the retrosternal group. Despite some very large goitres, no patient required tracheostomy or cardiopulmonary bypass and there were no perioperative deaths. In the subset of 133 patients managed at our institution over six years, there were no major adverse anaesthetic outcomes and no patient had a failed airway or tracheomalacia. In the latter cohort, of 32 (24%) patients identified as having a potentially difficult airway, 17 underwent awake fibreoptic tracheal intubation, but two of these were abandoned and converted to intravenous induction and general anaesthesia. Eleven had inhalational induction; two of these were also abandoned and converted to intravenous induction and general anaesthesia. Of those suspected as having a difficult airway, 28 (87.5%) subsequently had direct laryngoscopy where the laryngeal inlet was clearly visible. We found no good evidence that thyroid surgery patients with retrosternal goitre, with or without symptoms and signs of tracheal compression, present the experienced anaesthetist with an airway that cannot be managed using conventional techniques. This does not preclude the need for multidisciplinary discussion and planning. PMID:25342401

  11. A Prescribed Fire Emission Factors Database for Land Management and Air Quality Applications

    NASA Astrophysics Data System (ADS)

    Lincoln, E.; Hao, W.; Baker, S.; Yokelson, R. J.; Burling, I. R.; Urbanski, S. P.; Miller, W.; Weise, D. R.; Johnson, T. J.

    2010-12-01

    Prescribed fire is a significant emissions source in the U.S. and that needs to be adequately characterized in atmospheric transport/chemistry models. In addition, the Clean Air Act, its amendments, and air quality regulations require that prescribed fire managers estimate the quantity of emissions that a prescribed fire will produce. Several published papers contain a few emission factors for prescribed fire and additional results are found in unpublished documents whose quality has to be assessed. In conjunction with three research projects developing detailed new emissions data and meteorological tools to assist prescribed fire managers, the Strategic Environmental Research and Development Program (SERDP) is supporting development of a database that contains emissions information related to prescribed burning. Ultimately, this database will be available on the Internet and will contain older emissions information that has been assessed and newer emissions information that has been developed from both laboratory-scale and field measurements. The database currently contains emissions information from over 300 burns of different wildland vegetation types, including grasslands, shrublands, woodlands, forests, and tundra over much of North America. A summary of the compiled data will be presented, along with suggestions for additional categories.

  12. Advanced Scientific Computing Environment Team new scientific database management task. Progress report

    SciTech Connect

    Church, J.P.; Roberts, J.C.; Sims, R.N.; Smetana, A.O.; Westmoreland, B.W.

    1991-06-01

    The mission of the ASCENT Team is to continually keep pace with, evaluate, and select emerging computing technologies to define and implement prototypic scientific environments that maximize the ability of scientists and engineers to manage scientific data. These environments are to be implemented in a manner consistent with the site computing architecture and standards and NRTSC/SCS strategic plans for scientific computing. The major trends in computing hardware and software technology clearly indicate that the future ``computer`` will be a network environment that comprises supercomputers, graphics boxes, mainframes, clusters, workstations, terminals, and microcomputers. This ``network computer`` will have an architecturally transparent operating system allowing the applications code to run on any box supplying the required computing resources. The environment will include a distributed database and database managing system(s) that permits use of relational, hierarchical, object oriented, GIS, et al, databases. To reach this goal requires a stepwise progression from the present assemblage of monolithic applications codes running on disparate hardware platforms and operating systems. The first steps include converting from the existing JOSHUA system to a new J80 system that complies with modern language standards, development of a new J90 prototype to provide JOSHUA capabilities on Unix platforms, development of portable graphics tools to greatly facilitate preparation of input and interpretation of output; and extension of ``Jvv`` concepts and capabilities to distributed and/or parallel computing environments.

  13. Head-to-Head Evaluation of the Pro-Cite and Sci-Mate Bibliographic Database Management Systems.

    ERIC Educational Resources Information Center

    Saari, David S.; Foster, George A., Jr.

    1989-01-01

    Compares two full featured database management systems for bibliographic information in terms of programs and documentation; record creation and editing; online database citations; search procedures; access to references in external text files; sorting and printing functions; style sheets; indexes; and file operations. (four references) (CLB)

  14. The Use of SQL and Second Generation Database Management Systems for Data Processing and Information Retrieval in Libraries.

    ERIC Educational Resources Information Center

    Leigh, William; Paz, Noemi

    1989-01-01

    Describes Structured Query Language (SQL), the result of an American National Standards Institute effort to standardize language used to query computer databases and a common element in second generation database management systems. The discussion covers implementations of SQL, associated products, and techniques for its use in online catalogs,…

  15. An online spatial database of Australian Indigenous Biocultural Knowledge for contemporary natural and cultural resource management.

    PubMed

    Pert, Petina L; Ens, Emilie J; Locke, John; Clarke, Philip A; Packer, Joanne M; Turpin, Gerry

    2015-11-15

    With growing international calls for the enhanced involvement of Indigenous peoples and their biocultural knowledge in managing conservation and the sustainable use of physical environment, it is timely to review the available literature and develop cross-cultural approaches to the management of biocultural resources. Online spatial databases are becoming common tools for educating land managers about Indigenous Biocultural Knowledge (IBK), specifically to raise a broad awareness of issues, identify knowledge gaps and opportunities, and to promote collaboration. Here we describe a novel approach to the application of internet and spatial analysis tools that provide an overview of publically available documented Australian IBK (AIBK) and outline the processes used to develop the online resource. By funding an AIBK working group, the Australian Centre for Ecological Analysis and Synthesis (ACEAS) provided a unique opportunity to bring together cross-cultural, cross-disciplinary and trans-organizational contributors who developed these resources. Without such an intentionally collaborative process, this unique tool would not have been developed. The tool developed through this process is derived from a spatial and temporal literature review, case studies and a compilation of methods, as well as other relevant AIBK papers. The online resource illustrates the depth and breadth of documented IBK and identifies opportunities for further work, partnerships and investment for the benefit of not only Indigenous Australians, but all Australians. The database currently includes links to over 1500 publically available IBK documents, of which 568 are geo-referenced and were mapped. It is anticipated that as awareness of the online resource grows, more documents will be provided through the website to build the database. It is envisaged that this will become a well-used tool, integral to future natural and cultural resource management and maintenance.

  16. An online spatial database of Australian Indigenous Biocultural Knowledge for contemporary natural and cultural resource management.

    PubMed

    Pert, Petina L; Ens, Emilie J; Locke, John; Clarke, Philip A; Packer, Joanne M; Turpin, Gerry

    2015-11-15

    With growing international calls for the enhanced involvement of Indigenous peoples and their biocultural knowledge in managing conservation and the sustainable use of physical environment, it is timely to review the available literature and develop cross-cultural approaches to the management of biocultural resources. Online spatial databases are becoming common tools for educating land managers about Indigenous Biocultural Knowledge (IBK), specifically to raise a broad awareness of issues, identify knowledge gaps and opportunities, and to promote collaboration. Here we describe a novel approach to the application of internet and spatial analysis tools that provide an overview of publically available documented Australian IBK (AIBK) and outline the processes used to develop the online resource. By funding an AIBK working group, the Australian Centre for Ecological Analysis and Synthesis (ACEAS) provided a unique opportunity to bring together cross-cultural, cross-disciplinary and trans-organizational contributors who developed these resources. Without such an intentionally collaborative process, this unique tool would not have been developed. The tool developed through this process is derived from a spatial and temporal literature review, case studies and a compilation of methods, as well as other relevant AIBK papers. The online resource illustrates the depth and breadth of documented IBK and identifies opportunities for further work, partnerships and investment for the benefit of not only Indigenous Australians, but all Australians. The database currently includes links to over 1500 publically available IBK documents, of which 568 are geo-referenced and were mapped. It is anticipated that as awareness of the online resource grows, more documents will be provided through the website to build the database. It is envisaged that this will become a well-used tool, integral to future natural and cultural resource management and maintenance. PMID:25682266

  17. Activated charcoal. (Latest citations from the Compendex database). Published Search

    SciTech Connect

    Not Available

    1993-06-01

    The bibliography contains citations concerning theoretical aspects and industrial applications of activated charcoal. Topics include adsorption capacity and mechanism studies, kinetic and thermodynamic aspects, and description and evaluation of adsorptive abilities. Applications include use in water analyses and waste treatment, air pollution control and measurement, and in nuclear facilities. (Contains a minimum of 151 citations and includes a subject term index and title list.)

  18. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  19. National information network and database system of hazardous waste management in China

    SciTech Connect

    Ma Hongchang

    1996-12-31

    Industries in China generate large volumes of hazardous waste, which makes it essential for the nation to pay more attention to hazardous waste management. National laws and regulations, waste surveys, and manifest tracking and permission systems have been initiated. Some centralized hazardous waste disposal facilities are under construction. China`s National Environmental Protection Agency (NEPA) has also obtained valuable information on hazardous waste management from developed countries. To effectively share this information with local environmental protection bureaus, NEPA developed a national information network and database system for hazardous waste management. This information network will have such functions as information collection, inquiry, and connection. The long-term objective is to establish and develop a national and local hazardous waste management information network. This network will significantly help decision makers and researchers because it will be easy to obtain information (e.g., experiences of developed countries in hazardous waste management) to enhance hazardous waste management in China. The information network consists of five parts: technology consulting, import-export management, regulation inquiry, waste survey, and literature inquiry.

  20. An updated active structure database of Taiwan for seismic hazard assessments

    NASA Astrophysics Data System (ADS)

    Shyu, J. B. H.; Chuang, Y. R.; Chen, Y. L.; Lee, Y.; Cheng, T. C. T.

    2014-12-01

    In order to build a complete seismogenic source model to assess future seismic hazards in Taiwan, we have constructed an updated active structure database for the island. We reviewed existing active structure databases, and obtained new information for structures that have not been thoroughly analyzed before. For example, the Central Geological Survey of Taiwan has published a comprehensive database of active faults in Taiwan, including all of the historically ruptured faults. Many other active structures, such as blind faults or folds that can be identified from geomorphic or structural analysis, have also been mapped and reported in several previous investigations. We have combined information from these existing databases to build an updated and digitized three-dimensional active structure map for Taiwan. Furthermore, for detailed information of individual structure such as long-term slip rates and potential recurrence intervals, we have collected the data from existing publications, as well as calculated from results of our own field surveys and investigations. We hope this updated database would become a significant constraint for the calculations of seismic hazard assessments in Taiwan, and would provide important information for engineers and hazard mitigation agencies.

  1. Integrated computer aided reservoir management (CARM) using Landmark`s OpenWorks 3 database and Reservoir Management software

    SciTech Connect

    Ward, L.C.

    1995-08-01

    Multi-disciplinary asset teams in today`s oil industry are facing an information revolution. To assist them to more accurately define and develop known reservoirs, to visualise reservoirs in 3 dimensions, and to communicate more effectively, they require access to a single common dataset and a flexible, and comprehensive suite of reservoir description software, that allows delineation and refinement of quantitative 3D reservoir models. Landmark`s Computer Aided Reservoir Management (CARM) software provides the most complete integrated geo-information solution for data management, and a suite of integrated Reservoir Management software covering 3D & 2D seismic interpretation, 3D Geocellular modelling (Stratamodel), geological cross section building and deterministic and probabilistic petrophysical log analysis for 3D display. The OpenWorks 3 database provides a common framework not only for the integration of data between Landmark applications, but also with third party applications. Thus once the reservoir stratigraphic framework has been built in Stratamodel it can be used as direct input for stochastic modelling in Odin`s STORM, and also provide data direct to reservoir simulation applications. The key element to this integration is the OpenWorks 3 database which is a production oriented geo-science data model with over 500 tables and in excess of 2500 attributes. The OpenWorks 3 software permits seamless data transfer from one reservoir management application to another, and at every stage of reservoir management the latest updated interpretation is available to every team member. The goal of integrated reservoir management, to achieve effective exploitation of reserves, now utilises multi disciplinary analysis by cross functional teams, enabling the industry to maximise return on {open_quotes}knowledge assets{close_quotes} and physical reserves.

  2. Managing vulnerabilities and achieving compliance for Oracle databases in a modern ERP environment

    NASA Astrophysics Data System (ADS)

    Hölzner, Stefan; Kästle, Jan

    In this paper we summarize good practices on how to achieve compliance for an Oracle database in combination with an ERP system. We use an integrated approach to cover both the management of vulnerabilities (preventive measures) and the use of logging and auditing features (detective controls). This concise overview focusses on the combination Oracle and SAP and it’s dependencies, but also outlines security issues that arise with other ERP systems. Using practical examples, we demonstrate common vulnerabilities and coutermeasures as well as guidelines for the use of auditing features.

  3. Performance of online drug information databases as clinical decision support tools in infectious disease medication management.

    PubMed

    Polen, Hyla H; Zapantis, Antonia; Clauson, Kevin A; Clauson, Kevin Alan; Jebrock, Jennifer; Paris, Mark

    2008-01-01

    Infectious disease (ID) medication management is complex and clinical decision support tools (CDSTs) can provide valuable assistance. This study evaluated scope and completeness of ID drug information found in online databases by evaluating their ability to answer 147 question/answer pairs. Scope scores produced highest rankings (%) for: Micromedex (82.3), Lexi-Comp/American Hospital Formulary Service (81.0), and Medscape Drug Reference (81.0); lowest includes: Epocrates Online Premium (47.0), Johns Hopkins ABX Guide (45.6), and PEPID PDC (40.8). PMID:18999059

  4. Object-Oriented Database for Managing Building Modeling Components and Metadata: Preprint

    SciTech Connect

    Long, N.; Fleming, K.; Brackney, L.

    2011-12-01

    Building simulation enables users to explore and evaluate multiple building designs. When tools for optimization, parametrics, and uncertainty analysis are combined with analysis engines, the sheer number of discrete simulation datasets makes it difficult to keep track of the inputs. The integrity of the input data is critical to designers, engineers, and researchers for code compliance, validation, and building commissioning long after the simulations are finished. This paper discusses an application that stores inputs needed for building energy modeling in a searchable, indexable, flexible, and scalable database to help address the problem of managing simulation input data.

  5. Development of genome viewer (Web Omics Viewer) for managing databases of cucumber genome

    NASA Astrophysics Data System (ADS)

    Wojcieszek, M.; RóŻ, P.; Pawełkowicz, M.; Nowak, R.; Przybecki, Z.

    Cucumber is an important plant in horticulture and science world. Sequencing projects of C. sativus genome enable new methodological aproaches in further investigation of this species. Accessibility is crucial to fully exploit obtained information about detail structure of genes, markers and other characteristic features such contigs, scaffolds and chromosomes. Genome viewer is one of tools providing plain and easy way for presenting genome data for users and for databases administration. Gbrowse - the main viewer has several very useful features but lacks in managing simplicity. Our group developed new genome browser Web Omics Viewer (WOV), keeping functionality but improving utilization and accessibility to cucumber genome data.

  6. System configuration management plan for the TWRS controlled baseline database system [TCBD

    SciTech Connect

    Spencer, S.G.

    1998-09-23

    LHMC, TWRS Business Management Organization (BMO) is designated as system owner, operator, and maintenance authority. The TWAS BMO identified the need for the TCBD. The TWRS BMO users have established all requirements for the database and are responsible for maintaining database integrity and control (after the interface data has been received). Initial interface data control and integrity is maintained through functional and administrative processes and is the responsibility of the database owners who are providing the data. The specific groups within the TWRS BMO affected by this plan are the Financial Management and TWRS Management Support Project, Master Planning, and the Financial Control Integration and Reporting. The interfaces between these organizations are through normal line management chain of command. The Master Planning Group is assigned the responsibility to continue development and maintenance of the TCBD. This group maintains information that includes identification of requirements and changes to those requirements in a TCBD project file. They are responsible for the issuance, maintenance, and change authority of this SCW. LHMC, TWRS TCBD Users are designated as providing the project`s requirement changes for implementation and also testing of the TCBD during development. The Master Planning Group coordinates and monitors the user`s requests for system requirements (new/existing) as well as beta and acceptance testing. Users are those individuals and organizations needing data or information from the TCBD and having both a need-to-know and the proper training and authority to access the database. Each user or user organization is required to comply with the established requirements and procedures governing the TCBD. Lockheed Martin Services, Inc. (LMSI) is designated the TCBD developer, maintainer, and custodian until acceptance and process testing of the system has been completed via the TWRS BMO. Once this occurs, the TCBD will be completed and

  7. Study on parallel and distributed management of RS data based on spatial database

    NASA Astrophysics Data System (ADS)

    Chen, Yingbiao; Qian, Qinglan; Wu, Hongqiao; Liu, Shijin

    2009-10-01

    With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.

  8. Graph Databases for Large-Scale Healthcare Systems: A Framework for Efficient Data Management and Data Services

    SciTech Connect

    Park, Yubin; Shankar, Mallikarjun; Park, Byung H.; Ghosh, Dr. Joydeep

    2014-01-01

    Designing a database system for both efficient data management and data services has been one of the enduring challenges in the healthcare domain. In many healthcare systems, data services and data management are often viewed as two orthogonal tasks; data services refer to retrieval and analytic queries such as search, joins, statistical data extraction, and simple data mining algorithms, while data management refers to building error-tolerant and non-redundant database systems. The gap between service and management has resulted in rigid database systems and schemas that do not support effective analytics. We compose a rich graph structure from an abstracted healthcare RDBMS to illustrate how we can fill this gap in practice. We show how a healthcare graph can be automatically constructed from a normalized relational database using the proposed 3NF Equivalent Graph (3EG) transformation.We discuss a set of real world graph queries such as finding self-referrals, shared providers, and collaborative filtering, and evaluate their performance over a relational database and its 3EG-transformed graph. Experimental results show that the graph representation serves as multiple de-normalized tables, thus reducing complexity in a database and enhancing data accessibility of users. Based on this finding, we propose an ensemble framework of databases for healthcare applications.

  9. Metadata-based generation and management of knowledgebases from molecular biological databases.

    PubMed

    Eccles, J R; Saldanha, J W

    1990-06-01

    Present-day knowledge-based systems (or expert systems) and databases constitute 'islands of computing' with little or no connection to each other. The use of software to provide a communication channel between the two, and to integrate their separate functions, is particularly attractive in certain data-rich domains where there are already pre-existing database systems containing the data required by the relevant knowledge-based system. Our evolving program, GENPRO, provides such a communication channel. The original methodology has been extended to provide interactive Prolog clause input with syntactic and semantic verification. This enables automatic generation of clauses from the source database, together with complete management of subsequent interfacing to the specified knowledge-based system. The particular data-rich domain used in this paper is protein structure, where processes which require reasoning (modelled by knowledge-based systems), such as the inference of protein topology, protein model-building and protein structure prediction, often require large amounts of raw data (i.e., facts about particular proteins) in the form of logic programming ground clauses. These are generated in the proper format by use of the concept of metadata. PMID:2397635

  10. FEMA (Federal Emergency Management Agency) database requirements assessment and resource directory model. Final report 24 Aug 81-15 May 82

    SciTech Connect

    Tenopir, C.; Williams, M.E.

    1982-05-01

    Word-oriented databases (bibliographic, textual, directory, etc.) relevant to various units within the Federal Emergency Management Agency are identified and those of most potential relevance are analyzed. Subject profiles reflecting the interests of each major FEMA unit were developed and tested online on fifteen publicly available databases. The databases were then ranked by the number of citations pertinent to all aspects of emergency management and the number of pertinent citations per year of database coverage. Sample citations from the fifteen databases are included. A model Directory of Databases pertinent to emergency management was developed.

  11. Making the procedure manual come alive: A prototype relational database and dynamic website model for the management of nursing information.

    PubMed

    Peace, Jane; Brennan, Patricia Flatley

    2006-01-01

    The nursing procedural manual is an essential resource for clinical practice, yet insuring its currency and availability at the point of care remains an unresolved information management challenge for nurses. While standard HTML-based web pages offer significant advantage over paper compilations, employing emerging computer science tools offers even greater promise. This paper reports on the creation of a prototypical dynamic web-based nursing procedure manual driven by a relational database. We created a relational database in MySQL to manage, store, and link the procedure information, and developed PHP files to guide content retrieval, content management, and display on demand in browser-viewable format. This database driven dynamic website model is an important innovation to meet the challenge of content management and dissemination of nursing information.

  12. Knowledge management: An abstraction of knowledge base and database management systems

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  13. The use of database management systems and artificial intelligence in automating the planning of optical navigation pictures

    NASA Technical Reports Server (NTRS)

    Davis, Robert P.; Underwood, Ian M.

    1987-01-01

    The use of database management systems (DBMS) and AI to minimize human involvement in the planning of optical navigation pictures for interplanetary space probes is discussed, with application to the Galileo mission. Parameters characterizing the desirability of candidate pictures, and the program generating them, are described. How these parameters automatically build picture records in a database, and the definition of the database structure, are then discussed. The various rules, priorities, and constraints used in selecting pictures are also described. An example is provided of an expert system, written in Prolog, for automatically performing the selection process.

  14. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    NASA Astrophysics Data System (ADS)

    Shao, Weber; Kupelian, Patrick A.; Wang, Jason; Low, Daniel A.; Ruan, Dan

    2014-03-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  15. A Toolkit for Active Object-Oriented Databases with Application to Interoperability

    NASA Technical Reports Server (NTRS)

    King, Roger

    1996-01-01

    In our original proposal we stated that our research would 'develop a novel technology that provides a foundation for collaborative information processing.' The essential ingredient of this technology is the notion of 'deltas,' which are first-class values representing collections of proposed updates to a database. The Heraclitus framework provides a variety of algebraic operators for building up, combining, inspecting, and comparing deltas. Deltas can be directly applied to the database to yield a new state, or used 'hypothetically' in queries against the state that would arise if the delta were applied. The central point here is that the step of elevating deltas to 'first-class' citizens in database programming languages will yield tremendous leverage on the problem of supporting updates in collaborative information processing. In short, our original intention was to develop the theoretical and practical foundation for a technology based on deltas in an object- oriented database context, develop a toolkit for active object-oriented databases, and apply this toward collaborative information processing.

  16. A Toolkit for Active Object-Oriented Databases with Application to Interoperability

    NASA Technical Reports Server (NTRS)

    King, Roger

    1996-01-01

    In our original proposal we stated that our research would 'develop a novel technology that provides a foundation for collaborative information processing.' The essential ingredient of this technology is the notion of 'deltas,' which are first-class values representing collections of proposed updates to a database. The Heraclitus framework provides a variety of algebraic operators for building up, combining, inspecting, and comparing deltas. Deltas can be directly applied to the database to yield a new state, or used 'hypothetically' in queries against the state that would arise if the delta were applied. The central point here is that the step of elevating deltas to 'first-class' citizens in database programming languages will yield tremendous leverage on the problem of supporting updates in collaborative information processing. In short, our original intention was to develop the theoretical and practical foundation for a technology based on deltas in an object-oriented database context, develop a toolkit for active object-oriented databases, and apply this toward collaborative information processing.

  17. Developing genomic knowledge bases and databases to support clinical management: current perspectives.

    PubMed

    Huser, Vojtech; Sincan, Murat; Cimino, James J

    2014-01-01

    Personalized medicine, the ability to tailor diagnostic and treatment decisions for individual patients, is seen as the evolution of modern medicine. We characterize here the informatics resources available today or envisioned in the near future that can support clinical interpretation of genomic test results. We assume a clinical sequencing scenario (germline whole-exome sequencing) in which a clinical specialist, such as an endocrinologist, needs to tailor patient management decisions within his or her specialty (targeted findings) but relies on a genetic counselor to interpret off-target incidental findings. We characterize the genomic input data and list various types of knowledge bases that provide genomic knowledge for generating clinical decision support. We highlight the need for patient-level databases with detailed lifelong phenotype content in addition to genotype data and provide a list of recommendations for personalized medicine knowledge bases and databases. We conclude that no single knowledge base can currently support all aspects of personalized recommendations and that consolidation of several current resources into larger, more dynamic and collaborative knowledge bases may offer a future path forward.

  18. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    NASA Technical Reports Server (NTRS)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  19. On the evaluation of fuzzy quantified queries in a database management system

    NASA Technical Reports Server (NTRS)

    Bosc, Patrick; Pivert, Olivier

    1992-01-01

    Many propositions to extend database management systems have been made in the last decade. Some of them aim at the support of a wider range of queries involving fuzzy predicates. Unfortunately, these queries are somewhat complex and the question of their efficiency is a subject under discussion. In this paper, we focus on a particular subset of queries, namely those using fuzzy quantified predicates. More precisely, we will consider the case where such predicates apply to individual elements as well as to sets of elements. Thanks to some interesting properties of alpha-cuts of fuzzy sets, we are able to show that the evaluation of these queries can be significantly improved with respect to a naive strategy based on exhaustive scans of sets or files.

  20. Outcome Management in Cardiac Surgery Using the Society of Thoracic Surgeons National Database.

    PubMed

    Halpin, Linda S; Gallardo, Bret E; Speir, Alan M; Ad, Niv

    2016-09-01

    Health care reform has helped streamline patient care and reimbursement by encouraging providers to provide the best outcome for the best value. Institutions with cardiac surgery programs need a methodology to monitor and improve outcomes linked to reimbursement. The Society of Thoracic Surgeons National Database (STSND) is a tool for monitoring outcomes and improving care. This article identifies the purpose, goals, and reporting system of the STSND and ways these data can be used for benchmarking, linking outcomes to the effectiveness of treatment, and identifying factors associated with mortality and complications. We explain the methodology used at Inova Heart and Vascular Institute, Falls Church, Virginia, to perform outcome management by using the STSND and address our performance-improvement cycle through discussion of data collection, analysis, and outcome reporting. We focus on the revision of clinical practice and offer examples of how patient outcomes have been improved using this methodology. PMID:27568532

  1. Outcome Management in Cardiac Surgery Using the Society of Thoracic Surgeons National Database.

    PubMed

    Halpin, Linda S; Gallardo, Bret E; Speir, Alan M; Ad, Niv

    2016-09-01

    Health care reform has helped streamline patient care and reimbursement by encouraging providers to provide the best outcome for the best value. Institutions with cardiac surgery programs need a methodology to monitor and improve outcomes linked to reimbursement. The Society of Thoracic Surgeons National Database (STSND) is a tool for monitoring outcomes and improving care. This article identifies the purpose, goals, and reporting system of the STSND and ways these data can be used for benchmarking, linking outcomes to the effectiveness of treatment, and identifying factors associated with mortality and complications. We explain the methodology used at Inova Heart and Vascular Institute, Falls Church, Virginia, to perform outcome management by using the STSND and address our performance-improvement cycle through discussion of data collection, analysis, and outcome reporting. We focus on the revision of clinical practice and offer examples of how patient outcomes have been improved using this methodology.

  2. An intelligent interactive visual database management system for Space Shuttle closeout image management

    NASA Technical Reports Server (NTRS)

    Ragusa, James M.; Orwig, Gary; Gilliam, Michael; Blacklock, David; Shaykhian, Ali

    1994-01-01

    Status is given of an applications investigation on the potential for using an expert system shell for classification and retrieval of high resolution, digital, color space shuttle closeout photography. This NASA funded activity has focused on the use of integrated information technologies to intelligently classify and retrieve still imagery from a large, electronically stored collection. A space shuttle processing problem is identified, a working prototype system is described, and commercial applications are identified. A conclusion reached is that the developed system has distinct advantages over the present manual system and cost efficiencies will result as the system is implemented. Further, commercial potential exists for this integrated technology.

  3. Hazardous waste database: Waste management policy implications for the US Department of Energy`s Environmental Restoration and Waste Management Programmatic Environmental Impact Statement

    SciTech Connect

    Lazaro, M.A.; Policastro, A.J.; Antonopoulos, A.A.; Hartmann, H.M.; Koebnick, B.; Dovel, M.; Stoll, P.W.

    1994-03-01

    The hazardous waste risk assessment modeling (HaWRAM) database is being developed to analyze the risk from treatment technology operations and potential transportation accidents associated with the hazardous waste management alternatives. These alternatives are being assessed in the Department of Energy`s Environmental Restoration and Waste Management Programmatic Environmental Impact Statement (EM PEIS). To support the risk analysis, the current database contains complexwide detailed information on hazardous waste shipments from 45 Department of Energy installations during FY 1992. The database is currently being supplemented with newly acquired data. This enhancement will improve database information on operational hazardous waste generation rates, and the level and type of current on-site treatment at Department of Energy installations.

  4. [Cystic Fibrosis Cloud database: An information system for storage and management of clinical and microbiological data of cystic fibrosis patients].

    PubMed

    Prieto, Claudia I; Palau, María J; Martina, Pablo; Achiary, Carlos; Achiary, Andrés; Bettiol, Marisa; Montanaro, Patricia; Cazzola, María L; Leguizamón, Mariana; Massillo, Cintia; Figoli, Cecilia; Valeiras, Brenda; Perez, Silvia; Rentería, Fernando; Diez, Graciela; Yantorno, Osvaldo M; Bosch, Alejandra

    2016-01-01

    The epidemiological and clinical management of cystic fibrosis (CF) patients suffering from acute pulmonary exacerbations or chronic lung infections demands continuous updating of medical and microbiological processes associated with the constant evolution of pathogens during host colonization. In order to monitor the dynamics of these processes, it is essential to have expert systems capable of storing and subsequently extracting the information generated from different studies of the patients and microorganisms isolated from them. In this work we have designed and developed an on-line database based on an information system that allows to store, manage and visualize data from clinical studies and microbiological analysis of bacteria obtained from the respiratory tract of patients suffering from cystic fibrosis. The information system, named Cystic Fibrosis Cloud database is available on the http://servoy.infocomsa.com/cfc_database site and is composed of a main database and a web-based interface, which uses Servoy's product architecture based on Java technology. Although the CFC database system can be implemented as a local program for private use in CF centers, it can also be used, updated and shared by different users who can access the stored information in a systematic, practical and safe manner. The implementation of the CFC database could have a significant impact on the monitoring of respiratory infections, the prevention of exacerbations, the detection of emerging organisms, and the adequacy of control strategies for lung infections in CF patients.

  5. [Cystic Fibrosis Cloud database: An information system for storage and management of clinical and microbiological data of cystic fibrosis patients].

    PubMed

    Prieto, Claudia I; Palau, María J; Martina, Pablo; Achiary, Carlos; Achiary, Andrés; Bettiol, Marisa; Montanaro, Patricia; Cazzola, María L; Leguizamón, Mariana; Massillo, Cintia; Figoli, Cecilia; Valeiras, Brenda; Perez, Silvia; Rentería, Fernando; Diez, Graciela; Yantorno, Osvaldo M; Bosch, Alejandra

    2016-01-01

    The epidemiological and clinical management of cystic fibrosis (CF) patients suffering from acute pulmonary exacerbations or chronic lung infections demands continuous updating of medical and microbiological processes associated with the constant evolution of pathogens during host colonization. In order to monitor the dynamics of these processes, it is essential to have expert systems capable of storing and subsequently extracting the information generated from different studies of the patients and microorganisms isolated from them. In this work we have designed and developed an on-line database based on an information system that allows to store, manage and visualize data from clinical studies and microbiological analysis of bacteria obtained from the respiratory tract of patients suffering from cystic fibrosis. The information system, named Cystic Fibrosis Cloud database is available on the http://servoy.infocomsa.com/cfc_database site and is composed of a main database and a web-based interface, which uses Servoy's product architecture based on Java technology. Although the CFC database system can be implemented as a local program for private use in CF centers, it can also be used, updated and shared by different users who can access the stored information in a systematic, practical and safe manner. The implementation of the CFC database could have a significant impact on the monitoring of respiratory infections, the prevention of exacerbations, the detection of emerging organisms, and the adequacy of control strategies for lung infections in CF patients. PMID:26895996

  6. Rainforests: Conservation and resource management. (Latest citations from the Biobusiness database). Published Search

    SciTech Connect

    Not Available

    1994-12-01

    The bibliography contains citations concerning conservation of rainforest ecology and management of natural resources. Topics include plant community structure and development, nutrient dynamics, rainfall characteristics and water budgets, and forest dynamics. Studies performed in specific forest areas are included. Effects of human activities are also considered. (Contains a minimum of 154 citations and includes a subject term index and title list.)

  7. Karlsruhe Database for Radioactive Wastes (KADABRA) - Accounting and Management System for Radioactive Waste Treatment - 12275

    SciTech Connect

    Himmerkus, Felix; Rittmeyer, Cornelia

    2012-07-01

    The data management system KADABRA was designed according to the purposes of the Cen-tral Decontamination Department (HDB) of the Wiederaufarbeitungsanlage Karlsruhe Rueckbau- und Entsorgungs-GmbH (WAK GmbH), which is specialized in the treatment and conditioning of radioactive waste. The layout considers the major treatment processes of the HDB as well as regulatory and legal requirements. KADABRA is designed as an SAG ADABAS application on IBM system Z mainframe. The main function of the system is the data management of all processes related to treatment, transfer and storage of radioactive material within HDB. KADABRA records the relevant data concerning radioactive residues, interim products and waste products as well as the production parameters relevant for final disposal. Analytical data from the laboratory and non destructive assay systems, that describe the chemical and radiological properties of residues, production batches, interim products as well as final waste products, can be linked to the respective dataset for documentation and declaration. The system enables the operator to trace the radioactive material through processing and storage. Information on the actual sta-tus of the material as well as radiological data and storage position can be gained immediately on request. A variety of programs accessed to the database allow the generation of individual reports on periodic or special request. KADABRA offers a high security standard and is constantly adapted to the recent requirements of the organization. (authors)

  8. Using non-local databases for the environmental assessment of industrial activities: The case of Latin America

    SciTech Connect

    Osses de Eicker, Margarita; Hischier, Roland; Hurni, Hans; Zah, Rainer

    2010-04-15

    Nine non-local databases were evaluated with respect to their suitability for the environmental assessment of industrial activities in Latin America. Three assessment methods were considered, namely Life Cycle Assessment (LCA), Environmental Impact Assessment (EIA) and air emission inventories. The analysis focused on data availability in the databases and the applicability of their international data to Latin American industry. The study showed that the European EMEP/EEA Guidebook and the U.S. EPA AP-42 database are the most suitable ones for air emission inventories, whereas the LCI database Ecoinvent is the most suitable one for LCA and EIA. Due to the data coverage in the databases, air emission inventories are easier to develop than LCA or EIA, which require more comprehensive information. One strategy to overcome the limitations of non-local databases for Latin American industry is the combination of validated data from international databases with newly developed local datasets.

  9. Data base management systems activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The Data Management System-1100 is designed to operate in conjunction with the UNIVAC 1100 Series Operating System on any 1100 Series computer. DMS-1100 is divided into the following four major software components: (1) Data Definition Languages (DDL); (2) Data Management Routine (DMR); (3) Data Manipulation Languages (DML); and (4) Data Base Utilities (DBU). These software components are described in detail.

  10. The web-enabled database of JRC-EC, a useful tool for managing European Gen IV materials data

    NASA Astrophysics Data System (ADS)

    Over, H. H.; Dietz, W.

    2008-06-01

    Materials and document databases are important tools to conserve knowledge and experimental materials data of European R&D projects. A web-enabled application guarantees a fast access to these data. In combination with analysis tools the experimental data are used for e.g. mechanical design, construction and lifetime predictions of complex components. The effective and efficient handling of large amounts of generic and detailed materials data with regard to properties related to e.g. fabrication processes, joining techniques, irradiation or aging is one of the basic elements of data management within ongoing nuclear safety and design related European research projects and networks. The paper describes the structure and functionality of Mat-DB and gives examples how these tools can be used for the management and evaluation of materials data of European (national or multi-national) R&D activities or future reactor types such as the EURATOM FP7 Generation IV reactor types or the heavy liquid metals cooled reactor.

  11. The Johnson Space Center Management Information Systems (JSCMIS): An interface for organizational databases

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.; Erickson, Lloyd

    1990-01-01

    The Management Information and Decision Support Environment (MIDSE) is a research activity to build and test a prototype of a generic human interface on the Johnson Space Center (JSC) Information Network (CIN). The existing interfaces were developed specifically to support operations rather than the type of data which management could use. The diversity of the many interfaces and their relative difficulty discouraged occasional users from attempting to use them for their purposes. The MIDSE activity approached this problem by designing and building an interface to one JSC data base - the personnel statistics tables of the NASA Personnel and Payroll System (NPPS). The interface was designed against the following requirements: generic (use with any relational NOMAD data base); easy to learn (intuitive operations for new users); easy to use (efficient operations for experienced users); self-documenting (help facility which informs users about the data base structure as well as the operation of the interface); and low maintenance (easy configuration to new applications). A prototype interface entitled the JSC Management Information Systems (JSCMIS) was produced. It resides on CIN/PROFS and is available to JSC management who request it. The interface has passed management review and is ready for early use. Three kinds of data are now available: personnel statistics, personnel register, and plan/actual cost.

  12. PARPs database: A LIMS systems for protein-protein interaction data mining or laboratory information management system

    PubMed Central

    Droit, Arnaud; Hunter, Joanna M; Rouleau, Michèle; Ethier, Chantal; Picard-Cloutier, Aude; Bourgais, David; Poirier, Guy G

    2007-01-01

    Background In the "post-genome" era, mass spectrometry (MS) has become an important method for the analysis of proteins and the rapid advancement of this technique, in combination with other proteomics methods, results in an increasing amount of proteome data. This data must be archived and analysed using specialized bioinformatics tools. Description We herein describe "PARPs database," a data analysis and management pipeline for liquid chromatography tandem mass spectrometry (LC-MS/MS) proteomics. PARPs database is a web-based tool whose features include experiment annotation, protein database searching, protein sequence management, as well as data-mining of the peptides and proteins identified. Conclusion Using this pipeline, we have successfully identified several interactions of biological significance between PARP-1 and other proteins, namely RFC-1, 2, 3, 4 and 5. PMID:18093328

  13. A spatial classification and database for management, research, and policy making: The Great Lakes aquatic habitat framework

    EPA Science Inventory

    Managing the world’s largest and complex freshwater ecosystem, the Laurentian Great Lakes, requires a spatially hierarchical basin-wide database of ecological and socioeconomic information that are comparable across the region. To meet such a need, we developed a hierarchi...

  14. A Web-Based Multi-Database System Supporting Distributed Collaborative Management and Sharing of Microarray Experiment Information

    PubMed Central

    Burgarella, Sarah; Cattaneo, Dario; Masseroli, Marco

    2006-01-01

    We developed MicroGen, a multi-database Web based system for managing all the information characterizing spotted microarray experiments. It supports information gathering and storing according to the Minimum Information About Microarray Experiments (MIAME) standard. It also allows easy sharing of information and data among all multidisciplinary actors involved in spotted microarray experiments. PMID:17238488

  15. Covariant Evolutionary Event Analysis for Base Interaction Prediction Using a Relational Database Management System for RNA.

    PubMed

    Xu, Weijia; Ozer, Stuart; Gutell, Robin R

    2009-01-01

    With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure. PMID:20502534

  16. Covariant Evolutionary Event Analysis for Base Interaction Prediction Using a Relational Database Management System for RNA

    PubMed Central

    Xu, Weijia; Ozer, Stuart; Gutell, Robin R.

    2010-01-01

    With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure. PMID:20502534

  17. Developing a comprehensive database management system for organization and evaluation of mammography datasets.

    PubMed

    Wu, Yirong; Rubin, Daniel L; Woods, Ryan W; Elezaby, Mai; Burnside, Elizabeth S

    2014-01-01

    We aimed to design and develop a comprehensive mammography database system (CMDB) to collect clinical datasets for outcome assessment and development of decision support tools. A Health Insurance Portability and Accountability Act (HIPAA) compliant CMDB was created to store multi-relational datasets of demographic risk factors and mammogram results using the Breast Imaging Reporting and Data System (BI-RADS) lexicon. The CMDB collected both biopsy pathology outcomes, in a breast pathology lexicon compiled by extending BI-RADS, and our institutional breast cancer registry. The audit results derived from the CMDB were in accordance with Mammography Quality Standards Act (MQSA) audits and national benchmarks. The CMDB has managed the challenges of multi-level organization demanded by the complexity of mammography practice and lexicon development in pathology. We foresee that the CMDB will be useful for efficient quality assurance audits and development of decision support tools to improve breast cancer diagnosis. Our procedure of developing the CMDB provides a framework to build a detailed data repository for breast imaging quality control and research, which has the potential to augment existing resources.

  18. Developing a Comprehensive Database Management System for Organization and Evaluation of Mammography Datasets

    PubMed Central

    Wu, Yirong; Rubin, Daniel L; Woods, Ryan W; Elezaby, Mai; Burnside, Elizabeth S

    2014-01-01

    We aimed to design and develop a comprehensive mammography database system (CMDB) to collect clinical datasets for outcome assessment and development of decision support tools. A Health Insurance Portability and Accountability Act (HIPAA) compliant CMDB was created to store multi-relational datasets of demographic risk factors and mammogram results using the Breast Imaging Reporting and Data System (BI-RADS) lexicon. The CMDB collected both biopsy pathology outcomes, in a breast pathology lexicon compiled by extending BI-RADS, and our institutional breast cancer registry. The audit results derived from the CMDB were in accordance with Mammography Quality Standards Act (MQSA) audits and national benchmarks. The CMDB has managed the challenges of multi-level organization demanded by the complexity of mammography practice and lexicon development in pathology. We foresee that the CMDB will be useful for efficient quality assurance audits and development of decision support tools to improve breast cancer diagnosis. Our procedure of developing the CMDB provides a framework to build a detailed data repository for breast imaging quality control and research, which has the potential to augment existing resources. PMID:25368510

  19. COMPARISON OF EXERCISE PARTICIPATION RATES FOR CHILDREN IN THE LITERATURE WITH THOSE IN EPA'S CONSOLIDATED HUMAN ACTIVITY DATABASE (CHAD)

    EPA Science Inventory

    CHAD contains over 22,000 person-days of human activity pattern survey data. Part of the database includes exercise participation rates for children 0-17 years old, as well as for adults. Analyses of this database indicates that approximately 34% of the 0-17 age group (herea...

  20. Structuring medication related activities for information management.

    PubMed

    Luukkonen, Irmeli; Mykkänen, Juha; Kivekäs, Eija; Saranto, Kaija

    2014-01-01

    Medication treatment and the related information management are central parts of a patient's health care. As a cross-organizational and cooperative process, medication information management is a complex domain for development activities. We studied medication activities and related information management in a regional project in order to produce a shared broad picture of its processes and to understand the main issues and the needs for improvement. In this paper we provide a summary of the findings in a structured form, based on a six-dimensioned framework for design and analysis of activities and processes.

  1. Computer Science and Technology: Modeling and Measurement Techniques for Evaluation of Design Alternatives in the Implementation of Database Management Software. Final Report.

    ERIC Educational Resources Information Center

    Deutsch, Donald R.

    This report describes a research effort that was carried out over a period of several years to develop and demonstrate a methodology for evaluating proposed Database Management System designs. The major proposition addressed by this study is embodied in the thesis statement: Proposed database management system designs can be evaluated best through…

  2. Southern African Treatment Resistance Network (SATuRN) RegaDB HIV drug resistance and clinical management database: supporting patient management, surveillance and research in southern Africa.

    PubMed

    Manasa, Justen; Lessells, Richard; Rossouw, Theresa; Naidu, Kevindra; Van Vuuren, Cloete; Goedhals, Dominique; van Zyl, Gert; Bester, Armand; Skingsley, Andrew; Stott, Katharine; Danaviah, Siva; Chetty, Terusha; Singh, Lavanya; Moodley, Pravi; Iwuji, Collins; McGrath, Nuala; Seebregts, Christopher J; de Oliveira, Tulio

    2014-01-01

    Substantial amounts of data have been generated from patient management and academic exercises designed to better understand the human immunodeficiency virus (HIV) epidemic and design interventions to control it. A number of specialized databases have been designed to manage huge data sets from HIV cohort, vaccine, host genomic and drug resistance studies. Besides databases from cohort studies, most of the online databases contain limited curated data and are thus sequence repositories. HIV drug resistance has been shown to have a great potential to derail the progress made thus far through antiretroviral therapy. Thus, a lot of resources have been invested in generating drug resistance data for patient management and surveillance purposes. Unfortunately, most of the data currently available relate to subtype B even though >60% of the epidemic is caused by HIV-1 subtype C. A consortium of clinicians, scientists, public health experts and policy markers working in southern Africa came together and formed a network, the Southern African Treatment and Resistance Network (SATuRN), with the aim of increasing curated HIV-1 subtype C and tuberculosis drug resistance data. This article describes the HIV-1 data curation process using the SATuRN Rega database. The data curation is a manual and time-consuming process done by clinical, laboratory and data curation specialists. Access to the highly curated data sets is through applications that are reviewed by the SATuRN executive committee. Examples of research outputs from the analysis of the curated data include trends in the level of transmitted drug resistance in South Africa, analysis of the levels of acquired resistance among patients failing therapy and factors associated with the absence of genotypic evidence of drug resistance among patients failing therapy. All these studies have been important for informing first- and second-line therapy. This database is a free password-protected open source database available on

  3. Toxicity of ionic liquids: database and prediction via quantitative structure-activity relationship method.

    PubMed

    Zhao, Yongsheng; Zhao, Jihong; Huang, Ying; Zhou, Qing; Zhang, Xiangping; Zhang, Suojiang

    2014-08-15

    A comprehensive database on toxicity of ionic liquids (ILs) is established. The database includes over 4000 pieces of data. Based on the database, the relationship between IL's structure and its toxicity has been analyzed qualitatively. Furthermore, Quantitative Structure-Activity relationships (QSAR) model is conducted to predict the toxicities (EC50 values) of various ILs toward the Leukemia rat cell line IPC-81. Four parameters selected by the heuristic method (HM) are used to perform the studies of multiple linear regression (MLR) and support vector machine (SVM). The squared correlation coefficient (R(2)) and the root mean square error (RMSE) of training sets by two QSAR models are 0.918 and 0.959, 0.258 and 0.179, respectively. The prediction R(2) and RMSE of QSAR test sets by MLR model are 0.892 and 0.329, by SVM model are 0.958 and 0.234, respectively. The nonlinear model developed by SVM algorithm is much outperformed MLR, which indicates that SVM model is more reliable in the prediction of toxicity of ILs. This study shows that increasing the relative number of O atoms of molecules leads to decrease in the toxicity of ILs.

  4. 15 years of zooming in and zooming out: Developing a new single scale national active fault database of New Zealand

    NASA Astrophysics Data System (ADS)

    Ries, William; Langridge, Robert; Villamor, Pilar; Litchfield, Nicola; Van Dissen, Russ; Townsend, Dougal; Lee, Julie; Heron, David; Lukovic, Biljana

    2014-05-01

    In New Zealand, we are currently reconciling multiple digital coverages of mapped active faults into a national coverage at a single scale (1:250,000). This seems at first glance to be a relatively simple task. However, methods used to capture data, the scale of capture, and the initial purpose of the fault mapping, has produced datasets that have very different characteristics. The New Zealand digital active fault database (AFDB) was initially developed as a way of managing active fault locations and fault-related features within a computer-based spatial framework. The data contained within the AFDB comes from a wide range of studies, from plate tectonic (1:500,000) to cadastral (1:2,000) scale. The database was designed to allow capture of field observations and remotely sourced data without a loss in data resolution. This approach has worked well as a method for compiling a centralised database for fault information but not for providing a complete national coverage at a single scale. During the last 15 years other complementary projects have used and also contributed data to the AFDB, most notably the QMAP project (a national series of geological maps completed over 19 years that include coverage of active and inactive faults at 1:250,000). AFDB linework and attributes was incorporated into this series but simplification of linework and attributes has occurred to maintain map clarity at 1:250,000 scale. Also, during this period on-going mapping of active faults has improved upon these data. Other projects of note that have used data from the AFDB include the National Seismic Hazard Model of New Zealand and the Global Earthquake Model (GEM). The main goal of the current project has been to provide the best digital spatial representation of a fault trace at 1:250,000 scale and combine this with the most up to date attributes. In some areas this has required a simplification of very fine detailed data and in some cases new mapping to provide a complete coverage

  5. The Database Business: Managing Today--Planning for Tomorrow. Issues and Futures.

    ERIC Educational Resources Information Center

    Aitchison, T. M.; And Others

    1988-01-01

    Current issues and the future of the database business are discussed in five papers. Topics covered include aspects relating to the quality of database production; international ownership in the U.S. information marketplace; an overview of pricing strategies in the electronic information industry; and pricing issues from the viewpoints of online…

  6. A Parallel Relational Database Management System Approach to Relevance Feedback in Information Retrieval.

    ERIC Educational Resources Information Center

    Lundquist, Carol; Frieder, Ophir; Holmes, David O.; Grossman, David

    1999-01-01

    Describes a scalable, parallel, relational database-drive information retrieval engine. To support portability across a wide range of execution environments, all algorithms adhere to the SQL-92 standard. By incorporating relevance feedback algorithms, accuracy is enhanced over prior database-driven information retrieval efforts. Presents…

  7. [Research and development of medical case database: a novel medical case information system integrating with biospecimen management].

    PubMed

    Pan, Shiyang; Mu, Yuan; Wang, Hong; Wang, Tong; Huang, Peijun; Ma, Jianfeng; Jiang, Li; Zhang, Jie; Gu, Bing; Yi, Lujiang

    2010-04-01

    To meet the needs of management of medical case information and biospecimen simultaneously, we developed a novel medical case information system integrating with biospecimen management. The database established by MS SQL Server 2000 covered, basic information, clinical diagnosis, imaging diagnosis, pathological diagnosis and clinical treatment of patient; physicochemical property, inventory management and laboratory analysis of biospecimen; users log and data maintenance. The client application developed by Visual C++ 6.0 was used to implement medical case and biospecimen management, which was based on Client/Server model. This system can perform input, browse, inquest, summary of case and related biospecimen information, and can automatically synthesize case-records based on the database. Management of not only a long-term follow-up on individual, but also of grouped cases organized according to the aim of research can be achieved by the system. This system can improve the efficiency and quality of clinical researches while biospecimens are used coordinately. It realizes synthesized and dynamic management of medical case and biospecimen, which may be considered as a new management platform.

  8. The design and implementation of EPL: An event pattern language for active databases

    NASA Technical Reports Server (NTRS)

    Giuffrida, G.; Zaniolo, C.

    1994-01-01

    The growing demand for intelligent information systems requires closer coupling of rule-based reasoning engines, such as CLIPS, with advanced data base management systems (DBMS). For instance, several commercial DBMS now support the notion of triggers that monitor events and transactions occurring in the database and fire induced actions, which perform a variety of critical functions, including safeguarding the integrity of data, monitoring access, and recording volatile information needed by administrators, analysts, and expert systems to perform assorted tasks; examples of these tasks include security enforcement, market studies, knowledge discovery, and link analysis. At UCLA, we designed and implemented the event pattern language (EPL) which is capable of detecting and acting upon complex patterns of events which are temporally related to each other. For instance, a plant manager should be notified when a certain pattern of overheating repeats itself over time in a chemical process; likewise, proper notification is required when a suspicious sequence of bank transactions is executed within a certain time limit. The EPL prototype is built in CLIPS to operate on top of Sybase, a commercial relational DBMS, where actions can be triggered by events such as simple database updates, insertions, and deletions. The rule-based syntax of EPL allows the sequences of goals in rules to be interpreted as sequences of temporal events; each goal can correspond to either (1) a simple event, or (2) a (possibly negated) event/condition predicate, or (3) a complex event defined as the disjunction and repetition of other events. Various extensions have been added to CLIPS in order to tailor the interface with Sybase and its open client/server architecture.

  9. 'The surface management system' (SuMS) database: a surface-based database to aid cortical surface reconstruction, visualization and analysis

    NASA Technical Reports Server (NTRS)

    Dickson, J.; Drury, H.; Van Essen, D. C.

    2001-01-01

    Surface reconstructions of the cerebral cortex are increasingly widely used in the analysis and visualization of cortical structure, function and connectivity. From a neuroinformatics perspective, dealing with surface-related data poses a number of challenges. These include the multiplicity of configurations in which surfaces are routinely viewed (e.g. inflated maps, spheres and flat maps), plus the diversity of experimental data that can be represented on any given surface. To address these challenges, we have developed a surface management system (SuMS) that allows automated storage and retrieval of complex surface-related datasets. SuMS provides a systematic framework for the classification, storage and retrieval of many types of surface-related data and associated volume data. Within this classification framework, it serves as a version-control system capable of handling large numbers of surface and volume datasets. With built-in database management system support, SuMS provides rapid search and retrieval capabilities across all the datasets, while also incorporating multiple security levels to regulate access. SuMS is implemented in Java and can be accessed via a Web interface (WebSuMS) or using downloaded client software. Thus, SuMS is well positioned to act as a multiplatform, multi-user 'surface request broker' for the neuroscience community.

  10. A plant resource and experiment management system based on the Golm Plant Database as a basic tool for omics research

    PubMed Central

    Köhl, Karin I; Basler, Georg; Lüdemann, Alexander; Selbig, Joachim; Walther, Dirk

    2008-01-01

    Background For omics experiments, detailed characterisation of experimental material with respect to its genetic features, its cultivation history and its treatment history is a requirement for analyses by bioinformatics tools and for publication needs. Furthermore, meta-analysis of several experiments in systems biology based approaches make it necessary to store this information in a standardised manner, preferentially in relational databases. In the Golm Plant Database System, we devised a data management system based on a classical Laboratory Information Management System combined with web-based user interfaces for data entry and retrieval to collect this information in an academic environment. Results The database system contains modules representing the genetic features of the germplasm, the experimental conditions and the sampling details. In the germplasm module, genetically identical lines of biological material are generated by defined workflows, starting with the import workflow, followed by further workflows like genetic modification (transformation), vegetative or sexual reproduction. The latter workflows link lines and thus create pedigrees. For experiments, plant objects are generated from plant lines and united in so-called cultures, to which the cultivation conditions are linked. Materials and methods for each cultivation step are stored in a separate ACCESS database of the plant cultivation unit. For all cultures and thus every plant object, each cultivation site and the culture's arrival time at a site are logged by a barcode-scanner based system. Thus, for each plant object, all site-related parameters, e.g. automatically logged climate data, are available. These life history data and genetic information for the plant objects are linked to analytical results by the sampling module, which links sample components to plant object identifiers. This workflow uses controlled vocabulary for organs and treatments. Unique names generated by the system

  11. Status Report for Remediation Decision Support Project, Task 1, Activity 1.B – Physical and Hydraulic Properties Database and Interpretation

    SciTech Connect

    Rockhold, Mark L.

    2008-09-26

    The objective of Activity 1.B of the Remediation Decision Support (RDS) Project is to compile all available physical and hydraulic property data for sediments from the Hanford Site, to port these data into the Hanford Environmental Information System (HEIS), and to make the data web-accessible to anyone on the Hanford Local Area Network via the so-called Virtual Library. In past years efforts were made by RDS project staff to compile all available physical and hydraulic property data for Hanford sediments and to transfer these data into SoilVision{reg_sign}, a commercial geotechnical software package designed for storing, analyzing, and manipulating soils data. Although SoilVision{reg_sign} has proven to be useful, its access and use restrictions have been recognized as a limitation to the effective use of the physical and hydraulic property databases by the broader group of potential users involved in Hanford waste site issues. In order to make these data more widely available and useable, a decision was made to port them to HEIS and to make them web-accessible via a Virtual Library module. In FY08 the objectives of Activity 1.B of the RDS Project were to: (1) ensure traceability and defensibility of all physical and hydraulic property data currently residing in the SoilVision{reg_sign} database maintained by PNNL, (2) transfer the physical and hydraulic property data from the Microsoft Access database files used by SoilVision{reg_sign} into HEIS, which has most recently been maintained by Fluor-Hanford, Inc., (3) develop a Virtual Library module for accessing these data from HEIS, and (4) write a User's Manual for the Virtual Library module. The development of the Virtual Library module was to be performed by a third party under subcontract to Fluor. The intent of these activities is to make the available physical and hydraulic property data more readily accessible and useable by technical staff and operable unit managers involved in waste site assessments and

  12. Participatory Management of Co-Curricular Activities.

    ERIC Educational Resources Information Center

    McLenighan, Harry

    This paper argues that, for both practical and philosophical reasons, high school activities ought to be managed by participatory principles. It further argues that the responsibility for bringing this about belongs to principals and activities directors through appropriate modeling and in-service education. In addition, obstacles to the…

  13. Recently active traces of the Bartlett Springs Fault, California: a digital database

    USGS Publications Warehouse

    Lienkaemper, James J.

    2010-01-01

    The purpose of this map is to show the location of and evidence for recent movement on active fault traces within the Bartlett Springs Fault Zone, California. The location and recency of the mapped traces is primarily based on geomorphic expression of the fault as interpreted from large-scale aerial photography. In a few places, evidence of fault creep and offset Holocene strata in trenches and natural exposures have confirmed the activity of some of these traces. This publication is formatted both as a digital database for use within a geographic information system (GIS) and for broader public access as map images that may be browsed on-line or download a summary map. The report text describes the types of scientific observations used to make the map, gives references pertaining to the fault and the evidence of faulting, and provides guidance for use of and limitations of the map.

  14. Reflective Database Access Control

    ERIC Educational Resources Information Center

    Olson, Lars E.

    2009-01-01

    "Reflective Database Access Control" (RDBAC) is a model in which a database privilege is expressed as a database query itself, rather than as a static privilege contained in an access control list. RDBAC aids the management of database access controls by improving the expressiveness of policies. However, such policies introduce new interactions…

  15. A database about the tornadic activity in Catalonia (NE Spain) since 1994

    NASA Astrophysics Data System (ADS)

    Morales, M. E.; Arús, J.; Llasat, M. C.; Castán, S.

    2009-09-01

    Although tornadic activity is not the most important hazard in Spain, the damages that tornadoes and downburst generate are considerable in urban areas, giving place in some occasions to casualties. In Spain, the oldest systematic works collecting data about tornadoes, refer to the Balearic Islands, although some series about tornadoes in Spain have also been collected and analysed (Gayà, 2005). These series shows a positive increase that is probably more related to a change in the perception level of the population than to climatic change. In some occasions it is difficult to separate the damages produced by the tornado itself from those produced by other associated hazards like heavy rains, hail or a wind storms. It was the case of the September 2006 event, in which flash floods and tornadoes were recorded. In the same sense in some occasions, damages produced by a downsburt are confused with those that produced by a tornado. Having in mind all these facts, having a good systematic data base about tornadoes is necessary, before to obtain some conclusions not enough justified. This kind of database is not easy to obtain, because of it requires to have detailed information about damages, meteorological observations and testimonies that has to be filtered by a good quality control. After a general presentation about tornadoes and downsbursts in Mediterranean Region, this contribution presents the database that have affected Catalonia during the period 1994-2009, starting with the tornado recorded on the Espluga de Francolí the 31 August 1994.This database has been built in basis to the AEMET information, the Consorcio de Compensación de Seguros (the insurance company of Spain for natural disasters), the newspapers and field visits to the affected places.

  16. De-MA: a web Database for electron Microprobe Analyses to assist EMP lab manager and users

    NASA Astrophysics Data System (ADS)

    Allaz, J. M.

    2012-12-01

    Lab managers and users of electron microprobe (EMP) facilities require comprehensive, yet flexible documentation structures, as well as an efficient scheduling mechanism. A single on-line database system for managing reservations, and providing information on standards, quantitative and qualitative setups (element mapping, etc.), and X-ray data has been developed for this purpose. This system is particularly useful in multi-user facilities where experience ranges from beginners to the highly experienced. New users and occasional facility users will find these tools extremely useful in developing and maintaining high quality, reproducible, and efficient analyses. This user-friendly database is available through the web, and uses MySQL as a database and PHP/HTML as script language (dynamic website). The database includes several tables for standards information, X-ray lines, X-ray element mapping, PHA, element setups, and agenda. It is configurable for up to five different EMPs in a single lab, each of them having up to five spectrometers and as many diffraction crystals as required. The installation should be done on a web server supporting PHP/MySQL, although installation on a personal computer is possible using third-party freeware to create a local Apache server, and to enable PHP/MySQL. Since it is web-based, any user outside the EMP lab can access this database anytime through any web browser and on any operating system. The access can be secured using a general password protection (e.g. htaccess). The web interface consists of 6 main menus. (1) "Standards" lists standards defined in the database, and displays detailed information on each (e.g. material type, name, reference, comments, and analyses). Images such as EDS spectra or BSE can be associated with a standard. (2) "Analyses" lists typical setups to use for quantitative analyses, allows calculation of mineral composition based on a mineral formula, or calculation of mineral formula based on a fixed

  17. [Anemia management in haemodialysis. EuCliD database in Spain].

    PubMed

    Avilés, B; Coronel, F; Pérez-García, R; Marcelli, D; Orlandini, G; Ayala, J A; Rentero, R

    2002-01-01

    We present the results on Anaemia Management in Fresenius Medical Care Spain dialysis centres as reported by EuCliD (European Clinical Database), evaluating a population of 4,426 patients treated in Spain during the year 2001. To analyse the erythropoietin dose and the haemoglobin levels we divided the population in two groups according to the time with dialysis treatment: patients treated less than six months and patients between six months, and four years on therapy. We compared our results with the evidence based recommendations Guidelines: the European Best Practice Guidelines (EBPG) and the US National Kidney Foundation (NKF-K/DOQI). We also compared our results with those presented by the ESAM2 on 2,618 patients on dialysis in Spain carried out in the second half of the year 2000. We observed that 70% of the population reaches an haemoglobin value higher that 11 g/dl, with a mean erythropoietin (rHu-EPO) dose of 111.9 Ul/kg weight/week (n = 3,700; SD 74.9). However, for those patients on treatment for less than six months, the mean Haemoglobin only reaches 10.65 g/dl (n = 222; SD 1.4). The rHu-EPO was administrated subcutaneously in 70.2% of the patients. About the iron therapy, 86% of the patients received iron treatment and the administration route was intravenous in 93% of the population. The ferritin levels were below 100 micrograms/dl in 10% of the patients and 26.4% showed a transferrin saturation index (TSAT) below 20%. The erythropoieting resistance index (ERI), as rHu-EPO/haemoglobin, has been used to evaluate the response to rHu-Epo, according to different variables. It was observed that the following factors lead to a higher rHu-EPO resistance: intravenous rHu-EPO as administration route, the presence of hypoalbuminemia, increase of protein C reactive, Transferrin saturation below 20% and starting dialysis during the last six months.

  18. Identification of promiscuous ene-reductase activity by mining structural databases using active site constellations

    PubMed Central

    Steinkellner, Georg; Gruber, Christian C.; Pavkov-Keller, Tea; Binter, Alexandra; Steiner, Kerstin; Winkler, Christoph; Łyskowski, Andrzej; Schwamberger, Orsolya; Oberer, Monika; Schwab, Helmut; Faber, Kurt; Macheroux, Peter; Gruber, Karl

    2014-01-01

    The exploitation of catalytic promiscuity and the application of de novo design have recently opened the access to novel, non-natural enzymatic activities. Here we describe a structural bioinformatic method for predicting catalytic activities of enzymes based on three-dimensional constellations of functional groups in active sites (‘catalophores’). As a proof-of-concept we identify two enzymes with predicted promiscuous ene-reductase activity (reduction of activated C–C double bonds) and compare them with known ene-reductases, that is, members of the Old Yellow Enzyme family. Despite completely different amino acid sequences, overall structures and protein folds, high-resolution crystal structures reveal equivalent binding modes of typical Old Yellow Enzyme substrates and ligands. Biochemical and biocatalytic data show that the two enzymes indeed possess ene-reductase activity and reveal an inverted stereopreference compared with Old Yellow Enzymes for some substrates. This method could thus be a tool for the identification of viable starting points for the development and engineering of novel biocatalysts. PMID:24954722

  19. The U.S. Geological Survey mapping and cartographic database activities, 2006-2010

    USGS Publications Warehouse

    Craun, Kari J.; Donnelly, John P.; Allord, Gregory J.

    2011-01-01

    The U.S. Geological Survey (USGS) began systematic topographic mapping of the United States in the 1880s, beginning with scales of 1:250,000 and 1:125,000 in support of geological mapping. Responding to the need for higher resolution and more detail, the 1:62,500-scale, 15-minute, topographic map series was begun in the beginning of the 20th century. Finally, in the 1950s the USGS adopted the 1:24,000-scale, 7.5-minute topographic map series to portray even more detail, completing the coverage of the conterminous 48 states of the United States with this series in 1992. In 2001, the USGS developed the vision and concept of The National Map, a topographic database for the 21st century and the source for a new generation of topographic maps (http://nationalmap.gov/). In 2008, the initial production of those maps began with a 1:24,000-scale digital product. In a separate, but related project, the USGS began scanning the existing inventory of historical topographic maps at all scales to accompany the new topographic maps. The USGS also had developed a digital database of The National Atlas of the United States. The digital version of Atlas is now Web-available and supports a mapping engine for small scale maps of the United States and North America. These three efforts define topographic mapping activities of the USGS during the last few years and are discussed below.

  20. BaAMPs: the database of biofilm-active antimicrobial peptides.

    PubMed

    Di Luca, Mariagrazia; Maccari, Giuseppe; Maisetta, Giuseppantonio; Batoni, Giovanna

    2015-01-01

    Antimicrobial peptides (AMPs) are increasingly being considered as novel agents against biofilms. The development of AMP-based anti-biofilm strategies strongly relies on the design of sequences optimized to target specific features of sessile bacterial/fungal communities. Although several AMP databases have been created and successfully exploited for AMP design, all of these use data collected on peptides tested against planktonic microorganisms. Here, an open-access, manually curated database of AMPs specifically assayed against microbial biofilms (BaAMPs) is presented for the first time. In collecting relevant data from the literature an effort was made to define a minimal standard set of essential information including, for each AMP, the microbial species and biofilm conditions against which it was tested, and the specific assay and peptide concentration used. The availability of these data in an organized framework will benefit anti-biofilm research and support the design of novel molecules active against biofilm. BaAMPs is accessible at http://www.baamps.it. PMID:25760404

  1. Watershed Data Management (WDM) database for Salt Creek streamflow simulation, DuPage County, Illinois, water years 2005-11

    USGS Publications Warehouse

    Bera, Maitreyee

    2014-01-01

    The U.S. Geological Survey (USGS), in cooperation with DuPage County Stormwater Management Division, maintains a USGS database of hourly meteorologic and hydrologic data for use in a near real-time streamflow simulation system, which assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek watershed in DuPage County, Illinois. Most of the precipitation data are collected from a tipping-bucket rain-gage network located in and near DuPage County. The other meteorologic data (wind speed, solar radiation, air temperature, and dewpoint temperature) are collected at Argonne National Laboratory in Argonne, Ill. Potential evapotranspiration is computed from the meteorologic data. The hydrologic data (discharge and stage) are collected at USGS streamflow-gaging stations in DuPage County. These data are stored in a Watershed Data Management (WDM) database. An earlier report describes in detail the WDM database development including the processing of data from January 1, 1997, through September 30, 2004, in SEP04.WDM database. SEP04.WDM is updated with the appended data from October 1, 2004, through September 30, 2011, water years 2005–11 and renamed as SEP11.WDM. This report details the processing of meteorologic and hydrologic data in SEP11.WDM. This report provides a record of snow affected periods and the data used to fill missing-record periods for each precipitation site during water years 2005–11. The meteorologic data filling methods are described in detail in Over and others (2010), and an update is provided in this report.

  2. A spatial classification and database for management, research, and policy making: The Great Lakes aquatic habitat framework

    USGS Publications Warehouse

    Wang, Lizhu; Riseng, Catherine M.; Mason, Lacey; Werhrly, Kevin; Rutherford, Edward; McKenna, James E.; Castiglione, Chris; Johnson, Lucinda B.; Infante, Dana M.; Sowa, Scott P.; Robertson, Mike; Schaeffer, Jeff; Khoury, Mary; Gaiot, John; Hollenhurst, Tom; Brooks, Colin N.; Coscarelli, Mark

    2015-01-01

    Managing the world's largest and most complex freshwater ecosystem, the Laurentian Great Lakes, requires a spatially hierarchical basin-wide database of ecological and socioeconomic information that is comparable across the region. To meet such a need, we developed a spatial classification framework and database — Great Lakes Aquatic Habitat Framework (GLAHF). GLAHF consists of catchments, coastal terrestrial, coastal margin, nearshore, and offshore zones that encompass the entire Great Lakes Basin. The catchments captured in the database as river pour points or coastline segments are attributed with data known to influence physicochemical and biological characteristics of the lakes from the catchments. The coastal terrestrial zone consists of 30-m grid cells attributed with data from the terrestrial region that has direct connection with the lakes. The coastal margin and nearshore zones consist of 30-m grid cells attributed with data describing the coastline conditions, coastal human disturbances, and moderately to highly variable physicochemical and biological characteristics. The offshore zone consists of 1.8-km grid cells attributed with data that are spatially less variable compared with the other aquatic zones. These spatial classification zones and their associated data are nested within lake sub-basins and political boundaries and allow the synthesis of information from grid cells to classification zones, within and among political boundaries, lake sub-basins, Great Lakes, or within the entire Great Lakes Basin. This spatially structured database could help the development of basin-wide management plans, prioritize locations for funding and specific management actions, track protection and restoration progress, and conduct research for science-based decision making.

  3. jSPyDB, an open source database-independent tool for data management

    NASA Astrophysics Data System (ADS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  4. A novel meta-analytic approach: mining frequent co-activation patterns in neuroimaging databases.

    PubMed

    Caspers, Julian; Zilles, Karl; Beierle, Christoph; Rottschy, Claudia; Eickhoff, Simon B

    2014-04-15

    In recent years, coordinate-based meta-analyses have become a powerful and widely used tool to study co-activity across neuroimaging experiments, a development that was supported by the emergence of large-scale neuroimaging databases like BrainMap. However, the evaluation of co-activation patterns is constrained by the fact that previous coordinate-based meta-analysis techniques like Activation Likelihood Estimation (ALE) and Multilevel Kernel Density Analysis (MKDA) reveal all brain regions that show convergent activity within a dataset without taking into account actual within-experiment co-occurrence patterns. To overcome this issue we here propose a novel meta-analytic approach named PaMiNI that utilizes a combination of two well-established data-mining techniques, Gaussian mixture modeling and the Apriori algorithm. By this, PaMiNI enables a data-driven detection of frequent co-activation patterns within neuroimaging datasets. The feasibility of the method is demonstrated by means of several analyses on simulated data as well as a real application. The analyses of the simulated data show that PaMiNI identifies the brain regions underlying the simulated activation foci and perfectly separates the co-activation patterns of the experiments in the simulations. Furthermore, PaMiNI still yields good results when activation foci of distinct brain regions become closer together or if they are non-Gaussian distributed. For the further evaluation, a real dataset on working memory experiments is used, which was previously examined in an ALE meta-analysis and hence allows a cross-validation of both methods. In this latter analysis, PaMiNI revealed a fronto-parietal "core" network of working memory and furthermore indicates a left-lateralization in this network. Finally, to encourage a widespread usage of this new method, the PaMiNI approach was implemented into a publicly available software system. PMID:24365675

  5. Digital Database of Recently Active Traces of the Hayward Fault, California

    USGS Publications Warehouse

    Lienkaemper, James J.

    2006-01-01

    The purpose of this map is to show the location of and evidence for recent movement on active fault traces within the Hayward Fault Zone, California. The mapped traces represent the integration of the following three different types of data: (1) geomorphic expression, (2) creep (aseismic fault slip),and (3) trench exposures. This publication is a major revision of an earlier map (Lienkaemper, 1992), which both brings up to date the evidence for faulting and makes it available formatted both as a digital database for use within a geographic information system (GIS) and for broader public access interactively using widely available viewing software. The pamphlet describes in detail the types of scientific observations used to make the map, gives references pertaining to the fault and the evidence of faulting, and provides guidance for use of and limitations of the map. [Last revised Nov. 2008, a minor update for 2007 LiDAR and recent trench investigations; see version history below.

  6. Are Bibliographic Management Software Search Interfaces Reliable?: A Comparison between Search Results Obtained Using Database Interfaces and the EndNote Online Search Function

    ERIC Educational Resources Information Center

    Fitzgibbons, Megan; Meert, Deborah

    2010-01-01

    The use of bibliographic management software and its internal search interfaces is now pervasive among researchers. This study compares the results between searches conducted in academic databases' search interfaces versus the EndNote search interface. The results show mixed search reliability, depending on the database and type of search…

  7. Guide on Data Models in the Selection and Use of Database Management Systems. Final Report.

    ERIC Educational Resources Information Center

    Gallagher, Leonard J.; Draper, Jesse M.

    A tutorial introduction to data models in general is provided, with particular emphasis on the relational and network models defined by the two proposed ANSI (American National Standards Institute) database language standards. Examples based on the network and relational models include specific syntax and semantics, while examples from the other…

  8. Development of a grape genomics database using IBM DB2 content manager software

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A relational database was created for the North American Grapevine Genome project at the Viticultural Research Center, at Florida A&M University. The collaborative project with USDA, ARS researchers is an important resource for viticulture production of new grapevine varieties which will be adapted ...

  9. Earthquake Model of the Middle East (EMME) Project: Active Fault Database for the Middle East Region

    NASA Astrophysics Data System (ADS)

    Gülen, L.; Wp2 Team

    2010-12-01

    The Earthquake Model of the Middle East (EMME) Project is a regional project of the umbrella GEM (Global Earthquake Model) project (http://www.emme-gem.org/). EMME project region includes Turkey, Georgia, Armenia, Azerbaijan, Syria, Lebanon, Jordan, Iran, Pakistan, and Afghanistan. Both EMME and SHARE projects overlap and Turkey becomes a bridge connecting the two projects. The Middle East region is tectonically and seismically very active part of the Alpine-Himalayan orogenic belt. Many major earthquakes have occurred in this region over the years causing casualties in the millions. The EMME project will use PSHA approach and the existing source models will be revised or modified by the incorporation of newly acquired data. More importantly the most distinguishing aspect of the EMME project from the previous ones will be its dynamic character. This very important characteristic is accomplished by the design of a flexible and scalable database that will permit continuous update, refinement, and analysis. A digital active fault map of the Middle East region is under construction in ArcGIS format. We are developing a database of fault parameters for active faults that are capable of generating earthquakes above a threshold magnitude of Mw≥5.5. Similar to the WGCEP-2007 and UCERF-2 projects, the EMME project database includes information on the geometry and rates of movement of faults in a “Fault Section Database”. The “Fault Section” concept has a physical significance, in that if one or more fault parameters change, a new fault section is defined along a fault zone. So far over 3,000 Fault Sections have been defined and parameterized for the Middle East region. A separate “Paleo-Sites Database” includes information on the timing and amounts of fault displacement for major fault zones. A digital reference library that includes the pdf files of the relevant papers, reports is also being prepared. Another task of the WP-2 of the EMME project is to prepare

  10. A specialized database manager for interpretation of NMR spectra of synthetic glucides: JPD

    NASA Astrophysics Data System (ADS)

    Czaplicki, J.; Ponthus, C.

    1998-02-01

    The current communication presents a program, written specifically to create and handle a specialized database, containing NMR spectral patterns of various monosaccharidic units. The program's database format is compatible with that of the Aurelia/Amix Bruker software package. The software facilitates the search for J patterns included in the database and their comparison with an experimental spectrum, in order to identify the components of the studied system, including the contaminants. Nous présentons ici un logiciel écrit spécifiquement pour créer et gérer une base de données spécialisées, contenant les motifs du couplage J des unités monosaccharidiques différentes. Le format de la base de données est compatible avec le format utilisé par le logiciel Aurelia/Amix de Bruker. Le logiciel facilite la recherche des motifs J inclus dans la base de données de leurs comparaisons avec un spectre expérimental, afin d'identifier les constituants de l'échantillon étudié, et ses éventuelles impuretés.

  11. Recently Active Traces of the Berryessa Fault, California: A Digital Database

    USGS Publications Warehouse

    Lienkaemper, James J.

    2012-01-01

    The purpose of this map is to show the location of and evidence for recent movement on active fault traces within the Berryessa section and parts of adjacent sections of the Green Valley Fault Zone, California. The location and recency of the mapped traces is primarily based on geomorphic expression of the fault as interpreted from large-scale 2010 aerial photography and from 2007 and 2011 0.5 and 1.0 meter bare-earth LiDAR imagery (that is, high-resolution topographic data). In a few places, evidence of fault creep and offset Holocene strata in trenches and natural exposures have confirmed the activity of some of these traces. This publication is formatted both as a digital database for use within a geographic information system (GIS) and for broader public access as map images that may be browsed on-line or download a summary map. The report text describes the types of scientific observations used to make the map, gives references pertaining to the fault and the evidence of faulting, and provides guidance for use of and limitations of the map.

  12. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  13. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-09-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eleventh month of Phase 1

  14. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  15. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-08-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirty-sixth month of development activities.

  16. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase 1

  17. Chronic Fatigue Syndrome (CFS): Managing Activities and Exercise

    MedlinePlus

    ... Fatigue Syndrome (CFS) Share Compartir Managing Activities and Exercise On this Page Avoiding Extremes Developing an Activity ... recent manageable level of activity. Strength and Conditioning Exercises Strength and conditioning exercises are an important component ...

  18. Third millenium ideal gas and condensed phase thermochemical database for combustion (with update from active thermochemical tables).

    SciTech Connect

    Burcat, A.; Ruscic, B.; Chemistry; Technion - Israel Inst. of Tech.

    2005-07-29

    The thermochemical database of species involved in combustion processes is and has been available for free use for over 25 years. It was first published in print in 1984, approximately 8 years after it was first assembled, and contained 215 species at the time. This is the 7th printed edition and most likely will be the last one in print in the present format, which involves substantial manual labor. The database currently contains more than 1300 species, specifically organic molecules and radicals, but also inorganic species connected to combustion and air pollution. Since 1991 this database is freely available on the internet, at the Technion-IIT ftp server, and it is continuously expanded and corrected. The database is mirrored daily at an official mirror site, and at random at about a dozen unofficial mirror and 'finger' sites. The present edition contains numerous corrections and many recalculations of data of provisory type by the G3//B3LYP method, a high-accuracy composite ab initio calculation. About 300 species are newly calculated and are not yet published elsewhere. In anticipation of the full coupling, which is under development, the database started incorporating the available (as yet unpublished) values from Active Thermochemical Tables. The electronic version now also contains an XML file of the main database to allow transfer to other formats and ease finding specific information of interest. The database is used by scientists, educators, engineers and students at all levels, dealing primarily with combustion and air pollution, jet engines, rocket propulsion, fireworks, but also by researchers involved in upper atmosphere kinetics, astrophysics, abrasion metallurgy, etc. This introductory article contains explanations of the database and the means to use it, its sources, ways of calculation, and assessments of the accuracy of data.

  19. Spatial database for the management of "urban geology" geothematic information: the case of Drama City, Greece

    NASA Astrophysics Data System (ADS)

    Pantelias, Eustathios; Zervakou, Alexandra D.; Tsombos, Panagiotis I.; Nikolakopoulos, Konstantinos G.

    2008-10-01

    The aggregation of population in big cities leads to the concentration of human activities, economic wealth, over consumption of natural resources and urban growth without planning and sustainable management. As a result, urban societies are exposed to various dangers and threats with economical, social, ecological - environmental impacts on the urban surroundings. Problems associated with urban development are related to their geological conditions and those of their surroundings, e.g. flooding, land subsidence, groundwater pollution, soil contamination, earthquakes, landslides, etc. For these reasons, no sustainable urban planning can be done without geological information support. The first systematic recording, codification and documentation of "urban geology" geothematic information in Greece is implemented by the Institute of Geological and Mineral Exploration (I.G.M.E.) in the frame of project "Collection, codification and documentation of geothematic information for urban and suburban areas in Greece - pilot applications". Through the implementation of this project, all geothematic information derived from geological mapping, geotechnical - geochemical - geophysical research and measurements in four pilot areas of Greece Drama (North Greece), Nafplio & Sparti (Peloponnesus) and Thrakomakedones (Attica) is stored and processed in specially designed geodatabases in GIS environment containing vector and raster data. For the specific GIS application ArcGIS Personal Geodatabase is used. Data is classified in geothematic layers, grouped in geothematic datasets (e.g. Topography, Geology - Tectonics, Submarine Geology, Technical Geology, Hydrogeology, Soils, Radioactive elements, etc) and being processed in order to produced multifunctional geothematic maps. All compiled data constitute the essential base for land use planning and environmental protection in specific urban areas. With the termination of the project the produced geodatabase and other digital data

  20. Forest management. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1995-04-01

    The bibliography contains citations concerning forest management practices. Planning that evaluates the sustainability of timber harvest, habitat availability, and recreation over long periods of time is covered. Topics include silviculture, tree diseases and pests, timber cutting methods, and watershed management. (Contains a minimum of 141 citations and includes a subject term index and title list.)

  1. Forest management. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1996-05-01

    The bibliography contains citations concerning forest management practices. Planning that evaluates the sustainability of timber harvest, habitat availability, and recreation over long periods of time is covered. Topics include silviculture, tree diseases and pests, timber cutting methods, and watershed management. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  2. Sustainable forest management. (Latest citations from the Cab abstracts database). Published Search

    SciTech Connect

    1996-12-01

    The bibliography contains citations concerning developments in sustainable forestry management. Topics include international regulations, economics, strategies, land use rights, ecological impact, political developments, and evaluations of sustainable forestry resource management programs. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  3. A new database on contaminant exposure and effects in terrestrial vertebrates for natural resource managers

    USGS Publications Warehouse

    Rattner, B.A.; Pearson, J.L.; Garrett, L.J.; Erwin, R.M.; Walz, A.; Ottinger, M.A.; Barrett, H.R.

    1997-01-01

    The Biomonitoring of Environmental Status and Trends (BEST) program of the Department of the Interior is focused to identify and understand effects of contaminant stressors on biological resources under their stewardship. Despite the desire of many to continuously monitor the environmental health of our estuaries, much can be learned by summarizing existing temporal, geographic, and phylogenetic contaminant information. To this end, retrospective contaminant exposure and effects data for amphibians, reptiles, birds, and mammals residing within 30 km of Atlantic coast estuaries are being assembled through searches of published literature (e.g., Fisheries Review, Wildlife Review, BIOSIS Previews) and databases (e.g., US EPA Ecological Incident Information System; USGS Diagnostic and Epizootic Databases), and compilation of summary data from unpublished reports of government natural resource agencies, private conservation groups, and universities. These contaminant exposure and effect data for terrestrial vertebrates (CEE-TV) are being summarized using Borland dBASE in a 96- field format, including species, collection time and site coordinates, sample matrix, contaminant concentration, biomarker and bioindicator responses, and source of information (N>1500 records). This CEE-TV database has been imported into the ARC/INFO geographic information system (GIS), for purposes of examining geographic coverage and trends, and to identify critical data gaps. A preliminary risk assessment will be conducted to identify and characterize contaminants and other stressors potentially affecting terrestrial vertebrates that reside, migrate through or reproduce in these estuaries. Evaluations are underway, using specific measurement and assessment endpoints, to rank and prioritize estuarine ecosystems in which terrestrial vertebrates are potentially at risk for purposes of prediction and focusing future biomonitoring efforts.

  4. ScriptWriter. A relational database to manage outpatient medical treatment.

    PubMed Central

    Tanner, T. B.

    1994-01-01

    ScriptWriter is database software designed to replicate the process of a physician writing a prescription. The software also includes standard demographic and progress note information; however the focus of the software is on automating the process of writing prescriptions. The software is especially adept at creating patient medication lists, generating medication histories and keeping track of medication expiration dates. Other strengths include its ability to organize patient assignments and assist in the generation of progress notes. The application is network capable and fully graphical. A psychiatric outpatient clinic is currently using the software. Practitioners in non-psychiatric settings can also benefit from the software. PMID:7949872

  5. COMPILATION AND MANAGEMENT OF ORP GLASS FORMULATION DATABASE, VSL-12R2470-1 REV 0

    SciTech Connect

    Kruger, Albert A.; Pasieka, Holly K.; Muller, Isabelle; Gilbo, Konstantin; Perez-Cardenas, Fernando; Joseph, Innocent; Pegg, Ian L.; Kot, Wing K.

    2012-12-13

    The present report describes the first steps in the development of a glass property-composition database for WTP LAW and HL W glasses that includes all of the data that were used in the development of the WTP baseline models and all of the data collected subsequently as part of WTP enhancement studies perfonned for ORP. The data were reviewed to identifY some of the more significant gaps in the composition space that will need to be filled to support waste processing at Hanford. The WTP baseline models have been evaluated against the new data in terms of range of validity and prediction perfonnance.

  6. Analysis of DOE international environmental management activities

    SciTech Connect

    Ragaini, R.C.

    1995-09-01

    The Department of Energy`s (DOE) Strategic Plan (April 1994) states that DOE`s long-term vision includes world leadership in environmental restoration and waste management activities. The activities of the DOE Office of Environmental Management (EM) can play a key role in DOE`s goals of maintaining U.S. global competitiveness and ensuring the continuation of a world class science and technology community. DOE`s interest in attaining these goals stems partly from its participation in organizations like the Trade Policy Coordinating Committee (TPCC), with its National Environmental Export Promotion Strategy, which seeks to strengthen U.S. competitiveness and the building of public-private partnerships as part of U.S. industrial policy. The International Interactions Field Office task will build a communication network which will facilitate the efficient and effective communication between DOE Headquarters, Field Offices, and contractors. Under this network, Headquarters will provide the Field Offices with information on the Administration`s policies and activities (such as the DOE Strategic Plan), interagency activities, as well as relevant information from other field offices. Lawrence Livermore National Laboratory (LLNL) will, in turn, provide Headquarters with information on various international activities which, when appropriate, will be included in reports to groups like the TPCC and the EM Focus Areas. This task provides for the collection, review, and analysis of information on the more significant international environmental restoration and waste management initiatives and activities which have been used or are being considered at LLNL. Information gathering will focus on efforts and accomplishments in meeting the challenges of providing timely and cost effective cleanup of its environmentally damaged sites and facilities, especially through international technical exchanges and/or the implementation of foreign-development technologies.

  7. Planning and management of water resource programs. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1997-05-01

    The bibliography contains citations concerning planning and management of water resource programs and projects at the local, regional, state, and national levels. The studies of water quality, drinking water, industrial water, and irrigation are presented. Topics include groundwater and surface water management, flood control, waste water treatment, hydroelectric power generation, sanitation and toxic hazards, models and risk assessment, and remote sensing technology. Worldwide water management is covered.(Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  8. Planning and management of water resource programs. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1996-02-01

    The bibliography contains citations concerning planning and management of water resource programs and projects at the local, regional, state, and national levels. The studies of water quality, drinking water, industrial water, and irrigation are presented. Topics include groundwater and surface water management, flood control, waste water treatment, hydroelectric power generation, sanitation and toxic hazards, models and risk assessment, and remote sensing technology. Worldwide water management is covered.(Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  9. The Neotoma Paleoecology Database

    NASA Astrophysics Data System (ADS)

    Grimm, E. C.; Ashworth, A. C.; Barnosky, A. D.; Betancourt, J. L.; Bills, B.; Booth, R.; Blois, J.; Charles, D. F.; Graham, R. W.; Goring, S. J.; Hausmann, S.; Smith, A. J.; Williams, J. W.; Buckland, P.

    2015-12-01

    The Neotoma Paleoecology Database (www.neotomadb.org) is a multiproxy, open-access, relational database that includes fossil data for the past 5 million years (the late Neogene and Quaternary Periods). Modern distributional data for various organisms are also being made available for calibration and paleoecological analyses. The project is a collaborative effort among individuals from more than 20 institutions worldwide, including domain scientists representing a spectrum of Pliocene-Quaternary fossil data types, as well as experts in information technology. Working groups are active for diatoms, insects, ostracodes, pollen and plant macroscopic remains, testate amoebae, rodent middens, vertebrates, age models, geochemistry and taphonomy. Groups are also active in developing online tools for data analyses and for developing modules for teaching at different levels. A key design concept of NeotomaDB is that stewards for various data types are able to remotely upload and manage data. Cooperatives for different kinds of paleo data, or from different regions, can appoint their own stewards. Over the past year, much progress has been made on development of the steward software-interface that will enable this capability. The steward interface uses web services that provide access to the database. More generally, these web services enable remote programmatic access to the database, which both desktop and web applications can use and which provide real-time access to the most current data. Use of these services can alleviate the need to download the entire database, which can be out-of-date as soon as new data are entered. In general, the Neotoma web services deliver data either from an entire table or from the results of a view. Upon request, new web services can be quickly generated. Future developments will likely expand the spatial and temporal dimensions of the database. NeotomaDB is open to receiving new datasets and stewards from the global Quaternary community

  10. 76 FR 59170 - Hartford Financial Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-23

    ... Management Division, Hartford, CT; Notice of Negative Determination Regarding Application for Reconsideration... negative determination regarding workers' eligibility to apply for Trade Adjustment Assistance (TAA...). The negative determination was issued on August 19, 2011. The Department's Notice of determination...

  11. Wetlands legislation and management. (Latest citations from the Selected Water Resources Abstracts database). Published Search

    SciTech Connect

    Not Available

    1994-02-01

    The bibliography contains citations concerning federal and state legislation governing coastal and fresh water wetlands. Studies of regional regulations and management of specific sites are included. Topics such as reconciling environmental considerations with economic pressures and landowners' rights are covered. Wetlands restoration projects, conservation projects, and development plans are also presented. Many citations discuss wetlands management in relation to the Clean Water Act. (Contains 250 citations and includes a subject term index and title list.)

  12. Analysis, Repair, and Management of the Total Ozone Mapping Spectrometer Database

    NASA Technical Reports Server (NTRS)

    Sirovich, Lawrence

    1997-01-01

    In the ensuing period we were able to demonstrate that the origin of these filamentous patterns resulted from the action of synoptic-scale vortical velocity field on the global-scale background gradient of ozone concentration in the meridional direction. Hyperbolic flow patterns between long-lived atmospheric vortices bring together air parcels from different latitudes, thus creating large gradients along the separatrices leaving the hyperbolic (stagnation) point. This result is further confirmed by the KL analysis of the ozone field in the equatorial region, where the background concentration gradient vanishes. The spectral slope in this region has been found to lie close to -1, in agreement with Batchelor's prediction. Another outcome of this result is that it at least provides indirect evidence about the kinetic energy spectrum of the atmospheric turbulence in the range of scales approximately 200 to 2000 km. Namely, Batchelor's analysis is based on the assumption that the velocity field is large-scale, that is the kinetic energy spectrum decays as O(k(sup -3)) or steeper. Since the scalar spectrum is confirmed, this also supports this form of the kinetic energy spectrum. The study of equatorial regions of TOMS data revealed the efficiency of the KL method is in detecting and separating a wave-like measurement artifact inherently present in the dataset due to the non-perfect correction for cross-track bias. Just two to three eigenfunctions represent the error, which makes it possible to enhance the data by reconstituting it from the data by eliminating the subspace of artifactual eigenfunctions. This represents a highly efficient means for achieving an improved rendering of the data. This has been implemented on the database. A wide range of techniques and algorithms have been developed for the repair and extension of the TOMS database.

  13. DBAASP v.2: an enhanced database of structure and antimicrobial/cytotoxic activity of natural and synthetic peptides

    PubMed Central

    Pirtskhalava, Malak; Gabrielian, Andrei; Cruz, Phillip; Griggs, Hannah L.; Squires, R. Burke; Hurt, Darrell E.; Grigolava, Maia; Chubinidze, Mindia; Gogoladze, George; Vishnepolsky, Boris; Alekseev, Vsevolod; Rosenthal, Alex; Tartakovsky, Michael

    2016-01-01

    Antimicrobial peptides (AMPs) are anti-infectives that may represent a novel and untapped class of biotherapeutics. Increasing interest in AMPs means that new peptides (natural and synthetic) are discovered faster than ever before. We describe herein a new version of the Database of Antimicrobial Activity and Structure of Peptides (DBAASPv.2, which is freely accessible at http://dbaasp.org). This iteration of the database reports chemical structures and empirically-determined activities (MICs, IC50, etc.) against more than 4200 specific target microbes for more than 2000 ribosomal, 80 non-ribosomal and 5700 synthetic peptides. Of these, the vast majority are monomeric, but nearly 200 of these peptides are found as homo- or heterodimers. More than 6100 of the peptides are linear, but about 515 are cyclic and more than 1300 have other intra-chain covalent bonds. More than half of the entries in the database were added after the resource was initially described, which reflects the recent sharp uptick of interest in AMPs. New features of DBAASPv.2 include: (i) user-friendly utilities and reporting functions, (ii) a ‘Ranking Search’ function to query the database by target species and return a ranked list of peptides with activity against that target and (iii) structural descriptions of the peptides derived from empirical data or calculated by molecular dynamics (MD) simulations. The three-dimensional structural data are critical components for understanding structure–activity relationships and for design of new antimicrobial drugs. We created more than 300 high-throughput MD simulations specifically for inclusion in DBAASP. The resulting structures are described in the database by novel trajectory analysis plots and movies. Another 200+ DBAASP entries have links to the Protein DataBank. All of the structures are easily visualized directly in the web browser. PMID:26578581

  14. DBAASP v.2: an enhanced database of structure and antimicrobial/cytotoxic activity of natural and synthetic peptides.

    PubMed

    Pirtskhalava, Malak; Gabrielian, Andrei; Cruz, Phillip; Griggs, Hannah L; Squires, R Burke; Hurt, Darrell E; Grigolava, Maia; Chubinidze, Mindia; Gogoladze, George; Vishnepolsky, Boris; Alekseev, Vsevolod; Rosenthal, Alex; Tartakovsky, Michael

    2016-01-01

    Antimicrobial peptides (AMPs) are anti-infectives that may represent a novel and untapped class of biotherapeutics. Increasing interest in AMPs means that new peptides (natural and synthetic) are discovered faster than ever before. We describe herein a new version of the Database of Antimicrobial Activity and Structure of Peptides (DBAASPv.2, which is freely accessible at http://dbaasp.org). This iteration of the database reports chemical structures and empirically-determined activities (MICs, IC50, etc.) against more than 4200 specific target microbes for more than 2000 ribosomal, 80 non-ribosomal and 5700 synthetic peptides. Of these, the vast majority are monomeric, but nearly 200 of these peptides are found as homo- or heterodimers. More than 6100 of the peptides are linear, but about 515 are cyclic and more than 1300 have other intra-chain covalent bonds. More than half of the entries in the database were added after the resource was initially described, which reflects the recent sharp uptick of interest in AMPs. New features of DBAASPv.2 include: (i) user-friendly utilities and reporting functions, (ii) a 'Ranking Search' function to query the database by target species and return a ranked list of peptides with activity against that target and (iii) structural descriptions of the peptides derived from empirical data or calculated by molecular dynamics (MD) simulations. The three-dimensional structural data are critical components for understanding structure-activity relationships and for design of new antimicrobial drugs. We created more than 300 high-throughput MD simulations specifically for inclusion in DBAASP. The resulting structures are described in the database by novel trajectory analysis plots and movies. Another 200+ DBAASP entries have links to the Protein DataBank. All of the structures are easily visualized directly in the web browser. PMID:26578581

  15. Measuring impact of protected area management interventions: current and future use of the Global Database of Protected Area Management Effectiveness.

    PubMed

    Coad, Lauren; Leverington, Fiona; Knights, Kathryn; Geldmann, Jonas; Eassom, April; Kapos, Valerie; Kingston, Naomi; de Lima, Marcelo; Zamora, Camilo; Cuardros, Ivon; Nolte, Christoph; Burgess, Neil D; Hockings, Marc

    2015-11-01

    Protected areas (PAs) are at the forefront of conservation efforts, and yet despite considerable progress towards the global target of having 17% of the world's land area within protected areas by 2020, biodiversity continues to decline. The discrepancy between increasing PA coverage and negative biodiversity trends has resulted in renewed efforts to enhance PA effectiveness. The global conservation community has conducted thousands of assessments of protected area management effectiveness (PAME), and interest in the use of these data to help measure the conservation impact of PA management interventions is high. Here, we summarize the status of PAME assessment, review the published evidence for a link between PAME assessment results and the conservation impacts of PAs, and discuss the limitations and future use of PAME data in measuring the impact of PA management interventions on conservation outcomes. We conclude that PAME data, while designed as a tool for local adaptive management, may also help to provide insights into the impact of PA management interventions from the local-to-global scale. However, the subjective and ordinal characteristics of the data present significant limitations for their application in rigorous scientific impact evaluations, a problem that should be recognized and mitigated where possible. PMID:26460133

  16. Measuring impact of protected area management interventions: current and future use of the Global Database of Protected Area Management Effectiveness.

    PubMed

    Coad, Lauren; Leverington, Fiona; Knights, Kathryn; Geldmann, Jonas; Eassom, April; Kapos, Valerie; Kingston, Naomi; de Lima, Marcelo; Zamora, Camilo; Cuardros, Ivon; Nolte, Christoph; Burgess, Neil D; Hockings, Marc

    2015-11-01

    Protected areas (PAs) are at the forefront of conservation efforts, and yet despite considerable progress towards the global target of having 17% of the world's land area within protected areas by 2020, biodiversity continues to decline. The discrepancy between increasing PA coverage and negative biodiversity trends has resulted in renewed efforts to enhance PA effectiveness. The global conservation community has conducted thousands of assessments of protected area management effectiveness (PAME), and interest in the use of these data to help measure the conservation impact of PA management interventions is high. Here, we summarize the status of PAME assessment, review the published evidence for a link between PAME assessment results and the conservation impacts of PAs, and discuss the limitations and future use of PAME data in measuring the impact of PA management interventions on conservation outcomes. We conclude that PAME data, while designed as a tool for local adaptive management, may also help to provide insights into the impact of PA management interventions from the local-to-global scale. However, the subjective and ordinal characteristics of the data present significant limitations for their application in rigorous scientific impact evaluations, a problem that should be recognized and mitigated where possible.

  17. Measuring impact of protected area management interventions: current and future use of the Global Database of Protected Area Management Effectiveness

    PubMed Central

    Coad, Lauren; Leverington, Fiona; Knights, Kathryn; Geldmann, Jonas; Eassom, April; Kapos, Valerie; Kingston, Naomi; de Lima, Marcelo; Zamora, Camilo; Cuardros, Ivon; Nolte, Christoph; Burgess, Neil D.; Hockings, Marc

    2015-01-01

    Protected areas (PAs) are at the forefront of conservation efforts, and yet despite considerable progress towards the global target of having 17% of the world's land area within protected areas by 2020, biodiversity continues to decline. The discrepancy between increasing PA coverage and negative biodiversity trends has resulted in renewed efforts to enhance PA effectiveness. The global conservation community has conducted thousands of assessments of protected area management effectiveness (PAME), and interest in the use of these data to help measure the conservation impact of PA management interventions is high. Here, we summarize the status of PAME assessment, review the published evidence for a link between PAME assessment results and the conservation impacts of PAs, and discuss the limitations and future use of PAME data in measuring the impact of PA management interventions on conservation outcomes. We conclude that PAME data, while designed as a tool for local adaptive management, may also help to provide insights into the impact of PA management interventions from the local-to-global scale. However, the subjective and ordinal characteristics of the data present significant limitations for their application in rigorous scientific impact evaluations, a problem that should be recognized and mitigated where possible. PMID:26460133

  18. Railroad management planning. (Latest citations from the NTIS database). Published Search

    SciTech Connect

    Not Available

    1993-01-01

    The bibliography contains citations concerning railroad management techniques and their impact on operations. Topics include freight statistics, impacts on communities, and yard operations. Forecasts of future trends and government policies regarding railroad operations are also discussed. (Contains a minimum of 76 citations and includes a subject term index and title list.)

  19. Content-Based Management of Image Databases in the Internet Age

    ERIC Educational Resources Information Center

    Kleban, James Theodore

    2010-01-01

    The Internet Age has seen the emergence of richly annotated image data collections numbering in the billions of items. This work makes contributions in three primary areas which aid the management of this data: image representation, efficient retrieval, and annotation based on content and metadata. The contributions are as follows. First,…

  20. Inland wetlands legislation and management. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1996-03-01

    The bibliography contains citations concerning Federal and state laws and management programs for the protection and use of inland wetlands. The use of wetlands to control highway runoff and community wastewater is discussed. Wetlands protection programs, restoration projects, resource planning, and wetlands identification methods are cited. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  1. Inland wetlands legislation and management. (Latest citations from the NTIS Bibliographic database). Published Search

    SciTech Connect

    Not Available

    1993-11-01

    The bibliography contains citations concerning Federal and state laws and management programs for the protection and use of inland wetlands. the use of wetlands to control highway runoff and community wastewater is discussed. Wetlands protection programs, restoration projects, resource planning, and wetlands identification methods are cited. (Contains 250 citations and includes a subject term index and title list.)

  2. Lessons Learned From Developing Reactor Pressure Vessel Steel Embrittlement Database

    SciTech Connect

    Wang, Jy-An John

    2010-08-01

    Materials behaviors caused by neutron irradiation under fission and/or fusion environments can be little understood without practical examination. Easily accessible material information system with large material database using effective computers is necessary for design of nuclear materials and analyses or simulations of the phenomena. The developed Embrittlement Data Base (EDB) at ORNL is this comprehensive collection of data. EDB database contains power reactor pressure vessel surveillance data, the material test reactor data, foreign reactor data (through bilateral agreements authorized by NRC), and the fracture toughness data. The lessons learned from building EDB program and the associated database management activity regarding Material Database Design Methodology, Architecture and the Embedded QA Protocol are described in this report. The development of IAEA International Database on Reactor Pressure Vessel Materials (IDRPVM) and the comparison of EDB database and IAEA IDRPVM database are provided in the report. The recommended database QA protocol and database infrastructure are also stated in the report.

  3. Unterstützung der IT-Service-Management-Prozesse an der Technischen Universität München durch eine Configuration-Management-Database

    NASA Astrophysics Data System (ADS)

    Knittl, Silvia

    Hochschulprozesse in Lehre und Verwaltung erfordern durch die steigende Integration und IT-Unterstützung ein sogenanntes Business Alignment der IT und damit auch ein professionelleres IT-Service-Management (ITSM). Die IT Infrastructure Library (ITIL) mit ihrer Beschreibung von in der Praxis bewährten Prozessen hat sich zum de-facto Standard im ITSM etabliert. Ein solcher Prozess ist das Konfigurationsmanagement. Es bildet die IT-Infrastruktur als Konfigurationselemente und deren Beziehungen in einem Werkzeug, genannt Configuration Management Database (CMDB), ab und unterstützt so das ITSM. Dieser Bericht beschreibt die Erfahrungen mit der prototypischen Einführung einer CMDB an der Technischen Universität München.

  4. Avibase – a database system for managing and organizing taxonomic concepts

    PubMed Central

    Lepage, Denis; Vaidya, Gaurav; Guralnick, Robert

    2014-01-01

    Abstract Scientific names of biological entities offer an imperfect resolution of the concepts that they are intended to represent. Often they are labels applied to entities ranging from entire populations to individual specimens representing those populations, even though such names only unambiguously identify the type specimen to which they were originally attached. Thus the real-life referents of names are constantly changing as biological circumscriptions are redefined and thereby alter the sets of individuals bearing those names. This problem is compounded by other characteristics of names that make them ambiguous identifiers of biological concepts, including emendations, homonymy and synonymy. Taxonomic concepts have been proposed as a way to address issues related to scientific names, but they have yet to receive broad recognition or implementation. Some efforts have been made towards building systems that address these issues by cataloguing and organizing taxonomic concepts, but most are still in conceptual or proof-of-concept stage. We present the on-line database Avibase as one possible approach to organizing taxonomic concepts. Avibase has been successfully used to describe and organize 844,000 species-level and 705,000 subspecies-level taxonomic concepts across every major bird taxonomic checklist of the last 125 years. The use of taxonomic concepts in place of scientific names, coupled with efficient resolution services, is a major step toward addressing some of the main deficiencies in the current practices of scientific name dissemination and use. PMID:25061375

  5. Attributes of the Federal Energy Management Program's Federal Site Building Characteristics Database

    SciTech Connect

    Loper, Susan A.; Sandusky, William F.

    2010-12-31

    Typically, the Federal building stock is referred to as a group of about one-half million buildings throughout the United States. Additional information beyond this level is generally limited to distribution of that total by agency and maybe distribution of the total by state. However, additional characterization of the Federal building stock is required as the Federal sector seeks ways to implement efficiency projects to reduce energy and water use intensity as mandated by legislation and Executive Order. Using a Federal facility database that was assembled for use in a geographic information system tool, additional characterization of the Federal building stock is provided including information regarding the geographical distribution of sites, building counts and percentage of total by agency, distribution of sites and building totals by agency, distribution of building count and floor space by Federal building type classification by agency, and rank ordering of sites, buildings, and floor space by state. A case study is provided regarding how the building stock has changed for the Department of Energy from 2000 through 2008.

  6. The computational structural mechanics testbed architecture. Volume 4: The global-database manager GAL-DBM

    NASA Technical Reports Server (NTRS)

    Wright, Mary A.; Regelbrugge, Marc E.; Felippa, Carlos A.

    1989-01-01

    This is the fourth of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 4 describes the nominal-record data management component of the NICE software. It is intended for all users.

  7. Management of Reclaimed Produced Water in California Enhanced with the Expanded U.S. Geological Survey Produced Waters Geochemical Database

    NASA Astrophysics Data System (ADS)

    Gans, K. D.; Blondes, M. S.; Kharaka, Y. K.; Reidy, M. E.; Conaway, C. H.; Thordsen, J. J.; Rowan, E. L.; Engle, M.

    2015-12-01

    In California, in 2014, every barrel of oil produced also produced 16 barrels of water. Approximately 3.2 billion barrels of water were co-produced with California oil in 2014. Half of California's produced water is generally used for steam and water injection for enhanced oil recovery. The other half (~215,000 acre-feet of water) is available for potential reuse. Concerns about the severe drought, groundwater depletion, and contamination have prompted petroleum operators and water districts to examine the recycling of produced water. Knowledge of the geochemistry of produced waters is valuable in determining the feasibility of produced water reuse. Water with low salinity can be reclaimed for use outside of the petroleum industry (e.g. irrigation, municipal uses, and industrial operations). Since a great proportion of California petroleum wells have produced water with relatively low salinity (generally 10,000-40,000 mg/L TDS), reclaiming produced water could be important as a drought mitigation strategy, especially in the parched southern San Joaquin Valley with many oil fields. The USGS Produced Waters Geochemical Database, available at http://eerscmap.usgs.gov/pwapp, will facilitate studies on the management of produced water for reclamation in California. Expanding on the USGS 2002 database, we have more accurately located California wells. We have added new data for 300 wells in the Sacramento Valley, San Joaquin Valley and the Los Angeles Basin for a total of ~ 1100 wells in California. In addition to the existing (2002) geochemical analyses of major ions and total dissolved solids, the new data also include geochemical analyses of minor ions and stable isotopes. We have added an interactive web map application which allows the user to filter data on chosen fields (e.g. TDS < 35,000 mg/L). Using the web map application as well as more in-depth investigation on the full data set can provide critical insight for better management of produced waters in water

  8. The GTN-P Data Management System: A central database for permafrost monitoring parameters of the Global Terrestrial Network for Permafrost (GTN-P) and beyond

    NASA Astrophysics Data System (ADS)

    Lanckman, Jean-Pierre; Elger, Kirsten; Karlsson, Ævar Karl; Johannsson, Halldór; Lantuit, Hugues

    2013-04-01

    Permafrost is a direct indicator of climate change and has been identified as Essential Climate Variable (ECV) by the global observing community. The monitoring of permafrost temperatures, active-layer thicknesses and other parameters has been performed for several decades already, but it was brought together within the Global Terrestrial Network for Permafrost (GTN-P) in the 1990's only, including the development of measurement protocols to provide standardized data. GTN-P is the primary international observing network for permafrost sponsored by the Global Climate Observing System (GCOS) and the Global Terrestrial Observing System (GTOS), and managed by the International Permafrost Association (IPA). All GTN-P data was outfitted with an "open data policy" with free data access via the World Wide Web. The existing data, however, is far from being homogeneous: it is not yet optimized for databases, there is no framework for data reporting or archival and data documentation is incomplete. As a result, and despite the utmost relevance of permafrost in the Earth's climate system, the data has not been used by as many researchers as intended by the initiators of the programs. While the monitoring of many other ECVs has been tackled by organized international networks (e.g. FLUXNET), there is still no central database for all permafrost-related parameters. The European Union project PAGE21 created opportunities to develop this central database for permafrost monitoring parameters of GTN-P during the duration of the project and beyond. The database aims to be the one location where the researcher can find data, metadata, and information of all relevant parameters for a specific site. Each component of the Data Management System (DMS), including parameters, data levels and metadata formats were developed in cooperation with the GTN-P and the IPA. The general framework of the GTN-P DMS is based on an object oriented model (OOM), open for as many parameters as possible, and

  9. Knowledge Management in Cardiac Surgery: The Second Tehran Heart Center Adult Cardiac Surgery Database Report

    PubMed Central

    Abbasi, Kyomars; Karimi, Abbasali; Abbasi, Seyed Hesameddin; Ahmadi, Seyed Hossein; Davoodi, Saeed; Babamahmoodi, Abdolreza; Movahedi, Namdar; Salehiomran, Abbas; Shirzad, Mahmood; Bina, Peyvand

    2012-01-01

    Background: The Adult Cardiac Surgery Databank (ACSD) of Tehran Heart Center was established in 2002 with a view to providing clinical prediction rules for outcomes of cardiac procedures, developing risk score systems, and devising clinical guidelines. This is a general analysis of the collected data. Methods: All the patients referred to Tehran Heart Center for any kind of heart surgery between 2002 and 2008 were included, and their demographic, medical, clinical, operative, and postoperative data were gathered. This report presents general information as well as in-hospital mortality rates regarding all the cardiac procedures performed in the above time period. Results: There were 24959 procedures performed: 19663 (78.8%) isolated coronary artery bypass grafting surgeries (CABGs); 1492 (6.0%) isolated valve surgeries; 1437 (5.8%) CABGs concomitant with other procedures; 832 (3.3%) CABGs combined with valve surgeries; 722 (2.9%) valve surgeries concomitant with other procedures; 545 (2.2%) surgeries other than CABG or valve surgery; and 267 (1.1%) CABGs concomitant with valve and other types of surgery. The overall mortality was 205 (1.04%), with the lowest mortality rate (0.47%) in the isolated CABGs and the highest (4.49%) in the CABGs concomitant with valve surgeries and other types of surgery. Meanwhile, the overall mortality rate was higher in the female patients than in the males (1.90% vs. 0.74%, respectively). Conclusion: Isolated CABG was the most prevalent procedure at our center with the lowest mortality rate. However, the overall mortality was more prevalent in our female patients. This database can serve as a platform for the participation of the other countries in the region in the creation of a regional ACSD. PMID:23304179

  10. Principles and techniques in the design of ADMS+. [advanced data-base management system

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nick; Kang, Hyunchul

    1986-01-01

    'ADMS+/-' is an advanced data base management system whose architecture integrates the ADSM+ mainframe data base system with a large number of work station data base systems, designated ADMS-; no communications exist between these work stations. The use of this system radically decreases the response time of locally processed queries, since the work station runs in a single-user mode, and no dynamic security checking is required for the downloaded portion of the data base. The deferred update strategy used reduces overhead due to update synchronization in message traffic.

  11. Databases save time and improve the quality of the design, management and processing of ecopathological surveys.

    PubMed

    Sulpice, P; Bugnard, F; Calavas, D

    1994-01-01

    The example of an ecopathological survey on nursing ewe mastitis shows that data bases have 4 complementary functions: assistance during the conception of surveys; follow-up of surveys; management and quality control of data; and data organization for statistical analysis. This is made possible by the simultaneous conception of both the data base and the survey, and by the integration of computer science into the work of the task group that conducts the survey. This methodology helps save time and improve the quality of data in ecopathological surveys.

  12. Development of a database for prompt gamma-ray neutron activation analysis: Summary report of the third research coordination meeting

    SciTech Connect

    Lindstrom, Richard M.; Firestone, Richard B.; Pavi, ???

    2003-04-01

    The main discussions and conclusions from the Third Co-ordination Meeting on the Development of a Database for Prompt Gamma-ray Neutron Activation Analysis are summarized in this report. All results were reviewed in detail, and the final version of the TECDOC and the corresponding software were agreed upon and approved for preparation. Actions were formulated with the aim of completing the final version of the TECDOC and associated software by May 2003.

  13. Managing attribute--value clinical trials data using the ACT/DB client-server database system.

    PubMed

    Nadkarni, P M; Brandt, C; Frawley, S; Sayward, F G; Einbinder, R; Zelterman, D; Schacter, L; Miller, P L

    1998-01-01

    ACT/DB is a client-server database application for storing clinical trials and outcomes data, which is currently undergoing initial pilot use. It stores most of its data in entity-attribute-value form. Such data are segregated according to data type to allow indexing by value when possible, and binary large object data are managed in the same way as other data. ACT/DB lets an investigator design a study rapidly by defining the parameters (or attributes) that are to be gathered, as well as their logical grouping for purposes of display and data entry. ACT/DB generates customizable data entry. The data can be viewed through several standard reports as well as exported as text to external analysis programs. ACT/DB is designed to encourage reuse of parameters across multiple studies and has facilities for dictionary search and maintenance. It uses a Microsoft Access client running on Windows 95 machines, which communicates with an Oracle server running on a UNIX platform. ACT/DB is being used to manage the data for seven studies in its initial deployment.

  14. Data storage management in a distributed database with deterministic limited communications windows between data storage nodes

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-05-01

    An orbital service model allows data to be collected, stored and used on different nodes comprising an ad-hoc system where provider craft supply services to consumer craft. Ad-hoc networks and provider-consumer relationships are commonly used in various applications on Earth. The deterministic movement of spacecraft, however, allows the ad-hoc network and service providing model to operate in a different way than would be typical in most terrestrial ad-hoc networks. While long periods of no direct node-to-node connectivity may exist, the periods of connectivity are pre-known based on orbital parameters. Additionally, paths for indirect connectivity can be identified and evaluated for cost effectiveness. This paper presents a data management approach for an orbital computing ad-hoc system. Algorithms for determining where data should be stored (identification of most useful point of storage, whether multiple copies are justified) and how movement should be affected (transfer scheduling, replication, etc.) are presented and evaluated.

  15. Curating and Preserving the Big Canopy Database System: an Active Curation Approach using SEAD

    NASA Astrophysics Data System (ADS)

    Myers, J.; Cushing, J. B.; Lynn, P.; Weiner, N.; Ovchinnikova, A.; Nadkarni, N.; McIntosh, A.

    2015-12-01

    Modern research is increasingly dependent upon highly heterogeneous data and on the associated cyberinfrastructure developed to organize, analyze, and visualize that data. However, due to the complexity and custom nature of such combined data-software systems, it can be very challenging to curate and preserve them for the long term at reasonable cost and in a way that retains their scientific value. In this presentation, we describe how this challenge was met in preserving the Big Canopy Database (CanopyDB) system using an agile approach and leveraging the Sustainable Environment - Actionable Data (SEAD) DataNet project's hosted data services. The CanopyDB system was developed over more than a decade at Evergreen State College to address the needs of forest canopy researchers. It is an early yet sophisticated exemplar of the type of system that has become common in biological research and science in general, including multiple relational databases for different experiments, a custom database generation tool used to create them, an image repository, and desktop and web tools to access, analyze, and visualize this data. SEAD provides secure project spaces with a semantic content abstraction (typed content with arbitrary RDF metadata statements and relationships to other content), combined with a standards-based curation and publication pipeline resulting in packaged research objects with Digital Object Identifiers. Using SEAD, our cross-project team was able to incrementally ingest CanopyDB components (images, datasets, software source code, documentation, executables, and virtualized services) and to iteratively define and extend the metadata and relationships needed to document them. We believe that both the process, and the richness of the resultant standards-based (OAI-ORE) preservation object, hold lessons for the development of best-practice solutions for preserving scientific data in association with the tools and services needed to derive value from it.

  16. A Multi-Disciplinary Management of Flooding Risk Based on the Use of Rainfall Data, Historical Impacts Databases and Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Renard, F.; Alonso, L.; Soto, D.

    2014-12-01

    Greater Lyon (1.3 million inhabitants 650 km ²), France, is subjected to recurring floods, with numerous consequences. From the perspective of prevention and management of this risk, the local authorities, in partnership with multidisciplinary researchers, have developed since 1988 a database built by the field teams, which specifically identifies all floods (places, date, impacts, damage, etc.). At first, this historical database is compared to two other databases, the emergency services and the local newspaper ones, in georeferencing these events using a GIS. It turns out that the historical database is more complete and precise, but the contribution of the other two bases is not negligible, and a useful complement to the knowledge of impacts. Thanks to the dense rain measurement network (30 rain gauges), the flood information is then compared to the distribution of rainfall for each episode (interpolation by ordinary kriging). The results are satisfactory and validate the accuracy of the information contained in the database, but also the accuracy of rainfall measurements. Thereafter, the number of flood on the study area is confronted with rainfall characteristics (intensity, duration and height of precipitated water). It does not appear here clear relationship between the number of floods and rainfall characteristics, because of the diversity of land uses, its permeability and the the types of local sewer network and urban water management. Finally, floods observed in the database are compared spatially with a GIS to flooding from the sewer network modeling (using the software Canoe). A strong spatial similarity between floods observed in the field and simulated flood is found in the majority of cases, despite the limitations of each tools. These encouraging results confirm the accuracy of the database and the reliability of the simulation software, and offer many operational perspectives to better understand the flood and learn to cope with the flooding risk.

  17. Transfer of Physical and Hydraulic Properties Databases to the Hanford Environmental Information System - PNNL Remediation Decision Support Project, Task 1, Activity 6

    SciTech Connect

    Rockhold, Mark L.; Middleton, Lisa A.

    2009-03-31

    This report documents the requirements for transferring physical and hydraulic property data compiled by PNNL into the Hanford Environmental Information System (HEIS). The Remediation Decision Support (RDS) Project is managed by Pacific Northwest National Laboratory (PNNL) to support Hanford Site waste management and remedial action decisions by the U.S. Department of Energy and one of their current site contractors - CH2M-Hill Plateau Remediation Company (CHPRC). The objective of Task 1, Activity 6 of the RDS project is to compile all available physical and hydraulic property data for sediments from the Hanford Site, to port these data into the Hanford Environmental Information System (HEIS), and to make the data web-accessible to anyone on the Hanford Local Area Network via the so-called Virtual Library.1 These physical and hydraulic property data are used to estimate parameters for analytical and numerical flow and transport models that are used for site risk assessments and evaluation of remedial action alternatives. In past years efforts were made by RDS project staff to compile all available physical and hydraulic property data for Hanford sediments and to transfer these data into SoilVision{reg_sign}, a commercial geotechnical software package designed for storing, analyzing, and manipulating soils data. Although SoilVision{reg_sign} has proven to be useful, its access and use restrictions have been recognized as a limitation to the effective use of the physical and hydraulic property databases by the broader group of potential users involved in Hanford waste site issues. In order to make these data more widely available and useable, a decision was made to port them to HEIS and to make them web-accessible via a Virtual Library module. In FY08 the original objectives of this activity on the RDS project were to: (1) ensure traceability and defensibility of all physical and hydraulic property data currently residing in the SoilVision{reg_sign} database

  18. VIEWCACHE: An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nick; Sellis, Timoleon

    1991-01-01

    The objective is to illustrate the concept of incremental access to distributed databases. An experimental database management system, ADMS, which has been developed at the University of Maryland, in College Park, uses VIEWCACHE, a database access method based on incremental search. VIEWCACHE is a pointer-based access method that provides a uniform interface for accessing distributed databases and catalogues. The compactness of the pointer structures formed during database browsing and the incremental access method allow the user to search and do inter-database cross-referencing with no actual data movement between database sites. Once the search is complete, the set of collected pointers pointing to the desired data are dereferenced.

  19. Database of Industrial Technological Information in Kanagawa : Networks for Technology Activities

    NASA Astrophysics Data System (ADS)

    Saito, Akira; Shindo, Tadashi

    This system is one of the databases which require participation by its members and of which premise is to open all the data in it. Aiming at free technological cooperation and exchange among industries it was constructed by Kanagawa Prefecture in collaboration with enterprises located in it. The input data is 36 items such as major product, special and advantageous technology, technolagy to be wanted for cooperation, facility and equipment, which technologically characterize each enterprise. They are expressed in 2,000 characters and written by natural language including Kanji except for some coded items. 24 search items are accessed by natural language so that in addition to interactive searching procedures including menu-type it enables extensive searching. The information service started in Oct., 1986 covering data from 2,000 enterprisen.

  20. Stackfile Database

    NASA Technical Reports Server (NTRS)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  1. Active management of food allergy: an emerging concept.

    PubMed

    Anagnostou, Katherine; Stiefel, Gary; Brough, Helen; du Toit, George; Lack, Gideon; Fox, Adam T

    2015-04-01

    IgE-mediated food allergies are common and currently there is no cure. Traditionally, management has relied upon patient education, food avoidance and the provision of an emergency medication plan. Despite this, food allergy can significantly impact on quality of life. Therefore, in recent years, evolving research has explored alternative management strategies. A more active approach to management is being adopted, which includes early introduction of potentially allergenic foods, anticipatory testing, active monitoring, desensitisation to food allergens and active risk management. This review will discuss these areas in turn.

  2. A New Activity-Based Financial Cost Management Method

    NASA Astrophysics Data System (ADS)

    Qingge, Zhang

    The standard activity-based financial cost management model is a new model of financial cost management, which is on the basis of the standard cost system and the activity-based cost and integrates the advantages of the two. It is a new model of financial cost management with more accurate and more adequate cost information by taking the R&D expenses as the accounting starting point and after-sale service expenses as the terminal point and covering the whole producing and operating process and the whole activities chain and value chain aiming at serving the internal management and decision.

  3. Activated sludge process: Waste treatment. (Latest citations from the Biobusiness database). Published Search

    SciTech Connect

    1996-01-01

    The bibliography contains citations concerning the use of the activated sludge process in waste and wastewater treatment. Topics include biochemistry of the activated sludge process, effects of various pollutants on process activity, effects of environmental variables such as oxygen and water levels, and nutrient requirements of microorganisms employed in activated sludge processes. The citations also explore use of the process to treat specific wastes, such as halocarbons, metallic wastes, and petrochemical effluents; and wastes from pharmaceutical and dairy processes. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  4. Activated sludge process: Waste treatment. (Latest citations from the Biobusiness database). Published Search

    SciTech Connect

    Not Available

    1993-07-01

    The bibliography contains citations concerning the use of the activated sludge process in waste and wastewater treatment. Topics include biochemistry of the activated sludge process, effects of various pollutants on process activity, effects of environmental variables such as oxygen and water levels, and nutrient requirements of microorganisms employed in activated sludge processes. The citations also explore use of the process to treat specific wastes, such as halocarbons, metallic wastes, and petrochemical effluents; and wastes from pharmaceutical and dairy processes. (Contains 250 citations and includes a subject term index and title list.)

  5. Activated sludge process: Waste treatment. (Latest citations from the Biobusiness database). Published Search

    SciTech Connect

    Not Available

    1993-10-01

    The bibliography contains citations concerning the use of the activated sludge process in waste and wastewater treatment. Topics include biochemistry of the activated sludge process, effects of various pollutants on process activity, effects of environmental variables such as oxygen and water levels, and nutrient requirements of microorganisms employed in activated sludge processes. The citations also explore use of the process to treat specific wastes, such as halocarbons, metallic wastes, and petrochemical effluents; and wastes from pharmaceutical and dairy processes. (Contains 250 citations and includes a subject term index and title list.)

  6. Activated-sludge process: Waste treatment. (Latest citations from the biobusiness database). Published Search

    SciTech Connect

    Not Available

    1992-07-01

    The bibliography contains citations concerning the use of the activated sludge process in waste and wastewater treatment. Topics include biochemistry of the activated sludge process, effects of various pollutants on process activity, effects of environmental variables such as oxygen and water levels, and nutrient requirements of microorganisms employed in activated sludge processes. The citations also explore use of the process to treat specific wastes, such as halocarbons, metallic wastes, and petrochemical effluents; and wastes from pharmaceutical and dairy processes. (Contains 250 citations and includes a subject term index and title list.)

  7. US - Former Soviet Union environmental management activities

    SciTech Connect

    1995-09-01

    The Office of Environmental Management (EM) has been delegated the responsibility for US DOE`s cleanup of nuclear weapons complex. The nature and the magnitude of the waste management and environmental remediation problem requires the identification of technologies and scientific expertise from domestic and foreign sources. This booklet makes comparisons and describes coordinated projects and workshops between the USA and the former Soviet Union.

  8. Djeen (Database for Joomla!’s Extensible Engine): a research information management system for flexible multi-technology project administration

    PubMed Central

    2013-01-01

    Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665

  9. Databases for multilevel biophysiology research available at Physiome.jp

    PubMed Central

    Asai, Yoshiyuki; Abe, Takeshi; Li, Li; Oka, Hideki; Nomura, Taishin; Kitano, Hiroaki

    2015-01-01

    Physiome.jp (http://physiome.jp) is a portal site inaugurated in 2007 to support model-based research in physiome and systems biology. At Physiome.jp, several tools and databases are available to support construction of physiological, multi-hierarchical, large-scale models. There are three databases in Physiome.jp, housing mathematical models, morphological data, and time-series data. In late 2013, the site was fully renovated, and in May 2015, new functions were implemented to provide information infrastructure to support collaborative activities for developing models and performing simulations within the database framework. This article describes updates to the databases implemented since 2013, including cooperation among the three databases, interactive model browsing, user management, version management of models, management of parameter sets, and interoperability with applications. PMID:26441671

  10. Polish activity within Orphanet Europe--state of art of database and services.

    PubMed

    Jezela-Stanek, Aleksandra; Karczmarewicz, Dorota; Chrzanowska, Krystyna H; Krajewska-Walasek, Małgorzata

    2015-01-01

    Orphanet is an international project aiming to help in improvement the diagnostic process, care and treatment of patients with rare diseases, and to provide information on development in research and new therapy. Orphanet is currently represented in 38 countries. The infrastructure and coordination activities are jointly funded by Inserm, the French Directorate General for Health, and the European Commission. Moreover, certain services are specially funded by other partners. Orphanet's activities in each country of the network are partially financed by national institutions and(or) specific contracts. In this paper we present the Orphanet portal as well as the Polish national activity within this project. PMID:26982769

  11. Age-related patterns of vigorous-intensity physical activity in youth: The International Children's Accelerometry Database.

    PubMed

    Corder, Kirsten; Sharp, Stephen J; Atkin, Andrew J; Andersen, Lars B; Cardon, Greet; Page, Angie; Davey, Rachel; Grøntved, Anders; Hallal, Pedro C; Janz, Kathleen F; Kordas, Katarzyna; Kriemler, Susi; Puder, Jardena J; Sardinha, Luis B; Ekelund, Ulf; van Sluijs, Esther M F

    2016-12-01

    Physical activity declines during youth but most evidence reports on combined moderate and vigorous-intensity physical activity. We investigated how vigorous-intensity activity varies with age. Cross-sectional data from 24,025 participants (5.0-18.0 y; from 20 studies in 10 countries obtained 2008-2010) providing ≥ 1 day accelerometer data (International Children's Accelerometry Database (ICAD)). Linear regression was used to investigate age-related patterns in vigorous-intensity activity; models included age (exposure), adjustments for monitor wear-time and study. Moderate-intensity activity was examined for comparison. Interactions were used to investigate whether the age/vigorous-activity association differed by sex, weight status, ethnicity, maternal education and region. A 6.9% (95% CI 6.2, 7.5) relative reduction in mean vigorous-intensity activity with every year of age was observed; for moderate activity the relative reduction was 6.0% (5.6%, 6.4%). The age-related decrease in vigorous-intensity activity remained after adjustment for moderate activity. A larger age-related decrease in vigorous activity was observed for girls (- 10.7%) versus boys (- 2.9%), non-white (- 12.9% to - 9.4%) versus white individuals (- 6.1%), lowest maternal education (high school (- 2.0%)) versus college/university (ns) and for overweight/obese (- 6.1%) versus healthy-weight participants (- 8.1%). In addition to larger annual decreases in vigorous-intensity activity, overweight/obese individuals, girls and North Americans had comparatively lower average vigorous-intensity activity at 5.0-5.9 y. Age-related declines in vigorous-intensity activity during youth appear relatively greater than those of moderate activity. However, due to a higher baseline, absolute moderate-intensity activity decreases more than vigorous. Overweight/obese individuals, girls, and North Americans appear especially in need of vigorous-intensity activity promotion due to low levels at 5

  12. Age-related patterns of vigorous-intensity physical activity in youth: The International Children's Accelerometry Database.

    PubMed

    Corder, Kirsten; Sharp, Stephen J; Atkin, Andrew J; Andersen, Lars B; Cardon, Greet; Page, Angie; Davey, Rachel; Grøntved, Anders; Hallal, Pedro C; Janz, Kathleen F; Kordas, Katarzyna; Kriemler, Susi; Puder, Jardena J; Sardinha, Luis B; Ekelund, Ulf; van Sluijs, Esther M F

    2016-12-01

    Physical activity declines during youth but most evidence reports on combined moderate and vigorous-intensity physical activity. We investigated how vigorous-intensity activity varies with age. Cross-sectional data from 24,025 participants (5.0-18.0 y; from 20 studies in 10 countries obtained 2008-2010) providing ≥ 1 day accelerometer data (International Children's Accelerometry Database (ICAD)). Linear regression was used to investigate age-related patterns in vigorous-intensity activity; models included age (exposure), adjustments for monitor wear-time and study. Moderate-intensity activity was examined for comparison. Interactions were used to investigate whether the age/vigorous-activity association differed by sex, weight status, ethnicity, maternal education and region. A 6.9% (95% CI 6.2, 7.5) relative reduction in mean vigorous-intensity activity with every year of age was observed; for moderate activity the relative reduction was 6.0% (5.6%, 6.4%). The age-related decrease in vigorous-intensity activity remained after adjustment for moderate activity. A larger age-related decrease in vigorous activity was observed for girls (- 10.7%) versus boys (- 2.9%), non-white (- 12.9% to - 9.4%) versus white individuals (- 6.1%), lowest maternal education (high school (- 2.0%)) versus college/university (ns) and for overweight/obese (- 6.1%) versus healthy-weight participants (- 8.1%). In addition to larger annual decreases in vigorous-intensity activity, overweight/obese individuals, girls and North Americans had comparatively lower average vigorous-intensity activity at 5.0-5.9 y. Age-related declines in vigorous-intensity activity during youth appear relatively greater than those of moderate activity. However, due to a higher baseline, absolute moderate-intensity activity decreases more than vigorous. Overweight/obese individuals, girls, and North Americans appear especially in need of vigorous-intensity activity promotion due to low levels at 5

  13. Data management and language enhancement for generalized set theory computer language for operation of large relational databases

    NASA Technical Reports Server (NTRS)

    Finley, Gail T.

    1988-01-01

    This report covers the study of the relational database implementation in the NASCAD computer program system. The existing system is used primarily for computer aided design. Attention is also directed to a hidden-surface algorithm for final drawing output.

  14. Quarterly Briefing Book on Environmental and Waste Management Activities

    SciTech Connect

    Brown, M.C.

    1991-06-01

    The purpose of the Quarterly Briefing Book on Environmental and Waste Management Activities is to provide managers and senior staff at the US Department of Energy-Richland Operations Office and its contractors with timely and concise information on Hanford Site environmental and waste management activities. Each edition updates the information on the topics in the previous edition, deletes those determined not to be of current interest, and adds new topics to keep up to date with changing environmental and waste management requirements and issues. Section A covers current waste management and environmental restoration issues. In Section B are writeups on national or site-wide environmental and waste management topics. Section C has writeups on program- and waste-specific environmental and waste management topics. Section D provides information on waste sites and inventories on the site. 15 figs., 4 tabs.

  15. Biofuel Database

    National Institute of Standards and Technology Data Gateway

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  16. hERGAPDbase: a database documenting hERG channel inhibitory potentials and APD-prolongation activities of chemical compounds.

    PubMed

    Hishigaki, Haretsugu; Kuhara, Satoru

    2011-01-01

    Drug-induced QT interval prolongation is one of the most common reasons for the withdrawal of drugs from the market. In the past decade, at least nine drugs, i.e. terfenadine, astemizole, grepafloxacin, terodiline, droperidol, lidoflazine, sertindole, levomethadyl and cisapride, have been removed from the market or their use has been severely restricted because of drug-induced QT interval prolongation. Therefore, this irregularity is a major safety concern in the case of drugs submitted for regulatory approval. The most common mechanism of drug-induced QT interval prolongation may be drug-related inhibition of the human ether-á-go-go-related gene (hERG) channel, which subsequently results in prolongation of the cardiac action potential duration (APD). hERGAPDbase is a database of electrophysiological experimental data documenting potential hERG channel inhibitory actions and the APD-prolongation activities of chemical compounds. All data entries are manually collected from scientific papers and curated by a person. With hERGAPDbase, we aim to provide useful information for chemical and pharmacological scientists and enable easy access to electrophysiological experimental data on chemical compounds. Database URL: http://www.grt.kyushu-u.ac.jp/hergapdbase/.

  17. hERGAPDbase: a database documenting hERG channel inhibitory potentials and APD-prolongation activities of chemical compounds.

    PubMed

    Hishigaki, Haretsugu; Kuhara, Satoru

    2011-01-01

    Drug-induced QT interval prolongation is one of the most common reasons for the withdrawal of drugs from the market. In the past decade, at least nine drugs, i.e. terfenadine, astemizole, grepafloxacin, terodiline, droperidol, lidoflazine, sertindole, levomethadyl and cisapride, have been removed from the market or their use has been severely restricted because of drug-induced QT interval prolongation. Therefore, this irregularity is a major safety concern in the case of drugs submitted for regulatory approval. The most common mechanism of drug-induced QT interval prolongation may be drug-related inhibition of the human ether-á-go-go-related gene (hERG) channel, which subsequently results in prolongation of the cardiac action potential duration (APD). hERGAPDbase is a database of electrophysiological experimental data documenting potential hERG channel inhibitory actions and the APD-prolongation activities of chemical compounds. All data entries are manually collected from scientific papers and curated by a person. With hERGAPDbase, we aim to provide useful information for chemical and pharmacological scientists and enable easy access to electrophysiological experimental data on chemical compounds. Database URL: http://www.grt.kyushu-u.ac.jp/hergapdbase/. PMID:21586548

  18. An expressed sequence tag database of T-cell-enriched activated chicken splenocytes: sequence analysis of 5251 clones.

    PubMed

    Tirunagaru, V G; Sofer, L; Cui, J; Burnside, J

    2000-06-01

    The cDNA and gene sequences of many mammalian cytokines and their receptors are known. However, corresponding information on avian cytokines is limited due to the lack of cross-species activity at the functional level or strong homology at the molecular level. To improve the efficiency of identifying cytokines and novel chicken genes, a directionally cloned cDNA library from T-cell-enriched activated chicken splenocytes was constructed, and the partial sequence of 5251 clones was obtained. Sequence clustering indicates that 2357 (42%) of the clones are present as a single copy, and 2961 are distinct clones, demonstrating the high level of complexity of this library. Comparisons of the sequence data with known DNA sequences in GenBank indicate that approximately 25% of the clones match known chicken genes, 39% have similarity to known genes in other species, and 11% had no match to any sequence in the database. Several previously uncharacterized chicken cytokines and their receptors were present in our library. This collection provides a useful database for cataloging genes expressed in T cells and a valuable resource for future investigations of gene expression in avian immunology. A chicken EST Web site (http://udgenome. ags.udel. edu/chickest/chick.htm) has been created to provide access to the data, and a set of unique sequences has been deposited with GenBank (Accession Nos. AI979741-AI982511). Our new Web site (http://www. chickest.udel.edu) will be active as of March 3, 2000, and will also provide keyword-searching capabilities for BLASTX and BLASTN hits of all our clones. PMID:10860659

  19. International Project Management Committee: Overview and Activities

    NASA Technical Reports Server (NTRS)

    Hoffman, Edward

    2010-01-01

    This slide presentation discusses the purpose and composition of the International Project Management Committee (IMPC). The IMPC was established by members of 15 space agencies, companies and professional organizations. The goal of the committee is to establish a means to share experiences and best practices with space project/program management practitioners at the global level. The space agencies that are involved are: AEB, DLR, ESA, ISRO, JAXA, KARI, and NASA. The industrial and professional organizational members are Comau, COSPAR, PMI, and Thales Alenia Space.

  20. Description of data base management systems activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    One of the major responsibilities of the JPL Computing and Information Services Office is to develop and maintain a JPL plan for providing computing services to the JPL management and administrative community that will lead to improved productivity. The CISO plan to accomplish this objective has been titled 'Management and Administrative Support Systems' (MASS). The MASS plan is based on the continued use of JPL's IBM 3032 Computer system for administrative computing and for the MASS functions. The current candidate administrative Data Base Management Systems required to support the MASS include ADABASE, Cullinane IDMS and TOTAL. Previous uses of administrative Data Base Systems have been applied to specific local functions rather than in a centralized manner with elements common to the many user groups. Limited capacity data base systems have been installed in microprocessor based office automation systems in a few Project and Management Offices using Ashton-Tate dBASE II. These experiences plus some other localized in house DBMS uses have provided an excellent background for developing user and system requirements for a single DBMS to support the MASS program.

  1. Activated carbon: Utilization excluding industrial waste treatment. (Latest citations from the Compendex database). Published Search

    SciTech Connect

    Not Available

    1993-06-01

    The bibliography contains citations concerning the commercial use and theoretical studies of activated carbon. Topics include performance evaluations in water treatment processes, preparation and regeneration techniques, materials recovery, and pore structure studies. Adsorption characteristics for specific materials are discussed. Studies pertaining specifically to industrial waste treatment are excluded. (Contains 250 citations and includes a subject term index and title list.)

  2. Essential Learnings in Environmental Education--A Database for Building Activities and Programs.

    ERIC Educational Resources Information Center

    Ballard, Melissa, Comp.; Pandya, Mamata, Comp.

    The purpose of this book is to provide building blocks for designing and reviewing environmental education programs and activities. This handbook provides 600 basic concepts needed to attain the environmental education goals outlined at the Tbilisi, USSR, conference and generally agreed to be the fundamental core of quality environmental…

  3. Data-Based Active Learning in the Principles of Macroeconomics Course: A Mock FOMC Meeting

    ERIC Educational Resources Information Center

    Whiting, Cathleen

    2006-01-01

    The author presents an active-learning exercise for the introductory macroeconomics class in which students participate in a mock Federal Open Market Committee (FOMC) meeting. Preparation involves data gathering and writing both a research report and a policy recommendation. An FOMC meeting is simulated in which students give their policy…

  4. Activated sludge treatment. (Latest citations from the Life Sciences Collection database). Published Search

    SciTech Connect

    Not Available

    1993-07-01

    The bibliography contains citations concerning the activated sludge process in treating industrial and domestic waste. Apparatus design, parameters for effectiveness, and organisms utilized in the various processes are among the topics discussed. Performance evaluations and applications of treatment processes for the purification and removal of unwanted substances from sewage and waste water are presented. (Contains 250 citations and includes a subject term index and title list.)

  5. Review of Medical Dispute Cases in the Pain Management in Korea: A Medical Malpractice Liability Insurance Database Study

    PubMed Central

    Moon, Hyun Seog

    2015-01-01

    Background Pain medicine often requires medico-legal involvement, even though diagnosis and treatments have improved considerably. Multiple guidelines for pain physicians contain many recommendations regarding interventional treatment. Unfortunately, no definite treatment guidelines exist because there is no complete consensus among individual guidelines. Pain intervention procedures are widely practiced and highly associated with adverse events and complications. However, a comprehensive, systemic review of medical-dispute cases (MDCs) in Korea has not yet been reported. The purpose of this article is to analyze the frequency and type of medical dispute activity undertaken by pain specialists in Korea. Methods Data on medical disputes cases were collected through the Korea Medical Association mutual aid and through a private medical malpractice liability insurance company. Data regarding the frequency and type of MDCs, along with brief case descriptions, were obtained. Results Pain in the lumbar region made up a major proportion of MDCs and compensation costs. Infection, nerve injury, and diagnosis related cases were the most major contents of MDCs. Only a small proportion of cases involved patient death or unconsciousness, but compensation costs were the highest. Conclusions More systemic guidelines and recommendations on interventional pain management are needed, especially those focused on medico-legal cases. Complications arising from pain management procedures and treatments may be avoided by physicians who have the required knowledge and expertise regarding anatomy and pain intervention procedures and know how to recognize procedural aberrations as soon as they occur. PMID:26495080

  6. Databases for LDEF results

    NASA Technical Reports Server (NTRS)

    Bohnhoff-Hlavacek, Gail

    1992-01-01

    One of the objectives of the team supporting the LDEF Systems and Materials Special Investigative Groups is to develop databases of experimental findings. These databases identify the hardware flown, summarize results and conclusions, and provide a system for acknowledging investigators, tracing sources of data, and future design suggestions. To date, databases covering the optical experiments, and thermal control materials (chromic acid anodized aluminum, silverized Teflon blankets, and paints) have been developed at Boeing. We used the Filemaker Pro software, the database manager for the Macintosh computer produced by the Claris Corporation. It is a flat, text-retrievable database that provides access to the data via an intuitive user interface, without tedious programming. Though this software is available only for the Macintosh computer at this time, copies of the databases can be saved to a format that is readable on a personal computer as well. Further, the data can be exported to more powerful relational databases, capabilities, and use of the LDEF databases and describe how to get copies of the database for your own research.

  7. Multiple objective optimization for active sensor management

    NASA Astrophysics Data System (ADS)

    Page, Scott F.; Dolia, Alexander N.; Harris, Chris J.; White, Neil M.

    2005-03-01

    The performance of a multi-sensor data fusion system is inherently constrained by the configuration of the given sensor suite. Intelligent or adaptive control of sensor resources has been shown to offer improved fusion performance in many applications. Common approaches to sensor management select sensor observation tasks that are optimal in terms of a measure of information. However, optimising for information alone is inherently sub-optimal as it does not take account of any other system requirements such as stealth or sensor power conservation. We discuss the issues relating to developing a suite of performance metrics for optimising multi-sensor systems and propose some candidate metrics. In addition it may not always be necessary to maximize information gain, in some cases small increases in information gain may take place at the cost of large sensor resource requirements. Additionally, the problems of sensor tasking and placement are usually treated separately, leading to a lack of coherency between sensor management frameworks. We propose a novel approach based on a high level decentralized information-theoretic sensor management architecture that unifies the processes of sensor tasking and sensor placement into a single framework. Sensors are controlled using a minimax multiple objective optimisation approach in order to address probability of target detection, sensor power consumption, and sensor survivability whilst maintaining a target estimation covariance threshold. We demonstrate the potential of the approach through simulation of a multi-sensor, target tracking scenario and compare the results with a single objective information based approach.

  8. A real-time framework for classifying, indexing and querying image database volcano activity

    NASA Astrophysics Data System (ADS)

    Aliotta, M. A.; Cannata, A.; Cassisi, C.; Montalto, P.; Prestifilippo, M.

    2013-12-01

    INGV-OE monitors explosive activity at Stromboli Volcano in order to analyze its eruptive dynamics. Since strombolian activity can be monitored using thermal camera frames, an efficient system able to require information from a huge amount of frames is needed. Aim of this work is the development of a novel system capable of fast data retrieval, based on similarity concepts. In the light of it, an indexing algorithm was developed. The concept is finding elements of a set that are close respect to a query element, according to a similarity criterion. To accomplish this task each video frame is processed using morphological image processing techniques to extract the image area of the explosion. Each closed curve, representing the explosion contour, is processed in order to extract useful features that constitute the metric space in which similarity between objects can be evaluated calculating the distances using an appropriate distance function. This approach suffers from an intrinsic problem related to the number of distances to be computed on a huge amount of data. To overcome this drawback, an indexing algorithm was applied. The idea behind this concept is a tree data structure that minimizes the number of distances to be computed among objects. Proposed method allows us to perform a fast query execution. The proposed system is able to find similar objects using distance among their features. Thus, we can group together explosions related to different kinds of activities using reference items. For instance, if we have a known image sequence, showing a given explosion, we can easily and quickly find all sequences containing explosions similar to it. The developed framework is able to both classifying each new explosion and dynamically insert the corresponding object into the tree structure. Therefore, by our approach, we can cluster the entire data space grouping objects having similar characteristics and classify them.

  9. First Look: TRADEMARKSCAN Database.

    ERIC Educational Resources Information Center

    Fernald, Anne Conway; Davidson, Alan B.

    1984-01-01

    Describes database produced by Thomson and Thomson and available on Dialog which contains over 700,000 records representing all active federal trademark registrations and applications for registrations filed in United States Patent and Trademark Office. A typical record, special features, database applications, learning to use TRADEMARKSCAN, and…

  10. A Quality System Database

    NASA Technical Reports Server (NTRS)

    Snell, William H.; Turner, Anne M.; Gifford, Luther; Stites, William

    2010-01-01

    A quality system database (QSD), and software to administer the database, were developed to support recording of administrative nonconformance activities that involve requirements for documentation of corrective and/or preventive actions, which can include ISO 9000 internal quality audits and customer complaints.

  11. BioWarehouse: a bioinformatics database warehouse toolkit

    PubMed Central

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David WJ; Tenenbaum, Jessica D; Karp, Peter D

    2006-01-01

    Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the database integration problem for

  12. The uptake of active surveillance for the management of prostate cancer: A population-based analysis

    PubMed Central

    Richard, Patrick O.; Alibhai, Shabbir M.H.; Panzarella, Tony; Klotz, Laurence; Komisarenko, Maria; Fleshner, Neil E.; Urbach, David; Finelli, Antonio

    2016-01-01

    Introduction: Active surveillance (AS) is a strategy for the management of low-risk prostate cancer (PCa). However, few studies have assessed the uptake of AS at a population level and none of these were based on a Canadian population. Therefore, our objectives were to estimate the proportion of men being managed by AS in Ontario and to assess the factors associated with its uptake. Methods: This was a retrospective, population-based study using administrative databases from the province of Ontario to identify men ≤75 years diagnosed with localized PCa between 2002 and 2010. Descriptive statistics were used to estimate the proportion of men managed by AS, whereas mixed models were used to assess the factors associated with the uptake of AS. Results: 45 691 men met our inclusion criteria. Of these, 18% were managed by AS. Over time, the rates of AS increased significantly from 11% to 21% (p<0.001). Older age, residing in an urban centre, being diagnosed in the later years of the study period, having a neighborhood income in the highest quintile, and being managed by urologists were all associated with greater odds of receiving AS. Conclusions: There has been a steady increase in the uptake of AS between 2002 and 2010. However, only 18% of men diagnosed with localized PCa were managed by AS during the study period. The decisions to adopt AS were influenced by several individual and physician characteristics. The data suggest that there is significant opportunity for more widespread adoption of AS. PMID:27800055

  13. The Global Terrestrial Network for Permafrost Database: metadata statistics and prospective analysis on future permafrost temperature and active layer depth monitoring site distribution

    NASA Astrophysics Data System (ADS)

    Biskaborn, B. K.; Lanckman, J.-P.; Lantuit, H.; Elger, K.; Streletskiy, D. A.; Cable, W. L.; Romanovsky, V. E.

    2015-03-01

    The Global Terrestrial Network for Permafrost (GTN-P) provides the first dynamic database associated with the Thermal State of Permafrost (TSP) and the Circumpolar Active Layer Monitoring (CALM) programs, which extensively collect permafrost temperature and active layer thickness data from Arctic, Antarctic and Mountain permafrost regions. The purpose of the database is to establish an "early warning system" for the consequences of climate change in permafrost regions and to provide standardized thermal permafrost data to global models. In this paper we perform statistical analysis of the GTN-P metadata aiming to identify the spatial gaps in the GTN-P site distribution in relation to climate-effective environmental parameters. We describe the concept and structure of the Data Management System in regard to user operability, data transfer and data policy. We outline data sources and data processing including quality control strategies. Assessment of the metadata and data quality reveals 63% metadata completeness at active layer sites and 50% metadata completeness for boreholes. Voronoi Tessellation Analysis on the spatial sample distribution of boreholes and active layer measurement sites quantifies the distribution inhomogeneity and provides potential locations of additional permafrost research sites to improve the representativeness of thermal monitoring across areas underlain by permafrost. The depth distribution of the boreholes reveals that 73% are shallower than 25 m and 27% are deeper, reaching a maximum of 1 km depth. Comparison of the GTN-P site distribution with permafrost zones, soil organic carbon contents and vegetation types exhibits different local to regional monitoring situations on maps. Preferential slope orientation at the sites most likely causes a bias in the temperature monitoring and should be taken into account when using the data for global models. The distribution of GTN-P sites within zones of projected temperature change show a high

  14. US EPA’s Watershed Management Research Activities

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s Urban Watershed Management Branch (UWMB) is responsible for developing and demonstrating methods to manage the risk to public health, property and the environment from wet-weather flows (WWF) in urban watersheds. The activities are prim...

  15. Guide to good practices for line and training manager activities

    SciTech Connect

    1998-06-01

    The purpose of this guide is to provide direction for line and training managers in carrying out their responsibilities for training and qualifying personnel and to verify that existing training activities are effective.

  16. Nonexercise activity thermogenesis in obesity management.

    PubMed

    Villablanca, Pedro A; Alegria, Jorge R; Mookadam, Farouk; Holmes, David R; Wright, R Scott; Levine, James A

    2015-04-01

    Obesity is linked to cardiovascular disease. The global increase in sedentary lifestyle is an important factor contributing to the rising prevalence of the obesity epidemic. Traditionally, counseling has focused on moderate- to vigorous-intensity exercise, with disappointing results. Nonexercise activity thermogenesis (NEAT) is an important component of daily energy expenditure. It represents the common daily activities, such as fidgeting, walking, and standing. These high-effect NEAT movements could result in up to an extra 2000 kcal of expenditure per day beyond the basal metabolic rate, depending on body weight and level of activity. Implementing NEAT during leisure-time and occupational activities could be essential to maintaining a negative energy balance. NEAT can be applied by being upright, ambulating, and redesigning workplace and leisure-time environments to promote NEAT. The benefits of NEAT include not only the extra calories expended but also the reduced occurrence of the metabolic syndrome, cardiovascular events, and all-cause mortality. We believe that to overcome the obesity epidemic and its adverse cardiovascular consequences, NEAT should be part of the current medical recommendations. The content of this review is based on a literature search of PubMed and the Google search engine between January 1, 1960, and October 1, 2014, using the search terms physical activity, obesity, energy expenditure, nonexercise activity thermogenesis, and NEAT. PMID:25841254

  17. Database of the Geology and Thermal Activity of Norris Geyser Basin, Yellowstone National Park

    USGS Publications Warehouse

    Flynn, Kathryn; Graham Wall, Brita; White, Donald E.; Hutchinson, Roderick A.; Keith, Terry E.C.; Clor, Laura; Robinson, Joel E.

    2008-01-01

    This dataset contains contacts, geologic units and map boundaries from Plate 1 of USGS Professional Paper 1456, 'The Geology and Remarkable Thermal Activity of Norris Geyser Basin, Yellowstone National Park, Wyoming.' The features are contained in the Annotation, basins_poly, contours, geology_arc, geology_poly, point_features, and stream_arc feature classes as well as a table of geologic units and their descriptions. This dataset was constructed to produce a digital geologic map as a basis for studying hydrothermal processes in Norris Geyser Basin. The original map does not contain registration tic marks. To create the geodatabase, the original scanned map was georegistered to USGS aerial photographs of the Norris Junction quadrangle collected in 1994. Manmade objects, i.e. roads, parking lots, and the visitor center, along with stream junctions and other hydrographic features, were used for registration.

  18. An Interactive Geospatial Database and Visualization Approach to Early Warning Systems and Monitoring of Active Volcanoes: GEOWARN

    NASA Astrophysics Data System (ADS)

    Gogu, R. C.; Schwandner, F. M.; Hurni, L.; Dietrich, V. J.

    2002-12-01

    Large parts of southern and central Europe and the Pacific rim are situated in tectonically, seismic and volcanological extremely active zones. With the growth of population and tourism, vulnerability and risk towards natural hazards have expanded over large areas. Socio-economical aspects, land use, tourist and industrial planning as well as environmental protection increasingly require needs of natural hazard assessment. The availability of powerful and reliable satellite, geophysical and geochemical information and warning systems is therefore increasingly vital. Besides, once such systems have proven to be effective, they can be applied for similar purposes in other European areas and worldwide. Technologies today have proven that early warning of volcanic activity can be achieved by monitoring measurable changes in geophysical and geochemical parameters. Correlation between different monitored data sets, which would improve any prediction, is very scarce or missing. Visualisation of all spatial information and integration into an "intelligent cartographic concept" is of paramount interest in order to develop 2-, 3- and 4-dimensional models to approach the risk and emergency assessment as well as environmental and socio-economic planning. In the framework of the GEOWARN project, a database prototype for an Early Warning System (EWS) and monitoring of volcanic activity in case of hydrothermal-explosive and volcanic reactivation has been designed. The platform-independent, web-based, JAVA-programmed, interactive multidisciplinary multiparameter visualization software being developed at ETH allows expansion and utilization to other volcanoes, world-wide databases of volcanic unrest, or other types of natural hazard assessment. Within the project consortium, scientific data have been acquired on two pilot sites: Campi Flegrei (Italy) and Nisyros Greece, including 2&3D Topography and Bathymetry, Elevation (DEM) and Landscape models (DLM) derived from conventional

  19. Far-infrared Line Spectra of Active Galaxies from the Herschel/PACS Spectrometer: The Complete Database

    NASA Astrophysics Data System (ADS)

    Fernández-Ontiveros, Juan Antonio; Spinoglio, Luigi; Pereira-Santaella, Miguel; Malkan, Matthew A.; Andreani, Paola; Dasyra, Kalliopi M.

    2016-10-01

    We present a coherent database of spectroscopic observations of far-IR fine-structure lines from the Herschel/Photoconductor Array Camera and Spectrometer archive for a sample of 170 local active galactic nuclei (AGNs), plus a comparison sample of 20 starburst galaxies and 43 dwarf galaxies. Published Spitzer/IRS and Herschel/SPIRE line fluxes are included to extend our database to the full 10–600 μm spectral range. The observations are compared to a set of Cloudy photoionization models to estimate the above physical quantities through different diagnostic diagrams. We confirm the presence of a stratification of gas density in the emission regions of the galaxies, which increases with the ionization potential of the emission lines. The new [O iv]{}25.9μ {{m}}/[O iii]{}88μ {{m}} versus [Ne iii]{}15.6μ {{m}}/[Ne ii]{}12.8μ {{m}} diagram is proposed as the best diagnostic to separate (1) AGN activity from any kind of star formation and (2) low-metallicity dwarf galaxies from starburst galaxies. Current stellar atmosphere models fail to reproduce the observed [O iv]{}25.9μ {{m}}/[O iii]{}88μ {{m}} ratios, which are much higher when compared to the predicted values. Finally, the ([Ne iii]{}15.6μ {{m}} + [Ne ii]{}12.8μ {{m}})/([S iv]{}10.5μ {{m}} +[S iii]{}18.7μ {{m}}) ratio is proposed as a promising metallicity tracer to be used in obscured objects, where optical lines fail to accurately measure the metallicity. The diagnostic power of mid- to far-infrared spectroscopy shown here for local galaxies will be of crucial importance to study galaxy evolution during the dust-obscured phase at the peak of the star formation and black hole accretion activity (1\\lt z\\lt 4). This study will be addressed by future deep spectroscopic surveys with present and forthcoming facilities such as the James Webb Space Telescope, the Atacama Large Millimeter/submillimeter Array, and the Space Infrared telescope for Cosmology and Astrophysics.

  20. Bibliometric analysis of nutrition and dietetics research activity in Arab countries using ISI Web of Science database.

    PubMed

    Sweileh, Waleed M; Al-Jabi, Samah W; Sawalha, Ansam F; Zyoud, Sa'ed H

    2014-01-01

    Reducing nutrition-related health problems in Arab countries requires an understanding of the performance of Arab countries in the field of nutrition and dietetics research. Assessment of research activity from a particular country or region could be achieved through bibliometric analysis. This study was carried out to investigate research activity in "nutrition and dietetics" in Arab countries. Original and review articles published from Arab countries in "nutrition and dietetics" Web of Science category up until 2012 were retrieved and analyzed using the ISI Web of Science database. The total number of documents published in "nutrition and dietetics" category from Arab countries was 2062. This constitutes 1% of worldwide research activity in the field. Annual research productivity showed a significant increase after 2005. Approximately 60% of published documents originated from three Arab countries, particularly Egypt, Kingdom of Saudi Arabia, and Tunisia. However, Kuwait has the highest research productivity per million inhabitants. Main research areas of published documents were in "Food Science/Technology" and "Chemistry" which constituted 75% of published documents compared with 25% for worldwide documents in nutrition and dietetics. A total of 329 (15.96%) nutrition - related diabetes or obesity or cancer documents were published from Arab countries compared with 21% for worldwide published documents. Interest in nutrition and dietetics research is relatively recent in Arab countries. Focus of nutrition research is mainly toward food technology and chemistry with lesser activity toward nutrition-related health research. International cooperation in nutrition research will definitely help Arab researchers in implementing nutrition research that will lead to better national policies regarding nutrition.

  1. Active listening: The key of successful communication in hospital managers

    PubMed Central

    Jahromi, Vahid Kohpeima; Tabatabaee, Seyed Saeed; Abdar, Zahra Esmaeili; Rajabi, Mahboobeh

    2016-01-01

    Introduction One of the important causes of medical errors and unintentional harm to patients is ineffective communication. The important part of this skill, in case it has been forgotten, is listening. The objective of this study was to determine whether managers in hospitals listen actively. Methods This study was conducted between May and June 2014 among three levels of managers at teaching hospitals in Kerman, Iran. Active Listening skill among hospital managers was measured by self-made Active Listening Skill Scale (ALSS), which consists of the key elements of active listening and has five subscales, i.e., Avoiding Interruption, Maintaining Interest, Postponing Evaluation, Organizing Information, and Showing Interest. The data were analyzed by IBM-SPSS software, version 20, and the Pearson product-moment correlation coefficient, the chi-squared test, and multiple linear regressions. Results The mean score of active listening in hospital managers was 2.32 out of 3.The highest score (2.27) was obtained by the first-level managers, and the top managers got the lowest score (2.16). Hospital mangers were best in showing interest and worst in avoiding interruptions. The area of employment was a significant predictor of avoiding interruption and the managers’ gender was a strong predictor of skill in maintaining interest (p < 0.05). The type of management and education can predict postponing evaluation, and the length of employment can predict showing interest (p < 0.05). Conclusion There is a necessity for the development of strategies to create more awareness among the hospital managers concerning their active listening skills. PMID:27123221

  2. Intrauterine resuscitation: active management of fetal distress.

    PubMed

    Thurlow, J A; Kinsella, S M

    2002-04-01

    Acute fetal distress in labour is a condition of progressive fetal asphyxia with hypoxia and acidosis. It is usually diagnosed by finding characteristic features in the fetal heart rate pattern, wherever possible supported by fetal scalp pH measurement. Intrauterine resuscitation consists of applying specific measures with the aim of increasing oxygen delivery to the placenta and umbilical blood flow, in order to reverse hypoxia and acidosis. These measures include initial left lateral recumbent positioning followed by right lateral or knee-elbow if necessary, rapid intravenous infusion of a litre of non-glucose crystalloid, maternal oxygen administration at the highest practical inspired percentage, inhibition of uterine contractions usually with subcutaneous or intravenous terbutaline 250 microg, and intra-amniotic infusion of warmed crystalloid solution. Specific manoeuvres for umbilical cord prolapse are also described. Intrauterine resuscitation may be used as part of the obstetric management of labour, while preparing for caesarean delivery for fetal distress, or at the time of establishment of regional analgesia during labour in the compromised fetus. The principles may also be applied during inter-hospital transfers of sick or labouring parturients.

  3. Intrauterine resuscitation: active management of fetal distress.

    PubMed

    Thurlow, J A; Kinsella, S M

    2002-04-01

    Acute fetal distress in labour is a condition of progressive fetal asphyxia with hypoxia and acidosis. It is usually diagnosed by finding characteristic features in the fetal heart rate pattern, wherever possible supported by fetal scalp pH measurement. Intrauterine resuscitation consists of applying specific measures with the aim of increasing oxygen delivery to the placenta and umbilical blood flow, in order to reverse hypoxia and acidosis. These measures include initial left lateral recumbent positioning followed by right lateral or knee-elbow if necessary, rapid intravenous infusion of a litre of non-glucose crystalloid, maternal oxygen administration at the highest practical inspired percentage, inhibition of uterine contractions usually with subcutaneous or intravenous terbutaline 250 microg, and intra-amniotic infusion of warmed crystalloid solution. Specific manoeuvres for umbilical cord prolapse are also described. Intrauterine resuscitation may be used as part of the obstetric management of labour, while preparing for caesarean delivery for fetal distress, or at the time of establishment of regional analgesia during labour in the compromised fetus. The principles may also be applied during inter-hospital transfers of sick or labouring parturients. PMID:15321562

  4. Activity-based costing management in a private practice setting.

    PubMed

    Carlomagno, M; Draper, V

    1997-01-01

    Activity-based costing is a method of calculating cost of a service, focusing on operations. It gives quick and tangible cost information to operations and financial managers. While this method has be used more in the manufacturing area, it is gaining acceptance in the medical practice. This article describes activity-based costing and illustrates how to start utilizing it in a practice.

  5. Activity-Based Costing: A Cost Management Tool.

    ERIC Educational Resources Information Center

    Turk, Frederick J.

    1993-01-01

    In college and university administration, overhead costs are often charged to programs indiscriminately, whereas the support activities that underlie those costs remain unanalyzed. It is time for institutions to decrease ineffective use of resources. Activity-based management attributes costs more accurately and can improve efficiency. (MSE)

  6. The Impact of Environment and Occupation on the Health and Safety of Active Duty Air Force Members: Database Development and De-Identification.

    PubMed

    Erich, Roger; Eaton, Melinda; Mayes, Ryan; Pierce, Lamar; Knight, Andrew; Genovesi, Paul; Escobar, James; Mychalczuk, George; Selent, Monica

    2016-08-01

    Preparing data for medical research can be challenging, detail oriented, and time consuming. Transcription errors, missing or nonsensical data, and records not applicable to the study population may hamper progress and, if unaddressed, can lead to erroneous conclusions. In addition, study data may be housed in multiple disparate databases and complex formats. Merging methods may be incomplete to obtain temporally synchronized data elements. We created a comprehensive database to explore the general hypothesis that environmental and occupational factors influence health outcomes and risk-taking behavior among active duty Air Force personnel. Several databases containing demographics, medical records, health survey responses, and safety incident reports were cleaned, validated, and linked to form a comprehensive, relational database. The final step involved removing and transforming personally identifiable information to form a Health Insurance Portability and Accountability Act compliant limited database. Initial data consisted of over 62.8 million records containing 221 variables. When completed, approximately 23.9 million clean and valid records with 214 variables remained. With a clean, robust database, future analysis aims to identify high-risk career fields for targeted interventions or uncover potential protective factors in low-risk career fields. PMID:27483519

  7. Using Sales Management Students to Manage Professional Selling Students in an Innovative Active Learning Project

    ERIC Educational Resources Information Center

    Young, Joyce A.; Hawes, Jon M.

    2013-01-01

    This paper describes an application of active learning within two different courses: professional selling and sales management. Students assumed the roles of sales representatives and sales managers for an actual fund-raiser--a golf outing--sponsored by a student chapter of the American Marketing Association. The sales project encompassed an…

  8. Maize databases

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This chapter is a succinct overview of maize data held in the species-specific database MaizeGDB (the Maize Genomics and Genetics Database), and selected multi-species data repositories, such as Gramene/Ensembl Plants, Phytozome, UniProt and the National Center for Biotechnology Information (NCBI), ...

  9. TRANSFORMATION OF DEVELOPMENTAL NEUROTOXICITY DATA INTO STRUCTURE-SEARCHABLE TOXML DATABASE IN SUPPORT OF STRUCTURE-ACTIVITY RELATIONSHIP (SAR) WORKFLOW.

    EPA Science Inventory

    Early hazard identification of new chemicals is often difficult due to lack of data on the novel material for toxicity endpoints, including neurotoxicity. At present, there are no structure searchable neurotoxicity databases. A working group was formed to construct a database to...

  10. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for scientific activities in the vicinity of Barrow, Alaska.

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Kassin, A.; Kofoed, K. B.; Copenhaver, W.; Laney, C. M.; Gaylord, A. G.; Collins, J. A.; Tweedie, C. E.

    2014-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic and the Barrow Area Information Database (BAID, www.barrowmapped.org) tracks and facilitates a gamut of research, management, and educational activities in the area. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 12,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, save or print maps and query results, and filter or view information by space, time, and/or other tags. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. Recent advances include the addition of more than 2000 new research sites, provision of differential global position system (dGPS) and Unmanned Aerial Vehicle (UAV) support to visiting scientists, surveying over 80 miles of coastline to document rates of erosion, training of local GIS personal to better make use of science in local decision making, deployment and near real time connectivity to a wireless micrometeorological sensor network, links to Barrow area datasets housed at national data archives and substantial upgrades to the BAID website and web mapping applications.

  11. Status Report on Transfer of Physical and Hydraulic Properties Databases to the Hanford Environmental Information System - PNNL Remediation Decision Support Project, Task 1, Activity 6

    SciTech Connect

    Rockhold, Mark L.; Middleton, Lisa A.; Cantrell, Kirk J.

    2009-06-30

    This document provides a status report on efforts to transfer physical and hydraulic property data from PNNL to CHPRC for incorporation into HEIS. The Remediation Decision Support (RDS) Project is managed by Pacific Northwest National Laboratory (PNNL) to support Hanford Site waste management and remedial action decisions by the U.S. Department of Energy and their contractors. The objective of Task 1, Activity 6 of the RDS project is to compile all available physical and hydraulic property data for sediments from the Hanford Site, to port these data into the Hanford Environmental Information System (HEIS), and to make the data web-accessible to anyone on the Hanford Local Area Network via the so-called Virtual Library. These physical and hydraulic property data are used to estimate parameters for analytical and numerical flow and transport models that are used for site risk assessments and evaluation of remedial action alternatives. In past years efforts were made by RDS project staff to compile all available physical and hydraulic property data for Hanford sediments and to transfer these data into SoilVision{reg_sign}, a commercial geotechnical software package designed for storing, analyzing, and manipulating soils data. Although SoilVision{reg_sign} has proven to be useful, its access and use restrictions have been recognized as a limitation to the effective use of the physical and hydraulic property databases by the broader group of potential users involved in Hanford waste site issues. In order to make these data more widely available and useable, a decision was made to port them to HEIS and to make them web-accessible via a Virtual Library module. In FY08 the original objectives of this activity on the RDS project were to: (1) ensure traceability and defensibility of all physical and hydraulic property data currently residing in the SoilVision{reg_sign} database maintained by PNNL, (2) transfer the physical and hydraulic property data from the Microsoft

  12. [Active career management needed for female doctors].

    PubMed

    Maas, Angela H E M; ter Braak, Edith W M T; Verbon, Annelies

    2015-01-01

    For more than 15 years two-thirds of medical students have been women. Despite this, they represent a minority (16-25 %) of professors in academic medicine. There is still a major gender gap to the disadvantage of women in leading positions in academia, with women earning only 80% of the salary of their male counterparts and fewer opportunities for scientific grants. Recent studies have shown that career ambition among men and women in medicine is comparable. However, successful women more often doubt their own achievements than men do. This is known as the 'imposter phenomenon' and acts as a barrier to career progression. Female leadership should be more actively promoted and encouraged to establish the diversity and creativity that we need in our current healthcare system.

  13. [Active career management needed for female doctors].

    PubMed

    Maas, Angela H E M; ter Braak, Edith W M T; Verbon, Annelies

    2015-01-01

    For more than 15 years two-thirds of medical students have been women. Despite this, they represent a minority (16-25 %) of professors in academic medicine. There is still a major gender gap to the disadvantage of women in leading positions in academia, with women earning only 80% of the salary of their male counterparts and fewer opportunities for scientific grants. Recent studies have shown that career ambition among men and women in medicine is comparable. However, successful women more often doubt their own achievements than men do. This is known as the 'imposter phenomenon' and acts as a barrier to career progression. Female leadership should be more actively promoted and encouraged to establish the diversity and creativity that we need in our current healthcare system. PMID:26959735

  14. Management of Water for Unconventional Oil and Gas Operations Enhanced with the Expanded U.S.Geological Survey Produced Waters Geochemical Database

    NASA Astrophysics Data System (ADS)

    Gans, K. D.; Blondes, M. S.; Thordsen, J. J.; Thomas, B.; Reidy, M. E.; Engle, M.; Kharaka, Y. K.; Rowan, E. L.

    2014-12-01

    Increases in hydraulic fracturing practices for shale gas and tight oil reservoirs have dramatically increased petroleum production in the USA, but have also made the issue of water management from these operations a high priority. Hydraulic fracturing requires ~ 10,000 to 50,000 m3 of water per well for injection in addition to water used to drill the well. Initially much of the water used for hydraulic fracturing was fresh water, but attitudes and operations are changing in response to costs and concerns. Concerns about groundwater depletion and contamination have prompted operators to increase the amount of produced water that can be recycled for hydraulic fracturing and to find suitable locations for salt-water injection. Knowledge of the geochemistry of produced waters is valuable in determining the feasibility of produced water recycling. Water with low salinity can be reclaimed for use outside of the petroleum industry (e.g. irrigation, municipal uses, and industrial operations). The updated and expanded USGS Produced Waters Database available at http://eerscmap.usgs.gov/pwapp/ will facilitate and enhance studies on management of water, including produced water, for unconventional oil and gas drilling and production. The USGS database contains > 160,000 samples. Expanding on the 2002 database, we have filled in state and regional gaps with information from conventional and unconventional wells and have increased the number of constituents to include minor and trace chemicals, isotopes, and time series data. We currently have produced water data from 5,200 tight gas wells, 4,500 coal-bed methane (CBM) wells, 3,500 shale gas wells, and 700 tight oil wells. These numbers will increase as we continue to receive positive responses from oil companies, state oil and gas commissions, and scientists wanting to contribute their data. This database is an important resource for a wide range of interested parties. Scientists from universities, government agencies, public

  15. Passive and active adaptive management: Approaches and an example

    USGS Publications Warehouse

    Williams, B.K.

    2011-01-01

    Adaptive management is a framework for resource conservation that promotes iterative learning-based decision making. Yet there remains considerable confusion about what adaptive management entails, and how to actually make resource decisions adaptively. A key but somewhat ambiguous distinction in adaptive management is between active and passive forms of adaptive decision making. The objective of this paper is to illustrate some approaches to active and passive adaptive management with a simple example involving the drawdown of water impoundments on a wildlife refuge. The approaches are illustrated for the drawdown example, and contrasted in terms of objectives, costs, and potential learning rates. Some key challenges to the actual practice of AM are discussed, and tradeoffs between implementation costs and long-term benefits are highlighted. ?? 2010 Elsevier Ltd.

  16. USER'S GUIDE FOR THE MUNICIPAL SOLID WASTE LIFE-CYCLE DATABASE

    EPA Science Inventory

    The report describes how to use the municipal solid waste (MSW) life cycle database, a software application with Microsoft Access interfaces, that provides environmental data for energy production, materials production, and MSW management activities and equipment. The basic datab...

  17. Identification of new ABA- and MEJA-activated sugarcane bZIP genes by data mining in the SUCEST database.

    PubMed

    Schlögl, Paulo Sérgio; Nogueira, Fábio Tebaldi S; Drummond, Rodrigo; Felix, Juliana M; De Rosa, Vicente E; Vicentini, Renato; Leite, Adilson; Ulian, Eugênio C; Menossi, Marcelo

    2008-02-01

    Sugarcane is generally propagated by cuttings of the stalk containing one or more lateral buds, which will develop into a new plant. The transition from the dormant into the active stage constitutes a complex phenomenon characterized by changes in accumulation of phytohormones and several other physiological aspects. Abscisic acid (ABA) and methyl-jasmonate (MeJA) are major signaling molecules, which influence plant development and stress responses. These plant regulators modulate gene expression with the participation of many transcriptional factors. Basic leucine zipper proteins (bZIPs) form a large family of transcriptional factors involved in a variety of plant physiological processes, such as development and responses to stress. Query sequences consisting of full-length protein sequence of each of the Arabidopsis bZIP families were utilized to screen the sugarcane EST database (SUCEST) and 86 sugarcane assembled sequences (SAS) coding for bZIPs were identified. cDNA arrays and RNA-gel blots were used to study the expression of these sugarcane bZIP genes during early plantlet development and in response to ABA and MeJA. Six bZIP genes were found to be differentially expressed during development. ABA and MeJA modulated the expression of eight sugarcane bZIP genes. Our findings provide novel insights into the expression of this large protein family of transcriptional factors in sugarcane.

  18. Genome databases

    SciTech Connect

    Courteau, J.

    1991-10-11

    Since the Genome Project began several years ago, a plethora of databases have been developed or are in the works. They range from the massive Genome Data Base at Johns Hopkins University, the central repository of all gene mapping information, to small databases focusing on single chromosomes or organisms. Some are publicly available, others are essentially private electronic lab notebooks. Still others limit access to a consortium of researchers working on, say, a single human chromosome. An increasing number incorporate sophisticated search and analytical software, while others operate as little more than data lists. In consultation with numerous experts in the field, a list has been compiled of some key genome-related databases. The list was not limited to map and sequence databases but also included the tools investigators use to interpret and elucidate genetic data, such as protein sequence and protein structure databases. Because a major goal of the Genome Project is to map and sequence the genomes of several experimental animals, including E. coli, yeast, fruit fly, nematode, and mouse, the available databases for those organisms are listed as well. The author also includes several databases that are still under development - including some ambitious efforts that go beyond data compilation to create what are being called electronic research communities, enabling many users, rather than just one or a few curators, to add or edit the data and tag it as raw or confirmed.

  19. SU-E-J-129: A Strategy to Consolidate the Image Database of a VERO Unit Into a Radiotherapy Management System

    SciTech Connect

    Yan, Y; Medin, P; Yordy, J; Zhao, B; Jiang, S

    2014-06-01

    Purpose: To present a strategy to integrate the imaging database of a VERO unit with a treatment management system (TMS) to improve clinical workflow and consolidate image data to facilitate clinical quality control and documentation. Methods: A VERO unit is equipped with both kV and MV imaging capabilities for IGRT treatments. It has its own imaging database behind a firewall. It has been a challenge to transfer images on this unit to a TMS in a radiation therapy clinic so that registered images can be reviewed remotely with an approval or rejection record. In this study, a software system, iPump-VERO, was developed to connect VERO and a TMS in our clinic. The patient database folder on the VERO unit was mapped to a read-only folder on a file server outside VERO firewall. The application runs on a regular computer with the read access to the patient database folder. It finds the latest registered images and fuses them in one of six predefined patterns before sends them via DICOM connection to the TMS. The residual image registration errors will be overlaid on the fused image to facilitate image review. Results: The fused images of either registered kV planar images or CBCT images are fully DICOM compatible. A sentinel module is built to sense new registered images with negligible computing resources from the VERO ExacTrac imaging computer. It takes a few seconds to fuse registered images and send them to the TMS. The whole process is automated without any human intervention. Conclusion: Transferring images in DICOM connection is the easiest way to consolidate images of various sources in your TMS. Technically the attending does not have to go to the VERO treatment console to review image registration prior delivery. It is a useful tool for a busy clinic with a VERO unit.

  20. Open systems and databases

    SciTech Connect

    Martire, G.S. ); Nuttall, D.J.H. )

    1993-05-01

    This paper is part of a series of papers invited by the IEEE POWER CONTROL CENTER WORKING GROUP concerning the changing designs of modern control centers. Papers invited by the Working Group discuss the following issues: Benefits of Openness, Criteria for Evaluating Open EMS Systems, Hardware Design, Configuration Management, Security, Project Management, Databases, SCADA, Inter- and Intra-System Communications and Man-Machine Interfaces,'' The goal of this paper is to provide an introduction to the issues pertaining to Open Systems and Databases.'' The intent is to assist understanding of some of the underlying factors that effect choices that must be made when selecting a database system for use in a control room environment. This paper describes and compares the major database information models which are in common use for database systems and provides an overview of SQL. A case for the control center community to follow the workings of the non-formal standards bodies is presented along with possible uses and the benefits of commercially available databases within the control center. The reasons behind the emergence of industry supported standards organizations such as the Open Software Foundation (OSF) and SQL Access are presented.

  1. Mechanisms and Management of Stress Fractures in Physically Active Persons

    PubMed Central

    Romani, William A.; Gieck, Joe H.; Perrin, David H.; Saliba, Ethan N.; Kahler, David M.

    2002-01-01

    Objective: To describe the anatomy of bone and the physiology of bone remodeling as a basis for the proper management of stress fractures in physically active people. Data Sources: We searched PubMed for the years 1965 through 2000 using the key words stress fracture, bone remodeling, epidemiology, and rehabilitation. Data Synthesis: Bone undergoes a normal remodeling process in physically active persons. Increased stress leads to an acceleration of this remodeling process, a subsequent weakening of bone, and a higher susceptibility to stress fracture. When a stress fracture is suspected, appropriate management of the injury should begin immediately. Effective management includes a cyclic process of activity and rest that is based on the remodeling process of bone. Conclusions/Recommendations: Bone continuously remodels itself to withstand the stresses involved with physical activity. Stress fractures occur as the result of increased remodeling and a subsequent weakening of the outer surface ofthe bone. Once a stress fracture is suspected, a cyclic management program that incorporates the physiology of bone remodeling should be initiated. The cyclic program should allow the physically active person to remove the source of the stress to the bone, maintain fitness, promote a safe return to activity, and permit the bone to heal properly. PMID:16558676

  2. A Ranking Analysis of the Management Schools in Greater China (2000-2010): Evidence from the SSCI Database

    ERIC Educational Resources Information Center

    Hou, Mingjun; Fan, Peihua; Liu, Heng

    2014-01-01

    The authors rank the management schools in Greater China (including Mainland China, Hong Kong, Taiwan, and Macau) based on their academic publications in the Social Sciences Citation Index management and business journals from 2000 to 2010. Following K. Ritzberger's (2008) and X. Yu and Z. Gao's (2010) ranking method, the authors develop…

  3. ePlantLIBRA: A composition and biological activity database for bioactive compounds in plant food supplements.

    PubMed

    Plumb, J; Lyons, J; Nørby, K; Thomas, M; Nørby, E; Poms, R; Bucchini, L; Restani, P; Kiely, M; Finglas, P

    2016-02-15

    The newly developed ePlantLIBRA database is a comprehensive and searchable database, with up-to-date coherent and validated scientific information on plant food supplement (PFS) bioactive compounds, with putative health benefits as well as adverse effects, and contaminants and residues. It is the only web-based database available compiling peer reviewed publications and case studies on PFS. A user-friendly, efficient and flexible interface has been developed for searching, extracting, and exporting the data, including links to the original references. Data from over 570 publications have been quality evaluated and entered covering 70 PFS or their botanical ingredients.

  4. ePlantLIBRA: A composition and biological activity database for bioactive compounds in plant food supplements.

    PubMed

    Plumb, J; Lyons, J; Nørby, K; Thomas, M; Nørby, E; Poms, R; Bucchini, L; Restani, P; Kiely, M; Finglas, P

    2016-02-15

    The newly developed ePlantLIBRA database is a comprehensive and searchable database, with up-to-date coherent and validated scientific information on plant food supplement (PFS) bioactive compounds, with putative health benefits as well as adverse effects, and contaminants and residues. It is the only web-based database available compiling peer reviewed publications and case studies on PFS. A user-friendly, efficient and flexible interface has been developed for searching, extracting, and exporting the data, including links to the original references. Data from over 570 publications have been quality evaluated and entered covering 70 PFS or their botanical ingredients. PMID:26433297

  5. Assessment of global disease activity in RA patients monitored in the METEOR database: the patient's versus the rheumatologist's opinion.

    PubMed

    Gvozdenović, Emilia; Koevoets, Rosanne; Wolterbeek, Ron; van der Heijde, Désirée; Huizinga, Tom W J; Allaart, Cornelia F; Landewé, Robert B M

    2014-04-01

    The objectives of this study were to compare the patient's (PtGDA) and physician's (PhGDA) assessment of global disease activity and to identify factors that might influence these differences as well as factors that may influence the patient's and the physician's scores separately. Anonymous data were used from 2,117 Dutch patients included in the Measurement of efficacy of Treatment in the Era of Rheumatology database. PtGDA and PhGDA were scored independently on a 100-mm visual analog scale (VAS) with 0 and 100 as extremes. The agreement, intraclass correlation coefficients (ICC), was calculated and a Bland-Altman plot was created to visualize the differences between PtGDA and PhGDA. Linear mixed model analysis was used to model PtGDA and PhGDA. Logistic repeated measurements were used to model the difference in PtGDA and PhGDA (PtGDA > PhGDA versus PtGDA ≤ PhGDA). Gender patient, gender physician, age, swollen joint count (SJC), tender joint count, VAS pain, disease duration, and erythrocyte sedimentation rate (ESR) were considered as possible determinants in both models. Mean (standard deviation) age was 57 (15) years and 67 % of the patients were female. Agreement between PtGDA and PhGDA was moderate (ICC, 0.57). Patients scored on average 11 units higher (worse) than rheumatologists (95 % limits of agreement, -25.2 to 47.6). Patient's perception of pain (VAS) was positively associated with a PtGDA being higher than PhGDA. Similarly, ESR and swollen joint counts were positively associated with a PtGDA being lower or equal to the PhGDA. Patients rate global disease activity consistently higher than their rheumatologists. Patients base their judgment primarily on the level of pain, physicians on the level of SJC and ESR.

  6. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  7. The spectral database Specchio: Data management, data sharing and initial processing of field spectrometer data within the Dimensions of Biodiversity project

    NASA Astrophysics Data System (ADS)

    Hueni, A.; Schweiger, A. K.

    2015-12-01

    Field spectrometry has substantially gained importance in vegetation ecology due to the increasing knowledge about causal ties between vegetation spectra and biochemical and structural plant traits. Additionally, worldwide databases enable the exchange of spectral and plant trait data and promote global research cooperation. This can be expected to further enhance the use of field spectrometers in ecological studies. However, the large amount of data collected during spectral field campaigns poses major challenges regarding data management, archiving and processing. The spectral database Specchio is designed to organize, manage, process and share spectral data and metadata. We provide an example for using Specchio based on leaf level spectra of prairie plant species collected during the 2015 field campaign of the Dimensions of Biodiversity research project, conducted at the Cedar Creek Long-Term Ecological Research site, in central Minnesota. We show how spectral data collections can be efficiently administered, organized and shared between distinct research groups and explore the capabilities of Specchio for data quality checks and initial processing steps.

  8. WAX ActiveLibrary: a tool to manage information overload.

    PubMed

    Hanka, R; O'Brien, C; Heathfield, H; Buchan, I E

    1999-11-01

    WAX Active-Library (Cambridge Centre for Clinical Informatics) is a knowledge management system that seeks to support doctors' decision making through the provision of electronic books containing a wide range of clinical knowledge and locally based information. WAX has been piloted in several regions in the United Kingdom and formally evaluated in 17 GP surgeries based in Cambridgeshire. The evaluation has provided evidence that WAX Active-Library significantly improves GPs' access to relevant information sources and by increasing appropriate patient management and referrals this might also lead to an improvement in clinical outcomes.

  9. Prognostic and health management of active assets in nuclear power plants

    SciTech Connect

    Agarwal, Vivek; Lybeck, Nancy; Pham, Binh T.; Rusaw, Richard; Bickford, Randall

    2015-06-04

    This study presents the development of diagnostic and prognostic capabilities for active assets in nuclear power plants (NPPs). The research was performed under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program. Idaho National Laboratory researched, developed, implemented, and demonstrated diagnostic and prognostic models for generator step-up transformers (GSUs). The Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software developed by the Electric Power Research Institute was used to perform diagnosis and prognosis. As part of the research activity, Idaho National Laboratory implemented 22 GSU diagnostic models in the Asset Fault Signature Database and two wellestablished GSU prognostic models for the paper winding insulation in the Remaining Useful Life Database of the FW-PHM Suite. The implemented models along with a simulated fault data stream were used to evaluate the diagnostic and prognostic capabilities of the FW-PHM Suite. Knowledge of the operating condition of plant asset gained from diagnosis and prognosis is critical for the safe, productive, and economical long-term operation of the current fleet of NPPs. This research addresses some of the gaps in the current state of technology development and enables effective application of diagnostics and prognostics to nuclear plant assets.

  10. Prognostic and health management of active assets in nuclear power plants

    DOE PAGES

    Agarwal, Vivek; Lybeck, Nancy; Pham, Binh T.; Rusaw, Richard; Bickford, Randall

    2015-06-04

    This study presents the development of diagnostic and prognostic capabilities for active assets in nuclear power plants (NPPs). The research was performed under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program. Idaho National Laboratory researched, developed, implemented, and demonstrated diagnostic and prognostic models for generator step-up transformers (GSUs). The Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software developed by the Electric Power Research Institute was used to perform diagnosis and prognosis. As part of the research activity, Idaho National Laboratory implemented 22 GSU diagnostic models in the Asset Fault Signature Database and twomore » wellestablished GSU prognostic models for the paper winding insulation in the Remaining Useful Life Database of the FW-PHM Suite. The implemented models along with a simulated fault data stream were used to evaluate the diagnostic and prognostic capabilities of the FW-PHM Suite. Knowledge of the operating condition of plant asset gained from diagnosis and prognosis is critical for the safe, productive, and economical long-term operation of the current fleet of NPPs. This research addresses some of the gaps in the current state of technology development and enables effective application of diagnostics and prognostics to nuclear plant assets.« less

  11. Hazard Analysis Database Report

    SciTech Connect

    GRAMS, W.H.

    2000-12-28

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.

  12. Hazard Analysis Database Report

    SciTech Connect

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  13. Scale out databases for CERN use cases

    NASA Astrophysics Data System (ADS)

    Baranowski, Zbigniew; Grzybek, Maciej; Canali, Luca; Lanza Garcia, Daniel; Surdy, Kacper

    2015-12-01

    Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log database.

  14. Database Systems. Course Three. Information Systems Curriculum.

    ERIC Educational Resources Information Center

    O'Neil, Sharon Lund; Everett, Donna R.

    This course is the third of seven in the Information Systems curriculum. The purpose of the course is to familiarize students with database management concepts and standard database management software. Databases and their roles, advantages, and limitations are explained. An overview of the course sets forth the condition and performance standard…

  15. Redis database administration tool

    SciTech Connect

    Martinez, J. J.

    2013-02-13

    MyRedis is a product of the Lorenz subproject under the ASC Scirntific Data Management effort. MyRedis is a web based utility designed to allow easy administration of instances of Redis databases. It can be usedd to view and manipulate data as well as run commands directly against a variety of different Redis hosts.

  16. Use of relational database management system by clinicians to create automated MICU progress note from existent data sources.

    PubMed

    Delaney, D P; Zibrak, J D; Samore, M; Peterson, M

    1997-01-01

    We designed and built an application called MD Assist that compiles data from several hospital databases to create reports used for daily house officer rounding in the medical intensive care unit (MICU). After rounding, the report becomes the objective portion of the daily "SOAP" MICU progress note. All data used in the automated note was available in digital format residing in an institution wide Sybase data repository which had been built to fulfill data needs of the parent enterprise. From initial design of target output through actual creation and implementation in the MICU, MD Assist was created by physicians with only consultative help from information systems (IS). This project demonstrated a method for rapidly developing time saving, clinically useful applications using a comprehensive clinical data repository.

  17. Management of Hypertension: Adapting New Guidelines for Active Patients.

    ERIC Educational Resources Information Center

    Tanji, Jeffrey L.; Batt, Mark E.

    1995-01-01

    Discusses recent guidelines on hypertension from the National Heart, Lung, and Blood Institute and details the latest management protocols for patients with high blood pressure. The article helps physicians interpret the guidelines for treating active patients, highlighting diagnosis, step care revision, pharmacology, and sports participation…

  18. Draft position paper on knowledge management in space activities

    NASA Technical Reports Server (NTRS)

    Holm, Jeanne; Moura, Denis

    2003-01-01

    As other fields of industry, space activities are facing the challenge of Knowledge Management and the International Academy of Astronautics decided to settle in 2002 a Study Group to analyse the problem and issue general guidelines. This communication presents the draft position paper of this group in view to be discussed during the 2003 IAF Congress.

  19. Moodog: Tracking Student Activity in Online Course Management Systems

    ERIC Educational Resources Information Center

    Zhang, Hangjin; Almeroth, Kevin

    2010-01-01

    Many universities are currently using Course Management Systems (CMSes) to conduct online learning, for example, by distributing course materials or submitting homework assignments. However, most CMSes do not include comprehensive activity tracking and analysis capabilities. This paper describes a method to track students' online learning…

  20. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for scientific activities in the vicinity of Barrow, Alaska

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Kassin, A.; Gaylord, A.; Brown, J.; Tweedie, C. E.

    2012-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic. The Barrow Area Information Database (BAID, www.baidims.org) is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 9,600 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, and save or print maps and query results. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. BAID has been used to: Optimize research site choice; Reduce duplication of science effort; Discover complementary and potentially detrimental research activities in an area of scientific interest; Re-establish historical research sites for resampling efforts assessing change in ecosystem structure and function over time; Exchange knowledge across disciplines and generations; Facilitate communication between western science and traditional ecological knowledge; Provide local residents access to science data that facilitates adaptation to arctic change; (and) Educate the next generation of environmental and computer scientists. This poster describes key activities that will be undertaken over the next three years to provide BAID users with novel software tools to interact with a current and diverse selection of information and data about the Barrow area. Key activities include: 1. Collecting data on research

  1. SMALL-SCALE AND GLOBAL DYNAMOS AND THE AREA AND FLUX DISTRIBUTIONS OF ACTIVE REGIONS, SUNSPOT GROUPS, AND SUNSPOTS: A MULTI-DATABASE STUDY

    SciTech Connect

    Muñoz-Jaramillo, Andrés; Windmueller, John C.; Amouzou, Ernest C.; Longcope, Dana W.; Senkpeil, Ryan R.; Tlatov, Andrey G.; Nagovitsyn, Yury A.; Pevtsov, Alexei A.; Chapman, Gary A.; Cookson, Angela M.; Yeates, Anthony R.; Watson, Fraser T.; Balmaceda, Laura A.; DeLuca, Edward E.; Martens, Petrus C. H.

    2015-02-10

    In this work, we take advantage of 11 different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions—where a pure Weibull (log-normal) characterizes the distribution of structures with fluxes below (above) 10{sup 21}Mx (10{sup 22}Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behavior of a power-law distribution (when extended to smaller fluxes), making our results compatible with the results of Parnell et al. We propose that this is evidence of two separate mechanisms giving rise to visible structures on the photosphere: one directly connected to the global component of the dynamo (and the generation of bipolar active regions), and the other with the small-scale component of the dynamo (and the fragmentation of magnetic structures due to their interaction with turbulent convection)

  2. Active surveillance for the management of localized prostate cancer: Guideline recommendations

    PubMed Central

    Morash, Chris; Tey, Rovena; Agbassi, Chika; Klotz, Laurence; McGowan, Tom; Srigley, John; Evans, Andrew

    2015-01-01

    Introduction: The objective is to provide guidance on the role of active surveillance (AS) as a management strategy for low-risk prostate cancer patients and to ensure that AS is offered to appropriate patients assessed by a standardized protocol. Prostate cancer is often a slowly progressive or sometimes non-progressive indolent disease diagnosed at an early stage with localized tumours that are unlikely to cause morbidity or death. Standard active treatments for prostate cancer include radiotherapy (RT) or radical prostatectomy (RP), but the harms from over diagnosis and overtreatment are of a significant concern. AS is increasingly being considered as a management strategy to avoid or delay the potential harms caused by unnecessary radical treatment. Methods: A literature search of MEDLINE, EMBASE, the Cochrane library, guideline databases and relevant meeting proceedings was performed and a systematic review of identified evidence was synthesized to make recommendations relating to the role of AS in the management of localized prostate cancer. Results: No exiting guidelines or reviews were suitable for use in the synthesis of evidence for the recommendations, but 59 reports of primary studies were identified. Due to studies being either non-comparative or heterogeneous, pooled meta-analyses were not conducted. Conclusion: The working group concluded that for patients with low-risk (Gleason score ≤6) localized prostate cancer, AS is the preferred disease management strategy. Active treatment (RP or RT) is appropriate for patients with intermediate-risk (Gleason score 7) localized prostate cancer. For select patients with low-volume Gleason 3+4=7 localized prostate cancer, AS can be considered. PMID:26225165

  3. Management and climate contributions to satellite-derived active fire trends in the contiguous United States

    NASA Astrophysics Data System (ADS)

    Lin, Hsiao-Wen; McCarty, Jessica L.; Wang, Dongdong; Rogers, Brendan M.; Morton, Douglas C.; Collatz, G. James; Jin, Yufang; Randerson, James T.

    2014-04-01

    Fires in croplands, plantations, and rangelands contribute significantly to fire emissions in the United States, yet are often overshadowed by wildland fires in efforts to develop inventories or estimate responses to climate change. Here we quantified decadal trends, interannual variability, and seasonality of Terra Moderate Resolution Imaging Spectroradiometer (MODIS) observations of active fires (thermal anomalies) as a function of management type in the contiguous U.S. during 2001-2010. We used the Monitoring Trends in Burn Severity database to identify active fires within the perimeter of large wildland fires and land cover maps to identify active fires in croplands. A third class of fires defined as prescribed/other included all residual satellite active fire detections. Large wildland fires were the most variable of all three fire types and had no significant annual trend in the contiguous U.S. during 2001-2010. Active fires in croplands, in contrast, increased at a rate of 3.4% per year. Cropland and prescribed/other fire types combined were responsible for 77% of the total active fire detections within the U.S and were most abundant in the south and southeast. In the west, cropland active fires decreased at a rate of 5.9% per year, likely in response to intensive air quality policies. Potential evaporation was a dominant regulator of the interannual variability of large wildland fires, but had a weaker influence on the other two fire types. Our analysis suggests it may be possible to modify landscape fire emissions within the U.S. by influencing the way fires are used in managed ecosystems.

  4. Management and climate contributions to satellite-derived active fire trends in the contiguous United States

    PubMed Central

    Lin, Hsiao-Wen; McCarty, Jessica L; Wang, Dongdong; Rogers, Brendan M; Morton, Douglas C; Collatz, G James; Jin, Yufang; Randerson, James T

    2014-01-01

    Fires in croplands, plantations, and rangelands contribute significantly to fire emissions in the United States, yet are often overshadowed by wildland fires in efforts to develop inventories or estimate responses to climate change. Here we quantified decadal trends, interannual variability, and seasonality of Terra Moderate Resolution Imaging Spectroradiometer (MODIS) observations of active fires (thermal anomalies) as a function of management type in the contiguous U.S. during 2001–2010. We used the Monitoring Trends in Burn Severity database to identify active fires within the perimeter of large wildland fires and land cover maps to identify active fires in croplands. A third class of fires defined as prescribed/other included all residual satellite active fire detections. Large wildland fires were the most variable of all three fire types and had no significant annual trend in the contiguous U.S. during 2001–2010. Active fires in croplands, in contrast, increased at a rate of 3.4% per year. Cropland and prescribed/other fire types combined were responsible for 77% of the total active fire detections within the U.S and were most abundant in the south and southeast. In the west, cropland active fires decreased at a rate of 5.9% per year, likely in response to intensive air quality policies. Potential evaporation was a dominant regulator of the interannual variability of large wildland fires, but had a weaker influence on the other two fire types. Our analysis suggests it may be possible to modify landscape fire emissions within the U.S. by influencing the way fires are used in managed ecosystems. Key Points Wildland, cropland, and prescribed fires had different trends and patterns Sensitivity to climate varied with fire type Intensity of air quality regulation influenced cropland burning trends PMID:26213662

  5. The compatibility of general managers' activities and intentions in managing change in the NHS.

    PubMed

    Spurgeon, P; Barwell, F

    1990-03-01

    As Hales (1986) has observed, the problem of much of the managerial research to date has been the reluctance to ask why managers behave in the way they do. The behaviour of general managers in tackling organisational change in the NHS needs to be viewed not only with respect to what is done but also with respect to how personal and organisational objectives are construed. In other words, the implementation of organisational change ultimately rests on how general managers perceive the nature of this change and their role in structuring their own personal and organisational objectives into appropriate activities. Examining the compatibility of managerial activities and the underlying values and intentions which support them is of critical importance in any cognitively-based approach. These intentions provide an important link between perceptions (i.e. how the organisation is construed) and behaviour (i.e. what activities managers choose to perform). Understanding the conceptual frameworks which underpin managerial activities could have profound implications for assessing the performance of general managers.

  6. The compatibility of general managers' activities and intentions in managing change in the NHS.

    PubMed

    Spurgeon, P; Barwell, F

    1990-03-01

    As Hales (1986) has observed, the problem of much of the managerial research to date has been the reluctance to ask why managers behave in the way they do. The behaviour of general managers in tackling organisational change in the NHS needs to be viewed not only with respect to what is done but also with respect to how personal and organisational objectives are construed. In other words, the implementation of organisational change ultimately rests on how general managers perceive the nature of this change and their role in structuring their own personal and organisational objectives into appropriate activities. Examining the compatibility of managerial activities and the underlying values and intentions which support them is of critical importance in any cognitively-based approach. These intentions provide an important link between perceptions (i.e. how the organisation is construed) and behaviour (i.e. what activities managers choose to perform). Understanding the conceptual frameworks which underpin managerial activities could have profound implications for assessing the performance of general managers. PMID:10104281

  7. Drinking Water Database

    NASA Technical Reports Server (NTRS)

    Murray, ShaTerea R.

    2004-01-01

    This summer I had the opportunity to work in the Environmental Management Office (EMO) under the Chemical Sampling and Analysis Team or CS&AT. This team s mission is to support Glenn Research Center (GRC) and EM0 by providing chemical sampling and analysis services and expert consulting. Services include sampling and chemical analysis of water, soil, fbels, oils, paint, insulation materials, etc. One of this team s major projects is the Drinking Water Project. This is a project that is done on Glenn s water coolers and ten percent of its sink every two years. For the past two summers an intern had been putting together a database for this team to record the test they had perform. She had successfully created a database but hadn't worked out all the quirks. So this summer William Wilder (an intern from Cleveland State University) and I worked together to perfect her database. We began be finding out exactly what every member of the team thought about the database and what they would change if any. After collecting this data we both had to take some courses in Microsoft Access in order to fix the problems. Next we began looking at what exactly how the database worked from the outside inward. Then we began trying to change the database but we quickly found out that this would be virtually impossible.

  8. Management of hypertension by reduction in sympathetic activity.

    PubMed

    Mathias, C J

    1991-04-01

    The sympathetic nervous system may initiate or maintain hypertension, and a range of approaches that reduce sympathetic activity is often of value in management. These may include nonpharmacological methods, such as the various forms of behavioral therapy (e.g., meditation, relaxation, and biofeedback techniques); weight reduction and avoidance of particular foods and agents that stimulate sympathetic activity (including caffeine and alcohol), and regular physical exercise. Pharmacological therapy includes centrally acting drugs such as alpha-methyldopa, clonidine, and reserpine; ganglionic blockers such as hexamethonium; agents acting on sympathetic nerve terminals such as guanethidine and debrisoquine; and drugs that may act at multiple sites, such as the beta-adrenergic blockers. The role of reducing sympathetic activity in the current management of hypertension and its complications is considered in this overview.

  9. Solubility Database

    National Institute of Standards and Technology Data Gateway

    SRD 106 IUPAC-NIST Solubility Database (Web, free access)   These solubilities are compiled from 18 volumes (Click here for List) of the International Union for Pure and Applied Chemistry(IUPAC)-NIST Solubility Data Series. The database includes liquid-liquid, solid-liquid, and gas-liquid systems. Typical solvents and solutes include water, seawater, heavy water, inorganic compounds, and a variety of organic compounds such as hydrocarbons, halogenated hydrocarbons, alcohols, acids, esters and nitrogen compounds. There are over 67,500 solubility measurements and over 1800 references.

  10. SECONDARY WASTE MANAGEMENT STRATEGY FOR EARLY LOW ACTIVITY WASTE TREATMENT

    SciTech Connect

    TW, CRAWFORD

    2008-07-17

    This study evaluates parameters relevant to River Protection Project secondary waste streams generated during Early Low Activity Waste operations and recommends a strategy for secondary waste management that considers groundwater impact, cost, and programmatic risk. The recommended strategy for managing River Protection Project secondary waste is focused on improvements in the Effiuent Treatment Facility. Baseline plans to build a Solidification Treatment Unit adjacent to Effluent Treatment Facility should be enhanced to improve solid waste performance and mitigate corrosion of tanks and piping supporting the Effiuent Treatment Facility evaporator. This approach provides a life-cycle benefit to solid waste performance and reduction of groundwater contaminants.

  11. Waste management and technologies analytical database project for Los Alamos National Laboratory/Department of Energy. Final report, June 7, 1993--June 15, 1994

    SciTech Connect

    1995-04-17

    The Waste Management and Technologies Analytical Database System (WMTADS) supported by the Department of Energy`s (DOE) Office of Environmental Management (EM), Office of Technology Development (EM-50), was developed and based at the Los Alamos National Laboratory (LANL), Los Alamos, New Mexico, to collect, identify, organize, track, update, and maintain information related to existing/available/developing and planned technologies to characterize, treat, and handle mixed, hazardous and radioactive waste for storage and disposal in support of EM strategies and goals and to focus area projects. WMTADS was developed as a centralized source of on-line information regarding technologies for environmental management processes that can be accessed by a computer, modem, phone line, and communications software through a Local Area Network (LAN), and server connectivity on the Internet, the world`s largest computer network, and with file transfer protocol (FTP) can also be used to globally transfer files from the server to the user`s computer through Internet and World Wide Web (WWW) using Mosaic.

  12. A Comprehensive Software and Database Management System for Glomerular Filtration Rate Estimation by Radionuclide Plasma Sampling and Serum Creatinine Methods.

    PubMed

    Jha, Ashish Kumar

    2015-01-01

    Glomerular filtration rate (GFR) estimation by plasma sampling method is considered as the gold standard. However, this method is not widely used because the complex technique and cumbersome calculations coupled with the lack of availability of user-friendly software. The routinely used Serum Creatinine method (SrCrM) of GFR estimation also requires the use of online calculators which cannot be used without internet access. We have developed user-friendly software "GFR estimation software" which gives the options to estimate GFR by plasma sampling method as well as SrCrM. We have used Microsoft Windows(®) as operating system and Visual Basic 6.0 as the front end and Microsoft Access(®) as database tool to develop this software. We have used Russell's formula for GFR calculation by plasma sampling method. GFR calculations using serum creatinine have been done using MIRD, Cockcroft-Gault method, Schwartz method, and Counahan-Barratt methods. The developed software is performing mathematical calculations correctly and is user-friendly. This software also enables storage and easy retrieval of the raw data, patient's information and calculated GFR for further processing and comparison. This is user-friendly software to calculate the GFR by various plasma sampling method and blood parameter. This software is also a good system for storing the raw and processed data for future analysis.

  13. A Comprehensive Software and Database Management System for Glomerular Filtration Rate Estimation by Radionuclide Plasma Sampling and Serum Creatinine Methods.

    PubMed

    Jha, Ashish Kumar

    2015-01-01

    Glomerular filtration rate (GFR) estimation by plasma sampling method is considered as the gold standard. However, this method is not widely used because the complex technique and cumbersome calculations coupled with the lack of availability of user-friendly software. The routinely used Serum Creatinine method (SrCrM) of GFR estimation also requires the use of online calculators which cannot be used without internet access. We have developed user-friendly software "GFR estimation software" which gives the options to estimate GFR by plasma sampling method as well as SrCrM. We have used Microsoft Windows(®) as operating system and Visual Basic 6.0 as the front end and Microsoft Access(®) as database tool to develop this software. We have used Russell's formula for GFR calculation by plasma sampling method. GFR calculations using serum creatinine have been done using MIRD, Cockcroft-Gault method, Schwartz method, and Counahan-Barratt methods. The developed software is performing mathematical calculations correctly and is user-friendly. This software also enables storage and easy retrieval of the raw data, patient's information and calculated GFR for further processing and comparison. This is user-friendly software to calculate the GFR by various plasma sampling method and blood parameter. This software is also a good system for storing the raw and processed data for future analysis. PMID:26097422

  14. A Comprehensive Software and Database Management System for Glomerular Filtration Rate Estimation by Radionuclide Plasma Sampling and Serum Creatinine Methods

    PubMed Central

    Jha, Ashish Kumar

    2015-01-01

    Glomerular filtration rate (GFR) estimation by plasma sampling method is considered as the gold standard. However, this method is not widely used because the complex technique and cumbersome calculations coupled with the lack of availability of user-friendly software. The routinely used Serum Creatinine method (SrCrM) of GFR estimation also requires the use of online calculators which cannot be used without internet access. We have developed user-friendly software “GFR estimation software” which gives the options to estimate GFR by plasma sampling method as well as SrCrM. We have used Microsoft Windows® as operating system and Visual Basic 6.0 as the front end and Microsoft Access® as database tool to develop this software. We have used Russell's formula for GFR calculation by plasma sampling method. GFR calculations using serum creatinine have been done using MIRD, Cockcroft-Gault method, Schwartz method, and Counahan-Barratt methods. The developed software is performing mathematical calculations correctly and is user-friendly. This software also enables storage and easy retrieval of the raw data, patient's information and calculated GFR for further processing and comparison. This is user-friendly software to calculate the GFR by various plasma sampling method and blood parameter. This software is also a good system for storing the raw and processed data for future analysis. PMID:26097422

  15. Analytical results, database management and quality assurance for analysis of soil and groundwater samples collected by cone penetrometer from the F and H Area seepage basins

    SciTech Connect

    Boltz, D.R.; Johnson, W.H.; Serkiz, S.M.

    1994-10-01

    The Quantification of Soil Source Terms and Determination of the Geochemistry Controlling Distribution Coefficients (K{sub d} values) of Contaminants at the F- and H-Area Seepage Basins (FHSB) study was designed to generate site-specific contaminant transport factors for contaminated groundwater downgradient of the Basins. The experimental approach employed in this study was to collect soil and its associated porewater from contaminated areas downgradient of the FHSB. Samples were collected over a wide range of geochemical conditions (e.g., pH, conductivity, and contaminant concentration) and were used to describe the partitioning of contaminants between the aqueous phase and soil surfaces at the site. The partitioning behavior may be used to develop site-specific transport factors. This report summarizes the analytical procedures and results for both soil and porewater samples collected as part of this study and the database management of these data.

  16. Nuclear Science References Database

    SciTech Connect

    Pritychenko, B.; Běták, E.; Singh, B.; Totans, J.

    2014-06-15

    The Nuclear Science References (NSR) database together with its associated Web interface, is the world's only comprehensive source of easily accessible low- and intermediate-energy nuclear physics bibliographic information for more than 210,000 articles since the beginning of nuclear science. The weekly-updated NSR database provides essential support for nuclear data evaluation, compilation and research activities. The principles of the database and Web application development and maintenance are described. Examples of nuclear structure, reaction and decay applications are specifically included. The complete NSR database is freely available at the websites of the National Nuclear Data Center (http://www.nndc.bnl.gov/nsr) and the International Atomic Energy Agency (http://www-nds.iaea.org/nsr)

  17. Database-assisted promoter analysis.

    PubMed

    Hehl, R; Wingender, E

    2001-06-01

    The analysis of regulatory sequences is greatly facilitated by database-assisted bioinformatic approaches. The TRANSFAC database contains information on transcription factors and their origins, functional properties and sequence-specific binding activities. Software tools enable us to screen the database with a given DNA sequence for interacting transcription factors. If a regulatory function is already attributed to this sequence then the database-assisted identification of binding sites for proteins or protein classes and subsequent experimental verification might establish functionally relevant sites within this sequence. The binding transcription factors and interacting factors might already be present in the database.

  18. BAID: The Barrow Area Information Database - An Interactive Web Mapping Portal and Cyberinfrastructure Showcasing Scientific Activities in the Vicinity of Barrow, Arctic Alaska.

    NASA Astrophysics Data System (ADS)

    Escarzaga, S. M.; Cody, R. P.; Kassin, A.; Barba, M.; Gaylord, A. G.; Manley, W. F.; Mazza Ramsay, F. D.; Vargas, S. A., Jr.; Tarin, G.; Laney, C. M.; Villarreal, S.; Aiken, Q.; Collins, J. A.; Green, E.; Nelson, L.; Tweedie, C. E.

    2015-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic and the Barrow Area Information Database (BAID, www.barrowmapped.org) tracks and facilitates a gamut of research, management, and educational activities in the area. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 12,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, save or print maps and query results, and filter or view information by space, time, and/or other tags. Additionally, data are described with metadata that meet Federal Geographic Data Committee standards. Recent advances include the addition of more than 2000 new research sites, the addition of a query builder user interface allowing rich and complex queries, and provision of differential global position system (dGPS) and high-resolution aerial imagery support to visiting scientists. Recent field surveys include over 80 miles of coastline to document rates of erosion and the collection of high-resolution sonar data for bathymetric mapping of Elson Lagoon and near shore region of the Chukchi Sea. A network of five climate stations has been deployed across the peninsula to serve as a wireless net for the research community and to deliver near real time climatic data to the user community. Local GIS personal have also been trained to better make use of scientific data for local decision making. Links to Barrow area datasets are housed at national data archives and substantial upgrades have

  19. JDD, Inc. Database

    NASA Technical Reports Server (NTRS)

    Miller, David A., Jr.

    2004-01-01

    JDD Inc, is a maintenance and custodial contracting company whose mission is to provide their clients in the private and government sectors "quality construction, construction management and cleaning services in the most efficient and cost effective manners, (JDD, Inc. Mission Statement)." This company provides facilities support for Fort Riley in Fo,rt Riley, Kansas and the NASA John H. Glenn Research Center at Lewis Field here in Cleveland, Ohio. JDD, Inc. is owned and operated by James Vaughn, who started as painter at NASA Glenn and has been working here for the past seventeen years. This summer I worked under Devan Anderson, who is the safety manager for JDD Inc. in the Logistics and Technical Information Division at Glenn Research Center The LTID provides all transportation, secretarial, security needs and contract management of these various services for the center. As a safety manager, my mentor provides Occupational Health and Safety Occupation (OSHA) compliance to all JDD, Inc. employees and handles all other issues (Environmental Protection Agency issues, workers compensation, safety and health training) involving to job safety. My summer assignment was not as considered "groundbreaking research" like many other summer interns have done in the past, but it is just as important and beneficial to JDD, Inc. I initially created a database using a Microsoft Excel program to classify and categorize data pertaining to numerous safety training certification courses instructed by our safety manager during the course of the fiscal year. This early portion of the database consisted of only data (training field index, employees who were present at these training courses and who was absent) from the training certification courses. Once I completed this phase of the database, I decided to expand the database and add as many dimensions to it as possible. Throughout the last seven weeks, I have been compiling more data from day to day operations and been adding the

  20. DESIGN AND PERFORMANCE OF A XENOBIOTIC METABOLISM DATABASE MANAGER FOR METABOLIC SIMULATOR ENHANCEMENT AND CHEMICAL RISK ANALYSIS

    EPA Science Inventory

    A major uncertainty that has long been recognized in evaluating chemical toxicity is accounting for metabolic activation of chemicals resulting in increased toxicity. In silico approaches to predict chemical metabolism and to subsequently screen and prioritize chemicals for risk ...

  1. Small Business Innovations (Integrated Database)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Because of the diversity of NASA's information systems, it was necessary to develop DAVID as a central database management system. Under a Small Business Innovation Research (SBIR) grant, Ken Wanderman and Associates, Inc. designed software tools enabling scientists to interface with DAVID and commercial database management systems, as well as artificial intelligence programs. The software has been installed at a number of data centers and is commercially available.

  2. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2013-06-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, relational databases, and NoSQL databases. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system.

  3. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2013-06-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, relational databases, and NoSQL databases. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system. PMID:23749755

  4. Waste Management Policy Framework to Mitigate Terrorist Intrusion Activities

    SciTech Connect

    Redus, Kenneth, S.

    2003-02-26

    A policy-directed framework is developed to support US Department of Energy (DOE) counterterrorism efforts, specifically terrorist intrusion activities that affect of Environmental Management (EM) programs. The framework is called the Security Effectiveness and Resource Allocation Definition Forecasting and Control System (SERAD-FACS). Use of SERAD-FACS allows trade-offs between resources, technologies, risk, and Research and Development (R&D) efforts to mitigate such intrusion attempts. Core to SERAD-FACS is (1) the understanding the perspectives and time horizons of key decisionmakers and organizations, (2) a determination of site vulnerabilities and accessibilities, and (3) quantifying the measures that describe the risk associated with a compromise of EM assets. The innovative utility of SERAD-FACS is illustrated for three integrated waste management and security strategies. EM program risks, time delays, and security for effectiveness are examined to demonstrate the significant cost and schedule impact terrorist activities can have on cleanup efforts in the DOE complex.

  5. Teaching Case: Adapting the Access Northwind Database to Support a Database Course

    ERIC Educational Resources Information Center

    Dyer, John N.; Rogers, Camille

    2015-01-01

    A common problem encountered when teaching database courses is that few large illustrative databases exist to support teaching and learning. Most database textbooks have small "toy" databases that are chapter objective specific, and thus do not support application over the complete domain of design, implementation and management concepts…

  6. United States-Russia: Environmental management activities, Summer 1998

    SciTech Connect

    1998-09-01

    A Joint Coordinating Committee for Environmental Restoration and Waste Management (JCCEM) was formed between the US and Russia. This report describes the areas of research being studied under JCCEM, namely: Efficient separations; Contaminant transport and site characterization; Mixed wastes; High level waste tank remediation; Transuranic stabilization; Decontamination and decommissioning; and Emergency response. Other sections describe: Administrative framework for cooperation; Scientist exchange; Future actions; Non-JCCEM DOE-Russian activities; and JCCEM publications.

  7. Weight Management for Athletes and Active Individuals: A Brief Review.

    PubMed

    Manore, Melinda M

    2015-11-01

    Weight management for athletes and active individuals is unique because of their high daily energy expenditure; thus, the emphasis is usually placed on changing the diet side of the energy balance equation. When dieting for weight loss, active individuals also want to preserve lean tissue, which means that energy restriction cannot be too severe or lean tissue is lost. First, this brief review addresses the issues of weight management in athletes and active individuals and factors to consider when determining a weight-loss goal. Second, the concept of dynamic energy balance is reviewed, including two mathematical models developed to improve weight-loss predictions based on changes in diet and exercise. These models are now available on the Internet. Finally, dietary strategies for weight loss/maintenance that can be successfully used with active individuals are given. Emphasis is placed on teaching the benefits of consuming a low-ED diet (e.g., high-fiber, high-water, low-fat foods), which allows for the consumption of a greater volume of food to increase satiety while reducing energy intake. Health professionals and sport dietitians need to understand dynamic energy balance and be prepared with effective and evidence-based dietary approaches to help athletes and active individuals achieve their body-weight goals. PMID:26553496

  8. Weight Management for Athletes and Active Individuals: A Brief Review.

    PubMed

    Manore, Melinda M

    2015-11-01

    Weight management for athletes and active individuals is unique because of their high daily energy expenditure; thus, the emphasis is usually placed on changing the diet side of the energy balance equation. When dieting for weight loss, active individuals also want to preserve lean tissue, which means that energy restriction cannot be too severe or lean tissue is lost. First, this brief review addresses the issues of weight management in athletes and active individuals and factors to consider when determining a weight-loss goal. Second, the concept of dynamic energy balance is reviewed, including two mathematical models developed to improve weight-loss predictions based on changes in diet and exercise. These models are now available on the Internet. Finally, dietary strategies for weight loss/maintenance that can be successfully used with active individuals are given. Emphasis is placed on teaching the benefits of consuming a low-ED diet (e.g., high-fiber, high-water, low-fat foods), which allows for the consumption of a greater volume of food to increase satiety while reducing energy intake. Health professionals and sport dietitians need to understand dynamic energy balance and be prepared with effective and evidence-based dietary approaches to help athletes and active individuals achieve their body-weight goals.

  9. Drinking Water Treatability Database (Database)

    EPA Science Inventory

    The drinking Water Treatability Database (TDB) will provide data taken from the literature on the control of contaminants in drinking water, and will be housed on an interactive, publicly-available USEPA web site. It can be used for identifying effective treatment processes, rec...

  10. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Dezaki, Kyoko; Saeki, Makoto

    Rapid progress in advanced informationalization has increased need to enforce documentation activities in industries. Responding to it Tokin Corporation has been engaged in database construction for patent information, technical reports and so on accumulated inside the Company. Two results are obtained; One is TOPICS, inhouse patent information management system, the other is TOMATIS, management and technical information system by use of personal computers and all-purposed relational database software. These systems aim at compiling databases of patent and technological management information generated internally and externally by low labor efforts as well as low cost, and providing for comprehensive information company-wide. This paper introduces the outline of these systems and how they are actually used.

  11. FishTraits Database

    USGS Publications Warehouse

    Angermeier, Paul L.; Frimpong, Emmanuel A.

    2009-01-01

    The need for integrated and widely accessible sources of species traits data to facilitate studies of ecology, conservation, and management has motivated development of traits databases for various taxa. In spite of the increasing number of traits-based analyses of freshwater fishes in the United States, no consolidated database of traits of this group exists publicly, and much useful information on these species is documented only in obscure sources. The largely inaccessible and unconsolidated traits information makes large-scale analysis involving many fishes and/or traits particularly challenging. FishTraits is a database of >100 traits for 809 (731 native and 78 exotic) fish species found in freshwaters of the conterminous United States, including 37 native families and 145 native genera. The database contains information on four major categories of traits: (1) trophic ecology, (2) body size and reproductive ecology (life history), (3) habitat associations, and (4) salinity and temperature tolerances. Information on geographic distribution and conservation status is also included. Together, we refer to the traits, distribution, and conservation status information as attributes. Descriptions of attributes are available here. Many sources were consulted to compile attributes, including state and regional species accounts and other databases.

  12. MAKER2: an annotation pipeline and genome-database management tool for second-generation genome projects

    PubMed Central

    2011-01-01

    Background Second-generation sequencing technologies are precipitating major shifts with regards to what kinds of genomes are being sequenced and how they are annotated. While the first generation of genome projects focused on well-studied model organisms, many of today's projects involve exotic organisms whose genomes are largely terra incognita. This complicates their annotation, because unlike first-generation projects, there are no pre-existing 'gold-standard' gene-models with which to train gene-finders. Improvements in genome assembly and the wide availability of mRNA-seq data are also creating opportunities to update and re-annotate previously published genome annotations. Today's genome projects are thus in need of new genome annotation tools that can meet the challenges and opportunities presented by second-generation sequencing technologies. Results We present MAKER2, a genome annotation and data management tool designed for second-generation genome projects. MAKER2 is a multi-threaded, parallelized application that can process second-generation datasets of virtually any size. We show that MAKER2 can produce accurate annotations for novel genomes where training-data are limited, of low quality or even non-existent. MAKER2 also provides an easy means to use mRNA-seq data to improve annotation quality; and it can use these data to update legacy annotations, significantly improving their quality. We also show that MAKER2 can evaluate the quality of genome annotations, and identify and prioritize problematic annotations for manual review. Conclusions MAKER2 is the first annotation engine specifically designed for second-generation genome projects. MAKER2 scales to datasets of any size, requires little in the way of training data, and can use mRNA-seq data to improve annotation quality. It can also update and manage legacy genome annotation datasets. PMID:22192575

  13. Automating The Work at The Skin and Allergy Private Clinic : A Case Study on Using an Imaging Database to Manage Patients Records

    NASA Astrophysics Data System (ADS)

    Alghalayini, Mohammad Abdulrahman

    Today, many institutions and organizations are facing serious problem due to the tremendously increasing size of documents, and this problem is further triggering the storage and retrieval problems due to the continuously growing space and efficiency requirements. This problem is becoming more complex with time and the increase in the size and number of documents in an organization; therefore, there is a world wide growing demand to address this problem. This demand and challenge can be met by converting the tremendous amount of paper documents to images using a process to enable specialized document imaging people to select the most suitable image type and scanning resolution to use when there is a need for storing documents images. This documents management process, if applied, attempts to solve the problem of the image storage type and size to some extent. In this paper, we present a case study resembling an applied process to manage the registration of new patients in a private clinic and to optimize following up the registered patients after having their information records stored in an imaging database system; therefore, through this automation approach, we optimize the work process and maximize the efficiency of the Skin and Allergy Clinic tasks.

  14. Database management research for the Human Genome Project. Final progress report for period: 02/01/99 - 06/14/00

    SciTech Connect

    Bult, Carol J.

    1999-11-01

    The MouseBLAST server allows researchers to search a sequence within mouse/rodent sequence databases to find matching sequences that may be associated with mouse genes. Query results may be linked to gene detail records in the Mouse Genome Database (MGD). Searches are performed using WU-BLAST 2.0. All sequence databases are updated on a weekly basis.

  15. Carbon sink activity and GHG budget of managed European grasslands

    NASA Astrophysics Data System (ADS)

    Klumpp, Katja; Herfurth, Damien; Soussana, Jean-Francois; Fluxnet Grassland Pi's, European

    2013-04-01

    In agriculture, a large proportion (89%) of greenhouse gas (GHG) emission saving potential may be achieved by means of soil C sequestration. Recent demonstrations of carbon sink activities of European ecosystemes, however, often questioned the existence of C storing grasslands, as though a net sink of C was observed, uncertainty surrounding this estimate was larger than the sink itself (Janssens et al., 2003, Schulze et al., 2009. Then again, some of these estimates were based on a small number of measurements, and on models. Not surprising, there is still, a paucity of studies demonstrating the existence of grassland systems, where C sequestration would exceed (in CO2 equivalents) methane emissions from the enteric fermentation of ruminants and nitrous oxide emissions from managed soils. Grasslands are heavily relied upon for food and forage production. A key component of the carbon sink activity in grasslands is thus the impact of changes in management practices or effects of past and recent management, such as intensification as well as climate (and -variation). We analysed data (i.e. flux, ecological, management and soil organic carbon) from a network of European grassland flux observation sites (36). These sites covered different types and intensities of management, and offered the opportunity to understand grassland carbon cycling and trade-offs between C sinks and CH4 and N2O emissions. For some sites, the assessment of carbon sink activities were compared using two methods; repeated soil inventory and determination of the ecosystem C budget by continuous measurement of CO2 exchange in combination with quantification of other C imports and exports (net C storage, NCS). In general grassland, were a potential sink of C with 60±12 g C /m2.yr (median; min -456; max 645). Grazed sites had a higher NCS compared to cut sites (median 99 vs 67 g C /m2.yr), while permanent grassland sites tended to have a lower NCS compared to temporary sown grasslands (median 64 vs

  16. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Senoo, Tetsuo

    As computer technology, communication technology and others have progressed, many corporations are likely to locate constructing and utilizing their own databases at the center of the information activities, and aim at developing their information activities newly. This paper considers how information management in a corporation is affected under changing management and technology environments, and clarifies and generalizes what in-house databases should be constructed and utilized from the viewpoints of requirements to be furnished, types and forms of information to be dealt, indexing, use type and frequency, evaluation method and so on. The author outlines an information system of Matsushita called MATIS (Matsushita Technical Information System) as an actual example, and describes the present status and some points to be reminded in constructing and utilizing databases of REP, BOOK and SYMP.

  17. Waste management activities and carbon emissions in Africa

    SciTech Connect

    Couth, R.; Trois, C.

    2011-01-15

    This paper summarizes research into waste management activities and carbon emissions from territories in sub-Saharan Africa with the main objective of quantifying emission reductions (ERs) that can be gained through viable improvements to waste management in Africa. It demonstrates that data on waste and carbon emissions is poor and generally inadequate for prediction models. The paper shows that the amount of waste produced and its composition are linked to national Gross Domestic Product (GDP). Waste production per person is around half that in developed countries with a mean around 230 kg/hd/yr. Sub-Saharan territories produce waste with a biogenic carbon content of around 56% (+/-25%), which is approximately 40% greater than developed countries. This waste is disposed in uncontrolled dumps that produce large amounts of methane gas. Greenhouse gas (GHG) emissions from waste will rise with increasing urbanization and can only be controlled through funding mechanisms from developed countries.

  18. Prescribing physical activity to prevent and manage gestational diabetes

    PubMed Central

    Colberg, Sheri R; Castorino, Kristin; Jovanovič, Lois

    2013-01-01

    Gestational diabetes mellitus (GDM) is the most prevalent metabolic disorder during pregnancy. Women diagnosed with GDM have a substantially greater risk of developing type 2 diabetes within 5-10 years after delivery, and the risk is increased by excess body weight. Uncontrolled hyperglycemia during pregnancy is potentially harmful to both mother and fetus, resulting in a greater need for Caesarian-section deliveries, delivery of larger infants with more excess body fat, a greater risk of infant death and stillbirth, and an elevated risk of infant hypoglycemia immediately after birth. Fortunately, engaging in physical activity prior to and during pregnancy may lower the risk of developing GDM. Pregnant women should also be advised how to safely increase their physical activity during pregnancy and the postpartum period. An initial approach to becoming more physically active can simply be to encourage women to incorporate more unstructured physical activity into daily living, both before and during pregnancy. Giving women an appropriate exercise prescription can encourage them to participate in physical activity safely and effectively throughout pregnancy to prevent and/or manage GDM. Engaging in 30 min of moderate intensity physical activity on most, if not all, days of the week has been adopted as a recommendation for all pregnant women. PMID:24379915

  19. Pollution effects on fisheries — potential management activities

    NASA Astrophysics Data System (ADS)

    Sindermann, C. J.

    1980-03-01

    Management of ocean pollution must be based on the best available scientific information, with adequate consideration of economic, social, and political realities. Unfortunately, the best available scientific information about pollution effects on fisheries is often fragmentary, and often conjectural; therefore a primary concern of management should be a critical review and assessment of available factual information about effects of pollutants on fish and shellfish stocks. A major problem in any such review and assessment is the separation of pollutant effects from the effects of all the other environmental factors that influence survival and well-being of marine animals. Data from long-term monitoring of resource abundance, and from monitoring of all determinant environmental variables, will be required for analyses that lead to resolution of the problem. Information must also be acquired about fluxes of contaminants through resource-related ecosystems, and about contaminant effects on resource species as demonstrated in field and laboratory experiments. Other possible management activities include: (1) encouragement of continued efforts to document clearly the localized and general effects of pollution on living resources; (2) continued pressure to identify and use reliable biological indicators of environmental degradation (indicators of choice at present are: unusually high levels of genetic and other anomalies in the earliest life history stages; presence of pollution-associated disease signs, particularly fin erosion and ulcers, in fish; and biochemical/physiological changes); and (3) major efforts to reduce inputs of pollutants clearly demonstrated to be harmful to living resources, from point sources as well as ocean dumping. Such pollution management activities, based on continuous efforts in stock assessment, environmental assessment, and experimental studies, can help to insure that rational decisions will be made about uses and abuses of coastal

  20. Management plan for Facility Effluent Monitoring Plan activities

    SciTech Connect

    Nickels, J.M.; Pratt, D.R.

    1991-08-01

    The DOE/RL 89-19, United States Department of Energy-Richland Operations Office Environmental Protection Implementation Plan (1989), requires the Hanford Site to prepare an Environmental Monitoring Plan (EMP) by November 9, 1991. The DOE/EH-0173T, Environmental Regulatory Guide for Radiological Effluent Monitoring and Environmental Surveillance (1991), provides additional guidance and requires implementation of the EMP within 36 months of the effective data of the rule. DOE Order 5400.1, General Environmental Protection Program, requires each US Department of Energy (DOE) site, facility, or activity that uses, generates, releases, or manages significant quantities of hazardous materials to prepare an EMP. This EMP is to identify and discuss two major activities: (1) effluent monitoring and (2) environmental surveillance. At the Hanford Site, the site-wide EMP will consist of the following elements: (1) A conceptual plan addressing effluent monitoring and environmental surveillance; (2) Pacific Northwest Laboratory (PNL) site-wide environmental surveillance program; (3) Westinghouse Hanford Company (Westinghouse Hanford) effluent monitoring program consisting of the near-field operations environmental monitoring activities and abstracts of each Facility Effluent Monitoring Plan (FEMP). This management plan addresses the third of these three elements of the EMP, the FEMPs.

  1. Energy management and control of active distribution systems

    NASA Astrophysics Data System (ADS)

    Shariatzadeh, Farshid

    Advancements in the communication, control, computation and information technologies have driven the transition to the next generation active power distribution systems. Novel control techniques and management strategies are required to achieve the efficient, economic and reliable grid. The focus of this work is energy management and control of active distribution systems (ADS) with integrated renewable energy sources (RESs) and demand response (DR). Here, ADS mean automated distribution system with remotely operated controllers and distributed energy resources (DERs). DER as active part of the next generation future distribution system includes: distributed generations (DGs), RESs, energy storage system (ESS), plug-in hybrid electric vehicles (PHEV) and DR. Integration of DR and RESs into ADS is critical to realize the vision of sustainability. The objective of this dissertation is the development of management architecture to control and operate ADS in the presence of DR and RES. One of the most challenging issues for operating ADS is the inherent uncertainty of DR and RES as well as conflicting objective of DER and electric utilities. ADS can consist of different layers such as system layer and building layer and coordination between these layers is essential. In order to address these challenges, multi-layer energy management and control architecture is proposed with robust algorithms in this work. First layer of proposed multi-layer architecture have been implemented at the system layer. Developed AC optimal power flow (AC-OPF) generates fair price for all DR and non-DR loads which is used as a control signal for second layer. Second layer controls DR load at buildings using a developed look-ahead robust controller. Load aggregator collects information from all buildings and send aggregated load to the system optimizer. Due to the different time scale at these two management layers, time coordination scheme is developed. Robust and deterministic controllers

  2. Annual Review of Database Developments 1991.

    ERIC Educational Resources Information Center

    Basch, Reva

    1991-01-01

    Review of developments in databases highlights a new emphasis on accessibility. Topics discussed include the internationalization of databases; databases that deal with finance, drugs, and toxic waste; access to public records, both personal and corporate; media online; reducing large files of data to smaller, more manageable files; and…

  3. Development of soybean gene expression database (SGED)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Large volumes of microarray expression data is a challenge for analysis. To address this problem a web-based database, Soybean Expression Database (SGED) was built, using PERL/CGI, C and an ORACLE database management system. SGED contains three components. The Data Mining component serves as a repos...

  4. Data Extraction and Management in Networks of Observational Health Care Databases for Scientific Research: A Comparison of EU-ADR, OMOP, Mini-Sentinel and MATRICE Strategies

    PubMed Central

    Gini, Rosa; Schuemie, Martijn; Brown, Jeffrey; Ryan, Patrick; Vacchi, Edoardo; Coppola, Massimo; Cazzola, Walter; Coloma, Preciosa; Berni, Roberto; Diallo, Gayo; Oliveira, José Luis; Avillach, Paul; Trifirò, Gianluca; Rijnbeek, Peter; Bellentani, Mariadonata; van Der Lei, Johan; Klazinga, Niek; Sturkenboom, Miriam

    2016-01-01

    Introduction: We see increased use of existing observational data in order to achieve fast and transparent production of empirical evidence in health care research. Multiple databases are often used to increase power, to assess rare exposures or outcomes, or to study diverse populations. For privacy and sociological reasons, original data on individual subjects can’t be shared, requiring a distributed network approach where data processing is performed prior to data sharing. Case Descriptions and Variation Among Sites: We created a conceptual framework distinguishing three steps in local data processing: (1) data reorganization into a data structure common across the network; (2) derivation of study variables not present in original data; and (3) application of study design to transform longitudinal data into aggregated data sets for statistical analysis. We applied this framework to four case studies to identify similarities and differences in the United States and Europe: Exploring and Understanding Adverse Drug Reactions by Integrative Mining of Clinical Records and Biomedical Knowledge (EU-ADR), Observational Medical Outcomes Partnership (OMOP), the Food and Drug Administration’s (FDA’s) Mini-Sentinel, and the Italian network—the Integration of Content Management Information on the Territory of Patients with Complex Diseases or with Chronic Conditions (MATRICE). Findings: National networks (OMOP, Mini-Sentinel, MATRICE) all adopted shared procedures for local data reorganization. The multinational EU-ADR network needed locally defined procedures to reorganize its heterogeneous data into a common structure. Derivation of new data elements was centrally defined in all networks but the procedure was not shared in EU-ADR. Application of study design was a common and shared procedure in all the case studies. Computer procedures were embodied in different programming languages, including SAS, R, SQL, Java, and C++. Conclusion: Using our conceptual framework

  5. MANAGING ENGINEERING ACTIVITIES FOR THE PLATEAU REMEDIATION CONTRACT - HANFORD

    SciTech Connect

    KRONVALL CM

    2011-01-14

    In 2008, the primary Hanford clean-up contract transitioned to the CH2MHill Plateau Remediation Company (CHPRC). Prior to transition, Engineering resources assigned to remediation/Decontamination and Decommissioning (D&D) activities were a part of a centralized engineering organization and matrixed to the performing projects. Following transition, these resources were reassigned directly to the performing project, with a loose matrix through a smaller Central Engineering (CE) organization. The smaller (10 FTE) central organization has retained responsibility for the overall technical quality of engineering for the CHPRC, but no longer performs staffing and personnel functions. As the organization has matured, there are lessons learned that can be shared with other organizations going through or contemplating performing a similar change. Benefits that have been seen from the CHPRC CE organization structure include the following: (1) Staff are closely aligned with the 'Project/facility' that they are assigned to support; (2) Engineering priorities are managed to be consistent with the 'Project/facility' priorities; (3) Individual Engineering managers are accountable for identifying staffing needs and the filling of staffing positions; (4) Budget priorities are managed within the local organization structure; (5) Rather than being considered a 'functional' organization, engineering is considered a part of a line, direct funded organization; (6) The central engineering organization is able to provide 'overview' activities and maintain independence from the engineering organizations in the field; and (7) The central engineering organization is able to maintain a stable of specialized experts that are able to provide independent reviews of field projects and day-to-day activities.

  6. TESS: a geometric hashing algorithm for deriving 3D coordinate templates for searching structural databases. Application to enzyme active sites.

    PubMed

    Wallace, A C; Borkakoti, N; Thornton, J M

    1997-11-01

    It is well established that sequence templates such as those in the PROSITE and PRINTS databases are powerful tools for predicting the biological function and tertiary structure for newly derived protein sequences. The number of X-ray and NMR protein structures is increasing rapidly and it is apparent that a 3D equivalent of the sequence templates is needed. Here, we describe an algorithm called TESS that automatically derives 3D templates from structures deposited in the Brookhaven Protein Data Bank. While a new sequence can be searched for sequence patterns, a new structure can be scanned against these 3D templates to identify functional sites. As examples, 3D templates are derived for enzymes with an O-His-O "catalytic triad" and for the ribonucleases and lysozymes. When these 3D templates are applied to a large data set of nonidentical proteins, several interesting hits are located. This suggests that the development of a 3D template database may help to identify the function of new protein structures, if unknown, as well as to design proteins with specific functions.

  7. Active Piezoelectric Structures for Tip Clearance Management Assessed

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Managing blade tip clearance in turbomachinery stages is critical to developing advanced subsonic propulsion systems. Active casing structures with embedded piezoelectric actuators appear to be a promising solution. They can control static and dynamic tip clearance, compensate for uneven deflections, and accomplish electromechanical coupling at the material level. In addition, they have a compact design. To assess the feasibility of this concept and assist the development of these novel structures, the NASA Lewis Research Center developed in-house computational capabilities for composite structures with piezoelectric actuators and sensors, and subsequently used them to simulate candidate active casing structures. The simulations indicated the potential of active casings to modify the blade tip clearance enough to improve stage efficiency. They also provided valuable design information, such as preliminary actuator configurations (number and location) and the corresponding voltage patterns required to compensate for uneven casing deformations. An active ovalization of a casing with four discrete piezoceramic actuators attached on the outer surface is shown. The center figure shows the predicted radial displacements along the hoop direction that are induced when electrostatic voltage is applied at the piezoceramic actuators. This work, which has demonstrated the capabilities of in-house computational models to analyze and design active casing structures, is expected to contribute toward the development of advanced subsonic engines.

  8. Tracing thyroid hormone-disrupting compounds: database compilation and structure-activity evaluation for an effect-directed analysis of sediment.

    PubMed

    Weiss, Jana M; Andersson, Patrik L; Zhang, Jin; Simon, Eszter; Leonards, Pim E G; Hamers, Timo; Lamoree, Marja H

    2015-07-01

    A variety of anthropogenic compounds has been found to be capable of disrupting the endocrine systems of organisms, in laboratory studies as well as in wildlife. The most widely described endpoint is estrogenicity, but other hormonal disturbances, e.g., thyroid hormone disruption, are gaining more and more attention. Here, we present a review and chemical characterization, using principal component analysis, of organic compounds that have been tested for their capacity to bind competitively to the thyroid hormone transport protein transthyretin (TTR). The database contains 250 individual compounds and technical mixtures, of which 144 compounds are defined as TTR binders. Almost one third of these compounds (n = 52) were even more potent than the natural hormone thyroxine (T4). The database was used as a tool to assist in the identification of thyroid hormone-disrupting compounds (THDCs) in an effect-directed analysis (EDA) study of a sediment sample. Two compounds could be confirmed to contribute to the detected TTR-binding potency in the sediment sample, i.e., triclosan and nonylphenol technical mixture. They constituted less than 1% of the TTR-binding potency of the unfractionated extract. The low rate of explained activity may be attributed to the challenges related to identification of unknown contaminants in combination with the limited knowledge about THDCs in general. This study demonstrates the need for databases containing compound-specific toxicological properties. In the framework of EDA, such a database could be used to assist in the identification and confirmation of causative compounds focusing on thyroid hormone disruption. PMID:25986900

  9. The AMMA database

    NASA Astrophysics Data System (ADS)

    Boichard, Jean-Luc; Brissebrat, Guillaume; Cloche, Sophie; Eymard, Laurence; Fleury, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim

    2010-05-01

    The AMMA project includes aircraft, ground-based and ocean measurements, an intensive use of satellite data and diverse modelling studies. Therefore, the AMMA database aims at storing a great amount and a large variety of data, and at providing the data as rapidly and safely as possible to the AMMA research community. In order to stimulate the exchange of information and collaboration between researchers from different disciplines or using different tools, the database provides a detailed description of the products and uses standardized formats. The AMMA database contains: - AMMA field campaigns datasets; - historical data in West Africa from 1850 (operational networks and previous scientific programs); - satellite products from past and future satellites, (re-)mapped on a regular latitude/longitude grid and stored in NetCDF format (CF Convention); - model outputs from atmosphere or ocean operational (re-)analysis and forecasts, and from research simulations. The outputs are processed as the satellite products are. Before accessing the data, any user has to sign the AMMA data and publication policy. This chart only covers the use of data in the framework of scientific objectives and categorically excludes the redistribution of data to third parties and the usage for commercial applications. Some collaboration between data producers and users, and the mention of the AMMA project in any publication is also required. The AMMA database and the associated on-line tools have been fully developed and are managed by two teams in France (IPSL Database Centre, Paris and OMP, Toulouse). Users can access data of both data centres using an unique web portal. This website is composed of different modules : - Registration: forms to register, read and sign the data use chart when an user visits for the first time - Data access interface: friendly tool allowing to build a data extraction request by selecting various criteria like location, time, parameters... The request can

  10. Research Data Management and Libraries: Relationships, Activities, Drivers and Influences

    PubMed Central

    Pinfield, Stephen; Cox, Andrew M.; Smith, Jen

    2014-01-01

    The management of research data is now a major challenge for research organisations. Vast quantities of born-digital data are being produced in a wide variety of forms at a rapid rate in universities. This paper analyses the contribution of academic libraries to research data management (RDM) in the wider institutional context. In particular it: examines the roles and relationships involved in RDM, identifies the main components of an RDM programme, evaluates the major drivers for RDM activities, and analyses the key factors influencing the shape of RDM developments. The study is written from the perspective of library professionals, analysing data from 26 semi-structured interviews of library staff from different UK institutions. This is an early qualitative contribution to the topic complementing existing quantitative and case study approaches. Results show that although libraries are playing a significant role in RDM, there is uncertainty and variation in the relationship with other stakeholders such as IT services and research support offices. Current emphases in RDM programmes are on developments of policies and guidelines, with some early work on technology infrastructures and support services. Drivers for developments include storage, security, quality, compliance, preservation, and sharing with libraries associated most closely with the last three. The paper also highlights a ‘jurisdictional’ driver in which libraries are claiming a role in this space. A wide range of factors, including governance, resourcing and skills, are identified as influencing ongoing developments. From the analysis, a model is constructed designed to capture the main aspects of an institutional RDM programme. This model helps to clarify the different issues involved in RDM, identifying layers of activity, multiple stakeholders and drivers, and a large number of factors influencing the implementation of any initiative. Institutions may usefully benchmark their activities against

  11. Research data management and libraries: relationships, activities, drivers and influences.

    PubMed

    Pinfield, Stephen; Cox, Andrew M; Smith, Jen

    2014-01-01

    The management of research data is now a major challenge for research organisations. Vast quantities of born-digital data are being produced in a wide variety of forms at a rapid rate in universities. This paper analyses the contribution of academic libraries to research data management (RDM) in the wider institutional context. In particular it: examines the roles and relationships involved in RDM, identifies the main components of an RDM programme, evaluates the major drivers for RDM activities, and analyses the key factors influencing the shape of RDM developments. The study is written from the perspective of library professionals, analysing data from 26 semi-structured interviews of library staff from different UK institutions. This is an early qualitative contribution to the topic complementing existing quantitative and case study approaches. Results show that although libraries are playing a significant role in RDM, there is uncertainty and variation in the relationship with other stakeholders such as IT services and research support offices. Current emphases in RDM programmes are on developments of policies and guidelines, with some early work on technology infrastructures and support services. Drivers for developments include storage, security, quality, compliance, preservation, and sharing with libraries associated most closely with the last three. The paper also highlights a 'jurisdictional' driver in which libraries are claiming a role in this space. A wide range of factors, including governance, resourcing and skills, are identified as influencing ongoing developments. From the analysis, a model is constructed designed to capture the main aspects of an institutional RDM programme. This model helps to clarify the different issues involved in RDM, identifying layers of activity, multiple stakeholders and drivers, and a large number of factors influencing the implementation of any initiative. Institutions may usefully benchmark their activities against the

  12. Research data management and libraries: relationships, activities, drivers and influences.

    PubMed

    Pinfield, Stephen; Cox, Andrew M; Smith, Jen

    2014-01-01

    The management of research data is now a major challenge for research organisations. Vast quantities of born-digital data are being produced in a wide variety of forms at a rapid rate in universities. This paper analyses the contribution of academic libraries to research data management (RDM) in the wider institutional context. In particular it: examines the roles and relationships involved in RDM, identifies the main components of an RDM programme, evaluates the major drivers for RDM activities, and analyses the key factors influencing the shape of RDM developments. The study is written from the perspective of library professionals, analysing data from 26 semi-structured interviews of library staff from different UK institutions. This is an early qualitative contribution to the topic complementing existing quantitative and case study approaches. Results show that although libraries are playing a significant role in RDM, there is uncertainty and variation in the relationship with other stakeholders such as IT services and research support offices. Current emphases in RDM programmes are on developments of policies and guidelines, with some early work on technology infrastructures and support services. Drivers for developments include storage, security, quality, compliance, preservation, and sharing with libraries associated most closely with the last three. The paper also highlights a 'jurisdictional' driver in which libraries are claiming a role in this space. A wide range of factors, including governance, resourcing and skills, are identified as influencing ongoing developments. From the analysis, a model is constructed designed to capture the main aspects of an institutional RDM programme. This model helps to clarify the different issues involved in RDM, identifying layers of activity, multiple stakeholders and drivers, and a large number of factors influencing the implementation of any initiative. Institutions may usefully benchmark their activities against the

  13. Cross exploitation of geo-databases and earth observation data for stakes characterization in the framework of multi-risk analysis and management: RASOR examples

    NASA Astrophysics Data System (ADS)

    Tholey, Nadine; Yesou, Herve; Maxant, Jerome; Montabord, Myldred; Studer, Mathias; Faivre, Robin; Rudari, Roberto; de Fraipont, Paul

    2016-04-01

    In the context of risk analysis and management, information is needed on the landscape under investigation, especially for vulnerability assessment purposes where landuse and stakes characterization is of prime importance for the knowledge and description of exposure elements in modelling scenarios. Such thematic information over at-risk areas can be extracted from available global, regional or local scale open sources databases (e.g. ESA-Globcover, Natural Earth, Copernicus core services, OSM, …) or derived from the exploitation of EO satellite images at high and very high spatial resolution (e.g. SPOT, soon Sentinel2, Pleiades, WorldView, …) over territories where this type of information is not available or not sufficiently up to date. However, EO data processing, and derived results highlight also the gap between what would be needed for a complete representation of vulnerability , i.e. a functional description of the land use, a structural description of the buildings including their functional use , and what is reasonable accessible by exploiting EO data, i.e. a biophysical description of the land cover at different spatial resolution from decametric scales to sub-metric ones, especially for urban block and building information. Potential and limits of these multi-scale and multi-sources of geo-information will be illustrated by examples related to different types of landscape and urban settlements in Asia (Indonesia), Europe (Greece), and the Caribbean (Haiti) regions, and exploited within the framework of the RASOR (Rapid Analysis and Spatialisation Of Risk) project (European Commission FP7) which is developing a platform to perform multi-hazard risk analysis to support the full cycle of disaster management.

  14. The CEBAF Element Database

    SciTech Connect

    Theodore Larrieu, Christopher Slominski, Michele Joyce

    2011-03-01

    With the inauguration of the CEBAF Element Database (CED) in Fall 2010, Jefferson Lab computer scientists have taken a step toward the eventual goal of a model-driven accelerator. Once fully populated, the database will be the primary repository of information used for everything from generating lattice decks to booting control computers to building controls screens. A requirement influencing the CED design is that it provide access to not only present, but also future and past configurations of the accelerator. To accomplish this, an introspective database schema was designed that allows new elements, types, and properties to be defined on-the-fly with no changes to table structure. Used in conjunction with Oracle Workspace Manager, it allows users to query data from any time in the database history with the same tools used to query the present configuration. Users can also check-out workspaces to use as staging areas for upcoming machine configurations. All Access to the CED is through a well-documented Application Programming Interface (API) that is translated automatically from original C++ source code into native libraries for scripting languages such as perl, php, and TCL making access to the CED easy and ubiquitous.

  15. Hidradenitis Suppurativa Management in the United States: An Analysis of the National Ambulatory Medical Care Survey and MarketScan Medicaid Databases

    PubMed Central

    Davis, Scott A.; Lin, Hsien-Chang; Balkrishnan, Rajesh; Feldman, Steven R.

    2015-01-01

    Purpose To present nationally representative data demonstrating how frequently hidradenitis suppurativa (HS) occurs in specific groups and how it is currently managed. Methods We analyzed data from the 1990–2009 National Ambulatory Medical Care Survey (NAMCS) and the 2003–2007 MarketScan Medicaid databases for patients with a diagnosis of HS (ICD-9-CM code 705.83). Visits per 100,000 population of each race and ethnicity were calculated using the 2000 US Census data for specific demographics. Results There were 164,000 patient visits (95% CI: 128,000–200,000) annually with a diagnosis of HS in the NAMCS, and 17,270 HS patients were found in the MarketScan Medicaid over the 5-year period. Antibiotics were the most common treatment, followed by pain medications, topical steroids, and isotretinoin. Prescriptions of biologics and systemic methotrexate, cyclosporine, and acitretin were not observed in the NAMCS. Physicians prescribed medications in 74% of visits and used procedures in 11% of visits. African Americans, females, and young adults had higher numbers of visits for HS. Conclusions Our data showing a maximum of 0.06% of the population being treated for HS in a given year are consistent with the low estimates of HS prevalence. Compared to the current prescribing patterns, the more frequent prescription of biologics and systemic treatments may yield better outcomes. PMID:27172455

  16. Standardized phenology monitoring methods to track plant and animal activity for science and resource management applications

    NASA Astrophysics Data System (ADS)

    Denny, Ellen G.; Gerst, Katharine L.; Miller-Rushing, Abraham J.; Tierney, Geraldine L.; Crimmins, Theresa M.; Enquist, Carolyn A. F.; Guertin, Patricia; Rosemartin, Alyssa H.; Schwartz, Mark D.; Thomas, Kathryn A.; Weltzin, Jake F.

    2014-05-01

    Phenology offers critical insights into the responses of species to climate change; shifts in species' phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological "status", or the ability to track presence-absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.

  17. Standardized phenology monitoring methods to track plant and animal activity for science and resource management applications.

    PubMed

    Denny, Ellen G; Gerst, Katharine L; Miller-Rushing, Abraham J; Tierney, Geraldine L; Crimmins, Theresa M; Enquist, Carolyn A F; Guertin, Patricia; Rosemartin, Alyssa H; Schwartz, Mark D; Thomas, Kathryn A; Weltzin, Jake F

    2014-05-01

    Phenology offers critical insights into the responses of species to climate change; shifts in species' phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological "status", or the ability to track presence-absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.

  18. Standardized phenology monitoring methods to track plant and animal activity for science and resource management applications

    USGS Publications Warehouse

    Denny, Ellen G.; Gerst, Katharine L.; Miller-Rushing, Abraham J.; Tierney, Geraldine L.; Crimmins, Theresa M.; Enquist, Carolyn A.F.; Guertin, Patricia; Rosemartin, Alyssa H.; Schwartz, Mark D.; Thomas, Kathryn A.; Weltzin, Jake F.

    2014-01-01

    Phenology offers critical insights into the responses of species to climate change; shifts in species’ phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological “status”, or the ability to track presence–absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.

  19. Education and the activation, course, and management of anger.

    PubMed

    Schieman, S

    2000-03-01

    Using data from the 1996 General Social Survey, I examine education's association with the activation, course, and management of anger. I argue that education--as a source of stratification (status) and as a personal resource (human capital)--organizes the conditions that influence anger-related processes. In analyses of anger activation, education is associated with lower odds of family-related anger. The well educated have fewer children and more income--factors associated with a lower risk of family anger. Conversely, education is associated with higher odds of work-related anger, but income and personal control account for that association. In analyses of the course of anger, I document a nonlinear association between education and anger duration. Adjustment for the sense of control--which is negatively associated with anger duration--sharpens that parabolic association. Education is positively associated with perceived appropriateness of anger and negatively associated with the display of anger. In both cases, adjustment for control accounts for education's effect. The sense of control also suppresses education's significant positive effect on anger processing. In analyses of anger management, education increases the odds of cognitive flexibility and problem solving, but its effect on communication depends on the sense of control. In sum, education organizes personal and social circumstances that influence anger-related processes.

  20. Active microbial soil communities in different agricultural managements

    NASA Astrophysics Data System (ADS)

    Landi, S.; Pastorelli, R.

    2009-04-01

    We studied the composition of active eubacterial microflora by RNA extraction from soil (bulk and rhizosphere) under different environmental impact managements, in a hilly basin in Gallura (Sardinia). We contrasted grassy vineyard, in which the soil had been in continuous contact with plant roots for a long period of time, with traditional tilled vineyard. Moreover, we examined permanent grassland, in which plants had been present for some years, with temporary grassland, in which varying plants had been present only during the respective growing seasons. Molecular analysis of total population was carried out by electrophoretic separation by Denaturing Gradient Gel Electrophoresis (DGGE) of amplified cDNA fragments obtained from 16S rRNA. In vineyards UPGMA (Unweighted Pair Group Mathematical Average) analysis made up separate clusters depending on soil management. In spring both clusters showed similarity over 70%, while in autumn the similarity increased, 84% and 90% for grassy and conventional tilled vineyard respectively. Permanent and temporary grassland joined in a single cluster in spring, while in autumn a partial separation was evidenced. The grassy vineyard, permanent and temporary grassland showed higher richness and diversity Shannon-Weiner index values than vineyard with conventional tillage although no significant. In conclusion the expected effect of the rhizosphere was visible: the grass cover influenced positively the diversity of active microbial population.

  1. The Genopolis Microarray Database

    PubMed Central

    Splendiani, Andrea; Brandizi, Marco; Even, Gael; Beretta, Ottavio; Pavelka, Norman; Pelizzola, Mattia; Mayhaus, Manuel; Foti, Maria; Mauri, Giancarlo; Ricciardi-Castagnoli, Paola

    2007-01-01

    Background Gene expression databases are key resources for microarray data management and analysis and the importance of a proper annotation of their content is well understood. Public repositories as well as microarray database systems that can be implemented by single laboratories exist. However, there is not yet a tool that can easily support a collaborative environment where different users with different rights of access to data can interact to define a common highly coherent content. The scope of the Genopolis database is to provide a resource that allows different groups performing microarray experiments related to a common subject to create a common coherent knowledge base and to analyse it. The Genopolis database has been implemented as a dedicated system for the scientific community studying dendritic and macrophage cells functions and host-parasite interactions. Results The Genopolis Database system allows the community to build an object based MIAME compliant annotation of their experiments and to store images, raw and processed data from the Affymetrix GeneChip® platform. It supports dynamical definition of controlled vocabularies and provides automated and supervised steps to control the coherence of data and annotations. It allows a precise control of the visibility of the database content to different sub groups in the community and facilitates exports of its content to public repositories. It provides an interactive users interface for data analysis: this allows users to visualize data matrices based on functional lists and sample characterization, and to navigate to other data matrices defined by similarity of expression values as well as functional characterizations of genes involved. A collaborative environment is also provided for the definition and sharing of functional annotation by users. Conclusion The Genopolis Database supports a community in building a common coherent knowledge base and analyse it. This fills a gap between a local

  2. The Slovenian food composition database.

    PubMed

    Korošec, Mojca; Golob, Terezija; Bertoncelj, Jasna; Stibilj, Vekoslava; Seljak, Barbara Koroušić

    2013-10-01

    The preliminary Slovenian food composition database was created in 2003, through the application of the Data management and Alimenta nutritional software. In the subsequent projects, data on the composition of meat and meat products of Slovenian origin were gathered from analyses, and low-quality data of the preliminary database were discarded. The first volume of the Slovenian food composition database was published in 2006, in both electronic and paper versions. When Slovenia joined the EuroFIR NoE, the LanguaL indexing system was adopted. The Optijed nutritional software was developed, and later upgraded to the OPEN platform. This platform serves as an electronic database that currently comprises 620 foods, and as the Slovenian node in the EuroFIR virtual information platform. With the assimilation of the data on the compositions of foods of plant origin obtained within the latest project, the Slovenian database provides a good source for food compositional values of consistent and compatible quality.

  3. Construction of databases: advances and significance in clinical research.

    PubMed

    Long, Erping; Huang, Bingjie; Wang, Liming; Lin, Xiaoyu; Lin, Haotian

    2015-12-01

    Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the definition, classification, and establishment of databases. Afterward, the workflow for establishing databases, inputting data, verifying data, and managing databases is presented. Meanwhile, by discussing the application of databases in clinical research, we illuminate the important role of databases in clinical research practice. Lastly, we introduce the reanalysis of randomized controlled trials (RCTs) and cloud computing techniques, showing the most recent advancements of databases in clinical research. PMID:27215009

  4. Simulated Medication Therapy Management Activities in a Pharmacotherapy Laboratory Course

    PubMed Central

    Thorpe, Joshua M.; Trapskin, Kari

    2011-01-01

    Objective. To measure the impact of medication therapy management (MTM) learning activities on students’ confidence and intention to provide MTM using the Theory of Planned Behavior. Design. An MTM curriculum combining lecture instruction and active-learning strategies was incorporated into a required pharmacotherapy laboratory course. Assessment. A validated survey instrument was developed to evaluate student confidence and intent to engage in MTM services using the domains comprising the Theory of Planned Behavior. Confidence scores improved significantly from baseline for all items (p < 0.00), including identification of billable services, documentation, and electronic billing. Mean scores improved significantly for all Theory of Planned Behavior items within the constructs of perceived behavioral control and subjective norms (p < 0.05). At baseline, 42% of students agreed or strongly agreed that they had knowledge and skills to provide MTM. This percentage increased to 82% following completion of the laboratory activities. Conclusion. Implementation of simulated MTM activities in a pharmacotherapy laboratory significantly increased knowledge scores, confidence measures, and scores on Theory of Planned Behavior constructs related to perceived behavioral control and subjective norms. Despite these improvements, intention to engage in future MTM services remained unchanged. PMID:21829269

  5. Management practices that concentrate visitor activities: Camping impact management at Isle Royale National Park, USA

    USGS Publications Warehouse

    Marion, J.L.; Farrell, T.A.

    2002-01-01

    This study assessed campsite conditions and the effectiveness of campsite impact management strategies at Isle Royale National Park, USA. Protocols for assessing indicators of vegetation and soil conditions were developed and applied to 156 campsites and 88 shelters within 36 backcountry campgrounds. The average site was 68 m2 and 83% of sites lost vegetation over areas less than 47 m2. Results reveal that management actions to spatially concentrate camping activities and reduce camping disturbance have been highly successful. Comparisons of disturbed area/overnight stay among other protected areas reinforces this assertion. These reductions in area of camping disturbance are attributed to a designated site camping policy, limitation on site numbers, construction of sites in sloping terrain, use of facilities, and an ongoing program of campsite maintenance. Such actions are most appropriate in higher use backcountry and wilderness settings.

  6. Conception and development of a bibliographic database of blood nutrient fluxes across organs and tissues in ruminants: data gathering and management prior to meta-analysis.

    PubMed

    Vernet, Jean; Ortigues-Marty, Isabelle

    2006-01-01

    In the organism, nutrient exchanges among tissues and organs are subject to numerous sources of physiological or nutritional variation, and the contribution of individual factors needs to be quantified before establishing general response laws. To achieve this, meta-analysis of data from publications is a useful tool. The objective of this work was to develop a bibliographic database of nutrient fluxes across organs and tissues of ruminant animals (Flora) under Access using the Merise method. The most important criteria for Flora were the ease to relate the various information, the exhaustivity and the accuracy of the data input, a complete description of the diets, taking into account the methods of the methodological procedures of measurement and analysis of blood nutrients and the traceability of the information. The conceptual data model was built in 6 parts. The first part describes the authors and source of publication, and the person in charge of data input. It clearly separates and identifies the experiments, the groups of animals and the treatments within a publication. The second part is concerned with feeds, diets and their chemical composition and nutritional value. The third and fourth parts describe the infusion of any substrates and the methods employed, respectively. The fifth part is devoted to the results of blood flows and nutrient fluxes. The sixth part gathers miscellaneous experimental information. All these parts are inter-connected. To model this database, the Merise method was utilised and 26 entities and 32 relationships were created. At the physical level, 93 tables were created, corresponding, for the majority, to entities and relationships of the data model. They were divided into reference tables (n = 65) and data tables (n = 28). Data processing was developed in Flora and included the control of the data, generic calculations of unknown data from given data, the automation of the estimation of the missing data or the chemical

  7. Database Marketplace 2002: The Database Universe.

    ERIC Educational Resources Information Center

    Tenopir, Carol; Baker, Gayle; Robinson, William

    2002-01-01

    Reviews the database industry over the past year, including new companies and services, company closures, popular database formats, popular access methods, and changes in existing products and services. Lists 33 firms and their database services; 33 firms and their database products; and 61 company profiles. (LRW)

  8. The RIKEN integrated database of mammals

    PubMed Central

    Masuya, Hiroshi; Makita, Yuko; Kobayashi, Norio; Nishikata, Koro; Yoshida, Yuko; Mochizuki, Yoshiki; Doi, Koji; Takatsuki, Terue; Waki, Kazunori; Tanaka, Nobuhiko; Ishii, Manabu; Matsushima, Akihiro; Takahashi, Satoshi; Hijikata, Atsushi; Kozaki, Kouji; Furuichi, Teiichi; Kawaji, Hideya; Wakana, Shigeharu; Nakamura, Yukio; Yoshiki, Atsushi; Murata, Takehide; Fukami-Kobayashi, Kaoru; Mohan, Sujatha; Ohara, Osamu; Hayashizaki, Yoshihide; Mizoguchi, Riichiro; Obata, Yuichi; Toyoda, Tetsuro

    2011-01-01

    The RIKEN integrated database of mammals (http://scinets.org/db/mammal) is the official undertaking to integrate its mammalian databases produced from multiple large-scale programs that have been promoted by the institute. The database integrates not only RIKEN’s original databases, such as FANTOM, the ENU mutagenesis program, the RIKEN Cerebellar Development Transcriptome Database and the Bioresource Database, but also imported data from public databases, such as Ensembl, MGI and biomedical ontologies. Our integrated database has been implemented on the infrastructure of publication medium for databases, termed SciNetS/SciNeS, or the Scientists’ Networking System, where the data and metadata are structured as a semantic web and are downloadable in various standardized formats. The top-level ontology-based implementation of mammal-related data directly integrates the representative knowledge and individual data records in existing databases to ensure advanced cross-database searches and reduced unevenness of the data management operations. Through the development of this database, we propose a novel methodology for the development of standardized comprehensive management of heterogeneous data sets in multiple databases to improve the sustainability, accessibility, utility and publicity of the data of biomedical information. PMID:21076152

  9. The RIKEN integrated database of mammals.

    PubMed

    Masuya, Hiroshi; Makita, Yuko; Kobayashi, Norio; Nishikata, Koro; Yoshida, Yuko; Mochizuki, Yoshiki; Doi, Koji; Takatsuki, Terue; Waki, Kazunori; Tanaka, Nobuhiko; Ishii, Manabu; Matsushima, Akihiro; Takahashi, Satoshi; Hijikata, Atsushi; Kozaki, Kouji; Furuichi, Teiichi; Kawaji, Hideya; Wakana, Shigeharu; Nakamura, Yukio; Yoshiki, Atsushi; Murata, Takehide; Fukami-Kobayashi, Kaoru; Mohan, Sujatha; Ohara, Osamu; Hayashizaki, Yoshihide; Mizoguchi, Riichiro; Obata, Yuichi; Toyoda, Tetsuro

    2011-01-01

    The RIKEN integrated database of mammals (http://scinets.org/db/mammal) is the official undertaking to integrate its mammalian databases produced from multiple large-scale programs that have been promoted by the institute. The database integrates not only RIKEN's original databases, such as FANTOM, the ENU mutagenesis program, the RIKEN Cerebellar Development Transcriptome Database and the Bioresource Database, but also imported data from public databases, such as Ensembl, MGI and biomedical ontologies. Our integrated database has been implemented on the infrastructure of publication medium for databases, termed SciNetS/SciNeS, or the Scientists' Networking System, where the data and metadata are structured as a semantic web and are downloadable in various standardized formats. The top-level ontology-based implementation of mammal-related data directly integrates the representative knowledge and individual data records in existing databases to ensure advanced cross-database searches and reduced unevenness of the data management operations. Through the development of this database, we propose a novel methodology for the development of standardized comprehensive management of heterogeneous data sets in multiple databases to improve the sustainability, accessibility, utility and publicity of the data of biomedical information.

  10. Open Geoscience Database

    NASA Astrophysics Data System (ADS)

    Bashev, A.

    2012-04-01

    treatment could be conducted in other programs after extraction the filtered data into *.csv file. It makes the database understandable for non-experts. The database employs open data format (*.csv) and wide spread tools: PHP as the program language, MySQL as database management system, JavaScript for interaction with GoogleMaps and JQueryUI for create user interface. The database is multilingual: there are association tables, which connect with elements of the database. In total the development required about 150 hours. The database still has several problems. The main problem is the reliability of the data. Actually it needs an expert system for estimation the reliability, but the elaboration of such a system would take more resources than the database itself. The second problem is the problem of stream selection - how to select the stations that are connected with each other (for example, belong to one water stream) and indicate their sequence. Currently the interface is English and Russian. However it can be easily translated to your language. But some problems we decided. For example problem "the problem of the same station" (sometimes the distance between stations is smaller, than the error of position): when you adding new station to the database our application automatically find station near this place. Also we decided problem of object and parameter type (how to regard "EC" and "electrical conductivity" as the same parameter). This problem has been solved using "associative tables". If you would like to see the interface on your language, just contact us. We should send you the list of terms and phrases for translation on your language. The main advantage of the database is that it is totally open: everybody can see, extract the data from the database and use them for non-commercial purposes with no charge. Registered users can contribute to the database without getting paid. We hope, that it will be widely used first of all for education purposes, but

  11. NUREBASE: database of nuclear hormone receptors.

    PubMed

    Duarte, Jorge; Perrière, Guy; Laudet, Vincent; Robinson-Rechavi, Marc

    2002-01-01

    Nuclear hormone receptors are an abundant class of ligand activated transcriptional regulators, found in varying numbers in all animals. Based on our experience of managing the official nomenclature of nuclear receptors, we have developed NUREBASE, a database containing protein and DNA sequences, reviewed protein alignments and phylogenies, taxonomy and annotations for all nuclear receptors. The reviewed NUREBASE is completed by NUREBASE_DAILY, automatically updated every 24 h. Both databases are organized under a client/server architecture, with a client written in Java which runs on any platform. This client, named FamFetch, integrates a graphical interface allowing selection of families, and manipulation of phylogenies and alignments. NUREBASE sequence data is also accessible through a World Wide Web server, allowing complex queries. All information on accessing and installing NUREBASE may be found at http://www.ens-lyon.fr/LBMC/laudet/nurebase.html.

  12. Active traffic management on road networks: a macroscopic approach.

    PubMed

    Kurzhanskiy, Alex A; Varaiya, Pravin

    2010-10-13

    Active traffic management (ATM) is the ability to dynamically manage recurrent and non-recurrent congestion based on prevailing traffic conditions in order to maximize the effectiveness and efficiency of road networks. It is a continuous process of (i) obtaining and analysing traffic measurement data, (ii) operations planning, i.e. simulating various scenarios and control strategies, (iii) implementing the most promising control strategies in the field, and (iv) maintaining a real-time decision support system that filters current traffic measurements to predict the traffic state in the near future, and to suggest the best available control strategy for the predicted situation. ATM relies on a fast and trusted traffic simulator for the rapid quantitative assessment of a large number of control strategies for the road network under various scenarios, in a matter of minutes. The open-source macrosimulation tool Aurora ROAD NETWORK MODELER is a good candidate for this purpose. The paper describes the underlying dynamical traffic model and what it takes to prepare the model for simulation; covers the traffic performance measures and evaluation of scenarios as part of operations planning; introduces the framework within which the control strategies are modelled and evaluated; and presents the algorithm for real-time traffic state estimation and short-term prediction.

  13. Active Management of Flap-Edge Trailing Vortices

    NASA Technical Reports Server (NTRS)

    Greenblatt, David; Yao, Chung-Sheng; Vey, Stefan; Paschereit, Oliver C.; Meyer, Robert

    2008-01-01

    The vortex hazard produced by large airliners and increasingly larger airliners entering service, combined with projected rapid increases in the demand for air transportation, is expected to act as a major impediment to increased air traffic capacity. Significant reduction in the vortex hazard is possible, however, by employing active vortex alleviation techniques that reduce the wake severity by dynamically modifying its vortex characteristics, providing that the techniques do not degrade performance or compromise safety and ride quality. With this as background, a series of experiments were performed, initially at NASA Langley Research Center and subsequently at the Berlin University of Technology in collaboration with the German Aerospace Center. The investigations demonstrated the basic mechanism for managing trailing vortices using retrofitted devices that are decoupled from conventional control surfaces. The basic premise for managing vortices advanced here is rooted in the erstwhile forgotten hypothesis of Albert Betz, as extended and verified ingeniously by Coleman duPont Donaldson and his collaborators. Using these devices, vortices may be perturbed at arbitrarily long wavelengths down to wavelengths less than a typical airliner wingspan and the oscillatory loads on the wings, and hence the vehicle, are small. Significant flexibility in the specific device has been demonstrated using local passive and active separation control as well as local circulation control via Gurney flaps. The method is now in a position to be tested in a wind tunnel with a longer test section on a scaled airliner configuration. Alternatively, the method can be tested directly in a towing tank, on a model aircraft, a light aircraft or a full-scale airliner. The authors believed that this method will have significant appeal from an industry perspective due to its retrofit potential with little to no impact on cruise (devices tucked away in the cove or retracted); low operating power

  14. The Ribonuclease P Database.

    PubMed

    Brown, J W

    1999-01-01

    Ribonuclease P is responsible for the 5'-maturation of tRNA precursors. Ribonuclease P is a ribonucleoprotein, and in bacteria (and some Archaea) the RNA subunit alone is catalytically active in vitro, i.e. it is a ribozyme. The Ribonuclease P Database is a compilation of ribonuclease P sequences, sequence alignments, secondary structures, three-dimensional models and accessory information, available via the World Wide Web at the following URL: http://www.mbio.ncsu.edu/RNaseP/home .html

  15. Status of and prospects for advanced tokamak regimes from multi-machine comparisons using the 'International Tokamak Physics Activity' database

    NASA Astrophysics Data System (ADS)

    Litaudon, X.; Barbato, E.; Bécoulet, A.; Doyle, E. J.; Fujita, T.; Gohil, P.; Imbeaux, F.; Sauter, O.; Sips, G.; ITPA Group on Transport; Internal ITB Physics; Connor, J. W.; Doyle, E. J.; Esipchuk, Yu; Fujita, T.; Fukuda, T.; Gohil, P.; Kinsey, J.; Kirneva, N.; Lebedev, S.; Litaudon, X.; Mukhovatov, V.; Rice, J.; Synakowski, E.; Toi, K.; Unterberg, B.; Vershkov, V.; Wakatani, M.; International ITB Database Working Group; Aniel, T.; Baranov, Yu F.; Barbato, E.; Bécoulet, A.; Behn, R.; Bourdelle, C.; Bracco, G.; Budny, R. V.; Buratti, P.; Doyle, E. J.; Esipchuk, Yu; Esposito, B.; Ide, S.; Field, A. R.; Fujita, T.; Fukuda, T.; Gohil, P.; Gormezano, C.; Greenfield, C.; Greenwald, M.; Hahm, T. S.; Hoang, G. T.; Hobirk, J.; Hogeweij, D.; Ide, S.; Isayama, A.; Imbeaux, F.; Joffrin, E.; Kamada, Y.; Kinsey, J.; Kirneva, N.; Litaudon, X.; Luce, T. C.; Murakami, M.; Parail, V.; Peng, Y.-K. M.; Ryter, F.; Sakamoto, Y.; Shirai, H.; Sips, G.; Suzuki, T.; Synakowski, E.; Takenaga, H.; Takizuka, T.; Tala, T.; Wade, M. R.; Weiland, J.

    2004-05-01

    Advanced tokamak regimes obtained in ASDEX Upgrade, DIII-D, FT-U, JET, JT-60U, TCV and Tore Supra experiments are assessed both in terms of their fusion performance and capability for ultimately reaching steady-state using data from the international internal transport barrier database. These advanced modes of tokamak operation are characterized by an improved core confinement and a modified current profile compared to the relaxed Ohmically driven one. The present results obtained in these experiments are studied in view of their prospect for achieving either long pulses ('hybrid' scenario with inductive and non-inductive current drive) or ultimately steady-state purely non-inductive current drive operation in next step devices such as ITER. A new operational diagram for advanced tokamak operation is proposed where the figure of merit characterizing the fusion performances and confinement, H\\times \\beta _{\\rm N}/q^{2}_{95} , is drawn versus the fraction of the plasma current driven by the bootstrap effect. In this diagram, present day advanced tokamak regimes have now reached an operational domain that is required in the non-inductive ITER current drive operation with typically 50% of the plasma current driven by the bootstrap effect (Green et al 2003 Plasma Phys. Control. Fusion 45 587). In addition, the existence domain of the advanced mode regimes is also mapped in terms of dimensionless plasmas physics quantities such as normalized Larmor radius, normalized collisionality, Mach number and ratio of ion to electron temperature. The gap between present day and future advanced tokamak experiments is quantitatively assessed in terms of these dimensionless parameters. A preliminary version of this study was presented in the 29th EPS Conf. on Plasma Phys. and Control. Fusion (Montreux, Switzerland, 17 21 June 2002) [1].

  16. A Sandia telephone database system

    SciTech Connect

    Nelson, S.D.; Tolendino, L.F.

    1991-08-01

    Sandia National Laboratories, Albuquerque, may soon have more responsibility for the operation of its own telephone system. The processes that constitute providing telephone service can all be improved through the use of a central data information system. We studied these processes, determined the requirements for a database system, then designed the first stages of a system that meets our needs for work order handling, trouble reporting, and ISDN hardware assignments. The design was based on an extensive set of applications that have been used for five years to manage the Sandia secure data network. The system utilizes an Ingres database management system and is programmed using the Application-By-Forms tools.

  17. Activation and waste management considerations of fusion materials

    NASA Astrophysics Data System (ADS)

    Cheng, E. T.; Saji, G.

    1994-09-01

    Inconel-625 (Ni625), SS316, Ti-6Al-4V (Ti64), ferritic steel (FS), reduced activity ferritic steel (RAFS), manganese steel (Mn-steel), and V5Cr5Ti (V55), were examined for a near-term experimental D-T fueled fusion power reactor with respect to waste management. Activation calculations for these materials were performed assuming one year continuous operation at 1 MW/m 2 wall loading. The results show that the blanket components made of V55, Ti64, Mn-steel, and FS will be allowed for transfer to an on-site dry storage facility after 10 years of cooling after discharge. To transport the discharged blanket components to a permanent disposal site, the cooling time needed can be within 10 years for Ti64 and V55, provided that the impurities (mainly Ni, Nb and Mo) be controlled to an acceptable level. The RAFS and Mn-steel will need about 30 y cooling time because of its Fe and Mn contents. Ni625, 316SS, and FS, however, will require more than 50000 y cooling time because of their Nb and Mo contents. The RAFS, Mn-steel, Ti64 and V55 can be shallow-land wastes if the impurity level for Nb and Mo is dropped below 10 ppm.

  18. Physical activity, genetic, and nutritional considerations in childhood weight management.

    PubMed

    Bar-Or, O; Foreyt, J; Bouchard, C; Brownell, K D; Dietz, W H; Ravussin, E; Salbe, A D; Schwenger, S; St Jeor, S; Torun, B

    1998-01-01

    Almost one-quarter of U.S. children are now obese, a dramatic increase of over 20% in the past decade. It is intriguing that the increase in prevalence has been occurring while overall fat consumption has been declining. Body mass and composition are influenced by genetic factors, but the actual heritability of juvenile obesity is not known. A low physical activity (PA) is characteristic of obese children and adolescents, and it may be one cause of juvenile obesity. There is little evidence, however, that overall energy expenditure is low among the obese. There is a strong association between the prevalence of obesity and the extent of TV viewing. Enhanced PA can reduce body fat and blood pressure and improve lipoprotein profile in obese individuals. Its effect on body composition, however, is slower than with low-calorie diets. The three main dietary approaches are: protein sparing modified fast, balanced hypocaloric diets, and comprehensive behavioral lifestyle programs. To achieve long-standing control of overweight, one should combine changes in eating and activity patterns, using behavior modification techniques. However, the onus is also on society to reduce incentives for a sedentary lifestyle and over-consumption of food. To address the key issues related to childhood weight management, the American College of Sports Medicine convened a Scientific Roundtable in Indianapolis. PMID:9475638

  19. Remediation activities at the Fernald Environmental Management Project (FEMP)

    SciTech Connect

    Walsh, T.J.; Danner, R.

    1996-07-01

    The Fernald Environmental Management Project (FEMP) is a United States Department of Energy (DOE) facility located in southwestern Ohio. The facility began manufacturing uranium products in the early 1950`s and continued processing uranium ore concentrates until 1989. The facility used a variety of chemical and metallurgical processes to produce uranium metals for use at other DOE sites across the country. Since the facility manufactured uranium metals for over thirty years, various amounts of radiological contamination exists at the site. Because of the chemical and metallurgical processes employed at the site, some hazardous wastes as defined by the Resource Conservation and Recovery Act (RCRA) were also generated at the site. In 1989. the FEMP was placed on the National Priorities List (NPL) requiring cleanup of the facility`s radioactive and chemical contamination under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). This paper discusses the proposed remediation activities at the five Operable Units (OUs) designated at the FEMP. In addition, the paper also examines the ongoing CERCLA response actions and RCRA closure activities at the facility.

  20. Object-oriented structures supporting remote sensing databases

    NASA Technical Reports Server (NTRS)

    Wichmann, Keith; Cromp, Robert F.

    1995-01-01

    Object-oriented databases show promise for modeling the complex interrelationships pervasive in scientific domains. To examine the utility of this approach, we have developed an Intelligent Information Fusion System based on this technology, and applied it to the problem of managing an active repository of remotely-sensed satellite scenes. The design and implementation of the system is compared and contrasted with conventional relational database techniques, followed by a presentation of the underlying object-oriented data structures used to enable fast indexing into the data holdings.

  1. 76 FR 70486 - Agency Information Collection Activities: Submitted for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... Office of Natural Resources Revenue Agency Information Collection Activities: Submitted for Office of Management and Budget Review; Comment Request AGENCY: Office of Natural Resources Revenue, Interior. ACTION... information from Indian beneficiaries. The ONRR performs the minerals revenue management functions for...

  2. 77 FR 43355 - Agency Information Collection Activities: Submitted for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-24

    ... Office of Natural Resources Revenue Agency Information Collection Activities: Submitted for Office of Management and Budget Review, Comment Request AGENCY: Office of Natural Resources Revenue, Interior. ACTION... information from Indian beneficiaries. ONRR performs the minerals revenue management functions for...

  3. MEROPS: the peptidase database.

    PubMed

    Rawlings, N D; Barrett, A J

    1999-01-01

    The MEROPS database (http://www.bi.bbsrc.ac.uk/Merops/Merops.+ ++htm) provides a catalogue and structure-based classification of peptidases (i.e. all proteolytic enzymes). This is a large group of proteins (approximately 2% of all gene products) that is of particular importance in medicine and biotechnology. An index of the peptidases by name or synonym gives access to a set of files termed PepCards each of which provides information on a single peptidase. Each card file contains information on classification and nomenclature, and hypertext links to the relevant entries in online databases for human genetics, protein and nucleic acid sequence data and tertiary structure. Another index provides access to the PepCards by organism name so that the user can retrieve all known peptidases from a particular species. The peptidases are classified into families on the basis of statistically significant similarities between the protein sequences in the part termed the 'peptidase unit' that is most directly responsible for activity. Families that are thought to have common evolutionary origins and are known or expected to have similar tertiary folds are grouped into clans. The MEROPS database provides sets of files called FamCards and ClanCards describing the individual families and clans. Each FamCard document provides links to other databases for sequence motifs and secondary and tertiary structures, and shows the distribution of the family across the major kingdoms of living creatures. Release 3.03 of MEROPS contains 758 peptidases, 153 families and 22 clans. We suggest that the MEROPS database provides a model for a way in which a system of classification for a functional group of proteins can be developed and used as an organizational framework around which to assemble a variety of related information.

  4. Activated carbon: Utilization in sewage and industrial waste treatment. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    Not Available

    1994-04-01

    The bibliography contains citations concerning the use of activated carbon in treating sewage and industrial wastes. The citations include engineering studies, site evaluations, and regeneration techniques. References to air pollution are excluded. (Contains 250 citations and includes a subject term index and title list.)

  5. Activated carbon: Utilization in sewage and industrial waste treatment. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1995-11-01

    The bibliography contains citations concerning the use of activated carbon in treating sewage and industrial wastes. The citations include engineering studies, site evaluations, and regeneration techniques. References to air pollution are excluded. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  6. Activated carbon: Utilization in sewage and industrial waste treatment. (Latest citations from the NTIS database). Published Search

    SciTech Connect

    Not Available

    1992-12-01

    The bibliography contains citations concerning the use of activated carbon in treating sewage and industrial wastes. The citations include engineering studies, site evaluations, and regeneration techniques. References to air pollution are excluded. (Contains 250 citations and includes a subject term index and title list.)

  7. Activated carbon: Utilization excluding industrial waste treatment. (Latest citations from the EI Compendex*plus database). Published Search

    SciTech Connect

    Not Available

    1993-11-01

    The bibliography contains citations concerning the commercial use and theoretical studies of activated carbon. Topics include performance evaluations in water treatment processes, preparation and regeneration techniques, materials recovery, and pore structure studies. Adsorption characteristics for specific materials are discussed. Studies pertaining specifically to industrial waste treatment are excluded. (Contains 250 citations and includes a subject term index and title list.)

  8. Risk management activities at the DOE Class A reactor facilities

    SciTech Connect

    Sharp, D.A.; Hill, D.J.; Linn, M.A.; Atkinson, S.A.; Hu, J.P.

    1993-12-31

    The probabilistic risk assessment (PRA) and risk management group of the Association for Excellence in Reactor Operation (AERO) develops risk management initiatives and standards to improve operation and increase safety of the DOE Class A reactor facilities. Principal risk management applications that have been implemented at each facility are reviewed. The status of a program to develop guidelines for risk management programs at reactor facilities is presented.

  9. Risk management activities at the DOE Class A reactor facilities

    SciTech Connect

    Sharp, D.A. ); Hill, D.J. ); Linn, M.A. ); Atkinson, S.A. ); Hu, J.P. )

    1993-01-01

    The probabilistic risk assessment (PRA) and risk management group of the Association for Excellence in Reactor Operation (AERO) develops risk management initiatives and standards to improve operation and increase safety of the DOE Class A reactor facilities. Principal risk management applications that have been implemented at each facility are reviewed. The status of a program to develop guidelines for risk management programs at reactor facilities is presented.

  10. Overlap in Bibliographic Databases.

    ERIC Educational Resources Information Center

    Hood, William W.; Wilson, Concepcion S.

    2003-01-01

    Examines the topic of Fuzzy Set Theory to determine the overlap of coverage in bibliographic databases. Highlights include examples of comparisons of database coverage; frequency distribution of the degree of overlap; records with maximum overlap; records unique to one database; intra-database duplicates; and overlap in the top ten databases.…

  11. Issues for Active State Management of the JTPA Title III Grant: A Guide for State Planners and Managers.

    ERIC Educational Resources Information Center

    Reesman, Cilla J.

    This technical assistance guide presents the various options available to state planners and managers in considering five elements of active grant management. Each element is treated in a separate chapter. Chapter 1 addresses issues surrounding the setting of policies that ensure that Title III grants complement state agendas. Chapter 2 concerns…

  12. EPA U.S. NATIONAL MARKAL DATABASE: DATABASE DOCUMENTATION

    EPA Science Inventory

    This document describes in detail the U.S. Energy System database developed by EPA's Integrated Strategic Assessment Work Group for use with the MARKAL model. The group is part of the Office of Research and Development and is located in the National Risk Management Research Labor...

  13. Disparities in rheumatoid arthritis disease activity according to gross domestic product in 25 countries in the QUEST–RA database

    PubMed Central

    Sokka, T; Kautiainen, H; Pincus, T; Toloza, S; da Rocha Castelar Pinheiro, G; Lazovskis, J; Hetland, M L; Peets, T; Immonen, K; Maillefert, J F; Drosos, A A; Alten, R; Pohl, C; Rojkovich, B; Bresnihan, B; Minnock, P; Cazzato, M; Bombardieri, S; Rexhepi, S; Rexhepi, M; Andersone, D; Stropuviene, S; Huisman, M; Sierakowski, S; Karateev, D; Skakic, V; Naranjo, A; Baecklund, E; Henrohn, D; Gogus, F; Badsha, H; Mofti, A; Taylor, P; McClinton, C; Yazici, Y

    2009-01-01

    Objective: To analyse associations between the clinical status of patients with rheumatoid arthritis (RA) and the gross domestic product (GDP) of their resident country. Methods: The Quantitative Standard Monitoring of Patients with Rheumatoid Arthritis (QUEST–RA) cohort includes clinical and questionnaire data from 6004 patients who were seen in usual care at 70 rheumatology clinics in 25 countries as of April 2008, including 18 European countries. Demographic variables, clinical characteristics, RA disease activity measures, including the disease activity score in 28 joints (DAS28), and treatment-related variables were analysed according to GDP per capita, including 14 “high GDP” countries with GDP per capita greater than US$24 000 and 11 “low GDP” countries with GDP per capita less than US$11 000. Results: Disease activity DAS28 ranged between 3.1 and 6.0 among the 25 countries and was significantly associated with GDP (r  =  −0.78, 95% CI −0.56 to −0.90, r2  =  61%). Disease activity levels differed substantially between “high GDP” and “low GDP” countries at much greater levels than according to whether patients were currently taking or not taking methotrexate, prednisone and/or biological agents. Conclusions: The clinical status of patients with RA was correlated significantly with GDP among 25 mostly European countries according to all disease measures, associated only modestly with the current use of antirheumatic medications. The burden of arthritis appears substantially greater in “low GDP” than in “high GDP” countries. These findings may alert healthcare professionals and designers of health policy towards improving the clinical status of patients with RA in all countries. PMID:19643759

  14. Update on Activities of CEOS Disaster Management Support Group

    NASA Astrophysics Data System (ADS)

    Wood, H. M.; Lauritson, L.

    The Committee on Earth Observation Satellites (CEOS) Disaster Management Support Group (DMSG) has supported natural and technological disaster management on a worldwide basis by fostering improved utilization of existing and planned Earth Observation (EO) satellite data. The DMSG has focused on developing and refining recommendations for the application of satellite data to selected hazard areas--drought, earthquake, fire, flood, ice, landslide, oil spill, and volcanic hazards. Particular emphasis was placed on working closely with space agencies, international and regional organizations, and commercial organizations on the implementation of these recommendations. The DMSG is in its last year with its primary focus on documenting its work and migrating on going activities to other fora. With over 300 participants from more than 140 organizations, the DMSG has found strong support among CEOS space agencies and the Integrated Global Observing Strategy (IGOS), as well as an enthusiastic reception from numerous international, regional, and national emergency managers, and distinct interest from the commercial sector. In addition, the group has worked to give full support to the work of the United Nations Committee on the Peaceful Uses of Outer Space (COPUOS) in pursuit of decisions taken at UNISPACE III and the United Nations International Strategy on Disaster Reduction (ISDR). In conjunction with the IGOS, several of the DMSG hazards teams (earthquake, landslide, and solid Earth dimensions of volcanoes) are joining in the effort to develop an IGOS Geohazards theme team. Cooperation efforts with organizations such as IGOS, COPUOS, and ISDR will hopefully lead to the pick up of much of the on going DMSG activities. Since the inception of this ad hoc working group and its predecessor project, the DMSG has developed and refined recommendations for the application of satellite data by bringing together experts from eight hazard areas to identify user needs, as well as

  15. An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nicholas; Sellis, Timos

    1994-01-01

    We investigated a number of design and performance issues of interoperable database management systems (DBMS's). The major results of our investigation were obtained in the areas of client-server database architectures for heterogeneous DBMS's, incremental computation models, buffer management techniques, and query optimization. We finished a prototype of an advanced client-server workstation-based DBMS which allows access to multiple heterogeneous commercial DBMS's. Experiments and simulations were then run to compare its performance with the standard client-server architectures. The focus of this research was on adaptive optimization methods of heterogeneous database systems. Adaptive buffer management accounts for the random and object-oriented access methods for which no known characterization of the access patterns exists. Adaptive query optimization means that value distributions and selectives, which play the most significant role in query plan evaluation, are continuously refined to reflect the actual values as opposed to static ones that are computed off-line. Query feedback is a concept that was first introduced to the literature by our group. We employed query feedback for both adaptive buffer management and for computing value distributions and selectivities. For adaptive buffer management, we use the page faults of prior executions to achieve more 'informed' management decisions. For the estimation of the distributions of the selectivities, we use curve-fitting techniques, such as least squares and splines, for regressing on these values.

  16. Clinical Genomic Database

    PubMed Central

    Solomon, Benjamin D.; Nguyen, Anh-Dao; Bear, Kelly A.; Wolfsberg, Tyra G.

    2013-01-01

    Technological advances have greatly increased the availability of human genomic sequencing. However, the capacity to analyze genomic data in a clinically meaningful way lags behind the ability to generate such data. To help address this obstacle, we reviewed all conditions with genetic causes and constructed the Clinical Genomic Database (CGD) (http://research.nhgri.nih.gov/CGD/), a searchable, freely Web-accessible database of conditions based on the clinical utility of genetic diagnosis and the availability of specific medical interventions. The CGD currently includes a total of 2,616 genes organized clinically by affected organ systems and interventions (including preventive measures, disease surveillance, and medical or surgical interventions) that could be reasonably warranted by the identification of pathogenic mutations. To aid independent analysis and optimize new data incorporation, the CGD also includes all genetic conditions for which genetic knowledge may affect the selection of supportive care, informed medical decision-making, prognostic considerations, reproductive decisions, and allow avoidance of unnecessary testing, but for which specific interventions are not otherwise currently available. For each entry, the CGD includes the gene symbol, conditions, allelic conditions, clinical categorization (for both manifestations and interventions), mode of inheritance, affected age group, description of interventions/rationale, links to other complementary databases, including databases of variants and presumed pathogenic mutations, and links to PubMed references (>20,000). The CGD will be regularly maintained and updated to keep pace with scientific discovery. Further content-based expert opinions are actively solicited. Eventually, the CGD may assist the rapid curation of individual genomes as part of active medical care. PMID:23696674

  17. A survey of paediatric HIV programmatic and clinical management practices in Asia and sub-Saharan Africa—the International epidemiologic Databases to Evaluate AIDS (IeDEA)

    PubMed Central

    2013-01-01

    Introduction There are limited data on paediatric HIV care and treatment programmes in low-resource settings. Methods A standardized survey was completed by International epidemiologic Databases to Evaluate AIDS paediatric cohort sites in the regions of Asia-Pacific (AP), Central Africa (CA), East Africa (EA), Southern Africa (SA) and West Africa (WA) to understand operational resource availability and paediatric management practices. Data were collected through January 2010 using a secure, web-based software program (REDCap). Results A total of 64,552 children were under care at 63 clinics (AP, N=10; CA, N=4; EA, N=29; SA, N=10; WA, N=10). Most were in urban settings (N=41, 65%) and received funding from governments (N=51, 81%), PEPFAR (N=34, 54%), and/or the Global Fund (N=15, 24%). The majority were combined adult–paediatric clinics (N=36, 57%). Prevention of mother-to-child transmission was integrated at 35 (56%) sites; 89% (N=56) had access to DNA PCR for infant diagnosis. African (N=40/53) but not Asian sites recommended exclusive breastfeeding up until 4–6 months. Regular laboratory monitoring included CD4 (N=60, 95%), and viral load (N=24, 38%). Although 42 (67%) sites had the ability to conduct acid-fast bacilli (AFB) smears, 23 (37%) sites could conduct AFB cultures and 18 (29%) sites could conduct tuberculosis drug susceptibility testing. Loss to follow-up was defined as >3 months of lost contact for 25 (40%) sites, >6 months for 27 sites (43%) and >12 months for 6 sites (10%). Telephone calls (N=52, 83%) and outreach worker home visits to trace children lost to follow-up (N=45, 71%) were common. Conclusions In general, there was a high level of patient and laboratory monitoring within this multiregional paediatric cohort consortium that will facilitate detailed observational research studies. Practices will continue to be monitored as the WHO/UNAIDS Treatment 2.0 framework is implemented. PMID:23336728

  18. Draft secure medical database standard.

    PubMed

    Pangalos, George

    2002-01-01

    Medical database security is a particularly important issue for all Healthcare establishments. Medical information systems are intended to support a wide range of pertinent health issues today, for example: assure the quality of care, support effective management of the health services institutions, monitor and contain the cost of care, implement technology into care without violating social values, ensure the equity and availability of care, preserve humanity despite the proliferation of technology etc.. In this context, medical database security aims primarily to support: high availability, accuracy and consistency of the stored data, the medical professional secrecy and confidentiality, and the protection of the privacy of the patient. These properties, though of technical nature, basically require that the system is actually helpful for medical care and not harmful to patients. These later properties require in turn not only that fundamental ethical principles are not violated by employing database systems, but instead, are effectively enforced by technical means. This document reviews the existing and emerging work on the security of medical database systems. It presents in detail the related problems and requirements related to medical database security. It addresses the problems of medical database security policies, secure design methodologies and implementation techniques. It also describes the current legal framework and regulatory requirements for medical database security. The issue of medical database security guidelines is also examined in detailed. The current national and international efforts in the area are studied. It also gives an overview of the research work in the area. The document also presents in detail the most complete to our knowledge set of security guidelines for the development and operation of medical database systems.

  19. Databases of the marine metagenomics.

    PubMed

    Mineta, Katsuhiko; Gojobori, Takashi

    2016-02-01

    The metagenomic data obtained from marine environments is significantly useful for understanding marine microbial communities. In comparison with the conventional amplicon-based approach of metagenomics, the recent shotgun sequencing-based approach has become a powerful tool that provides an efficient way of grasping a diversity of the entire microbial community at a sampling point in the sea. However, this approach accelerates accumulation of the metagenome data as well as increase of data complexity. Moreover, when metagenomic approach is used for monitoring a time change of marine environments at multiple locations of the seawater, accumulation of metagenomics data will become tremendous with an enormous speed. Because this kind of situation has started becoming of reality at many marine research institutions and stations all over the world, it looks obvious that the data management and analysis will be confronted by the so-called Big Data issues such as how the database can be constructed in an efficient way and how useful knowledge should be extracted from a vast amount of the data. In this review, we summarize the outline of all the major databases of marine metagenome that are currently publically available, noting that database exclusively on marine metagenome is none but the number of metagenome databases including marine metagenome data are six, unexpectedly still small. We also extend our explanation to the databases, as reference database we call, that will be useful for constructing a marine metagenome database as well as complementing important information with the database. Then, we would point out a number of challenges to be conquered in constructing the marine metagenome database.

  20. An Evaluation of Database Solutions to Spatial Object Association

    SciTech Connect

    Kumar, V S; Kurc, T; Saltz, J; Abdulla, G M; Kohn, S; Matarazzo, C

    2008-06-24

    Object association is a common problem encountered in many applications. Spatial object association, also referred to as crossmatch of spatial datasets, is the problem of identifying and comparing objects in two datasets based on their positions in a common spatial coordinate system--one of the datasets may correspond to a catalog of objects observed over time in a multi-dimensional domain; the other dataset may consist of objects observed in a snapshot of the domain at a time point. The use of database management systems to the solve the object association problem provides portability across different platforms and also greater flexibility. Increasing dataset sizes in today's applications, however, have made object association a data/compute-intensive problem that requires targeted optimizations for efficient execution. In this work, we investigate how database-based crossmatch algorithms can be deployed on different database system architectures and evaluate the deployments to understand the impact of architectural choices on crossmatch performance and associated trade-offs. We investigate the execution of two crossmatch algorithms on (1) a parallel database system with active disk style processing capabilities, (2) a high-throughput network database (MySQL Cluster), and (3) shared-nothing databases with replication. We have conducted our study in the context of a large-scale astronomy application with real use-case scenarios.

  1. FLOPROS: an evolving global database of flood protection standards

    NASA Astrophysics Data System (ADS)

    Scussolini, Paolo; Aerts, Jeroen C. J. H.; Jongman, Brenden; Bouwer, Laurens M.; Winsemius, Hessel C.; de Moel, Hans; Ward, Philip J.

    2016-05-01

    With projected changes in climate, population and socioeconomic activity located in flood-prone areas, the global assessment of flood risk is essential to inform climate change policy and disaster risk management. Whilst global flood risk models exist for this purpose, the accuracy of their results is greatly limited by the lack of information on the current standard of protection to floods, with studies either neglecting this aspect or resorting to crude assumptions. Here we present a first global database of FLOod PROtection Standards, FLOPROS, which comprises information in the form of the flood return period associated with protection measures, at different spatial scales. FLOPROS comprises three layers of information, and combines them into one consistent database. The design layer contains empirical information about the actual standard of existing protection already in place; the policy layer contains information on protection standards from policy regulations; and the model layer uses a validated modelling approach to calculate protection standards. The policy layer and the model layer can be considered adequate proxies for actual protection standards included in the design layer, and serve to increase the spatial coverage of the database. Based on this first version of FLOPROS, we suggest a number of strategies to further extend and increase the resolution of the database. Moreover, as the database is intended to be continually updated, while flood protection standards are changing with new interventions, FLOPROS requires input from the flood risk community. We therefore invite researchers and practitioners to contribute information to this evolving database by corresponding to the authors.

  2. Nuclear Concrete Materials Database Phase I Development

    SciTech Connect

    Ren, Weiju; Naus, Dan J

    2012-05-01

    The FY 2011 accomplishments in Phase I development of the Nuclear Concrete Materials Database to support the Light Water Reactor Sustainability Program are summarized. The database has been developed using the ORNL materials database infrastructure established for the Gen IV Materials Handbook to achieve cost reduction and development efficiency. In this Phase I development, the database has been successfully designed and constructed to manage documents in the Portable Document Format generated from the Structural Materials Handbook that contains nuclear concrete materials data and related information. The completion of the Phase I database has established a solid foundation for Phase II development, in which a digital database will be designed and constructed to manage nuclear concrete materials data in various digitized formats to facilitate electronic and mathematical processing for analysis, modeling, and design applications.

  3. Heterogeneous distributed databases: A case study

    NASA Technical Reports Server (NTRS)

    Stewart, Tracy R.; Mukkamala, Ravi

    1991-01-01

    Alternatives are reviewed for accessing distributed heterogeneous databases and a recommended solution is proposed. The current study is limited to the Automated Information Systems Center at the Naval Sea Combat Systems Engineering Station at Norfolk, VA. This center maintains two databases located on Digital Equipment Corporation's VAX computers running under the VMS operating system. The first data base, ICMS, resides on a VAX11/780 and has been implemented using VAX DBMS, a CODASYL based system. The second database, CSA, resides on a VAX 6460 and has been implemented using the ORACLE relational database management system (RDBMS). Both databases are used for configuration management within the U.S. Navy. Different customer bases are supported by each database. ICMS tracks U.S. Navy ships and major systems (anti-sub, sonar, etc.). Even though the major systems on ships and submarines have totally different functions, some of the equipment within the major systems are common to both ships and submarines.

  4. A real-time framework for fast data retrieval in an image database of volcano activity scenarios

    NASA Astrophysics Data System (ADS)

    Aliotta, Marco Antonio; Cannata, Andrea; Cassisi, Carmelo; Ciancitto, Francesco; Montalto, Placido; Prestifilippo, Michele

    2015-04-01

    Explosive Activity at Stromboli Volcano (Aeolian Islands) is continuously monitored by INGV-OE in order to analyze its eruptive dynamics and specific scenarios. In particular, the images acquired from thermal cameras represent a big collection of data. In order to extract useful information from thermal image sequences, we need an efficient way to explore and retrieve information from a huge amount of data. In this work, a novel framework capable of fast data retrieval, using the "metric space" concept, is shown. In the light of it, we implemented an indexing algorithm related to similarity laws. The focal point is finding objects of a set that are "close" in relation to a given query, according to a similarity criterion. In order to perform this task, we performed morphological image processing techniques to each video frame, in order to map the shape area of each explosion into a closed curve, representing the explosion contour itself. In order to constitute a metric space, we chose a certain number of features obtained from parameters related to this closed curve and used them as objects of this metric space where similarity can be evaluated, using an appropriate "metric" function to calculate the distances. Unfortunately, this approach has to deal with an intrinsic issue involving the complexity and the number of distance functions to be calculated on a large amount of data. To overcome this drawback, we used a novel abstract data structure called "K-Pole Tree", having the property of minimizing the number of distances to be calculated among objects. Our method allows for fast retrieval of similar objects using an euclidean distance function among the features of the metric space. Thus, we can cluster explosions related to different kinds of volcanic activity, using "pivot" items. For example, given a known image sequence related to a particular type of explosion, it is possible to quickly and easily find all the image sequences that contain only similar

  5. Web-Based Self-Management in Chronic Care: A Study of Change in Patient Activation

    ERIC Educational Resources Information Center

    Solomon, Michael R.

    2010-01-01

    Web-based self-management interventions (W-SMIs) are designed to help a large number of chronically ill people become more actively engaged in their health care. Despite the potential to engage more patients in self-managing their health, the use of W-SMIs by patients and their clinicians is low. Using a self-management conceptual model based on…

  6. 77 FR 45698 - Agency Information Collection Activities: Submission for the Office of Management and Budget...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-01

    ... COMMISSION Agency Information Collection Activities: Submission for the Office of Management and Budget Review; Comment Request AGENCY: Nuclear Regulatory Commission. ACTION: Notice of the Office of Management.... Nuclear Regulatory Commission (NRC) has recently submitted to the Office of Management and Budget...

  7. The Astrobiology Habitable Environments Database (AHED)

    NASA Astrophysics Data System (ADS)

    Lafuente, B.; Stone, N.; Downs, R. T.; Blake, D. F.; Bristow, T.; Fonda, M.; Pires, A.

    2015-12-01

    The Astrobiology Habitable Environments Database (AHED) is a central, high quality, long-term searchable repository for archiving and collaborative sharing of astrobiologically relevant data, including, morphological, textural and contextural images, chemical, biochemical, isotopic, sequencing, and mineralogical information. The aim of AHED is to foster long-term innovative research by supporting integration and analysis of diverse datasets in order to: 1) help understand and interpret planetary geology; 2) identify and characterize habitable environments and pre-biotic/biotic processes; 3) interpret returned data from present and past missions; 4) provide a citable database of NASA-funded published and unpublished data (after an agreed-upon embargo period). AHED uses the online open-source software "The Open Data Repository's Data Publisher" (ODR - http://www.opendatarepository.org) [1], which provides a user-friendly interface that research teams or individual scientists can use to design, populate and manage their own database according to the characteristics of their data and the need to share data with collaborators or the broader scientific community. This platform can be also used as a laboratory notebook. The database will have the capability to import and export in a variety of standard formats. Advanced graphics will be implemented including 3D graphing, multi-axis graphs, error bars, and similar scientific data functions together with advanced online tools for data analysis (e. g. the statistical package, R). A permissions system will be put in place so that as data are being actively collected and interpreted, they will remain proprietary. A citation system will allow research data to be used and appropriately referenced by other researchers after the data are made public. This project is supported by the Science-Enabling Research Activity (SERA) and NASA NNX11AP82A, Mars Science Laboratory Investigations. [1] Nate et al. (2015) AGU, submitted.

  8. Chronic pain management in the active-duty military

    NASA Astrophysics Data System (ADS)

    Jamison, David; Cohen, Steven P.

    2012-06-01

    As in the general population, chronic pain is a prevalent and burdensome affliction in active-duty military personnel. Painful conditions in military members can be categorized broadly in terms of whether they arise directly from combat injuries (gunshot, fragmentation wound, blast impact) or whether they result from non-combat injuries (sprains, herniated discs, motor vehicle accidents). Both combat-related and non-combat-related causes of pain can further be classified as either acute or chronic. Here we discuss the state of pain management as it relates to the military population in both deployed and non-deployed settings. The term non-battle injury (NBI) is commonly used to refer to those conditions not directly associated with the combat actions of war. In the history of warfare, NBI have far outstripped battle-related injuries in terms not only of morbidity, but also mortality. It was not until improvements in health care and field medicine were applied in World War I that battle-related deaths finally outnumbered those attributed to disease and pestilence. However, NBI have been the leading cause of morbidity and hospital admission in every major conflict since the Korean War. Pain remains a leading cause of presentation to military medical facilities, both in and out of theater. The absence of pain services is associated with a low return-to-duty rate among the deployed population. The most common pain complaints involve the low-back and neck, and studies have suggested that earlier treatment is associated with more significant improvement and a higher return to duty rate. It is recognized that military medicine is often at the forefront of medical innovation, and that many fields of medicine have reaped benefit from the conduct of war.

  9. Fostering Intuition in Management Education: Activities and Resources

    ERIC Educational Resources Information Center

    Sadler-Smith, Eugene; Burke, Lisa A.

    2009-01-01

    In business, there is little doubt that managers use their intuitions when making decisions. But in spite of the fact that intuition and rationality are two parallel systems of knowing, intuition is often considered the antithesis of rationality and is overlooked, disregarded, or acted on covertly by managers. What is also clear is that intuition…

  10. Education & Recycling: Educator's Waste Management Resource and Activity Guide 1994.

    ERIC Educational Resources Information Center

    California State Dept. of Conservation. Sacramento. Div. of Recycling.

    This activity guide for grades K-12 reinforces the concepts of recycling, reducing, and reusing through a series of youth-oriented activities. The guide incorporates a video-based activity, multiple session classroom activities, and activities requiring group participation and student conducted research. Constructivist learning theory was…

  11. HLLV avionics requirements study and electronic filing system database development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This final report provides a summary of achievements and activities performed under Contract NAS8-39215. The contract's objective was to explore a new way of delivering, storing, accessing, and archiving study products and information and to define top level system requirements for Heavy Lift Launch Vehicle (HLLV) avionics that incorporate Vehicle Health Management (VHM). This report includes technical objectives, methods, assumptions, recommendations, sample data, and issues as specified by DPD No. 772, DR-3. The report is organized into two major subsections, one specific to each of the two tasks defined in the Statement of Work: the Index Database Task and the HLLV Avionics Requirements Task. The Index Database Task resulted in the selection and modification of a commercial database software tool to contain the data developed during the HLLV Avionics Requirements Task. All summary information is addressed within each task's section.

  12. Database Reports Over the Internet

    NASA Technical Reports Server (NTRS)

    Smith, Dean Lance

    2002-01-01

    Most of the summer was spent developing software that would permit existing test report forms to be printed over the web on a printer that is supported by Adobe Acrobat Reader. The data is stored in a DBMS (Data Base Management System). The client asks for the information from the database using an HTML (Hyper Text Markup Language) form in a web browser. JavaScript is used with the forms to assist the user and verify the integrity of the entered data. Queries to a database are made in SQL (Sequential Query Language), a widely supported standard for making queries to databases. Java servlets, programs written in the Java programming language running under the control of network server software, interrogate the database and complete a PDF form template kept in a file. The completed report is sent to the browser requesting the report. Some errors are sent to the browser in an HTML web page, others are reported to the server. Access to the databases was restricted since the data are being transported to new DBMS software that will run on new hardware. However, the SQL queries were made to Microsoft Access, a DBMS that is available on most PCs (Personal Computers). Access does support the SQL commands that were used, and a database was created with Access that contained typical data for the report forms. Some of the problems and features are discussed below.

  13. Evaluation of community-based nurse case management activities for symptomatic HIV/AIDS clients.

    PubMed

    Wright, J; Henry, S B; Holzemer, W L; Falknor, P

    1993-01-01

    The purpose of this study was to evaluate case management activities performed by nurse case managers in the California Pilot Care and Waiver Projects for HIV/AIDS patients. Nurse case managers, social workers, and site directors completed a 62-item survey. Significant differences appeared in ratings among the groups on five items. The nurse case managers responding to the survey indicated that a wide variety of nursing skills are used to provide case management services to persons living with AIDS and AIDS-related complex. This survey validates the interdisciplinary case management model in a community-based HIV population.

  14. A UNIMARC Bibliographic Format Database for ABCD

    ERIC Educational Resources Information Center

    Megnigbeto, Eustache

    2012-01-01

    Purpose: ABCD is a web-based open and free software suite for library management derived from the UNESCO CDS/ISIS software technology. The first version was launched officially in December 2009 with a MARC 21 bibliographic format database. This paper aims to detail the building of the UNIMARC bibliographic format database for ABCD.…

  15. Educational Use of Databases in CALL

    ERIC Educational Resources Information Center

    Beaudoin, Martin

    2004-01-01

    This article presents the idea that databases are very useful tools for teaching languages over the Internet. Databases in Computer Assisted Language Learning (CALL) are commonly used in three ways: for reference sources such as dictionaries, in the management of large websites, and for data processing such as language tests and learners'…

  16. Using activity-based management to control costs & achieve organization goals.

    PubMed

    Antos, J; Elwell, D

    1998-08-01

    Activity-based management (ABM) is a management process that focuses on improving costs and outcomes. It derives useful information based on the way people think (their activities) rather than traditional expense categories. ABM supports outcomes, quality, teams, re-engineering, empowerment, and continuous improvement. It is a process that providers may want to adopt in light of new Medicare reimbursement practices.

  17. Databases: Beyond the Basics.

    ERIC Educational Resources Information Center

    Whittaker, Robert

    This presented paper offers an elementary description of database characteristics and then provides a survey of databases that may be useful to the teacher and researcher in Slavic and East European languages and literatures. The survey focuses on commercial databases that are available, usable, and needed. Individual databases discussed include:…

  18. The Genomes OnLine Database (GOLD) v.5: a metadata management system based on a four level (meta)genome project classification

    SciTech Connect

    Reddy, Tatiparthi B. K.; Thomas, Alex D.; Stamatis, Dimitri; Bertsch, Jon; Isbandi, Michelle; Jansson, Jakob; Mallajosyula, Jyothi; Pagani, Ioanna; Lobos, Elizabeth A.; Kyrpides, Nikos C.

    2014-10-27

    The Genomes OnLine Database (GOLD; http://www.genomesonline.org) is a comprehensive online resource to catalog and monitor genetic studies worldwide. GOLD provides up-to-date status on complete and ongoing sequencing projects along with a broad array of curated metadata. Within this paper, we report version 5 (v.5) of the database. The newly designed database schema and web user interface supports several new features including the implementation of a four level (meta)genome project classification system and a simplified intuitive web interface to access reports and launch search tools. The database currently hosts information for about 19 200 studies, 56 000 Biosamples, 56 000 sequencing projects and 39 400 analysis projects. More than just a catalog of worldwide genome projects, GOLD is a manually curated, quality-controlled metadata warehouse. The problems encountered in integrating disparate and varying quality data into GOLD are briefly highlighted. Lastly, GOLD fully supports and follows the Genomic Standards Consortium (GSC) Minimum Information standards.

  19. The Genomes OnLine Database (GOLD) v.5: a metadata management system based on a four level (meta)genome project classification

    PubMed Central

    Reddy, T.B.K.; Thomas, Alex D.; Stamatis, Dimitri; Bertsch, Jon; Isbandi, Michelle; Jansson, Jakob; Mallajosyula, Jyothi; Pagani, Ioanna; Lobos, Elizabeth A.; Kyrpides, Nikos C.

    2015-01-01

    The Genomes OnLine Database (GOLD; http://www.genomesonline.org) is a comprehensive online resource to catalog and monitor genetic studies worldwide. GOLD provides up-to-date status on complete and ongoing sequencing projects along with a broad array of curated metadata. Here we report version 5 (v.5) of the database. The newly designed database schema and web user interface supports several new features including the implementation of a four level (meta)genome project classification system and a simplified intuitive web interface to access reports and launch search tools. The database currently hosts information for about 19 200 studies, 56 000 Biosamples, 56 000 sequencing projects and 39 400 analysis projects. More than just a catalog of worldwide genome projects, GOLD is a manually curated, quality-controlled metadata warehouse. The problems encountered in integrating disparate and varying quality data into GOLD are briefly highlighted. GOLD fully supports and follows the Genomic Standards Consortium (GSC) Minimum Information standards. PMID:25348402

  20. Human Mitochondrial Protein Database

    National Institute of Standards and Technology Data Gateway

    SRD 131 Human Mitochondrial Protein Database (Web, free access)   The Human Mitochondrial Protein Database (HMPDb) provides comprehensive data on mitochondrial and human nuclear encoded proteins involved in mitochondrial biogenesis and function. This database consolidates information from SwissProt, LocusLink, Protein Data Bank (PDB), GenBank, Genome Database (GDB), Online Mendelian Inheritance in Man (OMIM), Human Mitochondrial Genome Database (mtDB), MITOMAP, Neuromuscular Disease Center and Human 2-D PAGE Databases. This database is intended as a tool not only to aid in studying the mitochondrion but in studying the associated diseases.