Science.gov

Sample records for active database management

  1. Database Management System

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In 1981 Wayne Erickson founded Microrim, Inc, a company originally focused on marketing a microcomputer version of RIM (Relational Information Manager). Dennis Comfort joined the firm and is now vice president, development. The team developed an advanced spinoff from the NASA system they had originally created, a microcomputer database management system known as R:BASE 4000. Microrim added many enhancements and developed a series of R:BASE products for various environments. R:BASE is now the second largest selling line of microcomputer database management software in the world.

  2. TWRS technical baseline database manager definition document

    SciTech Connect

    Acree, C.D.

    1997-08-13

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

  3. Interconnecting heterogeneous database management systems

    NASA Technical Reports Server (NTRS)

    Gligor, V. D.; Luckenbaugh, G. L.

    1984-01-01

    It is pointed out that there is still a great need for the development of improved communication between remote, heterogeneous database management systems (DBMS). Problems regarding the effective communication between distributed DBMSs are primarily related to significant differences between local data managers, local data models and representations, and local transaction managers. A system of interconnected DBMSs which exhibit such differences is called a network of distributed, heterogeneous DBMSs. In order to achieve effective interconnection of remote, heterogeneous DBMSs, the users must have uniform, integrated access to the different DBMs. The present investigation is mainly concerned with an analysis of the existing approaches to interconnecting heterogeneous DBMSs, taking into account four experimental DBMS projects.

  4. Generalized Database Management System Support for Numeric Database Environments.

    ERIC Educational Resources Information Center

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  5. Database Design for Preservation Project Management: The California Newspaper Project.

    ERIC Educational Resources Information Center

    Hayman, Lynne M.

    1997-01-01

    Describes a database designed to manage a serials preservation project in which issues from multiple repositories are gathered and collated for preservation microfilming. Management information, added to bibliographic and holdings records, supports the production of reports tracking preservation activity. (Author)

  6. Microcomputer Database Management Systems for Bibliographic Data.

    ERIC Educational Resources Information Center

    Pollard, Richard

    1986-01-01

    Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)

  7. Construction of file database management

    SciTech Connect

    MERRILL,KYLE J.

    2000-03-01

    This work created a database for tracking data analysis files from multiple lab techniques and equipment stored on a central file server. Experimental details appropriate for each file type are pulled from the file header and stored in a searchable database. The database also stores specific location and self-directory structure for each data file. Queries can be run on the database according to file type, sample type or other experimental parameters. The database was constructed in Microsoft Access and Visual Basic was used for extraction of information from the file header.

  8. Negative Effects of Learning Spreadsheet Management on Learning Database Management

    ERIC Educational Resources Information Center

    Vágner, Anikó; Zsakó, László

    2015-01-01

    A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…

  9. An Introduction to Database Management Systems.

    ERIC Educational Resources Information Center

    Warden, William H., III; Warden, Bette M.

    1984-01-01

    Description of database management systems for microcomputers highlights system features and factors to consider in microcomputer system selection. A method for ranking database management systems is explained and applied to a defined need, i.e., software support for indexing a weekly newspaper. A glossary of terms and 32-item bibliography are…

  10. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  11. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins; John M. Beck

    2011-11-01

    The Next Generation Nuclear Plant (NGNP) Risk Management System (RMS) is a database used to maintain the project risk register. The RMS also maps risk reduction activities to specific identified risks. Further functionality of the RMS includes mapping reactor suppliers Design Data Needs (DDNs) to risk reduction tasks and mapping Phenomena Identification Ranking Table (PIRTs) to associated risks. This document outlines the basic instructions on how to use the RMS. This document constitutes Revision 1 of the NGNP Risk Management Database: A Model for Managing Risk. It incorporates the latest enhancements to the RMS. The enhancements include six new custom views of risk data - Impact/Consequence, Tasks by Project Phase, Tasks by Status, Tasks by Project Phase/Status, Tasks by Impact/WBS, and Tasks by Phase/Impact/WBS.

  12. Central Asia Active Fault Database

    NASA Astrophysics Data System (ADS)

    Mohadjer, Solmaz; Ehlers, Todd A.; Kakar, Najibullah

    2014-05-01

    The ongoing collision of the Indian subcontinent with Asia controls active tectonics and seismicity in Central Asia. This motion is accommodated by faults that have historically caused devastating earthquakes and continue to pose serious threats to the population at risk. Despite international and regional efforts to assess seismic hazards in Central Asia, little attention has been given to development of a comprehensive database for active faults in the region. To address this issue and to better understand the distribution and level of seismic hazard in Central Asia, we are developing a publically available database for active faults of Central Asia (including but not limited to Afghanistan, Tajikistan, Kyrgyzstan, northern Pakistan and western China) using ArcGIS. The database is designed to allow users to store, map and query important fault parameters such as fault location, displacement history, rate of movement, and other data relevant to seismic hazard studies including fault trench locations, geochronology constraints, and seismic studies. Data sources integrated into the database include previously published maps and scientific investigations as well as strain rate measurements and historic and recent seismicity. In addition, high resolution Quickbird, Spot, and Aster imagery are used for selected features to locate and measure offset of landforms associated with Quaternary faulting. These features are individually digitized and linked to attribute tables that provide a description for each feature. Preliminary observations include inconsistent and sometimes inaccurate information for faults documented in different studies. For example, the Darvaz-Karakul fault which roughly defines the western margin of the Pamir, has been mapped with differences in location of up to 12 kilometers. The sense of motion for this fault ranges from unknown to thrust and strike-slip in three different studies despite documented left-lateral displacements of Holocene and late

  13. Integrated Space Asset Management Database and Modeling

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

  14. The land management and operations database (LMOD)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper presents the design, implementation, deployment, and application of the Land Management and Operations Database (LMOD). LMOD is the single authoritative source for reference land management and operation reference data within the USDA enterprise data warehouse. LMOD supports modeling appl...

  15. Research on computer virus database management system

    NASA Astrophysics Data System (ADS)

    Qi, Guoquan

    2011-12-01

    The growing proliferation of computer viruses becomes the lethal threat and research focus of the security of network information. While new virus is emerging, the number of viruses is growing, virus classification increasing complex. Virus naming because of agencies' capture time differences can not be unified. Although each agency has its own virus database, the communication between each other lacks, or virus information is incomplete, or a small number of sample information. This paper introduces the current construction status of the virus database at home and abroad, analyzes how to standardize and complete description of virus characteristics, and then gives the information integrity, storage security and manageable computer virus database design scheme.

  16. Personal Database Management System I TRIAS

    NASA Astrophysics Data System (ADS)

    Yamamoto, Yoneo; Kashihara, Akihiro; Kawagishi, Keisuke

    The current paper provides TRIAS (TRIple Associative System) which is a database management system for a personal use. In order to implement TRIAS, we have developed an associative database, whose format is (e,a,v) : e for entity, a for attribute, v for value. ML-TREE is used to construct (e,a,v). ML-TREE is a reversion of B+-tree that is multiway valanced tree. The paper focuses mainly on the usage of associative database, demonstrating how to use basic commands, primary functions and applcations.

  17. Organizing a breast cancer database: data management.

    PubMed

    Yi, Min; Hunt, Kelly K

    2016-06-01

    Developing and organizing a breast cancer database can provide data and serve as valuable research tools for those interested in the etiology, diagnosis, and treatment of cancer. Depending on the research setting, the quality of the data can be a major issue. Assuring that the data collection process does not contribute inaccuracies can help to assure the overall quality of subsequent analyses. Data management is work that involves the planning, development, implementation, and administration of systems for the acquisition, storage, and retrieval of data while protecting it by implementing high security levels. A properly designed database provides you with access to up-to-date, accurate information. Database design is an important component of application design. If you take the time to design your databases properly, you'll be rewarded with a solid application foundation on which you can build the rest of your application.

  18. Implementing a Microcomputer Database Management System.

    ERIC Educational Resources Information Center

    Manock, John J.; Crater, K. Lynne

    1985-01-01

    Current issues in selecting, structuring, and implementing microcomputer database management systems in research administration offices are discussed, and their capabilities are illustrated with the system used by the University of North Carolina at Wilmington. Trends in microcomputer technology and their likely impact on research administration…

  19. A Database Management System for Interlibrary Loan.

    ERIC Educational Resources Information Center

    Chang, Amy

    1990-01-01

    Discusses the increasing complexity of dealing with interlibrary loan requests and describes a database management system for interlibrary loans used at Texas Tech University. System functions are described, including file control, records maintenance, and report generation, and the impact on staff productivity is discussed. (CLB)

  20. Choosing the Right Database Management Program.

    ERIC Educational Resources Information Center

    Vockell, Edward L.; Kopenec, Donald

    1989-01-01

    Provides a comparison of four database management programs commonly used in schools: AppleWorks, the DOS 3.3 and ProDOS versions of PFS, and MECC's Data Handler. Topics discussed include information storage, spelling checkers, editing functions, search strategies, graphs, printout formats, library applications, and HyperCard. (LRW)

  1. Producing an Index with Your Microcomputer Database Manager.

    ERIC Educational Resources Information Center

    Jonassen, David

    1985-01-01

    Describes a procedure for using commonly available database management systems to produce indexes on microcomputers. Production steps discussed include creation of the database, data entry, database sort, formatting, and editing. (Author/MBR)

  2. How Should We Manage All Those Databases?

    SciTech Connect

    Langley, K E

    1998-10-01

    In an organization where there are many DBAs working with many instances and databases on many machines with many developers - how do you manage all of this without total chaos? This paper will outline how the central Database Support organization at Lockheed Martin Energy Systems in Oak Ridge, TN manages more than 250 instances on more than 90 systems with a variety of operating systems. This discussion will include how tasks and responsibilities are divided between System DBAs, Application Project DBAs, and developers. The use of standards as well as local routines to maintain the systems will be discussed. Information on the type of communications used to keep the different group informed and up-to-date will also be presented.

  3. How Should we Manage all These Databases?

    SciTech Connect

    Langley, K.E.

    1998-11-01

    In an organization where there are many DBAs working with many instances and databases on many machines with many developers - how do you manage all of this without total chaos? This paper will outline how the central Database Support organization at Lockheed Martin Energy Systems in Oak Ridge, TN manages more than 250 instances on more than 90 systems with a variety of operating systems. This discussion will include how tasks and responsibilities are divided between System DBAs, Application Project DBAs, and developers. The use of standards as well as local routines to maintain the systems will be discussed. Information on the type of communications used to keep the different groups informed and up-to-date will also be presented.

  4. Service Management Database for DSN Equipment

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.

  5. Database Management Systems: New Homes for Migrating Bibliographic Records.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Bierbaum, Esther G.

    1987-01-01

    Assesses bibliographic databases as part of visionary text systems such as hypertext and scholars' workstations. Downloading is discussed in terms of the capability to search records and to maintain unique bibliographic descriptions, and relational database management systems, file managers, and text databases are reviewed as possible hosts for…

  6. The fundamentals of object-oriented database management systems.

    PubMed

    Plateau, D

    1993-01-01

    The purpose of this document is to characterize the two technologies (database and object-oriented technologies) which constitute the foundation of object-oriented database management systems. The O2 Object-Oriented DataBase Management System is then described as an example of this type of system.

  7. The Network Configuration of an Object Relational Database Management System

    NASA Technical Reports Server (NTRS)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  8. Integrated Space Asset Management Database and Modeling

    NASA Astrophysics Data System (ADS)

    Gagliano, L.; MacLeod, T.; Mason, S.; Percy, T.; Prescott, J.

    The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

  9. Resident database interfaces to the DAVID system, a heterogeneous distributed database management system

    NASA Technical Reports Server (NTRS)

    Moroh, Marsha

    1988-01-01

    A methodology for building interfaces of resident database management systems to a heterogeneous distributed database management system under development at NASA, the DAVID system, was developed. The feasibility of that methodology was demonstrated by construction of the software necessary to perform the interface task. The interface terminology developed in the course of this research is presented. The work performed and the results are summarized.

  10. Applications of GIS and database technologies to manage a Karst Feature Database

    USGS Publications Warehouse

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  11. Lessons Learned from Deploying an Analytical Task Management Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen

    2007-01-01

    Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.

  12. CURRENT STATUS OF THE IAEA'S NET ENABLED WASTE MANAGEMENT DATABASE

    SciTech Connect

    Csullog, G.W.; Pozdniakov, I.; Bellag, M.J.

    2003-02-27

    The International Atomic Energy Agency's Net Enabled Waste Management Database (NEWMDB) contains information on national radioactive waste management programs and organizations, plans and activities, relevant laws and regulations, policies and radioactive waste inventories. The NEWMDB, which was launched on the Internet July 6, 2001, is the successor to the Agency's Waste Management Database (WMDB), which was in use during the 1990's. The NEWMDB's first data collection cycle took place from July 2001 to March 2002. Agency Member State participation in the first data collection cycle was low--only 22 submissions were received. However, the first data collection cycle demonstrated that: the NEWMDB could be used to collect information on national radioactive waste management programs and radioactive waste inventories annually, the NEWMDB data can support the routine reporting of status and trends in radioactive waste management based on quantitative data, the NEWMDB can support the compilation of a consolidated, international radioactive waste inventory based on a unified waste classification scheme, the data needed to compute an indicator of sustainable development for radioactive waste management are available at the national level, NEWMDB data can be used to assess the development and implementation of national systems for radioactive waste management, and the NEWMDB can support the reporting requirements of the Joint Convention on the Safety of Spent Fuel Management and on the Safety of Radioactive Waste Management. Agency Member States that had not made data submissions in the first cycle were asked to submit data during an extension of the first cycle (July 2002--January 2003). When this paper was written, the Agency had conducted two of three international workshops to provide training for future NEWMDB data collection cycles and to compile lessons learned for the first data collection cycle. A third workshop was scheduled for January 2003. This paper provides

  13. Teaching Tip: Active Learning via a Sample Database: The Case of Microsoft's Adventure Works

    ERIC Educational Resources Information Center

    Mitri, Michel

    2015-01-01

    This paper describes the use and benefits of Microsoft's Adventure Works (AW) database to teach advanced database skills in a hands-on, realistic environment. Database management and querying skills are a key element of a robust information systems curriculum, and active learning is an important way to develop these skills. To facilitate active…

  14. EADB: An Estrogenic Activity Database for Assessing ...

    EPA Pesticide Factsheets

    Endocrine-active chemicals can potentially have adverse effects on both humans and wildlife. They can interfere with the body’s endocrine system through direct or indirect interactions with many protein targets. Estrogen receptors (ERs) are one of the major targets, and many endocrine disruptors are estrogenic and affect the normal estrogen signaling pathways. However, ERs can also serve as therapeutic targets for various medical conditions, such as menopausal symptoms, osteoporosis, and ER-positive breast cancer. Because of the decades-long interest in the safety and therapeutic utility of estrogenic chemicals, a large number of chemicals have been assayed for estrogenic activity, but these data exist in various sources and different formats that restrict the ability of regulatory and industry scientists to utilize them fully for assessing risk-benefit. To address this issue, we have developed an Estrogenic Activity Database (EADB; http://www.fda.gov/ScienceResearch/ BioinformaticsTools/EstrogenicActivityDatabaseEADB/default. htm) and made it freely available to the public. EADB contains 18,114 estrogenic activity data points collected for 8212 chemicals tested in 1284 binding, reporter gene, cell proliferation, and in vivo assays in 11 different species. The chemicals cover a broad chemical structure space and the data span a wide range of activities. A set of tools allow users to access EADB and evaluate potential endocrine activity of

  15. An authoritative global database for active submarine hydrothermal vent fields

    NASA Astrophysics Data System (ADS)

    Beaulieu, Stace E.; Baker, Edward T.; German, Christopher R.; Maffei, Andrew

    2013-11-01

    The InterRidge Vents Database is available online as the authoritative reference for locations of active submarine hydrothermal vent fields. Here we describe the revision of the database to an open source content management system and conduct a meta-analysis of the global distribution of known active vent fields. The number of known active vent fields has almost doubled in the past decade (521 as of year 2009), with about half visually confirmed and others inferred active from physical and chemical clues. Although previously known mainly from mid-ocean ridges (MORs), active vent fields at MORs now comprise only half of the total known, with about a quarter each now known at volcanic arcs and back-arc spreading centers. Discoveries in arc and back-arc settings resulted in an increase in known vent fields within exclusive economic zones, consequently reducing the proportion known in high seas to one third. The increase in known vent fields reflects a number of factors, including increased national and commercial interests in seafloor hydrothermal deposits as mineral resources. The purpose of the database now extends beyond academic research and education and into marine policy and management, with at least 18% of known vent fields in areas granted or pending applications for mineral prospecting and 8% in marine protected areas.

  16. Teaching Database Management System Use in a Library School Curriculum.

    ERIC Educational Resources Information Center

    Cooper, Michael D.

    1985-01-01

    Description of database management systems course being taught to students at School of Library and Information Studies, University of California, Berkeley, notes course structure, assignments, and course evaluation. Approaches to teaching concepts of three types of database systems are discussed and systems used by students in the course are…

  17. The role of databases in areawide pest management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A database is a comprehensive collection of related data organized for convenient access, generally in a computer. The evolution of computer software and the need to distinguish the specialized computer systems for storing and manipulating data, stimulated development of database management systems...

  18. Improving Recall Using Database Management Systems: A Learning Strategy.

    ERIC Educational Resources Information Center

    Jonassen, David H.

    1986-01-01

    Describes the use of microcomputer database management systems to facilitate the instructional uses of learning strategies relating to information processing skills, especially recall. Two learning strategies, cross-classification matrixing and node acquisition and integration, are highlighted. (Author/LRW)

  19. Integrated Electronic Health Record Database Management System: A Proposal.

    PubMed

    Schiza, Eirini C; Panos, George; David, Christiana; Petkov, Nicolai; Schizas, Christos N

    2015-01-01

    eHealth has attained significant importance as a new mechanism for health management and medical practice. However, the technological growth of eHealth is still limited by technical expertise needed to develop appropriate products. Researchers are constantly in a process of developing and testing new software for building and handling Clinical Medical Records, being renamed to Electronic Health Record (EHR) systems; EHRs take full advantage of the technological developments and at the same time provide increased diagnostic and treatment capabilities to doctors. A step to be considered for facilitating this aim is to involve more actively the doctor in building the fundamental steps for creating the EHR system and database. A global clinical patient record database management system can be electronically created by simulating real life medical practice health record taking and utilizing, analyzing the recorded parameters. This proposed approach demonstrates the effective implementation of a universal classic medical record in electronic form, a procedure by which, clinicians are led to utilize algorithms and intelligent systems for their differential diagnosis, final diagnosis and treatment strategies.

  20. A database management capability for Ada

    NASA Technical Reports Server (NTRS)

    Chan, Arvola; Danberg, SY; Fox, Stephen; Landers, Terry; Nori, Anil; Smith, John M.

    1986-01-01

    The data requirements of mission critical defense systems have been increasing dramatically. Command and control, intelligence, logistics, and even weapons systems are being required to integrate, process, and share ever increasing volumes of information. To meet this need, systems are now being specified that incorporate data base management subsystems for handling storage and retrieval of information. It is expected that a large number of the next generation of mission critical systems will contain embedded data base management systems. Since the use of Ada has been mandated for most of these systems, it is important to address the issues of providing data base management capabilities that can be closely coupled with Ada. A comprehensive distributed data base management project has been investigated. The key deliverables of this project are three closely related prototype systems implemented in Ada. These three systems are discussed.

  1. Development of a Relational Database for Learning Management Systems

    ERIC Educational Resources Information Center

    Deperlioglu, Omer; Sarpkaya, Yilmaz; Ergun, Ertugrul

    2011-01-01

    In today's world, Web-Based Distance Education Systems have a great importance. Web-based Distance Education Systems are usually known as Learning Management Systems (LMS). In this article, a database design, which was developed to create an educational institution as a Learning Management System, is described. In this sense, developed Learning…

  2. Expansion of the MANAGE database with forest and drainage studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The “Measured Annual Nutrient loads from AGricultural Environments” (MANAGE) database was published in 2006 to expand an early 1980’s compilation of nutrient export (load) data from agricultural land uses at the field or farm spatial scale. Then in 2008, MANAGE was updated with 15 additional studie...

  3. Comparing Text, Document, and Relational Database Management Systems.

    ERIC Educational Resources Information Center

    Reid, Clifford

    1990-01-01

    Discusses the critical need for tools to manage, access, and disseminate text and image information ranging from free-form ASCII data to scanned documents stored on optical media. Current types of integrated text/image databases are described and the advancement of technology toward systems that manage large collections of documents made up of…

  4. Geoscience research databases for coastal Alabama ecosystem management

    USGS Publications Warehouse

    Hummell, Richard L.

    1995-01-01

    Effective management of complex coastal ecosystems necessitates access to scientific knowledge that can be acquired through a multidisciplinary approach involving Federal and State scientists that take advantage of agency expertise and resources for the benefit of all participants working toward a set of common research and management goals. Cooperative geostatic investigations have led toward building databases of fundamental scientific knowledge that can be utilized to manage coastal Alabama's natural and future development. These databases have been used to assess the occurrence and economic potential of hard mineral resources in the Alabama EFZ, and to support oil spill contingency planning and environmental analysis for coastal Alabama.

  5. TRENDS: The aeronautical post-test database management system

    NASA Technical Reports Server (NTRS)

    Bjorkman, W. S.; Bondi, M. J.

    1990-01-01

    TRENDS, an engineering-test database operating system developed by NASA to support rotorcraft flight tests, is described. Capabilities and characteristics of the system are presented, with examples of its use in recalling and analyzing rotorcraft flight-test data from a TRENDS database. The importance of system user-friendliness in gaining users' acceptance is stressed, as is the importance of integrating supporting narrative data with numerical data in engineering-test databases. Considerations relevant to the creation and maintenance of flight-test database are discussed and TRENDS' solutions to database management problems are described. Requirements, constraints, and other considerations which led to the system's configuration are discussed and some of the lessons learned during TRENDS' development are presented. Potential applications of TRENDS to a wide range of aeronautical and other engineering tests are identified.

  6. DOE technology information management system database study report

    SciTech Connect

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  7. Relational Information Management Data-Base System

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Erickson, W. J.; Gray, F. P.; Comfort, D. L.; Wahlstrom, S. O.; Von Limbach, G.

    1985-01-01

    DBMS with several features particularly useful to scientists and engineers. RIM5 interfaced with any application program written in language capable of Calling FORTRAN routines. Applications include data management for Space Shuttle Columbia tiles, aircraft flight tests, high-pressure piping, atmospheric chemistry, census, university registration, CAD/CAM Geometry, and civil-engineering dam construction.

  8. A survey of commercial object-oriented database management systems

    NASA Technical Reports Server (NTRS)

    Atkins, John

    1992-01-01

    The object-oriented data model is the culmination of over thirty years of database research. Initially, database research focused on the need to provide information in a consistent and efficient manner to the business community. Early data models such as the hierarchical model and the network model met the goal of consistent and efficient access to data and were substantial improvements over simple file mechanisms for storing and accessing data. However, these models required highly skilled programmers to provide access to the data. Consequently, in the early 70's E.F. Codd, an IBM research computer scientists, proposed a new data model based on the simple mathematical notion of the relation. This model is known as the Relational Model. In the relational model, data is represented in flat tables (or relations) which have no physical or internal links between them. The simplicity of this model fostered the development of powerful but relatively simple query languages that now made data directly accessible to the general database user. Except for large, multi-user database systems, a database professional was in general no longer necessary. Database professionals found that traditional data in the form of character data, dates, and numeric data were easily represented and managed via the relational model. Commercial relational database management systems proliferated and performance of relational databases improved dramatically. However, there was a growing community of potential database users whose needs were not met by the relational model. These users needed to store data with data types not available in the relational model and who required a far richer modelling environment than that provided by the relational model. Indeed, the complexity of the objects to be represented in the model mandated a new approach to database technology. The Object-Oriented Model was the result.

  9. The methodology of database design in organization management systems

    NASA Astrophysics Data System (ADS)

    Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.

    2017-01-01

    The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.

  10. 76 FR 59170 - Hartford Financial Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-23

    ... Employment and Training Administration Hartford Financial Services, Inc., Corporate/EIT/CTO Database...) applicable to workers and former workers Hartford Financial Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, Connecticut (The Hartford, Corporate/EIT/CTO Database Management...

  11. Database Design and Management in Engineering Optimization.

    DTIC Science & Technology

    1988-02-01

    for 4 Steekanta Murthy, T., Shyy, Y.-K. and Arora, J. S. MIDAS: educational and research purposes. It has considerably Management of Information for...an education in the particular field of ,-". expertise. ..-. *, The types of information to be retained and presented depend on the user of the system...191 . ,. 110 Though the design of MIDAS is directly influenced by Obl- SPOC qUery-bioek the current structural optimization applications, it possesses

  12. Active in-database processing to support ambient assisted living systems.

    PubMed

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-08-12

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  13. Active In-Database Processing to Support Ambient Assisted Living Systems

    PubMed Central

    de Morais, Wagner O.; Lundström, Jens; Wickström, Nicholas

    2014-01-01

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare. PMID:25120164

  14. Development of the ageing management database of PUSPATI TRIGA reactor

    SciTech Connect

    Ramli, Nurhayati Tom, Phongsakorn Prak; Husain, Nurfazila; Farid, Mohd Fairus Abd; Ramli, Shaharum; Maskin, Mazleha; Adnan, Amirul Syazwan; Abidin, Nurul Husna Zainal

    2016-01-22

    Since its first criticality in 1982, PUSPATI TRIGA Reactor (RTP) has been operated for more than 30 years. As RTP become older, ageing problems have been seen to be the prominent issues. In addressing the ageing issues, an Ageing Management (AgeM) database for managing related ageing matters was systematically developed. This paper presents the development of AgeM database taking into account all RTP major Systems, Structures and Components (SSCs) and ageing mechanism of these SSCs through the system surveillance program.

  15. Integration of Information Retrieval and Database Management Systems.

    ERIC Educational Resources Information Center

    Deogun, Jitender S.; Raghavan, Vijay V.

    1988-01-01

    Discusses the motivation for integrating information retrieval and database management systems, and proposes a probabilistic retrieval model in which records in a file may be composed of attributes (formatted data items) and descriptors (content indicators). The details and resolutions of difficulties involved in integrating such systems are…

  16. Use of Knowledge Bases in Education of Database Management

    ERIC Educational Resources Information Center

    Radványi, Tibor; Kovács, Emod

    2008-01-01

    In this article we present a segment of Sulinet Digital Knowledgebase curriculum system in which you can find the sections of subject-matter which aid educating the database management. You can follow the order of the course from the beginning when some topics appearance and raise in elementary school, through the topics accomplish in secondary…

  17. Interface between astrophysical datasets and distributed database management systems (DAVID)

    NASA Technical Reports Server (NTRS)

    Iyengar, S. S.

    1988-01-01

    This is a status report on the progress of the DAVID (Distributed Access View Integrated Database Management System) project being carried out at Louisiana State University, Baton Rouge, Louisiana. The objective is to implement an interface between Astrophysical datasets and DAVID. Discussed are design details and implementation specifics between DAVID and astrophysical datasets.

  18. Selecting a Relational Database Management System for Library Automation Systems.

    ERIC Educational Resources Information Center

    Shekhel, Alex; O'Brien, Mike

    1989-01-01

    Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)

  19. Selecting Data-Base Management Software for Microcomputers in Libraries and Information Units.

    ERIC Educational Resources Information Center

    Pieska, K. A. O.

    1986-01-01

    Presents a model for the evaluation of database management systems software from the viewpoint of librarians and information specialists. The properties of data management systems, database management systems, and text retrieval systems are outlined and compared. (10 references) (CLB)

  20. EADB: an estrogenic activity database for assessing potential endocrine activity.

    PubMed

    Shen, Jie; Xu, Lei; Fang, Hong; Richard, Ann M; Bray, Jeffrey D; Judson, Richard S; Zhou, Guangxu; Colatsky, Thomas J; Aungst, Jason L; Teng, Christina; Harris, Steve C; Ge, Weigong; Dai, Susie Y; Su, Zhenqiang; Jacobs, Abigail C; Harrouk, Wafa; Perkins, Roger; Tong, Weida; Hong, Huixiao

    2013-10-01

    Endocrine-active chemicals can potentially have adverse effects on both humans and wildlife. They can interfere with the body's endocrine system through direct or indirect interactions with many protein targets. Estrogen receptors (ERs) are one of the major targets, and many endocrine disruptors are estrogenic and affect the normal estrogen signaling pathways. However, ERs can also serve as therapeutic targets for various medical conditions, such as menopausal symptoms, osteoporosis, and ER-positive breast cancer. Because of the decades-long interest in the safety and therapeutic utility of estrogenic chemicals, a large number of chemicals have been assayed for estrogenic activity, but these data exist in various sources and different formats that restrict the ability of regulatory and industry scientists to utilize them fully for assessing risk-benefit. To address this issue, we have developed an Estrogenic Activity Database (EADB; http://www.fda.gov/ScienceResearch/BioinformaticsTools/EstrogenicActivityDatabaseEADB/default.htm) and made it freely available to the public. EADB contains 18,114 estrogenic activity data points collected for 8212 chemicals tested in 1284 binding, reporter gene, cell proliferation, and in vivo assays in 11 different species. The chemicals cover a broad chemical structure space and the data span a wide range of activities. A set of tools allow users to access EADB and evaluate potential endocrine activity of chemicals. As a case study, a classification model was developed using EADB for predicting ER binding of chemicals.

  1. An image database management system for conducting CAD research

    NASA Astrophysics Data System (ADS)

    Gruszauskas, Nicholas; Drukker, Karen; Giger, Maryellen L.

    2007-03-01

    The development of image databases for CAD research is not a trivial task. The collection and management of images and their related metadata from multiple sources is a time-consuming but necessary process. By standardizing and centralizing the methods in which these data are maintained, one can generate subsets of a larger database that match the specific criteria needed for a particular research project in a quick and efficient manner. A research-oriented management system of this type is highly desirable in a multi-modality CAD research environment. An online, webbased database system for the storage and management of research-specific medical image metadata was designed for use with four modalities of breast imaging: screen-film mammography, full-field digital mammography, breast ultrasound and breast MRI. The system was designed to consolidate data from multiple clinical sources and provide the user with the ability to anonymize the data. Input concerning the type of data to be stored as well as desired searchable parameters was solicited from researchers in each modality. The backbone of the database was created using MySQL. A robust and easy-to-use interface for entering, removing, modifying and searching information in the database was created using HTML and PHP. This standardized system can be accessed using any modern web-browsing software and is fundamental for our various research projects on computer-aided detection, diagnosis, cancer risk assessment, multimodality lesion assessment, and prognosis. Our CAD database system stores large amounts of research-related metadata and successfully generates subsets of cases that match the user's desired search criteria.

  2. Database system for analysing and managing coiled tubing drilling data

    NASA Astrophysics Data System (ADS)

    Suh, J.; Choi, Y.; Park, H.; Choe, J.

    2009-05-01

    This study present a prototype of database system for analysing and managing petrophysical data from coiled tubing drilling in the oil and gas industry. The characteristics of coiled tubing drilling data from cores were analyzed and categorized according to the whole drilling process and data modeling including object relation diagram, class diagram was carried out to design the schema of effective database system such as the relationships between tables and key index fields to create the relationships. The database system called DrillerGeoDB consists of 22 tables and those are classified with 4 groups such as project information, stratum information, drilling/logging information and operation evaluate information. DrillerGeoDB provide all sort of results of each process with a spreadsheet such as MS-Excel via application of various algorithm of logging theory and statistics function of cost evaluation. This presentation describes the details of the system development and implementation.

  3. Concierge: Personal Database Software for Managing Digital Research Resources

    PubMed Central

    Sakai, Hiroyuki; Aoyama, Toshihiro; Yamaji, Kazutsuna; Usui, Shiro

    2007-01-01

    This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literature management, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp). PMID:18974800

  4. Concierge: personal database software for managing digital research resources.

    PubMed

    Sakai, Hiroyuki; Aoyama, Toshihiro; Yamaji, Kazutsuna; Usui, Shiro

    2007-01-01

    This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literature management, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp).

  5. Database Design Learning: A Project-Based Approach Organized through a Course Management System

    ERIC Educational Resources Information Center

    Dominguez, Cesar; Jaime, Arturo

    2010-01-01

    This paper describes an active method for database design learning through practical tasks development by student teams in a face-to-face course. This method integrates project-based learning, and project management techniques and tools. Some scaffolding is provided at the beginning that forms a skeleton that adapts to a great variety of…

  6. An engineering database management system for spacecraft operations

    NASA Technical Reports Server (NTRS)

    Cipollone, Gregorio; Mckay, Michael H.; Paris, Joseph

    1993-01-01

    Studies at ESOC have demonstrated the feasibility of a flexible and powerful Engineering Database Management System in support for spacecraft operations documentation. The objectives set out were three-fold: first an analysis of the problems encountered by the Operations team in obtaining and managing operations documents; secondly, the definition of a concept for operations documentation and the implementation of prototype to prove the feasibility of the concept; and thirdly, definition of standards and protocols required for the exchange of data between the top-level partners in a satellite project. The EDMS prototype was populated with ERS-l satellite design data and has been used by the operations team at ESOC to gather operational experience. An operational EDMS would be implemented at the satellite prime contractor's site as a common database for all technical information surrounding a project and would be accessible by the cocontractor's and ESA teams.

  7. The Oil and Natural Gas Knowledge Management Database from NETL

    DOE Data Explorer

    The Knowledge Management Database (KMD) Portal provides four options for searching the documents and data that NETL-managed oil and gas research has produced over the years for DOE’s Office of Fossil Energy. Information includes R&D carried out under both historical and ongoing DOE oil and gas research and development (R&D). The Document Repository, the CD/DVD Library, the Project Summaries from 1990 to the present, and the Oil and Natural Gas Program Reference Shelf provide a wide range of flexibility and coverage.

  8. A Support Database System for Integrated System Health Management (ISHM)

    NASA Technical Reports Server (NTRS)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between

  9. A multidisciplinary database for geophysical time series management

    NASA Astrophysics Data System (ADS)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  10. Overview of the LDEF MSIG databasing activities

    NASA Technical Reports Server (NTRS)

    Funk, Joan G.

    1995-01-01

    The Long Duration Exposure Facility (LDEF) and the accompanying experiments were composed of and contained a wide variety of materials, representing the largest collection of materials flown in low earth orbit (LEO) and retrieved for ground-based analysis to date. The results and implications of the mechanical, thermal, optical, and electrical data from these materials are the foundation on which future LEO spacecraft and missions will be built. The LDEF Materials Special Investigation Group (MSIG) has been charged with establishing and developing databases to document these materials and their performance to assure not only that the data are archived for future generations but also that the data are available to the spacecraft user community in an easily accessed, user-friendly form. This paper gives an overview of the current LDEF Materials Databases, their capabilities, and availability. An overview of the philosophy and format of a developing handbook on LEO effects on materials is also described.

  11. Computerized database management system for breast cancer patients.

    PubMed

    Sim, Kok Swee; Chong, Sze Siang; Tso, Chih Ping; Nia, Mohsen Esmaeili; Chong, Aun Kee; Abbas, Siti Fathimah

    2014-01-01

    Data analysis based on breast cancer risk factors such as age, race, breastfeeding, hormone replacement therapy, family history, and obesity was conducted on breast cancer patients using a new enhanced computerized database management system. My Structural Query Language (MySQL) is selected as the application for database management system to store the patient data collected from hospitals in Malaysia. An automatic calculation tool is embedded in this system to assist the data analysis. The results are plotted automatically and a user-friendly graphical user interface is developed that can control the MySQL database. Case studies show breast cancer incidence rate is highest among Malay women, followed by Chinese and Indian. The peak age for breast cancer incidence is from 50 to 59 years old. Results suggest that the chance of developing breast cancer is increased in older women, and reduced with breastfeeding practice. The weight status might affect the breast cancer risk differently. Additional studies are needed to confirm these findings.

  12. Digital map and spatial database requirements for advanced traffic management systems

    SciTech Connect

    Lerner-Lam, E.; Smith, W.T.; Francisca, J.R.; Rathi, A.

    1993-12-31

    Advanced Traffic Management Systems (ATMS) depend on good-quality digital maps and spatial databases. Concerns over the availability of digital maps and spatial databases for ATMS`s in the United States were initially raised in early meetings of IVHS America ATMS committee. While there has been little argument regarding the important role of the private sector in providing ``value-added`` data for sale to public and private parties, the IVHS community has since been engaged in a lively debate over the appropriates roles of the public and private sectors in providing ``base data`` for the nation`s Intelligent Vehicle and Highway Systems. This paper summarizes the activities of the ATMS Committee over the past 1 1/2 years and offers recommendations for next steps to be taken toward laying the foundations for efficient and effective deployment of digital map and spatial database resources for use in advanced traffic management systems.

  13. Student Activities. Managing Liability.

    ERIC Educational Resources Information Center

    Bennett, Barbara; And Others

    This monograph suggests ways that college or university administrations can undertake a systematic and careful review of the risks posed by students' activities. Its purpose is to provide guidance in integrating the risk management process into a school's existing approaches to managing student organizations and activities. It is noted that no…

  14. DBAASP: database of antimicrobial activity and structure of peptides.

    PubMed

    Gogoladze, Giorgi; Grigolava, Maia; Vishnepolsky, Boris; Chubinidze, Mindia; Duroux, Patrice; Lefranc, Marie-Paule; Pirtskhalava, Malak

    2014-08-01

    The Database of Antimicrobial Activity and Structure of Peptides (DBAASP) is a manually curated database for those peptides for which antimicrobial activity against particular targets has been evaluated experimentally. The database is a depository of complete information on: the chemical structure of peptides; target species; target object of cell; peptide antimicrobial/haemolytic/cytotoxic activities; and experimental conditions at which activities were estimated. The DBAASP search page allows the user to search peptides according to their structural characteristics, complexity type (monomer, dimer and two-peptide), source, synthesis type (ribosomal, nonribosomal and synthetic) and target species. The database prediction algorithm provides a tool for rational design of new antimicrobial peptides. DBAASP is accessible at http://www.biomedicine.org.ge/dbaasp/.

  15. Survey of standards applicable to a database management system

    NASA Technical Reports Server (NTRS)

    Urena, J. L.

    1981-01-01

    Industry, government, and NASA standards, and the status of standardization activities of standards setting organizations applicable to the design, implementation and operation of a data base management system for space related applications are identified. The applicability of the standards to a general purpose, multimission data base management system is addressed.

  16. Management Guidelines for Database Developers' Teams in Software Development Projects

    NASA Astrophysics Data System (ADS)

    Rusu, Lazar; Lin, Yifeng; Hodosi, Georg

    Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.

  17. Database Access Manager for the Software Engineering Laboratory (DAMSEL) user's guide

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Operating instructions for the Database Access Manager for the Software Engineering Laboratory (DAMSEL) system are presented. Step-by-step instructions for performing various data entry and report generation activities are included. Sample sessions showing the user interface display screens are also included. Instructions for generating reports are accompanied by sample outputs for each of the reports. The document groups the available software functions by the classes of users that may access them.

  18. Interdisciplinary Database Activities for Fifth Graders at Tomas Rivera.

    ERIC Educational Resources Information Center

    Ennis, Demetria

    1997-01-01

    Describes the practical application of interdisciplinary activities in a fifth-grade classroom. Twenty-eight students participated in a two-month unit which integrated database activities with mathematics, art, English, U.S. history, and public speaking. Illustrates that student leaders were able to understand and construct search strategies,…

  19. THE NATIONAL EXPOSURE RESEARCH LABORATORY'S CONSOLIDATED HUMAN ACTIVITY DATABASE

    EPA Science Inventory

    EPA's National Exposure Research Laboratory (NERL) has combined data from 12 U.S. studies related to human activities into one comprehensive data system that can be accessed via the Internet. The data system is called the Consolidated Human Activity Database (CHAD), and it is ...

  20. THE NATIONAL EXPOSURE RESEARCH LABORATORY'S COMPREHENSIVE HUMAN ACTIVITY DATABASE

    EPA Science Inventory

    EPA's National Exposure Research Laboratory (NERL) has combined data from nine U.S. studies related to human activities into one comprehensive data system that can be accessed via the world-wide web. The data system is called CHAD-Consolidated Human Activity Database-and it is ...

  1. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    NASA Technical Reports Server (NTRS)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  2. National Levee Database: monitoring, vulnerability assessment and management in Italy

    NASA Astrophysics Data System (ADS)

    Barbetta, Silvia; Camici, Stefania; Maccioni, Pamela; Moramarco, Tommaso

    2015-04-01

    Italian levees and historical breach failures to be exploited in the framework of an operational procedure addressed to the seepage vulnerability assessment of river reaches where the levee system is an important structural measure against flooding. For its structure, INLED is a dynamic geospatial database with ongoing efforts to add levee data from authorities with the charge of hydraulic risk mitigation. In particular, the database is aimed to provide the available information about: i) location and condition of levees; ii) morphological and geometrical properties; iii) photographic documentation; iv) historical levee failures; v) assessment of vulnerability to overtopping and seepage carried out through a procedure based on simple vulnerability indexes (Camici et al. 2014); vi) management, control and maintenance; vii)flood hazard maps developed by assuming the levee system undamaged/damaged during the flood event. Currently, INLED contains data of levees that are mostly located in the Tiber basin, Central Italy. References Apel H., Merz B. & Thieken A.H. Quantification of uncertainties in flood risk assessments. Int J River Basin Manag 2008, 6, (2), 149-162. Camici S,, Barbetta S., Moramarco T., Levee body vulnerability to seepage: the case study of the levee failure along the Foenna stream on 1st January 2006 (central Italy)", Journal of Flood Risk Management, in press. Colleselli F. Geotechnical problems related to river and channel embankments. Rotterdam, the Netherlands: Springer, 1994. H. R.Wallingford Consultants (HRWC). Risk assessment for flood and coastal defence for strategic planning: high level methodology technical report, London, 2003. Mazzoleni M., Bacchi B., Barontini S., Di Baldassarre G., Pilotti M. & Ranzi R. Flooding hazard mapping in floodplain areas affected by piping breaches in the Po River, Italy. J Hydrol Eng 2014, 19, (4), 717-731.

  3. The Development of a Standard Database System for Republic of Korea Army’s Personnel Management.

    DTIC Science & Technology

    1983-06-01

    for ROK Army personnel management ? Which data items should be incorporated in a database? Which tecnique should be applied to design data- bases using a...iD-Ri33 499 THE DEVELOPMENT OF A STANDARD DATABASE SYSTEM FOR i/i REPUBLIC OF KOREA ARMY’S PERSONNEL MANAGEMENT (U) NAVAL POSTGRADUATE SCHOOL MONTEREY...NAVAL POSTGRADUATE SCHOOL Monterey, California . THESIS THE DEVELOPMENT OF A STANDARD DATABASE SYSTEM FOR REPUBLIC OF KOREA ARMY’S PERSONNEL MANAGE

  4. Management of the life and death of an earth-science database: some examples from geotherm

    USGS Publications Warehouse

    Bliss, J.D.

    1986-01-01

    Productive earth-science databases require managers who are familiar with and skilled at using available software developed specifically for database management. There also should be a primary user with a clearly understood mission. The geologic phenomenon addressed by the database must be sufficiently understood, and adequate appropriate data must be available to construct a useful database. The database manager, in concert with the primary user, must ensure that data of adequate quality are available in the database, as well as prepare for mechanisms of releasing the data when the database is terminated. The primary user needs to be held accountable along with the database manager to ensure that a useful database will be created. Quality of data and maintenance of database relevancy to the user's mission are important issues during the database's lifetime. Products prepared at termination may be used more than the operational database and thus are of critical importance. These concepts are based, in part, on both the shortcomings and successes of GEOTHERM, a comprehensive system of databases and software used to store, locate, and evaluate the geology, geochemistry, and hydrology of geothermal systems. ?? 1986.

  5. Cryptanalysis of Password Protection of Oracle Database Management System (DBMS)

    NASA Astrophysics Data System (ADS)

    Koishibayev, Timur; Umarova, Zhanat

    2016-04-01

    This article discusses the currently available encryption algorithms in the Oracle database, also the proposed upgraded encryption algorithm, which consists of 4 steps. In conclusion we make an analysis of password encryption of Oracle Database.

  6. A Conceptual Model and Database to Integrate Data and Project Management

    NASA Astrophysics Data System (ADS)

    Guarinello, M. L.; Edsall, R.; Helbling, J.; Evaldt, E.; Glenn, N. F.; Delparte, D.; Sheneman, L.; Schumaker, R.

    2015-12-01

    database and build it in a way that is modular and can be changed or expanded to meet user needs. Our hope is that others, especially those managing large collaborative research grants, will be able to use our project model and database design to enhance the value of their project and data management both during and following the active research period.

  7. The INFN-CNAF Tier-1 GEMSS Mass Storage System and database facility activity

    NASA Astrophysics Data System (ADS)

    Ricci, Pier Paolo; Cavalli, Alessandro; Dell'Agnello, Luca; Favaro, Matteo; Gregori, Daniele; Prosperini, Andrea; Pezzi, Michele; Sapunenko, Vladimir; Zizzi, Giovanni; Vagnoni, Vincenzo

    2015-05-01

    The consolidation of Mass Storage services at the INFN-CNAF Tier1 Storage department that has occurred during the last 5 years, resulted in a reliable, high performance and moderately easy-to-manage facility that provides data access, archive, backup and database services to several different use cases. At present, the GEMSS Mass Storage System, developed and installed at CNAF and based upon an integration between the IBM GPFS parallel filesystem and the Tivoli Storage Manager (TSM) tape management software, is one of the largest hierarchical storage sites in Europe. It provides storage resources for about 12% of LHC data, as well as for data of other non-LHC experiments. Files are accessed using standard SRM Grid services provided by the Storage Resource Manager (StoRM), also developed at CNAF. Data access is also provided by XRootD and HTTP/WebDaV endpoints. Besides these services, an Oracle database facility is in production characterized by an effective level of parallelism, redundancy and availability. This facility is running databases for storing and accessing relational data objects and for providing database services to the currently active use cases. It takes advantage of several Oracle technologies, like Real Application Cluster (RAC), Automatic Storage Manager (ASM) and Enterprise Manager centralized management tools, together with other technologies for performance optimization, ease of management and downtime reduction. The aim of the present paper is to illustrate the state-of-the-art of the INFN-CNAF Tier1 Storage department infrastructures and software services, and to give a brief outlook to forthcoming projects. A description of the administrative, monitoring and problem-tracking tools that play a primary role in managing the whole storage framework is also given.

  8. Database of Active Structures From the Indo-Asian Collision

    NASA Astrophysics Data System (ADS)

    Styron, Richard; Taylor, Michael; Okoronkwo, Kelechi

    2010-05-01

    The ongoing collision of India and Asia has produced a vast system of folds and faults, many of which are active today, as evidenced by such recent deadly earthquakes as the 12 May 2008 Sichuan quake [Parsons et al., 2008]. Understanding these events requires knowledge of the region’s geologic structures. Taylor and Yin [2009] have assembled HimaTibetMap-1.0, a multiformat, comprehensive database of first-order active structures in central Asia that may aid researchers, educators, and students in their studies of Indo-Asian tectonics. For example, this database may be used by seismologists, geodesists, and modelers to identify structures in particular locations that contribute to active deformation, or it may be used by teachers to illustrate concepts such as continental collision or distributed deformation of continents.

  9. Enhancing Disaster Management: Development of a Spatial Database of Day Care Centers in the USA

    SciTech Connect

    Singh, Nagendra; Tuttle, Mark A.; Bhaduri, Budhendra L.

    2015-07-30

    Children under the age of five constitute around 7% of the total U.S. population and represent a segment of the population, which is totally dependent on others for day-to-day activities. A significant proportion of this population spends time in some form of day care arrangement while their parents are away from home. Accounting for those children during emergencies is of high priority, which requires a broad understanding of the locations of such day care centers. As concentrations of at risk population, the spatial location of day care centers is critical for any type of emergency preparedness and response (EPR). However, until recently, the U.S. emergency preparedness and response community did not have access to a comprehensive spatial database of day care centers at the national scale. This paper describes an approach for the development of the first comprehensive spatial database of day care center locations throughout the USA utilizing a variety of data harvesting techniques to integrate information from widely disparate data sources followed by geolocating for spatial precision. In the context of disaster management, such spatially refined demographic databases hold tremendous potential for improving high resolution population distribution and dynamics models and databases.

  10. Enhancing Disaster Management: Development of a Spatial Database of Day Care Centers in the USA

    DOE PAGES

    Singh, Nagendra; Tuttle, Mark A.; Bhaduri, Budhendra L.

    2015-07-30

    Children under the age of five constitute around 7% of the total U.S. population and represent a segment of the population, which is totally dependent on others for day-to-day activities. A significant proportion of this population spends time in some form of day care arrangement while their parents are away from home. Accounting for those children during emergencies is of high priority, which requires a broad understanding of the locations of such day care centers. As concentrations of at risk population, the spatial location of day care centers is critical for any type of emergency preparedness and response (EPR). However,more » until recently, the U.S. emergency preparedness and response community did not have access to a comprehensive spatial database of day care centers at the national scale. This paper describes an approach for the development of the first comprehensive spatial database of day care center locations throughout the USA utilizing a variety of data harvesting techniques to integrate information from widely disparate data sources followed by geolocating for spatial precision. In the context of disaster management, such spatially refined demographic databases hold tremendous potential for improving high resolution population distribution and dynamics models and databases.« less

  11. Health technology management: a database analysis as support of technology managers in hospitals.

    PubMed

    Miniati, Roberto; Dori, Fabrizio; Iadanza, Ernesto; Fregonara, Mario M; Gentili, Guido Biffi

    2011-01-01

    Technology management in healthcare must continually respond and adapt itself to new improvements in medical equipment. Multidisciplinary approaches which consider the interaction of different technologies, their use and user skills, are necessary in order to improve safety and quality. An easy and sustainable methodology is vital to Clinical Engineering (CE) services in healthcare organizations in order to define criteria regarding technology acquisition and replacement. This article underlines the critical aspects of technology management in hospitals by providing appropriate indicators for benchmarking CE services exclusively referring to the maintenance database from the CE department at the Careggi Hospital in Florence, Italy.

  12. The evolution of a health hazard assessment database management system for military weapons, equipment, and materiel.

    PubMed

    Murnyak, George R; Spencer, Clark O; Chaney, Ann E; Roberts, Welford C

    2002-04-01

    During the 1970s, the Army health hazard assessment (HHA) process developed as a medical program to minimize hazards in military materiel during the development process. The HHA Program characterizes health hazards that soldiers and civilians may encounter as they interact with military weapons and equipment. Thus, it is a resource for medical planners and advisors to use that can identify and estimate potential hazards that soldiers may encounter as they train and conduct missions. The U.S. Army Center for Health Promotion and Preventive Medicine administers the program, which is integrated with the Army's Manpower and Personnel Integration program. As the HHA Program has matured, an electronic database has been developed to record and monitor the health hazards associated with military equipment and systems. The current database tracks the results of HHAs and provides reporting designed to assist the HHA Program manager in daily activities.

  13. Database Design and Implementation of Game Management System for Rescue Robot Contest

    NASA Astrophysics Data System (ADS)

    Yamauchi, Hitoshi; Kojima, Atsuhiro; Koeda, Masanao

    The Rescue Robot Contest is one of the robot contests concerning lifesaving in urban disasters. In this contest, loads of rescue dummies simulating disaster victims, and contest progression status are presented to audiences, and team members operating robots. This presentation is important for both robot activity evaluation and production effect. For these purposes, a game management system for Rescue Robot Contest is originally constructed and operated. In this system, a database system is employed as base system. And this system's role is for both recording all data and events, and real-time processing for the presentation. In this paper, design and implementation of the tables and built-in functions of the database which is foundation of this system are presented. For real-time processing, embedded functions and trigger functions are implemented. These functions generate unique latest records into specific tables which stores only latest data for quick access.

  14. A Vibroacoustic Database Management Center for Shuttle and expendable launch vehicle payloads

    NASA Technical Reports Server (NTRS)

    Thomas, Valerie C.

    1987-01-01

    A Vibroacoustic Database Management Center has recently been established at the Jet Propulsion Laboratory (JPL). The center uses the Vibroacoustic Payload Environment Prediction System (VAPEPS) computer program to maintain a database of flight and ground-test data and structural parameters for both Shuttle and expendable launch-vehicle payloads. Given the launch-vehicle environment, the VAPEPS prediction software, which employs Statistical Energy Analysis (SEA) methods, can be used with or without the database to establish the vibroacoustic environment for new payload components. This paper summarizes the VAPEPS program and describes the functions of the Database Management Center at JPL.

  15. Database Design Methodology and Database Management System for Computer-Aided Structural Design Optimization.

    DTIC Science & Technology

    1984-12-01

    1983). Several researchers Lillehagen and Dokkar (1982), Grabowski, Eigener and Ranch (1978), and Eberlein and Wedekind (1982) have worked on database...Proceedings of International Federation of Information Processing. pp. 335-366. Eberlein, W. and Wedekind , H., 1982, "A Methodology for Embedding Design

  16. The Cronus Distributed DBMS (Database Management System) Project

    DTIC Science & Technology

    1989-10-01

    simplicity of the data modeling they offer (all data are structured as simple tables, or relations, of fields), Ihe aexibility of dynamically manipulating the...single centralized database nor access to application-specific data facilities are adequate to provide an integrated view of dispersed data. To address...system using Cronus client identities; - reconcile the data model presented by Cronus, an object model , with the data model presented by the database

  17. EADB: An Estrogenic Activity Database for Assessing Potential Endocrine Activity

    EPA Science Inventory

    Endocrine-active chemicals can potentially have adverse effects on both humans and wildlife. They can interfere with the body’s endocrine system through direct or indirect interactions with many protein targets. Estrogen receptors (ERs) are one of the major targets, and many ...

  18. Expert systems identify fossils and manage large paleontological databases

    SciTech Connect

    Beightol, D.S. ); Conrad, M.A.

    1988-02-01

    EXPAL is a computer program permitting creation and maintenance of comprehensive databases in marine paleontology. It is designed to assist specialists and non-specialists. EXPAL includes a powerful expert system based on the morphological descriptors specific to a given group of fossils. The expert system may be used, for example, to describe and automatically identify an unknown specimen. EXPAL was first applied to Dasycladales (Calcareous green algae). Projects are under way for corresponding expert systems and databases on planktonic foraminifers and calpionellids. EXPAL runs on an IBM XT or compatible microcomputer.

  19. Use of an administrative database to determine clinical management and outcomes in congenital heart disease.

    PubMed

    Gutgesell, Howard P; Hillman, Diane G; McHugh, Kimberly E; Dean, Peter; Matherne, G Paul

    2011-10-01

    We review our 16-year experience using the large, multi-institutional database of the University HealthSystem Consortium to study management and outcomes in congenital heart surgery for hypoplastic left heart syndrome, transposition of the great arteries, and neonatal coarctation. The advantages, limitations, and use of administrative databases by others to study congenital heart surgery are reviewed.

  20. Database Program To Manage Slides and Images for Teaching and Presentations.

    ERIC Educational Resources Information Center

    Byers, John A.

    1999-01-01

    Describes a computer program that manages a collection of pictures such as photographic slides, overheads, or computer images in one or more databases. Discusses organizing the database, searching, and keeping track of the time needed to present the images. (Author/LRW)

  1. GAS CHROMATOGRAPHIC RETENTION PARAMETERS DATABASE FOR REFRIGERANT MIXTURE COMPOSITION MANAGEMENT

    EPA Science Inventory

    Composition management of mixed refrigerant systems is a challenging problem in the laboratory, manufacturing facilities, and large refrigeration machinery. Ths issue of composition management is especially critical for the maintenance of machinery that utilizes zeotropic mixture...

  2. Knowledge Based Engineering for Spatial Database Management and Use

    NASA Technical Reports Server (NTRS)

    Peuquet, D. (Principal Investigator)

    1984-01-01

    The use of artificial intelligence techniques that are applicable to Geographic Information Systems (GIS) are examined. Questions involving the performance and modification to the database structure, the definition of spectra in quadtree structures and their use in search heuristics, extension of the knowledge base, and learning algorithm concepts are investigated.

  3. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  4. Zebrafish Database: Customizable, Free, and Open-Source Solution for Facility Management.

    PubMed

    Yakulov, Toma Antonov; Walz, Gerd

    2015-12-01

    Zebrafish Database is a web-based customizable database solution, which can be easily adapted to serve both single laboratories and facilities housing thousands of zebrafish lines. The database allows the users to keep track of details regarding the various genomic features, zebrafish lines, zebrafish batches, and their respective locations. Advanced search and reporting options are available. Unique features are the ability to upload files and images that are associated with the respective records and an integrated calendar component that supports multiple calendars and categories. Built on the basis of the Joomla content management system, the Zebrafish Database is easily extendable without the need for advanced programming skills.

  5. [Role and management of cancer clinical database in the application of gastric cancer precision medicine].

    PubMed

    Li, Yuanfang; Zhou, Zhiwei

    2016-02-01

    Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.

  6. Military services fitness database: development of a computerized physical fitness and weight management database for the U.S. Army.

    PubMed

    Williamson, Donald A; Bathalon, Gaston P; Sigrist, Lori D; Allen, H Raymond; Friedl, Karl E; Young, Andrew J; Martin, Corby K; Stewart, Tiffany M; Burrell, Lolita; Han, Hongmei; Hubbard, Van S; Ryan, Donna

    2009-01-01

    The Department of Defense (DoD) has mandated development of a system to collect and manage data on the weight, percent body fat (%BF), and fitness of all military personnel. This project aimed to (1) develop a computerized weight and fitness database to track individuals and Army units over time allowing cross-sectional and longitudinal evaluations and (2) test the computerized system for feasibility and integrity of data collection over several years of usage. The computer application, the Military Services Fitness Database (MSFD), was designed for (1) storage and tracking of data related to height, weight, %BF for the Army Weight Control Program (AWCP) and Army Physical Fitness Test (APFT) scores and (2) generation of reports using these data. A 2.5-year pilot test of the MSFD indicated that it monitors population and individual trends of changing body weight, %BF, and fitness in a military population.

  7. Military Services Fitness Database: Development of a Computerized Physical Fitness and Weight Management Database for the U.S. Army

    PubMed Central

    Williamson, Donald A.; Bathalon, Gaston P.; Sigrist, Lori D.; Allen, H. Raymond; Friedl, Karl E.; Young, Andrew J.; Martin, Corby K.; Stewart, Tiffany M.; Burrell, Lolita; Han, Hongmei; Hubbard, Van S.; Ryan, Donna

    2009-01-01

    The Department of Defense (DoD) has mandated development of a system to collect and manage data on the weight, percent body fat (%BF), and fitness of all military personnel. This project aimed to (1) develop a computerized weight and fitness database to track individuals and Army units over time allowing cross-sectional and longitudinal evaluations and (2) test the computerized system for feasibility and integrity of data collection over several years of usage. The computer application, the Military Services Fitness Database (MSFD), was designed for (1) storage and tracking of data related to height, weight, %BF for the Army Weight Control Program (AWCP) and Army Physical Fitness Test (APFT) scores and (2) generation of reports using these data. A 2.5-year pilot test of the MSFD indicated that it monitors population and individual trends of changing body weight, %BF, and fitness in a military population. PMID:19216292

  8. IUEAGN: A database of ultraviolet spectra of active galactic nuclei

    NASA Technical Reports Server (NTRS)

    Pike, G.; Edelson, R.; Shull, J. M.; Saken, J.

    1993-01-01

    In 13 years of operation, IUE has gathered approximately 5000 spectra of almost 600 Active Galactic Nuclei (AGN). In order to undertake AGN studies which require large amounts of data, we are consistently reducing this entire archive and creating a homogeneous, easy-to-use database. First, the spectra are extracted using the Optimal extraction algorithm. Continuum fluxes are then measured across predefined bands, and line fluxes are measured with a multi-component fit. These results, along with source information such as redshifts and positions, are placed in the IUEAGN relational database. Analysis algorithms, statistical tests, and plotting packages run within the structure, and this flexible database can accommodate future data when they are released. This archival approach has already been used to survey line and continuum variability in six bright Seyfert 1s and rapid continuum variability in 14 blazars. Among the results that could only be obtained using a large archival study is evidence that blazars show a positive correlation between degree of variability and apparent luminosity, while Seyfert 1s show an anti-correlation. This suggests that beaming dominates the ultraviolet properties for blazars, while thermal emission from an accretion disk dominates for Seyfert 1s. Our future plans include a survey of line ratios in Seyfert 1s, to be fitted with photoionization models to test the models and determine the range of temperatures, densities and ionization parameters. We will also include data from IRAS, Einstein, EXOSAT, and ground-based telescopes to measure multi-wavelength correlations and broadband spectral energy distributions.

  9. PACSY, a relational database management system for protein structure and chemical shift analysis.

    PubMed

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L

    2012-10-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu.

  10. A new Volcanic managEment Risk Database desIgn (VERDI): Application to El Hierro Island (Canary Islands)

    NASA Astrophysics Data System (ADS)

    Bartolini, S.; Becerril, L.; Martí, J.

    2014-11-01

    One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.

  11. PylotDB - A Database Management, Graphing, and Analysis Tool Written in Python

    SciTech Connect

    Barnette, Daniel W.

    2012-01-04

    PylotDB, written completely in Python, provides a user interface (UI) with which to interact with, analyze, graph data from, and manage open source databases such as MySQL. The UI mitigates the user having to know in-depth knowledge of the database application programming interface (API). PylotDB allows the user to generate various kinds of plots from user-selected data; generate statistical information on text as well as numerical fields; backup and restore databases; compare database tables across different databases as well as across different servers; extract information from any field to create new fields; generate, edit, and delete databases, tables, and fields; generate or read into a table CSV data; and similar operations. Since much of the database information is brought under control of the Python computer language, PylotDB is not intended for huge databases for which MySQL and Oracle, for example, are better suited. PylotDB is better suited for smaller databases that might be typically needed in a small research group situation. PylotDB can also be used as a learning tool for database applications in general.

  12. Experience with a Commercial Microcomputer Database Management System

    PubMed Central

    Covvey, H.D.; Craven, N.H.; Colman, F.

    1984-01-01

    The designer of an applications system to support a medical database must review a large number of potential packages on which to base his or her work. The number of packages, and their widely differing functional capabilities, costs and performance characteristics make the selection process difficult. Having selected any product, limitations must be faced and some of these will only be found after selection. While recognizing that using such packages is superior to developing one de novo, not realizing the problems and limitations can lead to disappointment, delays and even project failure.

  13. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    SciTech Connect

    Wolery, T W; Sutton, M

    2011-09-19

    they use a large body of thermodynamic data, generally from a supporting database file, to sort out the various important reactions from a wide spectrum of possibilities, given specified inputs. Usually codes of this kind are used to construct models of initial aqueous solutions that represent initial conditions for some process, although sometimes these calculations also represent a desired end point. Such a calculation might be used to determine the major chemical species of a dissolved component, the solubility of a mineral or mineral-like solid, or to quantify deviation from equilibrium in the form of saturation indices. Reactive transport codes such as TOUGHREACT and NUFT generally require the user to determine which chemical species and reactions are important, and to provide the requisite set of information including thermodynamic data in an input file. Usually this information is abstracted from the output of a geochemical modeling code and its supporting thermodynamic data file. The Yucca Mountain Project (YMP) developed two qualified thermodynamic databases to model geochemical processes, including ones involving repository components such as spent fuel. The first of the two (BSC, 2007a) was for systems containing dilute aqueous solutions only, the other (BSC, 2007b) for systems involving concentrated aqueous solutions and incorporating a model for such based on Pitzer's (1991) equations. A 25 C-only database with similarities to the latter was also developed for the Waste Isolation Pilot Plant (WIPP, cf. Xiong, 2005). The NAGRA/PSI database (Hummel et al., 2002) was developed to support repository studies in Europe. The YMP databases are often used in non-repository studies, including studies of geothermal systems (e.g., Wolery and Carroll, 2010) and CO2 sequestration (e.g., Aines et al., 2011).

  14. Management of CD-ROM Databases in ARL Libraries. SPEC Kit 169.

    ERIC Educational Resources Information Center

    Welch, C. Brigid, Ed.

    This kit is based on 73 responses to a survey conducted by the Association of Research Libraries (ARL) Office of Management Services to obtain information on the management of CD-ROM database installations in ARL libraries. The survey sought information in the areas of CD-ROM funding, instruction and publicity, organization, equipment, security,…

  15. Design of Student Information Management Database Application System for Office and Departmental Target Responsibility System

    NASA Astrophysics Data System (ADS)

    Zhou, Hui

    It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.

  16. Planning the future of JPL's management and administrative support systems around an integrated database

    NASA Technical Reports Server (NTRS)

    Ebersole, M. M.

    1983-01-01

    JPL's management and administrative support systems have been developed piece meal and without consistency in design approach over the past twenty years. These systems are now proving to be inadequate to support effective management of tasks and administration of the Laboratory. New approaches are needed. Modern database management technology has the potential for providing the foundation for more effective administrative tools for JPL managers and administrators. Plans for upgrading JPL's management and administrative systems over a six year period evolving around the development of an integrated management and administrative data base are discussed.

  17. Telecommunications issues of intelligent database management for ground processing systems in the EOS era

    NASA Technical Reports Server (NTRS)

    Touch, Joseph D.

    1994-01-01

    Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.

  18. Pro-Active Behavior Management.

    ERIC Educational Resources Information Center

    McCormack, James E., Jr.

    The paper outlines the basic tactics in pro-active behavior management, a behavior modification approach for use with severely handicapped students which reorders the staff/student relationship by focusing on positive interaction. Pro-active behavior management is noted to involve interruption of established behavior chains, environmental…

  19. Federated web-accessible clinical data management within an extensible neuroimaging database.

    PubMed

    Ozyurt, I Burak; Keator, David B; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R; Bockholt, Jeremy; Grethe, Jeffrey S

    2010-12-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site.

  20. Managing Geological Profiles in Databases for 3D Visualisation

    NASA Astrophysics Data System (ADS)

    Jarna, A.; Grøtan, B. O.; Henderson, I. H. C.; Iversen, S.; Khloussy, E.; Nordahl, B.; Rindstad, B. I.

    2016-10-01

    Geology and all geological structures are three-dimensional in space. GIS and databases are common tools used by geologists to interpret and communicate geological data. The NGU (Geological Survey of Norway) is the national institution for the study of bedrock, mineral resources, surficial deposits and groundwater and marine geology. 3D geology is usually described by geological profiles, or vertical sections through a map, where you can look at the rock structure below the surface. The goal is to gradually expand the usability of existing and new geological profiles to make them more available in the retail applications as well as build easier entry and registration of profiles. The project target is to develop the methodology for acquisition of data, modification and use of data and its further presentation on the web by creating a user-interface directly linked to NGU's webpage. This will allow users to visualise profiles in a 3D model.

  1. A database to manage flood risk in Catalonia

    NASA Astrophysics Data System (ADS)

    Echeverria, S.; Toldrà, R.; Verdaguer, I.

    2009-09-01

    We call priority action spots those local sites where heavy rain, increased river flow, sea storms and other flooding phenomena can cause human casualties or severe damage to property. Some examples are campsites, car parks, roads, chemical factories… In order to keep to a minimum the risk of these spots, both a prevention programme and an emergency response programme are required. The flood emergency plan of Catalonia (INUNCAT) prepared in 2005 included already a listing of priority action spots compiled by the Catalan Water Agency (ACA), which was elaborated taking into account past experience, hydraulic studies and information available by several knowledgeable sources. However, since land use evolves with time this listing of priority action spots has become outdated and incomplete. A new database is being built. Not only does this new database update and expand the previous listing, but adds to each entry information regarding prevention measures and emergency response: which spots are the most hazardous, under which weather conditions problems arise, which ones should have their access closed as soon as these conditions are forecast or actually given, which ones should be evacuated, who is in charge of the preventive actions or emergency response and so on. Carrying out this programme has to be done with the help and collaboration of all the organizations involved, foremost with the local authorities in the areas at risk. In order to achieve this goal a suitable geographical information system is necessary which can be easily used by all actors involved in this project. The best option has turned out to be the Spatial Data Infrastructure of Catalonia (IDEC), a platform to share spatial data on the Internet involving the Generalitat de Catalunya, Localret (a consortium of local authorities that promotes information technology) and other institutions.

  2. Design and Performance of a Xenobiotic Metabolism Database Manager for Building Metabolic Pathway Databases

    EPA Science Inventory

    A major challenge for scientists and regulators is accounting for the metabolic activation of chemicals that may lead to increased toxicity. Reliable forecasting of chemical metabolism is a critical factor in estimating a chemical’s toxic potential. Research is underway to develo...

  3. Flight Deck Interval Management Display. [Elements, Information and Annunciations Database User Guide

    NASA Technical Reports Server (NTRS)

    Lancaster, Jeff; Dillard, Michael; Alves, Erin; Olofinboba, Olu

    2014-01-01

    The User Guide details the Access Database provided with the Flight Deck Interval Management (FIM) Display Elements, Information, & Annunciations program. The goal of this User Guide is to support ease of use and the ability to quickly retrieve and select items of interest from the Database. The Database includes FIM Concepts identified in a literature review preceding the publication of this document. Only items that are directly related to FIM (e.g., spacing indicators), which change or enable FIM (e.g., menu with control buttons), or which are affected by FIM (e.g., altitude reading) are included in the database. The guide has been expanded from previous versions to cover database structure, content, and search features with voiced explanations.

  4. An Extensible Schema-less Database Framework for Managing High-throughput Semi-Structured Documents

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.; La, Tracy; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword searches of records for both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high throughput open database framework for managing, storing, and searching unstructured or semi structured arbitrary hierarchal models, XML and HTML.

  5. An Extensible "SCHEMA-LESS" Database Framework for Managing High-Throughput Semi-Structured Documents

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semistructured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  6. NETMARK: A Schema-less Extension for Relational Databases for Managing Semi-structured Data Dynamically

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  7. Data management and database structure at the ARS Culture Collection

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The organization and management of collection data for the 96,000 strains held in the ARS Culture Collection has been an ongoing process. Originally, the records for the four separate collections were maintained by individual curators in notebooks and/or card files and subsequently on the National C...

  8. The Future of Asset Management for Human Space Exploration: Supply Classification and an Integrated Database

    NASA Technical Reports Server (NTRS)

    Shull, Sarah A.; Gralla, Erica L.; deWeck, Olivier L.; Shishko, Robert

    2006-01-01

    One of the major logistical challenges in human space exploration is asset management. This paper presents observations on the practice of asset management in support of human space flight to date and discusses a functional-based supply classification and a framework for an integrated database that could be used to improve asset management and logistics for human missions to the Moon, Mars and beyond.

  9. Technical Aspects of Interfacing MUMPS to an External SQL Relational Database Management System

    PubMed Central

    Kuzmak, Peter M.; Walters, Richard F.; Penrod, Gail

    1988-01-01

    This paper describes an interface connecting InterSystems MUMPS (M/VX) to an external relational DBMS, the SYBASE Database Management System. The interface enables MUMPS to operate in a relational environment and gives the MUMPS language full access to a complete set of SQL commands. MUMPS generates SQL statements as ASCII text and sends them to the RDBMS. The RDBMS executes the statements and returns ASCII results to MUMPS. The interface suggests that the language features of MUMPS make it an attractive tool for use in the relational database environment. The approach described in this paper separates MUMPS from the relational database. Positioning the relational database outside of MUMPS promotes data sharing and permits a number of different options to be used for working with the data. Other languages like C, FORTRAN, and COBOL can access the RDBMS database. Advanced tools provided by the relational database vendor can also be used. SYBASE is an advanced high-performance transaction-oriented relational database management system for the VAX/VMS and UNIX operating systems. SYBASE is designed using a distributed open-systems architecture, and is relatively easy to interface with MUMPS.

  10. Evaluation of relational and NoSQL database architectures to manage genomic annotations.

    PubMed

    Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard

    2016-12-01

    While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences.

  11. Application of a Database System for Korean Military Personnel Management.

    DTIC Science & Technology

    1987-03-01

    Howver, In dih earty and middle 1970, computer memory capacities were such that millions of characters could be stored in them. and storage technology ... technologically possible (and is at least approaching conomic feasibility), the concept of data management is now emerging in the business community... Technological advances are making it possible to store data in a way that is radically different from most of the contemporary methods now in use. This new

  12. The Evaluation System Design of GIS-Based Oil and Gas Resources Carbon Emission Database Management

    NASA Astrophysics Data System (ADS)

    Zhu, Wenju; Bi, Jiantao; Wang, Xingxing; Zhu, Zuojia; Pang, Wenqi

    2014-03-01

    Due to the importance of research on carbon budgets in natural processes, it is critical to be able to effectively manage and process all types of data in order to get measure carbon emissions. For this purpose, data produced in oil and gas exploration and natural processes are the focus of this research. Various tools are used including Oracle11g for data storage, Arc Engine combined with Microsoft Visual C# among others including C++ and the Database Storage Management Platform with GIS software functions. The IPCC algorithms are the most important reference, combine this with actual events, a new calculation model about oil and gas resources carbon emission was constructed. This model will analyze and predict the amount of carbon emissions in the oil and gas production in the future. Putting the new calculation model into the Database Storage Management Platform, an Intelligent Prediction Database Platform contained the new calculation model was established.

  13. Is Library Database Searching a Language Learning Activity?

    ERIC Educational Resources Information Center

    Bordonaro, Karen

    2010-01-01

    This study explores how non-native speakers of English think of words to enter into library databases when they begin the process of searching for information in English. At issue is whether or not language learning takes place when these students use library databases. Language learning in this study refers to the use of strategies employed by…

  14. The database management system: A topic and a tool

    NASA Technical Reports Server (NTRS)

    Plummer, O. R.

    1984-01-01

    Data structures and data base management systems are common tools employed to deal with the administrative information of a university. An understanding of these topics is needed by a much wider audience, ranging from those interested in computer aided design and manufacturing to those using microcomputers. These tools are becoming increasingly valuable to academic programs as they develop comprehensive computer support systems. The wide use of these tools relies upon the relational data model as a foundation. Experience with the use of the IPAD RIM5.0 program is described.

  15. Online system for managing watermarked-images database

    NASA Astrophysics Data System (ADS)

    Chareyron, Ga"l.; Da Rugna, Jerome; Konik, Hubert; Trémeau, Alain

    2004-12-01

    Universities, Governmental administrations, photography agencies and many other companies or individuals need framework to manage their multimedia documents and the copyright or authenticity attached to their images. We purpose a web-based interface able to realize many operations: storage, image navigation, copyright insertion, authenticity verification. When a photography owner wants to store and to publish the document on the Internet, he will use the interface to add his images and set the internet sharing rules. The user can choose for example watermarking method or resolution viewing. He set the parameters visually in way to consider the best ratio between quality and protection. We propose too an authenticity module which will allow online verification of documents. Any user on internet, knowing the key encoding, will be able to verify if an watermarked image have been altered or not. Finally, we will give some practical examples of our system. In this study, we merge the last technology in image protection and navigation to offer a complete scheme able to manage the images published. It allows to use only one system to supply the security and the publication of their images.

  16. Online system for managing watermarked-images database

    NASA Astrophysics Data System (ADS)

    Chareyron, Gael; Da Rugna, Jerome; Konik, Hubert; Tremeau, Alain

    2005-01-01

    Universities, Governmental administrations, photography agencies and many other companies or individuals need framework to manage their multimedia documents and the copyright or authenticity attached to their images. We purpose a web-based interface able to realize many operations: storage, image navigation, copyright insertion, authenticity verification. When a photography owner wants to store and to publish the document on the Internet, he will use the interface to add his images and set the internet sharing rules. The user can choose for example watermarking method or resolution viewing. He set the parameters visually in way to consider the best ratio between quality and protection. We propose too an authenticity module which will allow online verification of documents. Any user on internet, knowing the key encoding, will be able to verify if an watermarked image have been altered or not. Finally, we will give some practical examples of our system. In this study, we merge the last technology in image protection and navigation to offer a complete scheme able to manage the images published. It allows to use only one system to supply the security and the publication of their images.

  17. Aerial image databases for pipeline rights-of-way management

    NASA Astrophysics Data System (ADS)

    Jadkowski, Mark A.

    1996-03-01

    Pipeline companies that own and manage extensive rights-of-way corridors are faced with ever-increasing regulatory pressures, operating issues, and the need to remain competitive in today's marketplace. Automation has long been an answer to the problem of having to do more work with less people, and Automated Mapping/Facilities Management/Geographic Information Systems (AM/FM/GIS) solutions have been implemented at several pipeline companies. Until recently, the ability to cost-effectively acquire and incorporate up-to-date aerial imagery into these computerized systems has been out of the reach of most users. NASA's Earth Observations Commercial Applications Program (EOCAP) is providing a means by which pipeline companies can bridge this gap. The EOCAP project described in this paper includes a unique partnership with NASA and James W. Sewall Company to develop an aircraft-mounted digital camera system and a ground-based computer system to geometrically correct and efficiently store and handle the digital aerial images in an AM/FM/GIS environment. This paper provides a synopsis of the project, including details on (1) the need for aerial imagery, (2) NASA's interest and role in the project, (3) the design of a Digital Aerial Rights-of-Way Monitoring System, (4) image georeferencing strategies for pipeline applications, and (5) commercialization of the EOCAP technology through a prototype project at Algonquin Gas Transmission Company which operates major gas pipelines in New England, New York, and New Jersey.

  18. IOOS Data Management Activities

    DTIC Science & Technology

    2010-06-01

    Atmospheric Administration ( NOAA ) has been assigned the role of lead federal agency in this endeavor. Technically, IOOS includes or interfaces with existing...The NOAA IOOS office has established a limited-scope data- management prototype and is developing an architecture to extend and augment that prototype...component of the Global Earth Observation System of Systems (GEOSS). In 2007, the US National Oceanic and Atmospheric Administration ( NOAA

  19. Environmental management activities

    SciTech Connect

    1997-07-01

    The Office of Environmental Management (EM) has been delegated the responsibility for the Department of Energy`s (DOE`s) cleanup of the nuclear weapons complex. The nature and magnitude of the waste management and environmental remediation problem requires the identification of technologies and scientific expertise from domestic and foreign sources. Within the United States, operational DOE facilities, as well as the decontamination and decommissioning of inactive facilities, have produced significant amounts of radioactive, hazardous, and mixed wastes. In order to ensure worker safety and the protection of the public, DOE must: (1) assess, remediate, and monitor sites and facilities; (2) store, treat, and dispose of wastes from past and current operations; and (3) develop and implement innovative technologies for environmental restoration and waste management. The EM directive necessitates looking beyond domestic capabilities to technological solutions found outside US borders. Following the collapse of the Soviet regime, formerly restricted elite Soviet scientific expertise became available to the West. EM has established a cooperative technology development program with Russian scientific institutes that meets domestic cleanup objectives by: (1) identifying and accessing Russian EM-related technologies, thereby leveraging investments and providing cost-savings; (2) improving access to technical information, scientific expertise, and technologies applicable to EM needs; and (3) increasing US private sector opportunities in Russian in EM-related areas.

  20. Comparison of scientific and administrative database management systems

    NASA Technical Reports Server (NTRS)

    Stoltzfus, J. C.

    1983-01-01

    Some characteristics found to be different for scientific and administrative data bases are identified and some of the corresponding generic requirements for data base management systems (DBMS) are discussed. The requirements discussed are especially stringent for either the scientific or administrative data bases. For some, no commercial DBMS is fully satisfactory, and the data base designer must invent a suitable approach. For others, commercial systems are available with elegant solutions, and a wrong choice would mean an expensive work-around to provide the missing features. It is concluded that selection of a DBMS must be based on the requirements for the information system. There is no unique distinction between scientific and administrative data bases or DBMS. The distinction comes from the logical structure of the data, and understanding the data and their relationships is the key to defining the requirements and selecting an appropriate DBMS for a given set of applications.

  1. The Consolidated Human Activity Database — Master Version (CHAD-Master) Technical Memorandum

    EPA Pesticide Factsheets

    This technical memorandum contains information about the Consolidated Human Activity Database -- Master version, including CHAD contents, inventory of variables: Questionnaire files and Event files, CHAD codes, and references.

  2. Information flow in the DAMA Project beyond database managers: Information flow managers

    SciTech Connect

    Russell, L.; Wolfson, O.; Yu, C.

    1996-03-01

    To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point-of-sale information, is being considered in the Demand Activated Manufacturing Project of the American Textile Partnership project. A scenario is examined in which 100,000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26,000 suppliers through the use of bill-of-materials explosions at four levels of detail. A new paradign the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced to keep estimates of demand as current as possible.

  3. Database Management: Building, Changing and Using Databases. Collected Papers and Abstracts of the Mid-Year Meeting of the American Society for Information Science (15th, Portland, Oregon, May 1986).

    ERIC Educational Resources Information Center

    American Society for Information Science, Washington, DC.

    This document contains abstracts of papers on database design and management which were presented at the 1986 mid-year meeting of the American Society for Information Science (ASIS). Topics considered include: knowledge representation in a bilingual art history database; proprietary database design; relational database design; in-house databases;…

  4. Management of three-dimensional and anthropometric databases: Alexandria and Cleopatra

    NASA Astrophysics Data System (ADS)

    Paquet, Eric; Robinette, Kathleen; Rioux, Marc

    2000-10-01

    This paper describes two systems for managing 3D and anthropometric databases, namely Alexandria and Cleopatra. Each system is made out of three parts: the crawler, the analyzer, and the search engine. The crawler retrieves the content from the network while the analyzer describes automatically the shape, scale, and color of each retrieved object and writes down a compact descriptor. The search engine applies the query by example paradigm to find and retrieve similar or related objects from the database based on different aspects of 3D shape, scale, and color distribution. The descriptors are defined and the implementation of the system is detailed. The application of the system to the CAESAR anthropometric survey is discussed. Experimental results from the CAESAR database and from generic databases are presented.

  5. Insight: An ontology-based integrated database and analysis platform for epilepsy self-management research.

    PubMed

    Sahoo, Satya S; Ramesh, Priya; Welter, Elisabeth; Bukach, Ashley; Valdez, Joshua; Tatsuoka, Curtis; Bamps, Yvan; Stoll, Shelley; Jobst, Barbara C; Sajatovic, Martha

    2016-10-01

    We present Insight as an integrated database and analysis platform for epilepsy self-management research as part of the national Managing Epilepsy Well Network. Insight is the only available informatics platform for accessing and analyzing integrated data from multiple epilepsy self-management research studies with several new data management features and user-friendly functionalities. The features of Insight include, (1) use of Common Data Elements defined by members of the research community and an epilepsy domain ontology for data integration and querying, (2) visualization tools to support real time exploration of data distribution across research studies, and (3) an interactive visual query interface for provenance-enabled research cohort identification. The Insight platform contains data from five completed epilepsy self-management research studies covering various categories of data, including depression, quality of life, seizure frequency, and socioeconomic information. The data represents over 400 participants with 7552 data points. The Insight data exploration and cohort identification query interface has been developed using Ruby on Rails Web technology and open source Web Ontology Language Application Programming Interface to support ontology-based reasoning. We have developed an efficient ontology management module that automatically updates the ontology mappings each time a new version of the Epilepsy and Seizure Ontology is released. The Insight platform features a Role-based Access Control module to authenticate and effectively manage user access to different research studies. User access to Insight is managed by the Managing Epilepsy Well Network database steering committee consisting of representatives of all current collaborating centers of the Managing Epilepsy Well Network. New research studies are being continuously added to the Insight database and the size as well as the unique coverage of the dataset allows investigators to conduct

  6. Environmental Management vitrification activities

    SciTech Connect

    Krumrine, P.H.

    1996-05-01

    Both the Mixed Waste and Landfill Stabilization Focus Areas as part of the Office of Technology Development efforts within the Department of Energy`s (DOE) Environmental Management (EM) Division have been developing various vitrification technologies as a treatment approach for the large quantities of transuranic (TRU), TRU mixed and Mixed Low Level Wastes that are stored in either landfills or above ground storage facilities. The technologies being developed include joule heated, plasma torch, plasma arc, induction, microwave, combustion, molten metal, and in situ methods. There are related efforts going into development glass, ceramic, and slag waste form windows of opportunity for the diverse quantities of heterogeneous wastes needing treatment. These studies look at both processing parameters, and long term performance parameters as a function of composition to assure that developed technologies have the right chemistry for success.

  7. Functions and Relations: Some Applications from Database Management for the Teaching of Classroom Mathematics.

    ERIC Educational Resources Information Center

    Hauge, Sharon K.

    While functions and relations are important concepts in the teaching of mathematics, research suggests that many students lack an understanding and appreciation of these concepts. The present paper discusses an approach for teaching functions and relations that draws on the use of illustrations from database management. This approach has the…

  8. The Subject-Object Relationship Interface Model in Database Management Systems.

    ERIC Educational Resources Information Center

    Yannakoudakis, Emmanuel J.; Attar-Bashi, Hussain A.

    1989-01-01

    Describes a model that displays structures necessary to map between the conceptual and external levels in database management systems, using an algorithm that maps the syntactic representations of tuples onto semantic representations. A technique for translating tuples into natural language sentences is introduced, and a system implemented in…

  9. Documentation of a spatial data-base management system for monitoring pesticide application in Washington

    USGS Publications Warehouse

    Schurr, K.M.; Cox, S.E.

    1994-01-01

    The Pesticide-Application Data-Base Management System was created as a demonstration project and was tested with data submitted to the Washington State Department of Agriculture by pesticide applicators from a small geographic area. These data were entered into the Department's relational data-base system and uploaded into the system's ARC/INFO files. Locations for pesticide applica- tions are assigned within the Public Land Survey System grids, and ARC/INFO programs in the Pesticide-Application Data-Base Management System can subdivide each survey section into sixteen idealized quarter-quarter sections for display map grids. The system provides data retrieval and geographic information system plotting capabilities from a menu of seven basic retrieval options. Additionally, ARC/INFO coverages can be created from the retrieved data when required for particular applications. The Pesticide-Application Data-Base Management System, or the general principles used in the system, could be adapted to other applica- tions or to other states.

  10. The Coral Triangle Atlas: An Integrated Online Spatial Database System for Improving Coral Reef Management

    PubMed Central

    Cros, Annick; Ahamad Fatan, Nurulhuda; White, Alan; Teoh, Shwu Jiau; Tan, Stanley; Handayani, Christian; Huang, Charles; Peterson, Nate; Venegas Li, Ruben; Siry, Hendra Yusran; Fitriana, Ria; Gove, Jamison; Acoba, Tomoko; Knight, Maurice; Acosta, Renerio; Andrew, Neil; Beare, Doug

    2014-01-01

    In this paper we describe the construction of an online GIS database system, hosted by WorldFish, which stores bio-physical, ecological and socio-economic data for the ‘Coral Triangle Area’ in South-east Asia and the Pacific. The database has been built in partnership with all six (Timor-Leste, Malaysia, Indonesia, The Philippines, Solomon Islands and Papua New Guinea) of the Coral Triangle countries, and represents a valuable source of information for natural resource managers at the regional scale. Its utility is demonstrated using biophysical data, data summarising marine habitats, and data describing the extent of marine protected areas in the region. PMID:24941442

  11. The Coral Triangle Atlas: an integrated online spatial database system for improving coral reef management.

    PubMed

    Cros, Annick; Ahamad Fatan, Nurulhuda; White, Alan; Teoh, Shwu Jiau; Tan, Stanley; Handayani, Christian; Huang, Charles; Peterson, Nate; Venegas Li, Ruben; Siry, Hendra Yusran; Fitriana, Ria; Gove, Jamison; Acoba, Tomoko; Knight, Maurice; Acosta, Renerio; Andrew, Neil; Beare, Doug

    2014-01-01

    In this paper we describe the construction of an online GIS database system, hosted by WorldFish, which stores bio-physical, ecological and socio-economic data for the 'Coral Triangle Area' in South-east Asia and the Pacific. The database has been built in partnership with all six (Timor-Leste, Malaysia, Indonesia, The Philippines, Solomon Islands and Papua New Guinea) of the Coral Triangle countries, and represents a valuable source of information for natural resource managers at the regional scale. Its utility is demonstrated using biophysical data, data summarising marine habitats, and data describing the extent of marine protected areas in the region.

  12. Metadata Dictionary Database: A Proposed Tool for Academic Library Metadata Management

    ERIC Educational Resources Information Center

    Southwick, Silvia B.; Lampert, Cory

    2011-01-01

    This article proposes a metadata dictionary (MDD) be used as a tool for metadata management. The MDD is a repository of critical data necessary for managing metadata to create "shareable" digital collections. An operational definition of metadata management is provided. The authors explore activities involved in metadata management in…

  13. TheSNPpit—A High Performance Database System for Managing Large Scale SNP Data

    PubMed Central

    Groeneveld, Eildert; Lichtenberg, Helmut

    2016-01-01

    The fast development of high throughput genotyping has opened up new possibilities in genetics while at the same time producing considerable data handling issues. TheSNPpit is a database system for managing large amounts of multi panel SNP genotype data from any genotyping platform. With an increasing rate of genotyping in areas like animal and plant breeding as well as human genetics, already now hundreds of thousand of individuals need to be managed. While the common database design with one row per SNP can manage hundreds of samples this approach becomes progressively slower as the size of the data sets increase until it finally fails completely once tens or even hundreds of thousands of individuals need to be managed. TheSNPpit has implemented three ideas to also accomodate such large scale experiments: highly compressed vector storage in a relational database, set based data manipulation, and a very fast export written in C with Perl as the base for the framework and PostgreSQL as the database backend. Its novel subset system allows the creation of named subsets based on the filtering of SNP (based on major allele frequency, no-calls, and chromosomes) and manually applied sample and SNP lists at negligible storage costs, thus avoiding the issue of proliferating file copies. The named subsets are exported for down stream analysis. PLINK ped and map files are processed as in- and outputs. TheSNPpit allows management of different panel sizes in the same population of individuals when higher density panels replace previous lower density versions as it occurs in animal and plant breeding programs. A completely generalized procedure allows storage of phenotypes. TheSNPpit only occupies 2 bits for storing a single SNP implying a capacity of 4 mio SNPs per 1MB of disk storage. To investigate performance scaling, a database with more than 18.5 mio samples has been created with 3.4 trillion SNPs from 12 panels ranging from 1000 through 20 mio SNPs resulting in a

  14. TheSNPpit-A High Performance Database System for Managing Large Scale SNP Data.

    PubMed

    Groeneveld, Eildert; Lichtenberg, Helmut

    2016-01-01

    The fast development of high throughput genotyping has opened up new possibilities in genetics while at the same time producing considerable data handling issues. TheSNPpit is a database system for managing large amounts of multi panel SNP genotype data from any genotyping platform. With an increasing rate of genotyping in areas like animal and plant breeding as well as human genetics, already now hundreds of thousand of individuals need to be managed. While the common database design with one row per SNP can manage hundreds of samples this approach becomes progressively slower as the size of the data sets increase until it finally fails completely once tens or even hundreds of thousands of individuals need to be managed. TheSNPpit has implemented three ideas to also accomodate such large scale experiments: highly compressed vector storage in a relational database, set based data manipulation, and a very fast export written in C with Perl as the base for the framework and PostgreSQL as the database backend. Its novel subset system allows the creation of named subsets based on the filtering of SNP (based on major allele frequency, no-calls, and chromosomes) and manually applied sample and SNP lists at negligible storage costs, thus avoiding the issue of proliferating file copies. The named subsets are exported for down stream analysis. PLINK ped and map files are processed as in- and outputs. TheSNPpit allows management of different panel sizes in the same population of individuals when higher density panels replace previous lower density versions as it occurs in animal and plant breeding programs. A completely generalized procedure allows storage of phenotypes. TheSNPpit only occupies 2 bits for storing a single SNP implying a capacity of 4 mio SNPs per 1MB of disk storage. To investigate performance scaling, a database with more than 18.5 mio samples has been created with 3.4 trillion SNPs from 12 panels ranging from 1000 through 20 mio SNPs resulting in a

  15. Data management with a landslide inventory of the Franconian Alb (Germany) using a spatial database and GIS tools

    NASA Astrophysics Data System (ADS)

    Bemm, Stefan; Sandmeier, Christine; Wilde, Martina; Jaeger, Daniel; Schwindt, Daniel; Terhorst, Birgit

    2014-05-01

    ), informations on location, landslide types and causes, geomorphological positions, geometries, hazards and damages, as well as assessments related to the activity of landslides. Furthermore, there are stored spatial objects, which represent the components of a landslide, in particular the scarps and the accumulation areas. Besides, waterways, map sheets, contour lines, detailed infrastructure data, digital elevation models, aspect and slope data are included. Examples of spatial queries to the database are intersections of raster and vector data for calculating values for slope gradients or aspects of landslide areas and for creating multiple, overlaying sections for the comparison of slopes, as well as distances to the infrastructure or to the next receiving drainage. Furthermore, getting informations on landslide magnitudes, distribution and clustering, as well as potential correlations concerning geomorphological or geological conditions. The data management concept in this study can be implemented for any academic, public or private use, because it is independent from any obligatory licenses. The created spatial database offers a platform for interdisciplinary research and socio-economic questions, as well as for landslide susceptibility and hazard indication mapping. Obe, R.O., Hsu, L.S. 2011. PostGIS in action. - pp 492, Manning Publications, Stamford

  16. A pilot GIS database of active faults of Mt. Etna (Sicily): A tool for integrated hazard evaluation

    NASA Astrophysics Data System (ADS)

    Barreca, Giovanni; Bonforte, Alessandro; Neri, Marco

    2013-02-01

    A pilot GIS-based system has been implemented for the assessment and analysis of hazard related to active faults affecting the eastern and southern flanks of Mt. Etna. The system structure was developed in ArcGis® environment and consists of different thematic datasets that include spatially-referenced arc-features and associated database. Arc-type features, georeferenced into WGS84 Ellipsoid UTM zone 33 Projection, represent the five main fault systems that develop in the analysed region. The backbone of the GIS-based system is constituted by the large amount of information which was collected from the literature and then stored and properly geocoded in a digital database. This consists of thirty five alpha-numeric fields which include all fault parameters available from literature such us location, kinematics, landform, slip rate, etc. Although the system has been implemented according to the most common procedures used by GIS developer, the architecture and content of the database represent a pilot backbone for digital storing of fault parameters, providing a powerful tool in modelling hazard related to the active tectonics of Mt. Etna. The database collects, organises and shares all scientific currently available information about the active faults of the volcano. Furthermore, thanks to the strong effort spent on defining the fields of the database, the structure proposed in this paper is open to the collection of further data coming from future improvements in the knowledge of the fault systems. By layering additional user-specific geographic information and managing the proposed database (topological querying) a great diversity of hazard and vulnerability maps can be produced by the user. This is a proposal of a backbone for a comprehensive geographical database of fault systems, universally applicable to other sites.

  17. MouseNet database: digital management of a large-scale mutagenesis project.

    PubMed

    Pargent, W; Heffner, S; Schäble, K F; Soewarto, D; Fuchs, H; Hrabé de Angelis, M

    2000-07-01

    The Munich ENU Mouse Mutagenesis Screen is a large-scale mutant production, phenotyping, and mapping project. It encompasses two animal breeding facilities and a number of screening groups located in the general area of Munich. A central database is required to manage and process the immense amount of data generated by the mutagenesis project. This database, which we named MouseNet(c), runs on a Sybase platform and will finally store and process all data from the entire project. In addition, the system comprises a portfolio of functions needed to support the workflow management of the core facility and the screening groups. MouseNet(c) will make all of the data available to the participating screening groups, and later to the international scientific community. MouseNet(c) will consist of three major software components:* Animal Management System (AMS)* Sample Tracking System (STS)* Result Documentation System (RDS)MouseNet(c) provides the following major advantages:* being accessible from different client platforms via the Internet* being a full-featured multi-user system (including access restriction and data locking mechanisms)* relying on a professional RDBMS (relational database management system) which runs on a UNIX server platform* supplying workflow functions and a variety of plausibility checks.

  18. Rhode Island Water Supply System Management Plan Database (WSSMP-Version 1.0)

    USGS Publications Warehouse

    Granato, Gregory E.

    2004-01-01

    In Rhode Island, the availability of water of sufficient quality and quantity to meet current and future environmental and economic needs is vital to life and the State's economy. Water suppliers, the Rhode Island Water Resources Board (RIWRB), and other State agencies responsible for water resources in Rhode Island need information about available resources, the water-supply infrastructure, and water use patterns. These decision makers need historical, current, and future water-resource information. In 1997, the State of Rhode Island formalized a system of Water Supply System Management Plans (WSSMPs) to characterize and document relevant water-supply information. All major water suppliers (those that obtain, transport, purchase, or sell more than 50 million gallons of water per year) are required to prepare, maintain, and carry out WSSMPs. An electronic database for this WSSMP information has been deemed necessary by the RIWRB for water suppliers and State agencies to consistently document, maintain, and interpret the information in these plans. Availability of WSSMP data in standard formats will allow water suppliers and State agencies to improve the understanding of water-supply systems and to plan for future needs or water-supply emergencies. In 2002, however, the Rhode Island General Assembly passed a law that classifies some of the WSSMP information as confidential to protect the water-supply infrastructure from potential terrorist threats. Therefore the WSSMP database was designed for an implementation method that will balance security concerns with the information needs of the RIWRB, suppliers, other State agencies, and the public. A WSSMP database was developed by the U.S. Geological Survey in cooperation with the RIWRB. The database was designed to catalog WSSMP information in a format that would accommodate synthesis of current and future information about Rhode Island's water-supply infrastructure. This report documents the design and implementation of

  19. Watershed Data Management (WDM) Database for Salt Creek Streamflow Simulation, DuPage County, Illinois

    USGS Publications Warehouse

    Murphy, Elizabeth A.; Ishii, Audrey

    2006-01-01

    The U.S. Geological Survey (USGS), in cooperation with DuPage County Department of Engineering, Stormwater Management Division, maintains a database of hourly meteorologic and hydrologic data for use in a near real-time streamflow simulation system, which assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek watershed in DuPage County, Illinois. The majority of the precipitation data are collected from a tipping-bucket rain-gage network located in and near DuPage County. The other meteorologic data (wind speed, solar radiation, air temperature, and dewpoint temperature) are collected at Argonne National Laboratory in Argonne, Illinois. Potential evapotranspiration is computed from the meteorologic data. The hydrologic data (discharge and stage) are collected at USGS streamflow-gaging stations in DuPage County. These data are stored in a Watershed Data Management (WDM) database. This report describes a version of the WDM database that was quality-assured and quality-controlled annually to ensure the datasets were complete and accurate. This version of the WDM database contains data from January 1, 1997, through September 30, 2004, and is named SEP04.WDM. This report provides a record of time periods of poor data for each precipitation dataset and describes methods used to estimate the data for the periods when data were missing, flawed, or snowfall-affected. The precipitation dataset data-filling process was changed in 2001, and both processes are described. The other meteorologic and hydrologic datasets in the database are fully described in the annual U.S. Geological Survey Water Data Report for Illinois and, therefore, are described in less detail than the precipitation datasets in this report.

  20. Reef Ecosystem Services and Decision Support Database

    EPA Science Inventory

    This scientific and management information database utilizes systems thinking to describe the linkages between decisions, human activities, and provisioning of reef ecosystem goods and services. This database provides: (1) Hierarchy of related topics - Click on topics to navigat...

  1. The Kepler DB, a Database Management System for Arrays, Sparse Arrays and Binary Data

    NASA Technical Reports Server (NTRS)

    McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Middour, Christopher; Klaus, Todd C.; Wohler, Bill

    2010-01-01

    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30-minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database (Kepler DB) management system was created to act as the repository of this information. After one year of ight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center.

  2. Database system for management of health physics and industrial hygiene records.

    SciTech Connect

    Murdoch, B. T.; Blomquist, J. A.; Cooke, R. H.; Davis, J. T.; Davis, T. M.; Dolecek, E. H.; Halka-Peel, L.; Johnson, D.; Keto, D. N.; Reyes, L. R.; Schlenker, R. A.; Woodring; J. L.

    1999-10-05

    This paper provides an overview of the Worker Protection System (WPS), a client/server, Windows-based database management system for essential radiological protection and industrial hygiene. Seven operational modules handle records for external dosimetry, bioassay/internal dosimetry, sealed sources, routine radiological surveys, lasers, workplace exposure, and respirators. WPS utilizes the latest hardware and software technologies to provide ready electronic access to a consolidated source of worker protection.

  3. Dynamic Tables: An Architecture for Managing Evolving, Heterogeneous Biomedical Data in Relational Database Management Systems

    PubMed Central

    Corwin, John; Silberschatz, Avi; Miller, Perry L.; Marenco, Luis

    2007-01-01

    Data sparsity and schema evolution issues affecting clinical informatics and bioinformatics communities have led to the adoption of vertical or object-attribute–value-based database schemas to overcome limitations posed when using conventional relational database technology. This paper explores these issues and discusses why biomedical data are difficult to model using conventional relational techniques. The authors propose a solution to these obstacles based on a relational database engine using a sparse, column-store architecture. The authors provide benchmarks comparing the performance of queries and schema-modification operations using three different strategies: (1) the standard conventional relational design; (2) past approaches used by biomedical informatics researchers; and (3) their sparse, column-store architecture. The performance results show that their architecture is a promising technique for storing and processing many types of data that are not handled well by the other two semantic data models. PMID:17068350

  4. SNPpy - Database Management for SNP Data from Genome Wide Association Studies

    PubMed Central

    Mitha, Faheem; Herodotou, Herodotos; Borisov, Nedyalko; Jiang, Chen; Yoder, Josh; Owzar, Kouros

    2011-01-01

    Background We describe SNPpy, a hybrid script database system using the Python SQLAlchemy library coupled with the PostgreSQL database to manage genotype data from Genome-Wide Association Studies (GWAS). This system makes it possible to merge study data with HapMap data and merge across studies for meta-analyses, including data filtering based on the values of phenotype and Single-Nucleotide Polymorphism (SNP) data. SNPpy and its dependencies are open source software. Results The current version of SNPpy offers utility functions to import genotype and annotation data from two commercial platforms. We use these to import data from two GWAS studies and the HapMap Project. We then export these individual datasets to standard data format files that can be imported into statistical software for downstream analyses. Conclusions By leveraging the power of relational databases, SNPpy offers integrated management and manipulation of genotype and phenotype data from GWAS studies. The analysis of these studies requires merging across GWAS datasets as well as patient and marker selection. To this end, SNPpy enables the user to filter the data and output the results as standardized GWAS file formats. It does low level and flexible data validation, including validation of patient data. SNPpy is a practical and extensible solution for investigators who seek to deploy central management of their GWAS data. PMID:22039405

  5. Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements

    NASA Technical Reports Server (NTRS)

    Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri

    2006-01-01

    NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.

  6. Designing an efficient electroencephalography system using database with embedded images management approach.

    PubMed

    Yu, Tzu-Yi; Ho, Hsu-Hua

    2014-01-01

    Many diseases associated with mental deterioration among aged patients can be effectively treated using neurological treatments. Research shows that electroencephalography (EEG) can be used as an independent prognostic indicator of morbidity and mortality. Unfortunately, EEG data are typically inaccessible to modern software. It is therefore important to design a comprehensive approach to integrate EEG results into institutional medical systems. A customized EEG system utilizing a database management approach was designed to bridge the gap between the commercial EEG software and hospital data management platforms. Practical and useful medical findings are discoursed from statistical analysis of large amounts of EEG data.

  7. Representing and querying conceptual graphs with relational database management systems is possible.

    PubMed Central

    Schadow, G.; Barnes, M. R.; McDonald, C. J.

    2001-01-01

    This is an experimental study on the feasibility of maintaining medical concept dictionaries in production grade relational database management systems (RDBMS.) In the past, RDBMS did not support transitive relational structures and had therefore been unsuitable for managing knowledge bases. The revised SQL-99 standard, however, may change this. In this paper we show that modern RDBMS that support recursive queries are capable of querying transitive relationships in a generic data model. We show a simple but efficient indexed representation of transitive closure. We could confirm that even challenging combined transitive relationships can be queried in SQL. PMID:11825256

  8. Reengineering a database for clinical trials management: lessons for system architects.

    PubMed

    Brandt, C A; Nadkarni, P; Marenco, L; Karras, B T; Lu, C; Schacter, L; Fisk, J M; Miller, P L

    2000-10-01

    This paper describes the process of enhancing Trial/DB, a database system for clinical studies management. The system's enhancements have been driven by the need to maximize the effectiveness of developer personnel in supporting numerous and diverse users, of study designers in setting up new studies, and of administrators in managing ongoing studies. Trial/DB was originally designed to work over a local area network within a single institution, and basic architectural changes were necessary to make it work over the Internet efficiently as well as securely. Further, as its use spread to diverse communities of users, changes were made to let the processes of study design and project management adapt to the working styles of the principal investigators and administrators for each study. The lessons learned in the process should prove instructive for system architects as well as managers of electronic patient record systems.

  9. Metabolonote: a wiki-based database for managing hierarchical metadata of metabolome analyses.

    PubMed

    Ara, Takeshi; Enomoto, Mitsuo; Arita, Masanori; Ikeda, Chiaki; Kera, Kota; Yamada, Manabu; Nishioka, Takaaki; Ikeda, Tasuku; Nihei, Yoshito; Shibata, Daisuke; Kanaya, Shigehiko; Sakurai, Nozomu

    2015-01-01

    Metabolomics - technology for comprehensive detection of small molecules in an organism - lags behind the other "omics" in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata), existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data) being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called "Togo Metabolome Data" (TogoMD), with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers' understanding and use of data but also submitters' motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitate the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http://metabolonote.kazusa.or.jp/.

  10. Metabolonote: A Wiki-Based Database for Managing Hierarchical Metadata of Metabolome Analyses

    PubMed Central

    Ara, Takeshi; Enomoto, Mitsuo; Arita, Masanori; Ikeda, Chiaki; Kera, Kota; Yamada, Manabu; Nishioka, Takaaki; Ikeda, Tasuku; Nihei, Yoshito; Shibata, Daisuke; Kanaya, Shigehiko; Sakurai, Nozomu

    2015-01-01

    Metabolomics – technology for comprehensive detection of small molecules in an organism – lags behind the other “omics” in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata), existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data) being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called “Togo Metabolome Data” (TogoMD), with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers’ understanding and use of data but also submitters’ motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitate the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http

  11. BDC: Operational Database Management for the SPOT/HELIOS Operations Control System

    NASA Astrophysics Data System (ADS)

    Guiral, P.; Teodomante, S.

    Since operational database is essential for the executable environment of satellite control system (e.g. for monitoring and commanding), the French Space Agency (CNES), as ground segment designer and satellite operator, allocates important resources to Operational Database Management tools development. Indeed, this kind of tool is necessary in order to generate and maintain the operations control system (OCS) data repository during all the relevant space system life. In this context, the objectives of this paper are firstly to present lessons learnt from SPOT/Helios product line and secondly to point out the new challenges related to the increasing number of satellite systems to qualify and maintain during the upcoming years. "BDC", as a component of the SPOT / Helios operations control, is an Operational Database Management tool designed and developed by CNES. This tool has been used since 1998 for SPOT4, then has been upgraded for Helios 1A / 1B, SPOT5 and currently is being customized for Helios 2A. We emphasize the need for CNES of having at one's disposal a tool enabling a significant flexibility in handling data modification during technical and operational qualification phases. This implies: an evolution of the data exchanges between the satellite contractor, Astrium, and • CNES. constraints on the tool development process, leading to the choice of developing • first a prototype and then industrializing it. After a brief data description, the tool is technically described, in particular its architecture and the design choices that allow reusability for different satellites lines. Keywords: Satellite operations, Operations Control System, Data management, Relational Database.

  12. Somma-Vesuvius' activity: a mineral chemistry database

    NASA Astrophysics Data System (ADS)

    Redi, Daniele; Cannatelli, Claudia; Esposito, Rosario; Lima, Annamaria; Petrosino, Paola; De Vivo, Benedetto

    2016-08-01

    Clinopyroxene and olivine are ubiquitous phases in Somma-Vesuvius (SV) volcanics and for the first time they were systematically studied in several products younger than 40 ka. In this manuscript chemical compositions (major, trace and rare earth elements) of a large set of olivine and clinopyroxene crystals from selected rock samples are presented and discussed. Fourteen pumice samples from Plinian pyroclastic deposits as well as three scoriae and eight lava samples from inter-Plinian deposits were collected. A representative number of olivine and clinopyroxene crystals (n ~ 50) were selected for each sample and analysed by electron microprobe and laser ablation inductively coupled plasma mass spectrometer, resulting in a large database, which is now available to the scientific community. All studied eruptive products contain olivine and clinopyroxene crystals spanning a wide range of compositions. Olivines show Fo content varying from 91 to 68, while clinopyroxenes display Mg# ranging from 93 to 71. In samples younger than A.D. 79, the more evolved (Mg#82-72) clinopyroxene crystals show clear Ca enrichment (~23.5-24.5 wt% CaO) with respect to those from older samples (before-A.D.79, ~23-21 wt% CaO). The results corroborate disequilibrium between olivine, clinopyroxene and the hosting melt, and an increasing role of carbonate assimilation in SV magma evolution in the last 2 ka. The database here produced is thought as a share product that makes available mineral data and can be used for further studies by researchers to investigate geochemical evolution of the SV system.

  13. Somma-Vesuvius' activity: a mineral chemistry database

    NASA Astrophysics Data System (ADS)

    Redi, Daniele; Cannatelli, Claudia; Esposito, Rosario; Lima, Annamaria; Petrosino, Paola; De Vivo, Benedetto

    2017-02-01

    Clinopyroxene and olivine are ubiquitous phases in Somma-Vesuvius (SV) volcanics and for the first time they were systematically studied in several products younger than 40 ka. In this manuscript chemical compositions (major, trace and rare earth elements) of a large set of olivine and clinopyroxene crystals from selected rock samples are presented and discussed. Fourteen pumice samples from Plinian pyroclastic deposits as well as three scoriae and eight lava samples from inter-Plinian deposits were collected. A representative number of olivine and clinopyroxene crystals (n 50) were selected for each sample and analysed by electron microprobe and laser ablation inductively coupled plasma mass spectrometer, resulting in a large database, which is now available to the scientific community. All studied eruptive products contain olivine and clinopyroxene crystals spanning a wide range of compositions. Olivines show Fo content varying from 91 to 68, while clinopyroxenes display Mg# ranging from 93 to 71. In samples younger than A.D. 79, the more evolved (Mg#82-72) clinopyroxene crystals show clear Ca enrichment ( 23.5-24.5 wt% CaO) with respect to those from older samples (before-A.D.79, 23-21 wt% CaO). The results corroborate disequilibrium between olivine, clinopyroxene and the hosting melt, and an increasing role of carbonate assimilation in SV magma evolution in the last 2 ka. The database here produced is thought as a share product that makes available mineral data and can be used for further studies by researchers to investigate geochemical evolution of the SV system.

  14. An Introduction to Database Structure and Database Machines.

    ERIC Educational Resources Information Center

    Detweiler, Karen

    1984-01-01

    Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…

  15. Fleet-Wide Prognostic and Health Management Suite: Asset Fault Signature Database

    SciTech Connect

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2015-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: (1) Diagnostic Advisor, (2) Asset Fault Signature (AFS) Database, (3) Remaining Useful Life Advisor, and (4) Remaining Useful Life Database. The paper focuses on the AFS Database of the FW-PHM Suite, which is used to catalog asset fault signatures. A fault signature is a structured representation of the information that an expert would use to first detect and then verify the occurrence of a specific type of fault. The fault signatures developed to assess the health status of generator step-up transformers are described in the paper. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  16. A cohort and database study of airway management in patients undergoing thyroidectomy for retrosternal goitre.

    PubMed

    Gilfillan, N; Ball, C M; Myles, P S; Serpell, J; Johnson, W R; Paul, E

    2014-11-01

    Patients undergoing thyroid surgery with retrosternal goitre may raise concerns for the anaesthetist, especially airway management. We reviewed a multicentre prospective thyroid surgery database and extracted data for those patients with retrosternal goitre. Additionally, we reviewed the anaesthetic charts of patients with retrosternal goitre at our institution to identify the anaesthetic induction technique and airway management. Of 4572 patients in the database, 919 (20%) had a retrosternal goitre. Two cases of early postoperative tracheomalacia were reported, one in the retrosternal group. Despite some very large goitres, no patient required tracheostomy or cardiopulmonary bypass and there were no perioperative deaths. In the subset of 133 patients managed at our institution over six years, there were no major adverse anaesthetic outcomes and no patient had a failed airway or tracheomalacia. In the latter cohort, of 32 (24%) patients identified as having a potentially difficult airway, 17 underwent awake fibreoptic tracheal intubation, but two of these were abandoned and converted to intravenous induction and general anaesthesia. Eleven had inhalational induction; two of these were also abandoned and converted to intravenous induction and general anaesthesia. Of those suspected as having a difficult airway, 28 (87.5%) subsequently had direct laryngoscopy where the laryngeal inlet was clearly visible. We found no good evidence that thyroid surgery patients with retrosternal goitre, with or without symptoms and signs of tracheal compression, present the experienced anaesthetist with an airway that cannot be managed using conventional techniques. This does not preclude the need for multidisciplinary discussion and planning.

  17. An integrated proteome database for two-dimensional electrophoresis data analysis and laboratory information management system.

    PubMed

    Cho, Sang Yun; Park, Kang-Sik; Shim, Jung Eun; Kwon, Min-Seok; Joo, Kil Hong; Lee, Won Suk; Chang, Joon; Kim, Hoguen; Chung, Hyun Cheol; Kim, Hyun Ok; Paik, Young-Ki

    2002-09-01

    We describe an integrated proteome database, termed Yonsei Proteome Research Center Proteome Database (YPRC-PDB) which can store, retrieve and analyze various information including two-dimensional electrophoresis (2-DE) images and associated spot information that were obtained during studies of hepatocellular carcinoma (HCC). YPRC-PDB is also designed to perform as a laboratory information management system that manages sample information, clinical background, conditions of both sample preparation and 2-DE, and entire sets of experimental results. It also features query system and data-mining applications, which are amenable to automatically analyze expression level changes of a specific protein and directly link to clinical information. The user interface is web-based, so that the results from other laboratories can be shared effectively. In particular, the master gel image query is equipped with a graphic tool that can easily identify the relationship between the specific pathological stage of HCC and expression levels of a potential marker protein on the master gel image. Thus, YPRC-PDB is a versatile integrated database suitable for subsequent analyses. The information in YPRC-PDB is updated easily and it is available to authorized users on the World Wide Web (http://yprcpdb.proteomix.org/ approximately damduck/).

  18. Database Administrator

    ERIC Educational Resources Information Center

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  19. An online spatial database of Australian Indigenous Biocultural Knowledge for contemporary natural and cultural resource management.

    PubMed

    Pert, Petina L; Ens, Emilie J; Locke, John; Clarke, Philip A; Packer, Joanne M; Turpin, Gerry

    2015-11-15

    With growing international calls for the enhanced involvement of Indigenous peoples and their biocultural knowledge in managing conservation and the sustainable use of physical environment, it is timely to review the available literature and develop cross-cultural approaches to the management of biocultural resources. Online spatial databases are becoming common tools for educating land managers about Indigenous Biocultural Knowledge (IBK), specifically to raise a broad awareness of issues, identify knowledge gaps and opportunities, and to promote collaboration. Here we describe a novel approach to the application of internet and spatial analysis tools that provide an overview of publically available documented Australian IBK (AIBK) and outline the processes used to develop the online resource. By funding an AIBK working group, the Australian Centre for Ecological Analysis and Synthesis (ACEAS) provided a unique opportunity to bring together cross-cultural, cross-disciplinary and trans-organizational contributors who developed these resources. Without such an intentionally collaborative process, this unique tool would not have been developed. The tool developed through this process is derived from a spatial and temporal literature review, case studies and a compilation of methods, as well as other relevant AIBK papers. The online resource illustrates the depth and breadth of documented IBK and identifies opportunities for further work, partnerships and investment for the benefit of not only Indigenous Australians, but all Australians. The database currently includes links to over 1500 publically available IBK documents, of which 568 are geo-referenced and were mapped. It is anticipated that as awareness of the online resource grows, more documents will be provided through the website to build the database. It is envisaged that this will become a well-used tool, integral to future natural and cultural resource management and maintenance.

  20. The Use of SQL and Second Generation Database Management Systems for Data Processing and Information Retrieval in Libraries.

    ERIC Educational Resources Information Center

    Leigh, William; Paz, Noemi

    1989-01-01

    Describes Structured Query Language (SQL), the result of an American National Standards Institute effort to standardize language used to query computer databases and a common element in second generation database management systems. The discussion covers implementations of SQL, associated products, and techniques for its use in online catalogs,…

  1. Head-to-Head Evaluation of the Pro-Cite and Sci-Mate Bibliographic Database Management Systems.

    ERIC Educational Resources Information Center

    Saari, David S.; Foster, George A., Jr.

    1989-01-01

    Compares two full featured database management systems for bibliographic information in terms of programs and documentation; record creation and editing; online database citations; search procedures; access to references in external text files; sorting and printing functions; style sheets; indexes; and file operations. (four references) (CLB)

  2. Database Management for Item Banking and Test Development: An Application of dBase II for the Microcomputer.

    ERIC Educational Resources Information Center

    Bowers, John J.

    The background and results of an effort to use dBASE II, a microcomputer database management package, to establish, maintain, and update an item bank useful in a complex test development process are presented. The paper explores some of the perspectives and considerations in designing such a database which make the test development process easier,…

  3. Managing database under the DPSIR tool for the implementation of European Water Framework Directive

    NASA Astrophysics Data System (ADS)

    Cinnirella, S.; Trombino, G.; Pesenti, E.; Algieri, A.; Pirrone, N.

    2003-04-01

    With the purpose of establishing an European legislation aimed to protect inland surface waters, transitional waters, coastal waters and groundwater the European Parliament and the Council of the European Union adopted the Water Framework Directive 2000/60/EC (WFD). The holistic approach of the WFD for the management of waters have been adopted to protect water bodies considered as interlinked systems from the spring to the sea. Having the above in mind, two years ago was started the EUROCAT project funded by the European Commission as part of the FP5 which is aimed to define for different catchment-coastal zone continuums in Europe a possible policy responses to mitigate environmental pressures driven by main economic activities existing in the area. In order to account for different aspects related to spatial interactions between the watershed and coastal zone, the Drivers-Pressures-State-Impact-Response (DPSIR) framework was developed for the Po Catchment-Adriatic Coastal Zone continuum and applied as policy tool to evaluate important aspects to be considered during the implementation of the WFD. The DPSIR includes ecological as well as socio-economic indicators that represent the Drivers that create Pressures which are gradually integrated as the Status of the system is assessed. Evaluation of the Impact for different Drivers and Pressure factors allows to identify the optimal strategy(-ies) for achieving the objectives and goals of the WFD. The aim of this paper is to discuss and present a multi-layer database that includes socio-economic, physical and ecological indicators that have been used to run different biogeochemical and socio-economic models including the one for assessing nutrient fluxes in the catchment (MONERIS, Behrendt, 1996), the nutrient cycle in surface and deep seawaters of the Adriatic Sea (WASP model) and socio-economic models (i.e., DEFINITE) for different socio-economic and environmental scenarios including the Business As Usual (BAU) and

  4. National information network and database system of hazardous waste management in China

    SciTech Connect

    Ma Hongchang

    1996-12-31

    Industries in China generate large volumes of hazardous waste, which makes it essential for the nation to pay more attention to hazardous waste management. National laws and regulations, waste surveys, and manifest tracking and permission systems have been initiated. Some centralized hazardous waste disposal facilities are under construction. China`s National Environmental Protection Agency (NEPA) has also obtained valuable information on hazardous waste management from developed countries. To effectively share this information with local environmental protection bureaus, NEPA developed a national information network and database system for hazardous waste management. This information network will have such functions as information collection, inquiry, and connection. The long-term objective is to establish and develop a national and local hazardous waste management information network. This network will significantly help decision makers and researchers because it will be easy to obtain information (e.g., experiences of developed countries in hazardous waste management) to enhance hazardous waste management in China. The information network consists of five parts: technology consulting, import-export management, regulation inquiry, waste survey, and literature inquiry.

  5. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  6. A Method to Calculate and Analyze Residents' Evaluations by Using a Microcomputer Data-Base Management System.

    ERIC Educational Resources Information Center

    Mills, Myron L.

    1988-01-01

    A system developed for more efficient evaluation of graduate medical students' progress uses numerical scoring and a microcomputer database management system as an alternative to manual methods to produce accurate, objective, and meaningful summaries of resident evaluations. (Author/MSE)

  7. KNApSAcK Metabolite Activity Database for retrieving the relationships between metabolites and biological activities.

    PubMed

    Nakamura, Yukiko; Afendi, Farit Mochamad; Parvin, Aziza Kawsar; Ono, Naoaki; Tanaka, Ken; Hirai Morita, Aki; Sato, Tetsuo; Sugiura, Tadao; Altaf-Ul-Amin, Md; Kanaya, Shigehiko

    2014-01-01

    Databases (DBs) are required by various omics fields because the volume of molecular biology data is increasing rapidly. In this study, we provide instructions for users and describe the current status of our metabolite activity DB. To facilitate a comprehensive understanding of the interactions between the metabolites of organisms and the chemical-level contribution of metabolites to human health, we constructed a metabolite activity DB known as the KNApSAcK Metabolite Activity DB. It comprises 9,584 triplet relationships (metabolite-biological activity-target species), including 2,356 metabolites, 140 activity categories, 2,963 specific descriptions of biological activities and 778 target species. Approximately 46% of the activities described in the DB are related to chemical ecology, most of which are attributed to antimicrobial agents and plant growth regulators. The majority of the metabolites with antimicrobial activities are flavonoids and phenylpropanoids. The metabolites with plant growth regulatory effects include plant hormones. Over half of the DB contents are related to human health care and medicine. The five largest groups are toxins, anticancer agents, nervous system agents, cardiovascular agents and non-therapeutic agents, such as flavors and fragrances. The KNApSAcK Metabolite Activity DB is integrated within the KNApSAcK Family DBs to facilitate further systematized research in various omics fields, especially metabolomics, nutrigenomics and foodomics. The KNApSAcK Metabolite Activity DB could also be utilized for developing novel drugs and materials, as well as for identifying viable drug resources and other useful compounds.

  8. Object-Oriented Database for Managing Building Modeling Components and Metadata: Preprint

    SciTech Connect

    Long, N.; Fleming, K.; Brackney, L.

    2011-12-01

    Building simulation enables users to explore and evaluate multiple building designs. When tools for optimization, parametrics, and uncertainty analysis are combined with analysis engines, the sheer number of discrete simulation datasets makes it difficult to keep track of the inputs. The integrity of the input data is critical to designers, engineers, and researchers for code compliance, validation, and building commissioning long after the simulations are finished. This paper discusses an application that stores inputs needed for building energy modeling in a searchable, indexable, flexible, and scalable database to help address the problem of managing simulation input data.

  9. Managing vulnerabilities and achieving compliance for Oracle databases in a modern ERP environment

    NASA Astrophysics Data System (ADS)

    Hölzner, Stefan; Kästle, Jan

    In this paper we summarize good practices on how to achieve compliance for an Oracle database in combination with an ERP system. We use an integrated approach to cover both the management of vulnerabilities (preventive measures) and the use of logging and auditing features (detective controls). This concise overview focusses on the combination Oracle and SAP and it’s dependencies, but also outlines security issues that arise with other ERP systems. Using practical examples, we demonstrate common vulnerabilities and coutermeasures as well as guidelines for the use of auditing features.

  10. System configuration management plan for the TWRS controlled baseline database system [TCBD

    SciTech Connect

    Spencer, S.G.

    1998-09-23

    LHMC, TWRS Business Management Organization (BMO) is designated as system owner, operator, and maintenance authority. The TWAS BMO identified the need for the TCBD. The TWRS BMO users have established all requirements for the database and are responsible for maintaining database integrity and control (after the interface data has been received). Initial interface data control and integrity is maintained through functional and administrative processes and is the responsibility of the database owners who are providing the data. The specific groups within the TWRS BMO affected by this plan are the Financial Management and TWRS Management Support Project, Master Planning, and the Financial Control Integration and Reporting. The interfaces between these organizations are through normal line management chain of command. The Master Planning Group is assigned the responsibility to continue development and maintenance of the TCBD. This group maintains information that includes identification of requirements and changes to those requirements in a TCBD project file. They are responsible for the issuance, maintenance, and change authority of this SCW. LHMC, TWRS TCBD Users are designated as providing the project`s requirement changes for implementation and also testing of the TCBD during development. The Master Planning Group coordinates and monitors the user`s requests for system requirements (new/existing) as well as beta and acceptance testing. Users are those individuals and organizations needing data or information from the TCBD and having both a need-to-know and the proper training and authority to access the database. Each user or user organization is required to comply with the established requirements and procedures governing the TCBD. Lockheed Martin Services, Inc. (LMSI) is designated the TCBD developer, maintainer, and custodian until acceptance and process testing of the system has been completed via the TWRS BMO. Once this occurs, the TCBD will be completed and

  11. Development of an Epidemiological Database Management, Extraction, and Analysis System (EPISYS)

    DTIC Science & Technology

    1992-07-01

    NAVAL HEALTH RESEARCH CENTER DEVELOPMENT OF AN EPIDEMIOLOGICAL DATABASE MANAGEMENT, EXTRACTION, AND ANALYSIS a) SYSTEM (EPISYS) I M. R. White 94...31520 Technical Document 93-6F Appr-urd for pubbc tuIee dotribution rIhwated NAVAL HEALTH RESEARCH CENTER P. 0. BOX 85122 SAN DIE(;O, CNLIFORNIA 92186...ANALYSIS SYSTEM (EPISYS) Prepared for: N AV AL HEALTH RESEARCH CENTER P.O. BOX 85122 San Diego, CA 92186-5122 - . . U’ : t".3. i ______ 1 I - Lii Prepared

  12. Performance of online drug information databases as clinical decision support tools in infectious disease medication management.

    PubMed

    Polen, Hyla H; Zapantis, Antonia; Clauson, Kevin A; Clauson, Kevin Alan; Jebrock, Jennifer; Paris, Mark

    2008-11-06

    Infectious disease (ID) medication management is complex and clinical decision support tools (CDSTs) can provide valuable assistance. This study evaluated scope and completeness of ID drug information found in online databases by evaluating their ability to answer 147 question/answer pairs. Scope scores produced highest rankings (%) for: Micromedex (82.3), Lexi-Comp/American Hospital Formulary Service (81.0), and Medscape Drug Reference (81.0); lowest includes: Epocrates Online Premium (47.0), Johns Hopkins ABX Guide (45.6), and PEPID PDC (40.8).

  13. Graph Databases for Large-Scale Healthcare Systems: A Framework for Efficient Data Management and Data Services

    SciTech Connect

    Park, Yubin; Shankar, Mallikarjun; Park, Byung H.; Ghosh, Dr. Joydeep

    2014-01-01

    Designing a database system for both efficient data management and data services has been one of the enduring challenges in the healthcare domain. In many healthcare systems, data services and data management are often viewed as two orthogonal tasks; data services refer to retrieval and analytic queries such as search, joins, statistical data extraction, and simple data mining algorithms, while data management refers to building error-tolerant and non-redundant database systems. The gap between service and management has resulted in rigid database systems and schemas that do not support effective analytics. We compose a rich graph structure from an abstracted healthcare RDBMS to illustrate how we can fill this gap in practice. We show how a healthcare graph can be automatically constructed from a normalized relational database using the proposed 3NF Equivalent Graph (3EG) transformation.We discuss a set of real world graph queries such as finding self-referrals, shared providers, and collaborative filtering, and evaluate their performance over a relational database and its 3EG-transformed graph. Experimental results show that the graph representation serves as multiple de-normalized tables, thus reducing complexity in a database and enhancing data accessibility of users. Based on this finding, we propose an ensemble framework of databases for healthcare applications.

  14. Activated charcoal. (Latest citations from the Compendex database). Published Search

    SciTech Connect

    Not Available

    1993-06-01

    The bibliography contains citations concerning theoretical aspects and industrial applications of activated charcoal. Topics include adsorption capacity and mechanism studies, kinetic and thermodynamic aspects, and description and evaluation of adsorptive abilities. Applications include use in water analyses and waste treatment, air pollution control and measurement, and in nuclear facilities. (Contains a minimum of 151 citations and includes a subject term index and title list.)

  15. A database of radionuclide activity and metal concentrations for the Alligator Rivers Region uranium province.

    PubMed

    Doering, Che; Bollhöfer, Andreas

    2016-10-01

    This paper presents a database of radionuclide activity and metal concentrations for the Alligator Rivers Region (ARR) uranium province in the Australian wet-dry tropics. The database contains 5060 sample records and 57,473 concentration values. The data are for animal, plant, soil, sediment and water samples collected by the Environmental Research Institute of the Supervising Scientist (ERISS) as part of its statutory role to undertake research and monitoring into the impacts of uranium mining on the environment of the ARR. Concentration values are provided in the database for 11 radionuclides ((227)Ac, (40)K, (210)Pb, (210)Po, (226)Ra, (228)Ra, (228)Th, (230)Th, (232)Th, (234)U, (238)U) and 26 metals (Al, As, Ba, Ca, Cd, Co, Cr, Cu, Fe, Hg, K, Mg, Mn, Na, Ni, P, Pb, Rb, S, Sb, Se, Sr, Th, U, V, Zn). Potential uses of the database are discussed.

  16. Improving machine operation management efficiency via improving the vehicle park structure and using the production operation information database

    NASA Astrophysics Data System (ADS)

    Koptev, V. Yu

    2017-02-01

    The work represents the results of studying basic interconnected criteria of separate equipment units of the transport network machines fleet, depending on production and mining factors to improve the transport systems management. Justifying the selection of a control system necessitates employing new methodologies and models, augmented with stability and transport flow criteria, accounting for mining work development dynamics on mining sites. A necessary condition is the accounting of technical and operating parameters related to vehicle operation. Modern open pit mining dispatching systems must include such kinds of the information database. An algorithm forming a machine fleet is presented based on multi-variation task solution in connection with defining reasonable operating features of a machine working as a part of a complex. Proposals cited in the work may apply to mining machines (drilling equipment, excavators) and construction equipment (bulldozers, cranes, pile-drivers), city transport and other types of production activities using machine fleet.

  17. Knowledge management: An abstraction of knowledge base and database management systems

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  18. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    NASA Technical Reports Server (NTRS)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  19. The use of database management systems and artificial intelligence in automating the planning of optical navigation pictures

    NASA Technical Reports Server (NTRS)

    Davis, Robert P.; Underwood, Ian M.

    1987-01-01

    The use of database management systems (DBMS) and AI to minimize human involvement in the planning of optical navigation pictures for interplanetary space probes is discussed, with application to the Galileo mission. Parameters characterizing the desirability of candidate pictures, and the program generating them, are described. How these parameters automatically build picture records in a database, and the definition of the database structure, are then discussed. The various rules, priorities, and constraints used in selecting pictures are also described. An example is provided of an expert system, written in Prolog, for automatically performing the selection process.

  20. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    NASA Astrophysics Data System (ADS)

    Shao, Weber; Kupelian, Patrick A.; Wang, Jason; Low, Daniel A.; Ruan, Dan

    2014-03-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  1. Developing a database management system to support birth defects surveillance in Florida.

    PubMed

    Salemi, Jason L; Hauser, Kimberlea W; Tanner, Jean Paul; Sampat, Diana; Correia, Jane A; Watkins, Sharon M; Kirby, Russell S

    2010-01-01

    The value of any public health surveillance program is derived from the ways in which data are managed and used to improve the public's health. Although birth defects surveillance programs vary in their case volume, budgets, staff, and objectives, the capacity to operate efficiently and maximize resources remains critical to long-term survival. The development of a fully-integrated relational database management system (DBMS) can enrich a surveillance program's data and improve efficiency. To build upon the Florida Birth Defects Registry--a statewide registry relying solely on linkage of administrative datasets and unconfirmed diagnosis codes-the Florida Department of Health provided funding to the University of South Florida to develop and pilot an enhanced surveillance system in targeted areas with a more comprehensive approach to case identification and diagnosis confirmation. To manage operational and administrative complexities, a DBMS was developed, capable of managing transmission of project data from multiple sources, tracking abstractor time during record reviews, offering tools for defect coding and case classification, and providing reports to DBMS users. Since its inception, the DBMS has been used as part of our surveillance projects to guide the receipt of over 200 case lists and review of 12,924 fetuses and infants (with associated maternal records) suspected of having selected birth defects in over 90 birthing and transfer facilities in Florida. The DBMS has provided both anticipated and unexpected benefits. Automation of the processes for managing incoming case lists has reduced clerical workload considerably, while improving accuracy of working lists for field abstraction. Data quality has improved through more effective use of internal edits and comparisons with values for other data elements, while simultaneously increasing abstractor efficiency in completion of case abstraction. We anticipate continual enhancement to the DBMS in the future

  2. Recon2Neo4j: applying graph database technologies for managing comprehensive genome-scale networks.

    PubMed

    Balaur, Irina; Mazein, Alexander; Saqi, Mansoor; Lysenko, Artem; Rawlings, Christopher J; Auffray, Charles

    2016-12-19

    The goal of this work is to offer a computational framework for exploring data from the Recon2 human metabolic reconstruction model. Advanced user access features have been developed using the Neo4j graph database technology and this paper describes key features such as efficient management of the network data, examples of the network querying for addressing particular tasks, and how query results are converted back to the Systems Biology Markup Language (SBML) standard format. The Neo4j-based metabolic framework facilitates exploration of highly connected and comprehensive human metabolic data and identification of metabolic subnetworks of interest. A Java-based parser component has been developed to convert query results (available in the JSON format) into SBML and SIF formats in order to facilitate further results exploration, enhancement or network sharing.

  3. On the evaluation of fuzzy quantified queries in a database management system

    NASA Technical Reports Server (NTRS)

    Bosc, Patrick; Pivert, Olivier

    1992-01-01

    Many propositions to extend database management systems have been made in the last decade. Some of them aim at the support of a wider range of queries involving fuzzy predicates. Unfortunately, these queries are somewhat complex and the question of their efficiency is a subject under discussion. In this paper, we focus on a particular subset of queries, namely those using fuzzy quantified predicates. More precisely, we will consider the case where such predicates apply to individual elements as well as to sets of elements. Thanks to some interesting properties of alpha-cuts of fuzzy sets, we are able to show that the evaluation of these queries can be significantly improved with respect to a naive strategy based on exhaustive scans of sets or files.

  4. Maryland power plant siting program radioecology database management system: format for coding radioecology data. Final report

    SciTech Connect

    Domotor, S.L.

    1986-01-01

    The Radioecology Laboratory of the State of Maryland Power Plant Siting Program (PPSP) conducts routine radiological monitoring programs designed to assess the environmental impact of radionuclides released by nuclear power plants affecting Maryland. The PPSP radioecology database management system was initiated to store existing and future monitoring data collected by PPSP and its subcontractors in a computer file format. From these files, SAS (Statistical Analysis System) datasets are created for qualitative and quantitative analysis of monitoring data, for modeling studies through incorporation of this data, or for predicting environmental impact. The system was designed to accommodate both gamma and beta radionuclide analyses from water, sediment, soil, air, foodstuff, and aquatic and terrestrial flora and fauna sample types. Plant releases of radionuclides and physical and chemical environmental parameters can also be stored.

  5. C RE-SLC: Database for conservation and renewable energy activities

    SciTech Connect

    Cavallo, J.D.; Tompkins, M.M.; Fisher, A.G.

    1992-08-01

    The Western Area Power Administration (Western) requires all its long-term power customers to implement programs that promote the conservation of electric energy or facilitate the use of renewable energy resources. The hope is that these measures could significantly reduce the amount of environmental damage associated with electricity production. As part of preparing the environmental impact statement for Western's Electric Power Marketing Program, Argonne National Laboratory constructed a database of the conservation and renewable energy activities in which Western's Salt Lake City customers are involved. The database provides information on types of conservation and renewable energy activities and allows for comparisons of activities being conducted at different utilities in the Salt Lake City region. Sorting the database allows Western's Salt Lake City customers to be classified so the various activities offered by different classes of utilities can be identified; for example, comparisons can be made between municipal utilities and cooperatives or between large and small customers. The information included in the database was collected from customer planning documents in the files of Western's Salt Lake City office.

  6. C&RE-SLC: Database for conservation and renewable energy activities

    SciTech Connect

    Cavallo, J.D.; Tompkins, M.M.; Fisher, A.G.

    1992-08-01

    The Western Area Power Administration (Western) requires all its long-term power customers to implement programs that promote the conservation of electric energy or facilitate the use of renewable energy resources. The hope is that these measures could significantly reduce the amount of environmental damage associated with electricity production. As part of preparing the environmental impact statement for Western`s Electric Power Marketing Program, Argonne National Laboratory constructed a database of the conservation and renewable energy activities in which Western`s Salt Lake City customers are involved. The database provides information on types of conservation and renewable energy activities and allows for comparisons of activities being conducted at different utilities in the Salt Lake City region. Sorting the database allows Western`s Salt Lake City customers to be classified so the various activities offered by different classes of utilities can be identified; for example, comparisons can be made between municipal utilities and cooperatives or between large and small customers. The information included in the database was collected from customer planning documents in the files of Western`s Salt Lake City office.

  7. C&RE-SLC: Database for conservation and renewable energy activities

    NASA Astrophysics Data System (ADS)

    Cavallo, J. D.; Tompkins, M. M.; Fisher, A. G.

    1992-08-01

    The Western Area Power Administration (Western) requires all its long-term power customers to implement programs that promote the conservation of electric energy or facilitate the use of renewable energy resources. The hope is that these measures could significantly reduce the amount of environmental damage associated with electricity production. As part of preparing the environmental impact statement for Western's Electric Power Marketing Program, Argonne National Laboratory constructed a database of the conservation and renewable energy activities in which Western's Salt Lake City customers are involved. The database provides information on types of conservation and renewable energy activities and allows for comparisons of activities being conducted at different utilities in the Salt Lake City region. Sorting the database allows Western's Salt Lake City customers to be classified so the various activities offered by different classes of utilities can be identified; for example, comparisons can be made between municipal utilities and cooperatives or between large and small customers. The information included in the database was collected from customer planning documents in the files of Western's Salt Lake City office.

  8. An intelligent interactive visual database management system for Space Shuttle closeout image management

    NASA Technical Reports Server (NTRS)

    Ragusa, James M.; Orwig, Gary; Gilliam, Michael; Blacklock, David; Shaykhian, Ali

    1994-01-01

    Status is given of an applications investigation on the potential for using an expert system shell for classification and retrieval of high resolution, digital, color space shuttle closeout photography. This NASA funded activity has focused on the use of integrated information technologies to intelligently classify and retrieve still imagery from a large, electronically stored collection. A space shuttle processing problem is identified, a working prototype system is described, and commercial applications are identified. A conclusion reached is that the developed system has distinct advantages over the present manual system and cost efficiencies will result as the system is implemented. Further, commercial potential exists for this integrated technology.

  9. Image Databases.

    ERIC Educational Resources Information Center

    Pettersson, Rune

    Different kinds of pictorial databases are described with respect to aims, user groups, search possibilities, storage, and distribution. Some specific examples are given for databases used for the following purposes: (1) labor markets for artists; (2) document management; (3) telling a story; (4) preservation (archives and museums); (5) research;…

  10. Hazardous waste database: Waste management policy implications for the US Department of Energy`s Environmental Restoration and Waste Management Programmatic Environmental Impact Statement

    SciTech Connect

    Lazaro, M.A.; Policastro, A.J.; Antonopoulos, A.A.; Hartmann, H.M.; Koebnick, B.; Dovel, M.; Stoll, P.W.

    1994-03-01

    The hazardous waste risk assessment modeling (HaWRAM) database is being developed to analyze the risk from treatment technology operations and potential transportation accidents associated with the hazardous waste management alternatives. These alternatives are being assessed in the Department of Energy`s Environmental Restoration and Waste Management Programmatic Environmental Impact Statement (EM PEIS). To support the risk analysis, the current database contains complexwide detailed information on hazardous waste shipments from 45 Department of Energy installations during FY 1992. The database is currently being supplemented with newly acquired data. This enhancement will improve database information on operational hazardous waste generation rates, and the level and type of current on-site treatment at Department of Energy installations.

  11. A Toolkit for Active Object-Oriented Databases with Application to Interoperability

    NASA Technical Reports Server (NTRS)

    King, Roger

    1996-01-01

    In our original proposal we stated that our research would 'develop a novel technology that provides a foundation for collaborative information processing.' The essential ingredient of this technology is the notion of 'deltas,' which are first-class values representing collections of proposed updates to a database. The Heraclitus framework provides a variety of algebraic operators for building up, combining, inspecting, and comparing deltas. Deltas can be directly applied to the database to yield a new state, or used 'hypothetically' in queries against the state that would arise if the delta were applied. The central point here is that the step of elevating deltas to 'first-class' citizens in database programming languages will yield tremendous leverage on the problem of supporting updates in collaborative information processing. In short, our original intention was to develop the theoretical and practical foundation for a technology based on deltas in an object-oriented database context, develop a toolkit for active object-oriented databases, and apply this toward collaborative information processing.

  12. A Toolkit for Active Object-Oriented Databases with Application to Interoperability

    NASA Technical Reports Server (NTRS)

    King, Roger

    1996-01-01

    In our original proposal we stated that our research would 'develop a novel technology that provides a foundation for collaborative information processing.' The essential ingredient of this technology is the notion of 'deltas,' which are first-class values representing collections of proposed updates to a database. The Heraclitus framework provides a variety of algebraic operators for building up, combining, inspecting, and comparing deltas. Deltas can be directly applied to the database to yield a new state, or used 'hypothetically' in queries against the state that would arise if the delta were applied. The central point here is that the step of elevating deltas to 'first-class' citizens in database programming languages will yield tremendous leverage on the problem of supporting updates in collaborative information processing. In short, our original intention was to develop the theoretical and practical foundation for a technology based on deltas in an object- oriented database context, develop a toolkit for active object-oriented databases, and apply this toward collaborative information processing.

  13. Rainforests: Conservation and resource management. (Latest citations from the Biobusiness database). Published Search

    SciTech Connect

    Not Available

    1994-12-01

    The bibliography contains citations concerning conservation of rainforest ecology and management of natural resources. Topics include plant community structure and development, nutrient dynamics, rainfall characteristics and water budgets, and forest dynamics. Studies performed in specific forest areas are included. Effects of human activities are also considered. (Contains a minimum of 154 citations and includes a subject term index and title list.)

  14. Karlsruhe Database for Radioactive Wastes (KADABRA) - Accounting and Management System for Radioactive Waste Treatment - 12275

    SciTech Connect

    Himmerkus, Felix; Rittmeyer, Cornelia

    2012-07-01

    The data management system KADABRA was designed according to the purposes of the Cen-tral Decontamination Department (HDB) of the Wiederaufarbeitungsanlage Karlsruhe Rueckbau- und Entsorgungs-GmbH (WAK GmbH), which is specialized in the treatment and conditioning of radioactive waste. The layout considers the major treatment processes of the HDB as well as regulatory and legal requirements. KADABRA is designed as an SAG ADABAS application on IBM system Z mainframe. The main function of the system is the data management of all processes related to treatment, transfer and storage of radioactive material within HDB. KADABRA records the relevant data concerning radioactive residues, interim products and waste products as well as the production parameters relevant for final disposal. Analytical data from the laboratory and non destructive assay systems, that describe the chemical and radiological properties of residues, production batches, interim products as well as final waste products, can be linked to the respective dataset for documentation and declaration. The system enables the operator to trace the radioactive material through processing and storage. Information on the actual sta-tus of the material as well as radiological data and storage position can be gained immediately on request. A variety of programs accessed to the database allow the generation of individual reports on periodic or special request. KADABRA offers a high security standard and is constantly adapted to the recent requirements of the organization. (authors)

  15. Data base management systems activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The Data Management System-1100 is designed to operate in conjunction with the UNIVAC 1100 Series Operating System on any 1100 Series computer. DMS-1100 is divided into the following four major software components: (1) Data Definition Languages (DDL); (2) Data Management Routine (DMR); (3) Data Manipulation Languages (DML); and (4) Data Base Utilities (DBU). These software components are described in detail.

  16. [Cystic Fibrosis Cloud database: An information system for storage and management of clinical and microbiological data of cystic fibrosis patients].

    PubMed

    Prieto, Claudia I; Palau, María J; Martina, Pablo; Achiary, Carlos; Achiary, Andrés; Bettiol, Marisa; Montanaro, Patricia; Cazzola, María L; Leguizamón, Mariana; Massillo, Cintia; Figoli, Cecilia; Valeiras, Brenda; Perez, Silvia; Rentería, Fernando; Diez, Graciela; Yantorno, Osvaldo M; Bosch, Alejandra

    2016-01-01

    The epidemiological and clinical management of cystic fibrosis (CF) patients suffering from acute pulmonary exacerbations or chronic lung infections demands continuous updating of medical and microbiological processes associated with the constant evolution of pathogens during host colonization. In order to monitor the dynamics of these processes, it is essential to have expert systems capable of storing and subsequently extracting the information generated from different studies of the patients and microorganisms isolated from them. In this work we have designed and developed an on-line database based on an information system that allows to store, manage and visualize data from clinical studies and microbiological analysis of bacteria obtained from the respiratory tract of patients suffering from cystic fibrosis. The information system, named Cystic Fibrosis Cloud database is available on the http://servoy.infocomsa.com/cfc_database site and is composed of a main database and a web-based interface, which uses Servoy's product architecture based on Java technology. Although the CFC database system can be implemented as a local program for private use in CF centers, it can also be used, updated and shared by different users who can access the stored information in a systematic, practical and safe manner. The implementation of the CFC database could have a significant impact on the monitoring of respiratory infections, the prevention of exacerbations, the detection of emerging organisms, and the adequacy of control strategies for lung infections in CF patients.

  17. Air Compliance Complaint Database (ACCD)

    EPA Pesticide Factsheets

    THIS DATA ASSET NO LONGER ACTIVE: This is metadata documentation for the Region 7 Air Compliance Complaint Database (ACCD) which logs all air pollution complaints received by Region 7. It contains information about the complaint along with how the complaint was addressed. The Air and Waste Management Division is the primary managing entity for this database. This work falls under objectives for EPA's 2003-2008 Strategic Plan (Goal 1) for Clean Air & Global Climate Change, which are to achieve healthier outdoor air.

  18. Drycleaner Database - Region 7

    EPA Pesticide Factsheets

    THIS DATA ASSET NO LONGER ACTIVE: This is metadata documentation for the Region 7 Drycleaner Database (R7DryClnDB) which tracks all Region7 drycleaners who notify Region 7 subject to Maximum Achievable Control Technologiy (MACT) standards. The Air and Waste Management Division is the primary managing entity for this database. This work falls under objectives for EPA's 2003-2008 Strategic Plan (Goal 4) for Healthy Communities & Ecosystems, which are to reduce chemical and/or pesticide risks at facilities.

  19. The Design of a “Functional” Database System and its Use in the Management of the Critically Ill

    PubMed Central

    Pollizzi, Joseph A.

    1983-01-01

    This paper reports on the design and implementation of the database used in the Quantitative Sentinal, a commercial patient data management system. This system was designed and has been in clinical use at the Maryland Institute for Emergency Medical Services Systems, in Baltimore, Maryland, for over 5 years. The system has also been installed at several other medical facilities throughout the United States and Canada. The report emphasizes a “model” for clinical information which describes clinical data as a collection of discrete value functions with certain properties. A database design based on the “functional” model is described. Typical data accesses of the database are examined in relationship to sample clinical questions. Within the implementation discussion, a special emphasis on database redundancy is also provided.

  20. The Johnson Space Center Management Information Systems (JSCMIS): An interface for organizational databases

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.; Erickson, Lloyd

    1990-01-01

    The Management Information and Decision Support Environment (MIDSE) is a research activity to build and test a prototype of a generic human interface on the Johnson Space Center (JSC) Information Network (CIN). The existing interfaces were developed specifically to support operations rather than the type of data which management could use. The diversity of the many interfaces and their relative difficulty discouraged occasional users from attempting to use them for their purposes. The MIDSE activity approached this problem by designing and building an interface to one JSC data base - the personnel statistics tables of the NASA Personnel and Payroll System (NPPS). The interface was designed against the following requirements: generic (use with any relational NOMAD data base); easy to learn (intuitive operations for new users); easy to use (efficient operations for experienced users); self-documenting (help facility which informs users about the data base structure as well as the operation of the interface); and low maintenance (easy configuration to new applications). A prototype interface entitled the JSC Management Information Systems (JSCMIS) was produced. It resides on CIN/PROFS and is available to JSC management who request it. The interface has passed management review and is ready for early use. Three kinds of data are now available: personnel statistics, personnel register, and plan/actual cost.

  1. PARPs database: A LIMS systems for protein-protein interaction data mining or laboratory information management system

    PubMed Central

    Droit, Arnaud; Hunter, Joanna M; Rouleau, Michèle; Ethier, Chantal; Picard-Cloutier, Aude; Bourgais, David; Poirier, Guy G

    2007-01-01

    Background In the "post-genome" era, mass spectrometry (MS) has become an important method for the analysis of proteins and the rapid advancement of this technique, in combination with other proteomics methods, results in an increasing amount of proteome data. This data must be archived and analysed using specialized bioinformatics tools. Description We herein describe "PARPs database," a data analysis and management pipeline for liquid chromatography tandem mass spectrometry (LC-MS/MS) proteomics. PARPs database is a web-based tool whose features include experiment annotation, protein database searching, protein sequence management, as well as data-mining of the peptides and proteins identified. Conclusion Using this pipeline, we have successfully identified several interactions of biological significance between PARP-1 and other proteins, namely RFC-1, 2, 3, 4 and 5. PMID:18093328

  2. The Use of INFO, a Database Management System, in Teaching Library and Information Studies at Manchester Polytechnic.

    ERIC Educational Resources Information Center

    Rowley, J. E.; And Others

    1988-01-01

    Outlines the courses offered by the Department of Library and Information Studies at Manchester Polytechnic, and describes the use of a database management system to teach aspects of information science. Details of a number of specific applications are given and future developments are discussed. (CLB)

  3. A spatial classification and database for management, research, and policy making: The Great Lakes aquatic habitat framework

    EPA Science Inventory

    Managing the world’s largest and complex freshwater ecosystem, the Laurentian Great Lakes, requires a spatially hierarchical basin-wide database of ecological and socioeconomic information that are comparable across the region. To meet such a need, we developed a hierarchi...

  4. A Web-Based Multi-Database System Supporting Distributed Collaborative Management and Sharing of Microarray Experiment Information

    PubMed Central

    Burgarella, Sarah; Cattaneo, Dario; Masseroli, Marco

    2006-01-01

    We developed MicroGen, a multi-database Web based system for managing all the information characterizing spotted microarray experiments. It supports information gathering and storing according to the Minimum Information About Microarray Experiments (MIAME) standard. It also allows easy sharing of information and data among all multidisciplinary actors involved in spotted microarray experiments. PMID:17238488

  5. A Web-based multi-database system supporting distributed collaborative management and sharing of microarray experiment information.

    PubMed

    Burgarella, Sarah; Cattaneo, Dario; Masseroli, Marco

    2006-01-01

    We developed MicroGen, a multi-database Web based system for managing all the information characterizing spotted microarray experiments. It supports information gathering and storing according to the Minimum Information About Microarray Experiments (MIAME) standard. It also allows easy sharing of information and data among all multidisciplinary actors involved in spotted microarray experiments.

  6. Using non-local databases for the environmental assessment of industrial activities: The case of Latin America

    SciTech Connect

    Osses de Eicker, Margarita; Hischier, Roland; Hurni, Hans; Zah, Rainer

    2010-04-15

    Nine non-local databases were evaluated with respect to their suitability for the environmental assessment of industrial activities in Latin America. Three assessment methods were considered, namely Life Cycle Assessment (LCA), Environmental Impact Assessment (EIA) and air emission inventories. The analysis focused on data availability in the databases and the applicability of their international data to Latin American industry. The study showed that the European EMEP/EEA Guidebook and the U.S. EPA AP-42 database are the most suitable ones for air emission inventories, whereas the LCI database Ecoinvent is the most suitable one for LCA and EIA. Due to the data coverage in the databases, air emission inventories are easier to develop than LCA or EIA, which require more comprehensive information. One strategy to overcome the limitations of non-local databases for Latin American industry is the combination of validated data from international databases with newly developed local datasets.

  7. Legacy2Drupal - Conversion of an existing oceanographic relational database to a semantically enabled Drupal content management system

    NASA Astrophysics Data System (ADS)

    Maffei, A. R.; Chandler, C. L.; Work, T.; Allen, J.; Groman, R. C.; Fox, P. A.

    2009-12-01

    Content Management Systems (CMSs) provide powerful features that can be of use to oceanographic (and other geo-science) data managers. However, in many instances, geo-science data management offices have previously designed customized schemas for their metadata. The WHOI Ocean Informatics initiative and the NSF funded Biological Chemical and Biological Data Management Office (BCO-DMO) have jointly sponsored a project to port an existing, relational database containing oceanographic metadata, along with an existing interface coded in Cold Fusion middleware, to a Drupal6 Content Management System. The goal was to translate all the existing database tables, input forms, website reports, and other features present in the existing system to employ Drupal CMS features. The replacement features include Drupal content types, CCK node-reference fields, themes, RDB, SPARQL, workflow, and a number of other supporting modules. Strategic use of some Drupal6 CMS features enables three separate but complementary interfaces that provide access to oceanographic research metadata via the MySQL database: 1) a Drupal6-powered front-end; 2) a standard SQL port (used to provide a Mapserver interface to the metadata and data; and 3) a SPARQL port (feeding a new faceted search capability being developed). Future plans include the creation of science ontologies, by scientist/technologist teams, that will drive semantically-enabled faceted search capabilities planned for the site. Incorporation of semantic technologies included in the future Drupal 7 core release is also anticipated. Using a public domain CMS as opposed to proprietary middleware, and taking advantage of the many features of Drupal 6 that are designed to support semantically-enabled interfaces will help prepare the BCO-DMO database for interoperability with other ecosystem databases.

  8. Second Research Coordination Meeting on Reference Database for Neutron Activation Analysis -- Summary Report

    SciTech Connect

    Firestone, Richard B.; Kellett, Mark A.

    2008-03-19

    The second meeting of the Co-ordinated Research Project on"Reference Database for Neutron Activation Analysis" was held at the IAEA, Vienna from 7-9 May, 2007. A summary of the presentations made by participants is given, along with reports on specifically assigned tasks and subsequent discussions. In order to meet the overall objectives of this CRP, the outputs have been reiterated and new task assignments made.

  9. Developing a comprehensive database management system for organization and evaluation of mammography datasets.

    PubMed

    Wu, Yirong; Rubin, Daniel L; Woods, Ryan W; Elezaby, Mai; Burnside, Elizabeth S

    2014-01-01

    We aimed to design and develop a comprehensive mammography database system (CMDB) to collect clinical datasets for outcome assessment and development of decision support tools. A Health Insurance Portability and Accountability Act (HIPAA) compliant CMDB was created to store multi-relational datasets of demographic risk factors and mammogram results using the Breast Imaging Reporting and Data System (BI-RADS) lexicon. The CMDB collected both biopsy pathology outcomes, in a breast pathology lexicon compiled by extending BI-RADS, and our institutional breast cancer registry. The audit results derived from the CMDB were in accordance with Mammography Quality Standards Act (MQSA) audits and national benchmarks. The CMDB has managed the challenges of multi-level organization demanded by the complexity of mammography practice and lexicon development in pathology. We foresee that the CMDB will be useful for efficient quality assurance audits and development of decision support tools to improve breast cancer diagnosis. Our procedure of developing the CMDB provides a framework to build a detailed data repository for breast imaging quality control and research, which has the potential to augment existing resources.

  10. Developing a Comprehensive Database Management System for Organization and Evaluation of Mammography Datasets

    PubMed Central

    Wu, Yirong; Rubin, Daniel L; Woods, Ryan W; Elezaby, Mai; Burnside, Elizabeth S

    2014-01-01

    We aimed to design and develop a comprehensive mammography database system (CMDB) to collect clinical datasets for outcome assessment and development of decision support tools. A Health Insurance Portability and Accountability Act (HIPAA) compliant CMDB was created to store multi-relational datasets of demographic risk factors and mammogram results using the Breast Imaging Reporting and Data System (BI-RADS) lexicon. The CMDB collected both biopsy pathology outcomes, in a breast pathology lexicon compiled by extending BI-RADS, and our institutional breast cancer registry. The audit results derived from the CMDB were in accordance with Mammography Quality Standards Act (MQSA) audits and national benchmarks. The CMDB has managed the challenges of multi-level organization demanded by the complexity of mammography practice and lexicon development in pathology. We foresee that the CMDB will be useful for efficient quality assurance audits and development of decision support tools to improve breast cancer diagnosis. Our procedure of developing the CMDB provides a framework to build a detailed data repository for breast imaging quality control and research, which has the potential to augment existing resources. PMID:25368510

  11. Comparison of air emissions from various operating scenarios using an environmental database management system

    SciTech Connect

    Rosen, N.

    1997-12-31

    In their continuing effort to reduce air emissions, chemical and petroleum processing facilities must be able to predict, analyze, and compare emissions which result from a variety of operating scenarios. Will the use of a more expensive, yet cleaner fuel improve air emissions enough to warrant the extra cost? What are the threshold levels of production that will push a facility`s air emissions out of compliance with regulated limits? Which raw materials have the most prominent effect on the facility`s air emissions? Accurately determining the answers to such questions will help a facility determine which emission reduction alternatives are the most efficient and cost-effective. The use of an environmental data management system can make the analysis of different source operating scenarios a painless and efficient task. Within one database, a facility can store all possible operating scenario information, as well as all regulated emissions limits. The system will then process and calculate the air emissions quickly and accurately. Using statistical analysis tools, graphing capabilities, and reports embedded in the system, the facility can easily compare the pros and cons of each operating scenario.

  12. Covariant Evolutionary Event Analysis for Base Interaction Prediction Using a Relational Database Management System for RNA

    PubMed Central

    Xu, Weijia; Ozer, Stuart; Gutell, Robin R.

    2010-01-01

    With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure. PMID:20502534

  13. [Research and development of medical case database: a novel medical case information system integrating with biospecimen management].

    PubMed

    Pan, Shiyang; Mu, Yuan; Wang, Hong; Wang, Tong; Huang, Peijun; Ma, Jianfeng; Jiang, Li; Zhang, Jie; Gu, Bing; Yi, Lujiang

    2010-04-01

    To meet the needs of management of medical case information and biospecimen simultaneously, we developed a novel medical case information system integrating with biospecimen management. The database established by MS SQL Server 2000 covered, basic information, clinical diagnosis, imaging diagnosis, pathological diagnosis and clinical treatment of patient; physicochemical property, inventory management and laboratory analysis of biospecimen; users log and data maintenance. The client application developed by Visual C++ 6.0 was used to implement medical case and biospecimen management, which was based on Client/Server model. This system can perform input, browse, inquest, summary of case and related biospecimen information, and can automatically synthesize case-records based on the database. Management of not only a long-term follow-up on individual, but also of grouped cases organized according to the aim of research can be achieved by the system. This system can improve the efficiency and quality of clinical researches while biospecimens are used coordinately. It realizes synthesized and dynamic management of medical case and biospecimen, which may be considered as a new management platform.

  14. Trends in active pharmaceutical ingredient salt selection based on analysis of the Orange Book database.

    PubMed

    Paulekuhn, G Steffen; Dressman, Jennifer B; Saal, Christoph

    2007-12-27

    The Orange Book database published by the U.S. Drug and Food Administration (FDA) was analyzed for the frequency of occurrence of different counterions used for the formation of pharmaceutical salts. The data obtained from the present analysis of the Orange Book are compared to reviews of the Cambridge Structural Database (CSD) and of the Martindale "The Extra Pharmacopoeia". As well as showing overall distributions of counterion usage, results are broken down into 5-year increments to identify trends in counterion selection. Chloride ions continue to be the most frequently utilized anionic counterions for the formation of salts as active pharmaceutical ingredients (APIs), while sodium ions are most widely utilized for the formation of salts starting from acidic molecules. A strong trend toward a wider variety of counterions over the past decade is observed. This trend can be explained by a stronger need to improve physical chemical properties of research and development compounds.

  15. The Database Business: Managing Today--Planning for Tomorrow. Issues and Futures.

    ERIC Educational Resources Information Center

    Aitchison, T. M.; And Others

    1988-01-01

    Current issues and the future of the database business are discussed in five papers. Topics covered include aspects relating to the quality of database production; international ownership in the U.S. information marketplace; an overview of pricing strategies in the electronic information industry; and pricing issues from the viewpoints of online…

  16. COMPARISON OF EXERCISE PARTICIPATION RATES FOR CHILDREN IN THE LITERATURE WITH THOSE IN EPA'S CONSOLIDATED HUMAN ACTIVITY DATABASE (CHAD)

    EPA Science Inventory

    CHAD contains over 22,000 person-days of human activity pattern survey data. Part of the database includes exercise participation rates for children 0-17 years old, as well as for adults. Analyses of this database indicates that approximately 34% of the 0-17 age group (herea...

  17. Engineering design optimization capability using object-oriented programming method with database management system

    SciTech Connect

    Lee, Hueihuang.

    1989-01-01

    Recent advances in computer hardware and software techniques offer opportunities to create large-scale engineering design systems that were once thought to be impossible or impractical. Incorporating existing software systems into an integrated engineering design system and creating new capabilities in the integrated system are challenging problems in the area of engineering software design. The creation of such a system is a large and complex project. Furthermore the engineering design system usually needs to be modified and extended quite often because of continuing developments in engineering theories and practice. Confronted with such a massive, complex, and volatile project, the program developers have been attempting to devise systematic approaches to complete the software system and maintain its understandability, modifiability, reliability, and efficiency (Ross, Goodenough, and Irvine, 1975). Considerable efforts have been made toward achieving these goals. They include the discipline of software engineering, the database management techniques, and the software design methodologies. Among the software design methods, the object-oriented approach has been very successful in the past years. This can be reflected from the supports of the object-oriented programming paradigm in the popular programming languages such as Ada (1983) and C++ (Stroustrup, 1986). A new RQP algorithm based on augmented Lagrangian is implemented into the system in a relatively short time. These capabilities demonstrate the extendibility of IDESIGN-10. The process of developing the new RQP algorithm is presented in depth as a complete demonstration of object-oriented programming in engineering applications. A preliminary evaluation of the algorithm shows that it has potential for engineering applications; however it needs to be further developed.

  18. 'The surface management system' (SuMS) database: a surface-based database to aid cortical surface reconstruction, visualization and analysis

    NASA Technical Reports Server (NTRS)

    Dickson, J.; Drury, H.; Van Essen, D. C.

    2001-01-01

    Surface reconstructions of the cerebral cortex are increasingly widely used in the analysis and visualization of cortical structure, function and connectivity. From a neuroinformatics perspective, dealing with surface-related data poses a number of challenges. These include the multiplicity of configurations in which surfaces are routinely viewed (e.g. inflated maps, spheres and flat maps), plus the diversity of experimental data that can be represented on any given surface. To address these challenges, we have developed a surface management system (SuMS) that allows automated storage and retrieval of complex surface-related datasets. SuMS provides a systematic framework for the classification, storage and retrieval of many types of surface-related data and associated volume data. Within this classification framework, it serves as a version-control system capable of handling large numbers of surface and volume datasets. With built-in database management system support, SuMS provides rapid search and retrieval capabilities across all the datasets, while also incorporating multiple security levels to regulate access. SuMS is implemented in Java and can be accessed via a Web interface (WebSuMS) or using downloaded client software. Thus, SuMS is well positioned to act as a multiplatform, multi-user 'surface request broker' for the neuroscience community.

  19. A plant resource and experiment management system based on the Golm Plant Database as a basic tool for omics research

    PubMed Central

    Köhl, Karin I; Basler, Georg; Lüdemann, Alexander; Selbig, Joachim; Walther, Dirk

    2008-01-01

    Background For omics experiments, detailed characterisation of experimental material with respect to its genetic features, its cultivation history and its treatment history is a requirement for analyses by bioinformatics tools and for publication needs. Furthermore, meta-analysis of several experiments in systems biology based approaches make it necessary to store this information in a standardised manner, preferentially in relational databases. In the Golm Plant Database System, we devised a data management system based on a classical Laboratory Information Management System combined with web-based user interfaces for data entry and retrieval to collect this information in an academic environment. Results The database system contains modules representing the genetic features of the germplasm, the experimental conditions and the sampling details. In the germplasm module, genetically identical lines of biological material are generated by defined workflows, starting with the import workflow, followed by further workflows like genetic modification (transformation), vegetative or sexual reproduction. The latter workflows link lines and thus create pedigrees. For experiments, plant objects are generated from plant lines and united in so-called cultures, to which the cultivation conditions are linked. Materials and methods for each cultivation step are stored in a separate ACCESS database of the plant cultivation unit. For all cultures and thus every plant object, each cultivation site and the culture's arrival time at a site are logged by a barcode-scanner based system. Thus, for each plant object, all site-related parameters, e.g. automatically logged climate data, are available. These life history data and genetic information for the plant objects are linked to analytical results by the sampling module, which links sample components to plant object identifiers. This workflow uses controlled vocabulary for organs and treatments. Unique names generated by the system

  20. Role-Based Access Control for Loosely Coupled Distributed Database Management Systems

    DTIC Science & Technology

    2002-03-01

    UDERSTANDING THE RBAC POLICY OF THE APPLICATION......41 C. MAPPING THE APPLICATION POLICY ...............................................42 D. STORAGE OF THE...functionality and implementation options that the Hypersonic database provides. B. UDERSTANDING THE RBAC POLICY OF THE APPLICATION To fully

  1. Laboratory database to manage electrophoresis and chromatography separations and the associated samples.

    PubMed

    Cole, K D

    2000-12-01

    A database was developed to store, organize, and retrieve the data associated with electrophoresis and chromatography separations. It allows laboratories to store extensive data on separation techniques (analytical and preparative). The data for gel electrophoresis includes gel composition, staining methods, electric fields, analysis, and samples loaded. The database stores data on chromatography conditions, the samples used, and the fractions collected. The data structure of this database was designed to maintain the link between samples (including fractions) from chromatography separations and their analysis by gel electrophoresis. The database will allow laboratories to organize and maintain a large amount of separation and sample data in a uniform data environment. It will facilitate the retrieval of the separation history of important samples and the separation conditions used.

  2. The Design and Implementation of the Ariel Active Database Rule System

    DTIC Science & Technology

    1991-10-01

    8217 iO AVAILABILITY OF ABSTRACT 21 ABSTRACT SECURITY CLASSIFICATION n , %CASSI ;D)’NLIMITED El SAME AS PPT 01 DTIC USERS Unclassified 𔃼’a NArMt OF...obsolete SECURITY CLASSIFICATION OF THIS .AIC, [1NC IASS I F I El ) JQ&% It 14t i(Al .. The Design and Implementation of the Ariel Active Database Rule...is at user-issued transition, and T,[Rj] is a transition induced by an execution of rule R2 : El E2 E3 En SO S1 S2 S3 ... m,. Sn T1 T2[R1I T2(R2] Tn

  3. Design and utilization of a Flight Test Engineering Database Management System at the NASA Dryden Flight Research Facility

    NASA Technical Reports Server (NTRS)

    Knighton, Donna L.

    1992-01-01

    A Flight Test Engineering Database Management System (FTE DBMS) was designed and implemented at the NASA Dryden Flight Research Facility. The X-29 Forward Swept Wing Advanced Technology Demonstrator flight research program was chosen for the initial system development and implementation. The FTE DBMS greatly assisted in planning and 'mass production' card preparation for an accelerated X-29 research program. Improved Test Plan tracking and maneuver management for a high flight-rate program were proven, and flight rates of up to three flights per day, two times per week were maintained.

  4. Automation and high through-put for a DNA database laboratory: development of a laboratory information management system.

    PubMed

    Steinlechner, M; Parson, W

    2001-06-01

    Automation and high through-put production of DNA profiles has become a necessity in every DNA database unit. In our laboratory we developed a Laboratory Information Management System (LIMS) controlled workflow architecture, which comprises a robotic DNA extraction- and pipetting-system and a capillary electrophoresis unit. This allows a through-put of 4,000 samples per person per year. Improved sample handling and data management, full sample- and batch-histories, and software-aided supervision of result data, with a consequent average turn-around time of 8 days, are the main features of our new system.

  5. De-MA: a web Database for electron Microprobe Analyses to assist EMP lab manager and users

    NASA Astrophysics Data System (ADS)

    Allaz, J. M.

    2012-12-01

    Lab managers and users of electron microprobe (EMP) facilities require comprehensive, yet flexible documentation structures, as well as an efficient scheduling mechanism. A single on-line database system for managing reservations, and providing information on standards, quantitative and qualitative setups (element mapping, etc.), and X-ray data has been developed for this purpose. This system is particularly useful in multi-user facilities where experience ranges from beginners to the highly experienced. New users and occasional facility users will find these tools extremely useful in developing and maintaining high quality, reproducible, and efficient analyses. This user-friendly database is available through the web, and uses MySQL as a database and PHP/HTML as script language (dynamic website). The database includes several tables for standards information, X-ray lines, X-ray element mapping, PHA, element setups, and agenda. It is configurable for up to five different EMPs in a single lab, each of them having up to five spectrometers and as many diffraction crystals as required. The installation should be done on a web server supporting PHP/MySQL, although installation on a personal computer is possible using third-party freeware to create a local Apache server, and to enable PHP/MySQL. Since it is web-based, any user outside the EMP lab can access this database anytime through any web browser and on any operating system. The access can be secured using a general password protection (e.g. htaccess). The web interface consists of 6 main menus. (1) "Standards" lists standards defined in the database, and displays detailed information on each (e.g. material type, name, reference, comments, and analyses). Images such as EDS spectra or BSE can be associated with a standard. (2) "Analyses" lists typical setups to use for quantitative analyses, allows calculation of mineral composition based on a mineral formula, or calculation of mineral formula based on a fixed

  6. The design and implementation of EPL: An event pattern language for active databases

    NASA Technical Reports Server (NTRS)

    Giuffrida, G.; Zaniolo, C.

    1994-01-01

    The growing demand for intelligent information systems requires closer coupling of rule-based reasoning engines, such as CLIPS, with advanced data base management systems (DBMS). For instance, several commercial DBMS now support the notion of triggers that monitor events and transactions occurring in the database and fire induced actions, which perform a variety of critical functions, including safeguarding the integrity of data, monitoring access, and recording volatile information needed by administrators, analysts, and expert systems to perform assorted tasks; examples of these tasks include security enforcement, market studies, knowledge discovery, and link analysis. At UCLA, we designed and implemented the event pattern language (EPL) which is capable of detecting and acting upon complex patterns of events which are temporally related to each other. For instance, a plant manager should be notified when a certain pattern of overheating repeats itself over time in a chemical process; likewise, proper notification is required when a suspicious sequence of bank transactions is executed within a certain time limit. The EPL prototype is built in CLIPS to operate on top of Sybase, a commercial relational DBMS, where actions can be triggered by events such as simple database updates, insertions, and deletions. The rule-based syntax of EPL allows the sequences of goals in rules to be interpreted as sequences of temporal events; each goal can correspond to either (1) a simple event, or (2) a (possibly negated) event/condition predicate, or (3) a complex event defined as the disjunction and repetition of other events. Various extensions have been added to CLIPS in order to tailor the interface with Sybase and its open client/server architecture.

  7. [Anemia management in haemodialysis. EuCliD database in Spain].

    PubMed

    Avilés, B; Coronel, F; Pérez-García, R; Marcelli, D; Orlandini, G; Ayala, J A; Rentero, R

    2002-01-01

    We present the results on Anaemia Management in Fresenius Medical Care Spain dialysis centres as reported by EuCliD (European Clinical Database), evaluating a population of 4,426 patients treated in Spain during the year 2001. To analyse the erythropoietin dose and the haemoglobin levels we divided the population in two groups according to the time with dialysis treatment: patients treated less than six months and patients between six months, and four years on therapy. We compared our results with the evidence based recommendations Guidelines: the European Best Practice Guidelines (EBPG) and the US National Kidney Foundation (NKF-K/DOQI). We also compared our results with those presented by the ESAM2 on 2,618 patients on dialysis in Spain carried out in the second half of the year 2000. We observed that 70% of the population reaches an haemoglobin value higher that 11 g/dl, with a mean erythropoietin (rHu-EPO) dose of 111.9 Ul/kg weight/week (n = 3,700; SD 74.9). However, for those patients on treatment for less than six months, the mean Haemoglobin only reaches 10.65 g/dl (n = 222; SD 1.4). The rHu-EPO was administrated subcutaneously in 70.2% of the patients. About the iron therapy, 86% of the patients received iron treatment and the administration route was intravenous in 93% of the population. The ferritin levels were below 100 micrograms/dl in 10% of the patients and 26.4% showed a transferrin saturation index (TSAT) below 20%. The erythropoieting resistance index (ERI), as rHu-EPO/haemoglobin, has been used to evaluate the response to rHu-Epo, according to different variables. It was observed that the following factors lead to a higher rHu-EPO resistance: intravenous rHu-EPO as administration route, the presence of hypoalbuminemia, increase of protein C reactive, Transferrin saturation below 20% and starting dialysis during the last six months.

  8. Platelet actively cooled thermal management devices

    NASA Astrophysics Data System (ADS)

    Mueggenburg, H. H.; Hidahl, J. W.; Kessler, E. L.; Rousar, D. C.

    1992-07-01

    An overview of 28 years of actively-cooled platelet thermal management devices design and development history is presented. Platelet devices are created by bonding together thin metal sheets (platelets) which contain chemically-etched coolant pasages. The bonding process produces an intricate and precise matrix of coolant passages and structural walls contained within a monolithic structure. Thirteen specific applications for platelet thermal management devices are described. These devices are cooled using convective, film, and transpiration cooling techniques. Platelet thermal management devices have been fabricated from a variety of metals, cooled with a variety of fluids, and operated at heat fluxes up to 200 Btu/sq in.-sec.

  9. Watershed Data Management (WDM) database for Salt Creek streamflow simulation, DuPage County, Illinois, water years 2005-11

    USGS Publications Warehouse

    Bera, Maitreyee

    2014-01-01

    The U.S. Geological Survey (USGS), in cooperation with DuPage County Stormwater Management Division, maintains a USGS database of hourly meteorologic and hydrologic data for use in a near real-time streamflow simulation system, which assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek watershed in DuPage County, Illinois. Most of the precipitation data are collected from a tipping-bucket rain-gage network located in and near DuPage County. The other meteorologic data (wind speed, solar radiation, air temperature, and dewpoint temperature) are collected at Argonne National Laboratory in Argonne, Ill. Potential evapotranspiration is computed from the meteorologic data. The hydrologic data (discharge and stage) are collected at USGS streamflow-gaging stations in DuPage County. These data are stored in a Watershed Data Management (WDM) database. An earlier report describes in detail the WDM database development including the processing of data from January 1, 1997, through September 30, 2004, in SEP04.WDM database. SEP04.WDM is updated with the appended data from October 1, 2004, through September 30, 2011, water years 2005–11 and renamed as SEP11.WDM. This report details the processing of meteorologic and hydrologic data in SEP11.WDM. This report provides a record of snow affected periods and the data used to fill missing-record periods for each precipitation site during water years 2005–11. The meteorologic data filling methods are described in detail in Over and others (2010), and an update is provided in this report.

  10. A spatial classification and database for management, research, and policy making: The Great Lakes aquatic habitat framework

    USGS Publications Warehouse

    Wang, Lizhu; Riseng, Catherine M.; Mason, Lacey; Werhrly, Kevin; Rutherford, Edward; McKenna, James E.; Castiglione, Chris; Johnson, Lucinda B.; Infante, Dana M.; Sowa, Scott P.; Robertson, Mike; Schaeffer, Jeff; Khoury, Mary; Gaiot, John; Hollenhurst, Tom; Brooks, Colin N.; Coscarelli, Mark

    2015-01-01

    Managing the world's largest and most complex freshwater ecosystem, the Laurentian Great Lakes, requires a spatially hierarchical basin-wide database of ecological and socioeconomic information that is comparable across the region. To meet such a need, we developed a spatial classification framework and database — Great Lakes Aquatic Habitat Framework (GLAHF). GLAHF consists of catchments, coastal terrestrial, coastal margin, nearshore, and offshore zones that encompass the entire Great Lakes Basin. The catchments captured in the database as river pour points or coastline segments are attributed with data known to influence physicochemical and biological characteristics of the lakes from the catchments. The coastal terrestrial zone consists of 30-m grid cells attributed with data from the terrestrial region that has direct connection with the lakes. The coastal margin and nearshore zones consist of 30-m grid cells attributed with data describing the coastline conditions, coastal human disturbances, and moderately to highly variable physicochemical and biological characteristics. The offshore zone consists of 1.8-km grid cells attributed with data that are spatially less variable compared with the other aquatic zones. These spatial classification zones and their associated data are nested within lake sub-basins and political boundaries and allow the synthesis of information from grid cells to classification zones, within and among political boundaries, lake sub-basins, Great Lakes, or within the entire Great Lakes Basin. This spatially structured database could help the development of basin-wide management plans, prioritize locations for funding and specific management actions, track protection and restoration progress, and conduct research for science-based decision making.

  11. Reflective Database Access Control

    ERIC Educational Resources Information Center

    Olson, Lars E.

    2009-01-01

    "Reflective Database Access Control" (RDBAC) is a model in which a database privilege is expressed as a database query itself, rather than as a static privilege contained in an access control list. RDBAC aids the management of database access controls by improving the expressiveness of policies. However, such policies introduce new interactions…

  12. Status Report for Remediation Decision Support Project, Task 1, Activity 1.B – Physical and Hydraulic Properties Database and Interpretation

    SciTech Connect

    Rockhold, Mark L.

    2008-09-26

    The objective of Activity 1.B of the Remediation Decision Support (RDS) Project is to compile all available physical and hydraulic property data for sediments from the Hanford Site, to port these data into the Hanford Environmental Information System (HEIS), and to make the data web-accessible to anyone on the Hanford Local Area Network via the so-called Virtual Library. In past years efforts were made by RDS project staff to compile all available physical and hydraulic property data for Hanford sediments and to transfer these data into SoilVision{reg_sign}, a commercial geotechnical software package designed for storing, analyzing, and manipulating soils data. Although SoilVision{reg_sign} has proven to be useful, its access and use restrictions have been recognized as a limitation to the effective use of the physical and hydraulic property databases by the broader group of potential users involved in Hanford waste site issues. In order to make these data more widely available and useable, a decision was made to port them to HEIS and to make them web-accessible via a Virtual Library module. In FY08 the objectives of Activity 1.B of the RDS Project were to: (1) ensure traceability and defensibility of all physical and hydraulic property data currently residing in the SoilVision{reg_sign} database maintained by PNNL, (2) transfer the physical and hydraulic property data from the Microsoft Access database files used by SoilVision{reg_sign} into HEIS, which has most recently been maintained by Fluor-Hanford, Inc., (3) develop a Virtual Library module for accessing these data from HEIS, and (4) write a User's Manual for the Virtual Library module. The development of the Virtual Library module was to be performed by a third party under subcontract to Fluor. The intent of these activities is to make the available physical and hydraulic property data more readily accessible and useable by technical staff and operable unit managers involved in waste site assessments and

  13. jSPyDB, an open source database-independent tool for data management

    NASA Astrophysics Data System (ADS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  14. What is the role of recombinant activated protein C in the management of sepsis?

    PubMed

    Alvarado, Juan; Castro, Ricardo

    2016-12-20

    During an episode of sepsis, systemic inflammatory response phenomenon triggers a series of procoagulant mechanisms. It has been suggested that the use of activated protein C could play a role in the management of this pathology, but there is no consensus. Searching in Epistemonikos database, which is maintained by screening multiple databases, we identified seven systematic reviews covering 35 primary studies addressing the question of this article, including six randomized trials. We extracted data, combined the evidence using meta-analysis and generated a summary of findings table following the GRADE approach. We concluded activated protein C in sepsis probably does not decrease the mortality rate and increases the rate of hemorrhagic events.

  15. A Data-Based Financial Management Information System (FMIS) for Administrative Sciences Department

    DTIC Science & Technology

    1990-12-01

    Financial Management Information System that would result in improved management of financial assets, better use of clerical skills, and more detailed...develops and implements a personal computer-based Management Information System for the Management of the many funding accounts controlled by the...different software programs, into a single all-encompassing Management Information System . The system was written using dBASE IV and is currently operational.

  16. Recently active traces of the Bartlett Springs Fault, California: a digital database

    USGS Publications Warehouse

    Lienkaemper, James J.

    2010-01-01

    The purpose of this map is to show the location of and evidence for recent movement on active fault traces within the Bartlett Springs Fault Zone, California. The location and recency of the mapped traces is primarily based on geomorphic expression of the fault as interpreted from large-scale aerial photography. In a few places, evidence of fault creep and offset Holocene strata in trenches and natural exposures have confirmed the activity of some of these traces. This publication is formatted both as a digital database for use within a geographic information system (GIS) and for broader public access as map images that may be browsed on-line or download a summary map. The report text describes the types of scientific observations used to make the map, gives references pertaining to the fault and the evidence of faulting, and provides guidance for use of and limitations of the map.

  17. Are Bibliographic Management Software Search Interfaces Reliable?: A Comparison between Search Results Obtained Using Database Interfaces and the EndNote Online Search Function

    ERIC Educational Resources Information Center

    Fitzgibbons, Megan; Meert, Deborah

    2010-01-01

    The use of bibliographic management software and its internal search interfaces is now pervasive among researchers. This study compares the results between searches conducted in academic databases' search interfaces versus the EndNote search interface. The results show mixed search reliability, depending on the database and type of search…

  18. Guide on Data Models in the Selection and Use of Database Management Systems. Final Report.

    ERIC Educational Resources Information Center

    Gallagher, Leonard J.; Draper, Jesse M.

    A tutorial introduction to data models in general is provided, with particular emphasis on the relational and network models defined by the two proposed ANSI (American National Standards Institute) database language standards. Examples based on the network and relational models include specific syntax and semantics, while examples from the other…

  19. Design and Application of an Object Oriented Graphical Database Management System for Synthetic Environments

    DTIC Science & Technology

    1991-12-01

    development of the Database Generation System was also successful. In addition to providing an excellent platform for testing GDMS funtionality , it proved to...Format Several geometries are available for modeling objects in three- space. Procedural models, fractals, grammar -based models, particle systems

  20. Performance Measurement of Three Commercial Object-Oriented Database Management Systems

    DTIC Science & Technology

    1993-12-01

    have the limitation. Itasca customer support told us that the problem would be cleared up in the next release. To solve the problem, a dummy -superclass...Benchmark Results for Large Remote Database 100 ObjectStore Wam Matiss cowd MatsseWarm 0 50 100 150 200 250 Elqed Tim i Sonds Lockup [J Two"m 0 hmnsr

  1. Identification of promiscuous ene-reductase activity by mining structural databases using active site constellations

    PubMed Central

    Steinkellner, Georg; Gruber, Christian C.; Pavkov-Keller, Tea; Binter, Alexandra; Steiner, Kerstin; Winkler, Christoph; Łyskowski, Andrzej; Schwamberger, Orsolya; Oberer, Monika; Schwab, Helmut; Faber, Kurt; Macheroux, Peter; Gruber, Karl

    2014-01-01

    The exploitation of catalytic promiscuity and the application of de novo design have recently opened the access to novel, non-natural enzymatic activities. Here we describe a structural bioinformatic method for predicting catalytic activities of enzymes based on three-dimensional constellations of functional groups in active sites (‘catalophores’). As a proof-of-concept we identify two enzymes with predicted promiscuous ene-reductase activity (reduction of activated C–C double bonds) and compare them with known ene-reductases, that is, members of the Old Yellow Enzyme family. Despite completely different amino acid sequences, overall structures and protein folds, high-resolution crystal structures reveal equivalent binding modes of typical Old Yellow Enzyme substrates and ligands. Biochemical and biocatalytic data show that the two enzymes indeed possess ene-reductase activity and reveal an inverted stereopreference compared with Old Yellow Enzymes for some substrates. This method could thus be a tool for the identification of viable starting points for the development and engineering of novel biocatalysts. PMID:24954722

  2. The U.S. Geological Survey mapping and cartographic database activities, 2006-2010

    USGS Publications Warehouse

    Craun, Kari J.; Donnelly, John P.; Allord, Gregory J.

    2011-01-01

    The U.S. Geological Survey (USGS) began systematic topographic mapping of the United States in the 1880s, beginning with scales of 1:250,000 and 1:125,000 in support of geological mapping. Responding to the need for higher resolution and more detail, the 1:62,500-scale, 15-minute, topographic map series was begun in the beginning of the 20th century. Finally, in the 1950s the USGS adopted the 1:24,000-scale, 7.5-minute topographic map series to portray even more detail, completing the coverage of the conterminous 48 states of the United States with this series in 1992. In 2001, the USGS developed the vision and concept of The National Map, a topographic database for the 21st century and the source for a new generation of topographic maps (http://nationalmap.gov/). In 2008, the initial production of those maps began with a 1:24,000-scale digital product. In a separate, but related project, the USGS began scanning the existing inventory of historical topographic maps at all scales to accompany the new topographic maps. The USGS also had developed a digital database of The National Atlas of the United States. The digital version of Atlas is now Web-available and supports a mapping engine for small scale maps of the United States and North America. These three efforts define topographic mapping activities of the USGS during the last few years and are discussed below.

  3. Digital Database of Recently Active Traces of the Hayward Fault, California

    USGS Publications Warehouse

    Lienkaemper, James J.

    2006-01-01

    The purpose of this map is to show the location of and evidence for recent movement on active fault traces within the Hayward Fault Zone, California. The mapped traces represent the integration of the following three different types of data: (1) geomorphic expression, (2) creep (aseismic fault slip),and (3) trench exposures. This publication is a major revision of an earlier map (Lienkaemper, 1992), which both brings up to date the evidence for faulting and makes it available formatted both as a digital database for use within a geographic information system (GIS) and for broader public access interactively using widely available viewing software. The pamphlet describes in detail the types of scientific observations used to make the map, gives references pertaining to the fault and the evidence of faulting, and provides guidance for use of and limitations of the map. [Last revised Nov. 2008, a minor update for 2007 LiDAR and recent trench investigations; see version history below.

  4. Soil Characterization Database for the Area 5 Radioactive Waste Management Site, Nevada Test Site, Nye County, Nevada

    SciTech Connect

    Y. J. Lee; R. D. Van Remortel; K. E. Snyder

    2005-01-01

    Soils were characterized in an investigation at the Area 5 Radioactive Waste Management Site at the U.S. Department of Energy Nevada Test Site in Nye County, Nevada. Data from the investigation are presented in four parameter groups: sample and site characteristics, U.S. Department of Agriculture (USDA) particle size fractions, chemical parameters, and American Society for Testing Materials-Unified Soil Classification System (ASTM-USCS) particle size fractions. Spread-sheet workbooks based on these parameter groups are presented to evaluate data quality, conduct database updates,and set data structures and formats for later extraction and analysis. This document does not include analysis or interpretation of presented data.

  5. Soil Characterization Database for the Area 3 Radioactive Waste Management Site, Nevada Test Site, Nye County, Nevada

    SciTech Connect

    R. D. Van Remortel; Y. J. Lee; K. E. Snyder

    2005-01-01

    Soils were characterized in an investigation at the Area 3 Radioactive Waste Management Site at the U.S. Department of Energy Nevada Test Site in Nye County, Nevada. Data from the investigation are presented in four parameter groups: sample and site characteristics, U.S. Department of Agriculture (USDA) particle size fractions, chemical parameters, and American Society for Testing Materials-Unified Soil Classification System (ASTM-USCS) particle size fractions. Spread-sheet workbooks based on these parameter groups are presented to evaluate data quality, conduct database updates, and set data structures and formats for later extraction and analysis. This document does not include analysis or interpretation of presented data.

  6. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase 1

  7. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  8. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  9. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-02-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirtieth month of development activities.

  10. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-08-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirty-sixth month of development activities.

  11. There must be a better way! Managing a corporate web site dynamically from a database

    SciTech Connect

    j.z. cohen

    1998-10-21

    This document is a set of slides available from http://www1.y12.org/lmes_sti/html/ycsdinf-98-8/index.htm that describes limitations of static web pages for conveying information, a plan for overcoming these limitations by generating web pages dynamically from a database, expected advantages and disadvantages of this method, design for a system using the method, and future plans.

  12. IChart: A Graphical Tool to View and Manipulate Force Management Structure Databases

    DTIC Science & Technology

    2008-09-01

    while reconnect may be used to open a new connection to the current database. Invoking loadTable with the name of a table causes an SQL query statement...multiple lines, each of which is composed of a keyword, colon or equal sign separator, and a value. Whitespace is allowed around the separator for...separated by dashes. The time has the fields separated by colons . IChart currently has no way of allowing the user to manipulate the time, so the file

  13. Design and Implementation of the Digital Engineering Laboratory Distributed Database Management System.

    DTIC Science & Technology

    1984-12-01

    COMMUNICATION CHIANN EL (a) DBMS DBMS DBMS 1 2 ... n COMMUNICATION COMMUNICATION COMMUNICATIN MODULE MODULE MODULE COMMUNICATION CHANNEL (b) DBMS DBMS DBMS 1...problems. Dawson of Mitre Corporation (2) discusses using distributed databases for a field-deployable, tactical air control system. The Worldwide...Military Command and Control System is *heavily dependent on networking capabilities, and in an article Coles of Mitre Corporation (1) discusses current

  14. Chronic Fatigue Syndrome (CFS): Managing Activities and Exercise

    MedlinePlus

    ... Fatigue Syndrome (CFS) Share Compartir Managing Activities and Exercise On this Page Avoiding Extremes Developing an Activity ... recent manageable level of activity. Strength and Conditioning Exercises Strength and conditioning exercises are an important component ...

  15. Planning and management of water resource programs. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    Not Available

    1994-10-01

    The bibliography contains citations concerning planning and management of water resource programs and projects at the local, regional, state, and national levels. The studies of water quality, drinking water, industrial water, and irrigation are presented. Topics include groundwater and surface water management, flood control, waste water treatment, hydroelectric power generation, sanitation and toxic hazards, models and risk assessment, and remote sensing technology. Worldwide water management is covered. (Contains 250 citations and includes a subject term index and title list.)

  16. Sustainable forest management. (Latest citations from the Cab abstracts database). Published Search

    SciTech Connect

    1996-12-01

    The bibliography contains citations concerning developments in sustainable forestry management. Topics include international regulations, economics, strategies, land use rights, ecological impact, political developments, and evaluations of sustainable forestry resource management programs. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  17. Woodlot management. (Latest citations from the Cab abstracts database). Published Search

    SciTech Connect

    1995-01-01

    The bibliography contains citations concerning worldwide woodlot and forest management. Topics cover private forest ecology, environmental policies, legislation, and land use. Timber harvesting, logging, timber valuation, farm management and forest inventories are examined. Forestry economics, including forecasting and planning, are also included. (Contains a minimum of 199 citations and includes a subject term index and title list.)

  18. Analysis of medical disputes regarding chronic pain management in the 2009–2016 period using the Korean Society of Anesthesiologists database

    PubMed Central

    Lee, Jin Young; Jung, Da Woon; Yang, Jae Young; Kim, Dae Yoon

    2017-01-01

    Background The active involvement of anesthesiologists in chronic pain management has been associated with an increase in the number of related medical dispute cases. Methods Using the Korean Society of Anesthesiologists Legislation Committee database covering case files from July 2009 to June 2016, we explored injuries and liability characteristics in a subset of cases involving chronic pain management. Results During the study period, 58 cases were eligible for final analysis. There were 27 cases related to complex regional pain syndrome (CRPS), many of them involving problems with financial compensation (24/27, 88.9%). The CRPS cases showed male dominance (22 males, 5 females). In a disproportionately large number of these cases, the causative injury occurred during military training (n = 5). Two cases were associated with noninvasive pain managements, and 29 cases with invasive procedures. Of the latter group, procedures involving the spine (both neuraxial and non-neuraxial procedures) resulted in more severe complications than other procedures (P = 0.007). Seven of the patients who underwent invasive procedures died. The most common type of invasive procedures were lumbosacral procedures (16/29, 55.2%). More specifically, the most common damaging events were inadvertent intravascular or intrathecal injection of local anesthetics (n = 6). Conclusions Several characteristics of medical disputes related to chronic pain management were identified: the prevalence of injury benefit claims in CRPS patients, higher severity of complications in procedures performed at the spine or cervical region, and the preventability of inadvertent intravascular or intrathecal injection of local anesthetics. PMID:28367290

  19. Spatial database for the management of "urban geology" geothematic information: the case of Drama City, Greece

    NASA Astrophysics Data System (ADS)

    Pantelias, Eustathios; Zervakou, Alexandra D.; Tsombos, Panagiotis I.; Nikolakopoulos, Konstantinos G.

    2008-10-01

    The aggregation of population in big cities leads to the concentration of human activities, economic wealth, over consumption of natural resources and urban growth without planning and sustainable management. As a result, urban societies are exposed to various dangers and threats with economical, social, ecological - environmental impacts on the urban surroundings. Problems associated with urban development are related to their geological conditions and those of their surroundings, e.g. flooding, land subsidence, groundwater pollution, soil contamination, earthquakes, landslides, etc. For these reasons, no sustainable urban planning can be done without geological information support. The first systematic recording, codification and documentation of "urban geology" geothematic information in Greece is implemented by the Institute of Geological and Mineral Exploration (I.G.M.E.) in the frame of project "Collection, codification and documentation of geothematic information for urban and suburban areas in Greece - pilot applications". Through the implementation of this project, all geothematic information derived from geological mapping, geotechnical - geochemical - geophysical research and measurements in four pilot areas of Greece Drama (North Greece), Nafplio & Sparti (Peloponnesus) and Thrakomakedones (Attica) is stored and processed in specially designed geodatabases in GIS environment containing vector and raster data. For the specific GIS application ArcGIS Personal Geodatabase is used. Data is classified in geothematic layers, grouped in geothematic datasets (e.g. Topography, Geology - Tectonics, Submarine Geology, Technical Geology, Hydrogeology, Soils, Radioactive elements, etc) and being processed in order to produced multifunctional geothematic maps. All compiled data constitute the essential base for land use planning and environmental protection in specific urban areas. With the termination of the project the produced geodatabase and other digital data

  20. Analysis of DOE international environmental management activities

    SciTech Connect

    Ragaini, R.C.

    1995-09-01

    The Department of Energy`s (DOE) Strategic Plan (April 1994) states that DOE`s long-term vision includes world leadership in environmental restoration and waste management activities. The activities of the DOE Office of Environmental Management (EM) can play a key role in DOE`s goals of maintaining U.S. global competitiveness and ensuring the continuation of a world class science and technology community. DOE`s interest in attaining these goals stems partly from its participation in organizations like the Trade Policy Coordinating Committee (TPCC), with its National Environmental Export Promotion Strategy, which seeks to strengthen U.S. competitiveness and the building of public-private partnerships as part of U.S. industrial policy. The International Interactions Field Office task will build a communication network which will facilitate the efficient and effective communication between DOE Headquarters, Field Offices, and contractors. Under this network, Headquarters will provide the Field Offices with information on the Administration`s policies and activities (such as the DOE Strategic Plan), interagency activities, as well as relevant information from other field offices. Lawrence Livermore National Laboratory (LLNL) will, in turn, provide Headquarters with information on various international activities which, when appropriate, will be included in reports to groups like the TPCC and the EM Focus Areas. This task provides for the collection, review, and analysis of information on the more significant international environmental restoration and waste management initiatives and activities which have been used or are being considered at LLNL. Information gathering will focus on efforts and accomplishments in meeting the challenges of providing timely and cost effective cleanup of its environmentally damaged sites and facilities, especially through international technical exchanges and/or the implementation of foreign-development technologies.

  1. A new database on contaminant exposure and effects in terrestrial vertebrates for natural resource managers

    USGS Publications Warehouse

    Rattner, B.A.; Pearson, J.L.; Garrett, L.J.; Erwin, R.M.; Walz, A.; Ottinger, M.A.; Barrett, H.R.

    1997-01-01

    The Biomonitoring of Environmental Status and Trends (BEST) program of the Department of the Interior is focused to identify and understand effects of contaminant stressors on biological resources under their stewardship. Despite the desire of many to continuously monitor the environmental health of our estuaries, much can be learned by summarizing existing temporal, geographic, and phylogenetic contaminant information. To this end, retrospective contaminant exposure and effects data for amphibians, reptiles, birds, and mammals residing within 30 km of Atlantic coast estuaries are being assembled through searches of published literature (e.g., Fisheries Review, Wildlife Review, BIOSIS Previews) and databases (e.g., US EPA Ecological Incident Information System; USGS Diagnostic and Epizootic Databases), and compilation of summary data from unpublished reports of government natural resource agencies, private conservation groups, and universities. These contaminant exposure and effect data for terrestrial vertebrates (CEE-TV) are being summarized using Borland dBASE in a 96- field format, including species, collection time and site coordinates, sample matrix, contaminant concentration, biomarker and bioindicator responses, and source of information (N>1500 records). This CEE-TV database has been imported into the ARC/INFO geographic information system (GIS), for purposes of examining geographic coverage and trends, and to identify critical data gaps. A preliminary risk assessment will be conducted to identify and characterize contaminants and other stressors potentially affecting terrestrial vertebrates that reside, migrate through or reproduce in these estuaries. Evaluations are underway, using specific measurement and assessment endpoints, to rank and prioritize estuarine ecosystems in which terrestrial vertebrates are potentially at risk for purposes of prediction and focusing future biomonitoring efforts.

  2. COMPILATION AND MANAGEMENT OF ORP GLASS FORMULATION DATABASE, VSL-12R2470-1 REV 0

    SciTech Connect

    Kruger, Albert A.; Pasieka, Holly K.; Muller, Isabelle; Gilbo, Konstantin; Perez-Cardenas, Fernando; Joseph, Innocent; Pegg, Ian L.; Kot, Wing K.

    2012-12-13

    The present report describes the first steps in the development of a glass property-composition database for WTP LAW and HL W glasses that includes all of the data that were used in the development of the WTP baseline models and all of the data collected subsequently as part of WTP enhancement studies perfonned for ORP. The data were reviewed to identifY some of the more significant gaps in the composition space that will need to be filled to support waste processing at Hanford. The WTP baseline models have been evaluated against the new data in terms of range of validity and prediction perfonnance.

  3. Earthquake Model of the Middle East (EMME) Project: Active Fault Database for the Middle East Region

    NASA Astrophysics Data System (ADS)

    Gülen, L.; Wp2 Team

    2010-12-01

    The Earthquake Model of the Middle East (EMME) Project is a regional project of the umbrella GEM (Global Earthquake Model) project (http://www.emme-gem.org/). EMME project region includes Turkey, Georgia, Armenia, Azerbaijan, Syria, Lebanon, Jordan, Iran, Pakistan, and Afghanistan. Both EMME and SHARE projects overlap and Turkey becomes a bridge connecting the two projects. The Middle East region is tectonically and seismically very active part of the Alpine-Himalayan orogenic belt. Many major earthquakes have occurred in this region over the years causing casualties in the millions. The EMME project will use PSHA approach and the existing source models will be revised or modified by the incorporation of newly acquired data. More importantly the most distinguishing aspect of the EMME project from the previous ones will be its dynamic character. This very important characteristic is accomplished by the design of a flexible and scalable database that will permit continuous update, refinement, and analysis. A digital active fault map of the Middle East region is under construction in ArcGIS format. We are developing a database of fault parameters for active faults that are capable of generating earthquakes above a threshold magnitude of Mw≥5.5. Similar to the WGCEP-2007 and UCERF-2 projects, the EMME project database includes information on the geometry and rates of movement of faults in a “Fault Section Database”. The “Fault Section” concept has a physical significance, in that if one or more fault parameters change, a new fault section is defined along a fault zone. So far over 3,000 Fault Sections have been defined and parameterized for the Middle East region. A separate “Paleo-Sites Database” includes information on the timing and amounts of fault displacement for major fault zones. A digital reference library that includes the pdf files of the relevant papers, reports is also being prepared. Another task of the WP-2 of the EMME project is to prepare

  4. Planning and management of water resource programs. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1997-05-01

    The bibliography contains citations concerning planning and management of water resource programs and projects at the local, regional, state, and national levels. The studies of water quality, drinking water, industrial water, and irrigation are presented. Topics include groundwater and surface water management, flood control, waste water treatment, hydroelectric power generation, sanitation and toxic hazards, models and risk assessment, and remote sensing technology. Worldwide water management is covered.(Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  5. Planning and management of water resource programs. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1996-02-01

    The bibliography contains citations concerning planning and management of water resource programs and projects at the local, regional, state, and national levels. The studies of water quality, drinking water, industrial water, and irrigation are presented. Topics include groundwater and surface water management, flood control, waste water treatment, hydroelectric power generation, sanitation and toxic hazards, models and risk assessment, and remote sensing technology. Worldwide water management is covered.(Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  6. Recently Active Traces of the Berryessa Fault, California: A Digital Database

    USGS Publications Warehouse

    Lienkaemper, James J.

    2012-01-01

    The purpose of this map is to show the location of and evidence for recent movement on active fault traces within the Berryessa section and parts of adjacent sections of the Green Valley Fault Zone, California. The location and recency of the mapped traces is primarily based on geomorphic expression of the fault as interpreted from large-scale 2010 aerial photography and from 2007 and 2011 0.5 and 1.0 meter bare-earth LiDAR imagery (that is, high-resolution topographic data). In a few places, evidence of fault creep and offset Holocene strata in trenches and natural exposures have confirmed the activity of some of these traces. This publication is formatted both as a digital database for use within a geographic information system (GIS) and for broader public access as map images that may be browsed on-line or download a summary map. The report text describes the types of scientific observations used to make the map, gives references pertaining to the fault and the evidence of faulting, and provides guidance for use of and limitations of the map.

  7. A database and model to support proactive management of sediment-related sewer blockages.

    PubMed

    Rodríguez, Juan Pablo; McIntyre, Neil; Díaz-Granados, Mario; Maksimović, Cedo

    2012-10-01

    Due to increasing customer and political pressures, and more stringent environmental regulations, sediment and other blockage issues are now a high priority when assessing sewer system operational performance. Blockages caused by sediment deposits reduce sewer system reliability and demand remedial action at considerable operational cost. Consequently, procedures are required for identifying which parts of the sewer system are in most need of proactive removal of sediments. This paper presents an exceptionally long (7.5 years) and spatially detailed (9658 grid squares--0.03 km² each--covering a population of nearly 7.5 million) data set obtained from a customer complaints database in Bogotá (Colombia). The sediment-related blockage data are modelled using homogeneous and non-homogeneous Poisson process models. In most of the analysed areas the inter-arrival time between blockages can be represented by the homogeneous process, but there are a considerable number of areas (up to 34%) for which there is strong evidence of non-stationarity. In most of these cases, the mean blockage rate increases over time, signifying a continual deterioration of the system despite repairs, this being particularly marked for pipe and gully pot related blockages. The physical properties of the system (mean pipe slope, diameter and pipe length) have a clear but weak influence on observed blockage rates. The Bogotá case study illustrates the potential value of customer complaints databases and formal analysis frameworks for proactive sewerage maintenance scheduling in large cities.

  8. Wetlands legislation and management. (Latest citations from the Selected Water Resources Abstracts database). Published Search

    SciTech Connect

    Not Available

    1994-02-01

    The bibliography contains citations concerning federal and state legislation governing coastal and fresh water wetlands. Studies of regional regulations and management of specific sites are included. Topics such as reconciling environmental considerations with economic pressures and landowners' rights are covered. Wetlands restoration projects, conservation projects, and development plans are also presented. Many citations discuss wetlands management in relation to the Clean Water Act. (Contains 250 citations and includes a subject term index and title list.)

  9. 77 FR 31615 - Improving Mail Management Policies, Procedures, and Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-29

    ... ADMINISTRATION Improving Mail Management Policies, Procedures, and Activities AGENCY: Office of Governmentwide... Services Administration (GSA) has issued Federal Management Regulation (FMR) Bulletin G-03 which provides guidance to Executive Branch agencies for improving mail management policies, procedures, and...

  10. Third millenium ideal gas and condensed phase thermochemical database for combustion (with update from active thermochemical tables).

    SciTech Connect

    Burcat, A.; Ruscic, B.; Chemistry; Technion - Israel Inst. of Tech.

    2005-07-29

    The thermochemical database of species involved in combustion processes is and has been available for free use for over 25 years. It was first published in print in 1984, approximately 8 years after it was first assembled, and contained 215 species at the time. This is the 7th printed edition and most likely will be the last one in print in the present format, which involves substantial manual labor. The database currently contains more than 1300 species, specifically organic molecules and radicals, but also inorganic species connected to combustion and air pollution. Since 1991 this database is freely available on the internet, at the Technion-IIT ftp server, and it is continuously expanded and corrected. The database is mirrored daily at an official mirror site, and at random at about a dozen unofficial mirror and 'finger' sites. The present edition contains numerous corrections and many recalculations of data of provisory type by the G3//B3LYP method, a high-accuracy composite ab initio calculation. About 300 species are newly calculated and are not yet published elsewhere. In anticipation of the full coupling, which is under development, the database started incorporating the available (as yet unpublished) values from Active Thermochemical Tables. The electronic version now also contains an XML file of the main database to allow transfer to other formats and ease finding specific information of interest. The database is used by scientists, educators, engineers and students at all levels, dealing primarily with combustion and air pollution, jet engines, rocket propulsion, fireworks, but also by researchers involved in upper atmosphere kinetics, astrophysics, abrasion metallurgy, etc. This introductory article contains explanations of the database and the means to use it, its sources, ways of calculation, and assessments of the accuracy of data.

  11. Managing the Big Data Avalanche in Astronomy - Data Mining the Galaxy Zoo Classification Database

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.

    2014-01-01

    We will summarize a variety of data mining experiments that have been applied to the Galaxy Zoo database of galaxy classifications, which were provided by the volunteer citizen scientists. The goal of these exercises is to learn new and improved classification rules for diverse populations of galaxies, which can then be applied to much larger sky surveys of the future, such as the LSST (Large Synoptic Sky Survey), which is proposed to obtain detailed photometric data for approximately 20 billion galaxies. The massive Big Data that astronomy projects will generate in the future demand greater application of data mining and data science algorithms, as well as greater training of astronomy students in the skills of data mining and data science. The project described here has involved several graduate and undergraduate research assistants at George Mason University.

  12. Design of a Graphics User Interface for a Database Management System.

    DTIC Science & Technology

    1986-06-01

    A 4 ) -4 0 .(𔃾 E-4- r-4 tn rz ’-,-0 oa 0Z 0 u ) -4 W 2 -1 4 co tnf *1 ~103 -,Zt,,Xe0 a) ll 4-- o). 0. C3., -4 .,1 - CE L")C’E 412 4-J4 4- - 4...C 13 a) :3: 1) 1 ) uu 00.,: C) ) : ,J) u 00 U C 0 4 -) 4 - LL 0) 0 Cd t 0C 4-) ~ 0 16 LIST OF REFERENCES (BOG 84) Boguraev, B. and Jones, K. "A...Programming: Concepts of Operating and Database Systems, Addison-Wesley, 1975. Moriconi, M. and Hare, D., "Visualizing Program Designs Through PegaSys

  13. Measuring impact of protected area management interventions: current and future use of the Global Database of Protected Area Management Effectiveness.

    PubMed

    Coad, Lauren; Leverington, Fiona; Knights, Kathryn; Geldmann, Jonas; Eassom, April; Kapos, Valerie; Kingston, Naomi; de Lima, Marcelo; Zamora, Camilo; Cuardros, Ivon; Nolte, Christoph; Burgess, Neil D; Hockings, Marc

    2015-11-05

    Protected areas (PAs) are at the forefront of conservation efforts, and yet despite considerable progress towards the global target of having 17% of the world's land area within protected areas by 2020, biodiversity continues to decline. The discrepancy between increasing PA coverage and negative biodiversity trends has resulted in renewed efforts to enhance PA effectiveness. The global conservation community has conducted thousands of assessments of protected area management effectiveness (PAME), and interest in the use of these data to help measure the conservation impact of PA management interventions is high. Here, we summarize the status of PAME assessment, review the published evidence for a link between PAME assessment results and the conservation impacts of PAs, and discuss the limitations and future use of PAME data in measuring the impact of PA management interventions on conservation outcomes. We conclude that PAME data, while designed as a tool for local adaptive management, may also help to provide insights into the impact of PA management interventions from the local-to-global scale. However, the subjective and ordinal characteristics of the data present significant limitations for their application in rigorous scientific impact evaluations, a problem that should be recognized and mitigated where possible.

  14. Measuring impact of protected area management interventions: current and future use of the Global Database of Protected Area Management Effectiveness

    PubMed Central

    Coad, Lauren; Leverington, Fiona; Knights, Kathryn; Geldmann, Jonas; Eassom, April; Kapos, Valerie; Kingston, Naomi; de Lima, Marcelo; Zamora, Camilo; Cuardros, Ivon; Nolte, Christoph; Burgess, Neil D.; Hockings, Marc

    2015-01-01

    Protected areas (PAs) are at the forefront of conservation efforts, and yet despite considerable progress towards the global target of having 17% of the world's land area within protected areas by 2020, biodiversity continues to decline. The discrepancy between increasing PA coverage and negative biodiversity trends has resulted in renewed efforts to enhance PA effectiveness. The global conservation community has conducted thousands of assessments of protected area management effectiveness (PAME), and interest in the use of these data to help measure the conservation impact of PA management interventions is high. Here, we summarize the status of PAME assessment, review the published evidence for a link between PAME assessment results and the conservation impacts of PAs, and discuss the limitations and future use of PAME data in measuring the impact of PA management interventions on conservation outcomes. We conclude that PAME data, while designed as a tool for local adaptive management, may also help to provide insights into the impact of PA management interventions from the local-to-global scale. However, the subjective and ordinal characteristics of the data present significant limitations for their application in rigorous scientific impact evaluations, a problem that should be recognized and mitigated where possible. PMID:26460133

  15. Analysis, Repair, and Management of the Total Ozone Mapping Spectrometer Database

    NASA Technical Reports Server (NTRS)

    Sirovich, Lawrence

    1997-01-01

    In the ensuing period we were able to demonstrate that the origin of these filamentous patterns resulted from the action of synoptic-scale vortical velocity field on the global-scale background gradient of ozone concentration in the meridional direction. Hyperbolic flow patterns between long-lived atmospheric vortices bring together air parcels from different latitudes, thus creating large gradients along the separatrices leaving the hyperbolic (stagnation) point. This result is further confirmed by the KL analysis of the ozone field in the equatorial region, where the background concentration gradient vanishes. The spectral slope in this region has been found to lie close to -1, in agreement with Batchelor's prediction. Another outcome of this result is that it at least provides indirect evidence about the kinetic energy spectrum of the atmospheric turbulence in the range of scales approximately 200 to 2000 km. Namely, Batchelor's analysis is based on the assumption that the velocity field is large-scale, that is the kinetic energy spectrum decays as O(k(sup -3)) or steeper. Since the scalar spectrum is confirmed, this also supports this form of the kinetic energy spectrum. The study of equatorial regions of TOMS data revealed the efficiency of the KL method is in detecting and separating a wave-like measurement artifact inherently present in the dataset due to the non-perfect correction for cross-track bias. Just two to three eigenfunctions represent the error, which makes it possible to enhance the data by reconstituting it from the data by eliminating the subspace of artifactual eigenfunctions. This represents a highly efficient means for achieving an improved rendering of the data. This has been implemented on the database. A wide range of techniques and algorithms have been developed for the repair and extension of the TOMS database.

  16. Management of patients with active caries.

    PubMed

    Milgrom, Peter

    2014-07-01

    This paper reports on a mechanism to manage caries as a disease and to medically intervene in the disease process to halt progression. The goal of this paper is to provide this alternative to a surgical-only approach. The management of caries begins with assessing lesion activity and the potential for arrest. This requires a clinical and radiological assessment and evaluation of risk. Hopeless teeth are extracted and large cavities filled to reduce infection. Risk reduction strategies are employed so efforts to arrest lesions can be successful. Teeth with lesions in the enamel or outer third of the dentin should be sealed, not restored, as restorations can weaken teeth and can be traumatic to pulps.

  17. The Neotoma Paleoecology Database

    NASA Astrophysics Data System (ADS)

    Grimm, E. C.; Ashworth, A. C.; Barnosky, A. D.; Betancourt, J. L.; Bills, B.; Booth, R.; Blois, J.; Charles, D. F.; Graham, R. W.; Goring, S. J.; Hausmann, S.; Smith, A. J.; Williams, J. W.; Buckland, P.

    2015-12-01

    The Neotoma Paleoecology Database (www.neotomadb.org) is a multiproxy, open-access, relational database that includes fossil data for the past 5 million years (the late Neogene and Quaternary Periods). Modern distributional data for various organisms are also being made available for calibration and paleoecological analyses. The project is a collaborative effort among individuals from more than 20 institutions worldwide, including domain scientists representing a spectrum of Pliocene-Quaternary fossil data types, as well as experts in information technology. Working groups are active for diatoms, insects, ostracodes, pollen and plant macroscopic remains, testate amoebae, rodent middens, vertebrates, age models, geochemistry and taphonomy. Groups are also active in developing online tools for data analyses and for developing modules for teaching at different levels. A key design concept of NeotomaDB is that stewards for various data types are able to remotely upload and manage data. Cooperatives for different kinds of paleo data, or from different regions, can appoint their own stewards. Over the past year, much progress has been made on development of the steward software-interface that will enable this capability. The steward interface uses web services that provide access to the database. More generally, these web services enable remote programmatic access to the database, which both desktop and web applications can use and which provide real-time access to the most current data. Use of these services can alleviate the need to download the entire database, which can be out-of-date as soon as new data are entered. In general, the Neotoma web services deliver data either from an entire table or from the results of a view. Upon request, new web services can be quickly generated. Future developments will likely expand the spatial and temporal dimensions of the database. NeotomaDB is open to receiving new datasets and stewards from the global Quaternary community

  18. Unterstützung der IT-Service-Management-Prozesse an der Technischen Universität München durch eine Configuration-Management-Database

    NASA Astrophysics Data System (ADS)

    Knittl, Silvia

    Hochschulprozesse in Lehre und Verwaltung erfordern durch die steigende Integration und IT-Unterstützung ein sogenanntes Business Alignment der IT und damit auch ein professionelleres IT-Service-Management (ITSM). Die IT Infrastructure Library (ITIL) mit ihrer Beschreibung von in der Praxis bewährten Prozessen hat sich zum de-facto Standard im ITSM etabliert. Ein solcher Prozess ist das Konfigurationsmanagement. Es bildet die IT-Infrastruktur als Konfigurationselemente und deren Beziehungen in einem Werkzeug, genannt Configuration Management Database (CMDB), ab und unterstützt so das ITSM. Dieser Bericht beschreibt die Erfahrungen mit der prototypischen Einführung einer CMDB an der Technischen Universität München.

  19. Reliability of the TTC approach: learning from inclusion of pesticide active substances in the supporting database.

    PubMed

    Feigenbaum, Alexandre; Pinalli, Roberta; Giannetto, Marco; Barlow, Susan

    2015-01-01

    Data on pesticide active substances were used to assess the reliability of the Threshold of Toxicological Concern (TTC) approach. Pesticides were chosen as a robust test because of their potential for toxicity. 328 pesticide substances were classified on the basis of their chemical structure, according to the generic scheme proposed by the European Food Safety Authority. 43 carbamates and organophosphates were allocated to the group for neurotoxicity alerts, and 279 substances to Cramer structural Class III. For Class III, the 5th percentile value as calculated from the cumulative distribution curve of the no-observed-effect levels (0.20 mg/kg bw per day), was slightly higher than that determined by Munro (0.15 mg/kg bw per day) from his original database. The difference is explained by the inclusion of carbamates and organophosphates in Munro's Class III. Consideration of the acceptable daily intakes and their underlying toxicity data showed that the TTC approach is conservative for 96.2% of the substances. Overall, this analysis gives added support to the utility of the generic scheme of application of the TTC approach for hazard assessment of substances for which few or no experimental toxicity data are available. A convenient alternative to the Cramer decision tree is proposed.

  20. Inland wetlands legislation and management. (Latest citations from the NTIS bibliographic database). Published Search

    SciTech Connect

    1996-03-01

    The bibliography contains citations concerning Federal and state laws and management programs for the protection and use of inland wetlands. The use of wetlands to control highway runoff and community wastewater is discussed. Wetlands protection programs, restoration projects, resource planning, and wetlands identification methods are cited. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  1. Inland wetlands legislation and management. (Latest citations from the NTIS Bibliographic database). Published Search

    SciTech Connect

    Not Available

    1993-11-01

    The bibliography contains citations concerning Federal and state laws and management programs for the protection and use of inland wetlands. the use of wetlands to control highway runoff and community wastewater is discussed. Wetlands protection programs, restoration projects, resource planning, and wetlands identification methods are cited. (Contains 250 citations and includes a subject term index and title list.)

  2. Railroad management planning. (Latest citations from the NTIS database). Published Search

    SciTech Connect

    Not Available

    1993-01-01

    The bibliography contains citations concerning railroad management techniques and their impact on operations. Topics include freight statistics, impacts on communities, and yard operations. Forecasts of future trends and government policies regarding railroad operations are also discussed. (Contains a minimum of 76 citations and includes a subject term index and title list.)

  3. Content-Based Management of Image Databases in the Internet Age

    ERIC Educational Resources Information Center

    Kleban, James Theodore

    2010-01-01

    The Internet Age has seen the emergence of richly annotated image data collections numbering in the billions of items. This work makes contributions in three primary areas which aid the management of this data: image representation, efficient retrieval, and annotation based on content and metadata. The contributions are as follows. First,…

  4. Database Management Systems: A Case Study of Faculty of Open Education

    ERIC Educational Resources Information Center

    Kamisli, Zehra

    2004-01-01

    We live in the information and the microelectronic age, where technological advancements become a major determinant of our lifestyle. Such advances in technology cannot possibly be made or sustained without concurrent advancement in management systems (5). The impact of computer technology on organizations and society is increasing as new…

  5. A Relational/Object-Oriented Database Management System: R/OODBMS

    DTIC Science & Technology

    1992-09-01

    Corporation produces GemStone17 [BM089][HW911, a disk-based storage manager designed for commercial and engineering markets [BOS911. The designers of...key’,reiatic~n_key); idi -bind("relation~print*,reiation~print); idi_bind(*create~tuple*~,create_tuole); idi-bind) *ck-union- comnatability ",ck

  6. The Clinical Next‐Generation Sequencing Database: A Tool for the Unified Management of Clinical Information and Genetic Variants to Accelerate Variant Pathogenicity Classification

    PubMed Central

    Nishio, Shin‐ya

    2017-01-01

    ABSTRACT Recent advances in next‐generation sequencing (NGS) have given rise to new challenges due to the difficulties in variant pathogenicity interpretation and large dataset management, including many kinds of public population databases as well as public or commercial disease‐specific databases. Here, we report a new database development tool, named the “Clinical NGS Database,” for improving clinical NGS workflow through the unified management of variant information and clinical information. This database software offers a two‐feature approach to variant pathogenicity classification. The first of these approaches is a phenotype similarity‐based approach. This database allows the easy comparison of the detailed phenotype of each patient with the average phenotype of the same gene mutation at the variant or gene level. It is also possible to browse patients with the same gene mutation quickly. The other approach is a statistical approach to variant pathogenicity classification based on the use of the odds ratio for comparisons between the case and the control for each inheritance mode (families with apparently autosomal dominant inheritance vs. control, and families with apparently autosomal recessive inheritance vs. control). A number of case studies are also presented to illustrate the utility of this database. PMID:28008688

  7. The Clinical Next-Generation Sequencing Database: A Tool for the Unified Management of Clinical Information and Genetic Variants to Accelerate Variant Pathogenicity Classification.

    PubMed

    Nishio, Shin-Ya; Usami, Shin-Ichi

    2017-03-01

    Recent advances in next-generation sequencing (NGS) have given rise to new challenges due to the difficulties in variant pathogenicity interpretation and large dataset management, including many kinds of public population databases as well as public or commercial disease-specific databases. Here, we report a new database development tool, named the "Clinical NGS Database," for improving clinical NGS workflow through the unified management of variant information and clinical information. This database software offers a two-feature approach to variant pathogenicity classification. The first of these approaches is a phenotype similarity-based approach. This database allows the easy comparison of the detailed phenotype of each patient with the average phenotype of the same gene mutation at the variant or gene level. It is also possible to browse patients with the same gene mutation quickly. The other approach is a statistical approach to variant pathogenicity classification based on the use of the odds ratio for comparisons between the case and the control for each inheritance mode (families with apparently autosomal dominant inheritance vs. control, and families with apparently autosomal recessive inheritance vs. control). A number of case studies are also presented to illustrate the utility of this database.

  8. The computational structural mechanics testbed architecture. Volume 4: The global-database manager GAL-DBM

    NASA Technical Reports Server (NTRS)

    Wright, Mary A.; Regelbrugge, Marc E.; Felippa, Carlos A.

    1989-01-01

    This is the fourth of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 4 describes the nominal-record data management component of the NICE software. It is intended for all users.

  9. The Recovery of a Clinical Database Management System after Destruction by Fire *

    PubMed Central

    Covvey, H.D.; McAlister, N.H.; Greene, J.; Wigle, E.D.

    1981-01-01

    In August 1980 a fire in the Cardiovascular Unit at Toronto General Hospital severely damaged the physical plant and rendered all on-site equipment unrecoverable. Among the hardware items in the fire was the computer which supports our cardiovascular database system. Within hours after the fire it was determined that the computer was no longer serviceable. Beyond off-site back-up tapes, there was the possibility that recent records on the computer had suffered a similar fate. Immediate procedures were instituted to obtain a replacement computer system and to clean media to permit data recovery. Within 2 months a partial system was supporting all users, and all data was recovered and being used. The destructive potential of a fire is rarely seriously considered relative to computer equipment in our clinical environments. Full-replacement value insurance; an excellent equipment supplier with the capacity to respond to an emergency; backup and recovery procedures with off-site storage; and dedicated staff are key hedges against disaster.

  10. Attributes of the Federal Energy Management Program's Federal Site Building Characteristics Database

    SciTech Connect

    Loper, Susan A.; Sandusky, William F.

    2010-12-31

    Typically, the Federal building stock is referred to as a group of about one-half million buildings throughout the United States. Additional information beyond this level is generally limited to distribution of that total by agency and maybe distribution of the total by state. However, additional characterization of the Federal building stock is required as the Federal sector seeks ways to implement efficiency projects to reduce energy and water use intensity as mandated by legislation and Executive Order. Using a Federal facility database that was assembled for use in a geographic information system tool, additional characterization of the Federal building stock is provided including information regarding the geographical distribution of sites, building counts and percentage of total by agency, distribution of sites and building totals by agency, distribution of building count and floor space by Federal building type classification by agency, and rank ordering of sites, buildings, and floor space by state. A case study is provided regarding how the building stock has changed for the Department of Energy from 2000 through 2008.

  11. Avibase – a database system for managing and organizing taxonomic concepts

    PubMed Central

    Lepage, Denis; Vaidya, Gaurav; Guralnick, Robert

    2014-01-01

    Abstract Scientific names of biological entities offer an imperfect resolution of the concepts that they are intended to represent. Often they are labels applied to entities ranging from entire populations to individual specimens representing those populations, even though such names only unambiguously identify the type specimen to which they were originally attached. Thus the real-life referents of names are constantly changing as biological circumscriptions are redefined and thereby alter the sets of individuals bearing those names. This problem is compounded by other characteristics of names that make them ambiguous identifiers of biological concepts, including emendations, homonymy and synonymy. Taxonomic concepts have been proposed as a way to address issues related to scientific names, but they have yet to receive broad recognition or implementation. Some efforts have been made towards building systems that address these issues by cataloguing and organizing taxonomic concepts, but most are still in conceptual or proof-of-concept stage. We present the on-line database Avibase as one possible approach to organizing taxonomic concepts. Avibase has been successfully used to describe and organize 844,000 species-level and 705,000 subspecies-level taxonomic concepts across every major bird taxonomic checklist of the last 125 years. The use of taxonomic concepts in place of scientific names, coupled with efficient resolution services, is a major step toward addressing some of the main deficiencies in the current practices of scientific name dissemination and use. PMID:25061375

  12. Content Based Retrieval Database Management System with Support for Similarity Searching and Query Refinement

    DTIC Science & Technology

    2002-01-01

    Management System. df Document Frequency. GIF Graphics Interchange Format Image File Format. HSV Hue , Saturation, Value Color Space. idf Inverse Document...is used to verify the safety criteria. A tuple is only returned if it successfully passes the safety test t.score > c, thus every returned tuple was...exploring the streams A and B in rank order. We test the maximum possible similarity score c for the current OABB against the set Saux of candidate tuples, if

  13. Consolidated Human Activity Database (CHAD) for use in human exposure and health studies and predictive models

    EPA Pesticide Factsheets

    EPA scientists have compiled detailed data on human behavior from 22 separate exposure and time-use studies into CHAD. The database includes more than 54,000 individual study days of detailed human behavior.

  14. Management of Reclaimed Produced Water in California Enhanced with the Expanded U.S. Geological Survey Produced Waters Geochemical Database

    NASA Astrophysics Data System (ADS)

    Gans, K. D.; Blondes, M. S.; Kharaka, Y. K.; Reidy, M. E.; Conaway, C. H.; Thordsen, J. J.; Rowan, E. L.; Engle, M.

    2015-12-01

    In California, in 2014, every barrel of oil produced also produced 16 barrels of water. Approximately 3.2 billion barrels of water were co-produced with California oil in 2014. Half of California's produced water is generally used for steam and water injection for enhanced oil recovery. The other half (~215,000 acre-feet of water) is available for potential reuse. Concerns about the severe drought, groundwater depletion, and contamination have prompted petroleum operators and water districts to examine the recycling of produced water. Knowledge of the geochemistry of produced waters is valuable in determining the feasibility of produced water reuse. Water with low salinity can be reclaimed for use outside of the petroleum industry (e.g. irrigation, municipal uses, and industrial operations). Since a great proportion of California petroleum wells have produced water with relatively low salinity (generally 10,000-40,000 mg/L TDS), reclaiming produced water could be important as a drought mitigation strategy, especially in the parched southern San Joaquin Valley with many oil fields. The USGS Produced Waters Geochemical Database, available at http://eerscmap.usgs.gov/pwapp, will facilitate studies on the management of produced water for reclamation in California. Expanding on the USGS 2002 database, we have more accurately located California wells. We have added new data for 300 wells in the Sacramento Valley, San Joaquin Valley and the Los Angeles Basin for a total of ~ 1100 wells in California. In addition to the existing (2002) geochemical analyses of major ions and total dissolved solids, the new data also include geochemical analyses of minor ions and stable isotopes. We have added an interactive web map application which allows the user to filter data on chosen fields (e.g. TDS < 35,000 mg/L). Using the web map application as well as more in-depth investigation on the full data set can provide critical insight for better management of produced waters in water

  15. Lessons Learned From Developing Reactor Pressure Vessel Steel Embrittlement Database

    SciTech Connect

    Wang, Jy-An John

    2010-08-01

    Materials behaviors caused by neutron irradiation under fission and/or fusion environments can be little understood without practical examination. Easily accessible material information system with large material database using effective computers is necessary for design of nuclear materials and analyses or simulations of the phenomena. The developed Embrittlement Data Base (EDB) at ORNL is this comprehensive collection of data. EDB database contains power reactor pressure vessel surveillance data, the material test reactor data, foreign reactor data (through bilateral agreements authorized by NRC), and the fracture toughness data. The lessons learned from building EDB program and the associated database management activity regarding Material Database Design Methodology, Architecture and the Embedded QA Protocol are described in this report. The development of IAEA International Database on Reactor Pressure Vessel Materials (IDRPVM) and the comparison of EDB database and IAEA IDRPVM database are provided in the report. The recommended database QA protocol and database infrastructure are also stated in the report.

  16. The GTN-P Data Management System: A central database for permafrost monitoring parameters of the Global Terrestrial Network for Permafrost (GTN-P) and beyond

    NASA Astrophysics Data System (ADS)

    Lanckman, Jean-Pierre; Elger, Kirsten; Karlsson, Ævar Karl; Johannsson, Halldór; Lantuit, Hugues

    2013-04-01

    Permafrost is a direct indicator of climate change and has been identified as Essential Climate Variable (ECV) by the global observing community. The monitoring of permafrost temperatures, active-layer thicknesses and other parameters has been performed for several decades already, but it was brought together within the Global Terrestrial Network for Permafrost (GTN-P) in the 1990's only, including the development of measurement protocols to provide standardized data. GTN-P is the primary international observing network for permafrost sponsored by the Global Climate Observing System (GCOS) and the Global Terrestrial Observing System (GTOS), and managed by the International Permafrost Association (IPA). All GTN-P data was outfitted with an "open data policy" with free data access via the World Wide Web. The existing data, however, is far from being homogeneous: it is not yet optimized for databases, there is no framework for data reporting or archival and data documentation is incomplete. As a result, and despite the utmost relevance of permafrost in the Earth's climate system, the data has not been used by as many researchers as intended by the initiators of the programs. While the monitoring of many other ECVs has been tackled by organized international networks (e.g. FLUXNET), there is still no central database for all permafrost-related parameters. The European Union project PAGE21 created opportunities to develop this central database for permafrost monitoring parameters of GTN-P during the duration of the project and beyond. The database aims to be the one location where the researcher can find data, metadata, and information of all relevant parameters for a specific site. Each component of the Data Management System (DMS), including parameters, data levels and metadata formats were developed in cooperation with the GTN-P and the IPA. The general framework of the GTN-P DMS is based on an object oriented model (OOM), open for as many parameters as possible, and

  17. DBAASP v.2: an enhanced database of structure and antimicrobial/cytotoxic activity of natural and synthetic peptides.

    PubMed

    Pirtskhalava, Malak; Gabrielian, Andrei; Cruz, Phillip; Griggs, Hannah L; Squires, R Burke; Hurt, Darrell E; Grigolava, Maia; Chubinidze, Mindia; Gogoladze, George; Vishnepolsky, Boris; Alekseev, Vsevolod; Rosenthal, Alex; Tartakovsky, Michael

    2016-01-04

    Antimicrobial peptides (AMPs) are anti-infectives that may represent a novel and untapped class of biotherapeutics. Increasing interest in AMPs means that new peptides (natural and synthetic) are discovered faster than ever before. We describe herein a new version of the Database of Antimicrobial Activity and Structure of Peptides (DBAASPv.2, which is freely accessible at http://dbaasp.org). This iteration of the database reports chemical structures and empirically-determined activities (MICs, IC50, etc.) against more than 4200 specific target microbes for more than 2000 ribosomal, 80 non-ribosomal and 5700 synthetic peptides. Of these, the vast majority are monomeric, but nearly 200 of these peptides are found as homo- or heterodimers. More than 6100 of the peptides are linear, but about 515 are cyclic and more than 1300 have other intra-chain covalent bonds. More than half of the entries in the database were added after the resource was initially described, which reflects the recent sharp uptick of interest in AMPs. New features of DBAASPv.2 include: (i) user-friendly utilities and reporting functions, (ii) a 'Ranking Search' function to query the database by target species and return a ranked list of peptides with activity against that target and (iii) structural descriptions of the peptides derived from empirical data or calculated by molecular dynamics (MD) simulations. The three-dimensional structural data are critical components for understanding structure-activity relationships and for design of new antimicrobial drugs. We created more than 300 high-throughput MD simulations specifically for inclusion in DBAASP. The resulting structures are described in the database by novel trajectory analysis plots and movies. Another 200+ DBAASP entries have links to the Protein DataBank. All of the structures are easily visualized directly in the web browser.

  18. DBAASP v.2: an enhanced database of structure and antimicrobial/cytotoxic activity of natural and synthetic peptides

    PubMed Central

    Pirtskhalava, Malak; Gabrielian, Andrei; Cruz, Phillip; Griggs, Hannah L.; Squires, R. Burke; Hurt, Darrell E.; Grigolava, Maia; Chubinidze, Mindia; Gogoladze, George; Vishnepolsky, Boris; Alekseev, Vsevolod; Rosenthal, Alex; Tartakovsky, Michael

    2016-01-01

    Antimicrobial peptides (AMPs) are anti-infectives that may represent a novel and untapped class of biotherapeutics. Increasing interest in AMPs means that new peptides (natural and synthetic) are discovered faster than ever before. We describe herein a new version of the Database of Antimicrobial Activity and Structure of Peptides (DBAASPv.2, which is freely accessible at http://dbaasp.org). This iteration of the database reports chemical structures and empirically-determined activities (MICs, IC50, etc.) against more than 4200 specific target microbes for more than 2000 ribosomal, 80 non-ribosomal and 5700 synthetic peptides. Of these, the vast majority are monomeric, but nearly 200 of these peptides are found as homo- or heterodimers. More than 6100 of the peptides are linear, but about 515 are cyclic and more than 1300 have other intra-chain covalent bonds. More than half of the entries in the database were added after the resource was initially described, which reflects the recent sharp uptick of interest in AMPs. New features of DBAASPv.2 include: (i) user-friendly utilities and reporting functions, (ii) a ‘Ranking Search’ function to query the database by target species and return a ranked list of peptides with activity against that target and (iii) structural descriptions of the peptides derived from empirical data or calculated by molecular dynamics (MD) simulations. The three-dimensional structural data are critical components for understanding structure–activity relationships and for design of new antimicrobial drugs. We created more than 300 high-throughput MD simulations specifically for inclusion in DBAASP. The resulting structures are described in the database by novel trajectory analysis plots and movies. Another 200+ DBAASP entries have links to the Protein DataBank. All of the structures are easily visualized directly in the web browser. PMID:26578581

  19. Principles and techniques in the design of ADMS+. [advanced data-base management system

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nick; Kang, Hyunchul

    1986-01-01

    'ADMS+/-' is an advanced data base management system whose architecture integrates the ADSM+ mainframe data base system with a large number of work station data base systems, designated ADMS-; no communications exist between these work stations. The use of this system radically decreases the response time of locally processed queries, since the work station runs in a single-user mode, and no dynamic security checking is required for the downloaded portion of the data base. The deferred update strategy used reduces overhead due to update synchronization in message traffic.

  20. Assessing urology and nephrology research activity in Arab countries using ISI web of science bibliometric database

    PubMed Central

    2014-01-01

    Background Bibliometric analysis is increasingly being used for research assessment. The main objective of this study was to assess research output in Urology and Nephrology subject from the Arab countries. Original scientific articles or reviews published from the 21 Arab countries in “Urology and Nephrology” subject were screened using the ISI Web of Science database. Research productivity was evaluated based on a methodology developed and used in other bibliometric studies by analyzing the annual productivity, names of journals, citations; top 10 active institution and authors as well as country contribution to Urology and Nephrology research. Results Three thousand and seventy six documents in “urology and nephrology” subject category were retrieved from 104 journals. This represents 1.4% of the global research output in “urology and nephrology”. Four hundred and two documents (12.66%) were published in Annales D Urologie Journal. The h-index of the retrieved documents was 57. The total number of citations, at the time of data analysis, was 30401 with an average citation of 9.57 per document. Egypt, with a total publication of 1284 (40.43%) ranked first among the Arab countries in “urology and nephrology” subject category. Mansoura University in Egypt was the most productive institution with a total of 561 (15.33%) documents. Arab researchers collaborated most with researchers from the United States of America (226; 7.12%) in urology and nephrology research. Conclusion The present data reveals a good contribution of some Arab countries to the field of “urology and nephrology”. More efforts are needed by some other Arab countries to bridge the gap in urology and nephrology research. Overall, the quality of urology/nephrology research is considered relatively high as measured by h-index. Cooperation in urology/nephrology research should be encouraged in the Arab world to bridge the gap with that from developed countries. PMID:24758477

  1. Boston Harbor Watershed: Using Database and GIS Tools to Communicate and Manage Water Quality Issues.

    NASA Astrophysics Data System (ADS)

    Smith, J. P.

    2001-05-01

    In the Boston Harbor Watershed, which contains 45 municipalities, 293 square miles, and a population of over 1,000,000, the sheer number and complexity of water quality issues makes it hard to identify, difficult to address, and economically unfeasible to take action on each one. A large volume of water quality data exists in many forms, most of which are neither easily accessible, in a standard format, nor readily available in a central location. Therefore, the major challenges facing watershed water quality efforts are the collection of useable and relevant data, the analysis of this data to identify trends and problems, making the data accessible to all parties involved, and the effective prioritization and communication of water quality concerns to develop an overall watershed-based water quality strategy. Global Information System (GIS) provides an invaluable tool for compiling water quality data in a central location, while visually communicating issues in a spatial format in order to prioritize issues with limited resources. This report describes the initiation of a multi-year project by the University of Massachusetts, Boston - Environmental, Coastal, and Ocean Sciences (ECOS) Department, in conjunction with the Urban Harbors Institute (UHI), to coordinate with key stakeholders in the watershed area to discuss known and perceived water quality problems and then develop an action plan to address these concerns. The action plan will include: (1) collecting and prioritizing existing water quality data and developing protocols and standards for future data integration; (2) compiling all data into a central database for common accessibility; (3) analyzing existing data to find long and short-term trends, establishing water quality objectives, and prioritizing causes of water quality impairment; and (4) developing a GIS application to visually communicate spatial data relating to water quality. The project will rely on continual feedback and communication with

  2. ENABLE (Exportable Notation and Bookmark List Engine): an Interface to Manage Tumor Measurement Data from PACS to Cancer Databases.

    PubMed

    Goyal, Nikhil; Apolo, Andrea B; Berman, Eliana D; Bagheri, Mohammad Hadi; Levine, Jason E; Glod, John W; Kaplan, Rosandra N; Machado, Laura B; Folio, Les R

    2017-01-10

    Oncologists evaluate therapeutic response in cancer trials based on tumor quantification following selected "target" lesions over time. At our cancer center, a majority of oncologists use Response Evaluation Criteria in Solid Tumors (RECIST) v1.1 quantifying tumor progression based on lesion measurements on imaging. Currently, our oncologists handwrite tumor measurements, followed by multiple manual data transfers; however, our Picture Archiving Communication System (PACS) (Carestream Health, Rochester, NY) has the ability to export tumor measurements, making it possible to manage tumor metadata digitally. We developed an interface, "Exportable Notation and Bookmark List Engine" (ENABLE), which produces prepopulated RECIST v1.1 worksheets and compiles cohort data and data models from PACS measurement data, thus eliminating handwriting and manual data transcription. We compared RECIST v1.1 data from eight patients (16 computed tomography exams) enrolled in an IRB-approved therapeutic trial with ENABLE outputs: 10 data fields with a total of 194 data points. All data in ENABLE's output matched with the existing data. Seven staff were taught how to use the interface with a 5-min explanatory instructional video. All were able to use ENABLE successfully without additional guidance. We additionally assessed 42 metastatic genitourinary cancer patients with available RECIST data within PACS to produce a best response waterfall plot. ENABLE manages tumor measurements and associated metadata exported from PACS, producing forms and data models compatible with cancer databases, obviating handwriting and the manual re-entry of data. Automation should reduce transcription errors and improve efficiency and the auditing process.

  3. Development of a database for prompt gamma-ray neutron activation analysis: Summary report of the third research coordination meeting

    SciTech Connect

    Lindstrom, Richard M.; Firestone, Richard B.; Pavi, ???

    2003-04-01

    The main discussions and conclusions from the Third Co-ordination Meeting on the Development of a Database for Prompt Gamma-ray Neutron Activation Analysis are summarized in this report. All results were reviewed in detail, and the final version of the TECDOC and the corresponding software were agreed upon and approved for preparation. Actions were formulated with the aim of completing the final version of the TECDOC and associated software by May 2003.

  4. Virtual screening for environmental pollutants: structure-activity relationships applied to a database of industrial chemicals.

    PubMed

    Oberg, Tomas

    2006-04-01

    The current risk paradigm calls for individual consideration and evaluation of each separate environmental pollutant, but this does not reflect accurately the cumulative impact of anthropogenic chemicals. In the present study, previously validated structure-activity relationships were used to estimate simultaneously the baseline toxicity and atmospheric persistence of approximately 50,000 compounds. The results from this virtual screening indicate fairly stable statistical distributions among small anthropogenic compounds. The baseline toxicity was not changed much by halogen substitution, but a distinct increase seemed to occur in the environmental persistence with increased halogenation. The ratio of the atmospheric half-lives to the median lethal concentrations provides a continuous scale with which to rank and summarize the incremental environmental impacts in a mixture-exposure situation. Halogenated compounds as a group obtained a high ranking in this data set, with well-known pollutants at the very top: DDT metabolites and derivatives, polychlorinated biphenyls, diphenyl ethers and dibenzofurans, chlorinated paraffins, chlorinated benzenes and derivatives, hydrochlorofluorocarbons, and dichlorononylphenol. Environmentally friendly chemicals that obtained the lowest rank are nearly all hydroxylated and water-soluble. Virtual screening can assist with "green chemistry" in designing safe and degradable products and enable assessment of the efficiency in chemicals risk management.

  5. [The new features of the bibliography database manager EndNote 6.0 and 7.0].

    PubMed

    Reiss, M; Reiss, G

    2005-03-09

    The bibliography database manager EndNote shows since 1999 a dramatically technical development. A new improved version is published every year. Because the version 7.0 was recently (June 2003) released, we want to describe some aspects of the new version. We also would like to examine, for which user the use of the respective version is recommendable. The use of the software package EndNote 6.0 and also 7.0 for Windows is described. The main reason for getting Endnote 6 is its clearly improved functions and features: Organize a variety of charts equations or pictures and the use of Microsoft-templates. The version 7.0 can be recommended especially to scientist who much works with a Palm particularly. It is also possible to work not only with Microsoft-Word but also with other word processors and creating a bibliography with topic headings. Altogether EndNote 6.0 and also 7.0 provides an excellent combination of features and ease of use. The versions 6.0 or 7.0 are useful especially for people, who every single day use EndNote. EndNote 7.0 is for user of a Palm a special recommendable version.

  6. Radio Active Waste Management: Underground Repository Method

    SciTech Connect

    Rudrapati Sandesh Kumar; Payal Shirvastava

    2002-07-01

    Finding a solution for nuclear waste is a key issue, not only for the protection of the environment but also for the future of the nuclear industry. Ten years from now, when the first decisions for the replacement of existing nuclear power plants will have to be made, The general public will require to know the solution for nuclear waste before accepting new nuclear plants. In other words, an acceptable solution for the management of nuclear waste is a prerequisite for a renewal of nuclear power. Most existing wastes are being stored in safe conditions waiting for permanent solution, with some exceptions in the former Eastern Bloc. Temporary surface or shallow storage is a well known technique widely used all over the world. A significant research effort has been made by the author of this paper in the direction of underground repository. The underground repository appears to be a good solution. Trying to transform dangerous long lived radionuclides into less harmful short lived or stable elements is a logical idea. It is indeed possible to incinerate or transmute heavy atoms of long lived elements in fast breeder reactors or even in pressurised or boiling water reactors. There are also new types of reactors which could be used, namely accelerator driven systems. High level and long lived wastes (spent fuel and vitrified waste) contain a mixture of high activity (heat producing) short lived nuclides and low activity long lived alpha emitting nuclides. To avoid any alteration due to temperature of the engineered or geological barrier surrounding the waste underground, it is necessary to store the packages on the surface for several decades (50 years or more) to allow a sufficient temperature decrease before disposing of them underground. In all cases, surface (or shallow) storage is needed as a temporary solution. This paper gives a detailed and comprehensive view of the Deep Geological Repository, providing a pragmatic picture of the means to make this method, a

  7. A Multi-Disciplinary Management of Flooding Risk Based on the Use of Rainfall Data, Historical Impacts Databases and Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Renard, F.; Alonso, L.; Soto, D.

    2014-12-01

    Greater Lyon (1.3 million inhabitants 650 km ²), France, is subjected to recurring floods, with numerous consequences. From the perspective of prevention and management of this risk, the local authorities, in partnership with multidisciplinary researchers, have developed since 1988 a database built by the field teams, which specifically identifies all floods (places, date, impacts, damage, etc.). At first, this historical database is compared to two other databases, the emergency services and the local newspaper ones, in georeferencing these events using a GIS. It turns out that the historical database is more complete and precise, but the contribution of the other two bases is not negligible, and a useful complement to the knowledge of impacts. Thanks to the dense rain measurement network (30 rain gauges), the flood information is then compared to the distribution of rainfall for each episode (interpolation by ordinary kriging). The results are satisfactory and validate the accuracy of the information contained in the database, but also the accuracy of rainfall measurements. Thereafter, the number of flood on the study area is confronted with rainfall characteristics (intensity, duration and height of precipitated water). It does not appear here clear relationship between the number of floods and rainfall characteristics, because of the diversity of land uses, its permeability and the the types of local sewer network and urban water management. Finally, floods observed in the database are compared spatially with a GIS to flooding from the sewer network modeling (using the software Canoe). A strong spatial similarity between floods observed in the field and simulated flood is found in the majority of cases, despite the limitations of each tools. These encouraging results confirm the accuracy of the database and the reliability of the simulation software, and offer many operational perspectives to better understand the flood and learn to cope with the flooding risk.

  8. New perspectives in toxicological information management, and the role of ISSTOX databases in assessing chemical mutagenicity and carcinogenicity.

    PubMed

    Benigni, Romualdo; Battistelli, Chiara Laura; Bossa, Cecilia; Tcheremenskaia, Olga; Crettaz, Pierre

    2013-07-01

    Currently, the public has access to a variety of databases containing mutagenicity and carcinogenicity data. These resources are crucial for the toxicologists and regulators involved in the risk assessment of chemicals, which necessitates access to all the relevant literature, and the capability to search across toxicity databases using both biological and chemical criteria. Towards the larger goal of screening chemicals for a wide range of toxicity end points of potential interest, publicly available resources across a large spectrum of biological and chemical data space must be effectively harnessed with current and evolving information technologies (i.e. systematised, integrated and mined), if long-term screening and prediction objectives are to be achieved. A key to rapid progress in the field of chemical toxicity databases is that of combining information technology with the chemical structure as identifier of the molecules. This permits an enormous range of operations (e.g. retrieving chemicals or chemical classes, describing the content of databases, finding similar chemicals, crossing biological and chemical interrogations, etc.) that other more classical databases cannot allow. This article describes the progress in the technology of toxicity databases, including the concepts of Chemical Relational Database and Toxicological Standardized Controlled Vocabularies (Ontology). Then it describes the ISSTOX cluster of toxicological databases at the Istituto Superiore di Sanitá. It consists of freely available databases characterised by the use of modern information technologies and by curation of the quality of the biological data. Finally, this article provides examples of analyses and results made possible by ISSTOX.

  9. Idaho Senior Center Activities, Activity Participation Level, and Managers' Perceptions of Activity Success.

    ERIC Educational Resources Information Center

    Girvan, James T.; Harris, Frances

    A survey completed by managers of 77 senior centers in Idaho revealed that meals, blood pressure screening, and games and trips were the most successful activities offered. Alzheimer's support groups, library books for loan, and exercise classes were the least successful. Possible reasons for the success or failure of these activities were…

  10. Transfer of Physical and Hydraulic Properties Databases to the Hanford Environmental Information System - PNNL Remediation Decision Support Project, Task 1, Activity 6

    SciTech Connect

    Rockhold, Mark L.; Middleton, Lisa A.

    2009-03-31

    This report documents the requirements for transferring physical and hydraulic property data compiled by PNNL into the Hanford Environmental Information System (HEIS). The Remediation Decision Support (RDS) Project is managed by Pacific Northwest National Laboratory (PNNL) to support Hanford Site waste management and remedial action decisions by the U.S. Department of Energy and one of their current site contractors - CH2M-Hill Plateau Remediation Company (CHPRC). The objective of Task 1, Activity 6 of the RDS project is to compile all available physical and hydraulic property data for sediments from the Hanford Site, to port these data into the Hanford Environmental Information System (HEIS), and to make the data web-accessible to anyone on the Hanford Local Area Network via the so-called Virtual Library.1 These physical and hydraulic property data are used to estimate parameters for analytical and numerical flow and transport models that are used for site risk assessments and evaluation of remedial action alternatives. In past years efforts were made by RDS project staff to compile all available physical and hydraulic property data for Hanford sediments and to transfer these data into SoilVision{reg_sign}, a commercial geotechnical software package designed for storing, analyzing, and manipulating soils data. Although SoilVision{reg_sign} has proven to be useful, its access and use restrictions have been recognized as a limitation to the effective use of the physical and hydraulic property databases by the broader group of potential users involved in Hanford waste site issues. In order to make these data more widely available and useable, a decision was made to port them to HEIS and to make them web-accessible via a Virtual Library module. In FY08 the original objectives of this activity on the RDS project were to: (1) ensure traceability and defensibility of all physical and hydraulic property data currently residing in the SoilVision{reg_sign} database

  11. Secure Database Management Study.

    DTIC Science & Technology

    1978-12-01

    3.1.2 McCauleYls Security Atoms Additional concepts are introduced in (MCCAES) which enhance the security aspects of the attribute- bar id model...sLgnificance. Physical access remains as an overriding factor of system security. Without protection of the physeal operating environment (computer

  12. Curating and Preserving the Big Canopy Database System: an Active Curation Approach using SEAD

    NASA Astrophysics Data System (ADS)

    Myers, J.; Cushing, J. B.; Lynn, P.; Weiner, N.; Ovchinnikova, A.; Nadkarni, N.; McIntosh, A.

    2015-12-01

    Modern research is increasingly dependent upon highly heterogeneous data and on the associated cyberinfrastructure developed to organize, analyze, and visualize that data. However, due to the complexity and custom nature of such combined data-software systems, it can be very challenging to curate and preserve them for the long term at reasonable cost and in a way that retains their scientific value. In this presentation, we describe how this challenge was met in preserving the Big Canopy Database (CanopyDB) system using an agile approach and leveraging the Sustainable Environment - Actionable Data (SEAD) DataNet project's hosted data services. The CanopyDB system was developed over more than a decade at Evergreen State College to address the needs of forest canopy researchers. It is an early yet sophisticated exemplar of the type of system that has become common in biological research and science in general, including multiple relational databases for different experiments, a custom database generation tool used to create them, an image repository, and desktop and web tools to access, analyze, and visualize this data. SEAD provides secure project spaces with a semantic content abstraction (typed content with arbitrary RDF metadata statements and relationships to other content), combined with a standards-based curation and publication pipeline resulting in packaged research objects with Digital Object Identifiers. Using SEAD, our cross-project team was able to incrementally ingest CanopyDB components (images, datasets, software source code, documentation, executables, and virtualized services) and to iteratively define and extend the metadata and relationships needed to document them. We believe that both the process, and the richness of the resultant standards-based (OAI-ORE) preservation object, hold lessons for the development of best-practice solutions for preserving scientific data in association with the tools and services needed to derive value from it.

  13. A lake-centric geospatial database to guide research and inform management decisions in an Arctic watershed in northern Alaska experiencing climate and land-use changes.

    PubMed

    Jones, Benjamin M; Arp, Christopher D; Whitman, Matthew S; Nigro, Debora; Nitze, Ingmar; Beaver, John; Gädeke, Anne; Zuck, Callie; Liljedahl, Anna; Daanen, Ronald; Torvinen, Eric; Fritz, Stacey; Grosse, Guido

    2017-03-25

    Lakes are dominant and diverse landscape features in the Arctic, but conventional land cover classification schemes typically map them as a single uniform class. Here, we present a detailed lake-centric geospatial database for an Arctic watershed in northern Alaska. We developed a GIS dataset consisting of 4362 lakes that provides information on lake morphometry, hydrologic connectivity, surface area dynamics, surrounding terrestrial ecotypes, and other important conditions describing Arctic lakes. Analyzing the geospatial database relative to fish and bird survey data shows relations to lake depth and hydrologic connectivity, which are being used to guide research and aid in the management of aquatic resources in the National Petroleum Reserve in Alaska. Further development of similar geospatial databases is needed to better understand and plan for the impacts of ongoing climate and land-use changes occurring across lake-rich landscapes in the Arctic.

  14. The development of an information system and installation of an Internet web database for the purposes of the occupational health and safety management system.

    PubMed

    Mavrikakis, I; Mantas, J; Diomidous, M

    2007-01-01

    This paper is based on the research on the possible structure of an information system for the purposes of occupational health and safety management. We initiated a questionnaire in order to find the possible interest on the part of potential users in the subject of occupational health and safety. The depiction of the potential interest is vital both for the software analysis cycle and development according to previous models. The evaluation of the results tends to create pilot applications among different enterprises. Documentation and process improvements ascertained quality of services, operational support, occupational health and safety advice are the basics of the above applications. Communication and codified information among intersted parts is the other target of the survey regarding health issues. Computer networks can offer such services. The network will consist of certain nodes responsible to inform executives on Occupational Health and Safety. A web database has been installed for inserting and searching documents. The submission of files to a server and the answers to questionnaires through the web help the experts to perform their activities. Based on the requirements of enterprises we have constructed a web file server. We submit files so that users can retrieve the files which they need. The access is limited to authorized users. Digital watermarks authenticate and protect digital objects.

  15. The Pilot Contracting Activities Program: A Management Review

    DTIC Science & Technology

    1988-12-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California SI STA? RADO , THESIS THE PILOT CONTRACTING ACTIVITIES PROGRAM: A MANAGEMENT REVIEW by Robert John ...Approved for public release; distribution is unlimited The Pilot Contracting Activities Program : A Management Review by Robert John Palmquist...the degree of MASTER OF SCIENCE IN MANAGEMENT from the NAVAL POSTGRADUATE SCHOOL December 1988 Author: ’ Y / " Rober John Palmquist Approved by: g17 A c

  16. VIEWCACHE: An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nick; Sellis, Timoleon

    1991-01-01

    The objective is to illustrate the concept of incremental access to distributed databases. An experimental database management system, ADMS, which has been developed at the University of Maryland, in College Park, uses VIEWCACHE, a database access method based on incremental search. VIEWCACHE is a pointer-based access method that provides a uniform interface for accessing distributed databases and catalogues. The compactness of the pointer structures formed during database browsing and the incremental access method allow the user to search and do inter-database cross-referencing with no actual data movement between database sites. Once the search is complete, the set of collected pointers pointing to the desired data are dereferenced.

  17. Djeen (Database for Joomla!’s Extensible Engine): a research information management system for flexible multi-technology project administration

    PubMed Central

    2013-01-01

    Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665

  18. US - Former Soviet Union environmental management activities

    SciTech Connect

    1995-09-01

    The Office of Environmental Management (EM) has been delegated the responsibility for US DOE`s cleanup of nuclear weapons complex. The nature and the magnitude of the waste management and environmental remediation problem requires the identification of technologies and scientific expertise from domestic and foreign sources. This booklet makes comparisons and describes coordinated projects and workshops between the USA and the former Soviet Union.

  19. Stackfile Database

    NASA Technical Reports Server (NTRS)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  20. Database of Industrial Technological Information in Kanagawa : Networks for Technology Activities

    NASA Astrophysics Data System (ADS)

    Saito, Akira; Shindo, Tadashi

    This system is one of the databases which require participation by its members and of which premise is to open all the data in it. Aiming at free technological cooperation and exchange among industries it was constructed by Kanagawa Prefecture in collaboration with enterprises located in it. The input data is 36 items such as major product, special and advantageous technology, technolagy to be wanted for cooperation, facility and equipment, which technologically characterize each enterprise. They are expressed in 2,000 characters and written by natural language including Kanji except for some coded items. 24 search items are accessed by natural language so that in addition to interactive searching procedures including menu-type it enables extensive searching. The information service started in Oct., 1986 covering data from 2,000 enterprisen.

  1. Quarterly Briefing Book on Environmental and Waste Management Activities

    SciTech Connect

    Brown, M.C.

    1991-06-01

    The purpose of the Quarterly Briefing Book on Environmental and Waste Management Activities is to provide managers and senior staff at the US Department of Energy-Richland Operations Office and its contractors with timely and concise information on Hanford Site environmental and waste management activities. Each edition updates the information on the topics in the previous edition, deletes those determined not to be of current interest, and adds new topics to keep up to date with changing environmental and waste management requirements and issues. Section A covers current waste management and environmental restoration issues. In Section B are writeups on national or site-wide environmental and waste management topics. Section C has writeups on program- and waste-specific environmental and waste management topics. Section D provides information on waste sites and inventories on the site. 15 figs., 4 tabs.

  2. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  3. Databases for multilevel biophysiology research available at Physiome.jp.

    PubMed

    Asai, Yoshiyuki; Abe, Takeshi; Li, Li; Oka, Hideki; Nomura, Taishin; Kitano, Hiroaki

    2015-01-01

    Physiome.jp (http://physiome.jp) is a portal site inaugurated in 2007 to support model-based research in physiome and systems biology. At Physiome.jp, several tools and databases are available to support construction of physiological, multi-hierarchical, large-scale models. There are three databases in Physiome.jp, housing mathematical models, morphological data, and time-series data. In late 2013, the site was fully renovated, and in May 2015, new functions were implemented to provide information infrastructure to support collaborative activities for developing models and performing simulations within the database framework. This article describes updates to the databases implemented since 2013, including cooperation among the three databases, interactive model browsing, user management, version management of models, management of parameter sets, and interoperability with applications.

  4. Data management and language enhancement for generalized set theory computer language for operation of large relational databases

    NASA Technical Reports Server (NTRS)

    Finley, Gail T.

    1988-01-01

    This report covers the study of the relational database implementation in the NASCAD computer program system. The existing system is used primarily for computer aided design. Attention is also directed to a hidden-surface algorithm for final drawing output.

  5. International Project Management Committee: Overview and Activities

    NASA Technical Reports Server (NTRS)

    Hoffman, Edward

    2010-01-01

    This slide presentation discusses the purpose and composition of the International Project Management Committee (IMPC). The IMPC was established by members of 15 space agencies, companies and professional organizations. The goal of the committee is to establish a means to share experiences and best practices with space project/program management practitioners at the global level. The space agencies that are involved are: AEB, DLR, ESA, ISRO, JAXA, KARI, and NASA. The industrial and professional organizational members are Comau, COSPAR, PMI, and Thales Alenia Space.

  6. Contextualizing Solar Cycle 24: Report on the Development of a Homogenous Database of Bipolar Active Regions Spanning Four Cycles

    NASA Astrophysics Data System (ADS)

    Munoz-Jaramillo, A.; Werginz, Z. A.; DeLuca, M. D.; Vargas-Acosta, J. P.; Longcope, D. W.; Harvey, J. W.; Martens, P.; Zhang, J.; Vargas-Dominguez, S.; DeForest, C. E.; Lamb, D. A.

    2015-12-01

    The solar cycle can be understood as a process that alternates the large-scale magnetic field of the Sun between poloidal and toroidal configurations. Although the process that transitions the solar cycle between toroidal and poloidal phases is still not fully understood, theoretical studies, and observational evidence, suggest that this process is driven by the emergence and decay of bipolar magnetic regions (BMRs) at the photosphere. Furthermore, the emergence of BMRs at the photosphere is the main driver behind solar variability and solar activity in general; making the study of their properties doubly important for heliospheric physics. However, in spite of their critical role, there is still no unified catalog of BMRs spanning multiple instruments and covering the entire period of systematic measurement of the solar magnetic field (i.e. 1975 to present).In this presentation we discuss an ongoing project to address this deficiency by applying our Bipolar Active Region Detection (BARD) code on full disk magnetograms measured by the 512 (1975-1993) and SPMG (1992-2003) instruments at the Kitt Peak Vacuum Telescope (KPVT), SOHO/MDI (1996-2011) and SDO/HMI (2010-present). First we will discuss the results of our revitalization of 512 and SPMG KPVT data, then we will discuss how our BARD code operates, and finally report the results of our cross-callibration.The corrected and improved KPVT magnetograms will be made available through the National Solar Observatory (NSO) and Virtual Solar Observatory (VSO), including updated synoptic maps produced by running the corrected KPVT magnetograms though the SOLIS pipeline. The homogeneous active region database will be made public by the end of 2017 once it has reached a satisfactory level of quality and maturity. The Figure shows all bipolar active regions present in our database (as of Aug 2015) colored according to the sign of their leading polarity. Marker size is indicative of the total active region flux. Anti

  7. Description of data base management systems activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    One of the major responsibilities of the JPL Computing and Information Services Office is to develop and maintain a JPL plan for providing computing services to the JPL management and administrative community that will lead to improved productivity. The CISO plan to accomplish this objective has been titled 'Management and Administrative Support Systems' (MASS). The MASS plan is based on the continued use of JPL's IBM 3032 Computer system for administrative computing and for the MASS functions. The current candidate administrative Data Base Management Systems required to support the MASS include ADABASE, Cullinane IDMS and TOTAL. Previous uses of administrative Data Base Systems have been applied to specific local functions rather than in a centralized manner with elements common to the many user groups. Limited capacity data base systems have been installed in microprocessor based office automation systems in a few Project and Management Offices using Ashton-Tate dBASE II. These experiences plus some other localized in house DBMS uses have provided an excellent background for developing user and system requirements for a single DBMS to support the MASS program.

  8. Building Databases for Education. ERIC Digest.

    ERIC Educational Resources Information Center

    Klausmeier, Jane A.

    This digest provides a brief explanation of what a database is; explains how a database can be used; identifies important factors that should be considered when choosing database management system software; and provides citations to sources for finding reviews and evaluations of database management software. The digest is concerned primarily with…

  9. Biofuel Database

    National Institute of Standards and Technology Data Gateway

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  10. Polish activity within Orphanet Europe--state of art of database and services.

    PubMed

    Jezela-Stanek, Aleksandra; Karczmarewicz, Dorota; Chrzanowska, Krystyna H; Krajewska-Walasek, Małgorzata

    2015-01-01

    Orphanet is an international project aiming to help in improvement the diagnostic process, care and treatment of patients with rare diseases, and to provide information on development in research and new therapy. Orphanet is currently represented in 38 countries. The infrastructure and coordination activities are jointly funded by Inserm, the French Directorate General for Health, and the European Commission. Moreover, certain services are specially funded by other partners. Orphanet's activities in each country of the network are partially financed by national institutions and(or) specific contracts. In this paper we present the Orphanet portal as well as the Polish national activity within this project.

  11. US EPA’s Watershed Management Research Activities

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s Urban Watershed Management Branch (UWMB) is responsible for developing and demonstrating methods to manage the risk to public health, property and the environment from wet-weather flows (WWF) in urban watersheds. The activities are prim...

  12. Guide to good practices for line and training manager activities

    SciTech Connect

    1998-06-01

    The purpose of this guide is to provide direction for line and training managers in carrying out their responsibilities for training and qualifying personnel and to verify that existing training activities are effective.

  13. Australian Management Education for International Business Activity.

    ERIC Educational Resources Information Center

    Gniewosz, Gerhard

    2000-01-01

    As Australian corporations have increased overseas activity, there has been a significant increase in international business degrees at the undergraduate and graduate levels. The curriculum is balanced between business-technical knowledge courses and cultural knowledge courses. (SK)

  14. Preliminary Tritium Management Design Activities at ORNL

    SciTech Connect

    Harrison, Thomas J.; Felde, David K.; Logsdon, Randall J.; McFarlane, Joanna; Qualls, A. L.

    2016-09-01

    Interest in salt-cooled and salt-fueled reactors has increased over the last decade (Forsberg et al. 2016). Several private companies and universities in the United States, as well as governments in other countries, are developing salt reactor designs and/or technology. Two primary issues for the development and deployment of many salt reactor concepts are (1) the prevention of tritium generation and (2) the management of tritium to prevent release to the environment (Holcomb 2013). In 2016, the US Department of Energy (DOE) initiated a research project under the Advanced Reactor Technology Program to (1) experimentally assess the feasibility of proposed methods for tritium mitigation and (2) to perform an engineering demonstration of the most promising methods. This document describes results from the first year’s efforts to define, design, and build an experimental apparatus to test potential methods for tritium management. These efforts are focused on producing a final design document as the basis for the apparatus and its scheduled completion consistent with available budget and approvals for facility use.

  15. Essential Learnings in Environmental Education--A Database for Building Activities and Programs.

    ERIC Educational Resources Information Center

    Ballard, Melissa, Comp.; Pandya, Mamata, Comp.

    The purpose of this book is to provide building blocks for designing and reviewing environmental education programs and activities. This handbook provides 600 basic concepts needed to attain the environmental education goals outlined at the Tbilisi, USSR, conference and generally agreed to be the fundamental core of quality environmental…

  16. Development of Novel Repellents Using Structure - Activity Modeling of Compounds in the USDA Archival Database

    DTIC Science & Technology

    2011-01-01

    used in efforts to develop QSAR models. Measurement of Repellent Efficacy Screening for Repellency of Compounds with Unknown Toxicology In screening...CPT) were used to develop Quantitative Structure Activity Relationship ( QSAR ) models to predict repellency. Successful prediction of novel...acylpiperidine QSAR models employed 4 descriptors to describe the relationship between structure and repellent duration. The ANN model of the carboxamides did not

  17. Implemented data mining and signal management systems on spontaneous reporting systems' databases and their availability to the scientific community - a systematic review.

    PubMed

    de Almeida Vieira Lima, Luis Miguel; Nunes, Nuno Goncalo Sales Craveiro; da Silva Dias, Pedro Goncalo Pires; Marques, Francisco Jorge Batel

    2012-04-01

    Adverse drug reactions' spontaneous reporting systems are an important element in worldwide pharmacovigilance, gathering potentially useful information for post-marketing drug safety surveillance. Data mining and signal management systems, providing the capability of reading and interpreting these systems' raw data (data that has not been subjected to processing or any other manipulation), improve its analysis process. In order for this analysis to be possible, both data mining and signal management systems and raw data should be available to researchers and the scientific community. The purpose of this work was to provide an overview of the spontaneous reporting systems databases reported in literature as having implemented a data mining and signal management system and the implementation itself, evidencing their availability to researchers. A systematic review was carried out, concluding that they are freely provided to researchers within institutions responsible for maintaining the spontaneous reporting systems, but not to most researchers within the scientific community.

  18. BioWarehouse: a bioinformatics database warehouse toolkit

    PubMed Central

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David WJ; Tenenbaum, Jessica D; Karp, Peter D

    2006-01-01

    Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the database integration problem for

  19. Description of data base management activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    ARC's current and future data processing needs have been identified and defined. Requirements for improved data processing capabilities are listed: (1) Centralized control of system data files. Establish a DBA/data base administrator position with responsibility for management of the Centers numerous data bases; (2) Programmer tools to improve efficiency of performance; (3) Direct and timely access to information. Presently, the user submits query requests to the data processing department where they are prioritized with other queries and then batch processed using a report generator; (4) On line data entry. With the merger of the Dryden facility with ARC on line data entry, edit and updates have become mandatory for timely operation and reporting. A DBMS software package was purchased to meet the above requirements.

  20. A Quality System Database

    NASA Technical Reports Server (NTRS)

    Snell, William H.; Turner, Anne M.; Gifford, Luther; Stites, William

    2010-01-01

    A quality system database (QSD), and software to administer the database, were developed to support recording of administrative nonconformance activities that involve requirements for documentation of corrective and/or preventive actions, which can include ISO 9000 internal quality audits and customer complaints.

  1. Active listening: The key of successful communication in hospital managers

    PubMed Central

    Jahromi, Vahid Kohpeima; Tabatabaee, Seyed Saeed; Abdar, Zahra Esmaeili; Rajabi, Mahboobeh

    2016-01-01

    Introduction One of the important causes of medical errors and unintentional harm to patients is ineffective communication. The important part of this skill, in case it has been forgotten, is listening. The objective of this study was to determine whether managers in hospitals listen actively. Methods This study was conducted between May and June 2014 among three levels of managers at teaching hospitals in Kerman, Iran. Active Listening skill among hospital managers was measured by self-made Active Listening Skill Scale (ALSS), which consists of the key elements of active listening and has five subscales, i.e., Avoiding Interruption, Maintaining Interest, Postponing Evaluation, Organizing Information, and Showing Interest. The data were analyzed by IBM-SPSS software, version 20, and the Pearson product-moment correlation coefficient, the chi-squared test, and multiple linear regressions. Results The mean score of active listening in hospital managers was 2.32 out of 3.The highest score (2.27) was obtained by the first-level managers, and the top managers got the lowest score (2.16). Hospital mangers were best in showing interest and worst in avoiding interruptions. The area of employment was a significant predictor of avoiding interruption and the managers’ gender was a strong predictor of skill in maintaining interest (p < 0.05). The type of management and education can predict postponing evaluation, and the length of employment can predict showing interest (p < 0.05). Conclusion There is a necessity for the development of strategies to create more awareness among the hospital managers concerning their active listening skills. PMID:27123221

  2. Using Sales Management Students to Manage Professional Selling Students in an Innovative Active Learning Project

    ERIC Educational Resources Information Center

    Young, Joyce A.; Hawes, Jon M.

    2013-01-01

    This paper describes an application of active learning within two different courses: professional selling and sales management. Students assumed the roles of sales representatives and sales managers for an actual fund-raiser--a golf outing--sponsored by a student chapter of the American Marketing Association. The sales project encompassed an…

  3. Design and Implementation of a Database Management System to Support Administrative Activities Onboard Hellenic Navy Vessels

    DTIC Science & Technology

    1994-09-01

    well SPAS testing strategy will be discussed at this point. 1. Paradox Testing Paradox is designed to allow the developer to conduct tesiing through...34 St Deartmend Nate Grado 0m 1 L9 Offence # 2amaonAl M #’ 0. Name 10. Dale Apology IPunishment IStart Date IEnd Date IReporting Officer Disciplinary

  4. The Global Terrestrial Network for Permafrost Database: metadata statistics and prospective analysis on future permafrost temperature and active layer depth monitoring site distribution

    NASA Astrophysics Data System (ADS)

    Biskaborn, B. K.; Lanckman, J.-P.; Lantuit, H.; Elger, K.; Streletskiy, D. A.; Cable, W. L.; Romanovsky, V. E.

    2015-03-01

    The Global Terrestrial Network for Permafrost (GTN-P) provides the first dynamic database associated with the Thermal State of Permafrost (TSP) and the Circumpolar Active Layer Monitoring (CALM) programs, which extensively collect permafrost temperature and active layer thickness data from Arctic, Antarctic and Mountain permafrost regions. The purpose of the database is to establish an "early warning system" for the consequences of climate change in permafrost regions and to provide standardized thermal permafrost data to global models. In this paper we perform statistical analysis of the GTN-P metadata aiming to identify the spatial gaps in the GTN-P site distribution in relation to climate-effective environmental parameters. We describe the concept and structure of the Data Management System in regard to user operability, data transfer and data policy. We outline data sources and data processing including quality control strategies. Assessment of the metadata and data quality reveals 63% metadata completeness at active layer sites and 50% metadata completeness for boreholes. Voronoi Tessellation Analysis on the spatial sample distribution of boreholes and active layer measurement sites quantifies the distribution inhomogeneity and provides potential locations of additional permafrost research sites to improve the representativeness of thermal monitoring across areas underlain by permafrost. The depth distribution of the boreholes reveals that 73% are shallower than 25 m and 27% are deeper, reaching a maximum of 1 km depth. Comparison of the GTN-P site distribution with permafrost zones, soil organic carbon contents and vegetation types exhibits different local to regional monitoring situations on maps. Preferential slope orientation at the sites most likely causes a bias in the temperature monitoring and should be taken into account when using the data for global models. The distribution of GTN-P sites within zones of projected temperature change show a high

  5. 76 FR 14924 - Takes of Marine Mammals Incidental to Specified Activities; Russian River Estuary Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-18

    ... Specified Activities; Russian River Estuary Management Activities AGENCY: National Marine Fisheries Service... incidental to Russian River estuary management activities. Pursuant to the Marine Mammal Protection Act (MMPA... Channel Adaptive Management Plan. NMFS' Environmental Assessment (2010) and associated Finding of...

  6. Activity-Based Costing: A Cost Management Tool.

    ERIC Educational Resources Information Center

    Turk, Frederick J.

    1993-01-01

    In college and university administration, overhead costs are often charged to programs indiscriminately, whereas the support activities that underlie those costs remain unanalyzed. It is time for institutions to decrease ineffective use of resources. Activity-based management attributes costs more accurately and can improve efficiency. (MSE)

  7. Database of the Geology and Thermal Activity of Norris Geyser Basin, Yellowstone National Park

    USGS Publications Warehouse

    Flynn, Kathryn; Graham Wall, Brita; White, Donald E.; Hutchinson, Roderick A.; Keith, Terry E.C.; Clor, Laura; Robinson, Joel E.

    2008-01-01

    This dataset contains contacts, geologic units and map boundaries from Plate 1 of USGS Professional Paper 1456, 'The Geology and Remarkable Thermal Activity of Norris Geyser Basin, Yellowstone National Park, Wyoming.' The features are contained in the Annotation, basins_poly, contours, geology_arc, geology_poly, point_features, and stream_arc feature classes as well as a table of geologic units and their descriptions. This dataset was constructed to produce a digital geologic map as a basis for studying hydrothermal processes in Norris Geyser Basin. The original map does not contain registration tic marks. To create the geodatabase, the original scanned map was georegistered to USGS aerial photographs of the Norris Junction quadrangle collected in 1994. Manmade objects, i.e. roads, parking lots, and the visitor center, along with stream junctions and other hydrographic features, were used for registration.

  8. Public databases of plant natural products for computational drug discovery.

    PubMed

    Tung, Chun-Wei

    2014-01-01

    Plant natural products have been intensively investigated during the past decades with a considerable amount of generated data. Databases are subsequently developed to facilitate the management and analysis of accumulated information including plant species, chemical compounds, structures and bioactivities. With the support of databases, the screening of novel bioactivities for plant natural products can benefit from advanced computational methods to accelerate the progress of drug discovery. This overview describes the contents of publicly available databases useful for computational research of plant natural products. Based on the databases, quantitative structure-activity relationship models and protein-ligand docking methods can be developed and applied to analyze and screen bioactive compounds. More public and structured databases with unique contents, search functions and links to major databases are needed for efficiently exploring the chemical space of plant natural products.

  9. Passive and active adaptive management: approaches and an example.

    PubMed

    Williams, Byron K

    2011-05-01

    Adaptive management is a framework for resource conservation that promotes iterative learning-based decision making. Yet there remains considerable confusion about what adaptive management entails, and how to actually make resource decisions adaptively. A key but somewhat ambiguous distinction in adaptive management is between active and passive forms of adaptive decision making. The objective of this paper is to illustrate some approaches to active and passive adaptive management with a simple example involving the drawdown of water impoundments on a wildlife refuge. The approaches are illustrated for the drawdown example, and contrasted in terms of objectives, costs, and potential learning rates. Some key challenges to the actual practice of AM are discussed, and tradeoffs between implementation costs and long-term benefits are highlighted.

  10. Passive and active adaptive management: Approaches and an example

    USGS Publications Warehouse

    Williams, B.K.

    2011-01-01

    Adaptive management is a framework for resource conservation that promotes iterative learning-based decision making. Yet there remains considerable confusion about what adaptive management entails, and how to actually make resource decisions adaptively. A key but somewhat ambiguous distinction in adaptive management is between active and passive forms of adaptive decision making. The objective of this paper is to illustrate some approaches to active and passive adaptive management with a simple example involving the drawdown of water impoundments on a wildlife refuge. The approaches are illustrated for the drawdown example, and contrasted in terms of objectives, costs, and potential learning rates. Some key challenges to the actual practice of AM are discussed, and tradeoffs between implementation costs and long-term benefits are highlighted. ?? 2010 Elsevier Ltd.

  11. Far-infrared Line Spectra of Active Galaxies from the Herschel/PACS Spectrometer: The Complete Database

    NASA Astrophysics Data System (ADS)

    Fernández-Ontiveros, Juan Antonio; Spinoglio, Luigi; Pereira-Santaella, Miguel; Malkan, Matthew A.; Andreani, Paola; Dasyra, Kalliopi M.

    2016-10-01

    We present a coherent database of spectroscopic observations of far-IR fine-structure lines from the Herschel/Photoconductor Array Camera and Spectrometer archive for a sample of 170 local active galactic nuclei (AGNs), plus a comparison sample of 20 starburst galaxies and 43 dwarf galaxies. Published Spitzer/IRS and Herschel/SPIRE line fluxes are included to extend our database to the full 10-600 μm spectral range. The observations are compared to a set of Cloudy photoionization models to estimate the above physical quantities through different diagnostic diagrams. We confirm the presence of a stratification of gas density in the emission regions of the galaxies, which increases with the ionization potential of the emission lines. The new [O iv]{}25.9μ {{m}}/[O iii]{}88μ {{m}} versus [Ne iii]{}15.6μ {{m}}/[Ne ii]{}12.8μ {{m}} diagram is proposed as the best diagnostic to separate (1) AGN activity from any kind of star formation and (2) low-metallicity dwarf galaxies from starburst galaxies. Current stellar atmosphere models fail to reproduce the observed [O iv]{}25.9μ {{m}}/[O iii]{}88μ {{m}} ratios, which are much higher when compared to the predicted values. Finally, the ([Ne iii]{}15.6μ {{m}} + [Ne ii]{}12.8μ {{m}})/([S iv]{}10.5μ {{m}} +[S iii]{}18.7μ {{m}}) ratio is proposed as a promising metallicity tracer to be used in obscured objects, where optical lines fail to accurately measure the metallicity. The diagnostic power of mid- to far-infrared spectroscopy shown here for local galaxies will be of crucial importance to study galaxy evolution during the dust-obscured phase at the peak of the star formation and black hole accretion activity (1\\lt z\\lt 4). This study will be addressed by future deep spectroscopic surveys with present and forthcoming facilities such as the James Webb Space Telescope, the Atacama Large Millimeter/submillimeter Array, and the Space Infrared telescope for Cosmology and Astrophysics.

  12. An Interactive Geospatial Database and Visualization Approach to Early Warning Systems and Monitoring of Active Volcanoes: GEOWARN

    NASA Astrophysics Data System (ADS)

    Gogu, R. C.; Schwandner, F. M.; Hurni, L.; Dietrich, V. J.

    2002-12-01

    Large parts of southern and central Europe and the Pacific rim are situated in tectonically, seismic and volcanological extremely active zones. With the growth of population and tourism, vulnerability and risk towards natural hazards have expanded over large areas. Socio-economical aspects, land use, tourist and industrial planning as well as environmental protection increasingly require needs of natural hazard assessment. The availability of powerful and reliable satellite, geophysical and geochemical information and warning systems is therefore increasingly vital. Besides, once such systems have proven to be effective, they can be applied for similar purposes in other European areas and worldwide. Technologies today have proven that early warning of volcanic activity can be achieved by monitoring measurable changes in geophysical and geochemical parameters. Correlation between different monitored data sets, which would improve any prediction, is very scarce or missing. Visualisation of all spatial information and integration into an "intelligent cartographic concept" is of paramount interest in order to develop 2-, 3- and 4-dimensional models to approach the risk and emergency assessment as well as environmental and socio-economic planning. In the framework of the GEOWARN project, a database prototype for an Early Warning System (EWS) and monitoring of volcanic activity in case of hydrothermal-explosive and volcanic reactivation has been designed. The platform-independent, web-based, JAVA-programmed, interactive multidisciplinary multiparameter visualization software being developed at ETH allows expansion and utilization to other volcanoes, world-wide databases of volcanic unrest, or other types of natural hazard assessment. Within the project consortium, scientific data have been acquired on two pilot sites: Campi Flegrei (Italy) and Nisyros Greece, including 2&3D Topography and Bathymetry, Elevation (DEM) and Landscape models (DLM) derived from conventional

  13. Management of Water for Unconventional Oil and Gas Operations Enhanced with the Expanded U.S.Geological Survey Produced Waters Geochemical Database

    NASA Astrophysics Data System (ADS)

    Gans, K. D.; Blondes, M. S.; Thordsen, J. J.; Thomas, B.; Reidy, M. E.; Engle, M.; Kharaka, Y. K.; Rowan, E. L.

    2014-12-01

    Increases in hydraulic fracturing practices for shale gas and tight oil reservoirs have dramatically increased petroleum production in the USA, but have also made the issue of water management from these operations a high priority. Hydraulic fracturing requires ~ 10,000 to 50,000 m3 of water per well for injection in addition to water used to drill the well. Initially much of the water used for hydraulic fracturing was fresh water, but attitudes and operations are changing in response to costs and concerns. Concerns about groundwater depletion and contamination have prompted operators to increase the amount of produced water that can be recycled for hydraulic fracturing and to find suitable locations for salt-water injection. Knowledge of the geochemistry of produced waters is valuable in determining the feasibility of produced water recycling. Water with low salinity can be reclaimed for use outside of the petroleum industry (e.g. irrigation, municipal uses, and industrial operations). The updated and expanded USGS Produced Waters Database available at http://eerscmap.usgs.gov/pwapp/ will facilitate and enhance studies on management of water, including produced water, for unconventional oil and gas drilling and production. The USGS database contains > 160,000 samples. Expanding on the 2002 database, we have filled in state and regional gaps with information from conventional and unconventional wells and have increased the number of constituents to include minor and trace chemicals, isotopes, and time series data. We currently have produced water data from 5,200 tight gas wells, 4,500 coal-bed methane (CBM) wells, 3,500 shale gas wells, and 700 tight oil wells. These numbers will increase as we continue to receive positive responses from oil companies, state oil and gas commissions, and scientists wanting to contribute their data. This database is an important resource for a wide range of interested parties. Scientists from universities, government agencies, public

  14. Mechanisms and Management of Stress Fractures in Physically Active Persons

    PubMed Central

    Romani, William A.; Gieck, Joe H.; Perrin, David H.; Saliba, Ethan N.; Kahler, David M.

    2002-01-01

    Objective: To describe the anatomy of bone and the physiology of bone remodeling as a basis for the proper management of stress fractures in physically active people. Data Sources: We searched PubMed for the years 1965 through 2000 using the key words stress fracture, bone remodeling, epidemiology, and rehabilitation. Data Synthesis: Bone undergoes a normal remodeling process in physically active persons. Increased stress leads to an acceleration of this remodeling process, a subsequent weakening of bone, and a higher susceptibility to stress fracture. When a stress fracture is suspected, appropriate management of the injury should begin immediately. Effective management includes a cyclic process of activity and rest that is based on the remodeling process of bone. Conclusions/Recommendations: Bone continuously remodels itself to withstand the stresses involved with physical activity. Stress fractures occur as the result of increased remodeling and a subsequent weakening of the outer surface ofthe bone. Once a stress fracture is suspected, a cyclic management program that incorporates the physiology of bone remodeling should be initiated. The cyclic program should allow the physically active person to remove the source of the stress to the bone, maintain fitness, promote a safe return to activity, and permit the bone to heal properly. PMID:16558676

  15. Bibliometric analysis of nutrition and dietetics research activity in Arab countries using ISI Web of Science database.

    PubMed

    Sweileh, Waleed M; Al-Jabi, Samah W; Sawalha, Ansam F; Zyoud, Sa'ed H

    2014-01-01

    Reducing nutrition-related health problems in Arab countries requires an understanding of the performance of Arab countries in the field of nutrition and dietetics research. Assessment of research activity from a particular country or region could be achieved through bibliometric analysis. This study was carried out to investigate research activity in "nutrition and dietetics" in Arab countries. Original and review articles published from Arab countries in "nutrition and dietetics" Web of Science category up until 2012 were retrieved and analyzed using the ISI Web of Science database. The total number of documents published in "nutrition and dietetics" category from Arab countries was 2062. This constitutes 1% of worldwide research activity in the field. Annual research productivity showed a significant increase after 2005. Approximately 60% of published documents originated from three Arab countries, particularly Egypt, Kingdom of Saudi Arabia, and Tunisia. However, Kuwait has the highest research productivity per million inhabitants. Main research areas of published documents were in "Food Science/Technology" and "Chemistry" which constituted 75% of published documents compared with 25% for worldwide documents in nutrition and dietetics. A total of 329 (15.96%) nutrition - related diabetes or obesity or cancer documents were published from Arab countries compared with 21% for worldwide published documents. Interest in nutrition and dietetics research is relatively recent in Arab countries. Focus of nutrition research is mainly toward food technology and chemistry with lesser activity toward nutrition-related health research. International cooperation in nutrition research will definitely help Arab researchers in implementing nutrition research that will lead to better national policies regarding nutrition.

  16. A Ranking Analysis of the Management Schools in Greater China (2000-2010): Evidence from the SSCI Database

    ERIC Educational Resources Information Center

    Hou, Mingjun; Fan, Peihua; Liu, Heng

    2014-01-01

    The authors rank the management schools in Greater China (including Mainland China, Hong Kong, Taiwan, and Macau) based on their academic publications in the Social Sciences Citation Index management and business journals from 2000 to 2010. Following K. Ritzberger's (2008) and X. Yu and Z. Gao's (2010) ranking method, the authors develop six…

  17. Maize databases

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This chapter is a succinct overview of maize data held in the species-specific database MaizeGDB (the Maize Genomics and Genetics Database), and selected multi-species data repositories, such as Gramene/Ensembl Plants, Phytozome, UniProt and the National Center for Biotechnology Information (NCBI), ...

  18. SU-E-J-129: A Strategy to Consolidate the Image Database of a VERO Unit Into a Radiotherapy Management System

    SciTech Connect

    Yan, Y; Medin, P; Yordy, J; Zhao, B; Jiang, S

    2014-06-01

    Purpose: To present a strategy to integrate the imaging database of a VERO unit with a treatment management system (TMS) to improve clinical workflow and consolidate image data to facilitate clinical quality control and documentation. Methods: A VERO unit is equipped with both kV and MV imaging capabilities for IGRT treatments. It has its own imaging database behind a firewall. It has been a challenge to transfer images on this unit to a TMS in a radiation therapy clinic so that registered images can be reviewed remotely with an approval or rejection record. In this study, a software system, iPump-VERO, was developed to connect VERO and a TMS in our clinic. The patient database folder on the VERO unit was mapped to a read-only folder on a file server outside VERO firewall. The application runs on a regular computer with the read access to the patient database folder. It finds the latest registered images and fuses them in one of six predefined patterns before sends them via DICOM connection to the TMS. The residual image registration errors will be overlaid on the fused image to facilitate image review. Results: The fused images of either registered kV planar images or CBCT images are fully DICOM compatible. A sentinel module is built to sense new registered images with negligible computing resources from the VERO ExacTrac imaging computer. It takes a few seconds to fuse registered images and send them to the TMS. The whole process is automated without any human intervention. Conclusion: Transferring images in DICOM connection is the easiest way to consolidate images of various sources in your TMS. Technically the attending does not have to go to the VERO treatment console to review image registration prior delivery. It is a useful tool for a busy clinic with a VERO unit.

  19. Managing Rock and Paleomagnetic Data Flow with the MagIC Database: from Measurement and Analysis to Comprehensive Archive and Visualization

    NASA Astrophysics Data System (ADS)

    Koppers, A. A.; Minnett, R. C.; Tauxe, L.; Constable, C.; Donadini, F.

    2008-12-01

    The Magnetics Information Consortium (MagIC) is commissioned to implement and maintain an online portal to a relational database populated by rock and paleomagnetic data. The goal of MagIC is to archive all measurements and derived properties for studies of paleomagnetic directions (inclination, declination) and intensities, and for rock magnetic experiments (hysteresis, remanence, susceptibility, anisotropy). Organizing data for presentation in peer-reviewed publications or for ingestion into databases is a time-consuming task, and to facilitate these activities, three tightly integrated tools have been developed: MagIC-PY, the MagIC Console Software, and the MagIC Online Database. A suite of Python scripts is available to help users port their data into the MagIC data format. They allow the user to add important metadata, perform basic interpretations, and average results at the specimen, sample and site levels. These scripts have been validated for use as Open Source software under the UNIX, Linux, PC and Macintosh© operating systems. We have also developed the MagIC Console Software program to assist in collating rock and paleomagnetic data for upload to the MagIC database. The program runs in Microsoft Excel© on both Macintosh© computers and PCs. It performs routine consistency checks on data entries, and assists users in preparing data for uploading into the online MagIC database. The MagIC website is hosted under EarthRef.org at http://earthref.org/MAGIC/ and has two search nodes, one for paleomagnetism and one for rock magnetism. Both nodes provide query building based on location, reference, methods applied, material type and geological age, as well as a visual FlashMap interface to browse and select locations. Users can also browse the database by data type (inclination, intensity, VGP, hysteresis, susceptibility) or by data compilation to view all contributions associated with previous databases, such as PINT, GMPDB or TAFI or other user

  20. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for scientific activities in the vicinity of Barrow, Alaska.

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Kassin, A.; Kofoed, K. B.; Copenhaver, W.; Laney, C. M.; Gaylord, A. G.; Collins, J. A.; Tweedie, C. E.

    2014-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic and the Barrow Area Information Database (BAID, www.barrowmapped.org) tracks and facilitates a gamut of research, management, and educational activities in the area. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 12,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, save or print maps and query results, and filter or view information by space, time, and/or other tags. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. Recent advances include the addition of more than 2000 new research sites, provision of differential global position system (dGPS) and Unmanned Aerial Vehicle (UAV) support to visiting scientists, surveying over 80 miles of coastline to document rates of erosion, training of local GIS personal to better make use of science in local decision making, deployment and near real time connectivity to a wireless micrometeorological sensor network, links to Barrow area datasets housed at national data archives and substantial upgrades to the BAID website and web mapping applications.

  1. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  2. The Impact of Environment and Occupation on the Health and Safety of Active Duty Air Force Members: Database Development and De-Identification.

    PubMed

    Erich, Roger; Eaton, Melinda; Mayes, Ryan; Pierce, Lamar; Knight, Andrew; Genovesi, Paul; Escobar, James; Mychalczuk, George; Selent, Monica

    2016-08-01

    Preparing data for medical research can be challenging, detail oriented, and time consuming. Transcription errors, missing or nonsensical data, and records not applicable to the study population may hamper progress and, if unaddressed, can lead to erroneous conclusions. In addition, study data may be housed in multiple disparate databases and complex formats. Merging methods may be incomplete to obtain temporally synchronized data elements. We created a comprehensive database to explore the general hypothesis that environmental and occupational factors influence health outcomes and risk-taking behavior among active duty Air Force personnel. Several databases containing demographics, medical records, health survey responses, and safety incident reports were cleaned, validated, and linked to form a comprehensive, relational database. The final step involved removing and transforming personally identifiable information to form a Health Insurance Portability and Accountability Act compliant limited database. Initial data consisted of over 62.8 million records containing 221 variables. When completed, approximately 23.9 million clean and valid records with 214 variables remained. With a clean, robust database, future analysis aims to identify high-risk career fields for targeted interventions or uncover potential protective factors in low-risk career fields.

  3. WAX ActiveLibrary: a tool to manage information overload.

    PubMed

    Hanka, R; O'Brien, C; Heathfield, H; Buchan, I E

    1999-11-01

    WAX Active-Library (Cambridge Centre for Clinical Informatics) is a knowledge management system that seeks to support doctors' decision making through the provision of electronic books containing a wide range of clinical knowledge and locally based information. WAX has been piloted in several regions in the United Kingdom and formally evaluated in 17 GP surgeries based in Cambridgeshire. The evaluation has provided evidence that WAX Active-Library significantly improves GPs' access to relevant information sources and by increasing appropriate patient management and referrals this might also lead to an improvement in clinical outcomes.

  4. The spectral database Specchio: Data management, data sharing and initial processing of field spectrometer data within the Dimensions of Biodiversity project

    NASA Astrophysics Data System (ADS)

    Hueni, A.; Schweiger, A. K.

    2015-12-01

    Field spectrometry has substantially gained importance in vegetation ecology due to the increasing knowledge about causal ties between vegetation spectra and biochemical and structural plant traits. Additionally, worldwide databases enable the exchange of spectral and plant trait data and promote global research cooperation. This can be expected to further enhance the use of field spectrometers in ecological studies. However, the large amount of data collected during spectral field campaigns poses major challenges regarding data management, archiving and processing. The spectral database Specchio is designed to organize, manage, process and share spectral data and metadata. We provide an example for using Specchio based on leaf level spectra of prairie plant species collected during the 2015 field campaign of the Dimensions of Biodiversity research project, conducted at the Cedar Creek Long-Term Ecological Research site, in central Minnesota. We show how spectral data collections can be efficiently administered, organized and shared between distinct research groups and explore the capabilities of Specchio for data quality checks and initial processing steps.

  5. BATMAN (Battle-Management Assessment System) and ROBIN (Raid Originator Bogie Ingress): Rationale, Software Design, and Database Descriptions

    DTIC Science & Technology

    1989-04-01

    background color of the screen when entering a new user’s name and social security number. value: a string of the form : " color offset". See...buttoncolor above. userbgcolor: the color of empty slots in the "list of users" panel. value: a string of the form : " color offset". See buttoncolor above. -66...BATMAN & ROBIN Database Descriptions userfgcolor: the color of users’ names in the "list of users" panel. value: a string of the form : " color offset

  6. TRANSFORMATION OF DEVELOPMENTAL NEUROTOXICITY DATA INTO STRUCTURE-SEARCHABLE TOXML DATABASE IN SUPPORT OF STRUCTURE-ACTIVITY RELATIONSHIP (SAR) WORKFLOW.

    EPA Science Inventory

    Early hazard identification of new chemicals is often difficult due to lack of data on the novel material for toxicity endpoints, including neurotoxicity. At present, there are no structure searchable neurotoxicity databases. A working group was formed to construct a database to...

  7. Prognostic and health management of active assets in nuclear power plants

    SciTech Connect

    Agarwal, Vivek; Lybeck, Nancy; Pham, Binh T.; Rusaw, Richard; Bickford, Randall

    2015-06-04

    This study presents the development of diagnostic and prognostic capabilities for active assets in nuclear power plants (NPPs). The research was performed under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program. Idaho National Laboratory researched, developed, implemented, and demonstrated diagnostic and prognostic models for generator step-up transformers (GSUs). The Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software developed by the Electric Power Research Institute was used to perform diagnosis and prognosis. As part of the research activity, Idaho National Laboratory implemented 22 GSU diagnostic models in the Asset Fault Signature Database and two wellestablished GSU prognostic models for the paper winding insulation in the Remaining Useful Life Database of the FW-PHM Suite. The implemented models along with a simulated fault data stream were used to evaluate the diagnostic and prognostic capabilities of the FW-PHM Suite. Knowledge of the operating condition of plant asset gained from diagnosis and prognosis is critical for the safe, productive, and economical long-term operation of the current fleet of NPPs. This research addresses some of the gaps in the current state of technology development and enables effective application of diagnostics and prognostics to nuclear plant assets.

  8. Prognostic and health management of active assets in nuclear power plants

    DOE PAGES

    Agarwal, Vivek; Lybeck, Nancy; Pham, Binh T.; ...

    2015-06-04

    This study presents the development of diagnostic and prognostic capabilities for active assets in nuclear power plants (NPPs). The research was performed under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program. Idaho National Laboratory researched, developed, implemented, and demonstrated diagnostic and prognostic models for generator step-up transformers (GSUs). The Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software developed by the Electric Power Research Institute was used to perform diagnosis and prognosis. As part of the research activity, Idaho National Laboratory implemented 22 GSU diagnostic models in the Asset Fault Signature Database and twomore » wellestablished GSU prognostic models for the paper winding insulation in the Remaining Useful Life Database of the FW-PHM Suite. The implemented models along with a simulated fault data stream were used to evaluate the diagnostic and prognostic capabilities of the FW-PHM Suite. Knowledge of the operating condition of plant asset gained from diagnosis and prognosis is critical for the safe, productive, and economical long-term operation of the current fleet of NPPs. This research addresses some of the gaps in the current state of technology development and enables effective application of diagnostics and prognostics to nuclear plant assets.« less

  9. A Software for managing afterhours activities in research user facilities

    DOE PAGES

    Camino, Fernando E.

    2016-10-13

    Here, we present an afterhours activity management program for shared facilities, which handles the processes required for afterhours access (request, approval, extension, etc.). It implements the concept of permitted afterhours activities, which consists of a list of well-defined activities that each user can perform afterhours. The program provides an easy and unambiguous way for users to know which activities they are allowed to perform afterhours. In addition, the program can enhance its safety efficacy by interacting with lab and instrument access control systems commonly present in user facilities.

  10. Genome databases

    SciTech Connect

    Courteau, J.

    1991-10-11

    Since the Genome Project began several years ago, a plethora of databases have been developed or are in the works. They range from the massive Genome Data Base at Johns Hopkins University, the central repository of all gene mapping information, to small databases focusing on single chromosomes or organisms. Some are publicly available, others are essentially private electronic lab notebooks. Still others limit access to a consortium of researchers working on, say, a single human chromosome. An increasing number incorporate sophisticated search and analytical software, while others operate as little more than data lists. In consultation with numerous experts in the field, a list has been compiled of some key genome-related databases. The list was not limited to map and sequence databases but also included the tools investigators use to interpret and elucidate genetic data, such as protein sequence and protein structure databases. Because a major goal of the Genome Project is to map and sequence the genomes of several experimental animals, including E. coli, yeast, fruit fly, nematode, and mouse, the available databases for those organisms are listed as well. The author also includes several databases that are still under development - including some ambitious efforts that go beyond data compilation to create what are being called electronic research communities, enabling many users, rather than just one or a few curators, to add or edit the data and tag it as raw or confirmed.

  11. Efficient Management of Complex Striped Files in Active Storage

    SciTech Connect

    Piernas Canovas, Juan; Nieplocha, Jaroslaw

    2008-08-25

    Active Storage provides an opportunity for reducing the band- width requirements between the storage and compute elements of cur- rent supercomputing systems, and leveraging the processing power of the storage nodes used by some modern file systems. To achieve both objec- tives, Active Storage allows certain processing tasks to be performed directly on the storage nodes, near the data they manage. However, Active Storage must also support key requirements of scientific applications. In particular, Active Storage must be able to support striped files and files with complex formats (e.g., netCDF). In this paper, we describe how these important requirements can be addressed. The experimental results on a Lustre file system not only show that our proposal can re- duce the network traffic to near zero and scale the performance with the number of storage nodes, but also that it provides an efficient treatment of striped files and can manage files with complex data structures.

  12. Moodog: Tracking Student Activity in Online Course Management Systems

    ERIC Educational Resources Information Center

    Zhang, Hangjin; Almeroth, Kevin

    2010-01-01

    Many universities are currently using Course Management Systems (CMSes) to conduct online learning, for example, by distributing course materials or submitting homework assignments. However, most CMSes do not include comprehensive activity tracking and analysis capabilities. This paper describes a method to track students' online learning…

  13. Centers Made Simple: A Management and Activity Guide

    ERIC Educational Resources Information Center

    Reynolds, Laureen

    2005-01-01

    Drawing from her own experiences, teacher and author Laureen Reynolds shares management strategies, helpful suggestions, and quick tips to take the mystery out of setting up centers and demonstrates how to make the most of valuable classroom time. The activities outlined in this book are designed to appeal to students of all abilities, and range…

  14. Draft position paper on knowledge management in space activities

    NASA Technical Reports Server (NTRS)

    Holm, Jeanne; Moura, Denis

    2003-01-01

    As other fields of industry, space activities are facing the challenge of Knowledge Management and the International Academy of Astronautics decided to settle in 2002 a Study Group to analyse the problem and issue general guidelines. This communication presents the draft position paper of this group in view to be discussed during the 2003 IAF Congress.

  15. Management of Hypertension: Adapting New Guidelines for Active Patients.

    ERIC Educational Resources Information Center

    Tanji, Jeffrey L.; Batt, Mark E.

    1995-01-01

    Discusses recent guidelines on hypertension from the National Heart, Lung, and Blood Institute and details the latest management protocols for patients with high blood pressure. The article helps physicians interpret the guidelines for treating active patients, highlighting diagnosis, step care revision, pharmacology, and sports participation…

  16. ePlantLIBRA: A composition and biological activity database for bioactive compounds in plant food supplements.

    PubMed

    Plumb, J; Lyons, J; Nørby, K; Thomas, M; Nørby, E; Poms, R; Bucchini, L; Restani, P; Kiely, M; Finglas, P

    2016-02-15

    The newly developed ePlantLIBRA database is a comprehensive and searchable database, with up-to-date coherent and validated scientific information on plant food supplement (PFS) bioactive compounds, with putative health benefits as well as adverse effects, and contaminants and residues. It is the only web-based database available compiling peer reviewed publications and case studies on PFS. A user-friendly, efficient and flexible interface has been developed for searching, extracting, and exporting the data, including links to the original references. Data from over 570 publications have been quality evaluated and entered covering 70 PFS or their botanical ingredients.

  17. Assessment of global disease activity in RA patients monitored in the METEOR database: the patient's versus the rheumatologist's opinion.

    PubMed

    Gvozdenović, Emilia; Koevoets, Rosanne; Wolterbeek, Ron; van der Heijde, Désirée; Huizinga, Tom W J; Allaart, Cornelia F; Landewé, Robert B M

    2014-04-01

    The objectives of this study were to compare the patient's (PtGDA) and physician's (PhGDA) assessment of global disease activity and to identify factors that might influence these differences as well as factors that may influence the patient's and the physician's scores separately. Anonymous data were used from 2,117 Dutch patients included in the Measurement of efficacy of Treatment in the Era of Rheumatology database. PtGDA and PhGDA were scored independently on a 100-mm visual analog scale (VAS) with 0 and 100 as extremes. The agreement, intraclass correlation coefficients (ICC), was calculated and a Bland-Altman plot was created to visualize the differences between PtGDA and PhGDA. Linear mixed model analysis was used to model PtGDA and PhGDA. Logistic repeated measurements were used to model the difference in PtGDA and PhGDA (PtGDA > PhGDA versus PtGDA ≤ PhGDA). Gender patient, gender physician, age, swollen joint count (SJC), tender joint count, VAS pain, disease duration, and erythrocyte sedimentation rate (ESR) were considered as possible determinants in both models. Mean (standard deviation) age was 57 (15) years and 67 % of the patients were female. Agreement between PtGDA and PhGDA was moderate (ICC, 0.57). Patients scored on average 11 units higher (worse) than rheumatologists (95 % limits of agreement, -25.2 to 47.6). Patient's perception of pain (VAS) was positively associated with a PtGDA being higher than PhGDA. Similarly, ESR and swollen joint counts were positively associated with a PtGDA being lower or equal to the PhGDA. Patients rate global disease activity consistently higher than their rheumatologists. Patients base their judgment primarily on the level of pain, physicians on the level of SJC and ESR.

  18. Use of relational database management system by clinicians to create automated MICU progress note from existent data sources.

    PubMed Central

    Delaney, D. P.; Zibrak, J. D.; Samore, M.; Peterson, M.

    1997-01-01

    We designed and built an application called MD Assist that compiles data from several hospital databases to create reports used for daily house officer rounding in the medical intensive care unit (MICU). After rounding, the report becomes the objective portion of the daily "SOAP" MICU progress note. All data used in the automated note was available in digital format residing in an institution wide Sybase data repository which had been built to fulfill data needs of the parent enterprise. From initial design of target output through actual creation and implementation in the MICU, MD Assist was created by physicians with only consultative help from information systems (IS). This project demonstrated a method for rapidly developing time saving, clinically useful applications using a comprehensive clinical data repository. PMID:9357578

  19. Database Systems. Course Three. Information Systems Curriculum.

    ERIC Educational Resources Information Center

    O'Neil, Sharon Lund; Everett, Donna R.

    This course is the third of seven in the Information Systems curriculum. The purpose of the course is to familiarize students with database management concepts and standard database management software. Databases and their roles, advantages, and limitations are explained. An overview of the course sets forth the condition and performance standard…

  20. Active surveillance for the management of localized prostate cancer: Guideline recommendations

    PubMed Central

    Morash, Chris; Tey, Rovena; Agbassi, Chika; Klotz, Laurence; McGowan, Tom; Srigley, John; Evans, Andrew

    2015-01-01

    Introduction: The objective is to provide guidance on the role of active surveillance (AS) as a management strategy for low-risk prostate cancer patients and to ensure that AS is offered to appropriate patients assessed by a standardized protocol. Prostate cancer is often a slowly progressive or sometimes non-progressive indolent disease diagnosed at an early stage with localized tumours that are unlikely to cause morbidity or death. Standard active treatments for prostate cancer include radiotherapy (RT) or radical prostatectomy (RP), but the harms from over diagnosis and overtreatment are of a significant concern. AS is increasingly being considered as a management strategy to avoid or delay the potential harms caused by unnecessary radical treatment. Methods: A literature search of MEDLINE, EMBASE, the Cochrane library, guideline databases and relevant meeting proceedings was performed and a systematic review of identified evidence was synthesized to make recommendations relating to the role of AS in the management of localized prostate cancer. Results: No exiting guidelines or reviews were suitable for use in the synthesis of evidence for the recommendations, but 59 reports of primary studies were identified. Due to studies being either non-comparative or heterogeneous, pooled meta-analyses were not conducted. Conclusion: The working group concluded that for patients with low-risk (Gleason score ≤6) localized prostate cancer, AS is the preferred disease management strategy. Active treatment (RP or RT) is appropriate for patients with intermediate-risk (Gleason score 7) localized prostate cancer. For select patients with low-volume Gleason 3+4=7 localized prostate cancer, AS can be considered. PMID:26225165

  1. Redis database administration tool

    SciTech Connect

    Martinez, J. J.

    2013-02-13

    MyRedis is a product of the Lorenz subproject under the ASC Scirntific Data Management effort. MyRedis is a web based utility designed to allow easy administration of instances of Redis databases. It can be usedd to view and manipulate data as well as run commands directly against a variety of different Redis hosts.

  2. Hazard Analysis Database Report

    SciTech Connect

    GRAMS, W.H.

    2000-12-28

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.

  3. Scale out databases for CERN use cases

    NASA Astrophysics Data System (ADS)

    Baranowski, Zbigniew; Grzybek, Maciej; Canali, Luca; Lanza Garcia, Daniel; Surdy, Kacper

    2015-12-01

    Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log database.

  4. CDC's Emergency Management Program activities - worldwide, 2003-2012.

    PubMed

    2013-09-06

    In 2003, recognizing the increasing frequency and complexity of disease outbreaks and disasters and a greater risk for terrorism, CDC established the Emergency Operations Center (EOC), bringing together CDC staff members who respond to public health emergencies to enhance communication and coordination. To complement the physical EOC environment, CDC implemented the Incident Management System (IMS), a staffing structure and set of standard operational protocols and services to support and monitor CDC program-led responses to complex public health emergencies. The EOC and IMS are key components of CDC's Emergency Management Program (EMP), which applies emergency management principles to public health practice. To enumerate activities conducted by the EMP during 2003-2012, CDC analyzed data from daily reports and activity logs. The results of this analysis determined that, during 2003-2012, the EMP fully activated the EOC and IMS on 55 occasions to support responses to infectious disease outbreaks, natural disasters, national security events (e.g., conventions, presidential addresses, and international summits), mass gatherings (e.g., large sports and social events), and man-made disasters. On 109 other occasions, the EMP was used to support emergency responses that did not require full EOC activation, and the EMP also conducted 30 exercises and drills. This report provides an overview of those 194 EMP activities.

  5. Management and climate contributions to satellite-derived active fire trends in the contiguous United States.

    PubMed

    Lin, Hsiao-Wen; McCarty, Jessica L; Wang, Dongdong; Rogers, Brendan M; Morton, Douglas C; Collatz, G James; Jin, Yufang; Randerson, James T

    2014-04-01

    Fires in croplands, plantations, and rangelands contribute significantly to fire emissions in the United States, yet are often overshadowed by wildland fires in efforts to develop inventories or estimate responses to climate change. Here we quantified decadal trends, interannual variability, and seasonality of Terra Moderate Resolution Imaging Spectroradiometer (MODIS) observations of active fires (thermal anomalies) as a function of management type in the contiguous U.S. during 2001-2010. We used the Monitoring Trends in Burn Severity database to identify active fires within the perimeter of large wildland fires and land cover maps to identify active fires in croplands. A third class of fires defined as prescribed/other included all residual satellite active fire detections. Large wildland fires were the most variable of all three fire types and had no significant annual trend in the contiguous U.S. during 2001-2010. Active fires in croplands, in contrast, increased at a rate of 3.4% per year. Cropland and prescribed/other fire types combined were responsible for 77% of the total active fire detections within the U.S and were most abundant in the south and southeast. In the west, cropland active fires decreased at a rate of 5.9% per year, likely in response to intensive air quality policies. Potential evaporation was a dominant regulator of the interannual variability of large wildland fires, but had a weaker influence on the other two fire types. Our analysis suggests it may be possible to modify landscape fire emissions within the U.S. by influencing the way fires are used in managed ecosystems.

  6. Management and climate contributions to satellite-derived active fire trends in the contiguous United States

    PubMed Central

    Lin, Hsiao-Wen; McCarty, Jessica L; Wang, Dongdong; Rogers, Brendan M; Morton, Douglas C; Collatz, G James; Jin, Yufang; Randerson, James T

    2014-01-01

    Fires in croplands, plantations, and rangelands contribute significantly to fire emissions in the United States, yet are often overshadowed by wildland fires in efforts to develop inventories or estimate responses to climate change. Here we quantified decadal trends, interannual variability, and seasonality of Terra Moderate Resolution Imaging Spectroradiometer (MODIS) observations of active fires (thermal anomalies) as a function of management type in the contiguous U.S. during 2001–2010. We used the Monitoring Trends in Burn Severity database to identify active fires within the perimeter of large wildland fires and land cover maps to identify active fires in croplands. A third class of fires defined as prescribed/other included all residual satellite active fire detections. Large wildland fires were the most variable of all three fire types and had no significant annual trend in the contiguous U.S. during 2001–2010. Active fires in croplands, in contrast, increased at a rate of 3.4% per year. Cropland and prescribed/other fire types combined were responsible for 77% of the total active fire detections within the U.S and were most abundant in the south and southeast. In the west, cropland active fires decreased at a rate of 5.9% per year, likely in response to intensive air quality policies. Potential evaporation was a dominant regulator of the interannual variability of large wildland fires, but had a weaker influence on the other two fire types. Our analysis suggests it may be possible to modify landscape fire emissions within the U.S. by influencing the way fires are used in managed ecosystems. Key Points Wildland, cropland, and prescribed fires had different trends and patterns Sensitivity to climate varied with fire type Intensity of air quality regulation influenced cropland burning trends PMID:26213662

  7. SECONDARY WASTE MANAGEMENT STRATEGY FOR EARLY LOW ACTIVITY WASTE TREATMENT

    SciTech Connect

    TW, CRAWFORD

    2008-07-17

    This study evaluates parameters relevant to River Protection Project secondary waste streams generated during Early Low Activity Waste operations and recommends a strategy for secondary waste management that considers groundwater impact, cost, and programmatic risk. The recommended strategy for managing River Protection Project secondary waste is focused on improvements in the Effiuent Treatment Facility. Baseline plans to build a Solidification Treatment Unit adjacent to Effluent Treatment Facility should be enhanced to improve solid waste performance and mitigate corrosion of tanks and piping supporting the Effiuent Treatment Facility evaporator. This approach provides a life-cycle benefit to solid waste performance and reduction of groundwater contaminants.

  8. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for scientific activities in the vicinity of Barrow, Alaska

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Kassin, A.; Gaylord, A.; Brown, J.; Tweedie, C. E.

    2012-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic. The Barrow Area Information Database (BAID, www.baidims.org) is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 9,600 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, and save or print maps and query results. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. BAID has been used to: Optimize research site choice; Reduce duplication of science effort; Discover complementary and potentially detrimental research activities in an area of scientific interest; Re-establish historical research sites for resampling efforts assessing change in ecosystem structure and function over time; Exchange knowledge across disciplines and generations; Facilitate communication between western science and traditional ecological knowledge; Provide local residents access to science data that facilitates adaptation to arctic change; (and) Educate the next generation of environmental and computer scientists. This poster describes key activities that will be undertaken over the next three years to provide BAID users with novel software tools to interact with a current and diverse selection of information and data about the Barrow area. Key activities include: 1. Collecting data on research

  9. Experiment Databases

    NASA Astrophysics Data System (ADS)

    Vanschoren, Joaquin; Blockeel, Hendrik

    Next to running machine learning algorithms based on inductive queries, much can be learned by immediately querying the combined results of many prior studies. Indeed, all around the globe, thousands of machine learning experiments are being executed on a daily basis, generating a constant stream of empirical information on machine learning techniques. While the information contained in these experiments might have many uses beyond their original intent, results are typically described very concisely in papers and discarded afterwards. If we properly store and organize these results in central databases, they can be immediately reused for further analysis, thus boosting future research. In this chapter, we propose the use of experiment databases: databases designed to collect all the necessary details of these experiments, and to intelligently organize them in online repositories to enable fast and thorough analysis of a myriad of collected results. They constitute an additional, queriable source of empirical meta-data based on principled descriptions of algorithm executions, without reimplementing the algorithms in an inductive database. As such, they engender a very dynamic, collaborative approach to experimentation, in which experiments can be freely shared, linked together, and immediately reused by researchers all over the world. They can be set up for personal use, to share results within a lab or to create open, community-wide repositories. Here, we provide a high-level overview of their design, and use an existing experiment database to answer various interesting research questions about machine learning algorithms and to verify a number of recent studies.

  10. Drinking Water Database

    NASA Technical Reports Server (NTRS)

    Murray, ShaTerea R.

    2004-01-01

    This summer I had the opportunity to work in the Environmental Management Office (EMO) under the Chemical Sampling and Analysis Team or CS&AT. This team s mission is to support Glenn Research Center (GRC) and EM0 by providing chemical sampling and analysis services and expert consulting. Services include sampling and chemical analysis of water, soil, fbels, oils, paint, insulation materials, etc. One of this team s major projects is the Drinking Water Project. This is a project that is done on Glenn s water coolers and ten percent of its sink every two years. For the past two summers an intern had been putting together a database for this team to record the test they had perform. She had successfully created a database but hadn't worked out all the quirks. So this summer William Wilder (an intern from Cleveland State University) and I worked together to perfect her database. We began be finding out exactly what every member of the team thought about the database and what they would change if any. After collecting this data we both had to take some courses in Microsoft Access in order to fix the problems. Next we began looking at what exactly how the database worked from the outside inward. Then we began trying to change the database but we quickly found out that this would be virtually impossible.

  11. 3D geo-database research: Retrospective and future directions

    NASA Astrophysics Data System (ADS)

    Breunig, Martin; Zlatanova, Sisi

    2011-07-01

    3D geo-database research is a promising field to support challenging applications such as 3D urban planning, environmental monitoring, infrastructure management, and early warning or disaster management and response. In these fields, interdisciplinary research in GIScience and related fields is needed to support the modelling, analysis, management, and integration of large geo-referenced data sets, which describe human activities and geophysical phenomena. Geo-databases may serve as platforms to integrate 2D maps, 3D geo-scientific models, and other geo-referenced data. However, current geo-databases do not provide sufficient 3D data modelling and data handling techniques. New 3D geo-databases are needed to handle surface and volume models. This article first presents a 25-year retrospective of geo-database research. Data modelling, standards, and indexing of geo-data are discussed in detail. New directions for the development of 3D geo-databases to open new fields for interdisciplinary research are addressed. Two scenarios in the fields of early warning and emergency response demonstrate the combined management of human and geophysical phenomena. The article concludes with a critical outlook on open research problems.

  12. 17 CFR 240.3b-15 - Definition of ancillary portfolio management securities activities.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... portfolio management securities activities. 240.3b-15 Section 240.3b-15 Commodity and Securities Exchanges... ancillary portfolio management securities activities. (a) The term ancillary portfolio management securities... of incidental trading activities for portfolio management purposes; and (3) Are limited to...

  13. SMALL-SCALE AND GLOBAL DYNAMOS AND THE AREA AND FLUX DISTRIBUTIONS OF ACTIVE REGIONS, SUNSPOT GROUPS, AND SUNSPOTS: A MULTI-DATABASE STUDY

    SciTech Connect

    Muñoz-Jaramillo, Andrés; Windmueller, John C.; Amouzou, Ernest C.; Longcope, Dana W.; Senkpeil, Ryan R.; Tlatov, Andrey G.; Nagovitsyn, Yury A.; Pevtsov, Alexei A.; Chapman, Gary A.; Cookson, Angela M.; Yeates, Anthony R.; Watson, Fraser T.; Balmaceda, Laura A.; DeLuca, Edward E.; Martens, Petrus C. H.

    2015-02-10

    In this work, we take advantage of 11 different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions—where a pure Weibull (log-normal) characterizes the distribution of structures with fluxes below (above) 10{sup 21}Mx (10{sup 22}Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behavior of a power-law distribution (when extended to smaller fluxes), making our results compatible with the results of Parnell et al. We propose that this is evidence of two separate mechanisms giving rise to visible structures on the photosphere: one directly connected to the global component of the dynamo (and the generation of bipolar active regions), and the other with the small-scale component of the dynamo (and the fragmentation of magnetic structures due to their interaction with turbulent convection)

  14. Solubility Database

    National Institute of Standards and Technology Data Gateway

    SRD 106 IUPAC-NIST Solubility Database (Web, free access)   These solubilities are compiled from 18 volumes (Click here for List) of the International Union for Pure and Applied Chemistry(IUPAC)-NIST Solubility Data Series. The database includes liquid-liquid, solid-liquid, and gas-liquid systems. Typical solvents and solutes include water, seawater, heavy water, inorganic compounds, and a variety of organic compounds such as hydrocarbons, halogenated hydrocarbons, alcohols, acids, esters and nitrogen compounds. There are over 67,500 solubility measurements and over 1800 references.

  15. United States-Russia: Environmental management activities, Summer 1998

    SciTech Connect

    1998-09-01

    A Joint Coordinating Committee for Environmental Restoration and Waste Management (JCCEM) was formed between the US and Russia. This report describes the areas of research being studied under JCCEM, namely: Efficient separations; Contaminant transport and site characterization; Mixed wastes; High level waste tank remediation; Transuranic stabilization; Decontamination and decommissioning; and Emergency response. Other sections describe: Administrative framework for cooperation; Scientist exchange; Future actions; Non-JCCEM DOE-Russian activities; and JCCEM publications.

  16. Briefing book on environmental and waste management activities

    SciTech Connect

    Quayle, T.A.

    1993-04-01

    The purpose of the Briefing Book is to provide current information on Environmental Restoration and Waste Management Activities at the Hanford Site. Each edition updates the information in the previous edition by deleting those sections determined not to be of current interest and adding new topics to keep up to date with the changing requirements and issues. This edition covers the period from October 15, 1992 through April 15, 1993.

  17. DESIGN AND PERFORMANCE OF A XENOBIOTIC METABOLISM DATABASE MANAGER FOR METABOLIC SIMULATOR ENHANCEMENT AND CHEMICAL RISK ANALYSIS

    EPA Science Inventory

    A major uncertainty that has long been recognized in evaluating chemical toxicity is accounting for metabolic activation of chemicals resulting in increased toxicity. In silico approaches to predict chemical metabolism and to subsequently screen and prioritize chemicals for risk ...

  18. Weight Management for Athletes and Active Individuals: A Brief Review.

    PubMed

    Manore, Melinda M

    2015-11-01

    Weight management for athletes and active individuals is unique because of their high daily energy expenditure; thus, the emphasis is usually placed on changing the diet side of the energy balance equation. When dieting for weight loss, active individuals also want to preserve lean tissue, which means that energy restriction cannot be too severe or lean tissue is lost. First, this brief review addresses the issues of weight management in athletes and active individuals and factors to consider when determining a weight-loss goal. Second, the concept of dynamic energy balance is reviewed, including two mathematical models developed to improve weight-loss predictions based on changes in diet and exercise. These models are now available on the Internet. Finally, dietary strategies for weight loss/maintenance that can be successfully used with active individuals are given. Emphasis is placed on teaching the benefits of consuming a low-ED diet (e.g., high-fiber, high-water, low-fat foods), which allows for the consumption of a greater volume of food to increase satiety while reducing energy intake. Health professionals and sport dietitians need to understand dynamic energy balance and be prepared with effective and evidence-based dietary approaches to help athletes and active individuals achieve their body-weight goals.

  19. JDD, Inc. Database

    NASA Technical Reports Server (NTRS)

    Miller, David A., Jr.

    2004-01-01

    JDD Inc, is a maintenance and custodial contracting company whose mission is to provide their clients in the private and government sectors "quality construction, construction management and cleaning services in the most efficient and cost effective manners, (JDD, Inc. Mission Statement)." This company provides facilities support for Fort Riley in Fo,rt Riley, Kansas and the NASA John H. Glenn Research Center at Lewis Field here in Cleveland, Ohio. JDD, Inc. is owned and operated by James Vaughn, who started as painter at NASA Glenn and has been working here for the past seventeen years. This summer I worked under Devan Anderson, who is the safety manager for JDD Inc. in the Logistics and Technical Information Division at Glenn Research Center The LTID provides all transportation, secretarial, security needs and contract management of these various services for the center. As a safety manager, my mentor provides Occupational Health and Safety Occupation (OSHA) compliance to all JDD, Inc. employees and handles all other issues (Environmental Protection Agency issues, workers compensation, safety and health training) involving to job safety. My summer assignment was not as considered "groundbreaking research" like many other summer interns have done in the past, but it is just as important and beneficial to JDD, Inc. I initially created a database using a Microsoft Excel program to classify and categorize data pertaining to numerous safety training certification courses instructed by our safety manager during the course of the fiscal year. This early portion of the database consisted of only data (training field index, employees who were present at these training courses and who was absent) from the training certification courses. Once I completed this phase of the database, I decided to expand the database and add as many dimensions to it as possible. Throughout the last seven weeks, I have been compiling more data from day to day operations and been adding the

  20. MSDsite: a database search and retrieval system for the analysis and viewing of bound ligands and active sites.

    PubMed

    Golovin, Adel; Dimitropoulos, Dimitris; Oldfield, Tom; Rachedi, Abdelkrim; Henrick, Kim

    2005-01-01

    The three-dimensional environments of ligand binding sites have been derived from the parsing and loading of the PDB entries into a relational database. For each bound molecule the biological assembly of the quaternary structure has been used to determine all contact residues and a fast interactive search and retrieval system has been developed. Prosite pattern and short sequence search options are available together with a novel graphical query generator for inter-residue contacts. The database and its query interface are accessible from the Internet through a web server located at: http://www.ebi.ac.uk/msd-srv/msdsite.

  1. Nuclear Science References Database

    SciTech Connect

    Pritychenko, B.; Běták, E.; Singh, B.; Totans, J.

    2014-06-15

    The Nuclear Science References (NSR) database together with its associated Web interface, is the world's only comprehensive source of easily accessible low- and intermediate-energy nuclear physics bibliographic information for more than 210,000 articles since the beginning of nuclear science. The weekly-updated NSR database provides essential support for nuclear data evaluation, compilation and research activities. The principles of the database and Web application development and maintenance are described. Examples of nuclear structure, reaction and decay applications are specifically included. The complete NSR database is freely available at the websites of the National Nuclear Data Center (http://www.nndc.bnl.gov/nsr) and the International Atomic Energy Agency (http://www-nds.iaea.org/nsr)

  2. Small Business Innovations (Integrated Database)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Because of the diversity of NASA's information systems, it was necessary to develop DAVID as a central database management system. Under a Small Business Innovation Research (SBIR) grant, Ken Wanderman and Associates, Inc. designed software tools enabling scientists to interface with DAVID and commercial database management systems, as well as artificial intelligence programs. The software has been installed at a number of data centers and is commercially available.

  3. BAID: The Barrow Area Information Database - An Interactive Web Mapping Portal and Cyberinfrastructure Showcasing Scientific Activities in the Vicinity of Barrow, Arctic Alaska.

    NASA Astrophysics Data System (ADS)

    Escarzaga, S. M.; Cody, R. P.; Kassin, A.; Barba, M.; Gaylord, A. G.; Manley, W. F.; Mazza Ramsay, F. D.; Vargas, S. A., Jr.; Tarin, G.; Laney, C. M.; Villarreal, S.; Aiken, Q.; Collins, J. A.; Green, E.; Nelson, L.; Tweedie, C. E.

    2015-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic and the Barrow Area Information Database (BAID, www.barrowmapped.org) tracks and facilitates a gamut of research, management, and educational activities in the area. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 12,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, save or print maps and query results, and filter or view information by space, time, and/or other tags. Additionally, data are described with metadata that meet Federal Geographic Data Committee standards. Recent advances include the addition of more than 2000 new research sites, the addition of a query builder user interface allowing rich and complex queries, and provision of differential global position system (dGPS) and high-resolution aerial imagery support to visiting scientists. Recent field surveys include over 80 miles of coastline to document rates of erosion and the collection of high-resolution sonar data for bathymetric mapping of Elson Lagoon and near shore region of the Chukchi Sea. A network of five climate stations has been deployed across the peninsula to serve as a wireless net for the research community and to deliver near real time climatic data to the user community. Local GIS personal have also been trained to better make use of scientific data for local decision making. Links to Barrow area datasets are housed at national data archives and substantial upgrades have

  4. Teaching Case: Adapting the Access Northwind Database to Support a Database Course

    ERIC Educational Resources Information Center

    Dyer, John N.; Rogers, Camille

    2015-01-01

    A common problem encountered when teaching database courses is that few large illustrative databases exist to support teaching and learning. Most database textbooks have small "toy" databases that are chapter objective specific, and thus do not support application over the complete domain of design, implementation and management concepts…

  5. MPD3: a useful medicinal plants database for drug designing.

    PubMed

    Mumtaz, Arooj; Ashfaq, Usman Ali; Ul Qamar, Muhammad Tahir; Anwar, Farooq; Gulzar, Faisal; Ali, Muhammad Amjad; Saari, Nazamid; Pervez, Muhammad Tariq

    2017-06-01

    Medicinal plants are the main natural pools for the discovery and development of new drugs. In the modern era of computer-aided drug designing (CADD), there is need of prompt efforts to design and construct useful database management system that allows proper data storage, retrieval and management with user-friendly interface. An inclusive database having information about classification, activity and ready-to-dock library of medicinal plant's phytochemicals is therefore required to assist the researchers in the field of CADD. The present work was designed to merge activities of phytochemicals from medicinal plants, their targets and literature references into a single comprehensive database named as Medicinal Plants Database for Drug Designing (MPD3). The newly designed online and downloadable MPD3 contains information about more than 5000 phytochemicals from around 1000 medicinal plants with 80 different activities, more than 900 literature references and 200 plus targets. The designed database is deemed to be very useful for the researchers who are engaged in medicinal plants research, CADD and drug discovery/development with ease of operation and increased efficiency. The designed MPD3 is a comprehensive database which provides most of the information related to the medicinal plants at a single platform. MPD3 is freely available at: http://bioinform.info .

  6. MAKER2: an annotation pipeline and genome-database management tool for second-generation genome projects

    PubMed Central

    2011-01-01

    Background Second-generation sequencing technologies are precipitating major shifts with regards to what kinds of genomes are being sequenced and how they are annotated. While the first generation of genome projects focused on well-studied model organisms, many of today's projects involve exotic organisms whose genomes are largely terra incognita. This complicates their annotation, because unlike first-generation projects, there are no pre-existing 'gold-standard' gene-models with which to train gene-finders. Improvements in genome assembly and the wide availability of mRNA-seq data are also creating opportunities to update and re-annotate previously published genome annotations. Today's genome projects are thus in need of new genome annotation tools that can meet the challenges and opportunities presented by second-generation sequencing technologies. Results We present MAKER2, a genome annotation and data management tool designed for second-generation genome projects. MAKER2 is a multi-threaded, parallelized application that can process second-generation datasets of virtually any size. We show that MAKER2 can produce accurate annotations for novel genomes where training-data are limited, of low quality or even non-existent. MAKER2 also provides an easy means to use mRNA-seq data to improve annotation quality; and it can use these data to update legacy annotations, significantly improving their quality. We also show that MAKER2 can evaluate the quality of genome annotations, and identify and prioritize problematic annotations for manual review. Conclusions MAKER2 is the first annotation engine specifically designed for second-generation genome projects. MAKER2 scales to datasets of any size, requires little in the way of training data, and can use mRNA-seq data to improve annotation quality. It can also update and manage legacy genome annotation datasets. PMID:22192575

  7. Comparative analysis of benign prostatic hyperplasia management by urologists and nonurologists: A Korean nationwide health insurance database study

    PubMed Central

    Park, Juhyun; Lee, Young Ju; Lee, Jeong Woo; Yoo, Tag Keun; Chung, Jae Il; Yun, Seok-Joong; Hong, Jun Hyuk; Seo, Seong Il; Cho, Sung Yong

    2015-01-01

    Purpose To compare the current management of benign prostatic hyperplasia (BPH) by urologists and nonurologists by use of Korean nationwide health insurance data. Materials and Methods We obtained patient data from the national health insurance system. New patients diagnosed with BPH in 2009 were divided into two groups depending on whether they were diagnosed by a urologist (U group) or by a nonurologist (NU group). Results A total of 390,767 individuals were newly diagnosed with BPH in 2009. Of these, 240,907 patients (61.7%) were in the U group and 149,860 patients (38.3%) were in the NU group. The rate of all initial evaluation tests, except serum creatinine, was significantly lower in the NU group. The initial prescription rate was higher in the U group, whereas the prescription period was longer in the NU group. Regarding the initial drugs prescribed, the use of alpha-blockers was common in both groups. However, the U group was prescribed combination therapy of an alpha-blocker and 5-alpha-reductase inhibitor as the second choice, whereas the NU group received monotherapy with a 5-alpha-reductase inhibitor. During the 1-year follow-up, the incidence of surgery was significantly different between the U group and the NU group. Conclusions There are distinct differences in the diagnosis and treatment of BPH by urologists and nonurologists in Korea. These differences may have adverse consequences for BPH patients. Urological societies should take a leadership role in the management of BPH and play an educational role for nonurologists as well as urologists. PMID:25763128

  8. Automating The Work at The Skin and Allergy Private Clinic : A Case Study on Using an Imaging Database to Manage Patients Records

    NASA Astrophysics Data System (ADS)

    Alghalayini, Mohammad Abdulrahman

    Today, many institutions and organizations are facing serious problem due to the tremendously increasing size of documents, and this problem is further triggering the storage and retrieval problems due to the continuously growing space and efficiency requirements. This problem is becoming more complex with time and the increase in the size and number of documents in an organization; therefore, there is a world wide growing demand to address this problem. This demand and challenge can be met by converting the tremendous amount of paper documents to images using a process to enable specialized document imaging people to select the most suitable image type and scanning resolution to use when there is a need for storing documents images. This documents management process, if applied, attempts to solve the problem of the image storage type and size to some extent. In this paper, we present a case study resembling an applied process to manage the registration of new patients in a private clinic and to optimize following up the registered patients after having their information records stored in an imaging database system; therefore, through this automation approach, we optimize the work process and maximize the efficiency of the Skin and Allergy Clinic tasks.

  9. Carbon sink activity and GHG budget of managed European grasslands

    NASA Astrophysics Data System (ADS)

    Klumpp, Katja; Herfurth, Damien; Soussana, Jean-Francois; Fluxnet Grassland Pi's, European

    2013-04-01

    In agriculture, a large proportion (89%) of greenhouse gas (GHG) emission saving potential may be achieved by means of soil C sequestration. Recent demonstrations of carbon sink activities of European ecosystemes, however, often questioned the existence of C storing grasslands, as though a net sink of C was observed, uncertainty surrounding this estimate was larger than the sink itself (Janssens et al., 2003, Schulze et al., 2009. Then again, some of these estimates were based on a small number of measurements, and on models. Not surprising, there is still, a paucity of studies demonstrating the existence of grassland systems, where C sequestration would exceed (in CO2 equivalents) methane emissions from the enteric fermentation of ruminants and nitrous oxide emissions from managed soils. Grasslands are heavily relied upon for food and forage production. A key component of the carbon sink activity in grasslands is thus the impact of changes in management practices or effects of past and recent management, such as intensification as well as climate (and -variation). We analysed data (i.e. flux, ecological, management and soil organic carbon) from a network of European grassland flux observation sites (36). These sites covered different types and intensities of management, and offered the opportunity to understand grassland carbon cycling and trade-offs between C sinks and CH4 and N2O emissions. For some sites, the assessment of carbon sink activities were compared using two methods; repeated soil inventory and determination of the ecosystem C budget by continuous measurement of CO2 exchange in combination with quantification of other C imports and exports (net C storage, NCS). In general grassland, were a potential sink of C with 60±12 g C /m2.yr (median; min -456; max 645). Grazed sites had a higher NCS compared to cut sites (median 99 vs 67 g C /m2.yr), while permanent grassland sites tended to have a lower NCS compared to temporary sown grasslands (median 64 vs

  10. Expenditures associated with dose titration at initiation of therapy in patients with major depressive disorder: a retrospective analysis of a large managed care claims database.

    PubMed

    Camacho, Fabian; Kong, Meg C; Sheehan, David V; Balkrishnan, Rajesh

    2010-08-01

    OBJECTIVE.: Although selective serotonin reuptake inhibitors (SSRIs) are considered cost-effective medications for patients with major depressive disorder (MDD), significant dosage adjustments are often necessary when treatment is initiated. Our study was conducted to examine whether dose titration for SSRIs at initiation of therapy was associated with a greater use of health care resources and higher costs. STUDY DESIGN.: A retrospective database analysis was conducted. METHODS.: A nationally representative cohort of individuals with MDD was identified in a large managed care claims database between January 1, 2004, and December 31, 2006. A study-specific titration algorithm was used to identify patients who underwent dose titration, compared with those who did not, within the first eight weeks of initiating SSRI therapy. We calculated propensity scores and identified a 1:1 matched cohort of titration versus non-titration patients. We used univariate and multivariate statistical tests to compare the mean number of therapeutic days, health care service utilization, and expenditures between the two groups during the first eight weeks (56 days) of treatment and six months (180 days) after treatment began. RESULTS.: Over the first eight weeks, the titration cohort had a 32% decrease in the adjusted mean number of therapeutic days (38 vs. 56, respectively; P < 0.001), a 50% increase in depression-related outpatient visits (1.8 vs. 1.2; P < 0.001), a 38% increase in depression-related outpatient costs ($137 vs. $81; P ≤ 0.001), an increase in antidepressant pharmacy costs ($139 vs. $61; P < 0.001), and a 64% increase in psychiatric visits (0.69 vs. 0.42; P = 0.001), compared with the matched non-titration cohort. These differences were consistent among individual SSRI groups as well as during the six-month period. CONCLUSION.: Patients undergoing dose titration of SSRIs at the beginning of therapy consumed more medical resources and spent more days receiving a

  11. FishTraits Database

    USGS Publications Warehouse

    Angermeier, Paul L.; Frimpong, Emmanuel A.

    2009-01-01

    The need for integrated and widely accessible sources of species traits data to facilitate studies of ecology, conservation, and management has motivated development of traits databases for various taxa. In spite of the increasing number of traits-based analyses of freshwater fishes in the United States, no consolidated database of traits of this group exists publicly, and much useful information on these species is documented only in obscure sources. The largely inaccessible and unconsolidated traits information makes large-scale analysis involving many fishes and/or traits particularly challenging. FishTraits is a database of >100 traits for 809 (731 native and 78 exotic) fish species found in freshwaters of the conterminous United States, including 37 native families and 145 native genera. The database contains information on four major categories of traits: (1) trophic ecology, (2) body size and reproductive ecology (life history), (3) habitat associations, and (4) salinity and temperature tolerances. Information on geographic distribution and conservation status is also included. Together, we refer to the traits, distribution, and conservation status information as attributes. Descriptions of attributes are available here. Many sources were consulted to compile attributes, including state and regional species accounts and other databases.

  12. Waste management activities and carbon emissions in Africa

    SciTech Connect

    Couth, R.; Trois, C.

    2011-01-15

    This paper summarizes research into waste management activities and carbon emissions from territories in sub-Saharan Africa with the main objective of quantifying emission reductions (ERs) that can be gained through viable improvements to waste management in Africa. It demonstrates that data on waste and carbon emissions is poor and generally inadequate for prediction models. The paper shows that the amount of waste produced and its composition are linked to national Gross Domestic Product (GDP). Waste production per person is around half that in developed countries with a mean around 230 kg/hd/yr. Sub-Saharan territories produce waste with a biogenic carbon content of around 56% (+/-25%), which is approximately 40% greater than developed countries. This waste is disposed in uncontrolled dumps that produce large amounts of methane gas. Greenhouse gas (GHG) emissions from waste will rise with increasing urbanization and can only be controlled through funding mechanisms from developed countries.

  13. Waste management activities and carbon emissions in Africa.

    PubMed

    Couth, R; Trois, C

    2011-01-01

    This paper summarizes research into waste management activities and carbon emissions from territories in sub-Saharan Africa with the main objective of quantifying emission reductions (ERs) that can be gained through viable improvements to waste management in Africa. It demonstrates that data on waste and carbon emissions is poor and generally inadequate for prediction models. The paper shows that the amount of waste produced and its composition are linked to national Gross Domestic Product (GDP). Waste production per person is around half that in developed countries with a mean around 230 kg/hd/yr. Sub-Saharan territories produce waste with a biogenic carbon content of around 56% (+/-25%), which is approximately 40% greater than developed countries. This waste is disposed in uncontrolled dumps that produce large amounts of methane gas. Greenhouse gas (GHG) emissions from waste will rise with increasing urbanization and can only be controlled through funding mechanisms from developed countries.

  14. 76 FR 23306 - Takes of Marine Mammals Incidental to Specified Activities; Russian River Estuary Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-26

    ... of a lagoon outlet channel. The latter activity, an alternative management technique conducted to... ``lagoon management period''). All estuary management activities are conducted by SCWA in accordance with a... habitat for ESA-listed salmonids. During the lagoon management period only, this involves construction...

  15. Pollution effects on fisheries — potential management activities

    NASA Astrophysics Data System (ADS)

    Sindermann, C. J.

    1980-03-01

    Management of ocean pollution must be based on the best available scientific information, with adequate consideration of economic, social, and political realities. Unfortunately, the best available scientific information about pollution effects on fisheries is often fragmentary, and often conjectural; therefore a primary concern of management should be a critical review and assessment of available factual information about effects of pollutants on fish and shellfish stocks. A major problem in any such review and assessment is the separation of pollutant effects from the effects of all the other environmental factors that influence survival and well-being of marine animals. Data from long-term monitoring of resource abundance, and from monitoring of all determinant environmental variables, will be required for analyses that lead to resolution of the problem. Information must also be acquired about fluxes of contaminants through resource-related ecosystems, and about contaminant effects on resource species as demonstrated in field and laboratory experiments. Other possible management activities include: (1) encouragement of continued efforts to document clearly the localized and general effects of pollution on living resources; (2) continued pressure to identify and use reliable biological indicators of environmental degradation (indicators of choice at present are: unusually high levels of genetic and other anomalies in the earliest life history stages; presence of pollution-associated disease signs, particularly fin erosion and ulcers, in fish; and biochemical/physiological changes); and (3) major efforts to reduce inputs of pollutants clearly demonstrated to be harmful to living resources, from point sources as well as ocean dumping. Such pollution management activities, based on continuous efforts in stock assessment, environmental assessment, and experimental studies, can help to insure that rational decisions will be made about uses and abuses of coastal

  16. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Senoo, Tetsuo

    As computer technology, communication technology and others have progressed, many corporations are likely to locate constructing and utilizing their own databases at the center of the information activities, and aim at developing their information activities newly. This paper considers how information management in a corporation is affected under changing management and technology environments, and clarifies and generalizes what in-house databases should be constructed and utilized from the viewpoints of requirements to be furnished, types and forms of information to be dealt, indexing, use type and frequency, evaluation method and so on. The author outlines an information system of Matsushita called MATIS (Matsushita Technical Information System) as an actual example, and describes the present status and some points to be reminded in constructing and utilizing databases of REP, BOOK and SYMP.

  17. Energy management and control of active distribution systems

    NASA Astrophysics Data System (ADS)

    Shariatzadeh, Farshid

    Advancements in the communication, control, computation and information technologies have driven the transition to the next generation active power distribution systems. Novel control techniques and management strategies are required to achieve the efficient, economic and reliable grid. The focus of this work is energy management and control of active distribution systems (ADS) with integrated renewable energy sources (RESs) and demand response (DR). Here, ADS mean automated distribution system with remotely operated controllers and distributed energy resources (DERs). DER as active part of the next generation future distribution system includes: distributed generations (DGs), RESs, energy storage system (ESS), plug-in hybrid electric vehicles (PHEV) and DR. Integration of DR and RESs into ADS is critical to realize the vision of sustainability. The objective of this dissertation is the development of management architecture to control and operate ADS in the presence of DR and RES. One of the most challenging issues for operating ADS is the inherent uncertainty of DR and RES as well as conflicting objective of DER and electric utilities. ADS can consist of different layers such as system layer and building layer and coordination between these layers is essential. In order to address these challenges, multi-layer energy management and control architecture is proposed with robust algorithms in this work. First layer of proposed multi-layer architecture have been implemented at the system layer. Developed AC optimal power flow (AC-OPF) generates fair price for all DR and non-DR loads which is used as a control signal for second layer. Second layer controls DR load at buildings using a developed look-ahead robust controller. Load aggregator collects information from all buildings and send aggregated load to the system optimizer. Due to the different time scale at these two management layers, time coordination scheme is developed. Robust and deterministic controllers

  18. Volcanic disasters and incidents: A new database

    NASA Astrophysics Data System (ADS)

    Witham, C. S.

    2005-12-01

    A new database on human mortality and morbidity, and civil evacuations arising from volcanic activity is presented. The aim is to quantify the human impacts of volcanic phenomena during the 20th Century. Data include numbers of deaths, injuries, evacuees and people made homeless, and the nature of the associated volcanic phenomena. The database has been compiled from a wide range of sources, and discrepancies between these are indicated where they arise. The quality of the data varies according to the source and the impacts reported. Data for homelessness are particularly poor and effects from ashfall and injuries appear to be under-reported. Of the 491 events included in the database, ˜53% resulted in deaths, although the total death toll of 91,724 is dominated by the disasters at Mt Pelée and Nevado del Ruiz. Pyroclastic density currents account for the largest proportion of deaths, and lahars for the most injuries incurred. The Philippines, Indonesia, and Southeast Asia, as a region, were the worst affected, and middle-income countries experienced greater human impacts than low or high-income countries. Compilation of the database has highlighted a number of problems with the completeness and accuracy of the existing CRED EM-DAT disaster database that includes volcanic events. This database is used by a range of organisations involved with risk management. The new database is intended as a resource for future analysis and will be made available via the Internet. It is hoped that it will be maintained and expanded.

  19. MANAGING ENGINEERING ACTIVITIES FOR THE PLATEAU REMEDIATION CONTRACT - HANFORD

    SciTech Connect

    KRONVALL CM

    2011-01-14

    In 2008, the primary Hanford clean-up contract transitioned to the CH2MHill Plateau Remediation Company (CHPRC). Prior to transition, Engineering resources assigned to remediation/Decontamination and Decommissioning (D&D) activities were a part of a centralized engineering organization and matrixed to the performing projects. Following transition, these resources were reassigned directly to the performing project, with a loose matrix through a smaller Central Engineering (CE) organization. The smaller (10 FTE) central organization has retained responsibility for the overall technical quality of engineering for the CHPRC, but no longer performs staffing and personnel functions. As the organization has matured, there are lessons learned that can be shared with other organizations going through or contemplating performing a similar change. Benefits that have been seen from the CHPRC CE organization structure include the following: (1) Staff are closely aligned with the 'Project/facility' that they are assigned to support; (2) Engineering priorities are managed to be consistent with the 'Project/facility' priorities; (3) Individual Engineering managers are accountable for identifying staffing needs and the filling of staffing positions; (4) Budget priorities are managed within the local organization structure; (5) Rather than being considered a 'functional' organization, engineering is considered a part of a line, direct funded organization; (6) The central engineering organization is able to provide 'overview' activities and maintain independence from the engineering organizations in the field; and (7) The central engineering organization is able to maintain a stable of specialized experts that are able to provide independent reviews of field projects and day-to-day activities.

  20. Data Extraction and Management in Networks of Observational Health Care Databases for Scientific Research: A Comparison of EU-ADR, OMOP, Mini-Sentinel and MATRICE Strategies

    PubMed Central

    Gini, Rosa; Schuemie, Martijn; Brown, Jeffrey; Ryan, Patrick; Vacchi, Edoardo; Coppola, Massimo; Cazzola, Walter; Coloma, Preciosa; Berni, Roberto; Diallo, Gayo; Oliveira, José Luis; Avillach, Paul; Trifirò, Gianluca; Rijnbeek, Peter; Bellentani, Mariadonata; van Der Lei, Johan; Klazinga, Niek; Sturkenboom, Miriam

    2016-01-01

    Introduction: We see increased use of existing observational data in order to achieve fast and transparent production of empirical evidence in health care research. Multiple databases are often used to increase power, to assess rare exposures or outcomes, or to study diverse populations. For privacy and sociological reasons, original data on individual subjects can’t be shared, requiring a distributed network approach where data processing is performed prior to data sharing. Case Descriptions and Variation Among Sites: We created a conceptual framework distinguishing three steps in local data processing: (1) data reorganization into a data structure common across the network; (2) derivation of study variables not present in original data; and (3) application of study design to transform longitudinal data into aggregated data sets for statistical analysis. We applied this framework to four case studies to identify similarities and differences in the United States and Europe: Exploring and Understanding Adverse Drug Reactions by Integrative Mining of Clinical Records and Biomedical Knowledge (EU-ADR), Observational Medical Outcomes Partnership (OMOP), the Food and Drug Administration’s (FDA’s) Mini-Sentinel, and the Italian network—the Integration of Content Management Information on the Territory of Patients with Complex Diseases or with Chronic Conditions (MATRICE). Findings: National networks (OMOP, Mini-Sentinel, MATRICE) all adopted shared procedures for local data reorganization. The multinational EU-ADR network needed locally defined procedures to reorganize its heterogeneous data into a common structure. Derivation of new data elements was centrally defined in all networks but the procedure was not shared in EU-ADR. Application of study design was a common and shared procedure in all the case studies. Computer procedures were embodied in different programming languages, including SAS, R, SQL, Java, and C++. Conclusion: Using our conceptual framework

  1. Active Piezoelectric Structures for Tip Clearance Management Assessed

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Managing blade tip clearance in turbomachinery stages is critical to developing advanced subsonic propulsion systems. Active casing structures with embedded piezoelectric actuators appear to be a promising solution. They can control static and dynamic tip clearance, compensate for uneven deflections, and accomplish electromechanical coupling at the material level. In addition, they have a compact design. To assess the feasibility of this concept and assist the development of these novel structures, the NASA Lewis Research Center developed in-house computational capabilities for composite structures with piezoelectric actuators and sensors, and subsequently used them to simulate candidate active casing structures. The simulations indicated the potential of active casings to modify the blade tip clearance enough to improve stage efficiency. They also provided valuable design information, such as preliminary actuator configurations (number and location) and the corresponding voltage patterns required to compensate for uneven casing deformations. An active ovalization of a casing with four discrete piezoceramic actuators attached on the outer surface is shown. The center figure shows the predicted radial displacements along the hoop direction that are induced when electrostatic voltage is applied at the piezoceramic actuators. This work, which has demonstrated the capabilities of in-house computational models to analyze and design active casing structures, is expected to contribute toward the development of advanced subsonic engines.

  2. Annual Review of Database Developments 1991.

    ERIC Educational Resources Information Center

    Basch, Reva

    1991-01-01

    Review of developments in databases highlights a new emphasis on accessibility. Topics discussed include the internationalization of databases; databases that deal with finance, drugs, and toxic waste; access to public records, both personal and corporate; media online; reducing large files of data to smaller, more manageable files; and…

  3. Research Data Management and Libraries: Relationships, Activities, Drivers and Influences

    PubMed Central

    Pinfield, Stephen; Cox, Andrew M.; Smith, Jen

    2014-01-01

    The management of research data is now a major challenge for research organisations. Vast quantities of born-digital data are being produced in a wide variety of forms at a rapid rate in universities. This paper analyses the contribution of academic libraries to research data management (RDM) in the wider institutional context. In particular it: examines the roles and relationships involved in RDM, identifies the main components of an RDM programme, evaluates the major drivers for RDM activities, and analyses the key factors influencing the shape of RDM developments. The study is written from the perspective of library professionals, analysing data from 26 semi-structured interviews of library staff from different UK institutions. This is an early qualitative contribution to the topic complementing existing quantitative and case study approaches. Results show that although libraries are playing a significant role in RDM, there is uncertainty and variation in the relationship with other stakeholders such as IT services and research support offices. Current emphases in RDM programmes are on developments of policies and guidelines, with some early work on technology infrastructures and support services. Drivers for developments include storage, security, quality, compliance, preservation, and sharing with libraries associated most closely with the last three. The paper also highlights a ‘jurisdictional’ driver in which libraries are claiming a role in this space. A wide range of factors, including governance, resourcing and skills, are identified as influencing ongoing developments. From the analysis, a model is constructed designed to capture the main aspects of an institutional RDM programme. This model helps to clarify the different issues involved in RDM, identifying layers of activity, multiple stakeholders and drivers, and a large number of factors influencing the implementation of any initiative. Institutions may usefully benchmark their activities against

  4. Research data management and libraries: relationships, activities, drivers and influences.

    PubMed

    Pinfield, Stephen; Cox, Andrew M; Smith, Jen

    2014-01-01

    The management of research data is now a major challenge for research organisations. Vast quantities of born-digital data are being produced in a wide variety of forms at a rapid rate in universities. This paper analyses the contribution of academic libraries to research data management (RDM) in the wider institutional context. In particular it: examines the roles and relationships involved in RDM, identifies the main components of an RDM programme, evaluates the major drivers for RDM activities, and analyses the key factors influencing the shape of RDM developments. The study is written from the perspective of library professionals, analysing data from 26 semi-structured interviews of library staff from different UK institutions. This is an early qualitative contribution to the topic complementing existing quantitative and case study approaches. Results show that although libraries are playing a significant role in RDM, there is uncertainty and variation in the relationship with other stakeholders such as IT services and research support offices. Current emphases in RDM programmes are on developments of policies and guidelines, with some early work on technology infrastructures and support services. Drivers for developments include storage, security, quality, compliance, preservation, and sharing with libraries associated most closely with the last three. The paper also highlights a 'jurisdictional' driver in which libraries are claiming a role in this space. A wide range of factors, including governance, resourcing and skills, are identified as influencing ongoing developments. From the analysis, a model is constructed designed to capture the main aspects of an institutional RDM programme. This model helps to clarify the different issues involved in RDM, identifying layers of activity, multiple stakeholders and drivers, and a large number of factors influencing the implementation of any initiative. Institutions may usefully benchmark their activities against the

  5. Active microbial soil communities in different agricultural managements

    NASA Astrophysics Data System (ADS)

    Landi, S.; Pastorelli, R.

    2009-04-01

    We studied the composition of active eubacterial microflora by RNA extraction from soil (bulk and rhizosphere) under different environmental impact managements, in a hilly basin in Gallura (Sardinia). We contrasted grassy vineyard, in which the soil had been in continuous contact with plant roots for a long period of time, with traditional tilled vineyard. Moreover, we examined permanent grassland, in which plants had been present for some years, with temporary grassland, in which varying plants had been present only during the respective growing seasons. Molecular analysis of total population was carried out by electrophoretic separation by Denaturing Gradient Gel Electrophoresis (DGGE) of amplified cDNA fragments obtained from 16S rRNA. In vineyards UPGMA (Unweighted Pair Group Mathematical Average) analysis made up separate clusters depending on soil management. In spring both clusters showed similarity over 70%, while in autumn the similarity increased, 84% and 90% for grassy and conventional tilled vineyard respectively. Permanent and temporary grassland joined in a single cluster in spring, while in autumn a partial separation was evidenced. The grassy vineyard, permanent and temporary grassland showed higher richness and diversity Shannon-Weiner index values than vineyard with conventional tillage although no significant. In conclusion the expected effect of the rhizosphere was visible: the grass cover influenced positively the diversity of active microbial population.

  6. Cross exploitation of geo-databases and earth observation data for stakes characterization in the framework of multi-risk analysis and management: RASOR examples

    NASA Astrophysics Data System (ADS)

    Tholey, Nadine; Yesou, Herve; Maxant, Jerome; Montabord, Myldred; Studer, Mathias; Faivre, Robin; Rudari, Roberto; de Fraipont, Paul

    2016-04-01

    In the context of risk analysis and management, information is needed on the landscape under investigation, especially for vulnerability assessment purposes where landuse and stakes characterization is of prime importance for the knowledge and description of exposure elements in modelling scenarios. Such thematic information over at-risk areas can be extracted from available global, regional or local scale open sources databases (e.g. ESA-Globcover, Natural Earth, Copernicus core services, OSM, …) or derived from the exploitation of EO satellite images at high and very high spatial resolution (e.g. SPOT, soon Sentinel2, Pleiades, WorldView, …) over territories where this type of information is not available or not sufficiently up to date. However, EO data processing, and derived results highlight also the gap between what would be needed for a complete representation of vulnerability , i.e. a functional description of the land use, a structural description of the buildings including their functional use , and what is reasonable accessible by exploiting EO data, i.e. a biophysical description of the land cover at different spatial resolution from decametric scales to sub-metric ones, especially for urban block and building information. Potential and limits of these multi-scale and multi-sources of geo-information will be illustrated by examples related to different types of landscape and urban settlements in Asia (Indonesia), Europe (Greece), and the Caribbean (Haiti) regions, and exploited within the framework of the RASOR (Rapid Analysis and Spatialisation Of Risk) project (European Commission FP7) which is developing a platform to perform multi-hazard risk analysis to support the full cycle of disaster management.

  7. Management practices that concentrate visitor activities: Camping impact management at Isle Royale National Park, USA

    USGS Publications Warehouse

    Marion, J.L.; Farrell, T.A.

    2002-01-01

    This study assessed campsite conditions and the effectiveness of campsite impact management strategies at Isle Royale National Park, USA. Protocols for assessing indicators of vegetation and soil conditions were developed and applied to 156 campsites and 88 shelters within 36 backcountry campgrounds. The average site was 68 m2 and 83% of sites lost vegetation over areas less than 47 m2. Results reveal that management actions to spatially concentrate camping activities and reduce camping disturbance have been highly successful. Comparisons of disturbed area/overnight stay among other protected areas reinforces this assertion. These reductions in area of camping disturbance are attributed to a designated site camping policy, limitation on site numbers, construction of sites in sloping terrain, use of facilities, and an ongoing program of campsite maintenance. Such actions are most appropriate in higher use backcountry and wilderness settings.

  8. [Development of an magnetic resonance imaging safety management system for metallic biomedical products using an magnetic resonance compatibility database and inquiry-based patient records].

    PubMed

    Fujiwara, Yasuhiro; Kata, Tomomi; Fujimoto, Shinichi; Yachida, Takuya; Kanamoto, Masayuki; Nanbu, Yousuke; Seki, Kouichirou; Kosaka, Nobuyuki; Kimura, Hirohiko; Adachi, Toshiki

    2014-12-01

    Several incidents involving magnetic resonance imaging (MRI) examinations of patients with unchecked MR-unsafe metallic products have been reported. To improve patient safety, we developed a new MRI safety management system for metallic biomedical products and evaluated its efficiency in clinical practice. Our system was integrated into the picture archiving and communication system (PACS) and comprised an MR compatibility database and inquiry-based patient records of internal metallic biomedical products, enabling hospital staff to check MR compatibility by product name. A total of 6,637 biomedical implants and devices were listed in this system, including product names and their MR compatibilities. Furthermore, MRI histories for each patient at our hospital were also recorded. Using this system, it was possible to confirm the MR compatibility of the patients' metallic biomedical products effectively and to reduce the number of unchecked internal products through systematic patient inquiry. In conclusion, our new system enhanced metallic biomedical product checking procedures, and improved patient safety during clinical MRI examinations.

  9. Activation of AMP-activated kinase as a strategy for managing autosomal dominant polycystic kidney disease.

    PubMed

    McCarty, Mark F; Barroso-Aranda, Jorge; Contreras, Francisco

    2009-12-01

    There is evidence that overactivity of both mammalian target of rapamycin (mTOR) and cystic fibrosis transmembrane conductance regulator (CFTR) contributes importantly to the progressive expansion of renal cysts in autosomal dominant polycystic kidney disease (ADPKD). Recent research has established that AMP-activated kinase (AMPK) can suppress the activity of each of these proteins. Clinical AMPK activators such as metformin and berberine may thus have potential in the clinical management of ADPKD. The traditional use of berberine in diarrhea associated with bacterial infections may reflect, in part, the inhibitory impact of AMPK on chloride extrusion by small intestinal enterocytes.

  10. Standardized phenology monitoring methods to track plant and animal activity for science and resource management applications

    USGS Publications Warehouse

    Denny, Ellen G.; Gerst, Katharine L.; Miller-Rushing, Abraham J.; Tierney, Geraldine L.; Crimmins, Theresa M.; Enquist, Carolyn A.F.; Guertin, Patricia; Rosemartin, Alyssa H.; Schwartz, Mark D.; Thomas, Kathryn A.; Weltzin, Jake F.

    2014-01-01

    Phenology offers critical insights into the responses of species to climate change; shifts in species’ phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological “status”, or the ability to track presence–absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.

  11. Standardized phenology monitoring methods to track plant and animal activity for science and resource management applications

    NASA Astrophysics Data System (ADS)

    Denny, Ellen G.; Gerst, Katharine L.; Miller-Rushing, Abraham J.; Tierney, Geraldine L.; Crimmins, Theresa M.; Enquist, Carolyn A. F.; Guertin, Patricia; Rosemartin, Alyssa H.; Schwartz, Mark D.; Thomas, Kathryn A.; Weltzin, Jake F.

    2014-05-01

    Phenology offers critical insights into the responses of species to climate change; shifts in species' phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological "status", or the ability to track presence-absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.

  12. Atomic Databases

    NASA Astrophysics Data System (ADS)

    Mendoza, Claudio

    2000-10-01

    Atomic and molecular data are required in a variety of fields ranging from the traditional astronomy, atmospherics and fusion research to fast growing technologies such as lasers, lighting, low-temperature plasmas, plasma assisted etching and radiotherapy. In this context, there are some research groups, both theoretical and experimental, scattered round the world that attend to most of this data demand, but the implementation of atomic databases has grown independently out of sheer necessity. In some cases the latter has been associated with the data production process or with data centers involved in data collection and evaluation; but sometimes it has been the result of individual initiatives that have been quite successful. In any case, the development and maintenance of atomic databases call for a number of skills and an entrepreneurial spirit that are not usually associated with most physics researchers. In the present report we present some of the highlights in this area in the past five years and discuss what we think are some of the main issues that have to be addressed.

  13. Construction of databases: advances and significance in clinical research.

    PubMed

    Long, Erping; Huang, Bingjie; Wang, Liming; Lin, Xiaoyu; Lin, Haotian

    2015-12-01

    Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the definition, classification, and establishment of databases. Afterward, the workflow for establishing databases, inputting data, verifying data, and managing databases is presented. Meanwhile, by discussing the application of databases in clinical research, we illuminate the important role of databases in clinical research practice. Lastly, we introduce the reanalysis of randomized controlled trials (RCTs) and cloud computing techniques, showing the most recent advancements of databases in clinical research.

  14. Activities identification for activity-based cost/management applications of the diagnostics outpatient procedures.

    PubMed

    Alrashdan, Abdalla; Momani, Amer; Ababneh, Tamador

    2012-01-01

    One of the most challenging problems facing healthcare providers is to determine the actual cost for their procedures, which is important for internal accounting and price justification to insurers. The objective of this paper is to find suitable categories to identify the diagnostic outpatient medical procedures and translate them from functional orientation to process orientation. A hierarchal task tree is developed based on a classification schema of procedural activities. Each procedure is seen as a process consisting of a number of activities. This makes a powerful foundation for activity-based cost/management implementation and provides enough information to discover the value-added and non-value-added activities that assist in process improvement and eventually may lead to cost reduction. Work measurement techniques are used to identify the standard time of each activity at the lowest level of the task tree. A real case study at a private hospital is presented to demonstrate the proposed methodology.

  15. The CEBAF Element Database

    SciTech Connect

    Theodore Larrieu, Christopher Slominski, Michele Joyce

    2011-03-01

    With the inauguration of the CEBAF Element Database (CED) in Fall 2010, Jefferson Lab computer scientists have taken a step toward the eventual goal of a model-driven accelerator. Once fully populated, the database will be the primary repository of information used for everything from generating lattice decks to booting control computers to building controls screens. A requirement influencing the CED design is that it provide access to not only present, but also future and past configurations of the accelerator. To accomplish this, an introspective database schema was designed that allows new elements, types, and properties to be defined on-the-fly with no changes to table structure. Used in conjunction with Oracle Workspace Manager, it allows users to query data from any time in the database history with the same tools used to query the present configuration. Users can also check-out workspaces to use as staging areas for upcoming machine configurations. All Access to the CED is through a well-documented Application Programming Interface (API) that is translated automatically from original C++ source code into native libraries for scripting languages such as perl, php, and TCL making access to the CED easy and ubiquitous.

  16. Tracing thyroid hormone-disrupting compounds: database compilation and structure-activity evaluation for an effect-directed analysis of sediment.

    PubMed

    Weiss, Jana M; Andersson, Patrik L; Zhang, Jin; Simon, Eszter; Leonards, Pim E G; Hamers, Timo; Lamoree, Marja H

    2015-07-01

    A variety of anthropogenic compounds has been found to be capable of disrupting the endocrine systems of organisms, in laboratory studies as well as in wildlife. The most widely described endpoint is estrogenicity, but other hormonal disturbances, e.g., thyroid hormone disruption, are gaining more and more attention. Here, we present a review and chemical characterization, using principal component analysis, of organic compounds that have been tested for their capacity to bind competitively to the thyroid hormone transport protein transthyretin (TTR). The database contains 250 individual compounds and technical mixtures, of which 144 compounds are defined as TTR binders. Almost one third of these compounds (n = 52) were even more potent than the natural hormone thyroxine (T4). The database was used as a tool to assist in the identification of thyroid hormone-disrupting compounds (THDCs) in an effect-directed analysis (EDA) study of a sediment sample. Two compounds could be confirmed to contribute to the detected TTR-binding potency in the sediment sample, i.e., triclosan and nonylphenol technical mixture. They constituted less than 1% of the TTR-binding potency of the unfractionated extract. The low rate of explained activity may be attributed to the challenges related to identification of unknown contaminants in combination with the limited knowledge about THDCs in general. This study demonstrates the need for databases containing compound-specific toxicological properties. In the framework of EDA, such a database could be used to assist in the identification and confirmation of causative compounds focusing on thyroid hormone disruption.

  17. The Student-Designed Database.

    ERIC Educational Resources Information Center

    Thomas, Rick

    1988-01-01

    This discussion of the design of data files for databases to be created by secondary school students uses AppleWorks software as an example. Steps needed to create and use a database are explained, the benefits of group activity are described, and other possible projects are listed. (LRW)

  18. The Genopolis Microarray Database

    PubMed Central

    Splendiani, Andrea; Brandizi, Marco; Even, Gael; Beretta, Ottavio; Pavelka, Norman; Pelizzola, Mattia; Mayhaus, Manuel; Foti, Maria; Mauri, Giancarlo; Ricciardi-Castagnoli, Paola

    2007-01-01

    Background Gene expression databases are key resources for microarray data management and analysis and the importance of a proper annotation of their content is well understood. Public repositories as well as microarray database systems that can be implemented by single laboratories exist. However, there is not yet a tool that can easily support a collaborative environment where different users with different rights of access to data can interact to define a common highly coherent content. The scope of the Genopolis database is to provide a resource that allows different groups performing microarray experiments related to a common subject to create a common coherent knowledge base and to analyse it. The Genopolis database has been implemented as a dedicated system for the scientific community studying dendritic and macrophage cells functions and host-parasite interactions. Results The Genopolis Database system allows the community to build an object based MIAME compliant annotation of their experiments and to store images, raw and processed data from the Affymetrix GeneChip® platform. It supports dynamical definition of controlled vocabularies and provides automated and supervised steps to control the coherence of data and annotations. It allows a precise control of the visibility of the database content to different sub groups in the community and facilitates exports of its content to public repositories. It provides an interactive users interface for data analysis: this allows users to visualize data matrices based on functional lists and sample characterization, and to navigate to other data matrices defined by similarity of expression values as well as functional characterizations of genes involved. A collaborative environment is also provided for the definition and sharing of functional annotation by users. Conclusion The Genopolis Database supports a community in building a common coherent knowledge base and analyse it. This fills a gap between a local

  19. A family of ring system-based structural fragments for use in structure-activity studies: database mining and recursive partitioning.

    PubMed

    Nilakantan, Ramaswamy; Nunn, David S; Greenblatt, Lynne; Walker, Gary; Haraki, Kevin; Mobilio, Dominick

    2006-01-01

    In earlier work from our laboratory, we have described the use of the ring system and ring scaffold as descriptors. We showed that these descriptors could be used for fast compound clustering, novelty determination, compound acquisition, and combinatorial library design. Here we extend the concept to a whole family of structural descriptors with the ring system as the centerpiece. We show how this simple idea can be used to build powerful search tools for mining chemical databases in useful ways. We have also built recursive partition trees using these fragments as descriptors. We will discuss how these trees can help in analyzing complex structure-activity data.

  20. Database Handling Software and Scientific Applications.

    ERIC Educational Resources Information Center

    Gabaldon, Diana J.

    1984-01-01

    Discusses the general characteristics of database management systems and file systems. Also gives a basic framework for evaluating such software and suggests characteristics that should be considered when buying software for specific scientific applications. A list of vendor addresses for popular database management systems is included. (JN)

  1. Active traffic management on road networks: a macroscopic approach.

    PubMed

    Kurzhanskiy, Alex A; Varaiya, Pravin

    2010-10-13

    Active traffic management (ATM) is the ability to dynamically manage recurrent and non-recurrent congestion based on prevailing traffic conditions in order to maximize the effectiveness and efficiency of road networks. It is a continuous process of (i) obtaining and analysing traffic measurement data, (ii) operations planning, i.e. simulating various scenarios and control strategies, (iii) implementing the most promising control strategies in the field, and (iv) maintaining a real-time decision support system that filters current traffic measurements to predict the traffic state in the near future, and to suggest the best available control strategy for the predicted situation. ATM relies on a fast and trusted traffic simulator for the rapid quantitative assessment of a large number of control strategies for the road network under various scenarios, in a matter of minutes. The open-source macrosimulation tool Aurora ROAD NETWORK MODELER is a good candidate for this purpose. The paper describes the underlying dynamical traffic model and what it takes to prepare the model for simulation; covers the traffic performance measures and evaluation of scenarios as part of operations planning; introduces the framework within which the control strategies are modelled and evaluated; and presents the algorithm for real-time traffic state estimation and short-term prediction.

  2. Activating patients with chronic disease for self-management: comparison of self-managing patients with those managing by frequent readmissions to hospital.

    PubMed

    Kirby, Sue E; Dennis, Sarah M; Bazeley, Pat; Harris, Mark F

    2013-01-01

    Understanding the factors that activate people to self-manage chronic disease is important in improving uptake levels. If the many frequent hospital users who present with acute exacerbations of chronic disease were to self-manage at home, some hospital admissions would be avoided. Patient interview and demographic, psychological, clinical and service utilisation data were compared for two groups of patients with chronic disease: those attending self-management services and those who managed by using hospital services. Data were analysed to see whether there were differences that might explain the two different approaches to managing their conditions. The two groups were similar in terms of comorbidity, age, sex, home services, home support and educational level. Self-managing patients were activated by their clinician, accepted their disease, changed their identity, confronted emotions and learnt the skills to self-manage and avoid hospital. Patients who frequently used hospital services to manage their chronic disease were often in denial about their chronic disease, hung on to their identity and expressed little emotional response. However, they reported a stronger sense of coherence and rated their health more highly than self-managing patients. This study shed light on the process of patient activation for self-management. A better understanding of the process of patient activation would encourage clinicians who come into contact with frequently readmitted chronic disease patients to be more proactive in supporting self-management.

  3. Active Management of Flap-Edge Trailing Vortices

    NASA Technical Reports Server (NTRS)

    Greenblatt, David; Yao, Chung-Sheng; Vey, Stefan; Paschereit, Oliver C.; Meyer, Robert

    2008-01-01

    The vortex hazard produced by large airliners and increasingly larger airliners entering service, combined with projected rapid increases in the demand for air transportation, is expected to act as a major impediment to increased air traffic capacity. Significant reduction in the vortex hazard is possible, however, by employing active vortex alleviation techniques that reduce the wake severity by dynamically modifying its vortex characteristics, providing that the techniques do not degrade performance or compromise safety and ride quality. With this as background, a series of experiments were performed, initially at NASA Langley Research Center and subsequently at the Berlin University of Technology in collaboration with the German Aerospace Center. The investigations demonstrated the basic mechanism for managing trailing vortices using retrofitted devices that are decoupled from conventional control surfaces. The basic premise for managing vortices advanced here is rooted in the erstwhile forgotten hypothesis of Albert Betz, as extended and verified ingeniously by Coleman duPont Donaldson and his collaborators. Using these devices, vortices may be perturbed at arbitrarily long wavelengths down to wavelengths less than a typical airliner wingspan and the oscillatory loads on the wings, and hence the vehicle, are small. Significant flexibility in the specific device has been demonstrated using local passive and active separation control as well as local circulation control via Gurney flaps. The method is now in a position to be tested in a wind tunnel with a longer test section on a scaled airliner configuration. Alternatively, the method can be tested directly in a towing tank, on a model aircraft, a light aircraft or a full-scale airliner. The authors believed that this method will have significant appeal from an industry perspective due to its retrofit potential with little to no impact on cruise (devices tucked away in the cove or retracted); low operating power

  4. Open Geoscience Database

    NASA Astrophysics Data System (ADS)

    Bashev, A.

    2012-04-01

    treatment could be conducted in other programs after extraction the filtered data into *.csv file. It makes the database understandable for non-experts. The database employs open data format (*.csv) and wide spread tools: PHP as the program language, MySQL as database management system, JavaScript for interaction with GoogleMaps and JQueryUI for create user interface. The database is multilingual: there are association tables, which connect with elements of the database. In total the development required about 150 hours. The database still has several problems. The main problem is the reliability of the data. Actually it needs an expert system for estimation the reliability, but the elaboration of such a system would take more resources than the database itself. The second problem is the problem of stream selection - how to select the stations that are connected with each other (for example, belong to one water stream) and indicate their sequence. Currently the interface is English and Russian. However it can be easily translated to your language. But some problems we decided. For example problem "the problem of the same station" (sometimes the distance between stations is smaller, than the error of position): when you adding new station to the database our application automatically find station near this place. Also we decided problem of object and parameter type (how to regard "EC" and "electrical conductivity" as the same parameter). This problem has been solved using "associative tables". If you would like to see the interface on your language, just contact us. We should send you the list of terms and phrases for translation on your language. The main advantage of the database is that it is totally open: everybody can see, extract the data from the database and use them for non-commercial purposes with no charge. Registered users can contribute to the database without getting paid. We hope, that it will be widely used first of all for education purposes, but

  5. Prevalence, Characteristics, Management, and Outcome of Pulmonary Tuberculosis in HIV-Infected Children in the TREAT Asia Pediatric HIV Observational Database (TApHOD)

    PubMed Central

    Sudjaritruk, Tavitiya; Maleesatharn, Alan; Prasitsuebsai, Wasana; Fong, Siew Moy; Le, Ngoc Oanh; Le, Thanh Thuy Thi; Lumbiganon, Pagakrong; Kumarasamy, Nagalingeswaran; Kurniati, Nia; Hansudewechakul, Rawiwan; Yusoff, Nik Khairulddin Nik; Razali, Kamarul Azahar Mohd; Kariminia, Azar; Sohn, Annette H.

    2013-01-01

    Abstract A multicenter, retrospective, observational study was conducted to determine prevalence, characteristics, management, and outcome of pulmonary tuberculosis (PTB) in Asian HIV-infected children in the TREAT Asia Pediatric HIV Observational Database (TApHOD). Data on PTB episodes diagnosed during the period between 12 months before antiretroviral therapy (ART) initiation and December 31, 2009 were extracted. A total of 2678 HIV-infected children were included in TApHOD over a 13-year period; 457 developed PTB, giving a period prevalence of 17.1% (range 5.7–33.0% per country). There were a total of 484 PTB episodes; 27 children had 2 episodes each. There were 21 deaths (4.3%). One third of episodes (n=175/484) occurred after ART initiation at a median of 14.1 months (interquartile range [IQR] 2.5–28.8 months). The median (IQR) CD4+ values were 9.0% (3.0–16.0%) and 183.5 (37.8–525.0) cells/mm3 when PTB was diagnosed. Most episodes (n=424/436, 97.3%) had abnormal radiographic findings compatible with PTB, whereas half (n=267/484, 55.2%) presented with clinical characteristics of PTB. One third of those tested (n=42/122, 34.4%) had bacteriological evidence of PTB. Of the 156 episodes (32.2%) that were accompanied with extrapulmonary TB, pleuritis was the most common manifestation (81.4%). After treatment completion, most episodes (n=396/484, 81.9%) were recorded as having positive outcomes (cured, treatment completed and child well, and improvement). The prevalence of PTB among Asian HIV-infected children in our cohort was high. Children with persistent immunosuppression remain vulnerable to PTB even after ART initiation. PMID:24206012

  6. Physical activity, genetic, and nutritional considerations in childhood weight management.

    PubMed

    Bar-Or, O; Foreyt, J; Bouchard, C; Brownell, K D; Dietz, W H; Ravussin, E; Salbe, A D; Schwenger, S; St Jeor, S; Torun, B

    1998-01-01

    Almost one-quarter of U.S. children are now obese, a dramatic increase of over 20% in the past decade. It is intriguing that the increase in prevalence has been occurring while overall fat consumption has been declining. Body mass and composition are influenced by genetic factors, but the actual heritability of juvenile obesity is not known. A low physical activity (PA) is characteristic of obese children and adolescents, and it may be one cause of juvenile obesity. There is little evidence, however, that overall energy expenditure is low among the obese. There is a strong association between the prevalence of obesity and the extent of TV viewing. Enhanced PA can reduce body fat and blood pressure and improve lipoprotein profile in obese individuals. Its effect on body composition, however, is slower than with low-calorie diets. The three main dietary approaches are: protein sparing modified fast, balanced hypocaloric diets, and comprehensive behavioral lifestyle programs. To achieve long-standing control of overweight, one should combine changes in eating and activity patterns, using behavior modification techniques. However, the onus is also on society to reduce incentives for a sedentary lifestyle and over-consumption of food. To address the key issues related to childhood weight management, the American College of Sports Medicine convened a Scientific Roundtable in Indianapolis.

  7. 77 FR 43355 - Agency Information Collection Activities: Submitted for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-24

    ... Office of Natural Resources Revenue Agency Information Collection Activities: Submitted for Office of Management and Budget Review, Comment Request AGENCY: Office of Natural Resources Revenue, Interior. ACTION... information from Indian beneficiaries. ONRR performs the minerals revenue management functions for...

  8. 76 FR 70486 - Agency Information Collection Activities: Submitted for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... Office of Natural Resources Revenue Agency Information Collection Activities: Submitted for Office of Management and Budget Review; Comment Request AGENCY: Office of Natural Resources Revenue, Interior. ACTION... information from Indian beneficiaries. The ONRR performs the minerals revenue management functions for...

  9. 78 FR 10179 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-13

    ... CONTACT: Daniel Gittleson, Office of Information Management, Food and Drug Administration, 1350 Piccard Dr... HUMAN SERVICES Food and Drug Administration Agency Information Collection Activities; Submission for Office of Management and Budget Review; Comment Request; Guidance for Industry and Food and...

  10. Bridging the gap between finance and clinical operations with activity-based cost management.

    PubMed

    Storfjell, J L; Jessup, S

    1996-12-01

    Activity-based cost management (ABCM) is an exciting management tool that links financial information with operations. By determining the costs of specific activities and processes, nurse managers accurately determine true costs of services more accurately than traditional cost accounting methods, and then can target processes for improvement and monitor them for change and improvement. The authors describe the ABCM process applied to nursing management situations.

  11. Risk management activities at the DOE Class A reactor facilities

    SciTech Connect

    Sharp, D.A. ); Hill, D.J. ); Linn, M.A. ); Atkinson, S.A. ); Hu, J.P. )

    1993-01-01

    The probabilistic risk assessment (PRA) and risk management group of the Association for Excellence in Reactor Operation (AERO) develops risk management initiatives and standards to improve operation and increase safety of the DOE Class A reactor facilities. Principal risk management applications that have been implemented at each facility are reviewed. The status of a program to develop guidelines for risk management programs at reactor facilities is presented.

  12. Risk management activities at the DOE Class A reactor facilities

    SciTech Connect

    Sharp, D.A.; Hill, D.J.; Linn, M.A.; Atkinson, S.A.; Hu, J.P.

    1993-12-31

    The probabilistic risk assessment (PRA) and risk management group of the Association for Excellence in Reactor Operation (AERO) develops risk management initiatives and standards to improve operation and increase safety of the DOE Class A reactor facilities. Principal risk management applications that have been implemented at each facility are reviewed. The status of a program to develop guidelines for risk management programs at reactor facilities is presented.

  13. Object-oriented structures supporting remote sensing databases

    NASA Technical Reports Server (NTRS)

    Wichmann, Keith; Cromp, Robert F.

    1995-01-01

    Object-oriented databases show promise for modeling the complex interrelationships pervasive in scientific domains. To examine the utility of this approach, we have developed an Intelligent Information Fusion System based on this technology, and applied it to the problem of managing an active repository of remotely-sensed satellite scenes. The design and implementation of the system is compared and contrasted with conventional relational database techniques, followed by a presentation of the underlying object-oriented data structures used to enable fast indexing into the data holdings.

  14. 78 FR 34669 - Incidental Take Permit and Environmental Assessment for Forest Management Activities, Southern...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-10

    ... management activities by Potlatch Forest Holdings, Inc. (Applicant) that would take the endangered red... harvesting activities on Potlatch lands in Arkansas; and provisioning and maintenance activities associated... managed under a separate HCP adjacent to Felsenthal NWR, Potlatch lands, and up to eight isolated...

  15. Rule-Based Statistical Calculations on a Database Abstract.

    DTIC Science & Technology

    1983-06-01

    of chapter 6 has been submitted to the Second LBL Workshop on Statistical Database Management, 1913. Thanks to all referees for suggestions regarding...Boral, and David J. DeWitt,. A Framework fbr Research in Database Management for Statistical Analysis. In Pwceed;ngsx pages 69-78. ACM SIGMOD...E. Denning. A Security Model for the Statistical Database Problem. In Proceeding& Second LBL Workshop on Statistical Database Management, September

  16. Risk factors associated with the surgical management of craniopharyngiomas in pediatric patients: analysis of 1961 patients from a national registry database.

    PubMed

    Bakhsheshian, Joshua; Jin, Diana L; Chang, Ki-Eun; Strickland, Ben A; Donoho, Dan A; Cen, Steven; Mack, William J; Attenello, Frank; Christian, Eisha A; Zada, Gabriel

    2016-12-01

    OBJECTIVE Patient demographic characteristics, hospital volume, and admission status have been shown to impact surgical outcomes of sellar region tumors in adults; however, the data available following the resection of craniopharyngiomas in the pediatric population remain limited. The authors sought to identify potential risk factors associated with outcomes following surgical management of pediatric craniopharyngiomas. METHODS The Nationwide Inpatient Sample database and Kids' Inpatient Database were analyzed to include admissions for pediatric patients (≤ 18 years) who underwent a transcranial or transsphenoidal craniotomy for resection of a craniopharyngioma. Patient-level factors, including age, race, comorbidities, and insurance type, as well as hospital factors were collected. Outcomes analyzed included mortality rate, endocrine and nonendocrine complications, hospital charges, and length of stay. A multivariate model controlling for variables analyzed was constructed to examine significant independent risk factors. RESULTS Between 2000 and 2011, 1961 pediatric patients were identified who underwent a transcranial (71.2%) or a transsphenoidal (28.8%) craniotomy for resection of a craniopharyngioma. A major predilection for age was observed with the selection of a transcranial (23.4% in < 7-year-olds, 28.1% in 7- to 12-year-olds, and 19.7% in 13- to 18-year-olds) versus transphenoidal (2.9% in < 7-year-olds, 7.4% in 7- to 12-year-olds, and 18.4% in 13- to 18-year-olds) approach. No significant outcomes were associated with a particular surgical approach, except that 7- to 12-year-old patients had a higher risk of nonendocrine complications (relative risk [RR] 2.42, 95% CI 1.04-5.65, p = 0.04) with the transsphenoidal approach when compared with 13- to 18-year-old patients. The overall inpatient mortality rate was 0.5% and the most common postoperative complication was diabetes insipidus (64.2%). There were no independent factors associated with inpatient

  17. Moisture Management in an Active Sportswear: Techniques and Evaluation—A Review Article

    NASA Astrophysics Data System (ADS)

    Senthilkumar, Mani; Sampath, M. B.; Ramachandran, T.

    2013-07-01

    Moisture management property is an important aspect of any fabric meant for active sportswear, which decides the comfort level of that fabric. Every human being sweats during different kinds of activities. An important feature of any fabric is how it transports this water out of the body, so as to make the wearer feel comfortable. This paper reports the concept of moisture management, various production techniques and evaluation of the moisture management characteristics on fabrics for active sportswear.

  18. Toward Phase IV, Populating the WOVOdat Database

    NASA Astrophysics Data System (ADS)

    Ratdomopurbo, A.; Newhall, C. G.; Schwandner, F. M.; Selva, J.; Ueda, H.

    2009-12-01

    One of challenges for volcanologists is the fact that more and more people are likely to live on volcanic slopes. Information about volcanic activity during unrest should be accurate and rapidly distributed. As unrest may lead to eruption, evacuation may be necessary to minimize damage and casualties. The decision to evacuate people is usually based on the interpretation of monitoring data. Over the past several decades, monitoring volcanoes has used more and more sophisticated instruments. A huge volume of data is collected in order to understand the state of activity and behaviour of a volcano. WOVOdat, The World Organization of Volcano Observatories (WOVO) Database of Volcanic Unrest, will provide context within which scientists can interpret the state of their own volcano, during and between crises. After a decision during the 2000 IAVCEI General Assembly to create WOVOdat, development has passed through several phases, from Concept Development (Phase-I in 2000-2002), Database Design (Phase-II, 2003-2006) and Pilot Testing (Phase-III in 2007-2008). For WOVOdat to be operational, there are still two (2) steps to complete, which are: Database Population (Phase-IV) and Enhancement and Maintenance (Phase-V). Since January 2009, the WOVOdat project is hosted by Earth Observatory of Singapore for at least a 5-year period. According to the original planning in 2002, this 5-year period will be used for completing the Phase-IV. As the WOVOdat design is not yet tested for all types of data, 2009 is still reserved for building the back-end relational database management system (RDBMS) of WOVOdat and testing it with more complex data. Fine-tuning of the WOVOdat’s RDBMS design is being done with each new upload of observatory data. The next and main phase of WOVOdat development will be data population, managing data transfer from multiple observatory formats to WOVOdat format. Data population will depend on two important things, the availability of SQL database in volcano

  19. The SpaceInn-SISMA Database: Characterization of a Large Sample of Variable and Active Stars by Means of Harps Spectra

    NASA Astrophysics Data System (ADS)

    Rainer, M.; Poretti, E.; Mistò, A.; Panzera, M. R.; Molinaro, M.; Cepparo, F.; Roth, M.; Michel, E.; Monteiro, M. J. P. F. G.

    2016-12-01

    We created a large database of physical parameters and variability indicators by fully reducing and analyzing the large number of spectra taken to complement the asteroseismic observations of the COnvection, ROtation and planetary Transits (CoRoT) satellite. 7103 spectra of 261 stars obtained with the ESO echelle spectrograph HARPS have been stored in the VO-compliant database Spectroscopic Indicators in a SeisMic Archive (SISMA), along with the CoRoT photometric data of the 72 CoRoT asteroseismic targets. The remaining stars belong to the same variable classes of the CoRoT targets and were observed to better characterize the properties of such classes. Several useful variability indicators (mean line profiles, indices of differential rotation, activity and emission lines) together with v\\sin i and radial-velocity measurements have been extracted from the spectra. The atmospheric parameters {T}{eff},{log}g, and [Fe/H] have been computed following a homogeneous procedure. As a result, we fully characterize a sample of new and known variable stars by computing several spectroscopic indicators, also providing some cases of simultaneous photometry and spectroscopy.

  20. Strategies for Teaching Students to Process Information Using Databases.

    ERIC Educational Resources Information Center

    Rooze, Gene E.

    The database management system or computerized database is an important tool for teaching thinking in the social studies. But what the teacher does to teach the student about databases, to use prepared databases, to organize their data, to direct their research, and to draw conclusions using this teaching device is just as important. The Direct…

  1. A survey of paediatric HIV programmatic and clinical management practices in Asia and sub-Saharan Africa—the International epidemiologic Databases to Evaluate AIDS (IeDEA)

    PubMed Central

    2013-01-01

    Introduction There are limited data on paediatric HIV care and treatment programmes in low-resource settings. Methods A standardized survey was completed by International epidemiologic Databases to Evaluate AIDS paediatric cohort sites in the regions of Asia-Pacific (AP), Central Africa (CA), East Africa (EA), Southern Africa (SA) and West Africa (WA) to understand operational resource availability and paediatric management practices. Data were collected through January 2010 using a secure, web-based software program (REDCap). Results A total of 64,552 children were under care at 63 clinics (AP, N=10; CA, N=4; EA, N=29; SA, N=10; WA, N=10). Most were in urban settings (N=41, 65%) and received funding from governments (N=51, 81%), PEPFAR (N=34, 54%), and/or the Global Fund (N=15, 24%). The majority were combined adult–paediatric clinics (N=36, 57%). Prevention of mother-to-child transmission was integrated at 35 (56%) sites; 89% (N=56) had access to DNA PCR for infant diagnosis. African (N=40/53) but not Asian sites recommended exclusive breastfeeding up until 4–6 months. Regular laboratory monitoring included CD4 (N=60, 95%), and viral load (N=24, 38%). Although 42 (67%) sites had the ability to conduct acid-fast bacilli (AFB) smears, 23 (37%) sites could conduct AFB cultures and 18 (29%) sites could conduct tuberculosis drug susceptibility testing. Loss to follow-up was defined as >3 months of lost contact for 25 (40%) sites, >6 months for 27 sites (43%) and >12 months for 6 sites (10%). Telephone calls (N=52, 83%) and outreach worker home visits to trace children lost to follow-up (N=45, 71%) were common. Conclusions In general, there was a high level of patient and laboratory monitoring within this multiregional paediatric cohort consortium that will facilitate detailed observational research studies. Practices will continue to be monitored as the WHO/UNAIDS Treatment 2.0 framework is implemented. PMID:23336728

  2. An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nicholas; Sellis, Timos

    1994-01-01

    We investigated a number of design and performance issues of interoperable database management systems (DBMS's). The major results of our investigation were obtained in the areas of client-server database architectures for heterogeneous DBMS's, incremental computation models, buffer management techniques, and query optimization. We finished a prototype of an advanced client-server workstation-based DBMS which allows access to multiple heterogeneous commercial DBMS's. Experiments and simulations were then run to compare its performance with the standard client-server architectures. The focus of this research was on adaptive optimization methods of heterogeneous database systems. Adaptive buffer management accounts for the random and object-oriented access methods for which no known characterization of the access patterns exists. Adaptive query optimization means that value distributions and selectives, which play the most significant role in query plan evaluation, are continuously refined to reflect the actual values as opposed to static ones that are computed off-line. Query feedback is a concept that was first introduced to the literature by our group. We employed query feedback for both adaptive buffer management and for computing value distributions and selectivities. For adaptive buffer management, we use the page faults of prior executions to achieve more 'informed' management decisions. For the estimation of the distributions of the selectivities, we use curve-fitting techniques, such as least squares and splines, for regressing on these values.

  3. Data-Based Teacher Development.

    ERIC Educational Resources Information Center

    Borg, Simon

    1998-01-01

    Describes how data from English language teaching (ELT) classroom research can be exploited in teacher development activities. The contribution data-based activities can make to teacher development is outlined, and examples that illustrate the principles underlying their design are presented. A case is made for using such activities to facilitate…

  4. 78 FR 23746 - Takes of Marine Mammals Incidental to Specified Activities; Russian River Estuary Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-22

    ... breaches, as well as construction and maintenance of a lagoon outlet channel. The latter activity, an... 15 through October 15 (hereafter, the ``lagoon management period''). All estuary management... lagoon management period only, this involves construction and maintenance of a lagoon outlet channel...

  5. 78 FR 66344 - Agency Information Collection Activities; Submission to the Office of Management and Budget for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-05

    ... Agency Information Collection Activities; Submission to the Office of Management and Budget for Review... to their performance in the classroom. The study will examine data from a teacher survey and data..., Information and Records Management Services, Office of Management. BILLING CODE 4000-01-P...

  6. Web-Based Self-Management in Chronic Care: A Study of Change in Patient Activation

    ERIC Educational Resources Information Center

    Solomon, Michael R.

    2010-01-01

    Web-based self-management interventions (W-SMIs) are designed to help a large number of chronically ill people become more actively engaged in their health care. Despite the potential to engage more patients in self-managing their health, the use of W-SMIs by patients and their clinicians is low. Using a self-management conceptual model based on…

  7. 78 FR 63973 - Agency Information Collection Activities; Submission to the Office of Management and Budget for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-25

    ... Agency Information Collection Activities; Submission to the Office of Management and Budget for Review and Approval; Comment Request; Student Assistance General Provisions--Subpart K--Cash Management...--Subpart K--Cash Management OMB Control Number: 1845-0038 Type of Review: Revision of an...

  8. 78 FR 64206 - Agency Information Collection Activities; Submission to the Office of Management and Budget for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-28

    ... Agency Information Collection Activities; Submission to the Office of Management and Budget for Review and Approval; Comment Request; Student Assistance General Provisions--Subpart K--Cash Management...--Subpart K--Cash Management. OMB Control Number: 1845-0049. Type of Review: Revision of an...

  9. 78 FR 61346 - Agency Information Collection Activities; Submission to the Office of Management and Budget for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-03

    ... Agency Information Collection Activities; Submission to the Office of Management and Budget for Review and Approval; Comment Request; Student Assistance General Provisions--Subpart K--Cash Management...--Subpart K--Cash Management. OMB Control Number: 1845-0106. Type of Review: Extension without change of...

  10. Spatial Databases

    DTIC Science & Technology

    2007-09-19

    astronomy, human anatomy, fluid flow or an electromagnetic field); • biometrics (fingerprints, palm measurements, facial patterns); • engineering...framework. Pr(X |li, Li) can be estimated using kernel functions from the observed values in the training dataset. A more detailed theoretical and...transportation, oil /gas-pipelines, and utilities (e.g. water, electricity, telephone). Thus, activity reports, e.g. crime/insurgency reports, may often use

  11. The Impact of Environment and Occupation on the Health and Safety of Active Duty Air Force Members - Database Development

    DTIC Science & Technology

    2014-04-01

    abuse, and physical altercations), high-risk sexual behavior (e.g., unprotected sexual intercourse), and physical health issues, such as high blood... sexual activity questions (section 12- reproductive ) PHA Sexually transmitted disease Sponsor ID (SSN), dependent status, DOB, sponsor pay grade...based on encounters and responses. The requested Health Assessment data consisted of specific information relating to tobacco, alcohol, and sexual

  12. JICST Factual Database JICST DNA Database

    NASA Astrophysics Data System (ADS)

    Shirokizawa, Yoshiko; Abe, Atsushi

    Japan Information Center of Science and Technology (JICST) has started the on-line service of DNA database in October 1988. This database is composed of EMBL Nucleotide Sequence Library and Genetic Sequence Data Bank. The authors outline the database system, data items and search commands. Examples of retrieval session are presented.

  13. JOSHUA: Symmetric Active/Active Replication for Highly Available HPC Job and Resource Management

    SciTech Connect

    Uhlemann, Kai; Engelmann, Christian; Scott, Steven L

    2006-01-01

    Most of today's HPC systems employ a single head node for control, which represents a single point of failure as it interrupts an entire HPC system upon failure. Furthermore, it is also a single point of control as it disables an entire HPC system until repair. One of the most important HPC system service running on the head node is the job and resource management. If it goes down, all currently running jobs loose the service they report back to. They have to be restarted once the head node is up and running again. With this paper, we present a generic approach for providing symmetric active/active replication for highly available HPC job and resource management. The JOSHUA solution provides a virtually synchronous environment for continuous availability without any interruption of service and without any loss of state. Replication is performed externally via the PBS service interface without the need to modify any service code. Test results as well as a reliability analysis of our proof-of-concept prototype implementation show that continuous availability can be provided by JOSHUA with an acceptable performance trade-off.

  14. Chronic pain management in the active-duty military

    NASA Astrophysics Data System (ADS)

    Jamison, David; Cohen, Steven P.

    2012-06-01

    As in the general population, chronic pain is a prevalent and burdensome affliction in active-duty military personnel. Painful conditions in military members can be categorized broadly in terms of whether they arise directly from combat injuries (gunshot, fragmentation wound, blast impact) or whether they result from non-combat injuries (sprains, herniated discs, motor vehicle accidents). Both combat-related and non-combat-related causes of pain can further be classified as either acute or chronic. Here we discuss the state of pain management as it relates to the military population in both deployed and non-deployed settings. The term non-battle injury (NBI) is commonly used to refer to those conditions not directly associated with the combat actions of war. In the history of warfare, NBI have far outstripped battle-related injuries in terms not only of morbidity, but also mortality. It was not until improvements in health care and field medicine were applied in World War I that battle-related deaths finally outnumbered those attributed to disease and pestilence. However, NBI have been the leading cause of morbidity and hospital admission in every major conflict since the Korean War. Pain remains a leading cause of presentation to military medical facilities, both in and out of theater. The absence of pain services is associated with a low return-to-duty rate among the deployed population. The most common pain complaints involve the low-back and neck, and studies have suggested that earlier treatment is associated with more significant improvement and a higher return to duty rate. It is recognized that military medicine is often at the forefront of medical innovation, and that many fields of medicine have reaped benefit from the conduct of war.

  15. An Evaluation of Database Solutions to Spatial Object Association

    SciTech Connect

    Kumar, V S; Kurc, T; Saltz, J; Abdulla, G M; Kohn, S; Matarazzo, C

    2008-06-24

    Object association is a common problem encountered in many applications. Spatial object association, also referred to as crossmatch of spatial datasets, is the problem of identifying and comparing objects in two datasets based on their positions in a common spatial coordinate system--one of the datasets may correspond to a catalog of objects observed over time in a multi-dimensional domain; the other dataset may consist of objects observed in a snapshot of the domain at a time point. The use of database management systems to the solve the object association problem provides portability across different platforms and also greater flexibility. Increasing dataset sizes in today's applications, however, have made object association a data/compute-intensive problem that requires targeted optimizations for efficient execution. In this work, we investigate how database-based crossmatch algorithms can be deployed on different database system architectures and evaluate the deployments to understand the impact of architectural choices on crossmatch performance and associated trade-offs. We investigate the execution of two crossmatch algorithms on (1) a parallel database system with active disk style processing capabilities, (2) a high-throughput network database (MySQL Cluster), and (3) shared-nothing databases with replication. We have conducted our study in the context of a large-scale astronomy application with real use-case scenarios.

  16. FLOPROS: an evolving global database of flood protection standards

    NASA Astrophysics Data System (ADS)

    Scussolini, Paolo; Aerts, Jeroen C. J. H.; Jongman, Brenden; Bouwer, Laurens M.; Winsemius, Hessel C.; de Moel, Hans; Ward, Philip J.

    2016-05-01

    With projected changes in climate, population and socioeconomic activity located in flood-prone areas, the global assessment of flood risk is essential to inform climate change policy and disaster risk management. Whilst global flood risk models exist for this purpose, the accuracy of their results is greatly limited by the lack of information on the current standard of protection to floods, with studies either neglecting this aspect or resorting to crude assumptions. Here we present a first global database of FLOod PROtection Standards, FLOPROS, which comprises information in the form of the flood return period associated with protection measures, at different spatial scales. FLOPROS comprises three layers of information, and combines them into one consistent database. The design layer contains empirical information about the actual standard of existing protection already in place; the policy layer contains information on protection standards from policy regulations; and the model layer uses a validated modelling approach to calculate protection standards. The policy layer and the model layer can be considered adequate proxies for actual protection standards included in the design layer, and serve to increase the spatial coverage of the database. Based on this first version of FLOPROS, we suggest a number of strategies to further extend and increase the resolution of the database. Moreover, as the database is intended to be continually updated, while flood protection standards are changing with new interventions, FLOPROS requires input from the flood risk community. We therefore invite researchers and practitioners to contribute information to this evolving database by corresponding to the authors.

  17. FLOPROS: an evolving global database of flood protection standards

    NASA Astrophysics Data System (ADS)

    Scussolini, P.; Aerts, J. C. J. H.; Jongman, B.; Bouwer, L. M.; Winsemius, H. C.; de Moel, H.; Ward, P. J.

    2015-12-01

    With the projected changes in climate, population and socioeconomic activity located in flood-prone areas, the global assessment of the flood risk is essential to inform climate change policy and disaster risk management. Whilst global flood risk models exist for this purpose, the accuracy of their results is greatly limited by the lack of information on the current standard of protection to floods, with studies either neglecting this aspect or resorting to crude assumptions. Here we present a first global database of FLOod PROtection Standards, FLOPROS, which comprises information in the form of the flood return period associated with protection measures, at different spatial scales. FLOPROS comprises three layers of information, and combines them into one consistent database. The Design layer contains empirical information about the actual standard of existing protection already in place, while the Policy layer and the Model layer are proxies for such protection standards, and serve to increase the spatial coverage of the database. The Policy layer contains information on protection standards from policy regulations; and the Model layer uses a validated modeling approach to calculate protection standards. Based on this first version of FLOPROS, we suggest a number of strategies to further extend and increase the resolution of the database. Moreover, as the database is intended to be continually updated, while flood protection standards are changing with new interventions, FLOPROS requires input from the flood risk community. We therefore invite researchers and practitioners to contribute information to this evolving database by corresponding to the authors.

  18. Databases of the marine metagenomics.

    PubMed

    Mineta, Katsuhiko; Gojobori, Takashi

    2016-02-01

    The metagenomic data obtained from marine environments is significantly useful for understanding marine microbial communities. In comparison with the conventional amplicon-based approach of metagenomics, the recent shotgun sequencing-based approach has become a powerful tool that provides an efficient way of grasping a diversity of the entire microbial community at a sampling point in the sea. However, this approach accelerates accumulation of the metagenome data as well as increase of data complexity. Moreover, when metagenomic approach is used for monitoring a time change of marine environments at multiple locations of the seawater, accumulation of metagenomics data will become tremendous with an enormous speed. Because this kind of situation has started becoming of reality at many marine research institutions and stations all over the world, it looks obvious that the data management and analysis will be confronted by the so-called Big Data issues such as how the database can be constructed in an efficient way and how useful knowledge should be extracted from a vast amount of the data. In this review, we summarize the outline of all the major databases of marine metagenome that are currently publically available, noting that database exclusively on marine metagenome is none but the number of metagenome databases including marine metagenome data are six, unexpectedly still small. We also extend our explanation to the databases, as reference database we call, that will be useful for constructing a marine metagenome database as well as complementing important information with the database. Then, we would point out a number of challenges to be conquered in constructing the marine metagenome database.

  19. Draft secure medical database standard.

    PubMed

    Pangalos, George

    2002-01-01

    Medical database security is a particularly important issue for all Healthcare establishments. Medical information systems are intended to support a wide range of pertinent health issues today, for example: assure the quality of care, support effective management of the health services institutions, monitor and contain the cost of care, implement technology into care without violating social values, ensure the equity and availability of care, preserve humanity despite the proliferation of technology etc.. In this context, medical database security aims primarily to support: high availability, accuracy and consistency of the stored data, the medical professional secrecy and confidentiality, and the protection of the privacy of the patient. These properties, though of technical nature, basically require that the system is actually helpful for medical care and not harmful to patients. These later properties require in turn not only that fundamental ethical principles are not violated by employing database systems, but instead, are effectively enforced by technical means. This document reviews the existing and emerging work on the security of medical database systems. It presents in detail the related problems and requirements related to medical database security. It addresses the problems of medical database security policies, secure design methodologies and implementation techniques. It also describes the current legal framework and regulatory requirements for medical database security. The issue of medical database security guidelines is also examined in detailed. The current national and international efforts in the area are studied. It also gives an overview of the research work in the area. The document also presents in detail the most complete to our knowledge set of security guidelines for the development and operation of medical database systems.

  20. Prototyping a genetics deductive database

    SciTech Connect

    Hearne, C.; Cui, Zhan; Parsons, S.; Hajnal, S.

    1994-12-31

    We are developing a laboratory notebook system known as the Genetics Deductive Database. Currently our prototype provides storage for biological facts and rules with flexible access via an interactive graphical display. We have introduced a formal basis for the representation and reasoning necessary to order genome map data and handle the uncertainty inherent in biological data. We aim to support laboratory activities by introducing an experiment planner into our prototype. The Genetics Deductive Database is built using new database technology which provides an object-oriented conceptual model, a declarative rule language, and a procedural update language. This combination of features allows the implementation of consistency maintenance, automated reasoning, and data verification.