Sample records for commercial database systems

  1. Small Business Innovations (Integrated Database)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Because of the diversity of NASA's information systems, it was necessary to develop DAVID as a central database management system. Under a Small Business Innovation Research (SBIR) grant, Ken Wanderman and Associates, Inc. designed software tools enabling scientists to interface with DAVID and commercial database management systems, as well as artificial intelligence programs. The software has been installed at a number of data centers and is commercially available.

  2. Supporting Social Data Observatory with Customizable Index Structures on HBase - Architecture and Performance

    DTIC Science & Technology

    2013-01-01

    commercial NoSQL database system. The results show that In-dexedHBase provides a data loading speed that is 6 times faster than Riak, and is...compare it with Riak, a widely adopted commercial NoSQL database system. The results show that In- dexedHBase provides a data loading speed that is 6...events. This chapter describes our research towards building an efficient and scalable storage platform for Truthy. Many existing NoSQL databases

  3. SU-E-T-255: Development of a Michigan Quality Assurance (MQA) Database for Clinical Machine Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, D

    Purpose: A unified database system was developed to allow accumulation, review and analysis of quality assurance (QA) data for measurement, treatment, imaging and simulation equipment in our department. Recording these data in a database allows a unified and structured approach to review and analysis of data gathered using commercial database tools. Methods: A clinical database was developed to track records of quality assurance operations on linear accelerators, a computed tomography (CT) scanner, high dose rate (HDR) afterloader and imaging systems such as on-board imaging (OBI) and Calypso in our department. The database was developed using Microsoft Access database and visualmore » basic for applications (VBA) programming interface. Separate modules were written for accumulation, review and analysis of daily, monthly and annual QA data. All modules were designed to use structured query language (SQL) as the basis of data accumulation and review. The SQL strings are dynamically re-written at run time. The database also features embedded documentation, storage of documents produced during QA activities and the ability to annotate all data within the database. Tests are defined in a set of tables that define test type, specific value, and schedule. Results: Daily, Monthly and Annual QA data has been taken in parallel with established procedures to test MQA. The database has been used to aggregate data across machines to examine the consistency of machine parameters and operations within the clinic for several months. Conclusion: The MQA application has been developed as an interface to a commercially available SQL engine (JET 5.0) and a standard database back-end. The MQA system has been used for several months for routine data collection.. The system is robust, relatively simple to extend and can be migrated to a commercial SQL server.« less

  4. TechTracS: NASA's commercial technology management system

    NASA Astrophysics Data System (ADS)

    Barquinero, Kevin; Cannon, Douglas

    1996-03-01

    The Commercial Technology Mission is a primary NASA mission, comparable in importance to those in aeronautics and space. This paper will discuss TechTracS, NASA Commercial Technology Management System that has been put into place in FY 1995 to implement this mission. This system is designed to identify and capture the NASA technologies which have commercial potential into an off-the-shelf database application, and then track the technologies' progress in realizing the commercial potential through collaborations with industry. The management system consists of four stages. The first is to develop an inventory database of the agency's entire technology portfolio and assess it for relevance to the commercial marketplace. Those technologies that are identified as having commercial potential will then be actively marketed to appropriate industries—this is the second stage. The third stage is when a NASA-industry partnership is entered into for the purposes of commercializing the technology. The final stage is to track the technology's success or failure in the marketplace. The collection of this information in TechTracS enables metrics evaluation and can accelerate the establishment on direct contacts between and NASA technologist and an industry technologist. This connection is the beginning of the technology commercialization process.

  5. CHSIR Anthropometric Database, CHSIR Truncated Anthropometric Database, and Boundary Manikins

    NASA Technical Reports Server (NTRS)

    Rajulu, Sudhakar

    2011-01-01

    The NASA crew anthropometric dimensions that the Commercial Transportation System (CTS) must accommodate are listed in CCT-REQ-1130 Draft 3.0, with the specific critical anthropometric dimensions for use in vehicle design (and suit design in the event that a pressure suit is part of the commercial partner s design solution).

  6. Thematic video indexing to support video database retrieval and query processing

    NASA Astrophysics Data System (ADS)

    Khoja, Shakeel A.; Hall, Wendy

    1999-08-01

    This paper presents a novel video database system, which caters for complex and long videos, such as documentaries, educational videos, etc. As compared to relatively structured format videos like CNN news or commercial advertisements, this database system has the capacity to work with long and unstructured videos.

  7. Field Validation of Food Service Listings: A Comparison of Commercial and Online Geographic Information System Databases

    PubMed Central

    Seliske, Laura; Pickett, William; Bates, Rebecca; Janssen, Ian

    2012-01-01

    Many studies examining the food retail environment rely on geographic information system (GIS) databases for location information. The purpose of this study was to validate information provided by two GIS databases, comparing the positional accuracy of food service places within a 1 km circular buffer surrounding 34 schools in Ontario, Canada. A commercial database (InfoCanada) and an online database (Yellow Pages) provided the addresses of food service places. Actual locations were measured using a global positioning system (GPS) device. The InfoCanada and Yellow Pages GIS databases provided the locations for 973 and 675 food service places, respectively. Overall, 749 (77.1%) and 595 (88.2%) of these were located in the field. The online database had a higher proportion of food service places found in the field. The GIS locations of 25% of the food service places were located within approximately 15 m of their actual location, 50% were within 25 m, and 75% were within 50 m. This validation study provided a detailed assessment of errors in the measurement of the location of food service places in the two databases. The location information was more accurate for the online database, however, when matching criteria were more conservative, there were no observed differences in error between the databases. PMID:23066385

  8. Field validation of food service listings: a comparison of commercial and online geographic information system databases.

    PubMed

    Seliske, Laura; Pickett, William; Bates, Rebecca; Janssen, Ian

    2012-08-01

    Many studies examining the food retail environment rely on geographic information system (GIS) databases for location information. The purpose of this study was to validate information provided by two GIS databases, comparing the positional accuracy of food service places within a 1 km circular buffer surrounding 34 schools in Ontario, Canada. A commercial database (InfoCanada) and an online database (Yellow Pages) provided the addresses of food service places. Actual locations were measured using a global positioning system (GPS) device. The InfoCanada and Yellow Pages GIS databases provided the locations for 973 and 675 food service places, respectively. Overall, 749 (77.1%) and 595 (88.2%) of these were located in the field. The online database had a higher proportion of food service places found in the field. The GIS locations of 25% of the food service places were located within approximately 15 m of their actual location, 50% were within 25 m, and 75% were within 50 m. This validation study provided a detailed assessment of errors in the measurement of the location of food service places in the two databases. The location information was more accurate for the online database, however, when matching criteria were more conservative, there were no observed differences in error between the databases.

  9. 48 CFR 204.1103 - Procedures.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) database. (1) On contract award documents, use the contractor's legal or “doing business as” name and physical address information as recorded in the (SAM) database at the time of award. (2) When making a... database; and (ii) The contractor's Data Universal Numbering System (DUNS) number, Commercial and...

  10. 48 CFR 204.1103 - Procedures.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) database. (1) On contract award documents, use the contractor's legal or “doing business as” name and physical address information as recorded in the (SAM) database at the time of award. (2) When making a... database; and (ii) The contractor's Data Universal Numbering System (DUNS) number, Commercial and...

  11. Content Based Retrieval Database Management System with Support for Similarity Searching and Query Refinement

    DTIC Science & Technology

    2002-01-01

    to the OODBMS approach. The ORDBMS approach produced such research prototypes as Postgres [155], and Starburst [67] and commercial products such as...Kemnitz. The POSTGRES Next-Generation Database Management System. Communications of the ACM, 34(10):78–92, 1991. [156] Michael Stonebreaker and Dorothy

  12. Data structures and organisation: Special problems in scientific applications

    NASA Astrophysics Data System (ADS)

    Read, Brian J.

    1989-12-01

    In this paper we discuss and offer answers to the following questions: What, really, are the benifits of databases in physics? Are scientific databases essentially different from conventional ones? What are the drawbacks of a commercial database management system for use with scientific data? Do they outweigh the advantages? Do databases systems have adequate graphics facilities, or is a separate graphics package necessary? SQL as a standard language has deficiencies, but what are they for scientific data in particular? Indeed, is the relational model appropriate anyway? Or, should we turn to object oriented databases?

  13. Commercial Supersonics Technology Project - Status of Airport Noise

    NASA Technical Reports Server (NTRS)

    Bridges, James

    2016-01-01

    The Commercial Supersonic Technology Project has been developing databases, computational tools, and system models to prepare for a level 1 milestone, the Low Noise Propulsion Tech Challenge, to be delivered Sept 2016. Steps taken to prepare for the final validation test are given, including system analysis, code validation, and risk reduction testing.

  14. SPIRE Data-Base Management System

    NASA Technical Reports Server (NTRS)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  15. An academic radiology information system (RIS): a review of the commercial RIS systems, and how an individualized academic RIS can be created and utilized.

    PubMed

    Tamm, E P; Kawashima, A; Silverman, P

    2001-06-01

    Current commercial radiology information systems (RIS) are designed for scheduling, billing, charge collection, and report dissemination. Academic institutions have additional requirements for their missions for teaching, research and clinical care. The newest versions of commercial RIS offer greater flexibility than prior systems. We sent questionnaires to Cerner Corporation, ADAC Health Care Information Systems, IDX Systems, Per-Se' Technologies, and Siemens Health Services regarding features of their products. All of the products we surveyed offer user customizable fields. However, most products did not allow the user to expand their product's data table. The search capabilities of the products varied. All of the products supported the Health Level 7 (HL-7) interface and the use of structured query language (SQL). All of the products were offered with an SQL editor for creating customized queries and custom reports. All products included capabilities for collecting data for quality assurance and included capabilities for tracking "interesting cases," though they varied in the functionality offered. No product offered dedicated functions for research. Alternatively, radiology departments can create their own client-server Windows-based database systems to supplement the capabilities of commercial systems. Such systems can be developed with "web-enabled" database products like Microsoft Access or Apple Filemaker Pro.

  16. Full-text, Downloading, & Other Issues.

    ERIC Educational Resources Information Center

    Tenopir, Carol

    1983-01-01

    Issues having a possible impact on online search services in libraries are discussed including full text databases, front-end processors which translate user's input into the command language of an appropriate system, downloading to create personal files from commercial databases, and pricing. (EJS)

  17. [A survey of the best bibliographic searching system in occupational medicine and discussion of its implementation].

    PubMed

    Inoue, J

    1991-12-01

    When occupational health personnel, especially occupational physicians search bibliographies, they usually have to search bibliographies by themselves. Also, if a library is not available because of the location of their work place, they might have to rely on online databases. Although there are many commercial databases in the world, people who seldom use them, will have problems with on-line searching, such as user-computer interface, keywords, and so on. The present study surveyed the best bibliographic searching system in the field of occupational medicine by questionnaire through the use of DIALOG OnDisc MEDLINE as a commercial database. In order to ascertain the problems involved in determining the best bibliographic searching system, a prototype bibliographic searching system was constructed and then evaluated. Finally, solutions for the problems were discussed. These led to the following conclusions: to construct the best bibliographic searching system at the present time, 1) a concept of micro-to-mainframe links (MML) is needed for the computer hardware network; 2) multi-lingual font standards and an excellent common user-computer interface are needed for the computer software; 3) a short course and education of database management systems, and support of personal information processing for retrieved data are necessary for the practical use of the system.

  18. World Energy Projection System Plus Model Documentation: Commercial Module

    EIA Publications

    2016-01-01

    The Commercial Model of the World Energy Projection System Plus (WEPS ) is an energy demand modeling system of the world commercial end?use sector at a regional level. This report describes the version of the Commercial Model that was used to produce the commercial sector projections published in the International Energy Outlook 2016 (IEO2016). The Commercial Model is one of 13 components of the WEPS system. The WEPS is a modular system, consisting of a number of separate energy models that are communicate and work with each other through an integrated system model. The model components are each developed independently, but are designed with well?defined protocols for system communication and interactivity. The WEPS modeling system uses a shared database (the “restart” file) that allows all the models to communicate with each other when they are run in sequence over a number of iterations. The overall WEPS system uses an iterative solution technique that forces convergence of consumption and supply pressures to solve for an equilibrium price.

  19. Starbase Data Tables: An ASCII Relational Database for Unix

    NASA Astrophysics Data System (ADS)

    Roll, John

    2011-11-01

    Database management is an increasingly important part of astronomical data analysis. Astronomers need easy and convenient ways of storing, editing, filtering, and retrieving data about data. Commercial databases do not provide good solutions for many of the everyday and informal types of database access astronomers need. The Starbase database system with simple data file formatting rules and command line data operators has been created to answer this need. The system includes a complete set of relational and set operators, fast search/index and sorting operators, and many formatting and I/O operators. Special features are included to enhance the usefulness of the database when manipulating astronomical data. The software runs under UNIX, MSDOS and IRAF.

  20. DOE technology information management system database study report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performedmore » detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.« less

  1. A survey of commercial object-oriented database management systems

    NASA Technical Reports Server (NTRS)

    Atkins, John

    1992-01-01

    The object-oriented data model is the culmination of over thirty years of database research. Initially, database research focused on the need to provide information in a consistent and efficient manner to the business community. Early data models such as the hierarchical model and the network model met the goal of consistent and efficient access to data and were substantial improvements over simple file mechanisms for storing and accessing data. However, these models required highly skilled programmers to provide access to the data. Consequently, in the early 70's E.F. Codd, an IBM research computer scientists, proposed a new data model based on the simple mathematical notion of the relation. This model is known as the Relational Model. In the relational model, data is represented in flat tables (or relations) which have no physical or internal links between them. The simplicity of this model fostered the development of powerful but relatively simple query languages that now made data directly accessible to the general database user. Except for large, multi-user database systems, a database professional was in general no longer necessary. Database professionals found that traditional data in the form of character data, dates, and numeric data were easily represented and managed via the relational model. Commercial relational database management systems proliferated and performance of relational databases improved dramatically. However, there was a growing community of potential database users whose needs were not met by the relational model. These users needed to store data with data types not available in the relational model and who required a far richer modelling environment than that provided by the relational model. Indeed, the complexity of the objects to be represented in the model mandated a new approach to database technology. The Object-Oriented Model was the result.

  2. A Comparison of Three Commercial Online Vendors.

    ERIC Educational Resources Information Center

    Hoover, Ryan E.

    1979-01-01

    Compares database update currency, number of hits, elapsed time, number of offline prints or online types, offline print turnaround time, vendor rates, total search cost, and discounted search cost based on vendor discount rates for five simple searches run on three major commercial vendors' online systems. (CWM)

  3. Concordance of Commercial Data Sources for Neighborhood-Effects Studies

    PubMed Central

    Schootman, Mario

    2010-01-01

    Growing evidence supports a relationship between neighborhood-level characteristics and important health outcomes. One source of neighborhood data includes commercial databases integrated with geographic information systems to measure availability of certain types of businesses or destinations that may have either favorable or adverse effects on health outcomes; however, the quality of these data sources is generally unknown. This study assessed the concordance of two commercial databases for ascertaining the presence, locations, and characteristics of businesses. Businesses in the St. Louis, Missouri area were selected based on their four-digit Standard Industrial Classification (SIC) codes and classified into 14 business categories. Business listings in the two commercial databases were matched by standardized business name within specified distances. Concordance and coverage measures were calculated using capture–recapture methods for all businesses and by business type, with further stratification by census-tract-level population density, percent below poverty, and racial composition. For matched listings, distance between listings and agreement in four-digit SIC code, sales volume, and employee size were calculated. Overall, the percent agreement was 32% between the databases. Concordance and coverage estimates were lowest for health-care facilities and leisure/entertainment businesses; highest for popular walking destinations, eating places, and alcohol/tobacco establishments; and varied somewhat by population density. The mean distance (SD) between matched listings was 108.2 (179.0) m with varying levels of agreement in four-digit SIC (percent agreement = 84.6%), employee size (weighted kappa = 0.63), and sales volume (weighted kappa = 0.04). Researchers should cautiously interpret findings when using these commercial databases to yield measures of the neighborhood environment. PMID:20480397

  4. The elements of a commercial human spaceflight safety reporting system

    NASA Astrophysics Data System (ADS)

    Christensen, Ian

    2017-10-01

    In its report on the SpaceShipTwo accident the National Transportation Safety Board (NTSB) included in its recommendations that the Federal Aviation Administration (FAA) ;in collaboration with the commercial spaceflight industry, continue work to implement a database of lessons learned from commercial space mishap investigations and encourage commercial space industry members to voluntarily submit lessons learned.; In its official response to the NTSB the FAA supported this recommendation and indicated it has initiated an iterative process to put into place a framework for a cooperative safety data sharing process including the sharing of lessons learned, and trends analysis. Such a framework is an important element of an overall commercial human spaceflight safety system.

  5. XML: James Webb Space Telescope Database Issues, Lessons, and Status

    NASA Technical Reports Server (NTRS)

    Detter, Ryan; Mooney, Michael; Fatig, Curtis

    2003-01-01

    This paper will present the current concept using extensible Markup Language (XML) as the underlying structure for the James Webb Space Telescope (JWST) database. The purpose of using XML is to provide a JWST database, independent of any portion of the ground system, yet still compatible with the various systems using a variety of different structures. The testing of the JWST Flight Software (FSW) started in 2002, yet the launch is scheduled for 2011 with a planned 5-year mission and a 5-year follow on option. The initial database and ground system elements, including the commands, telemetry, and ground system tools will be used for 19 years, plus post mission activities. During the Integration and Test (I&T) phases of the JWST development, 24 distinct laboratories, each geographically dispersed, will have local database tools with an XML database. Each of these laboratories database tools will be used for the exporting and importing of data both locally and to a central database system, inputting data to the database certification process, and providing various reports. A centralized certified database repository will be maintained by the Space Telescope Science Institute (STScI), in Baltimore, Maryland, USA. One of the challenges for the database is to be flexible enough to allow for the upgrade, addition or changing of individual items without effecting the entire ground system. Also, using XML should allow for the altering of the import and export formats needed by the various elements, tracking the verification/validation of each database item, allow many organizations to provide database inputs, and the merging of the many existing database processes into one central database structure throughout the JWST program. Many National Aeronautics and Space Administration (NASA) projects have attempted to take advantage of open source and commercial technology. Often this causes a greater reliance on the use of Commercial-Off-The-Shelf (COTS), which is often limiting. In our review of the database requirements and the COTS software available, only very expensive COTS software will meet 90% of requirements. Even with the high projected initial cost of COTS, the development and support for custom code over the 19-year mission period was forecasted to be higher than the total licensing costs. A group did look at reusing existing database tools and formats. If the JWST database was already in a mature state, the reuse made sense, but with the database still needing to handing the addition of different types of command and telemetry structures, defining new spacecraft systems, accept input and export to systems which has not been defined yet, XML provided the flexibility desired. It remains to be determined whether the XML database will reduce the over all cost for the JWST mission.

  6. Quality Attribute-Guided Evaluation of NoSQL Databases: An Experience Report

    DTIC Science & Technology

    2014-10-18

    detailed technical evaluations of NoSQL databases specifically, and big data systems in general, that have become apparent during our study... big data , software systems [Agarwal 2011]. Internet-born organizations such as Google and Amazon are at the cutting edge of this revolution...Chang 2008], along with those of numerous other big data innovators, have made a variety of open source and commercial data management technologies

  7. Development of a relational database to capture and merge clinical history with the quantitative results of radionuclide renography.

    PubMed

    Folks, Russell D; Savir-Baruch, Bital; Garcia, Ernest V; Verdes, Liudmila; Taylor, Andrew T

    2012-12-01

    Our objective was to design and implement a clinical history database capable of linking to our database of quantitative results from (99m)Tc-mercaptoacetyltriglycine (MAG3) renal scans and export a data summary for physicians or our software decision support system. For database development, we used a commercial program. Additional software was developed in Interactive Data Language. MAG3 studies were processed using an in-house enhancement of a commercial program. The relational database has 3 parts: a list of all renal scans (the RENAL database), a set of patients with quantitative processing results (the Q2 database), and a subset of patients from Q2 containing clinical data manually transcribed from the hospital information system (the CLINICAL database). To test interobserver variability, a second physician transcriber reviewed 50 randomly selected patients in the hospital information system and tabulated 2 clinical data items: hydronephrosis and presence of a current stent. The CLINICAL database was developed in stages and contains 342 fields comprising demographic information, clinical history, and findings from up to 11 radiologic procedures. A scripted algorithm is used to reliably match records present in both Q2 and CLINICAL. An Interactive Data Language program then combines data from the 2 databases into an XML (extensible markup language) file for use by the decision support system. A text file is constructed and saved for review by physicians. RENAL contains 2,222 records, Q2 contains 456 records, and CLINICAL contains 152 records. The interobserver variability testing found a 95% match between the 2 observers for presence or absence of ureteral stent (κ = 0.52), a 75% match for hydronephrosis based on narrative summaries of hospitalizations and clinical visits (κ = 0.41), and a 92% match for hydronephrosis based on the imaging report (κ = 0.84). We have developed a relational database system to integrate the quantitative results of MAG3 image processing with clinical records obtained from the hospital information system. We also have developed a methodology for formatting clinical history for review by physicians and export to a decision support system. We identified several pitfalls, including the fact that important textual information extracted from the hospital information system by knowledgeable transcribers can show substantial interobserver variation, particularly when record retrieval is based on the narrative clinical records.

  8. A Computational Chemistry Database for Semiconductor Processing

    NASA Technical Reports Server (NTRS)

    Jaffe, R.; Meyyappan, M.; Arnold, J. O. (Technical Monitor)

    1998-01-01

    The concept of 'virtual reactor' or 'virtual prototyping' has received much attention recently in the semiconductor industry. Commercial codes to simulate thermal CVD and plasma processes have become available to aid in equipment and process design efforts, The virtual prototyping effort would go nowhere if codes do not come with a reliable database of chemical and physical properties of gases involved in semiconductor processing. Commercial code vendors have no capabilities to generate such a database, rather leave the task to the user of finding whatever is needed. While individual investigations of interesting chemical systems continue at Universities, there has not been any large scale effort to create a database. In this presentation, we outline our efforts in this area. Our effort focuses on the following five areas: 1. Thermal CVD reaction mechanism and rate constants. 2. Thermochemical properties. 3. Transport properties.4. Electron-molecule collision cross sections. and 5. Gas-surface interactions.

  9. Design and implementation of an audit trail in compliance with US regulations.

    PubMed

    Jiang, Keyuan; Cao, Xiang

    2011-10-01

    Audit trails have been used widely to ensure quality of study data and have been implemented in computerized clinical trials data systems. Increasingly, there is a need to audit access to study participant identifiable information to provide assurance that study participant privacy is protected and confidentiality is maintained. In the United States, several federal regulations specify how the audit trail function should be implemented. To describe the development and implementation of a comprehensive audit trail system that meets the regulatory requirements of assuring data quality and integrity and protecting participant privacy and that is also easy to implement and maintain. The audit trail system was designed and developed after we examined regulatory requirements, data access methods, prevailing application architecture, and good security practices. Our comprehensive audit trail system was developed and implemented at the database level using a commercially available database management software product. It captures both data access and data changes with the correct user identifier. Documentation of access is initiated automatically in response to either data retrieval or data change at the database level. Currently, our system has been implemented only on one commercial database management system. Although our audit trail algorithm does not allow for logging aggregate operations, aggregation does not reveal sensitive private participant information. Careful consideration must be given to data items selected for monitoring because selection of all data items using our system can dramatically increase the requirements for computer disk space. Evaluating the criticality and sensitivity of individual data items selected can control the storage requirements for clinical trial audit trail records. Our audit trail system is capable of logging data access and data change operations to satisfy regulatory requirements. Our approach is applicable to virtually any data that can be stored in a relational database.

  10. A Data Analysis Expert System For Large Established Distributed Databases

    NASA Astrophysics Data System (ADS)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-05-01

    The purpose of this work is to analyze the applicability of artificial intelligence techniques for developing a user-friendly, parallel interface to large isolated, incompatible NASA databases for the purpose of assisting the management decision process. To carry out this work, a survey was conducted to establish the data access requirements of several key NASA user groups. In addition, current NASA database access methods were evaluated. The results of this work are presented in the form of a design for a natural language database interface system, called the Deductively Augmented NASA Management Decision Support System (DANMDS). This design is feasible principally because of recently announced commercial hardware and software product developments which allow cross-vendor compatibility. The goal of the DANMDS system is commensurate with the central dilemma confronting most large companies and institutions in America, the retrieval of information from large, established, incompatible database systems. The DANMDS system implementation would represent a significant first step toward this problem's resolution.

  11. Bar-Code System for a Microbiological Laboratory

    NASA Technical Reports Server (NTRS)

    Law, Jennifer; Kirschner, Larry

    2007-01-01

    A bar-code system has been assembled for a microbiological laboratory that must examine a large number of samples. The system includes a commercial bar-code reader, computer hardware and software components, plus custom-designed database software. The software generates a user-friendly, menu-driven interface.

  12. Towards G2G: Systems of Technology Database Systems

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David

    2005-01-01

    We present an approach and methodology for developing Government-to-Government (G2G) Systems of Technology Database Systems. G2G will deliver technologies for distributed and remote integration of technology data for internal use in analysis and planning as well as for external communications. G2G enables NASA managers, engineers, operational teams and information systems to "compose" technology roadmaps and plans by selecting, combining, extending, specializing and modifying components of technology database systems. G2G will interoperate information and knowledge that is distributed across organizational entities involved that is ideal for NASA future Exploration Enterprise. Key contributions of the G2G system will include the creation of an integrated approach to sustain effective management of technology investments that supports the ability of various technology database systems to be independently managed. The integration technology will comply with emerging open standards. Applications can thus be customized for local needs while enabling an integrated management of technology approach that serves the global needs of NASA. The G2G capabilities will use NASA s breakthrough in database "composition" and integration technology, will use and advance emerging open standards, and will use commercial information technologies to enable effective System of Technology Database systems.

  13. Charting a Path to Location Intelligence for STD Control.

    PubMed

    Gerber, Todd M; Du, Ping; Armstrong-Brown, Janelle; McNutt, Louise-Anne; Coles, F Bruce

    2009-01-01

    This article describes the New York State Department of Health's GeoDatabase project, which developed new methods and techniques for designing and building a geocoding and mapping data repository for sexually transmitted disease (STD) control. The GeoDatabase development was supported through the Centers for Disease Control and Prevention's Outcome Assessment through Systems of Integrated Surveillance workgroup. The design and operation of the GeoDatabase relied upon commercial-off-the-shelf tools that other public health programs may also use for disease-control systems. This article provides a blueprint of the structure and software used to build the GeoDatabase and integrate location data from multiple data sources into the everyday activities of STD control programs.

  14. Generating Shifting Workloads to Benchmark Adaptability in Relational Database Systems

    NASA Astrophysics Data System (ADS)

    Rabl, Tilmann; Lang, Andreas; Hackl, Thomas; Sick, Bernhard; Kosch, Harald

    A large body of research concerns the adaptability of database systems. Many commercial systems already contain autonomic processes that adapt configurations as well as data structures and data organization. Yet there is virtually no possibility for a just measurement of the quality of such optimizations. While standard benchmarks have been developed that simulate real-world database applications very precisely, none of them considers variations in workloads produced by human factors. Today’s benchmarks test the performance of database systems by measuring peak performance on homogeneous request streams. Nevertheless, in systems with user interaction access patterns are constantly shifting. We present a benchmark that simulates a web information system with interaction of large user groups. It is based on the analysis of a real online eLearning management system with 15,000 users. The benchmark considers the temporal dependency of user interaction. Main focus is to measure the adaptability of a database management system according to shifting workloads. We will give details on our design approach that uses sophisticated pattern analysis and data mining techniques.

  15. Query by forms: User-oriented relational database retrieving system and its application in analysis of experiment data

    NASA Astrophysics Data System (ADS)

    Skotniczny, Zbigniew

    1989-12-01

    The Query by Forms (QbF) system is a user-oriented interactive tool for querying large relational database with minimal queries difinition cost. The system was worked out under the assumption that user's time and effort for defining needed queries is the most severe bottleneck. The system may be applied in any Rdb/VMS databases system and is recommended for specific information systems of any project where end-user queries cannot be foreseen. The tool is dedicated to specialist of an application domain who have to analyze data maintained in database from any needed point of view, who do not need to know commercial databases languages. The paper presents the system developed as a compromise between its functionality and usability. User-system communication via a menu-driven "tree-like" structure of screen-forms which produces a query difinition and execution is discussed in detail. Output of query results (printed reports and graphics) is also discussed. Finally the paper shows one application of QbF to a HERA-project.

  16. Flight Test Evaluation of Situation Awareness Benefits of Integrated Synthetic Vision System Technology f or Commercial Aircraft

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kramer, Lynda J.; Arthur, Jarvis J., III

    2005-01-01

    Research was conducted onboard a Gulfstream G-V aircraft to evaluate integrated Synthetic Vision System concepts during flight tests over a 6-week period at the Wallops Flight Facility and Reno/Tahoe International Airport. The NASA Synthetic Vision System incorporates database integrity monitoring, runway incursion prevention alerting, surface maps, enhanced vision sensors, and advanced pathway guidance and synthetic terrain presentation. The paper details the goals and objectives of the flight test with a focus on the situation awareness benefits of integrating synthetic vision system enabling technologies for commercial aircraft.

  17. 77 FR 21808 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-11

    ... and open source records and commercial database. EXEMPTIONS CLAIMED FOR THE SYSTEM: The Attorney... notification procedures, the record access procedures, the contesting record procedures, the record source..., confidential sources, and victims of crimes. The offenses and alleged offenses associated with the individuals...

  18. Reference System of DNA and Protein Sequences on CD-ROM

    NASA Astrophysics Data System (ADS)

    Nasu, Hisanori; Ito, Toshiaki

    DNASIS-DBREF31 is a database for DNA and Protein sequences in the form of optical Compact Disk (CD) ROM, developed and commercialized by Hitachi Software Engineering Co., Ltd. Both nucleic acid base sequences and protein amino acid sequences can be retrieved from a single CD-ROM. Existing database is offered in the form of on-line service, floppy disks, or magnetic tape, all of which have some problems or other, such as usability or storage capacity. DNASIS-DBREF31 newly adopt a CD-ROM as a database device to realize a mass storage and personal use of the database.

  19. Compilation of the data-base of the star catalogue by ADABAS.

    NASA Astrophysics Data System (ADS)

    Ishikawa, T.

    A data-base of the FK4 Star Catalogue is compiled by using HITAC M-280H in the Computer Center of Tokyo University and a commercial data-base management system (DBMS) ADABAS. The purpose of this attempt is to examine whether the ADABAS, which could be regarded as a representative of the currently available DBMS's developed majorly for business and information retrieval purposes, proves itself useful for handling mass numerical data like the star catalogue data. It is concluded that the data-base could really be a convenient way for storing and utilizing the star catalogue data.

  20. Update to the Ground-Water Withdrawals Database for the Death Valley Regional Ground-Water Flow System, Nevada and California, 1913-2003

    USGS Publications Warehouse

    Moreo, Michael T.; Justet, Leigh

    2008-01-01

    Ground-water withdrawal estimates from 1913 through 2003 for the Death Valley regional ground-water flow system are compiled in an electronic database to support a regional, three-dimensional, transient ground-water flow model. This database updates a previously published database that compiled estimates of ground-water withdrawals for 1913-1998. The same methodology is used to construct each database. Primary differences between the 2 databases are an additional 5 years of ground-water withdrawal data, well locations in the updated database are restricted to Death Valley regional ground-water flow system model boundary, and application rates are from 0 to 1.5 feet per year lower than original estimates. The lower application rates result from revised estimates of crop consumptive use, which are based on updated estimates of potential evapotranspiration. In 2003, about 55,700 acre-feet of ground water was pumped in the DVRFS, of which 69 percent was used for irrigation, 13 percent for domestic, and 18 percent for public supply, commercial, and mining activities.

  1. The open-source movement: an introduction for forestry professionals

    Treesearch

    Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove

    2005-01-01

    In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....

  2. Region and database management for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, D.

    The Data Integration 2000 Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract. It is based on the Commercial-Off-The-Shelf product solution with commercially proven business processes. The COTS product solution set, of PassPort and People Soft software, supports finance, supply and chemical management/Material Safety Data Sheet, human resources.

  3. Secure Database Management Study.

    DTIC Science & Technology

    1978-12-01

    covers cases Involving indus- trial economics (e.g., Industrial spies) and commercial finances (e.g., fraud). Priv¢j--Protection of date about people...California, Berke - lay [STONM76aI. * The approach to protection taken in INGRE (STOM74| has attracted a lot of Interest* Queries, in a high level query...Material Command Support Activity (NMCSA), and another DoD agency, Cullinane Corporation developed a prototype version of the IDS database system on a

  4. "New Space Explosion" and Earth Observing System Capabilities

    NASA Astrophysics Data System (ADS)

    Stensaas, G. L.; Casey, K.; Snyder, G. I.; Christopherson, J.

    2017-12-01

    This presentation will describe recent developments in spaceborne remote sensing, including introduction to some of the increasing number of new firms entering the market, along with new systems and successes from established players, as well as industry consolidation reactions to these developments from communities of users. The information in this presentation will include inputs from the results of the Joint Agency Commercial Imagery Evaluation (JACIE) 2017 Civil Commercial Imagery Evaluation Workshop and the use of the US Geological Survey's Requirements Capabilities and Analysis for Earth Observation (RCA-EO) centralized Earth observing systems database and how system performance parameters are used with user science applications requirements.

  5. An intelligent interactive visual database management system for Space Shuttle closeout image management

    NASA Technical Reports Server (NTRS)

    Ragusa, James M.; Orwig, Gary; Gilliam, Michael; Blacklock, David; Shaykhian, Ali

    1994-01-01

    Status is given of an applications investigation on the potential for using an expert system shell for classification and retrieval of high resolution, digital, color space shuttle closeout photography. This NASA funded activity has focused on the use of integrated information technologies to intelligently classify and retrieve still imagery from a large, electronically stored collection. A space shuttle processing problem is identified, a working prototype system is described, and commercial applications are identified. A conclusion reached is that the developed system has distinct advantages over the present manual system and cost efficiencies will result as the system is implemented. Further, commercial potential exists for this integrated technology.

  6. Standardization of XML Database Exchanges and the James Webb Space Telescope Experience

    NASA Technical Reports Server (NTRS)

    Gal-Edd, Jonathan; Detter, Ryan; Jones, Ron; Fatig, Curtis C.

    2007-01-01

    Personnel from the National Aeronautics and Space Administration (NASA) James Webb Space Telescope (JWST) Project have been working with various standard communities such the Object Management Group (OMG) and the Consultative Committee for Space Data Systems (CCSDS) to assist in the definition of a common extensible Markup Language (XML) for database exchange format. The CCSDS and OMG standards are intended for the exchange of core command and telemetry information, not for all database information needed to exercise a NASA space mission. The mission-specific database, containing all the information needed for a space mission, is translated from/to the standard using a translator. The standard is meant to provide a system that encompasses 90% of the information needed for command and telemetry processing. This paper will discuss standardization of the XML database exchange format, tools used, and the JWST experience, as well as future work with XML standard groups both commercial and government.

  7. Driver acceptance of commercial vehicle operations (CVO) technology in the motor carrier environment. Executive summary, Critical issues relating to acceptance of technology by interstate truck and bus drivers

    DOT National Transportation Integrated Search

    2000-05-01

    The California database incorporated in the Highway Safety Information System (HSIS) is derived from the California TASAS (Traffic Accident Surveillance and Analysis System). The system, maintained by the Traffic Operations Office of Caltrans, is a m...

  8. ARACHNID: A prototype object-oriented database tool for distributed systems

    NASA Technical Reports Server (NTRS)

    Younger, Herbert; Oreilly, John; Frogner, Bjorn

    1994-01-01

    This paper discusses the results of a Phase 2 SBIR project sponsored by NASA and performed by MIMD Systems, Inc. A major objective of this project was to develop specific concepts for improved performance in accessing large databases. An object-oriented and distributed approach was used for the general design, while a geographical decomposition was used as a specific solution. The resulting software framework is called ARACHNID. The Faint Source Catalog developed by NASA was the initial database testbed. This is a database of many giga-bytes, where an order of magnitude improvement in query speed is being sought. This database contains faint infrared point sources obtained from telescope measurements of the sky. A geographical decomposition of this database is an attractive approach to dividing it into pieces. Each piece can then be searched on individual processors with only a weak data linkage between the processors being required. As a further demonstration of the concepts implemented in ARACHNID, a tourist information system is discussed. This version of ARACHNID is the commercial result of the project. It is a distributed, networked, database application where speed, maintenance, and reliability are important considerations. This paper focuses on the design concepts and technologies that form the basis for ARACHNID.

  9. Keeping Track Every Step of the Way

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Knowledge Sharing Systems, Inc., a producer of intellectual assets management software systems for the federal government, universities, non-profit laboratories, and private companies, constructed and presently manages the NASA Technology Tracking System, also known as TechTracS. Under contract to Langley Research Center, TechTracS identifies and captures all NASA technologies, manages the patent prosecution process, and then tracks their progress en route to commercialization. The system supports all steps involved in various technology transfer activities, and is considered the premier intellectual asset management system used in the federal government today. NASA TechTracS consists of multiple relational databases and web servers, located at each of the 10 field centers, as well as NASA Headquarters. The system is capable of supporting the following functions: planning commercial technologies; commercialization activities; reporting new technologies and inventions; and processing and tracking intellectual property rights, licensing, partnerships, awards, and success stories. NASA TechTracS is critical to the Agency's ongoing mission to commercialize its revolutionary technologies in a variety of sectors within private industry, both aerospace and non- aerospace.

  10. Role of HIS/RIS DICOM interfaces in the integration of imaging into the Department of Veterans Affairs healthcare enterprise

    NASA Astrophysics Data System (ADS)

    Kuzmak, Peter M.; Dayhoff, Ruth E.

    1998-07-01

    The U.S. Department of Veterans Affairs is integrating imaging into the healthcare enterprise using the Digital Imaging and Communication in Medicine (DICOM) standard protocols. Image management is directly integrated into the VistA Hospital Information System (HIS) software and clinical database. Radiology images are acquired via DICOM, and are stored directly in the HIS database. Images can be displayed on low- cost clinician's workstations throughout the medical center. High-resolution diagnostic quality multi-monitor VistA workstations with specialized viewing software can be used for reading radiology images. DICOM has played critical roles in the ability to integrate imaging functionality into the Healthcare Enterprise. Because of its openness, it allows the integration of system components from commercial and non- commercial sources to work together to provide functional cost-effective solutions (see Figure 1). Two approaches are used to acquire and handle images within the radiology department. At some VA Medical Centers, DICOM is used to interface a commercial Picture Archiving and Communications System (PACS) to the VistA HIS. At other medical centers, DICOM is used to interface the image producing modalities directly to the image acquisition and display capabilities of VistA itself. Both of these approaches use a small set of DICOM services that has been implemented by VistA to allow patient and study text data to be transmitted to image producing modalities and the commercial PACS, and to enable images and study data to be transferred back.

  11. A World Wide Web (WWW) server database engine for an organelle database, MitoDat.

    PubMed

    Lemkin, P F; Chipperfield, M; Merril, C; Zullo, S

    1996-03-01

    We describe a simple database search engine "dbEngine" which may be used to quickly create a searchable database on a World Wide Web (WWW) server. Data may be prepared from spreadsheet programs (such as Excel, etc.) or from tables exported from relationship database systems. This Common Gateway Interface (CGI-BIN) program is used with a WWW server such as available commercially, or from National Center for Supercomputer Algorithms (NCSA) or CERN. Its capabilities include: (i) searching records by combinations of terms connected with ANDs or ORs; (ii) returning search results as hypertext links to other WWW database servers; (iii) mapping lists of literature reference identifiers to the full references; (iv) creating bidirectional hypertext links between pictures and the database. DbEngine has been used to support the MitoDat database (Mendelian and non-Mendelian inheritance associated with the Mitochondrion) on the WWW.

  12. DISTRIBUTED CONTROL AND DA FOR ATLAS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. SCUDDER; ET AL

    1999-05-01

    The control system for the Atlas pulsed power generator being built at Los Alamos National Laboratory will utilize a significant level of distributed control. Other principal design characteristics include noise immunity, modularity and use of commercial products wherever possible. The data acquisition system is tightly coordinated with the control system. Both share a common database server and a fiber-optic ethernet communications backbone.

  13. A watershed-based environmental and regulatory data analysis system for the forest products industry

    Treesearch

    John Beebe

    2012-01-01

    A watershed-based data analysis system was created as a tool for forest product companies to better understand potential implications from environmental regulations. Also known as the Receiving Water Database (RWDB), this data system was designed with the purpose of assisting companies that own pulp and paper mills, wood product facilities, and commercial timberlands...

  14. Recent Experience of the Application of a Commercial Data Base Management System (ADABAS) to a Scientific Data Bank (ECDIN).

    ERIC Educational Resources Information Center

    And Others; Town, William G.

    1980-01-01

    Discusses the problems encountered and solutions adopted in application of the ADABAS database management system to the ECDIN (Environmental Chemicals Data and Information Network) data bank. SIMAS, the pilot system, and ADABAS are compared, and ECDIN ADABAS design features are described. Appendices provide additional facts about ADABAS and SIMAS.…

  15. Microvax-based data management and reduction system for the regional planetary image facilities

    NASA Technical Reports Server (NTRS)

    Arvidson, R.; Guinness, E.; Slavney, S.; Weiss, B.

    1987-01-01

    Presented is a progress report for the Regional Planetary Image Facilities (RPIF) prototype image data management and reduction system being jointly implemented by Washington University and the USGS, Flagstaff. The system will consist of a MicroVAX with a high capacity (approx 300 megabyte) disk drive, a compact disk player, an image display buffer, a videodisk player, USGS image processing software, and SYSTEM 1032 - a commercial relational database management package. The USGS, Flagstaff, will transfer their image processing software including radiometric and geometric calibration routines, to the MicroVAX environment. Washington University will have primary responsibility for developing the database management aspects of the system and for integrating the various aspects into a working system.

  16. IMAT graphics manual

    NASA Technical Reports Server (NTRS)

    Stockwell, Alan E.; Cooper, Paul A.

    1991-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) consists of a menu driven executive system coupled with a relational database which links commercial structures, structural dynamics and control codes. The IMAT graphics system, a key element of the software, provides a common interface for storing, retrieving, and displaying graphical information. The IMAT Graphics Manual shows users of commercial analysis codes (MATRIXx, MSC/NASTRAN and I-DEAS) how to use the IMAT graphics system to obtain high quality graphical output using familiar plotting procedures. The manual explains the key features of the IMAT graphics system, illustrates their use with simple step-by-step examples, and provides a reference for users who wish to take advantage of the flexibility of the software to customize their own applications.

  17. Molecule database framework: a framework for creating database applications with chemical structure search capability

    PubMed Central

    2013-01-01

    Background Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Results Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes: • Support for multi-component compounds (mixtures) • Import and export of SD-files • Optional security (authorization) For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures). Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. Conclusions By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework. PMID:24325762

  18. Molecule database framework: a framework for creating database applications with chemical structure search capability.

    PubMed

    Kiener, Joos

    2013-12-11

    Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework.

  19. Risk management in mental health: applying lessons from commercial aviation.

    PubMed

    Hatcher, Simon

    2010-02-01

    Risk management in mental health focuses on risks in patients and fails to predict rare but catastrophic events such as suicide. Commercial aviation has a similar task in preventing rare but catastrophic accidents. This article describes the systems in place in commercial aviation that allows that industry to prevent disasters and contrasts this with the situation in mental health. In mental health we should learn from commercial aviation by having: national policies to promote patient safety; a national body responsible for implementing this policy which maintains a database of safety occurrences, sets targets and investigates adverse outcomes; legislation in place which encourages clinicians to report safety occurrences; and a common method and language for investigating safety occurrences.

  20. An optical scan/statistical package for clinical data management in C-L psychiatry.

    PubMed

    Hammer, J S; Strain, J J; Lyerly, M

    1993-03-01

    This paper explores aspects of the need for clinical database management systems that permit ongoing service management, measurement of the quality and appropriateness of care, databased administration of consultation liaison (C-L) services, teaching/educational observations, and research. It describes an OPTICAL SCAN databased management system that permits flexible form generation, desktop publishing, and linking of observations in multiple files. This enhanced MICRO-CARES software system--Medical Application Platform (MAP)--permits direct transfer of the data to ASCII and SAS format for mainframe manipulation of the clinical information. The director of a C-L service may now develop his or her own forms, incorporate structured instruments, or develop "branch chains" of essential data to add to the core data set without the effort and expense to reprint forms or consult with commercial vendors.

  1. Analysis of benzonatate overdoses among adults and children from 1969-2010 by the United States Food and Drug Administration.

    PubMed

    McLawhorn, Melinda W; Goulding, Margie R; Gill, Rajdeep K; Michele, Theresa M

    2013-01-01

    To augment the December 2010 United States Food and Drug Administration (FDA) Drug Safety Communication on accidental ingestion of benzonatate in children less than 10 years old by summarizing data on emergency department visits, benzonatate exposure, and reports of benzonatate overdoses from several data sources. Retrospective review of adverse-event reports and drug utilization data of benzonatate. The FDA Adverse Event Reporting System (AERS) database (1969-2010), the National Electronic Injury Surveillance System-Cooperative Adverse Drug Event Surveillance Project (NEISS-CADES, 2004-2009), and the IMS commercial data vendor (2004-2009). Any patient who reported an adverse event with benzonatate captured in the AERS or NEISS-CADES database or received a prescription for benzonatate according to the IMS commercial data vendor. Postmarketing adverse events with benzonatate were collected from the AERS database, emergency department visits due to adverse events with benzonatate were collected from the NEISS-CADES database, and outpatient drug utilization data were collected from the IMS commercial data vendor. Of 31 overdose cases involving benzonatate reported in the AERS database, 20 had a fatal outcome, and five of these fatalities occurred from accidental ingestions in children 2 years of age and younger. The NEISS-CADES database captured emergency department visits involving 12 cases of overdose from accidental benzonatate ingestions in children aged 1-3 years. Signs and symptoms of overdose included seizures, cardiac arrest, coma, brain edema or anoxic encephalopathy, apnea, tachycardia, and respiratory arrest and occurred in some patients within 15 minutes of ingestion. Dispensed benzonatate prescriptions increased by approximately 52% from 2004 to 2009. Although benzonatate has a long history of safe use, accumulating cases of fatal overdose, especially in children, prompted the FDA to notify health care professionals about the risks of benzonatate overdose. Pharmacists may have a role in preventing benzonatate overdoses by counseling patients on signs and symptoms of benzonatate overdose, the need for immediate medical care, and safe storage and disposal of benzonatate. © 2013 Pharmacotherapy Publications, Inc.

  2. Hand-held computer operating system program for collection of resident experience data.

    PubMed

    Malan, T K; Haffner, W H; Armstrong, A Y; Satin, A J

    2000-11-01

    To describe a system for recording resident experience involving hand-held computers with the Palm Operating System (3 Com, Inc., Santa Clara, CA). Hand-held personal computers (PCs) are popular, easy to use, inexpensive, portable, and can share data among other operating systems. Residents in our program carry individual hand-held database computers to record Residency Review Committee (RRC) reportable patient encounters. Each resident's data is transferred to a single central relational database compatible with Microsoft Access (Microsoft Corporation, Redmond, WA). Patient data entry and subsequent transfer to a central database is accomplished with commercially available software that requires minimal computer expertise to implement and maintain. The central database can then be used for statistical analysis or to create required RRC resident experience reports. As a result, the data collection and transfer process takes less time for residents and program director alike, than paper-based or central computer-based systems. The system of collecting resident encounter data using hand-held computers with the Palm Operating System is easy to use, relatively inexpensive, accurate, and secure. The user-friendly system provides prompt, complete, and accurate data, enhancing the education of residents while facilitating the job of the program director.

  3. Small Business Innovations (Automated Information)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Bruce G. Jackson & Associates Document Director is an automated tool that combines word processing and database management technologies to offer the flexibility and convenience of text processing with the linking capability of database management. Originally developed for NASA, it provides a means to collect and manage information associated with requirements development. The software system was used by NASA in the design of the Assured Crew Return Vehicle, as well as by other government and commercial organizations including the Southwest Research Institute.

  4. 77 FR 40630 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... the United States against terrorist and foreign intelligence threats and to enforce U.S. criminal laws..., informants, sources, bystanders, law enforcement personnel, intelligence personnel, other responders... intelligence operation; individuals who are identified in open source information or commercial databases, or...

  5. An ERP Post-Implementation Review: Planning for the Future by Looking Back

    ERIC Educational Resources Information Center

    Powel, Wayne D.; Barry, Jim

    2005-01-01

    In 1995, Gonzaga University embarked on a project to implement a university-wide information system. The search for an "out-of-the-box" solution began following an attempt to build an integrated data management system in-house. In 1994, Gonzaga decided to look at commercial solutions to its database management problems. With the blessing…

  6. Review of the Composability Problem for System Evaluation

    DTIC Science & Technology

    2004-11-01

    burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services ...directory services (e.g., the Lightweight Directory Access Protocol (LDAP)), authentication (e.g., Kerberos), databases, user interface (e.g...exemplifies this type of development, by its use of commercial components and systems for authentication, access management, directory services

  7. HLLV avionics requirements study and electronic filing system database development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This final report provides a summary of achievements and activities performed under Contract NAS8-39215. The contract's objective was to explore a new way of delivering, storing, accessing, and archiving study products and information and to define top level system requirements for Heavy Lift Launch Vehicle (HLLV) avionics that incorporate Vehicle Health Management (VHM). This report includes technical objectives, methods, assumptions, recommendations, sample data, and issues as specified by DPD No. 772, DR-3. The report is organized into two major subsections, one specific to each of the two tasks defined in the Statement of Work: the Index Database Task and the HLLV Avionics Requirements Task. The Index Database Task resulted in the selection and modification of a commercial database software tool to contain the data developed during the HLLV Avionics Requirements Task. All summary information is addressed within each task's section.

  8. Coordinating Council. First Meeting: NASA/RECON database

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A Council of NASA Headquarters, American Institute of Aeronautics and Astronautics (AIAA), and the NASA Scientific and Technical Information (STI) Facility management met (1) to review and discuss issues of NASA concern, and (2) to promote new and better ways to collect and disseminate scientific and technical information. Topics mentioned for study and discussion at subsequent meetings included the pros and cons of transferring the NASA/RECON database to the commercial sector, the quality of the database, and developing ways to increase foreign acquisitions. The input systems at AIAA and the STI Facility were described. Also discussed were the proposed RECON II retrieval system, the transmittal of document orders received by the Facility and sent to AIAA, and the handling of multimedia input by the Departments of Defense and Commerce. A second meeting was scheduled for six weeks later to discuss database quality and international foreign input.

  9. Implementation of an open adoption research data management system for clinical studies.

    PubMed

    Müller, Jan; Heiss, Kirsten Ingmar; Oberhoffer, Renate

    2017-07-06

    Research institutions need to manage multiple studies with individual data sets, processing rules and different permissions. So far, there is no standard technology that provides an easy to use environment to create databases and user interfaces for clinical trials or research studies. Therefore various software solutions are being used-from custom software, explicitly designed for a specific study, to cost intensive commercial Clinical Trial Management Systems (CTMS) up to very basic approaches with self-designed Microsoft ® databases. The technology applied to conduct those studies varies tremendously from study to study, making it difficult to evaluate data across various studies (meta-analysis) and keeping a defined level of quality in database design, data processing, displaying and exporting. Furthermore, the systems being used to collect study data are often operated redundantly to systems used in patient care. As a consequence the data collection in studies is inefficient and data quality may suffer from unsynchronized datasets, non-normalized database scenarios and manually executed data transfers. With OpenCampus Research we implemented an open adoption software (OAS) solution on an open source basis, which provides a standard environment for state-of-the-art research database management at low cost.

  10. Service Management Database for DSN Equipment

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.

  11. Managing Requirements-Documents to Data

    NASA Technical Reports Server (NTRS)

    Orr, Kevin; Hudson, Abe

    2017-01-01

    Managing Requirements on long term projects like International Space Station (ISS) can go thru many phases, from initial product development to almost over 20 years of operations and sustainment. Over that time many authorized changes have been made to the requirement set, that apply to any new systems that would visit the ISS today, like commercial cargo/crew vehicles or payloads. Explore the benefits of managing requirements in a database while satisfying traditional documents needs for contracts and stakeholder/user consumption that are not tied into the database.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    SacconePhD, Scott F; Chesler, Elissa J; Bierut, Laura J

    Commercial SNP microarrays now provide comprehensive and affordable coverage of the human genome. However, some diseases have biologically relevant genomic regions that may require additional coverage. Addiction, for example, is thought to be influenced by complex interactions among many relevant genes and pathways. We have assembled a list of 486 biologically relevant genes nominated by a panel of experts on addiction. We then added 424 genes that showed evidence of association with addiction phenotypes through mouse QTL mappings and gene co-expression analysis. We demonstrate that there are a substantial number of SNPs in these genes that are not well representedmore » by commercial SNP platforms. We address this problem by introducing a publicly available SNP database for addiction. The database is annotated using numeric prioritization scores indicating the extent of biological relevance. The scores incorporate a number of factors such as SNP/gene functional properties (including synonymy and promoter regions), data from mouse systems genetics and measures of human/mouse evolutionary conservation. We then used HapMap genotyping data to determine if a SNP is tagged by a commercial microarray through linkage disequilibrium. This combination of biological prioritization scores and LD tagging annotation will enable addiction researchers to supplement commercial SNP microarrays to ensure comprehensive coverage of biologically relevant regions.« less

  13. Success of HIS DICOM interfaces in the integration of the healthcare enterprise at the Department of Veterans Affairs

    NASA Astrophysics Data System (ADS)

    Kuzmak, Peter M.; Dayhoff, Ruth E.

    1999-07-01

    The US Department of Veterans Affairs (VA) is integrating imaging into the healthcare enterprise using the Digital Imaging and Communication in Medicine (DICOM) standard protocols. Image management is directly integrated into the VistA Hospital Information System (HIS) software and the clinical database. Radiology images are acquired via DICOM, and are stored directly in the HIS database. Images can be displayed on low-cost clinician's workstations throughout the medical center. High-resolution diagnostic quality multi-monitor VistA workstations with specialized viewing software can be used for reading radiology images. Two approaches are used to acquire and handle imags within the radiology department. Some sties have a commercial Picture Archiving and Communications System (PACS) interfaced to the VistA HIS, while other sites use the direct image acquisition and integrated diagnostic reading capabilities of VistA itself. A small set of DICOM services have been implemented by VistA to allow patient and study text data to be transmitted to image producing modalities and the commercial PACS, and to enable images and study data to be transferred back. The VistA DICOM capabilities are now used to interface seven different commercial PACS products and over twenty different radiology modalities. The communications capabilities of DICOM and the VA wide area network are begin used to support reading of radiology images form remote sites. DICOM has been the cornerstone in the ability to integrate imaging functionality into the Healthcare Enterprise. Because of its openness, it allows the integration of system component from commercial and non- commercial sources to work together to provide functional cost-effective solutions. As DICOM expands to non-radiology devices, integration must occur with the specialty information subsystems that handle orders and reports, their associated DICOM image capture systems, and the computer- based patient record. The mode and concepts of the DICOM standard can be extended to these other areas, but some adjustments may be required.

  14. Integrated multidisciplinary analysis tool IMAT users' guide

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  15. Telecommunication Networks. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    One of nine brief guides for special educators on using computer technology, this guide focuses on utilizing the telecommunications capabilities of computers. Network capabilities including electronic mail, bulletin boards, and access to distant databases are briefly explained. Networks useful to the educator, general commercial systems, and local…

  16. 78 FR 18349 - Federal Acquisition Regulation; Information Collection; Commercial Item Acquisitions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... Certification Application (ORCA) function of the System for Award Management (SAM) database. Because an offeror..., use of the ORCA function by prospective contractors decreases the number of responses per respondent per year for purposes of this information collection. ORCA was developed to eliminate the...

  17. CICS Region Virtualization for Cost Effective Application Development

    ERIC Educational Resources Information Center

    Khan, Kamal Waris

    2012-01-01

    Mainframe is used for hosting large commercial databases, transaction servers and applications that require a greater degree of reliability, scalability and security. Customer Information Control System (CICS) is a mainframe software framework for implementing transaction services. It is designed for rapid, high-volume online processing. In order…

  18. Expert system application for the loading capability assessment of transmission lines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le, T.L.; Negnevitsky, M.; Piekutowski, M.

    1995-11-01

    This paper describes the application of an expert system for the evaluation of the short time thermal rating and temperature rise of overhead conductors. The expert system has been developed using a database and Leonardo expert system shell which is gaining popularity among commercial tools for developing expert system applications. The expert system has been found to compare well when evaluated against the site tests. A practical application is given to demonstrate the usefulness of the expert system developed.

  19. Content and Workflow Management for Library Websites: Case Studies

    ERIC Educational Resources Information Center

    Yu, Holly, Ed.

    2005-01-01

    Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…

  20. Mining and Indexing Graph Databases

    ERIC Educational Resources Information Center

    Yuan, Dayu

    2013-01-01

    Graphs are widely used to model structures and relationships of objects in various scientific and commercial fields. Chemical molecules, proteins, malware system-call dependencies and three-dimensional mechanical parts are all modeled as graphs. In this dissertation, we propose to mine and index those graph data to enable fast and scalable search.…

  1. Turning Access into a web-enabled secure information system for clinical trials.

    PubMed

    Dongquan Chen; Chen, Wei-Bang; Soong, Mayhue; Soong, Seng-Jaw; Orthner, Helmuth F

    2009-08-01

    Organizations that have limited resources need to conduct clinical studies in a cost-effective, but secure way. Clinical data residing in various individual databases need to be easily accessed and secured. Although widely available, digital certification, encryption, and secure web server, have not been implemented as widely, partly due to a lack of understanding of needs and concerns over issues such as cost and difficulty in implementation. The objective of this study was to test the possibility of centralizing various databases and to demonstrate ways of offering an alternative to a large-scale comprehensive and costly commercial product, especially for simple phase I and II trials, with reasonable convenience and security. We report a working procedure to transform and develop a standalone Access database into a secure Web-based secure information system. For data collection and reporting purposes, we centralized several individual databases; developed, and tested a web-based secure server using self-issued digital certificates. The system lacks audit trails. The cost of development and maintenance may hinder its wide application. The clinical trial databases scattered in various departments of an institution could be centralized into a web-enabled secure information system. The limitations such as the lack of a calendar and audit trail can be partially addressed with additional programming. The centralized Web system may provide an alternative to a comprehensive clinical trial management system.

  2. Smartphone home monitoring of ECG

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Moon, Gyu; Landa, Joseph; Nakajima, Hiroshi; Hata, Yutaka

    2012-06-01

    A system of ambulatory, halter, electrocardiography (ECG) monitoring system has already been commercially available for recording and transmitting heartbeats data by the Internet. However, it enjoys the confidence with a reservation and thus a limited market penetration, our system was targeting at aging global villagers having an increasingly biomedical wellness (BMW) homecare needs, not hospital related BMI (biomedical illness). It was designed within SWaP-C (Size, Weight, and Power, Cost) using 3 innovative modules: (i) Smart Electrode (lowpower mixed signal embedded with modern compressive sensing and nanotechnology to improve the electrodes' contact impedance); (ii) Learnable Database (in terms of adaptive wavelets transform QRST feature extraction, Sequential Query Relational database allowing home care monitoring retrievable Aided Target Recognition); (iii) Smartphone (touch screen interface, powerful computation capability, caretaker reporting with GPI, ID, and patient panic button for programmable emergence procedure). It can provide a supplementary home screening system for the post or the pre-diagnosis care at home with a build-in database searchable with the time, the place, and the degree of urgency happened, using in-situ screening.

  3. An integrated knowledge system for wind tunnel testing - Project Engineers' Intelligent Assistant

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.; Shi, George Z.; Hoyt, W. A.; Steinle, Frank W., Jr.

    1993-01-01

    The Project Engineers' Intelligent Assistant (PEIA) is an integrated knowledge system developed using artificial intelligence technology, including hypertext, expert systems, and dynamic user interfaces. This system integrates documents, engineering codes, databases, and knowledge from domain experts into an enriched hypermedia environment and was designed to assist project engineers in planning and conducting wind tunnel tests. PEIA is a modular system which consists of an intelligent user-interface, seven modules and an integrated tool facility. Hypermedia technology is discussed and the seven PEIA modules are described. System maintenance and updating is very easy due to the modular structure and the integrated tool facility provides user access to commercial software shells for documentation, reporting, or database updating. PEIA is expected to provide project engineers with technical information, increase efficiency and productivity, and provide a realistic tool for personnel training.

  4. An intelligent user interface for browsing satellite data catalogs

    NASA Technical Reports Server (NTRS)

    Cromp, Robert F.; Crook, Sharon

    1989-01-01

    A large scale domain-independent spatial data management expert system that serves as a front-end to databases containing spatial data is described. This system is unique for two reasons. First, it uses spatial search techniques to generate a list of all the primary keys that fall within a user's spatial constraints prior to invoking the database management system, thus substantially decreasing the amount of time required to answer a user's query. Second, a domain-independent query expert system uses a domain-specific rule base to preprocess the user's English query, effectively mapping a broad class of queries into a smaller subset that can be handled by a commercial natural language processing system. The methods used by the spatial search module and the query expert system are explained, and the system architecture for the spatial data management expert system is described. The system is applied to data from the International Ultraviolet Explorer (IUE) satellite, and results are given.

  5. Commercial Aircraft Emission Scenario for 2020: Database Development and Analysis

    NASA Technical Reports Server (NTRS)

    Sutkus, Donald J., Jr.; Baughcum, Steven L.; DuBois, Douglas P.; Wey, Chowen C. (Technical Monitor)

    2003-01-01

    This report describes the development of a three-dimensional database of aircraft fuel use and emissions (NO(x), CO, and hydrocarbons) for the commercial aircraft fleet projected to 2020. Global totals of emissions and fuel burn for 2020 are compared to global totals from previous aircraft emission scenario calculations.

  6. Selecting a Persistent Data Support Environment for Object-Oriented Applications

    DTIC Science & Technology

    1998-03-01

    key features of most object DBMS products is contained in the <DWAS 9{eeds Assessment for Objects from Barry and Associates. The developer should...data structure and behavior in a self- contained module enhances maintainability of the system and promotes reuse of modules for similar domains...considered together, represent a survey of commercial object-oriented database management systems. These references contain detailed information needed

  7. Overcoming barriers to a research-ready national commercial claims database.

    PubMed

    Newman, David; Herrera, Carolina-Nicole; Parente, Stephen T

    2014-11-01

    Billions of dollars have been spent on the goal of making healthcare data available to clinicians and researchers in the hopes of improving healthcare and lowering costs. However, the problems of data governance, distribution, and accessibility remain challenges for the healthcare system to overcome. In this study, we discuss some of the issues around holding, reporting, and distributing data, including the newest "big data" challenge: making the data accessible to researchers and policy makers. This article presents a case study in "big healthcare data" involving the Health Care Cost Institute (HCCI). HCCI is a nonprofit, nonpartisan, independent research institute that serves as a voluntary repository of national commercial healthcare claims data. Governance of large healthcare databases is complicated by the data-holding model and further complicated by issues related to distribution to research teams. For multi-payer healthcare claims databases, the 2 most common models of data holding (mandatory and voluntary) have different data security requirements. Furthermore, data transport and accessibility may require technological investment. HCCI's efforts offer insights from which other data managers and healthcare leaders may benefit when contemplating a data collaborative.

  8. Multidisciplinary analysis of actively controlled large flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Cooper, Paul A.; Young, John W.; Sutter, Thomas R.

    1986-01-01

    The control of Flexible Structures (COFS) program has supported the development of an analysis capability at the Langley Research Center called the Integrated Multidisciplinary Analysis Tool (IMAT) which provides an efficient data storage and transfer capability among commercial computer codes to aid in the dynamic analysis of actively controlled structures. IMAT is a system of computer programs which transfers Computer-Aided-Design (CAD) configurations, structural finite element models, material property and stress information, structural and rigid-body dynamic model information, and linear system matrices for control law formulation among various commercial applications programs through a common database. Although general in its formulation, IMAT was developed specifically to aid in the evaluation of the structures. A description of the IMAT system and results of an application of the system are given.

  9. Managing Data in a GIS Environment

    NASA Technical Reports Server (NTRS)

    Beltran, Maria; Yiasemis, Haris

    1997-01-01

    A Geographic Information System (GIS) is a computer-based system that enables capture, modeling, manipulation, retrieval, analysis and presentation of geographically referenced data. A GIS operates in a dynamic environment of spatial and temporal information. This information is held in a database like any other information system, but performance is more of an issue for a geographic database than a traditional database due to the nature of the data. What distinguishes a GIS from other information systems is the spatial and temporal dimensions of the data and the volume of data (several gigabytes). Most traditional information systems are usually based around tables and textual reports, whereas GIS requires the use of cartographic forms and other visualization techniques. Much of the data can be represented using computer graphics, but a GIS is not a graphics database. A graphical system is concerned with the manipulation and presentation of graphical objects whereas a GIS handles geographic objects that have not only spatial dimensions but non-visual, i e., attribute and components. Furthermore, the nature of the data on which a GIS operates makes the traditional relational database approach inadequate for retrieving data and answering queries that reference spatial data. The purpose of this paper is to describe the efficiency issues behind storage and retrieval of data within a GIS database. Section 2 gives a general background on GIS, and describes the issues involved in custom vs. commercial and hybrid vs. integrated geographic information systems. Section 3 describes the efficiency issues concerning the management of data within a GIS environment. The paper ends with a summary of the main concerns of this paper.

  10. First Biomass Conference of the Americas: Energy, environment, agriculture, and industry. Proceedings, Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-10-01

    This conference was designed to provide a national and international forum to support the development of a viable biomass industry. Although papers on research activities and technologies under development that address industry problems comprised part of this conference, an effort was made to focus on scale-up and demonstration projects, technology transfer to end users, and commercial applications of biomass and wastes. The conference was divided into these major subject areas: Resource Base, Power Production, Transportation Fuels, Chemicals and Products, Environmental Issues, Commercializing Biomass Projects, Biomass Energy System Studies, and Biomass in Latin America. The papers in this second volume covermore » Transportation Fuels, and Chemicals and Products. Transportation Fuels topics include: Biodiesel, Pyrolytic Liquids, Ethanol, Methanol and Ethers, and Commercialization. The Chemicals and Products section includes specific topics in: Research, Technology Transfer, and Commercial Systems. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.« less

  11. Geologic Map of the Tucson and Nogales Quadrangles, Arizona (Scale 1:250,000): A Digital Database

    USGS Publications Warehouse

    Peterson, J.A.; Berquist, J.R.; Reynolds, S.J.; Page-Nedell, S. S.; Digital database by Oland, Gustav P.; Hirschberg, Douglas M.

    2001-01-01

    The geologic map of the Tucson-Nogales 1:250,000 scale quadrangle (Peterson and others, 1990) was digitized by U.S. Geological Survey staff and University of Arizona contractors at the Southwest Field Office, Tucson, Arizona, in 2000 for input into a geographic information system (GIS). The database was created for use as a basemap in a decision support system designed by the National Industrial Minerals and Surface Processes project. The resulting digital geologic map database can be queried in many ways to produce a variety of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included; they may be obtained from a variety of commercial and government sources. Additionally, point features, such as strike and dip, were not captured from the original paper map and are not included in the database. This database is not meant to be used or displayed at any scale larger than 1:250,000 (for example, 1:100,000 or 1:24,000). The digital geologic map graphics and plot files that are provided in the digital package are representations of the digital database. They are not designed to be cartographic products.

  12. Species identification of corynebacteria by cellular fatty acid analysis.

    PubMed

    Van den Velde, Sandra; Lagrou, Katrien; Desmet, Koen; Wauters, Georges; Verhaegen, Jan

    2006-02-01

    We evaluated the usefulness of cellular fatty acid analysis for the identification of corynebacteria. Therefore, 219 well-characterized strains belonging to 21 Corynebacterium species were analyzed with the Sherlock System of MIDI (Newark, DE). Most Corynebacterium species have a qualitative different fatty acid profile. Corynebacterium coyleae (subgroup 1), Corynebacterium riegelii, Corynebacterium simulans, and Corynebacterium imitans differ only quantitatively. Corynebacterium afermentans afermentans and C. coyleae (subgroup 2) have both a similar qualitative and quantitative profile. The commercially available database (CLIN 40, MIDI) identified only one third of the 219 strains correctly at the species level. We created a new database with these 219 strains. This new database was tested with 34 clinical isolates and could identify 29 strains correctly. Strains that remained unidentified were 2 Corynebacterium aurimucosum (not included in our database), 1 C. afermentans afermentans, and 2 Corynebacterium pseudodiphtheriticum. Cellular fatty acid analysis with a self-created database can be used for the identification and differentiation of corynebacteria.

  13. An evaluation of information retrieval accuracy with simulated OCR output

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croft, W.B.; Harding, S.M.; Taghva, K.

    Optical Character Recognition (OCR) is a critical part of many text-based applications. Although some commercial systems use the output from OCR devices to index documents without editing, there is very little quantitative data on the impact of OCR errors on the accuracy of a text retrieval system. Because of the difficulty of constructing test collections to obtain this data, we have carried out evaluation using simulated OCR output on a variety of databases. The results show that high quality OCR devices have little effect on the accuracy of retrieval, but low quality devices used with databases of short documents canmore » result in significant degradation.« less

  14. System Study: Emergency Power System 1998–2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-02-01

    This report presents an unreliability evaluation of the emergency power system (EPS) at 104 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2013 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10-year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant trends were identified in the EPS results.

  15. IMAT (Integrated Multidisciplinary Analysis Tool) user's guide for the VAX/VMS computer

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system for the VAX/VMS computer developed at the Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  16. Databases: Beyond the Basics.

    ERIC Educational Resources Information Center

    Whittaker, Robert

    This presented paper offers an elementary description of database characteristics and then provides a survey of databases that may be useful to the teacher and researcher in Slavic and East European languages and literatures. The survey focuses on commercial databases that are available, usable, and needed. Individual databases discussed include:…

  17. Cold Climate Foundation Retrofit Experimental Hygrothermal Performance. Cloquet Residential Research Facility Laboratory Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Louise F.; Harmon, Anna C.

    2015-04-09

    This project was funded jointly by the National Renewable Energy Laboratory (NREL) and Oak Ridge National Laboratory (ORNL). ORNL focused on developing a full basement wall system experimental database to enable others to validate hygrothermal simulation codes. NREL focused on testing the moisture durability of practical basement wall interior insulation retrofit solutions for cold climates. The project has produced a physically credible and reliable long-term hygrothermal performance database for retrofit foundation wall insulation systems in zone 6 and 7 climates that are fully compliant with the performance criteria in the 2009 Minnesota Energy Code. These data currently span the periodmore » from November 10, 2012 through May 31, 2014 and are anticipated to be extended through November 2014. The experimental data were configured into a standard format that can be published online and that is compatible with standard commercially available spreadsheet and database software.« less

  18. Ground Support Software for Spaceborne Instrumentation

    NASA Technical Reports Server (NTRS)

    Anicich, Vincent; Thorpe, rob; Fletcher, Greg; Waite, Hunter; Xu, Hykua; Walter, Erin; Frick, Kristie; Farris, Greg; Gell, Dave; Furman, Jufy; hide

    2004-01-01

    ION is a system of ground support software for the ion and neutral mass spectrometer (INMS) instrument aboard the Cassini spacecraft. By incorporating commercial off-the-shelf database, Web server, and Java application components, ION offers considerably more ground-support-service capability than was available previously. A member of the team that operates the INMS or a scientist who uses the data collected by the INMS can gain access to most of the services provided by ION via a standard pointand click hyperlink interface generated by almost any Web-browser program running in almost any operating system on almost any computer. Data are stored in one central location in a relational database in a non-proprietary format, are accessible in many combinations and formats, and can be combined with data from other instruments and spacecraft. The use of the Java programming language as a system-interface language offers numerous capabilities for object-oriented programming and for making the database accessible to participants using a variety of computer hardware and software.

  19. Analysis of commercial and public bioactivity databases.

    PubMed

    Tiikkainen, Pekka; Franke, Lutz

    2012-02-27

    Activity data for small molecules are invaluable in chemoinformatics. Various bioactivity databases exist containing detailed information of target proteins and quantitative binding data for small molecules extracted from journals and patents. In the current work, we have merged several public and commercial bioactivity databases into one bioactivity metabase. The molecular presentation, target information, and activity data of the vendor databases were standardized. The main motivation of the work was to create a single relational database which allows fast and simple data retrieval by in-house scientists. Second, we wanted to know the amount of overlap between databases by commercial and public vendors to see whether the former contain data complementing the latter. Third, we quantified the degree of inconsistency between data sources by comparing data points derived from the same scientific article cited by more than one vendor. We found that each data source contains unique data which is due to different scientific articles cited by the vendors. When comparing data derived from the same article we found that inconsistencies between the vendors are common. In conclusion, using databases of different vendors is still useful since the data overlap is not complete. It should be noted that this can be partially explained by the inconsistencies and errors in the source data.

  20. Methods for Estimating Withdrawal and Return Flow by Census Block for 2005 and 2020 for New Hampshire

    USGS Publications Warehouse

    Hayes, Laura; Horn, Marilee A.

    2009-01-01

    The U.S. Geological Survey, in cooperation with the New Hampshire Department of Environmental Services, estimated the amount of water demand, consumptive use, withdrawal, and return flow for each U.S. Census block in New Hampshire for the years 2005 (current) and 2020. Estimates of domestic, commercial, industrial, irrigation, and other nondomestic water use were derived through the use and innovative integration of several State and Federal databases, and by use of previously developed techniques. The New Hampshire Water Demand database was created as part of this study to store and integrate State of New Hampshire data central to the project. Within the New Hampshire Water Demand database, a lookup table was created to link the State databases and identify water users common to more than one database. The lookup table also allowed identification of withdrawal and return-flow locations of registered and unregistered commercial, industrial, agricultural, and other nondomestic users. Geographic information system data from the State were used in combination with U.S. Census Bureau spatial data to locate and quantify withdrawals and return flow for domestic users in each census block. Analyzing and processing the most recently available data resulted in census-block estimations of 2005 water use. Applying population projections developed by the State to the data sets enabled projection of water use for the year 2020. The results for each census block are stored in the New Hampshire Water Demand database and may be aggregated to larger political areas or watersheds to assess relative hydrologic stress on the basis of current and potential water availability.

  1. How ISO/IEC 17799 can be used for base lining information assurance among entities using data mining for defense, homeland security, commercial, and other civilian/commercial domains

    NASA Astrophysics Data System (ADS)

    Perry, William G.

    2006-04-01

    One goal of database mining is to draw unique and valid perspectives from multiple data sources. Insights that are fashioned from closely-held data stores are likely to possess a high degree of reliability. The degree of information assurance comes into question, however, when external databases are accessed, combined and analyzed to form new perspectives. ISO/IEC 17799, Information technology-Security techniques-Code of practice for information security management, can be used to establish a higher level of information assurance among disparate entities using data mining in the defense, homeland security, commercial and other civilian/commercial domains. Organizations that meet ISO/IEC information security standards have identified and assessed risks, threats and vulnerabilities and have taken significant proactive steps to meet their unique security requirements. The ISO standards address twelve domains: risk assessment and treatment, security policy, organization of information security, asset management, human resources security, physical and environmental security, communications and operations management, access control, information systems acquisition, development and maintenance, information security incident management and business continuity management and compliance. Analysts can be relatively confident that if organizations are ISO 17799 compliant, a high degree of information assurance is likely to be a characteristic of the data sets being used. The reverse may be true. Extracting, fusing and drawing conclusions based upon databases with a low degree of information assurance may be wrought with all of the hazards that come from knowingly using bad data to make decisions. Using ISO/IEC 17799 as a baseline for information assurance can help mitigate these risks.

  2. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  3. The Houston Academy of Medicine--Texas Medical Center Library management information system.

    PubMed Central

    Camille, D; Chadha, S; Lyders, R A

    1993-01-01

    A management information system (MIS) provides a means for collecting, reporting, and analyzing data from all segments of an organization. Such systems are common in business but rare in libraries. The Houston Academy of Medicine-Texas Medical Center Library developed an MIS that operates on a system of networked IBM PCs and Paradox, a commercial database software package. The data collected in the system include monthly reports, client profile information, and data collected at the time of service requests. The MIS assists with enforcement of library policies, ensures that correct information is recorded, and provides reports for library managers. It also can be used to help answer a variety of ad hoc questions. Future plans call for the development of an MIS that could be adapted to other libraries' needs, and a decision-support interface that would facilitate access to the data contained in the MIS databases. PMID:8251972

  4. Quantitative assessment of the expanding complementarity between public and commercial databases of bioactive compounds.

    PubMed

    Southan, Christopher; Várkonyi, Péter; Muresan, Sorel

    2009-07-06

    Since 2004 public cheminformatic databases and their collective functionality for exploring relationships between compounds, protein sequences, literature and assay data have advanced dramatically. In parallel, commercial sources that extract and curate such relationships from journals and patents have also been expanding. This work updates a previous comparative study of databases chosen because of their bioactive content, availability of downloads and facility to select informative subsets. Where they could be calculated, extracted compounds-per-journal article were in the range of 12 to 19 but compound-per-protein counts increased with document numbers. Chemical structure filtration to facilitate standardised comparisons typically reduced source counts by between 5% and 30%. The pair-wise overlaps between 23 databases and subsets were determined, as well as changes between 2006 and 2008. While all compound sets have increased, PubChem has doubled to 14.2 million. The 2008 comparison matrix shows not only overlap but also unique content across all sources. Many of the detailed differences could be attributed to individual strategies for data selection and extraction. While there was a big increase in patent-derived structures entering PubChem since 2006, GVKBIO contains over 0.8 million unique structures from this source. Venn diagrams showed extensive overlap between compounds extracted by independent expert curation from journals by GVKBIO, WOMBAT (both commercial) and BindingDB (public) but each included unique content. In contrast, the approved drug collections from GVKBIO, MDDR (commercial) and DrugBank (public) showed surprisingly low overlap. Aggregating all commercial sources established that while 1 million compounds overlapped with PubChem 1.2 million did not. On the basis of chemical structure content per se public sources have covered an increasing proportion of commercial databases over the last two years. However, commercial products included in this study provide links between compounds and information from patents and journals at a larger scale than current public efforts. They also continue to capture a significant proportion of unique content. Our results thus demonstrate not only an encouraging overall expansion of data-supported bioactive chemical space but also that both commercial and public sources are complementary for its exploration.

  5. Sponsored Schools and Commercialized Classrooms: Schoolhouse Commercializing Trends in the 1990's.

    ERIC Educational Resources Information Center

    Molnar, Alex

    This report analyzes commercializing trends in America's schools and classrooms, using data from database searches in seven categories of schoolhouse commercialism in the period 1990-97. The number of citations relating to commercializing activities can provide only a rough approximation of the scope and development of the phenomenon. The number…

  6. System Study: Reactor Core Isolation Cooling 1998-2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-12-01

    This report presents an unreliability evaluation of the reactor core isolation cooling (RCIC) system at 31 U.S. commercial boiling water reactors. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant trends were identified in the RCIC results.

  7. System Study: Auxiliary Feedwater 1998-2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-12-01

    This report presents an unreliability evaluation of the auxiliary feedwater (AFW) system at 69 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing or decreasing trends were identified in the AFW results.

  8. Materials, processes, and environmental engineering network

    NASA Technical Reports Server (NTRS)

    White, Margo M.

    1993-01-01

    The Materials, Processes, and Environmental Engineering Network (MPEEN) was developed as a central holding facility for materials testing information generated by the Materials and Processes Laboratory. It contains information from other NASA centers and outside agencies, and also includes the NASA Environmental Information System (NEIS) and Failure Analysis Information System (FAIS) data. Environmental replacement materials information is a newly developed focus of MPEEN. This database is the NASA Environmental Information System, NEIS, which is accessible through MPEEN. Environmental concerns are addressed regarding materials identified by the NASA Operational Environment Team, NOET, to be hazardous to the environment. An environmental replacement technology database is contained within NEIS. Environmental concerns about materials are identified by NOET, and control or replacement strategies are formed. This database also contains the usage and performance characteristics of these hazardous materials. In addition to addressing environmental concerns, MPEEN contains one of the largest materials databases in the world. Over 600 users access this network on a daily basis. There is information available on failure analysis, metals and nonmetals testing, materials properties, standard and commercial parts, foreign alloy cross-reference, Long Duration Exposure Facility (LDEF) data, and Materials and Processes Selection List data.

  9. Quantification of missing prescriptions in commercial claims databases: results of a cohort study.

    PubMed

    Cepeda, Maria Soledad; Fife, Daniel; Denarié, Michel; Bradford, Dan; Roy, Stephanie; Yuan, Yingli

    2017-04-01

    This study aims to quantify the magnitude of missed dispensings in commercial claims databases. A retrospective cohort study has been used linking PharMetrics, a commercial claims database, to a prescription database (LRx) that captures pharmacy dispensings independently of payment method, including cash transactions. We included adults with dispensings for opioids, diuretics, antiplatelet medications, or anticoagulants. To determine the degree of capture of dispensings, we calculated the number of subjects with the following: (1) same number of dispensings in both databases; (2) at least one dispensing, but not all dispensings, missed in PharMetrics; and (3) all dispensings missing in PharMetrics. Similar analyses were conducted using dispensings as the unit of analysis. To assess whether a dispensing in LRx was in PharMetrics, the dispensing in PharMetrics had to be for the same medication class and within ±7 days in LRx. A total of 1 426 498 subjects were included. Overall, 68% of subjects had the same number of dispensings in both databases. In 13% of subjects, PharMetrics identified ≥1 dispensing but also missed ≥1 dispensing. In 19% of the subjects, PharMetrics missed all the dispensings. Taking dispensings as the unit of analysis, 25% of the dispensings present in LRx were not captured in PharMetrics. These patterns were similar across all four classes of medications. Of the dispensings missing in PharMetrics, 48% involved a subject who had >1 health insurance plan. Commercial claims databases provide an incomplete picture of all prescriptions dispensed to patients. The lack of capture goes beyond cash transactions and potentially introduces substantial misclassification bias. © 2017 The Authors. Pharmacoepidemiology & Drug Safety Published by John Wiley & Sons Ltd. © 2017 The Authors. Pharmacoepidemiology & Drug Safety Published by John Wiley & Sons Ltd.

  10. The Application of Lidar to Synthetic Vision System Integrity

    NASA Technical Reports Server (NTRS)

    Campbell, Jacob L.; UijtdeHaag, Maarten; Vadlamani, Ananth; Young, Steve

    2003-01-01

    One goal in the development of a Synthetic Vision System (SVS) is to create a system that can be certified by the Federal Aviation Administration (FAA) for use at various flight criticality levels. As part of NASA s Aviation Safety Program, Ohio University and NASA Langley have been involved in the research and development of real-time terrain database integrity monitors for SVS. Integrity monitors based on a consistency check with onboard sensors may be required if the inherent terrain database integrity is not sufficient for a particular operation. Sensors such as the radar altimeter and weather radar, which are available on most commercial aircraft, are currently being investigated for use in a real-time terrain database integrity monitor. This paper introduces the concept of using a Light Detection And Ranging (LiDAR) sensor as part of a real-time terrain database integrity monitor. A LiDAR system consists of a scanning laser ranger, an inertial measurement unit (IMU), and a Global Positioning System (GPS) receiver. Information from these three sensors can be combined to generate synthesized terrain models (profiles), which can then be compared to the stored SVS terrain model. This paper discusses an initial performance evaluation of the LiDAR-based terrain database integrity monitor using LiDAR data collected over Reno, Nevada. The paper will address the consistency checking mechanism and test statistic, sensitivity to position errors, and a comparison of the LiDAR-based integrity monitor to a radar altimeter-based integrity monitor.

  11. International Energy: Subject Thesaurus. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The International Energy Agency: Subject Thesaurus contains the standard vocabulary of indexing terms (descriptors) developed and structured to build and maintain energy information databases. Involved in this cooperative task are (1) the technical staff of the USDOE Office of Scientific and Technical Information (OSTI) in cooperation with the member countries of the International Energy Agency`s Energy Technology Data Exchange (ETDE) and (2) the International Atomic Energy Agency`s International Nuclear Information System (INIS) staff representing the more than 100 countries and organizations that record and index information for the international nuclear information community. ETDE member countries are also members of INIS.more » Nuclear information prepared for INIS by ETDE member countries is included in the ETDE Energy Database, which contains the online equivalent of the printed INIS Atomindex. Indexing terminology is therefore cooperatively standardized for use in both information systems. This structured vocabulary reflects thscope of international energy research, development, and technological programs. The terminology of this thesaurus aids in subject searching on commercial systems, such as ``Energy Science & Technology`` by DIALOG Information Services, ``Energy`` by STN International and the ``ETDE Energy Database`` by SilverPlatter. It is also the thesaurus for the Integrated Technical Information System (ITIS) online databases of the US Department of Energy.« less

  12. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  13. Towards a low-cost mobile subcutaneous vein detection solution using near-infrared spectroscopy.

    PubMed

    Juric, Simon; Flis, Vojko; Debevc, Matjaz; Holzinger, Andreas; Zalik, Borut

    2014-01-01

    Excessive venipunctures are both time- and resource-consuming events, which cause anxiety, pain, and distress in patients, or can lead to severe harmful injuries. We propose a low-cost mobile health solution for subcutaneous vein detection using near-infrared spectroscopy, along with an assessment of the current state of the art in this field. The first objective of this study was to get a deeper overview of the research topic, through the initial team discussions and a detailed literature review (using both academic and grey literature). The second objective, that is, identifying the commercial systems employing near-infrared spectroscopy, was conducted using the PubMed database. The goal of the third objective was to identify and evaluate (using the IEEE Xplore database) the research efforts in the field of low-cost near-infrared imaging in general, as a basis for the conceptual model of the upcoming prototype. Although the reviewed commercial devices have demonstrated usefulness and value for peripheral veins visualization, other evaluated clinical outcomes are less conclusive. Previous studies regarding low-cost near-infrared systems demonstrated the general feasibility of developing cost-effective vein detection systems; however, their limitations are restricting their applicability to clinical practice. Finally, based on the current findings, we outline the future research direction.

  14. Towards a Low-Cost Mobile Subcutaneous Vein Detection Solution Using Near-Infrared Spectroscopy

    PubMed Central

    Flis, Vojko; Debevc, Matjaz; Holzinger, Andreas; Zalik, Borut

    2014-01-01

    Excessive venipunctures are both time- and resource-consuming events, which cause anxiety, pain, and distress in patients, or can lead to severe harmful injuries. We propose a low-cost mobile health solution for subcutaneous vein detection using near-infrared spectroscopy, along with an assessment of the current state of the art in this field. The first objective of this study was to get a deeper overview of the research topic, through the initial team discussions and a detailed literature review (using both academic and grey literature). The second objective, that is, identifying the commercial systems employing near-infrared spectroscopy, was conducted using the PubMed database. The goal of the third objective was to identify and evaluate (using the IEEE Xplore database) the research efforts in the field of low-cost near-infrared imaging in general, as a basis for the conceptual model of the upcoming prototype. Although the reviewed commercial devices have demonstrated usefulness and value for peripheral veins visualization, other evaluated clinical outcomes are less conclusive. Previous studies regarding low-cost near-infrared systems demonstrated the general feasibility of developing cost-effective vein detection systems; however, their limitations are restricting their applicability to clinical practice. Finally, based on the current findings, we outline the future research direction. PMID:24883388

  15. Features of commercial computer software systems for medical examiners and coroners.

    PubMed

    Hanzlick, R L; Parrish, R G; Ing, R

    1993-12-01

    There are many ways of automating medical examiner and coroner offices, one of which is to purchase commercial software products specifically designed for death investigation. We surveyed four companies that offer such products and requested information regarding each company and its hardware, software, operating systems, peripheral devices, applications, networking options, programming language, querying capability, coding systems, prices, customer support, and number and size of offices using the product. Although the four products (CME2, ForenCIS, InQuest, and Medical Examiner's Software System) are similar in many respects and each can be installed on personal computers, there are differences among the products with regard to cost, applications, and the other features. Death investigators interested in office automation should explore these products to determine the usefulness of each in comparison with the others and in comparison with general-purpose, off-the-shelf databases and software adaptable to death investigation needs.

  16. Comparison of scientific and administrative database management systems

    NASA Technical Reports Server (NTRS)

    Stoltzfus, J. C.

    1983-01-01

    Some characteristics found to be different for scientific and administrative data bases are identified and some of the corresponding generic requirements for data base management systems (DBMS) are discussed. The requirements discussed are especially stringent for either the scientific or administrative data bases. For some, no commercial DBMS is fully satisfactory, and the data base designer must invent a suitable approach. For others, commercial systems are available with elegant solutions, and a wrong choice would mean an expensive work-around to provide the missing features. It is concluded that selection of a DBMS must be based on the requirements for the information system. There is no unique distinction between scientific and administrative data bases or DBMS. The distinction comes from the logical structure of the data, and understanding the data and their relationships is the key to defining the requirements and selecting an appropriate DBMS for a given set of applications.

  17. REPDOSE: A database on repeated dose toxicity studies of commercial chemicals--A multifunctional tool.

    PubMed

    Bitsch, A; Jacobi, S; Melber, C; Wahnschaffe, U; Simetska, N; Mangelsdorf, I

    2006-12-01

    A database for repeated dose toxicity data has been developed. Studies were selected by data quality. Review documents or risk assessments were used to get a pre-screened selection of available valid data. The structure of the chemicals should be rather simple for well defined chemical categories. The database consists of three core data sets for each chemical: (1) structural features and physico-chemical data, (2) data on study design, (3) study results. To allow consistent queries, a high degree of standardization categories and glossaries were developed for relevant parameters. At present, the database consists of 364 chemicals investigated in 1018 studies which resulted in a total of 6002 specific effects. Standard queries have been developed, which allow analyzing the influence of structural features or PC data on LOELs, target organs and effects. Furthermore, it can be used as an expert system. First queries have shown that the database is a very valuable tool.

  18. Normative Databases for Imaging Instrumentation.

    PubMed

    Realini, Tony; Zangwill, Linda M; Flanagan, John G; Garway-Heath, David; Patella, Vincent M; Johnson, Chris A; Artes, Paul H; Gaddie, Ian B; Fingeret, Murray

    2015-08-01

    To describe the process by which imaging devices undergo reference database development and regulatory clearance. The limitations and potential improvements of reference (normative) data sets for ophthalmic imaging devices will be discussed. A symposium was held in July 2013 in which a series of speakers discussed issues related to the development of reference databases for imaging devices. Automated imaging has become widely accepted and used in glaucoma management. The ability of such instruments to discriminate healthy from glaucomatous optic nerves, and to detect glaucomatous progression over time is limited by the quality of reference databases associated with the available commercial devices. In the absence of standardized rules governing the development of reference databases, each manufacturer's database differs in size, eligibility criteria, and ethnic make-up, among other key features. The process for development of imaging reference databases may be improved by standardizing eligibility requirements and data collection protocols. Such standardization may also improve the degree to which results may be compared between commercial instruments.

  19. Normative Databases for Imaging Instrumentation

    PubMed Central

    Realini, Tony; Zangwill, Linda; Flanagan, John; Garway-Heath, David; Patella, Vincent Michael; Johnson, Chris; Artes, Paul; Ben Gaddie, I.; Fingeret, Murray

    2015-01-01

    Purpose To describe the process by which imaging devices undergo reference database development and regulatory clearance. The limitations and potential improvements of reference (normative) data sets for ophthalmic imaging devices will be discussed. Methods A symposium was held in July 2013 in which a series of speakers discussed issues related to the development of reference databases for imaging devices. Results Automated imaging has become widely accepted and used in glaucoma management. The ability of such instruments to discriminate healthy from glaucomatous optic nerves, and to detect glaucomatous progression over time is limited by the quality of reference databases associated with the available commercial devices. In the absence of standardized rules governing the development of reference databases, each manufacturer’s database differs in size, eligibility criteria, and ethnic make-up, among other key features. Conclusions The process for development of imaging reference databases may be improved by standardizing eligibility requirements and data collection protocols. Such standardization may also improve the degree to which results may be compared between commercial instruments. PMID:25265003

  20. Online bibliographic sources in hydrology

    USGS Publications Warehouse

    Wild, Emily C.; Havener, W. Michael

    2001-01-01

    Traditional commercial bibliographic databases and indexes provide some access to hydrology materials produced by the government; however, these sources do not provide comprehensive coverage of relevant hydrologic publications. This paper discusses bibliographic information available from the federal government and state geological surveys, water resources agencies, and depositories. In addition to information in these databases, the paper describes the scope, styles of citing, subject terminology, and the ways these information sources are currently being searched, formally and informally, by hydrologists. Information available from the federal and state agencies and from the state depositories might be missed by limiting searches to commercially distributed databases.

  1. A collaborative computer auditing system under SOA-based conceptual model

    NASA Astrophysics Data System (ADS)

    Cong, Qiushi; Huang, Zuoming; Hu, Jibing

    2013-03-01

    Some of the current challenges of computer auditing are the obstacles to retrieving, converting and translating data from different database schema. During the last few years, there are many data exchange standards under continuous development such as Extensible Business Reporting Language (XBRL). These XML document standards can be used for data exchange among companies, financial institutions, and audit firms. However, for many companies, it is still expensive and time-consuming to translate and provide XML messages with commercial application packages, because it is complicated and laborious to search and transform data from thousands of tables in the ERP databases. How to transfer transaction documents for supporting continuous auditing or real time auditing between audit firms and their client companies is a important topic. In this paper, a collaborative computer auditing system under SOA-based conceptual model is proposed. By utilizing the widely used XML document standards and existing data transformation applications developed by different companies and software venders, we can wrap these application as commercial web services that will be easy implemented under the forthcoming application environments: service-oriented architecture (SOA). Under the SOA environments, the multiagency mechanism will help the maturity and popularity of data assurance service over the Internet. By the wrapping of data transformation components with heterogeneous databases or platforms, it will create new component markets composed by many software vendors and assurance service companies to provide data assurance services for audit firms, regulators or third parties.

  2. Real-Time Integrity Monitoring of Stored Geo-Spatial Data Using Forward-Looking Remote Sensing Technology

    NASA Technical Reports Server (NTRS)

    Young, Steven D.; Harrah, Steven D.; deHaag, Maarten Uijt

    2002-01-01

    Terrain Awareness and Warning Systems (TAWS) and Synthetic Vision Systems (SVS) provide pilots with displays of stored geo-spatial data (e.g. terrain, obstacles, and/or features). As comprehensive validation is impractical, these databases typically have no quantifiable level of integrity. This lack of a quantifiable integrity level is one of the constraints that has limited certification and operational approval of TAWS/SVS to "advisory-only" systems for civil aviation. Previous work demonstrated the feasibility of using a real-time monitor to bound database integrity by using downward-looking remote sensing technology (i.e. radar altimeters). This paper describes an extension of the integrity monitor concept to include a forward-looking sensor to cover additional classes of terrain database faults and to reduce the exposure time associated with integrity threats. An operational concept is presented that combines established feature extraction techniques with a statistical assessment of similarity measures between the sensed and stored features using principles from classical detection theory. Finally, an implementation is presented that uses existing commercial-off-the-shelf weather radar sensor technology.

  3. JANE, A new information retrieval system for the Radiation Shielding Information Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trubey, D.K.

    A new information storage and retrieval system has been developed for the Radiation Shielding Information Center (RSIC) at Oak Ridge National Laboratory to replace mainframe systems that have become obsolete. The database contains citations and abstracts of literature which were selected by RSIC analysts and indexed with terms from a controlled vocabulary. The database, begun in 1963, has been maintained continuously since that time. The new system, called JANE, incorporates automatic indexing techniques and on-line retrieval using the RSIC Data General Eclipse MV/4000 minicomputer, Automatic indexing and retrieval techniques based on fuzzy-set theory allow the presentation of results in ordermore » of Retrieval Status Value. The fuzzy-set membership function depends on term frequency in the titles and abstracts and on Term Discrimination Values which indicate the resolving power of the individual terms. These values are determined by the Cover Coefficient method. The use of a commercial database base to store and retrieve the indexing information permits rapid retrieval of the stored documents. Comparisons of the new and presently-used systems for actual searches of the literature indicate that it is practical to replace the mainframe systems with a minicomputer system similar to the present version of JANE. 18 refs., 10 figs.« less

  4. First biomass conference of the Americas: Energy, environment, agriculture, and industry. Proceedings, Volume 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-10-01

    This conference was designed to provide a national and international forum to support the development of a viable biomass industry. Although papers on research activities and technologies under development that address industry problems comprised part of this conference, an effort was made to focus on scale-up and demonstration projects, technology transfer to end users, and commercial applications of biomass and wastes. The conference was divided into these major subject areas: Resource Base, Power Production, Transportation Fuels, Chemicals and Products, Environmental Issues, Commercializing Biomass Projects, Biomass Energy System Studies, and Biomass in Latin America. The papers in this third volume dealmore » with Environmental Issues, Biomass Energy System Studies, and Biomass in Latin America. Concerning Environmental Issues, the following topics are emphasized: Global Climate Change, Biomass Utilization, Biofuel Test Procedures, and Commercialization of Biomass Products. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.« less

  5. SiC: An Agent Based Architecture for Preventing and Detecting Attacks to Ubiquitous Databases

    NASA Astrophysics Data System (ADS)

    Pinzón, Cristian; de Paz, Yanira; Bajo, Javier; Abraham, Ajith; Corchado, Juan M.

    One of the main attacks to ubiquitous databases is the structure query language (SQL) injection attack, which causes severe damages both in the commercial aspect and in the user’s confidence. This chapter proposes the SiC architecture as a solution to the SQL injection attack problem. This is a hierarchical distributed multiagent architecture, which involves an entirely new approach with respect to existing architectures for the prevention and detection of SQL injections. SiC incorporates a kind of intelligent agent, which integrates a case-based reasoning system. This agent, which is the core of the architecture, allows the application of detection techniques based on anomalies as well as those based on patterns, providing a great degree of autonomy, flexibility, robustness and dynamic scalability. The characteristics of the multiagent system allow an architecture to detect attacks from different types of devices, regardless of the physical location. The architecture has been tested on a medical database, guaranteeing safe access from various devices such as PDAs and notebook computers.

  6. Transport and Environment Database System (TRENDS): Maritime air pollutant emission modelling

    NASA Astrophysics Data System (ADS)

    Georgakaki, Aliki; Coffey, Robert A.; Lock, Graham; Sorenson, Spencer C.

    This paper reports the development of the maritime module within the framework of the Transport and Environment Database System (TRENDS) project. A detailed database has been constructed for the calculation of energy consumption and air pollutant emissions. Based on an in-house database of commercial vessels kept at the Technical University of Denmark, relationships between the fuel consumption and size of different vessels have been developed, taking into account the fleet's age and service speed. The technical assumptions and factors incorporated in the database are presented, including changes from findings reported in Methodologies for Estimating air pollutant Emissions from Transport (MEET). The database operates on statistical data provided by Eurostat, which describe vessel and freight movements from and towards EU 15 major ports. Data are at port to Maritime Coastal Area (MCA) level, so a bottom-up approach is used. A port to MCA distance database has also been constructed for the purpose of the study. This was the first attempt to use Eurostat maritime statistics for emission modelling; and the problems encountered, since the statistical data collection was not undertaken with a view to this purpose, are mentioned. Examples of the results obtained by the database are presented. These include detailed air pollutant emission calculations for bulk carriers entering the port of Helsinki, as an example of the database operation, and aggregate results for different types of movements for France. Overall estimates of SO x and NO x emission caused by shipping traffic between the EU 15 countries are in the area of 1 and 1.5 million tonnes, respectively.

  7. QSAR Modeling Using Large-Scale Databases: Case Study for HIV-1 Reverse Transcriptase Inhibitors.

    PubMed

    Tarasova, Olga A; Urusova, Aleksandra F; Filimonov, Dmitry A; Nicklaus, Marc C; Zakharov, Alexey V; Poroikov, Vladimir V

    2015-07-27

    Large-scale databases are important sources of training sets for various QSAR modeling approaches. Generally, these databases contain information extracted from different sources. This variety of sources can produce inconsistency in the data, defined as sometimes widely diverging activity results for the same compound against the same target. Because such inconsistency can reduce the accuracy of predictive models built from these data, we are addressing the question of how best to use data from publicly and commercially accessible databases to create accurate and predictive QSAR models. We investigate the suitability of commercially and publicly available databases to QSAR modeling of antiviral activity (HIV-1 reverse transcriptase (RT) inhibition). We present several methods for the creation of modeling (i.e., training and test) sets from two, either commercially or freely available, databases: Thomson Reuters Integrity and ChEMBL. We found that the typical predictivities of QSAR models obtained using these different modeling set compilation methods differ significantly from each other. The best results were obtained using training sets compiled for compounds tested using only one method and material (i.e., a specific type of biological assay). Compound sets aggregated by target only typically yielded poorly predictive models. We discuss the possibility of "mix-and-matching" assay data across aggregating databases such as ChEMBL and Integrity and their current severe limitations for this purpose. One of them is the general lack of complete and semantic/computer-parsable descriptions of assay methodology carried by these databases that would allow one to determine mix-and-matchability of result sets at the assay level.

  8. System Study: High-Pressure Coolant Injection 1998-2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-12-01

    This report presents an unreliability evaluation of the high-pressure coolant injection system (HPCI) at 25 U.S. commercial boiling water reactors. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing or decreasing trends were identified in the HPCI results.

  9. System Study: High-Pressure Safety Injection 1998-2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-12-01

    This report presents an unreliability evaluation of the high-pressure safety injection system (HPSI) at 69 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing or decreasing trends were identified in the HPSI results.

  10. MitBASE : a comprehensive and integrated mitochondrial DNA database. The present status

    PubMed Central

    Attimonelli, M.; Altamura, N.; Benne, R.; Brennicke, A.; Cooper, J. M.; D’Elia, D.; Montalvo, A. de; Pinto, B. de; De Robertis, M.; Golik, P.; Knoop, V.; Lanave, C.; Lazowska, J.; Licciulli, F.; Malladi, B. S.; Memeo, F.; Monnerot, M.; Pasimeni, R.; Pilbout, S.; Schapira, A. H. V.; Sloof, P.; Saccone, C.

    2000-01-01

    MitBASE is an integrated and comprehensive database of mitochondrial DNA data which collects, under a single interface, databases for Plant, Vertebrate, Invertebrate, Human, Protist and Fungal mtDNA and a Pilot database on nuclear genes involved in mitochondrial biogenesis in Saccharomyces cerevisiae. MitBASE reports all available information from different organisms and from intraspecies variants and mutants. Data have been drawn from the primary databases and from the literature; value adding information has been structured, e.g., editing information on protist mtDNA genomes, pathological information for human mtDNA variants, etc. The different databases, some of which are structured using commercial packages (Microsoft Access, File Maker Pro) while others use a flat-file format, have been integrated under ORACLE. Ad hoc retrieval systems have been devised for some of the above listed databases keeping into account their peculiarities. The database is resident at the EBI and is available at the following site: http://www3.ebi.ac.uk/Research/Mitbase/mitbase.pl . The impact of this project is intended for both basic and applied research. The study of mitochondrial genetic diseases and mitochondrial DNA intraspecies diversity are key topics in several biotechnological fields. The database has been funded within the EU Biotechnology programme. PMID:10592207

  11. CyberSecurity Monitoring Tools and Projects: A Compendium of Commercial and Government Tools and Government Research Projects

    DTIC Science & Technology

    2000-08-01

    identify changes to the risk levels of business network functions based on proposed modifications. Expert can model networks as well (see special...network from departmental systems to enterprise-wide environments. ACX is scaled with the use of a Policy Model Database(PMDB). The PMDB is a management...This Entry February 8, 2000 Description BlackICE Defender is a host-based intrusion detector designed for use on home or small business systems. It

  12. Common Data Acquisition Systems (DAS) Software Development for Rocket Propulsion Test (RPT) Test Facilities

    NASA Technical Reports Server (NTRS)

    Hebert, Phillip W., Sr.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Hughes, Mark S.

    2012-01-01

    The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition systems (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis development and deployment.

  13. Abstraction of the Relational Model from a Department of Veterans Affairs DHCP Database: Bridging Theory and Working Application

    PubMed Central

    Levy, C.; Beauchamp, C.

    1996-01-01

    This poster describes the methods used and working prototype that was developed from an abstraction of the relational model from the VA's hierarchical DHCP database. Overlaying the relational model on DHCP permits multiple user views of the physical data structure, enhances access to the database by providing a link to commercial (SQL based) software, and supports a conceptual managed care data model based on primary and longitudinal patient care. The goal of this work was to create a relational abstraction of the existing hierarchical database; to construct, using SQL data definition language, user views of the database which reflect the clinical conceptual view of DHCP, and to allow the user to work directly with the logical view of the data using GUI based commercial software of their choosing. The workstation is intended to serve as a platform from which a managed care information model could be implemented and evaluated.

  14. Vertical Motion Simulator Experiment on Stall Recovery Guidance

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan; Lombaerts, Thomas; Stepanyan, Vahram; Kaneshige, John; Shish, Kimberlee; Robinson, Peter; Hardy, Gordon H.

    2017-01-01

    A stall recovery guidance system was designed to help pilots improve their stall recovery performance when the current aircraft state may be unrecognized under various complicating operational factors. Candidate guidance algorithms were connected to the split-cue pitch and roll flight directors that are standard on large transport commercial aircraft. A new thrust guidance algorithm and cue was also developed to help pilots prevent the combination of excessive thrust and nose-up stabilizer trim. The overall system was designed to reinforce the current FAA recommended stall recovery procedure. A general transport aircraft model, similar to a Boeing 757, with an extended aerodynamic database for improved stall dynamics simulation fidelity was integrated into the Vertical Motion Simulator at NASA Ames Research Center. A detailed study of the guidance system was then conducted across four stall scenarios with 30 commercial and 10 research test pilots, and the results are reported.

  15. REBASE--a database for DNA restriction and modification: enzymes, genes and genomes.

    PubMed

    Roberts, Richard J; Vincze, Tamas; Posfai, Janos; Macelis, Dana

    2015-01-01

    REBASE is a comprehensive and fully curated database of information about the components of restriction-modification (RM) systems. It contains fully referenced information about recognition and cleavage sites for both restriction enzymes and methyltransferases as well as commercial availability, methylation sensitivity, crystal and sequence data. All genomes that are completely sequenced are analyzed for RM system components, and with the advent of PacBio sequencing, the recognition sequences of DNA methyltransferases (MTases) are appearing rapidly. Thus, Type I and Type III systems can now be characterized in terms of recognition specificity merely by DNA sequencing. The contents of REBASE may be browsed from the web http://rebase.neb.com and selected compilations can be downloaded by FTP (ftp.neb.com). Monthly updates are also available via email. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Benchmarking Using Basic DBMS Operations

    NASA Astrophysics Data System (ADS)

    Crolotte, Alain; Ghazal, Ahmad

    The TPC-H benchmark proved to be successful in the decision support area. Many commercial database vendors and their related hardware vendors used these benchmarks to show the superiority and competitive edge of their products. However, over time, the TPC-H became less representative of industry trends as vendors keep tuning their database to this benchmark-specific workload. In this paper, we present XMarq, a simple benchmark framework that can be used to compare various software/hardware combinations. Our benchmark model is currently composed of 25 queries that measure the performance of basic operations such as scans, aggregations, joins and index access. This benchmark model is based on the TPC-H data model due to its maturity and well-understood data generation capability. We also propose metrics to evaluate single-system performance and compare two systems. Finally we illustrate the effectiveness of this model by showing experimental results comparing two systems under different conditions.

  17. The design and implementation of EPL: An event pattern language for active databases

    NASA Technical Reports Server (NTRS)

    Giuffrida, G.; Zaniolo, C.

    1994-01-01

    The growing demand for intelligent information systems requires closer coupling of rule-based reasoning engines, such as CLIPS, with advanced data base management systems (DBMS). For instance, several commercial DBMS now support the notion of triggers that monitor events and transactions occurring in the database and fire induced actions, which perform a variety of critical functions, including safeguarding the integrity of data, monitoring access, and recording volatile information needed by administrators, analysts, and expert systems to perform assorted tasks; examples of these tasks include security enforcement, market studies, knowledge discovery, and link analysis. At UCLA, we designed and implemented the event pattern language (EPL) which is capable of detecting and acting upon complex patterns of events which are temporally related to each other. For instance, a plant manager should be notified when a certain pattern of overheating repeats itself over time in a chemical process; likewise, proper notification is required when a suspicious sequence of bank transactions is executed within a certain time limit. The EPL prototype is built in CLIPS to operate on top of Sybase, a commercial relational DBMS, where actions can be triggered by events such as simple database updates, insertions, and deletions. The rule-based syntax of EPL allows the sequences of goals in rules to be interpreted as sequences of temporal events; each goal can correspond to either (1) a simple event, or (2) a (possibly negated) event/condition predicate, or (3) a complex event defined as the disjunction and repetition of other events. Various extensions have been added to CLIPS in order to tailor the interface with Sybase and its open client/server architecture.

  18. Evaluating the Potential of Commercial GIS for Accelerator Configuration Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.L. Larrieu; Y.R. Roblin; K. White

    2005-10-10

    The Geographic Information System (GIS) is a tool used by industries needing to track information about spatially distributed assets. A water utility, for example, must know not only the precise location of each pipe and pump, but also the respective pressure rating and flow rate of each. In many ways, an accelerator such as CEBAF (Continuous Electron Beam Accelerator Facility) can be viewed as an ''electron utility''. Whereas the water utility uses pipes and pumps, the ''electron utility'' uses magnets and RF cavities. At Jefferson lab we are exploring the possibility of implementing ESRI's ArcGIS as the framework for buildingmore » an all-encompassing accelerator configuration database that integrates location, configuration, maintenance, and connectivity details of all hardware and software. The possibilities of doing so are intriguing. From the GIS, software such as the model server could always extract the most-up-to-date layout information maintained by the Survey & Alignment for lattice modeling. The Mechanical Engineering department could use ArcGIS tools to generate CAD drawings of machine segments from the same database. Ultimately, the greatest benefit of the GIS implementation could be to liberate operators and engineers from the limitations of the current system-by-system view of machine configuration and allow a more integrated regional approach. The commercial GIS package provides a rich set of tools for database-connectivity, versioning, distributed editing, importing and exporting, and graphical analysis and querying, and therefore obviates the need for much custom development. However, formidable challenges to implementation exist and these challenges are not only technical and manpower issues, but also organizational ones. The GIS approach would crosscut organizational boundaries and require departments, which heretofore have had free reign to manage their own data, to cede some control and agree to a centralized framework.« less

  19. Head Up Displays. (Latest Citations from the Aerospace Database)

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The bibliography contains citations concerning the design, fabrication, and applications of head up displays (HUDs). Applications include military aircraft, helicopters, space shuttle, and commercial aircraft. Functions of the display include instrument approach, target tracking, and navigation. The head up display provides for an integrated avionics system with the pilot in the loop. (Contains 50-250 citations and includes a subject term index and title list.)

  20. Head Up Displays. (Latest citations from the Aerospace Database)

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The bibliography contains citations concerning the design, fabrication, and applications of head up displays (HUDs). Applications include military aircraft, helicopters, space shuttle, and commercial aircraft. Functions of the display include instrument approach, target tracking, and navigation. The head up display provides for an integrated avionics system with the pilot in the loop. (Contains 50-250 citations and includes a subject term index and title list.)

  1. Ground-source heat pump case studies and utility programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lienau, P.J.; Boyd, T.L.; Rogers, R.L.

    1995-04-01

    Ground-source heat pump systems are one of the promising new energy technologies that has shown rapid increase in usage over the past ten years in the United States. These systems offer substantial benefits to consumers and utilities in energy (kWh) and demand (kW) savings. The purpose of this study was to determine what existing monitored data was available mainly from electric utilities on heat pump performance, energy savings and demand reduction for residential, school and commercial building applications. In order to verify the performance, information was collected for 253 case studies from mainly utilities throughout the United States. The casemore » studies were compiled into a database. The database was organized into general information, system information, ground system information, system performance, and additional information. Information was developed on the status of demand-side management of ground-source heat pump programs for about 60 electric utility and rural electric cooperatives on marketing, incentive programs, barriers to market penetration, number units installed in service area, and benefits.« less

  2. Laptop Computer - Based Facial Recognition System Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. A. Cain; G. B. Singleton

    2001-03-01

    The objective of this project was to assess the performance of the leading commercial-off-the-shelf (COTS) facial recognition software package when used as a laptop application. We performed the assessment to determine the system's usefulness for enrolling facial images in a database from remote locations and conducting real-time searches against a database of previously enrolled images. The assessment involved creating a database of 40 images and conducting 2 series of tests to determine the product's ability to recognize and match subject faces under varying conditions. This report describes the test results and includes a description of the factors affecting the results.more » After an extensive market survey, we selected Visionics' FaceIt{reg_sign} software package for evaluation and a review of the Facial Recognition Vendor Test 2000 (FRVT 2000). This test was co-sponsored by the US Department of Defense (DOD) Counterdrug Technology Development Program Office, the National Institute of Justice, and the Defense Advanced Research Projects Agency (DARPA). Administered in May-June 2000, the FRVT 2000 assessed the capabilities of facial recognition systems that were currently available for purchase on the US market. Our selection of this Visionics product does not indicate that it is the ''best'' facial recognition software package for all uses. It was the most appropriate package based on the specific applications and requirements for this specific application. In this assessment, the system configuration was evaluated for effectiveness in identifying individuals by searching for facial images captured from video displays against those stored in a facial image database. An additional criterion was that the system be capable of operating discretely. For this application, an operational facial recognition system would consist of one central computer hosting the master image database with multiple standalone systems configured with duplicates of the master operating in remote locations. Remote users could perform real-time searches where network connectivity is not available. As images are enrolled at the remote locations, periodic database synchronization is necessary.« less

  3. Similar compounds searching system by using the gene expression microarray database.

    PubMed

    Toyoshiba, Hiroyoshi; Sawada, Hiroshi; Naeshiro, Ichiro; Horinouchi, Akira

    2009-04-10

    Numbers of microarrays have been examined and several public and commercial databases have been developed. However, it is not easy to compare in-house microarray data with those in a database because of insufficient reproducibility due to differences in the experimental conditions. As one of the approach to use these databases, we developed the similar compounds searching system (SCSS) on a toxicogenomics database. The datasets of 55 compounds administered to rats in the Toxicogenomics Project (TGP) database in Japan were used in this study. Using the fold-change ranking method developed by Lamb et al. [Lamb, J., Crawford, E.D., Peck, D., Modell, J.W., Blat, I.C., Wrobel, M.J., Lerner, J., Brunet, J.P., Subramanian, A., Ross, K.N., Reich, M., Hieronymus, H., Wei, G., Armstrong, S.A., Haggarty, S.J., Clemons, P.A., Wei, R., Carr, S.A., Lander, E.S., Golub, T.R., 2006. The connectivity map: using gene-expression signatures to connect small molecules, genes, and disease. Science 313, 1929-1935] and criteria called hit ratio, the system let us compare in-house microarray data and those in the database. In-house generated data for clofibrate, phenobarbital, and a proprietary compound were tested to evaluate the performance of the SCSS method. Phenobarbital and clofibrate, which were included in the TGP database, scored highest by the SCSS method. Other high scoring compounds had effects similar to either phenobarbital (a cytochrome P450s inducer) or clofibrate (a peroxisome proliferator). Some of high scoring compounds identified using the proprietary compound-administered rats have been known to cause similar toxicological changes in different species. Our results suggest that the SCSS method could be used in drug discovery and development. Moreover, this method may be a powerful tool to understand the mechanisms by which biological systems respond to various chemical compounds and may also predict adverse effects of new compounds.

  4. System Study: Emergency Power System 1998-2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-12-01

    This report presents an unreliability evaluation of the emergency power system (EPS) at 104 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period while yearly estimates for system unreliability are provided for the entire active period. An extremely statistically significant increasing trend was observed for EPS system unreliability for an 8-hour mission. A statistically significant increasing trend was observed for EPS system start-onlymore » unreliability.« less

  5. SU-D-BRB-02: Combining a Commercial Autoplanning Engine with Database Dose Predictions to Further Improve Plan Quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, SP; Moore, JA; Hui, X

    Purpose: Database dose predictions and a commercial autoplanning engine both improve treatment plan quality in different but complimentary ways. The combination of these planning techniques is hypothesized to further improve plan quality. Methods: Four treatment plans were generated for each of 10 head and neck (HN) and 10 prostate cancer patients, including Plan-A: traditional IMRT optimization using clinically relevant default objectives; Plan-B: traditional IMRT optimization using database dose predictions; Plan-C: autoplanning using default objectives; and Plan-D: autoplanning using database dose predictions. One optimization was used for each planning method. Dose distributions were normalized to 95% of the planning target volumemore » (prostate: 8000 cGy; HN: 7000 cGy). Objectives used in plan optimization and analysis were the larynx (25%, 50%, 90%), left and right parotid glands (50%, 85%), spinal cord (0%, 50%), rectum and bladder (0%, 20%, 50%, 80%), and left and right femoral heads (0%, 70%). Results: All objectives except larynx 25% and 50% resulted in statistically significant differences between plans (Friedman’s χ{sup 2} ≥ 11.2; p ≤ 0.011). Maximum dose to the rectum (Plans A-D: 8328, 8395, 8489, 8537 cGy) and bladder (Plans A-D: 8403, 8448, 8527, 8569 cGy) were significantly increased. All other significant differences reflected a decrease in dose. Plans B-D were significantly different from Plan-A for 3, 17, and 19 objectives, respectively. Plans C-D were also significantly different from Plan-B for 8 and 13 objectives, respectively. In one case (cord 50%), Plan-D provided significantly lower dose than plan C (p = 0.003). Conclusion: Combining database dose predictions with a commercial autoplanning engine resulted in significant plan quality differences for the greatest number of objectives. This translated to plan quality improvements in most cases, although special care may be needed for maximum dose constraints. Further evaluation is warranted in a larger cohort across HN, prostate, and other treatment sites. This work is supported by Philips Radiation Oncology Systems.« less

  6. An automated image-collection system for crystallization experiments using SBS standard microplates.

    PubMed

    Brostromer, Erik; Nan, Jie; Su, Xiao Dong

    2007-02-01

    As part of a structural genomics platform in a university laboratory, a low-cost in-house-developed automated imaging system for SBS microplate experiments has been designed and constructed. The imaging system can scan a microplate in 2-6 min for a 96-well plate depending on the plate layout and scanning options. A web-based crystallization database system has been developed, enabling users to follow their crystallization experiments from a web browser. As the system has been designed and built by students and crystallographers using commercially available parts, this report is aimed to serve as a do-it-yourself example for laboratory robotics.

  7. EDULISS: a small-molecule database with data-mining and pharmacophore searching capabilities

    PubMed Central

    Hsin, Kun-Yi; Morgan, Hugh P.; Shave, Steven R.; Hinton, Andrew C.; Taylor, Paul; Walkinshaw, Malcolm D.

    2011-01-01

    We present the relational database EDULISS (EDinburgh University Ligand Selection System), which stores structural, physicochemical and pharmacophoric properties of small molecules. The database comprises a collection of over 4 million commercially available compounds from 28 different suppliers. A user-friendly web-based interface for EDULISS (available at http://eduliss.bch.ed.ac.uk/) has been established providing a number of data-mining possibilities. For each compound a single 3D conformer is stored along with over 1600 calculated descriptor values (molecular properties). A very efficient method for unique compound recognition, especially for a large scale database, is demonstrated by making use of small subgroups of the descriptors. Many of the shape and distance descriptors are held as pre-calculated bit strings permitting fast and efficient similarity and pharmacophore searches which can be used to identify families of related compounds for biological testing. Two ligand searching applications are given to demonstrate how EDULISS can be used to extract families of molecules with selected structural and biophysical features. PMID:21051336

  8. Aviation Safety Issues Database

    NASA Technical Reports Server (NTRS)

    Morello, Samuel A.; Ricks, Wendell R.

    2009-01-01

    The aviation safety issues database was instrumental in the refinement and substantiation of the National Aviation Safety Strategic Plan (NASSP). The issues database is a comprehensive set of issues from an extremely broad base of aviation functions, personnel, and vehicle categories, both nationally and internationally. Several aviation safety stakeholders such as the Commercial Aviation Safety Team (CAST) have already used the database. This broader interest was the genesis to making the database publically accessible and writing this report.

  9. Using a commercial CAD system for simultaneous input to theoretical aerodynamic programs and wind-tunnel model construction

    NASA Technical Reports Server (NTRS)

    Enomoto, F.; Keller, P.

    1984-01-01

    The Computer Aided Design (CAD) system's common geometry database was used to generate input for theoretical programs and numerically controlled (NC) tool paths for wind tunnel part fabrication. This eliminates the duplication of work in generating separate geometry databases for each type of analysis. Another advantage is that it reduces the uncertainty due to geometric differences when comparing theoretical aerodynamic data with wind tunnel data. The system was adapted to aerodynamic research by developing programs written in Design Analysis Language (DAL). These programs reduced the amount of time required to construct complex geometries and to generate input for theoretical programs. Certain shortcomings of the Design, Drafting, and Manufacturing (DDM) software limited the effectiveness of these programs and some of the Calma NC software. The complexity of aircraft configurations suggests that more types of surface and curve geometry should be added to the system. Some of these shortcomings may be eliminated as improved versions of DDM are made available.

  10. An Object-Oriented Database Interface for Ada

    DTIC Science & Technology

    1993-12-01

    single object model, a unique extension for each ODM system may be required. The existence of Classic Ada with persistence provides evidence that a...prototypes and also through a commercial product known as Classic Ada with persistence. Classic Ada, a product marketed by Software Productivity Solutions...legal Ada constructs. Classic Ada with persistence provides an extra keyword, persistent, so that a user-defined class can be declared persistent. The

  11. Materials dispersion and biodynamics project research

    NASA Technical Reports Server (NTRS)

    Lewis, Marian L.

    1992-01-01

    The Materials Dispersion and Biodynamics Project (MDBP) focuses on dispersion and mixing of various biological materials and the dynamics of cell-to-cell communication and intracellular molecular trafficking in microgravity. Research activities encompass biomedical applications, basic cell biology, biotechnology (products from cells), protein crystal development, ecological life support systems (involving algae and bacteria), drug delivery (microencapsulation), biofilm deposition by living organisms, and hardware development to support living cells on Space Station Freedom (SSF). Project goals are to expand the existing microgravity science database through experiments on sounding rockets, the Shuttle, and COMET program orbiters and to evolve,through current database acquisition and feasibility testing, to more mature and larger-scale commercial operations on SSF. Maximized utilization of SSF for these science applications will mean that service companies will have a role in providing equipment for use by a number of different customers. An example of a potential forerunner of such a service for SSF is the Materials Dispersion Apparatus (MDA) 'mini lab' of Instrumentation Technology Associates, Inc. (ITA) in use on the Shuttle for the Commercial MDAITA Experiments (CMIX) Project. The MDA wells provide the capability for a number of investigators to perform mixing and bioprocessing experiments in space. In the area of human adaptation to microgravity, a significant database has been obtained over the past three decades. Some low-g effects are similar to Earth-based disorders (anemia, osteoporosis, neuromuscular diseases, and immune system disorders). As new information targets potential profit-making processes, services and products from microgravity, commercial space ventures are expected to expand accordingly. Cooperative CCDS research in the above mentioned areas is essential for maturing SSF biotechnology and to ensure U.S. leadership in space technology. Currently, the MDBP conducts collaborative research with investigators at the Rockefeller University, National Cancer Institute, and the Universities of California, Arizona, and Alabama in Birmingham. The growing database from these collaborations provides fundamental information applicable to development of cell products, manipulation of immune cell response, bone cell growth and mineralization, and other processes altered by low-gravity. Contacts with biotechnology and biopharmaceutical companies are being increased to reach uninformed potential SSF users, provide access through the CMDS to interested users for feasibility studies, and to continue active involvement of current participants. We encourage and actively seek participation of private sector companies, and university and government researchers interested in biopharmaceuticals, hardware development and fundamental research in microgravity.

  12. High-Performance Secure Database Access Technologies for HEP Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysismore » capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure authorization is pushed into the database engine will eliminate inefficient data transfer bottlenecks. Furthermore, traditionally separated database and security layers provide an extra vulnerability, leaving a weak clear-text password authorization as the only protection on the database core systems. Due to the legacy limitations of the systems’ security models, the allowed passwords often can not even comply with the DOE password guideline requirements. We see an opportunity for the tight integration of the secure authorization layer with the database server engine resulting in both improved performance and improved security. Phase I has focused on the development of a proof-of-concept prototype using Argonne National Laboratory’s (ANL) Argonne Tandem-Linac Accelerator System (ATLAS) project as a test scenario. By developing a grid-security enabled version of the ATLAS project’s current relation database solution, MySQL, PIOCON Technologies aims to offer a more efficient solution to secure database access.« less

  13. Determination of Detection Limits and Quantitation Limits for Compounds in a Database of GC/MS by FUMI Theory

    PubMed Central

    Nakashima, Shinya; Hayashi, Yuzuru

    2016-01-01

    The aim of this paper is to propose a stochastic method for estimating the detection limits (DLs) and quantitation limits (QLs) of compounds registered in a database of a GC/MS system and prove its validity with experiments. The approach described in ISO 11843 Part 7 is adopted here as an estimation means of DL and QL, and the decafluorotriphenylphosphine (DFTPP) tuning and retention time locking are carried out for adjusting the system. Coupled with the data obtained from the system adjustment experiments, the information (noise and signal of chromatograms and calibration curves) stored in the database is used for the stochastic estimation, dispensing with the repetition measurements. Of sixty-six pesticides, the DL values obtained by the ISO method were compared with those from the statistical approach and the correlation between them was observed to be excellent with the correlation coefficient of 0.865. The accuracy of the method proposed was also examined and concluded to be satisfactory as well. The samples used are commercial products of pesticides mixtures and the uncertainty from sample preparation processes is not taken into account. PMID:27162706

  14. Do-It-Yourself: A Special Library's Approach to Creating Dynamic Web Pages Using Commercial Off-The-Shelf Applications

    NASA Technical Reports Server (NTRS)

    Steeman, Gerald; Connell, Christopher

    2000-01-01

    Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.

  15. A distributed control system for the lower-hybrid current drive system on the Tokamak de Varennes

    NASA Astrophysics Data System (ADS)

    Bagdoo, J.; Guay, J. M.; Chaudron, G.-A.; Decoste, R.; Demers, Y.; Hubbard, A.

    1990-08-01

    An rf current drive system with an output power of 1 MW at 3.7 GHz is under development for the Tokamak de Varennes. The control system is based on an Ethernet local-area network of programmable logic controllers as front end, personal computers as consoles, and CAMAC-based DSP processors. The DSP processors ensure the PID control of the phase and rf power of each klystron, and the fast protection of high-power rf hardware, all within a 40 μs loop. Slower control and protection, event sequencing and the run-time database are provided by the programmable logic controllers, which communicate, via the LAN, with the consoles. The latter run a commercial process-control console software. The LAN protocol respects the first four layers of the ISO/OSI 802.3 standard. Synchronization with the tokamak control system is provided by commercially available CAMAC timing modules which trigger shot-related events and reference waveform generators. A detailed description of each subsystem and a performance evaluation of the system will be presented.

  16. A Novel Concept for the Search and Retrieval of the Derwent Markush Resource Database.

    PubMed

    Barth, Andreas; Stengel, Thomas; Litterst, Edwin; Kraut, Hans; Matuszczyk, Henry; Ailer, Franz; Hajkowski, Steve

    2016-05-23

    The representation of and search for generic chemical structures (Markush) remains a continuing challenge. Several research groups have addressed this problem, and over time a limited number of practical solutions have been proposed. Today there are two large commercial providers of Markush databases: Chemical Abstracts Service (CAS) and Thomson Reuters. The Thomson Reuters "Derwent" Markush database is currently offered via the online services Questel and STN and as a data feed for in-house use. The aim of this paper is to briefly review the existing Markush systems (databases plus search engines) and to describe our new approach for the implementation of the Derwent Markush Resource on STN. Our new approach demonstrates the integration of the Derwent Markush Resource database into the existing chemistry-focused STN platform without loss of detail. This provides compatibility with other structure and Markush databases on STN and at the same time makes it possible to deploy the specific features and functions of the Derwent approach. It is shown that the different Markush languages developed by CAS and Derwent can be combined into a single general Markush description. In this concept the generic nodes are grouped together in a unique hierarchy where all chemical elements and fragments can be integrated. As a consequence, both systems are searchable using a single structure query. Moreover, the presented concept could serve as a promising starting point for a common generalized description of Markush structures.

  17. System Study: High-Pressure Core Spray 1998-2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-12-01

    This report presents an unreliability evaluation of the high-pressure core spray (HPCS) at eight U.S. commercial boiling water reactors. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing or decreasing trends were identified in the HPCS results.

  18. In-Memory Graph Databases for Web-Scale Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Morari, Alessandro; Weaver, Jesse R.

    RDF databases have emerged as one of the most relevant way for organizing, integrating, and managing expo- nentially growing, often heterogeneous, and not rigidly structured data for a variety of scientific and commercial fields. In this paper we discuss the solutions integrated in GEMS (Graph database Engine for Multithreaded Systems), a software framework for implementing RDF databases on commodity, distributed-memory high-performance clusters. Unlike the majority of current RDF databases, GEMS has been designed from the ground up to primarily employ graph-based methods. This is reflected in all the layers of its stack. The GEMS framework is composed of: a SPARQL-to-C++more » compiler, a library of data structures and related methods to access and modify them, and a custom runtime providing lightweight software multithreading, network messages aggregation and a partitioned global address space. We provide an overview of the framework, detailing its component and how they have been closely designed and customized to address issues of graph methods applied to large-scale datasets on clusters. We discuss in details the principles that enable automatic translation of the queries (expressed in SPARQL, the query language of choice for RDF databases) to graph methods, and identify differences with respect to other RDF databases.« less

  19. Prescriber Compliance With Liver Monitoring Guidelines for Pazopanib in the Postapproval Setting: Results From a Distributed Research Network.

    PubMed

    Shantakumar, Sumitra; Nordstrom, Beth L; Hall, Susan A; Djousse, Luc; van Herk-Sukel, Myrthe P P; Fraeman, Kathy H; Gagnon, David R; Chagin, Karen; Nelson, Jeanenne J

    2017-04-20

    Pazopanib received US Food and Drug Administration approval in 2009 for advanced renal cell carcinoma. During clinical development, liver chemistry abnormalities and adverse hepatic events were observed, leading to a boxed warning for hepatotoxicity and detailed label prescriber guidelines for liver monitoring. As part of postapproval regulatory commitments, a cohort study was conducted to assess prescriber compliance with liver monitoring guidelines. Over a 4-year period, a distributed network approach was used across 3 databases: US Veterans Affairs Healthcare System, a US outpatient oncology community practice database, and the Dutch PHARMO Database Network. Measures of prescriber compliance were designed using the original pazopanib label guidelines for liver monitoring. Results from the VA (n = 288) and oncology databases (n = 283) indicate that prescriber liver chemistry monitoring was less than 100%: 73% to 74% compliance with baseline testing and 37% to 39% compliance with testing every 4 weeks. Compliance was highest near drug initiation and decreased over time. Among patients who should have had weekly testing, the compliance was 56% in both databases. The more serious elevations examined, including combinations of liver enzyme elevations meeting the laboratory definition of Hy's law were infrequent but always led to appropriate discontinuation of pazopanib. Only 4 patients were identified for analysis in the Dutch database; none had recorded baseline testing. In this population-based study, prescriber compliance was reasonable near pazopanib initiation but low during subsequent weeks of treatment. This study provides information from real-world community practice settings and offers feedback to regulators on the effectiveness of label monitoring guidelines.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

  20. An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nicholas; Sellis, Timos

    1994-01-01

    We investigated a number of design and performance issues of interoperable database management systems (DBMS's). The major results of our investigation were obtained in the areas of client-server database architectures for heterogeneous DBMS's, incremental computation models, buffer management techniques, and query optimization. We finished a prototype of an advanced client-server workstation-based DBMS which allows access to multiple heterogeneous commercial DBMS's. Experiments and simulations were then run to compare its performance with the standard client-server architectures. The focus of this research was on adaptive optimization methods of heterogeneous database systems. Adaptive buffer management accounts for the random and object-oriented access methods for which no known characterization of the access patterns exists. Adaptive query optimization means that value distributions and selectives, which play the most significant role in query plan evaluation, are continuously refined to reflect the actual values as opposed to static ones that are computed off-line. Query feedback is a concept that was first introduced to the literature by our group. We employed query feedback for both adaptive buffer management and for computing value distributions and selectivities. For adaptive buffer management, we use the page faults of prior executions to achieve more 'informed' management decisions. For the estimation of the distributions of the selectivities, we use curve-fitting techniques, such as least squares and splines, for regressing on these values.

  1. Human error analysis of commercial aviation accidents: application of the Human Factors Analysis and Classification system (HFACS).

    PubMed

    Wiegmann, D A; Shappell, S A

    2001-11-01

    The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based on Reason's (1990) model of latent and active failures, HFACS addresses human error at all levels of the system, including the condition of aircrew and organizational factors. The purpose of the present study was to assess the utility of the HFACS framework as an error analysis and classification tool outside the military. The HFACS framework was used to analyze human error data associated with aircrew-related commercial aviation accidents that occurred between January 1990 and December 1996 using database records maintained by the NTSB and the FAA. Investigators were able to reliably accommodate all the human causal factors associated with the commercial aviation accidents examined in this study using the HFACS system. In addition, the classification of data using HFACS highlighted several critical safety issues in need of intervention research. These results demonstrate that the HFACS framework can be a viable tool for use within the civil aviation arena. However, additional research is needed to examine its applicability to areas outside the flight deck, such as aircraft maintenance and air traffic control domains.

  2. Common Data Acquisition Systems (DAS) Software Development for Rocket Propulsion Test (RPT) Test Facilities - A General Overview

    NASA Technical Reports Server (NTRS)

    Hebert, Phillip W., Sr.; Hughes, Mark S.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Marshall, PeggL.; Duncan, Michael E.; Morris, Jon A.; Franzl, Richard W.

    2012-01-01

    The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition system (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis' development and deployment.

  3. The IAGOS information system

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Schultz, Martin; Brötz, Björn; Rauthe-Schöch, Armin; Thouret, Valérie

    2015-04-01

    IAGOS (In-service Aircraft for a Global Observing System) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. The IAGOS database (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data centre Ether (CNES and CNRS). In the framework of the IGAS project (IAGOS for Copernicus Atmospheric Service) interoperability with international portals or other databases is implemented in order to improve IAGOS data discovery. The IGAS data network is composed of three data centres: the IAGOS database in Toulouse including IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data since January 2015; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the MACC data centre in Jülich (http://join.iek.fz-juelich.de). The MACC (Monitoring Atmospheric Composition and Climate) project is a prominent user of the IGAS data network. In June 2015 a new version of the IAGOS database will be released providing improved services such as download in NetCDF or NASA Ames formats; graphical tools (maps, scatter plots, etc.); standardized metadata (ISO 19115) and a better users management. The link with the MACC data centre, through JOIN (Jülich OWS Interface), will allow to combine model outputs with IAGOS data for intercomparison. The interoperability within the IGAS data network, implemented thanks to many web services, will improve the functionalities of the web interfaces of each data centre.

  4. Survey Software Evaluation

    DTIC Science & Technology

    2009-01-01

    Oracle 9i, 10g  MySQL  MS SQL Server MS SQL Server Operating System Supported Windows 2003 Server  Windows 2000 Server (32 bit...WebStar (Mac OS X)  SunOne Internet Information Services (IIS) Database Server Supported MS SQL Server  MS SQL Server  Oracle 9i, 10g...challenges of Web-based surveys are: 1) identifying the best Commercial Off the Shelf (COTS) Web-based survey packages to serve the particular

  5. Comet: an open-source MS/MS sequence database search tool.

    PubMed

    Eng, Jimmy K; Jahan, Tahmina A; Hoopmann, Michael R

    2013-01-01

    Proteomics research routinely involves identifying peptides and proteins via MS/MS sequence database search. Thus the database search engine is an integral tool in many proteomics research groups. Here, we introduce the Comet search engine to the existing landscape of commercial and open-source database search tools. Comet is open source, freely available, and based on one of the original sequence database search tools that has been widely used for many years. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Review and Comparison of the Search Effectiveness and User Interface of Three Major Online Chemical Databases

    ERIC Educational Resources Information Center

    Bharti, Neelam; Leonard, Michelle; Singh, Shailendra

    2016-01-01

    Online chemical databases are the largest source of chemical information and, therefore, the main resource for retrieving results from published journals, books, patents, conference abstracts, and other relevant sources. Various commercial, as well as free, chemical databases are available. SciFinder, Reaxys, and Web of Science are three major…

  7. Real-time data acquisition of commercial microwave link networks for hydrometeorological applications

    NASA Astrophysics Data System (ADS)

    Chwala, Christian; Keis, Felix; Kunstmann, Harald

    2016-03-01

    The usage of data from commercial microwave link (CML) networks for scientific purposes is becoming increasingly popular, in particular for rain rate estimation. However, data acquisition and availability is still a crucial problem and limits research possibilities. To overcome this issue, we have developed an open-source data acquisition system based on the Simple Network Management Protocol (SNMP). It is able to record transmitted and received signal levels of a large number of CMLs simultaneously with a temporal resolution of up to 1 s. We operate this system at Ericsson Germany, acquiring data from 450 CMLs with minutely real-time transfer to our database. Our data acquisition system is not limited to a particular CML hardware model or manufacturer, though. We demonstrate this by running the same system for CMLs of a different manufacturer, operated by an alpine ski resort in Germany. There, the data acquisition is running simultaneously for four CMLs with a temporal resolution of 1 s. We present an overview of our system, describe the details of the necessary SNMP requests and show results from its operational application.

  8. Real-time access of large volume imagery through low-bandwidth links

    NASA Astrophysics Data System (ADS)

    Phillips, James; Grohs, Karl; Brower, Bernard; Kelly, Lawrence; Carlisle, Lewis; Pellechia, Matthew

    2010-04-01

    Providing current, time-sensitive imagery and geospatial information to deployed tactical military forces or first responders continues to be a challenge. This challenge is compounded through rapid increases in sensor collection volumes, both with larger arrays and higher temporal capture rates. Focusing on the needs of these military forces and first responders, ITT developed a system called AGILE (Advanced Geospatial Imagery Library Enterprise) Access as an innovative approach based on standard off-the-shelf techniques to solving this problem. The AGILE Access system is based on commercial software called Image Access Solutions (IAS) and incorporates standard JPEG 2000 processing. Our solution system is implemented in an accredited, deployable form, incorporating a suite of components, including an image database, a web-based search and discovery tool, and several software tools that act in concert to process, store, and disseminate imagery from airborne systems and commercial satellites. Currently, this solution is operational within the U.S. Government tactical infrastructure and supports disadvantaged imagery users in the field. This paper presents the features and benefits of this system to disadvantaged users as demonstrated in real-world operational environments.

  9. Gradual cut detection using low-level vision for digital video

    NASA Astrophysics Data System (ADS)

    Lee, Jae-Hyun; Choi, Yeun-Sung; Jang, Ok-bae

    1996-09-01

    Digital video computing and organization is one of the important issues in multimedia system, signal compression, or database. Video should be segmented into shots to be used for identification and indexing. This approach requires a suitable method to automatically locate cut points in order to separate shot in a video. Automatic cut detection to isolate shots in a video has received considerable attention due to many practical applications; our video database, browsing, authoring system, retrieval and movie. Previous studies are based on a set of difference mechanisms and they measured the content changes between video frames. But they could not detect more special effects which include dissolve, wipe, fade-in, fade-out, and structured flashing. In this paper, a new cut detection method for gradual transition based on computer vision techniques is proposed. And then, experimental results applied to commercial video are presented and evaluated.

  10. The Development of Commercially Available Databases in Europe.

    ERIC Educational Resources Information Center

    Tomberg, Alex

    1979-01-01

    Europe's lag in databanks and online commercial availability is contrasted to its lead in numbers of bibliographic files. Intelligent use of new technologies such as Viewdata and the European Communications Satellite are expected to correct this imbalance. (RAA)

  11. GEOGRAPHIC INFORMATION SYSTEM APPROACH FOR PLAY PORTFOLIOS TO IMPROVE OIL PRODUCTION IN THE ILLINOIS BASIN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beverly Seyler; John Grube

    2004-12-10

    Oil and gas have been commercially produced in Illinois for over 100 years. Existing commercial production is from more than fifty-two named pay horizons in Paleozoic rocks ranging in age from Middle Ordovician to Pennsylvanian. Over 3.2 billion barrels of oil have been produced. Recent calculations indicate that remaining mobile resources in the Illinois Basin may be on the order of several billion barrels. Thus, large quantities of oil, potentially recoverable using current technology, remain in Illinois oil fields despite a century of development. Many opportunities for increased production may have been missed due to complex development histories, multiple stackedmore » pays, and commingled production which makes thorough exploitation of pays and the application of secondary or improved/enhanced recovery strategies difficult. Access to data, and the techniques required to evaluate and manage large amounts of diverse data are major barriers to increased production of critical reserves in the Illinois Basin. These constraints are being alleviated by the development of a database access system using a Geographic Information System (GIS) approach for evaluation and identification of underdeveloped pays. The Illinois State Geological Survey has developed a methodology that is being used by industry to identify underdeveloped areas (UDAs) in and around petroleum reservoirs in Illinois using a GIS approach. This project utilizes a statewide oil and gas Oracle{reg_sign} database to develop a series of Oil and Gas Base Maps with well location symbols that are color-coded by producing horizon. Producing horizons are displayed as layers and can be selected as separate or combined layers that can be turned on and off. Map views can be customized to serve individual needs and page size maps can be printed. A core analysis database with over 168,000 entries has been compiled and assimilated into the ISGS Enterprise Oracle database. Maps of wells with core data have been generated. Data from over 1,700 Illinois waterflood units and waterflood areas have been entered into an Access{reg_sign} database. The waterflood area data has also been assimilated into the ISGS Oracle database for mapping and dissemination on the ArcIMS website. Formation depths for the Beech Creek Limestone, Ste. Genevieve Limestone and New Albany Shale in all of the oil producing region of Illinois have been calculated and entered into a digital database. Digital contoured structure maps have been constructed, edited and added to the ILoil website as map layers. This technology/methodology addresses the long-standing constraints related to information access and data management in Illinois by significantly simplifying the laborious process that industry presently must use to identify underdeveloped pay zones in Illinois.« less

  12. Technology and the Modern Library.

    ERIC Educational Resources Information Center

    Boss, Richard W.

    1984-01-01

    Overview of the impact of information technology on libraries highlights turnkey vendors, bibliographic utilities, commercial suppliers of records, state and regional networks, computer-to-computer linkages, remote database searching, terminals and microcomputers, building local databases, delivery of information, digital telefacsimile,…

  13. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  14. Virtual Reality Therapy for Adults Post-Stroke: A Systematic Review and Meta-Analysis Exploring Virtual Environments and Commercial Games in Therapy

    PubMed Central

    Lohse, Keith R.; Hilderman, Courtney G. E.; Cheung, Katharine L.; Tatla, Sandy; Van der Loos, H. F. Machiel

    2014-01-01

    Background The objective of this analysis was to systematically review the evidence for virtual reality (VR) therapy in an adult post-stroke population in both custom built virtual environments (VE) and commercially available gaming systems (CG). Methods MEDLINE, CINAHL, EMBASE, ERIC, PSYCInfo, DARE, PEDro, Cochrane Central Register of Controlled Trials, and Cochrane Database of Systematic Reviews were systematically searched from the earliest available date until April 4, 2013. Controlled trials that compared VR to conventional therapy were included. Population criteria included adults (>18) post-stroke, excluding children, cerebral palsy, and other neurological disorders. Included studies were reported in English. Quality of studies was assessed with the Physiotherapy Evidence Database Scale (PEDro). Results Twenty-six studies met the inclusion criteria. For body function outcomes, there was a significant benefit of VR therapy compared to conventional therapy controls, G = 0.48, 95% CI = [0.27, 0.70], and no significant difference between VE and CG interventions (P = 0.38). For activity outcomes, there was a significant benefit of VR therapy, G = 0.58, 95% CI = [0.32, 0.85], and no significant difference between VE and CG interventions (P = 0.66). For participation outcomes, the overall effect size was G = 0.56, 95% CI = [0.02, 1.10]. All participation outcomes came from VE studies. Discussion VR rehabilitation moderately improves outcomes compared to conventional therapy in adults post-stroke. Current CG interventions have been too few and too small to assess potential benefits of CG. Future research in this area should aim to clearly define conventional therapy, report on participation measures, consider motivational components of therapy, and investigate commercially available systems in larger RCTs. Trial Registration Prospero CRD42013004338 PMID:24681826

  15. Virtual reality therapy for adults post-stroke: a systematic review and meta-analysis exploring virtual environments and commercial games in therapy.

    PubMed

    Lohse, Keith R; Hilderman, Courtney G E; Cheung, Katharine L; Tatla, Sandy; Van der Loos, H F Machiel

    2014-01-01

    The objective of this analysis was to systematically review the evidence for virtual reality (VR) therapy in an adult post-stroke population in both custom built virtual environments (VE) and commercially available gaming systems (CG). MEDLINE, CINAHL, EMBASE, ERIC, PSYCInfo, DARE, PEDro, Cochrane Central Register of Controlled Trials, and Cochrane Database of Systematic Reviews were systematically searched from the earliest available date until April 4, 2013. Controlled trials that compared VR to conventional therapy were included. Population criteria included adults (>18) post-stroke, excluding children, cerebral palsy, and other neurological disorders. Included studies were reported in English. Quality of studies was assessed with the Physiotherapy Evidence Database Scale (PEDro). Twenty-six studies met the inclusion criteria. For body function outcomes, there was a significant benefit of VR therapy compared to conventional therapy controls, G = 0.48, 95% CI = [0.27, 0.70], and no significant difference between VE and CG interventions (P = 0.38). For activity outcomes, there was a significant benefit of VR therapy, G = 0.58, 95% CI = [0.32, 0.85], and no significant difference between VE and CG interventions (P = 0.66). For participation outcomes, the overall effect size was G = 0.56, 95% CI = [0.02, 1.10]. All participation outcomes came from VE studies. VR rehabilitation moderately improves outcomes compared to conventional therapy in adults post-stroke. Current CG interventions have been too few and too small to assess potential benefits of CG. Future research in this area should aim to clearly define conventional therapy, report on participation measures, consider motivational components of therapy, and investigate commercially available systems in larger RCTs. Prospero CRD42013004338.

  16. Radiology and Enterprise Medical Imaging Extensions (REMIX).

    PubMed

    Erdal, Barbaros S; Prevedello, Luciano M; Qian, Songyue; Demirer, Mutlu; Little, Kevin; Ryu, John; O'Donnell, Thomas; White, Richard D

    2018-02-01

    Radiology and Enterprise Medical Imaging Extensions (REMIX) is a platform originally designed to both support the medical imaging-driven clinical and clinical research operational needs of Department of Radiology of The Ohio State University Wexner Medical Center. REMIX accommodates the storage and handling of "big imaging data," as needed for large multi-disciplinary cancer-focused programs. The evolving REMIX platform contains an array of integrated tools/software packages for the following: (1) server and storage management; (2) image reconstruction; (3) digital pathology; (4) de-identification; (5) business intelligence; (6) texture analysis; and (7) artificial intelligence. These capabilities, along with documentation and guidance, explaining how to interact with a commercial system (e.g., PACS, EHR, commercial database) that currently exists in clinical environments, are to be made freely available.

  17. BRCA Share: A Collection of Clinical BRCA Gene Variants.

    PubMed

    Béroud, Christophe; Letovsky, Stanley I; Braastad, Corey D; Caputo, Sandrine M; Beaudoux, Olivia; Bignon, Yves Jean; Bressac-De Paillerets, Brigitte; Bronner, Myriam; Buell, Crystal M; Collod-Béroud, Gwenaëlle; Coulet, Florence; Derive, Nicolas; Divincenzo, Christina; Elzinga, Christopher D; Garrec, Céline; Houdayer, Claude; Karbassi, Izabela; Lizard, Sarab; Love, Angela; Muller, Danièle; Nagan, Narasimhan; Nery, Camille R; Rai, Ghadi; Revillion, Françoise; Salgado, David; Sévenet, Nicolas; Sinilnikova, Olga; Sobol, Hagay; Stoppa-Lyonnet, Dominique; Toulas, Christine; Trautman, Edwin; Vaur, Dominique; Vilquin, Paul; Weymouth, Katelyn S; Willis, Alecia; Eisenberg, Marcia; Strom, Charles M

    2016-12-01

    As next-generation sequencing increases access to human genetic variation, the challenge of determining clinical significance of variants becomes ever more acute. Germline variants in the BRCA1 and BRCA2 genes can confer substantial lifetime risk of breast and ovarian cancer. Assessment of variant pathogenicity is a vital part of clinical genetic testing for these genes. A database of clinical observations of BRCA variants is a critical resource in that process. This article describes BRCA Share™, a database created by a unique international alliance of academic centers and commercial testing laboratories. By integrating the content of the Universal Mutation Database generated by the French Unicancer Genetic Group with the testing results of two large commercial laboratories, Quest Diagnostics and Laboratory Corporation of America (LabCorp), BRCA Share™ has assembled one of the largest publicly accessible collections of BRCA variants currently available. Although access is available to academic researchers without charge, commercial participants in the project are required to pay a support fee and contribute their data. The fees fund the ongoing curation effort, as well as planned experiments to functionally characterize variants of uncertain significance. BRCA Share™ databases can therefore be considered as models of successful data sharing between private companies and the academic world. © 2016 WILEY PERIODICALS, INC.

  18. Interpretive Guidance: What We’ve Learned

    DTIC Science & Technology

    2004-03-10

    database Internet /Web/ eCommerce Embedded, real-time systems Custom software Commercial DOD/other government Contractor to DOD/other government...Invited participation of ~7,000 people • Over 4,000 people had direct internet access. • Over 3,000 others were notified that the questionnaire was...activities defined in the PA, but not as formal as required. • Most things of ISM should be done at level 2. • "Little A" acquisition process adds very

  19. Adjacency and Proximity Searching in the Science Citation Index and Google

    DTIC Science & Technology

    2005-01-01

    major database search engines , including commercial S&T database search engines (e.g., Science Citation Index (SCI), Engineering Compendex (EC...PubMed, OVID), Federal agency award database search engines (e.g., NSF, NIH, DOE, EPA, as accessed in Federal R&D Project Summaries), Web search Engines (e.g...searching. Some database search engines allow strict constrained co- occurrence searching as a user option (e.g., OVID, EC), while others do not (e.g., SCI

  20. The new IAGOS Database Portal

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Fontaine, Alain

    2016-04-01

    IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data. The IAGOS Database Portal (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). The new IAGOS Database Portal has been released in December 2015. The main improvement is the interoperability implementation with international portals or other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the CAMS data center in Jülich (http://join.iek.fz-juelich.de). The CAMS (Copernicus Atmospheric Monitoring Service) project is a prominent user of the IGAS data network. The new portal provides improved and new services such as the download in NetCDF or NASA Ames formats, plotting tools (maps, time series, vertical profiles, etc.) and user management. Added value products are available on the portal: back trajectories, origin of air masses, co-location with satellite data, etc. The link with the CAMS data center, through JOIN (Jülich OWS Interface), allows to combine model outputs with IAGOS data for inter-comparison. Finally IAGOS metadata has been standardized (ISO 19115) and now provides complete information about data traceability and quality.

  1. Herbal medicine (Hyeolbuchukeo-tang or Xuefu Zhuyu decoction) for treating primary dysmenorrhoea: protocol for a systematic review of randomised controlled trials.

    PubMed

    Jo, Junyoung; Leem, Jungtae; Lee, Jin Moo; Park, Kyoung Sun

    2017-06-15

    Primary dysmenorrhoea is menstrual pain without pelvic pathology and is the most common gynaecological condition in women. Xuefu Zhuyudecoction (XZD) or Hyeolbuchukeo-tang, a traditional herbal formula, has been used as a treatment for primary dysmenorrhoea. The purpose of this study is to assess the current published evidence regarding XZD as treatment for primary dysmenorrhoea. The following databases will be searched from their inception until April 2017: MEDLINE (via PubMed), Allied and Complementary Medicine Database (AMED), EMBASE, The Cochrane Library, six Korean medical databases (Korean Studies Information Service System, DBPia, Oriental Medicine Advanced Searching Integrated System, Research Information Service System, Korea Med and the Korean Traditional Knowledge Portal), three Chinese medical databases (China National Knowledge Infrastructure (CNKI), Wan Fang Database and Chinese Scientific Journals Database (VIP)) and one Japanese medical database (CiNii). Randomised clinical trials (RCTs) that will be included in this systematic review comprise those that used XZD or modified XZD. The control groups in the RCTs include no treatment, placebo, conventional medication or other treatments. Trials testing XZD as an adjunct to other treatments and studies where the control group received the same treatment as the intervention group will be also included. Data extraction and risk of bias assessments will be performed by two independent reviewers. The risk of bias will be assessed with the Cochrane risk of bias tool. All statistical analyses will be conducted using Review Manager software (RevMan V.5.3.0). This systematic review will be published in a peer-reviewed journal. The review will also be disseminated electronically and in print. The review will benefit patients and practitioners in the fields of traditional and conventional medicine. CRD42016050447. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. 78 FR 773 - Hartford Financial Services Group, Inc., Commercial/Actuarial/Information Delivery Services (IDS...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-04

    ...., Commercial/ Actuarial/Information Delivery Services (IDS)/Corporate & Financial Reporting group, Hartford... financial reporting. The group develops databases for creating reports for corporate, regulatory, and... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-81,815] Hartford Financial...

  3. Applied learning-based color tone mapping for face recognition in video surveillance system

    NASA Astrophysics Data System (ADS)

    Yew, Chuu Tian; Suandi, Shahrel Azmin

    2012-04-01

    In this paper, we present an applied learning-based color tone mapping technique for video surveillance system. This technique can be applied onto both color and grayscale surveillance images. The basic idea is to learn the color or intensity statistics from a training dataset of photorealistic images of the candidates appeared in the surveillance images, and remap the color or intensity of the input image so that the color or intensity statistics match those in the training dataset. It is well known that the difference in commercial surveillance cameras models, and signal processing chipsets used by different manufacturers will cause the color and intensity of the images to differ from one another, thus creating additional challenges for face recognition in video surveillance system. Using Multi-Class Support Vector Machines as the classifier on a publicly available video surveillance camera database, namely SCface database, this approach is validated and compared to the results of using holistic approach on grayscale images. The results show that this technique is suitable to improve the color or intensity quality of video surveillance system for face recognition.

  4. Using a commercial mathematics software package for on-line analysis at the BNL Accelerator Test Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malone, R.; Wang, X.J.

    BY WRITING BOTH A CUSTOM WINDOWS(NTTM) DYNAMIC LINK LIBRARY AND GENERIC COMPANION SERVER SOFTWARE, THE INTRINSIC FUNCTIONS OF MATHSOFT MATHCAD(TM) HAVE BEEN EXTENDED WITH NEW CAPABILITIES WHICH PERMIT DIRECT ACCESS TO THE CONTROL SYSTEM DATABASES OF BROOKHAVEN NATIONAL LABORATORY ACCELERATOR TEST FACILITY. UNDER THIS SCHEME, A MATHCAD WORKSHEET EXECUTING ON A PERSONAL COMPUTER BECOMES A CLIENT WHICH CAN BOTH IMPORT AND EXPORT DATA TO A CONTROL SYSTEM SERVER VIA A NETWORK STREAM SOCKET CONNECTION. THE RESULT IS AN ALTERNATIVE, MATHEMATICALLY ORIENTED VIEW OF CONTROLLING THE ACCELERATOR INTERACTIVELY.

  5. Classifying compound mechanism of action for linking whole cell phenotypes to molecular targets

    PubMed Central

    Bourne, Christina R.; Wakeham, Nancy; Bunce, Richard A.; Berlin, K. Darrell; Barrow, William W.

    2013-01-01

    Drug development programs have proven successful when performed at a whole cell level, thus incorporating solubility and permeability into the primary screen. However, linking those results to the target within the cell has been a major set-back. The Phenotype Microarray system, marketed and sold by Biolog, seeks to address this need by assessing the phenotype in combination with a variety of chemicals with known mechanism of action (MOA). We have evaluated this system for usefulness in deducing the MOA for three test compounds. To achieve this, we constructed a database with 21 known antimicrobials, which served as a comparison for grouping our unknown MOA compounds. Pearson correlation and Ward linkage calculations were used to generate a dendrogram that produced clustering largely by known MOA, although there were exceptions. Of the three unknown compounds, one was definitively placed as an anti-folate. The second and third compounds’ MOA were not clearly identified, likely due to unique MOA not represented within the commercial database. The availability of the database generated in this report for S. aureus ATCC 29213 will increase the accessibility of this technique to other investigators. From our analysis, the Phenotype Microarray system can group compounds with clear MOA, but distinction of unique or broadly acting MOA at this time is less clear. PMID:22434711

  6. Spatial digital database of the geologic map of Catalina Core Complex and San Pedro Trough, Pima, Pinal, Gila, Graham, and Cochise counties, Arizona

    USGS Publications Warehouse

    Dickinson, William R.; digital database by Hirschberg, Douglas M.; Pitts, G. Stephen; Bolm, Karen S.

    2002-01-01

    The geologic map of Catalina Core Complex and San Pedro Trough by Dickinson (1992) was digitized for input into a geographic information system (GIS) by the U.S. Geological Survey staff and contractors in 2000-2001. This digital geospatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information in a geographic information system (GIS) for use in spatial analysis. The resulting digital geologic map database data can be queried in many ways to produce a variety of geologic maps and derivative products. Digital base map data (topography, roads, towns, rivers, lakes, and so forth) are not included; they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:125,000 (for example, 1:100,000 or 1:24,000). The digital geologic map plot files that are provided herein are representations of the database. The map area is located in southern Arizona. This report lists the geologic map units, the methods used to convert the geologic map data into a digital format, the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. The manuscript and digital data review by Lorre Moyer (USGS) is greatly appreciated.

  7. Survival in commercially insured multiple sclerosis patients and comparator subjects in the U.S.

    PubMed

    Kaufman, D W; Reshef, S; Golub, H L; Peucker, M; Corwin, M J; Goodin, D S; Knappertz, V; Pleimes, D; Cutter, G

    2014-05-01

    Compare survival in patients with multiple sclerosis (MS) from a U.S. commercial health insurance database with a matched cohort of non-MS subjects. 30,402 MS patients and 89,818 non-MS subjects (comparators) in the OptumInsight Research (OIR) database from 1996 to 2009 were included. An MS diagnosis required at least 3 consecutive months of database reporting, with two or more ICD-9 codes of 340 at least 30 days apart, or the combination of 1 ICD-9-340 code and at least 1 MS disease-modifying treatment (DMT) code. Comparators required the absence of ICD-9-340 and DMT codes throughout database reporting. Up to three comparators were matched to each patient for: age in the year of the first relevant code (index year - at least 3 months of reporting in that year were required); sex; region of residence in the index year. Deaths were ascertained from the National Death Index and the Social Security Administration Death Master File. Subjects not identified as deceased were assumed to be alive through the end of 2009. Annual mortality rates were 899/100,000 among MS patients and 446/100,000 among comparators. Standardized mortality ratios compared to the U.S. population were 1.70 and 0.80, respectively. Kaplan-Meier analysis yielded a median survival from birth that was 6 years lower among MS patients than among comparators. The results show, for the first time in a U.S. population, a survival disadvantage for contemporary MS patients compared to non-MS subjects from the same healthcare system. The 6-year decrement in lifespan parallels a recent report from British Columbia. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Spatial distribution of clinical computer systems in primary care in England in 2016 and implications for primary care electronic medical record databases: a cross-sectional population study.

    PubMed

    Kontopantelis, Evangelos; Stevens, Richard John; Helms, Peter J; Edwards, Duncan; Doran, Tim; Ashcroft, Darren M

    2018-02-28

    UK primary care databases (PCDs) are used by researchers worldwide to inform clinical practice. These databases have been primarily tied to single clinical computer systems, but little is known about the adoption of these systems by primary care practices or their geographical representativeness. We explore the spatial distribution of clinical computing systems and discuss the implications for the longevity and regional representativeness of these resources. Cross-sectional study. English primary care clinical computer systems. 7526 general practices in August 2016. Spatial mapping of family practices in England in 2016 by clinical computer system at two geographical levels, the lower Clinical Commissioning Group (CCG, 209 units) and the higher National Health Service regions (14 units). Data for practices included numbers of doctors, nurses and patients, and area deprivation. Of 7526 practices, Egton Medical Information Systems (EMIS) was used in 4199 (56%), SystmOne in 2552 (34%) and Vision in 636 (9%). Great regional variability was observed for all systems, with EMIS having a stronger presence in the West of England, London and the South; SystmOne in the East and some regions in the South; and Vision in London, the South, Greater Manchester and Birmingham. PCDs based on single clinical computer systems are geographically clustered in England. For example, Clinical Practice Research Datalink and The Health Improvement Network, the most popular primary care databases in terms of research outputs, are based on the Vision clinical computer system, used by <10% of practices and heavily concentrated in three major conurbations and the South. Researchers need to be aware of the analytical challenges posed by clustering, and barriers to accessing alternative PCDs need to be removed. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Multigeneration data migration from legacy systems

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Liu, Brent J.; Kho, Hwa T.; Tao, Wenchao; Wang, Cun; McCoy, J. Michael

    2003-05-01

    The migration of image data from different generations of legacy archive systems represents a technical challenge and in incremental cost in transitions to newer generations of PACS. UCLA medical center has elected to completely replace the existing PACS infrastructure encompassing several generations of legacy systems by a new commercial system providing enterprise-wide image management and communication. One of the most challenging parts of the project was the migration of large volumes of legacy images into the new system. Planning of the migration required the development of specialized software and hardware, and included different phases of data mediation from existing databases to the new PACS database prior to the migration of the image data. The project plan included a detailed analysis of resources and cost of data migration to optimize the process and minimize the delay of a hybrid operation where the legacy systems need to remain operational. Our analysis and project planning showed that the data migration represents the most critical path in the process of PACS renewal. Careful planning and optimization of the project timeline and resources allocated is critical to minimize the financial impact and the time delays that such migrations can impose on the implementation plan.

  10. 37 CFR 1.105 - Requirements for information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... databases: The existence of any particularly relevant commercial database known to any of the inventors that... improvement, identification of what is being improved. (vii) In use: Identification of any use of the claimed... the use. (viii) Technical information known to applicant. Technical information known to applicant...

  11. DEEP: A Database of Energy Efficiency Performance to Accelerate Energy Retrofitting of Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoon Lee, Sang; Hong, Tianzhen; Sawaya, Geof

    The paper presents a method and process to establish a database of energy efficiency performance (DEEP) to enable quick and accurate assessment of energy retrofit of commercial buildings. DEEP was compiled from results of about 35 million EnergyPlus simulations. DEEP provides energy savings for screening and evaluation of retrofit measures targeting the small and medium-sized office and retail buildings in California. The prototype building models are developed for a comprehensive assessment of building energy performance based on DOE commercial reference buildings and the California DEER prototype buildings. The prototype buildings represent seven building types across six vintages of constructions andmore » 16 California climate zones. DEEP uses these prototypes to evaluate energy performance of about 100 energy conservation measures covering envelope, lighting, heating, ventilation, air-conditioning, plug-loads, and domestic hot water. DEEP consists the energy simulation results for individual retrofit measures as well as packages of measures to consider interactive effects between multiple measures. The large scale EnergyPlus simulations are being conducted on the super computers at the National Energy Research Scientific Computing Center of Lawrence Berkeley National Laboratory. The pre-simulation database is a part of an on-going project to develop a web-based retrofit toolkit for small and medium-sized commercial buildings in California, which provides real-time energy retrofit feedback by querying DEEP with recommended measures, estimated energy savings and financial payback period based on users’ decision criteria of maximizing energy savings, energy cost savings, carbon reduction, or payback of investment. The pre-simulated database and associated comprehensive measure analysis enhances the ability to performance assessments of retrofits to reduce energy use for small and medium buildings and business owners who typically do not have resources to conduct costly building energy audit. DEEP will be migrated into the DEnCity - DOE’s Energy City, which integrates large-scale energy data for multi-purpose, open, and dynamic database leveraging diverse source of existing simulation data.« less

  12. Optical measurements of paintings and the creation of an artwork database for authenticity

    PubMed Central

    Hwang, Seonhee; Song, Hyerin; Cho, Soon-Woo; Kim, Chang Eun; Kim, Chang-Seok; Kim, Kyujung

    2017-01-01

    Paintings have high cultural and commercial value, so that needs to be preserved. Many techniques have been attempted to analyze properties of paintings, including X-ray analysis and optical coherence tomography (OCT) methods, and enable conservation of paintings from forgeries. In this paper, we suggest a simple and accurate optical analysis system to protect them from counterfeit which is comprised of fiber optics reflectance spectroscopy (FORS) and line laser-based topographic analysis. The system is designed to fully cover the whole area of paintings regardless of its size for the accurate analysis. For additional assessments, a line laser-based high resolved OCT was utilized. Some forgeries were created by the experts from the three different styles of genuine paintings for the experiments. After measuring surface properties of paintings, we could observe the results from the genuine works and the forgeries have the distinctive characteristics. The forgeries could be distinguished maximally 76.5% with obtained RGB spectra by FORS and 100% by topographic analysis. Through the several executions, the reliability of the system was confirmed. We could verify that the measurement system is worthwhile for the conservation of the valuable paintings. To store the surface information of the paintings in micron scale, we created a numerical database. Consequently, we secured the databases of three different famous Korean paintings for accurate authenticity. PMID:28151981

  13. Optical measurements of paintings and the creation of an artwork database for authenticity.

    PubMed

    Hwang, Seonhee; Song, Hyerin; Cho, Soon-Woo; Kim, Chang Eun; Kim, Chang-Seok; Kim, Kyujung

    2017-01-01

    Paintings have high cultural and commercial value, so that needs to be preserved. Many techniques have been attempted to analyze properties of paintings, including X-ray analysis and optical coherence tomography (OCT) methods, and enable conservation of paintings from forgeries. In this paper, we suggest a simple and accurate optical analysis system to protect them from counterfeit which is comprised of fiber optics reflectance spectroscopy (FORS) and line laser-based topographic analysis. The system is designed to fully cover the whole area of paintings regardless of its size for the accurate analysis. For additional assessments, a line laser-based high resolved OCT was utilized. Some forgeries were created by the experts from the three different styles of genuine paintings for the experiments. After measuring surface properties of paintings, we could observe the results from the genuine works and the forgeries have the distinctive characteristics. The forgeries could be distinguished maximally 76.5% with obtained RGB spectra by FORS and 100% by topographic analysis. Through the several executions, the reliability of the system was confirmed. We could verify that the measurement system is worthwhile for the conservation of the valuable paintings. To store the surface information of the paintings in micron scale, we created a numerical database. Consequently, we secured the databases of three different famous Korean paintings for accurate authenticity.

  14. Development and Testing of Carbon-Carbon Nozzle Extensions for Upper Stage Liquid Rocket Engines

    NASA Technical Reports Server (NTRS)

    Valentine, Peter G.; Gradl, Paul R.; Greene, Sandra E.

    2017-01-01

    Carbon-carbon (C-C) composite nozzle extensions are of interest for use on a variety of launch vehicle upper stage engines and in-space propulsion systems. The C-C nozzle extension technology and test capabilities being developed are intended to support National Aeronautics and Space Administration (NASA) and Department of Defense (DOD) requirements, as well as those of the broader Commercial Space industry. For NASA, C-C nozzle extension technology development primarily supports the NASA Space Launch System (SLS) and NASA's Commercial Space partners. Marshall Space Flight Center (MSFC) efforts are aimed at both (a) further developing the technology and databases needed to enable the use of composite nozzle extensions on cryogenic upper stage engines, and (b) developing and demonstrating low-cost capabilities for testing and qualifying composite nozzle extensions. Recent, on-going, and potential future work supporting NASA, DOD, and Commercial Space needs will be discussed. Information to be presented will include (a) recent and on-going mechanical, thermal, and hot-fire testing, as well as (b) potential future efforts to further develop and qualify domestic C-C nozzle extension solutions for the various upper stage engines under development.

  15. Cold Climate Foundation Retrofit Experimental Hygrothermal Performance: Cloquet Residential Research Facility Laboratory Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Louise F.; Harmon, Anna C.

    2015-04-01

    Thermal and moisture problems in existing basements create a unique challenge because the exterior face of the wall is not easily or inexpensively accessible. This approach addresses thermal and moisture management from the interior face of the wall without disturbing the exterior soil and landscaping. the interior and exterior environments. This approach has the potential for improving durability, comfort, and indoor air quality. This project was funded jointly by the National Renewable Energy Laboratory (NREL) and Oak Ridge National Laboratory (ORNL). ORNL focused on developing a full basement wall system experimental database to enable others to validate hygrothermal simulation codes.more » NREL focused on testing the moisture durability of practical basement wall interior insulation retrofit solutions for cold climates. The project has produced a physically credible and reliable long-term hygrothermal performance database for retrofit foundation wall insulation systems in zone 6 and 7 climates that are fully compliant with the performance criteria in the 2009 Minnesota Energy Code. The experimental data were configured into a standard format that can be published online and that is compatible with standard commercially available spreadsheet and database software.« less

  16. Changes in Exercise Data Management

    NASA Technical Reports Server (NTRS)

    Buxton, R. E.; Kalogera, K. L.; Hanson, A. M.

    2018-01-01

    The suite of exercise hardware aboard the International Space Station (ISS) generates an immense amount of data. The data collected from the treadmill, cycle ergometer, and resistance strength training hardware are basic exercise parameters (time, heart rate, speed, load, etc.). The raw data are post processed in the laboratory and more detailed parameters are calculated from each exercise data file. Updates have recently been made to how this valuable data are stored, adding an additional level of data security, increasing data accessibility, and resulting in overall increased efficiency of medical report delivery. Questions regarding exercise performance or how exercise may influence other variables of crew health frequently arise within the crew health care community. Inquiries over the health of the exercise hardware often need quick analysis and response to ensure the exercise system is operable on a continuous basis. Consolidating all of the exercise system data in a single repository enables a quick response to both the medical and engineering communities. A SQL server database is currently in use, and provides a secure location for all of the exercise data starting at ISS Expedition 1 - current day. The database has been structured to update derived metrics automatically, making analysis and reporting available within minutes of dropping the inflight data it into the database. Commercial tools were evaluated to help aggregate and visualize data from the SQL database. The Tableau software provides manageable interface, which has improved the laboratory's output time of crew reports by 67%. Expansion of the SQL database to be inclusive of additional medical requirement metrics, addition of 'app-like' tools for mobile visualization, and collaborative use (e.g. operational support teams, research groups, and International Partners) of the data system is currently being explored.

  17. Graphic Interfaces and Online Information.

    ERIC Educational Resources Information Center

    Percival, J. Mark

    1990-01-01

    Discusses the growing importance of the use of Graphic User Interfaces (GUIs) with microcomputers and online services. Highlights include the development of graphics interfacing with microcomputers; CD-ROM databases; an evaluation of HyperCard as a potential interface to electronic mail and online commercial databases; and future possibilities.…

  18. The Hidden Dimensions of Databases.

    ERIC Educational Resources Information Center

    Jacso, Peter

    1994-01-01

    Discusses methods of evaluating commercial online databases and provides examples that illustrate their hidden dimensions. Topics addressed include size, including the number of records or the number of titles; the number of years covered; and the frequency of updates. Comparisons of Readers' Guide Abstracts and Magazine Article Summaries are…

  19. Survey of Machine Learning Methods for Database Security

    NASA Astrophysics Data System (ADS)

    Kamra, Ashish; Ber, Elisa

    Application of machine learning techniques to database security is an emerging area of research. In this chapter, we present a survey of various approaches that use machine learning/data mining techniques to enhance the traditional security mechanisms of databases. There are two key database security areas in which these techniques have found applications, namely, detection of SQL Injection attacks and anomaly detection for defending against insider threats. Apart from the research prototypes and tools, various third-party commercial products are also available that provide database activity monitoring solutions by profiling database users and applications. We present a survey of such products. We end the chapter with a primer on mechanisms for responding to database anomalies.

  20. Financing a future for public biological data.

    PubMed

    Ellis, L B; Kalumbi, D

    1999-09-01

    The public web-based biological database infrastructure is a source of both wonder and worry. Users delight in the ever increasing amounts of information available; database administrators and curators worry about long-term financial support. An earlier study of 153 biological databases (Ellis and Kalumbi, Nature Biotechnol., 16, 1323-1324, 1998) determined that near future (1-5 year) funding for over two-thirds of them was uncertain. More detailed data are required to determine the magnitude of the problem and offer possible solutions. This study examines the finances and use statistics of a few of these organizations in more depth, and reviews several economic models that may help sustain them. Six organizations were studied. Their administrative overhead is fairly low; non-administrative personnel and computer-related costs account for 77% of expenses. One smaller, more specialized US database, in 1997, had 60% of total access from US domains; a majority (56%) of its US accesses came from commercial domains, although only 2% of the 153 databases originally studied received any industrial support. The most popular model used to gain industrial support is asymmetric pricing: preferentially charging the commercial users of a database. At least five biological databases have recently begun using this model. Advertising is another model which may be useful for the more general, more heavily used sites. Microcommerce has promise, especially for databases that do not attract advertisers, but needs further testing. The least income reported for any of the databases studied was $50,000/year; applying this rate to 400 biological databases (a lower limit of the number of such databases, many of which require far larger resources) would mean annual support need of at least $20 million. To obtain this level of support is challenging, yet failure to accept the challenge could be catastrophic. lynda@tc.umn. edu

  1. Formal implementation of a performance evaluation model for the face recognition system.

    PubMed

    Shin, Yong-Nyuo; Kim, Jason; Lee, Yong-Jun; Shin, Woochang; Choi, Jin-Young

    2008-01-01

    Due to usability features, practical applications, and its lack of intrusiveness, face recognition technology, based on information, derived from individuals' facial features, has been attracting considerable attention recently. Reported recognition rates of commercialized face recognition systems cannot be admitted as official recognition rates, as they are based on assumptions that are beneficial to the specific system and face database. Therefore, performance evaluation methods and tools are necessary to objectively measure the accuracy and performance of any face recognition system. In this paper, we propose and formalize a performance evaluation model for the biometric recognition system, implementing an evaluation tool for face recognition systems based on the proposed model. Furthermore, we performed evaluations objectively by providing guidelines for the design and implementation of a performance evaluation system, formalizing the performance test process.

  2. Towards the implementation of a spectral database for the detection of biological warfare agents

    NASA Astrophysics Data System (ADS)

    Carestia, M.; Pizzoferrato, R.; Gelfusa, M.; Cenciarelli, O.; D'Amico, F.; Malizia, A.; Scarpellini, D.; Murari, A.; Vega, J.; Gaudio, P.

    2014-10-01

    The deliberate use of biological warfare agents (BWA) and other pathogens can jeopardize the safety of population, fauna and flora, and represents a concrete concern from the military and civil perspective. At present, the only commercially available tools for fast warning of a biological attack can perform point detection and require active or passive sampling collection. The development of a stand-off detection system would be extremely valuable to minimize the risk and the possible consequences of the release of biological aerosols in the atmosphere. Biological samples can be analyzed by means of several optical techniques, covering a broad region of the electromagnetic spectrum. Strong evidence proved that the informative content of fluorescence spectra could provide good preliminary discrimination among those agents and it can also be obtained through stand-off measurements. Such a system necessitates a database and a mathematical method for the discrimination of the spectral signatures. In this work, we collected fluorescence emission spectra of the main BWA simulants, to implement a spectral signature database and apply the Universal Multi Event Locator (UMEL) statistical method. Our preliminary analysis, conducted in laboratory conditions with a standard UV lamp source, considers the main experimental setups influencing the fluorescence signature of some of the most commonly used BWA simulants. Our work represents a first step towards the implementation of a spectral database and a laser-based biological stand-off detection and identification technique.

  3. Decision support systems and applications in ophthalmology: literature and commercial review focused on mobile apps.

    PubMed

    de la Torre-Díez, Isabel; Martínez-Pérez, Borja; López-Coronado, Miguel; Díaz, Javier Rodríguez; López, Miguel Maldonado

    2015-01-01

    The growing importance that mobile devices have in daily life has also reached health care and medicine. This is making the paradigm of health care change and the concept of mHealth or mobile health more relevant, whose main essence is the apps. This new reality makes it possible for doctors who are not specialist to have easy access to all the information generated in different corners of the world, making them potential keepers of that knowledge. However, the new daily information exceeds the limits of the human intellect, making Decision Support Systems (DSS) necessary for helping doctors to diagnose diseases and also help them to decide the attitude that has to be taken towards these diagnoses. These could improve the health care in remote areas and developing countries. All of this is even more important in diseases that are more prevalent in primary care and that directly affect the people's quality of life, this is the case in ophthalmological problems where in first patient care a specialist in ophthalmology is not involved. The goal of this paper is to analyse the state of the art of DSS in Ophthalmology. Many of them focused on diseases affecting the eye's posterior pole. For achieving the main purpose of this research work, a literature review and commercial apps analysis will be done. The used databases and systems will be IEEE Xplore, Web of Science (WoS), Scopus, and PubMed. The search is limited to articles published from 2000 until now. Later, different Mobile Decision Support System (MDSS) in Ophthalmology will be analyzed in the virtual stores for Android and iOS. 37 articles were selected according their thematic (posterior pole, anterior pole, Electronic Health Records (EHRs), cloud, data mining, algorithms and structures for DSS, and other) from a total of 600 found in the above cited databases. Very few mobile apps were found in the different stores. It can be concluded that almost all existing mobile apps are focused on the eye's posterior pole. Among them, the most intended are for diagnostic of diabetic retinopathy. The primary market niche of the commercial apps is the general physicians.

  4. An expert system based software sizing tool, phase 2

    NASA Technical Reports Server (NTRS)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  5. Electronic Publishing and Document Delivery; A Case Study of Commercial Information Services on the Internet.

    ERIC Educational Resources Information Center

    Abbott, Anthony

    1992-01-01

    Discusses the electronic publishing activities of Meckler Publishing on the Internet, including a publications catalog, an electronic journal, and tables of contents databases. Broader issues of commercial network publishing are also addressed, including changes in the research process, changes in publishing, bibliographic control,…

  6. Trends in Literacy Software Publication and Marketing: Multicultural Themes.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    This article provides data and discussion of multicultural theme-related issues arising from analysis of a detailed database of commercial software products targeted to reading and literacy education. The database consisted of 1152 titles, representing the offerings of 104 publishers and distributors. Of the titles, 62 were identified as having…

  7. Interdisciplinary analysis procedures in the modeling and control of large space-based structures

    NASA Technical Reports Server (NTRS)

    Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.

    1987-01-01

    The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.

  8. In silico strategies for the selection of chelating compounds with potential application in metal-promoted neurodegenerative diseases

    NASA Astrophysics Data System (ADS)

    Rodríguez-Rodríguez, Cristina; Rimola, Albert; Alí-Torres, Jorge; Sodupe, Mariona; González-Duarte, Pilar

    2011-01-01

    The development of new strategies to find commercial molecules with promising biochemical features is a main target in the field of biomedicine chemistry. In this work we present an in silico-based protocol that allows identifying commercial compounds with suitable metal coordinating and pharmacokinetic properties to act as metal-ion chelators in metal-promoted neurodegenerative diseases (MpND). Selection of the chelating ligands is done by combining quantum chemical calculations with the search of commercial compounds on different databases via virtual screening. Starting from different designed molecular frameworks, which mainly constitute the binding site, the virtual screening on databases facilitates the identification of different commercial molecules that enclose such scaffolds and, by imposing a set of chemical and pharmacokinetic filters, obey some drug-like requirements mandatory to deal with MpND. The quantum mechanical calculations are useful to gauge the chelating properties of the selected candidate molecules by determining the structure of metal complexes and evaluating their stability constants. With the proposed strategy, commercial compounds containing N and S donor atoms in the binding sites and capable to cross the BBB have been identified and their chelating properties analyzed.

  9. Expert system for web based collaborative CAE

    NASA Astrophysics Data System (ADS)

    Hou, Liang; Lin, Zusheng

    2006-11-01

    An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.

  10. A global organism detection and monitoring system for non-native species

    USGS Publications Warehouse

    Graham, J.; Newman, G.; Jarnevich, C.; Shory, R.; Stohlgren, T.J.

    2007-01-01

    Harmful invasive non-native species are a significant threat to native species and ecosystems, and the costs associated with non-native species in the United States is estimated at over $120 Billion/year. While some local or regional databases exist for some taxonomic groups, there are no effective geographic databases designed to detect and monitor all species of non-native plants, animals, and pathogens. We developed a web-based solution called the Global Organism Detection and Monitoring (GODM) system to provide real-time data from a broad spectrum of users on the distribution and abundance of non-native species, including attributes of their habitats for predictive spatial modeling of current and potential distributions. The four major subsystems of GODM provide dynamic links between the organism data, web pages, spatial data, and modeling capabilities. The core survey database tables for recording invasive species survey data are organized into three categories: "Where, Who & When, and What." Organisms are identified with Taxonomic Serial Numbers from the Integrated Taxonomic Information System. To allow users to immediately see a map of their data combined with other user's data, a custom geographic information system (GIS) Internet solution was required. The GIS solution provides an unprecedented level of flexibility in database access, allowing users to display maps of invasive species distributions or abundances based on various criteria including taxonomic classification (i.e., phylum or division, order, class, family, genus, species, subspecies, and variety), a specific project, a range of dates, and a range of attributes (percent cover, age, height, sex, weight). This is a significant paradigm shift from "map servers" to true Internet-based GIS solutions. The remainder of the system was created with a mix of commercial products, open source software, and custom software. Custom GIS libraries were created where required for processing large datasets, accessing the operating system, and to use existing libraries in C++, R, and other languages to develop the tools to track harmful species in space and time. The GODM database and system are crucial for early detection and rapid containment of invasive species. ?? 2007 Elsevier B.V. All rights reserved.

  11. Why an Eye Limiting Display Resolution Matters

    NASA Technical Reports Server (NTRS)

    Kato, Kenji Hiroshi

    2013-01-01

    Many factors affect the suitability of an out-the-window simulator visual system. Contrast, brightness, resolution, field-of-view, update rate, scene content and a number of other criteria are common factors often used to define requirements for simulator visual systems. For the past 7 years, NASA has worked with the USAF on the Operational Based Vision Assessment Program. The purpose of this program has been to provide the USAF School of Aerospace Medicine with a scientific testing laboratory to study human vision and testing standards in an operationally relevant environment. It was determined early in the design that current commercial and military training systems wern't well suited for the available budget as well as the highly research oriented requirements. During various design review meetings, it was determined the OBVA requirements were best met by using commercial-off-the-shelf equipment to minimize technical risk and costs. In this paper we will describe how the simulator specifications were developed in order to meet the research objectives and the resulting architecture and design considerations. In particular we will discuss the image generator architecture and database developments to meet eye limited resolution.

  12. Lessons Learned from Managing a Petabyte

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becla, J

    2005-01-20

    The amount of data collected and stored by the average business doubles each year. Many commercial databases are already approaching hundreds of terabytes, and at this rate, will soon be managing petabytes. More data enables new functionality and capability, but the larger scale reveals new problems and issues hidden in ''smaller'' terascale environments. This paper presents some of these new problems along with implemented solutions in the framework of a petabyte dataset for a large High Energy Physics experiment. Through experience with two persistence technologies, a commercial database and a file-based approach, we expose format-independent concepts and issues prevalent atmore » this new scale of computing.« less

  13. Identification of clinical yeasts by Vitek MS system compared with API ID 32 C.

    PubMed

    Durán-Valle, M Teresa; Sanz-Rodríguez, Nuria; Muñoz-Paraíso, Carmen; Almagro-Moltó, María; Gómez-Garcés, José Luis

    2014-05-01

    We performed a clinical evaluation of the Vitek MS matrix-assisted laser desorption ionization-time-of-flight mass spectrometry (MALDI-TOF MS) system with the commercial database version 2.0 for rapid identification of medically important yeasts as compared with the conventional phenotypic method API ID 32 C. We tested 161 clinical isolates, nine isolates from culture collections and five reference strains. In case of discrepant results or no identification with one or both methods, molecular identification techniques were employed. Concordance between both methods was observed with 160/175 isolates (91.42%) and misidentifications by both systems occurred only when taxa were not included in the respective databases, i.e., one isolate of Candida etchellsii was identified as C. globosa by Vitek MS and two isolates of C. orthopsilosis were identified as C. parapsilosis by API ID 32 C. Vitek MS could not identify nine strains (5.14%) and API ID 32 C did not identify 13 (7.42%). Vitek MS was more reliable than API ID 32 C and reduced the time required for the identification of clinical isolates to only a few minutes.

  14. COINS: A composites information database system

    NASA Technical Reports Server (NTRS)

    Siddiqi, Shahid; Vosteen, Louis F.; Edlow, Ralph; Kwa, Teck-Seng

    1992-01-01

    An automated data abstraction form (ADAF) was developed to collect information on advanced fabrication processes and their related costs. The information will be collected for all components being fabricated as part of the ACT program and include in a COmposites INformation System (COINS) database. The aim of the COINS development effort is to provide future airframe preliminary design and fabrication teams with a tool through which production cost can become a deterministic variable in the design optimization process. The effort was initiated by the Structures Technology Program Office (STPO) of the NASA LaRC to implement the recommendations of a working group comprised of representatives from the commercial airframe companies. The principal working group recommendation was to re-institute collection of composite part fabrication data in a format similar to the DOD/NASA Structural Composites Fabrication Guide. The fabrication information collection form was automated with current user friendly computer technology. This work in progress paper describes the new automated form and features that make the form easy to use by an aircraft structural design-manufacturing team.

  15. An Array Library for Microsoft SQL Server with Astrophysical Applications

    NASA Astrophysics Data System (ADS)

    Dobos, L.; Szalay, A. S.; Blakeley, J.; Falck, B.; Budavári, T.; Csabai, I.

    2012-09-01

    Today's scientific simulations produce output on the 10-100 TB scale. This unprecedented amount of data requires data handling techniques that are beyond what is used for ordinary files. Relational database systems have been successfully used to store and process scientific data, but the new requirements constantly generate new challenges. Moving terabytes of data among servers on a timely basis is a tough problem, even with the newest high-throughput networks. Thus, moving the computations as close to the data as possible and minimizing the client-server overhead are absolutely necessary. At least data subsetting and preprocessing have to be done inside the server process. Out of the box commercial database systems perform very well in scientific applications from the prospective of data storage optimization, data retrieval, and memory management but lack basic functionality like handling scientific data structures or enabling advanced math inside the database server. The most important gap in Microsoft SQL Server is the lack of a native array data type. Fortunately, the technology exists to extend the database server with custom-written code that enables us to address these problems. We present the prototype of a custom-built extension to Microsoft SQL Server that adds array handling functionality to the database system. With our Array Library, fix-sized arrays of all basic numeric data types can be created and manipulated efficiently. Also, the library is designed to be able to be seamlessly integrated with the most common math libraries, such as BLAS, LAPACK, FFTW, etc. With the help of these libraries, complex operations, such as matrix inversions or Fourier transformations, can be done on-the-fly, from SQL code, inside the database server process. We are currently testing the prototype with two different scientific data sets: The Indra cosmological simulation will use it to store particle and density data from N-body simulations, and the Milky Way Laboratory project will use it to store galaxy simulation data.

  16. The Clear Creek Envirohydrologic Observatory: From Vision Toward Reality

    NASA Astrophysics Data System (ADS)

    Just, C.; Muste, M.; Kruger, A.

    2007-12-01

    As the vision of a fully-functional Clear Creek Envirohydrologic Observatory comes closer to reality, the opportunities for significant watershed science advances in the near future become more apparent. As a starting point to approaching this vision, we focused on creating a working example of cyberinfrastructure in the hydrologic and environmental sciences. The system will integrate a broad range of technologies and ideas: wired and wireless sensors, low power wireless communication, embedded microcontrollers, commodity cellular networks, the internet, unattended quality assurance, metadata, relational databases, machine-to-machine communication, interfaces to hydrologic and environmental models, feedback, and external inputs. Hardware: An accomplishment to date is "in-house" developed sensor networking electronics to compliment commercially available communications. The first of these networkable sensors are dielectric soil moisture probes that are arrayed and equipped with wireless connectivity for communications. Commercially available data logging and telemetry-enabled systems deployed at the Clear Creek testbed include a Campbell Scientific CR1000 datalogger, a Redwing 100 cellular modem, a YA Series yagi antenna, a NP12 rechargeable battery, and a BP SX20U solar panel. This networking equipment has been coupled with Hach DS5X water quality sondes, DTS-12 turbidity probes and MicroLAB nutrient analyzers. Software: Our existing data model is an Arc Hydro-based geodatabase customized with applications for extraction and population of the database with third party data. The following third party data are acquired automatically and in real time into the Arc Hydro customized database: 1) geophysical data: 10m DEM and soil grids, soils; 2) land use/land cover data; and 3) eco-hydrological: radar-based rainfall estimates, stream gage, streamlines, and water quality data. A new processing software for data analysis of Acoustic Doppler Current Profilers (ADCP) measurements has been finalized. The software package provides mean flow field and turbulence characteristics obtained by operating the ADCP at fixed points or using the moving-boat approach. Current Work: The current development work is focused on extracting and populating the Clear Creek database with in-situ measurements acquired and transmitted in real time with sensors deployed in the Clear Creek watershed.

  17. Characteristics and treatment patterns of US commercially insured and Medicaid patients with opioid dependence or abuse.

    PubMed

    Wollschlaeger, Bernd A; Willson, Tina M; Montejano, Leslie B; Ronquest, Naoko A; Nadipelli, Vijay R

    To identify the demographic and clinical characteristics of commercially insured and Medicaid patients with a diagnosis of opioid dependence or abuse and to describe the pharmacological and nonpharmacological treatments received by these patients. This was a retrospective observational study using de-identified administrative claims data. The analysis included commercially insured and Medicaid patient data extracted from the Truven Health MarketScan® Commercial and Medicaid Databases. Patients with a diagnosis of opioid dependence or abuse from 2008 to 2014 (earliest diagnosis = index date) and a minimum of 6 months of pre-index and postindex continuous enrollment in the database. Baseline demographic and clinical characteristics, medication-assisted treatment (MAT), and treatment other than MAT received following diagnosis, and the clinical practice setting in which patients received any opioid dependence-related care were reported. Data from commercially insured (N = 103,768) and Medicaid (N = 50,552) patients were analyzed. Common comorbid conditions included chronic pain (48.6 percent Commercial, 56.8 percent Medicaid), depressive disorder (24.0 percent Commercial, 32.8 percent Medicaid), and other substance abuse disorders (13.3 percent Commercial, 23.7 percent Medicaid). Nearly one third of both Commercial (31.6 percent) and Medicaid (33.6 percent) patients did not have any claims for psychosocial therapy or MAT during the follow-up period. Only 24.3 percent of Commercial patients and 20.4 percent of Medicaid patients had evidence of claims for both MAT and psychosocial treatment anytime following diagnosis. The results suggest that there are opportunities to improve care through comprehensive and coordinated treatment for opioid dependence/abuse. Policies aimed at improving treatment access may be warranted.

  18. Market Assessment of Biomass Gasification and Combustion Technology for Small- and Medium-Scale Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, D.; Haase, S.

    2009-07-01

    This report provides a market assessment of gasification and direct combustion technologies that use wood and agricultural resources to generate heat, power, or combined heat and power (CHP) for small- to medium-scale applications. It contains a brief overview of wood and agricultural resources in the U.S.; a description and discussion of gasification and combustion conversion technologies that utilize solid biomass to generate heat, power, and CHP; an assessment of the commercial status of gasification and combustion technologies; a summary of gasification and combustion system economics; a discussion of the market potential for small- to medium-scale gasification and combustion systems; andmore » an inventory of direct combustion system suppliers and gasification technology companies. The report indicates that while direct combustion and close-coupled gasification boiler systems used to generate heat, power, or CHP are commercially available from a number of manufacturers, two-stage gasification systems are largely in development, with a number of technologies currently in demonstration. The report also cites the need for a searchable, comprehensive database of operating combustion and gasification systems that generate heat, power, or CHP built in the U.S., as well as a national assessment of the market potential for the systems.« less

  19. A USA Commercial Flight Track Database for Upper Tropospheric Aircraft Emission Studies

    NASA Technical Reports Server (NTRS)

    Garber, Donald P.; Minnis, Patrick; Costulis, Kay P.

    2003-01-01

    A new air traffic database over the contiguous United States of America (USA) has been developed from a commercially available real-time product for 2001-2003 for all non-military flights above 25,000 ft. Both individual flight tracks and gridded spatially integrated flight legs are available. On average, approximately 24,000 high-altitude flights were recorded each day. The diurnal cycle of air traffic over the USA is characterized by a broad daytime maximum with a 0130-LT minimum and a mean day-night air traffic ratio of 2.4. Each week, the air traffic typically peaks on Thursday and drops to a low Saturday with a range of 18%. Flight density is greatest during late summer and least during winter. The database records the disruption of air traffic after the air traffic shutdown during September 2001. The dataset should be valuable for realistically simulating the atmospheric effects of aircraft in the upper troposphere.

  20. Pattern Recognition-Assisted Infrared Library Searching of the Paint Data Query Database to Enhance Lead Information from Automotive Paint Trace Evidence.

    PubMed

    Lavine, Barry K; White, Collin G; Allen, Matthew D; Weakley, Andrew

    2017-03-01

    Multilayered automotive paint fragments, which are one of the most complex materials encountered in the forensic science laboratory, provide crucial links in criminal investigations and prosecutions. To determine the origin of these paint fragments, forensic automotive paint examiners have turned to the paint data query (PDQ) database, which allows the forensic examiner to compare the layer sequence and color, texture, and composition of the sample to paint systems of the original equipment manufacturer (OEM). However, modern automotive paints have a thin color coat and this layer on a microscopic fragment is often too thin to obtain accurate chemical and topcoat color information. A search engine has been developed for the infrared (IR) spectral libraries of the PDQ database in an effort to improve discrimination capability and permit quantification of discrimination power for OEM automotive paint comparisons. The similarity of IR spectra of the corresponding layers of various records for original finishes in the PDQ database often results in poor discrimination using commercial library search algorithms. A pattern recognition approach employing pre-filters and a cross-correlation library search algorithm that performs both a forward and backward search has been used to significantly improve the discrimination of IR spectra in the PDQ database and thus improve the accuracy of the search. This improvement permits inter-comparison of OEM automotive paint layer systems using the IR spectra alone. Such information can serve to quantify the discrimination power of the original automotive paint encountered in casework and further efforts to succinctly communicate trace evidence to the courts.

  1. Spatial digital database for the geologic map of the east part of the Pullman 1° x 2° quadrangle, Idaho

    USGS Publications Warehouse

    Rember, William C.; Bennett, Earl H.

    2001-01-01

    he paper geologic map of the east part of the Pullman 1·x 2· degree quadrangle, Idaho (Rember and Bennett, 1979) was scanned and initially attributed by Optronics Specialty Co., Inc. (Northridge, CA) and remitted to the U.S. Geological Survey for further attribution and publication of the geospatial digital files. The resulting digital geologic map GIS can be queried in many ways to produce a variety of geologic maps. This digital geospatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information in a geographic information system (GIS) for use in spatial analysis. Digital base map data files (topography, roads, towns, rivers and lakes, and others.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:250,000 (for example, 1:100,000 or 1:24,000). The digital geologic map graphics and plot files (pull250k.gra/.hp /.eps) that are provided in the digital package are representations of the digital database.

  2. Inductive knowledge acquisition experience with commercial tools for space shuttle main engine testing

    NASA Technical Reports Server (NTRS)

    Modesitt, Kenneth L.

    1990-01-01

    Since 1984, an effort has been underway at Rocketdyne, manufacturer of the Space Shuttle Main Engine (SSME), to automate much of the analysis procedure conducted after engine test firings. Previously published articles at national and international conferences have contained the context of and justification for this effort. Here, progress is reported in building the full system, including the extensions of integrating large databases with the system, known as Scotty. Inductive knowledge acquisition has proven itself to be a key factor in the success of Scotty. The combination of a powerful inductive expert system building tool (ExTran), a relational data base management system (Reliance), and software engineering principles and Computer-Assisted Software Engineering (CASE) tools makes for a practical, useful and state-of-the-art application of an expert system.

  3. Technology Commercialization Effects on the Conduct of Research in Higher Education

    ERIC Educational Resources Information Center

    Powers, Joshua B.; Campbell, Eric G.

    2011-01-01

    The objective of this study was to investigate the effects of technology commercialization on researcher practice and productivity at U.S. universities. Using data drawn from licensing contract documents and databases of university-industry linkages and faculty research output, the study findings suggest that the common practice of licensing…

  4. External Data and Attribute Hyperlink Programs for Promis*e(Registered Trademark)

    NASA Technical Reports Server (NTRS)

    Derengowski, Rich; Gruel, Andrew

    2001-01-01

    External Data and Attribute Hyperlink are computer programs that can be added to Promis*e(trademark) which is a commercial software system that automates routine tasks in the design (including drawing schematic diagrams) of electrical control systems. The programs were developed under the Stennis Space Center's (SSC) Dual Use Technology Development Program to provide capabilities for SSC's BMCS configuration management system which uses Promis*e(trademark). The External Data program enables the storage and management of information in an external database linked to a drawing. Changes can be made either in the database or on the drawing. Information that originates outside Promis*e(trademark) can be stored in custom fields that can be added to the database. Although this information is not available in Promis*e(trademark) printed drawings, it can be associated with symbols in the drawings, and can be retrieved through the drawings when the software is running. The Attribute Hyperlink program enables the addition of hyperlink information as attributes of symbols. This program enables the formation of a direct hyperlink between a schematic diagram and an Internet site or a file on a compact disk, on the user's hard drive, or on another computer on a network to which the user's computer is connected. The user can then obtain information directly related to the part (e.g., maintenance, or troubleshooting information) associated with the hyperlink.

  5. DEEP: Database of Energy Efficiency Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Piette, Mary; Lee, Sang Hoon

    A database of energy efficiency performance (DEEP) is a presimulated database to enable quick and accurate assessment of energy retrofit of commercial buildings. DEEP was compiled from results of about 10 million EnergyPlus simulations. DEEP provides energy savings for screening and evaluation of retrofit measures targeting the small and medium-sized office and retail buildings in California. The prototype building models are developed for a comprehensive assessment of building energy performance based on DOE commercial reference buildings and the California DEER [sic] prototype buildings. The prototype buildings represent seven building types across six vintages of constructions and 16 California climate zones.more » DEEP uses these prototypes to evaluate energy performance of about 100 energy conservation measures covering envelope, lighting, heating, ventilation, air conditioning, plug loads, and domestic hot war. DEEP consists the energy simulation results for individual retrofit measures as well as packages of measures to consider interactive effects between multiple measures. The large scale EnergyPlus simulations are being conducted on the super computers at the National Energy Research Scientific Computing Center (NERSC) of Lawrence Berkeley National Laboratory. The pre-simulation database is a part of the CEC PIER project to develop a web-based retrofit toolkit for small and medium-sized commercial buildings in California, which provides real-time energy retrofit feedback by querying DEEP with recommended measures, estimated energy savings and financial payback period based on users' decision criteria of maximizing energy savings, energy cost savings, carbon reduction, or payback of investment. The pre-simulated database and associated comprehensive measure analysis enhances the ability to performance assessments of retrofits to reduce energy use for small and medium buildings and business owners who typically do not have resources to conduct costly building energy audit.« less

  6. The Mayak Worker Dosimetry System (MWDS-2013): Implementation of the Dose Calculations.

    PubMed

    Zhdanov, А; Vostrotin, V; Efimov, А; Birchall, A; Puncher, M

    2016-07-15

    The calculation of internal doses for the Mayak Worker Dosimetry System (MWDS-2013) involved extensive computational resources due to the complexity and sheer number of calculations required. The required output consisted of a set of 1000 hyper-realizations: each hyper-realization consists of a set (1 for each worker) of probability distributions of organ doses. This report describes the hardware components and computational approaches required to make the calculation tractable. Together with the software, this system is referred to here as the 'PANDORA system'. It is based on a commercial SQL server database in a series of six work stations. A complete run of the entire Mayak worker cohort entailed a huge amount of calculations in PANDORA and due to the relatively slow speed of writing the data into the SQL server, each run took about 47 days. Quality control was monitored by comparing doses calculated in PANDORA with those in a specially modified version of the commercial software 'IMBA Professional Plus'. Suggestions are also made for increasing calculation and storage efficiency for future dosimetry calculations using PANDORA. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Impacts and Viability of Open Source Software on Earth Science Metadata Clearing House and Service Registry Applications

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Cechini, M. F.; Mitchell, A.

    2011-12-01

    Earth Science applications typically deal with large amounts of data and high throughput rates, if not also high transaction rates. While Open Source is frequently used for smaller scientific applications, large scale, highly available systems frequently fall back to "enterprise" class solutions like Oracle RAC or commercial grade JEE Application Servers. NASA's Earth Observing System Data and Information System (EOSDIS) provides end-to-end capabilities for managing NASA's Earth science data from multiple sources - satellites, aircraft, field measurements, and various other programs. A core capability of EOSDIS, the Earth Observing System (EOS) Clearinghouse (ECHO), is a highly available search and order clearinghouse of over 100 million pieces of science data that has evolved from its early R&D days to a fully operational system. Over the course of this maturity ECHO has largely transitioned from commercial frameworks, databases, and operating systems to Open Source solutions...and in some cases, back. In this talk we discuss the progression of our technological solutions and our lessons learned in the areas of: ? High performance, large scale searching solutions ? GeoSpatial search capabilities and dealing with multiple coordinate systems ? Search and storage of variable format source (science) data ? Highly available deployment solutions ? Scalable (elastic) solutions to visual searching and image handling Throughout the evolution of the ECHO system we have had to evaluate solutions with respect to performance, cost, developer productivity, reliability, and maintainability in the context of supporting global science users. Open Source solutions have played a significant role in our architecture and development but several critical commercial components remain (or have been reinserted) to meet our operational demands.

  8. UnCover on the Web: search hints and applications in library environments.

    PubMed

    Galpern, N F; Albert, K M

    1997-01-01

    Among the huge maze of resources available on the Internet, UnCoverWeb stands out as a valuable tool for medical libraries. This up-to-date, free-access, multidisciplinary database of periodical references is searched through an easy-to-learn graphical user interface that is a welcome improvement over the telnet version. This article reviews the basic and advanced search techniques for UnCoverWeb, as well as providing information on the document delivery functions and table of contents alerting service called Reveal. UnCover's currency is evaluated and compared with other current awareness resources. System deficiencies are discussed, with the conclusion that although UnCoverWeb lacks the sophisticated features of many commercial database search services, it is nonetheless a useful addition to the repertoire of information sources available in a library.

  9. Design and implementation of a library-based information service in molecular biology and genetics at the University of Pittsburgh

    PubMed Central

    Chattopadhyay, Ansuman; Tannery, Nancy Hrinya; Silverman, Deborah A. L.; Bergen, Phillip; Epstein, Barbara A.

    2006-01-01

    Setting: In summer 2002, the Health Sciences Library System (HSLS) at the University of Pittsburgh initiated an information service in molecular biology and genetics to assist researchers with identifying and utilizing bioinformatics tools. Program Components: This novel information service comprises hands-on training workshops and consultation on the use of bioinformatics tools. The HSLS also provides an electronic portal and networked access to public and commercial molecular biology databases and software packages. Evaluation Mechanisms: Researcher feedback gathered during the first three years of workshops and individual consultation indicate that the information service is meeting user needs. Next Steps/Future Directions: The service's workshop offerings will expand to include emerging bioinformatics topics. A frequently asked questions database is also being developed to reuse advice on complex bioinformatics questions. PMID:16888665

  10. TERMTrial--terminology-based documentation systems for cooperative clinical trials.

    PubMed

    Merzweiler, A; Weber, R; Garde, S; Haux, R; Knaup-Gregori, P

    2005-04-01

    Within cooperative groups of multi-center clinical trials a standardized documentation is a prerequisite for communication and sharing of data. Standardizing documentation systems means standardizing the underlying terminology. The management and consistent application of terminology systems is a difficult and fault-prone task, which should be supported by appropriate software tools. Today, documentation systems for clinical trials are often implemented as so-called Remote-Data-Entry-Systems (RDE-systems). Although there are many commercial systems, which support the development of RDE-systems there is none offering a comprehensive terminological support. Therefore, we developed the software system TERMTrial which consists of a component for the definition and management of terminology systems for cooperative groups of clinical trials and two components for the terminology-based automatic generation of trial databases and terminology-based interactive design of electronic case report forms (eCRFs). TERMTrial combines the advantages of remote data entry with a comprehensive terminological control.

  11. U.S. Commercial Spent Nuclear Fuel Assembly Characteristics - 1968-2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Jianwei; Peterson, Joshua L.; Gauld, Ian C.

    2016-09-01

    Activities related to management of spent nuclear fuel (SNF) are increasing in the US and many other countries. Over 240,000 SNF assemblies have been discharged from US commercial reactors since the late 1960s. The enrichment and burnup of SNF have changed significantly over the past 40 years, and fuel assembly designs have also evolved. Understanding the general characteristics of SNF helps regulators and other stakeholders form overall strategies towards the final disposal of US SNF. This report documents a survey of all US commercial SNF assemblies in the GC-859 database and provides reference SNF source terms (e.g., nuclide inventories, decaymore » heat, and neutron/photon emission) at various cooling times up to 200 years after fuel discharge. This study reviews the distribution and evolution of fuel parameters of all SNF assemblies discharged over the past 40 years. Assemblies were categorized into three groups based on discharge year, and the median burnups and enrichments of each group were used to establish representative cases. An extended burnup case was created for boiling water reactor (BWR) fuels, and another was created for the pressurized water reactor (PWR) fuels. Two additional cases were developed to represent the eight mixed oxide (MOX) fuel assemblies in the database. Burnup calculations were performed for each representative case. Realistic parameters for fuel design and operations were used to model the SNF and to provide reference fuel characteristics representative of the current inventory. Burnup calculations were performed using the ORIGEN code, which is part of the SCALE nuclear modeling and simulation code system. Results include total activity, decay heat, photon emission, neutron flux, gamma heat, and plutonium content, as well as concentrations for 115 significant nuclides. These quantities are important in the design, regulation, and operations of SNF storage, transportation, and disposal systems.« less

  12. Impact investigation of reactor fuel operating parameters on reactivity for use in burnup credit applications

    NASA Astrophysics Data System (ADS)

    Sloma, Tanya Noel

    When representing the behavior of commercial spent nuclear fuel (SNF), credit is sought for the reduced reactivity associated with the net depletion of fissile isotopes and the creation of neutron-absorbing isotopes, a process that begins when a commercial nuclear reactor is first operated at power. Burnup credit accounts for the reduced reactivity potential of a fuel assembly and varies with the fuel burnup, cooling time, and the initial enrichment of fissile material in the fuel. With regard to long-term SNF disposal and transportation, tremendous benefits, such as increased capacity, flexibility of design and system operations, and reduced overall costs, provide an incentive to seek burnup credit for criticality safety evaluations. The Nuclear Regulatory Commission issued Interim Staff Guidance 8, Revision 2 in 2002, endorsing burnup credit of actinide composition changes only; credit due to actinides encompasses approximately 30% of exiting pressurized water reactor SNF inventory and could potentially be increased to 90% if fission product credit were accepted. However, one significant issue for utilizing full burnup credit, compensating for actinide and fission product composition changes, is establishing a set of depletion parameters that produce an adequately conservative representation of the fuel's isotopic inventory. Depletion parameters can have a significant effect on the isotopic inventory of the fuel, and thus the residual reactivity. This research seeks to quantify the reactivity impact on a system from dominant depletion parameters (i.e., fuel temperature, moderator density, burnable poison rod, burnable poison rod history, and soluble boron concentration). Bounding depletion parameters were developed by statistical evaluation of a database containing reactor operating histories. The database was generated from summary reports of commercial reactor criticality data. Through depletion calculations, utilizing the SCALE 6 code package, several light water reactor assembly designs and in-core locations are analyzed in establishing a combination of depletion parameters that conservatively represent the fuel's isotopic inventory as an initiative to take credit for fuel burnup in criticality safety evaluations for transportation and storage of SNF.

  13. System Engineering Issues for Avionics Survival in the Space Environment

    NASA Technical Reports Server (NTRS)

    Pavelitz, Steven

    1999-01-01

    This paper examines how the system engineering process influences the design of a spacecraft's avionics by considering the space environment. Avionics are susceptible to the thermal, radiation, plasma, and meteoroids/orbital debris environments. The environment definitions for various spacecraft mission orbits (LEO/low inclination, LEO/Polar, MEO, HEO, GTO, GEO and High ApogeeElliptical) are discussed. NASA models and commercial software used for environment analysis are reviewed. Applicability of technical references, such as NASA TM-4527 "Natural Orbital Environment Guidelines for Use in Aerospace Vehicle Development" is discussed. System engineering references, such as the MSFC System Engineering Handbook, are reviewed to determine how the environments are accounted for in the system engineering process. Tools and databases to assist the system engineer and avionics designer in addressing space environment effects on avionics are described and usefulness assessed.

  14. Performance model for grid-connected photovoltaic inverters.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyson, William Earl; Galbraith, Gary M.; King, David L.

    2007-09-01

    This document provides an empirically based performance model for grid-connected photovoltaic inverters used for system performance (energy) modeling and for continuous monitoring of inverter performance during system operation. The versatility and accuracy of the model were validated for a variety of both residential and commercial size inverters. Default parameters for the model can be obtained from manufacturers specification sheets, and the accuracy of the model can be further refined using measurements from either well-instrumented field measurements in operational systems or using detailed measurements from a recognized testing laboratory. An initial database of inverter performance parameters was developed based on measurementsmore » conducted at Sandia National Laboratories and at laboratories supporting the solar programs of the California Energy Commission.« less

  15. Biotechnology: Commercialization and Economic Aspects, January 1993-June 1996. Quick Bibliography Series no. QB 96-10.

    ERIC Educational Resources Information Center

    Leonard, Scott A., Comp.; Dobert, Raymond, Comp.

    This bibliography on the commercialization and economic aspects of biotechnology was produced by the National Agricultural Library. It contains 151 citations in English from the AGRICOLA database. The search strategy is included, call numbers are given for each entry, and abstracts are provided for some citations. The bibliography concludes with…

  16. Trends in Solar energy Driven Vertical Ground Source Heat Pump Systems in Sweden - An Analysis Based on the Swedish Well Database

    NASA Astrophysics Data System (ADS)

    Juhlin, K.; Gehlin, S.

    2016-12-01

    Sweden is a world leader in developing and using vertical ground source heat pump (GSHP) technology. GSHP systems extract passively stored solar energy in the ground and the Earth's natural geothermal energy. Geothermal energy is an admitted renewable energy source in Sweden since 2007 and is the third largest renewable energy source in the country today. The Geological Survey of Sweden (SGU) is the authority in Sweden that provides open access geological data of rock, soil and groundwater for the public. All wells drilled must be registered in the SGU Well Database and it is the well driller's duty to submit registration of drilled wells.Both active and passive geothermal energy systems are in use. Large GSHP systems, with at least 20 boreholes, are active geothermal energy systems. Energy is stored in the ground which allows both comfort heating and cooling to be extracted. Active systems are therefore relevant for larger properties and industrial buildings. Since 1978 more than 600 000 wells (water wells, GSHP boreholes etc) have been registered in the Well Database, with around 20 000 new registrations per year. Of these wells an estimated 320 000 wells are registered as GSHP boreholes. The vast majority of these boreholes are single boreholes for single-family houses. The number of properties with registered vertical borehole GSHP installations amounts to approximately 243 000. Of these sites between 300-350 are large GSHP systems with at least 20 boreholes. While the increase in number of new registrations for smaller homes and households has slowed down after the rapid development in the 80's and 90's, the larger installations for commercial and industrial buildings have increased in numbers over the last ten years. This poster uses data from the SGU Well Database to quantify and analyze the trends in vertical GSHP systems reported between 1978-2015 in Sweden, with special focus on large systems. From the new aggregated data, conclusions can be drawn about the development of larger vertical GSHP system installments over the years and the geographical distribution in Sweden.

  17. Pit-a-Pat: A Smart Electrocardiogram System for Detecting Arrhythmia.

    PubMed

    Park, Juyoung; Lee, Kuyeon; Kang, Kyungtae

    2015-10-01

    Electrocardiogram (ECG) telemonitoring is one of the most promising applications of medical telemetry. However, previous approaches to ECG telemonitoring have largely relied on public databases of ECG results. In this article we propose a smart ECG system called Pit-a-Pat, which extracts features from ECG signals and detects arrhythmia. It is designed to run on an Android™ (Google, Mountain View, CA) device, without requiring modifications to other software. We implemented the Pit-a-Pat system using a commercial ECG device, and the experimental results demonstrate the effectiveness and accuracy of Pit-a-Pat for monitoring the ECG signal and analyzing the cardiac activity of a mobile patient. The proposed system allows monitoring of cardiac activity with automatic analysis, thereby providing a convenient, inexpensive, and ubiquitous adjunct to personal healthcare.

  18. Nonparametric Bayesian Modeling for Automated Database Schema Matching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferragut, Erik M; Laska, Jason A

    2015-01-01

    The problem of merging databases arises in many government and commercial applications. Schema matching, a common first step, identifies equivalent fields between databases. We introduce a schema matching framework that builds nonparametric Bayesian models for each field and compares them by computing the probability that a single model could have generated both fields. Our experiments show that our method is more accurate and faster than the existing instance-based matching algorithms in part because of the use of nonparametric Bayesian models.

  19. Querying and Computing with BioCyc Databases

    PubMed Central

    Krummenacker, Markus; Paley, Suzanne; Mueller, Lukas; Yan, Thomas; Karp, Peter D.

    2006-01-01

    Summary We describe multiple methods for accessing and querying the complex and integrated cellular data in the BioCyc family of databases: access through multiple file formats, access through Application Program Interfaces (APIs) for LISP, Perl and Java, and SQL access through the BioWarehouse relational database. Availability The Pathway Tools software and 20 BioCyc DBs in Tiers 1 and 2 are freely available to academic users; fees apply to some types of commercial use. For download instructions see http://BioCyc.org/download.shtml PMID:15961440

  20. Geologic and structure map of the Choteau 1 degree by 2 degrees Quadrangle, western Montana

    USGS Publications Warehouse

    Mudge, Melville R.; Earhart, Robert L.; Whipple, James W.; Harrison, Jack E.

    1982-01-01

    The geologic and structure map of Choteau 1 x 2 degree quadrangle (Mudge and others, 1982) was originally converted to a digital format by Jeff Silkwood (U.S. Forest Service and completed by the U.S. Geological Survey staff and contractor at the Spokane Field Office (WA) in 2000 for input into a geographic information system (GIS). The resulting digital geologic map (GIS) database can be queried in many ways to produce a variey of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:250,000 (e.g. 1:100,000 or 1:24,000. The digital geologic map graphics and plot files (chot250k.gra/.hp/.eps and chot-map.pdf) that are provided in the digital package are representations of the digital database. They are not designed to be cartographic products.

  1. Emerging modalities in dysphagia rehabilitation: neuromuscular electrical stimulation.

    PubMed

    Huckabee, Maggie-Lee; Doeltgen, Sebastian

    2007-10-12

    The aim of this review article is to advise the New Zealand medical community about the application of neuromuscular electrical stimulation (NMES) as a treatment for pharyngeal swallowing impairment (dysphagia). NMES in this field of rehabilitation medicine has quickly emerged as a widely used method overseas but has been accompanied by significant controversy. Basic information is provided about the physiologic background of electrical stimulation. The literature reviewed in this manuscript was derived through a computer-assisted search using the biomedical database Medline to identify all relevant articles published until from the initiation of the databases up to January 2007. The reviewers used the following search strategy: [(deglutition disorders OR dysphagia) AND (neuromuscular electrical stimulation OR NMES)]. In addition, the technique of reference tracing was used and very recently published studies known to the authors but not yet included in the database systems were included. This review elucidates not only the substantive potential benefit of this treatment, but also potential key concerns for patient safety and long term outcome. The discussion within the clinical and research communities, especially around the commercially available VitalStim stimulator, is objectively explained.

  2. Incidence and prevalence of idiopathic inflammatory myopathies among commercially insured, Medicare supplemental insured, and Medicaid enrolled populations: an administrative claims analysis

    PubMed Central

    2012-01-01

    Background Idiopathic inflammatory myopathies (IIMs) are a rare group of autoimmune syndromes characterized by chronic muscle inflammation and muscle weakness with no known cause. Little is known about their incidence and prevalence. This study reports the incidence and prevalence of IIMs among commercially insured and Medicare and Medicaid enrolled populations in the US. Methods We retrospectively examined medical claims with an IIM diagnosis (ICD-9-CM 710.3 [dermatomyositis (DM)], 710.4 [polymyositis (PM)], 728.81[interstitial myositis]) in the MarketScan® databases to identify age- and gender-adjusted annual IIM incidence and prevalence for 2004–2008. Sensitivity analysis was performed for evidence of a specialist visit (rheumatologist/ neurologist/dermatologist), systemic corticosteroid or immunosuppressant use, or muscle biopsy. Results We identified 2,990 incident patients between 2004 and 2008 (67% female, 17% Medicaid enrollees, 27% aged ≥65 years). Overall adjusted IIM incidence for 2004–2008 for commercial and Medicare supplemental groups combined were 4.27 cases (95% CI, 4.09-4.44) and for Medicaid, 5.23 (95% CI 4.74-5.72) per 100,000 person-years (py). Disease sub-type incidence rates per 100,000-py were 1.52 (95% CI 1.42-1.63) and 1.70 (1.42-1.97) for DM, 2.46 (2.33-2.59) and 3.53 (3.13-3.94) for PM, and 0.73 (0.66-0.81) and 0.78 (0.58-0.97) for interstitial myositis for the commercial/Medicare and Medicaid cohorts respectively. Annual incidence fluctuated over time with the base MarketScan populations. There were 7,155 prevalent patients, with annual prevalence ranging from 20.62 to 25.32 per 100,000 for commercial/Medicare (83% of prevalent cases) and from 15.35 to 32.74 for Medicaid. Conclusions We found higher IIM incidence than historically reported. Employer turnover, miscoding and misdiagnosing, care seeking behavior, and fluctuations in database membership over time can influence the results. Further studies are needed to confirm the incidence and prevalence of IIM. PMID:22703603

  3. GIS applications for military operations in coastal zones

    USGS Publications Warehouse

    Fleming, S.; Jordan, T.; Madden, M.; Usery, E.L.; Welch, R.

    2009-01-01

    In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations. ?? 2008 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS).

  4. GIS applications for military operations in coastal zones

    NASA Astrophysics Data System (ADS)

    Fleming, S.; Jordan, T.; Madden, M.; Usery, E. L.; Welch, R.

    In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations.

  5. GraphSAW: a web-based system for graphical analysis of drug interactions and side effects using pharmaceutical and molecular data.

    PubMed

    Shoshi, Alban; Hoppe, Tobias; Kormeier, Benjamin; Ogultarhan, Venus; Hofestädt, Ralf

    2015-02-28

    Adverse drug reactions are one of the most common causes of death in industrialized Western countries. Nowadays, empirical data from clinical studies for the approval and monitoring of drugs and molecular databases is available. The integration of database information is a promising method for providing well-based knowledge to avoid adverse drug reactions. This paper presents our web-based decision support system GraphSAW which analyzes and evaluates drug interactions and side effects based on data from two commercial and two freely available molecular databases. The system is able to analyze single and combined drug-drug interactions, drug-molecule interactions as well as single and cumulative side effects. In addition, it allows exploring associative networks of drugs, molecules, metabolic pathways, and diseases in an intuitive way. The molecular medication analysis includes the capabilities of the upper features. A statistical evaluation of the integrated data and top 20 drugs concerning drug interactions and side effects is performed. The results of the data analysis give an overview of all theoretically possible drug interactions and side effects. The evaluation shows a mismatch between pharmaceutical and molecular databases. The concordance of drug interactions was about 12% and 9% of drug side effects. An application case with prescribed data of 11 patients is presented in order to demonstrate the functionality of the system under real conditions. For each patient at least two interactions occured in every medication and about 8% of total diseases were possibly induced by drug therapy. GraphSAW (http://tunicata.techfak.uni-bielefeld.de/graphsaw/) is meant to be a web-based system for health professionals and researchers. GraphSAW provides comprehensive drug-related knowledge and an improved medication analysis which may support efforts to reduce the risk of medication errors and numerous drastic side effects.

  6. Computerized literature reference system: use of an optical scanner and optical character recognition software.

    PubMed

    Lossef, S V; Schwartz, L H

    1990-09-01

    A computerized reference system for radiology journal articles was developed by using an IBM-compatible personal computer with a hand-held optical scanner and optical character recognition software. This allows direct entry of scanned text from printed material into word processing or data-base files. Additionally, line diagrams and photographs of radiographs can be incorporated into these files. A text search and retrieval software program enables rapid searching for keywords in scanned documents. The hand scanner and software programs are commercially available, relatively inexpensive, and easily used. This permits construction of a personalized radiology literature file of readily accessible text and images requiring minimal typing or keystroke entry.

  7. Evaluation of MALDI-TOF mass spectrometry for identification of environmental yeasts and development of supplementary database.

    PubMed

    Agustini, Bruna Carla; Silva, Luciano Paulino; Bloch, Carlos; Bonfim, Tania M B; da Silva, Gildo Almeida

    2014-06-01

    Yeast identification using traditional methods which employ morphological, physiological, and biochemical characteristics can be considered a hard task as it requires experienced microbiologists and a rigorous control in culture conditions that could implicate in different outcomes. Considering clinical or industrial applications, the fast and accurate identification of microorganisms is a crescent demand. Hence, molecular biology approaches has been extensively used and, more recently, protein profiling using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) has proved to be an even more efficient tool for taxonomic purposes. Nonetheless, concerning to mass spectrometry, data available for the differentiation of yeast species for industrial purpose is limited and reference databases commercially available comprise almost exclusively clinical microorganisms. In this context, studies focusing on environmental isolates are required to extend the existing databases. The development of a supplementary database and the assessment of a commercial database for taxonomic identifications of environmental yeast are the aims of this study. We challenge MALDI-TOF MS to create protein profiles for 845 yeast strains isolated from grape must and 67.7 % of the strains were successfully identified according to previously available manufacturer database. The remaining 32.3 % strains were not identified due to the absence of a reference spectrum. After matching the correct taxon for these strains by using molecular biology approaches, the spectra concerning the missing species were added in a supplementary database. This new library was able to accurately predict unidentified species at first instance by MALDI-TOF MS, proving it is a powerful tool for the identification of environmental yeasts.

  8. Bioinformatic tools for inferring functional information from plant microarray data: tools for the first steps.

    PubMed

    Page, Grier P; Coulibaly, Issa

    2008-01-01

    Microarrays are a very powerful tool for quantifying the amount of RNA in samples; however, their ability to query essentially every gene in a genome, which can number in the tens of thousands, presents analytical and interpretative problems. As a result, a variety of software and web-based tools have been developed to help with these issues. This article highlights and reviews some of the tools for the first steps in the analysis of a microarray study. We have tried for a balance between free and commercial systems. We have organized the tools by topics including image processing tools (Section 2), power analysis tools (Section 3), image analysis tools (Section 4), database tools (Section 5), databases of functional information (Section 6), annotation tools (Section 7), statistical and data mining tools (Section 8), and dissemination tools (Section 9).

  9. Introducing a Public Stereoscopic 3D High Dynamic Range (SHDR) Video Database

    NASA Astrophysics Data System (ADS)

    Banitalebi-Dehkordi, Amin

    2017-03-01

    High dynamic range (HDR) displays and cameras are paving their ways through the consumer market at a rapid growth rate. Thanks to TV and camera manufacturers, HDR systems are now becoming available commercially to end users. This is taking place only a few years after the blooming of 3D video technologies. MPEG/ITU are also actively working towards the standardization of these technologies. However, preliminary research efforts in these video technologies are hammered by the lack of sufficient experimental data. In this paper, we introduce a Stereoscopic 3D HDR database of videos that is made publicly available to the research community. We explain the procedure taken to capture, calibrate, and post-process the videos. In addition, we provide insights on potential use-cases, challenges, and research opportunities, implied by the combination of higher dynamic range of the HDR aspect, and depth impression of the 3D aspect.

  10. Synthetic vision in the cockpit: 3D systems for general aviation

    NASA Astrophysics Data System (ADS)

    Hansen, Andrew J.; Rybacki, Richard M.; Smith, W. Garth

    2001-08-01

    Synthetic vision has the potential to improve safety in aviation through better pilot situational awareness and enhanced navigational guidance. The technological advances enabling synthetic vision are GPS based navigation (position and attitude) systems and efficient graphical systems for rendering 3D displays in the cockpit. A benefit for military, commercial, and general aviation platforms alike is the relentless drive to miniaturize computer subsystems. Processors, data storage, graphical and digital signal processing chips, RF circuitry, and bus architectures are at or out-pacing Moore's Law with the transition to mobile computing and embedded systems. The tandem of fundamental GPS navigation services such as the US FAA's Wide Area and Local Area Augmentation Systems (WAAS) and commercially viable mobile rendering systems puts synthetic vision well with the the technological reach of general aviation. Given the appropriate navigational inputs, low cost and power efficient graphics solutions are capable of rendering a pilot's out-the-window view into visual databases with photo-specific imagery and geo-specific elevation and feature content. Looking beyond the single airframe, proposed aviation technologies such as ADS-B would provide a communication channel for bringing traffic information on-board and into the cockpit visually via the 3D display for additional pilot awareness. This paper gives a view of current 3D graphics system capability suitable for general aviation and presents a potential road map following the current trends.

  11. Satellite Imagery Assisted Road-Based Visual Navigation System

    NASA Astrophysics Data System (ADS)

    Volkova, A.; Gibbens, P. W.

    2016-06-01

    There is a growing demand for unmanned aerial systems as autonomous surveillance, exploration and remote sensing solutions. Among the key concerns for robust operation of these systems is the need to reliably navigate the environment without reliance on global navigation satellite system (GNSS). This is of particular concern in Defence circles, but is also a major safety issue for commercial operations. In these circumstances, the aircraft needs to navigate relying only on information from on-board passive sensors such as digital cameras. An autonomous feature-based visual system presented in this work offers a novel integral approach to the modelling and registration of visual features that responds to the specific needs of the navigation system. It detects visual features from Google Earth* build a feature database. The same algorithm then detects features in an on-board cameras video stream. On one level this serves to localise the vehicle relative to the environment using Simultaneous Localisation and Mapping (SLAM). On a second level it correlates them with the database to localise the vehicle with respect to the inertial frame. The performance of the presented visual navigation system was compared using the satellite imagery from different years. Based on comparison results, an analysis of the effects of seasonal, structural and qualitative changes of the imagery source on the performance of the navigation algorithm is presented. * The algorithm is independent of the source of satellite imagery and another provider can be used

  12. Machine learning techniques for breast cancer computer aided diagnosis using different image modalities: A systematic review.

    PubMed

    Yassin, Nisreen I R; Omran, Shaimaa; El Houby, Enas M F; Allam, Hemat

    2018-03-01

    The high incidence of breast cancer in women has increased significantly in the recent years. Physician experience of diagnosing and detecting breast cancer can be assisted by using some computerized features extraction and classification algorithms. This paper presents the conduction and results of a systematic review (SR) that aims to investigate the state of the art regarding the computer aided diagnosis/detection (CAD) systems for breast cancer. The SR was conducted using a comprehensive selection of scientific databases as reference sources, allowing access to diverse publications in the field. The scientific databases used are Springer Link (SL), Science Direct (SD), IEEE Xplore Digital Library, and PubMed. Inclusion and exclusion criteria were defined and applied to each retrieved work to select those of interest. From 320 studies retrieved, 154 studies were included. However, the scope of this research is limited to scientific and academic works and excludes commercial interests. This survey provides a general analysis of the current status of CAD systems according to the used image modalities and the machine learning based classifiers. Potential research studies have been discussed to create a more objective and efficient CAD systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Discrepancy Reporting Management System

    NASA Technical Reports Server (NTRS)

    Cooper, Tonja M.; Lin, James C.; Chatillon, Mark L.

    2004-01-01

    Discrepancy Reporting Management System (DRMS) is a computer program designed for use in the stations of NASA's Deep Space Network (DSN) to help establish the operational history of equipment items; acquire data on the quality of service provided to DSN customers; enable measurement of service performance; provide early insight into the need to improve processes, procedures, and interfaces; and enable the tracing of a data outage to a change in software or hardware. DRMS is a Web-based software system designed to include a distributed database and replication feature to achieve location-specific autonomy while maintaining a consistent high quality of data. DRMS incorporates commercial Web and database software. DRMS collects, processes, replicates, communicates, and manages information on spacecraft data discrepancies, equipment resets, and physical equipment status, and maintains an internal station log. All discrepancy reports (DRs), Master discrepancy reports (MDRs), and Reset data are replicated to a master server at NASA's Jet Propulsion Laboratory; Master DR data are replicated to all the DSN sites; and Station Logs are internal to each of the DSN sites and are not replicated. Data are validated according to several logical mathematical criteria. Queries can be performed on any combination of data.

  14. The Effect of Impurities on the Processing of Aluminum Alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zi-Kui Liu; Shengjun Zhang; Qingyou Han

    2007-04-23

    For this Aluminum Industry of the Future (IOF) project, the effect of impurities on the processing of aluminum alloys was systematically investigated. The work was carried out as a collaborative effort between the Pennsylvania State University and Oak Ridge National Laboratory. Industrial support was provided by ALCOA and ThermoCalc, Inc. The achievements described below were made. A method that combines first-principles calculation and calculation of phase diagrams (CALPHAD) was used to develop the multicomponent database Al-Ca-K-Li-Mg-Na. This method was extensively used in this project for the development of a thermodynamic database. The first-principles approach provided some thermodynamic property data thatmore » are not available in the open literature. These calculated results were used in the thermodynamic modeling as experimental data. Some of the thermodynamic property data are difficult, if not impossible, to measure. The method developed and used in this project allows the estimation of these data for thermodynamic database development. The multicomponent database Al-Ca-K-Li-Mg-Na was developed. Elements such as Ca, Li, Na, and K are impurities that strongly affect the formability and corrosion behavior of aluminum alloys. However, these impurity elements are not included in the commercial aluminum alloy database. The process of thermodynamic modeling began from Al-Na, Ca-Li, Li-Na, K-Na, and Li-K sub-binary systems. Then ternary and higher systems were extrapolated because of the lack of experimental information. Databases for five binary alloy systems and two ternary systems were developed. Along with other existing binary and ternary databases, the full database of the multicomponent Al-Ca-K-Li-Mg-Na system was completed in this project. The methodology in integrating with commercial or other aluminum alloy databases can be developed. The mechanism of sodium-induced high-temperature embrittlement (HTE) of Al-Mg is now understood. Using the thermodynamic database developed in this project, thermodynamic simulations were carried out to investigate the effect of sodium on the HTE of Al-Mg alloys. The simulation results indicated that the liquid miscibility gap resulting from the dissolved sodium in the molten material plays an important role in HTE. A liquid phase forms from the solid face-centered cubic (fcc) phase (most likely at grain boundaries) during cooling, resulting in the occurrence of HTE. Comparison of the thermodynamic simulation results with experimental measurements on the high-temperature ductility of an Al-5Mg-Na alloy shows that HTE occurs in the temperature range at which the liquid phase exists. Based on this fundamental understanding of the HTE mechanism during processing of aluminum alloy, an HTE sensitive zone and a hot-rolling safe zone of the Al-Mg-Na alloys are defined as functions of processing temperature and alloy composition. The tendency of HTE was evaluated based on thermodynamic simulations of the fraction of the intergranular sodium-rich liquid phase. Methods of avoiding HTE during rolling/extrusion of Al-Mg-based alloys were suggested. Energy and environmental benefits from the results of this project could occur through a number of avenues: (1) energy benefits accruing from reduced rejection rates of the aluminum sheet and bar, (2) reduced dross formation during the remelting of the aluminum rejects, and (3) reduced CO2 emission related to the energy savings. The sheet and extruded bar quantities produced in the United States during 2000 were 10,822 and 4,546 million pounds, respectively. It is assumed that 50% of the sheet and 10% of the bar will be affected by implementing the results of this project. With the current process, the rejection rate of sheet and bar is estimated at 5%. Assuming that at least half of the 5% rejection of sheet and bar will be eliminated by using the results of this project and that 4% of the aluminum will be lost through dross (Al2O3) during remelting of the rejects, the full-scale industrial implementation of the project results would lead to energy savings in excess of 6.2 trillion Btu/year and cost savings of $42.7 million by 2020.« less

  15. TRICARE Applied Behavior Analysis (ABA) Benefit

    PubMed Central

    Maglione, Margaret; Kadiyala, Srikanth; Kress, Amii; Hastings, Jaime L.; O'Hanlon, Claire E.

    2017-01-01

    Abstract This study compared the Applied Behavior Analysis (ABA) benefit provided by TRICARE as an early intervention for autism spectrum disorder with similar benefits in Medicaid and commercial health insurance plans. The sponsor, the Office of the Under Secretary of Defense for Personnel and Readiness, was particularly interested in how a proposed TRICARE reimbursement rate decrease from $125 per hour to $68 per hour for ABA services performed by a Board Certified Behavior Analyst compared with reimbursement rates (defined as third-party payment to the service provider) in Medicaid and commercial health insurance plans. Information on ABA coverage in state Medicaid programs was collected from Medicaid state waiver databases; subsequently, Medicaid provider reimbursement data were collected from state Medicaid fee schedules. Applied Behavior Analysis provider reimbursement in the commercial health insurance system was estimated using Truven Health MarketScan® data. A weighted mean U.S. reimbursement rate was calculated for several services using cross-state information on the number of children diagnosed with autism spectrum disorder. Locations of potential provider shortages were also identified. Medicaid and commercial insurance reimbursement rates varied considerably across the United States. This project concluded that the proposed $68-per-hour reimbursement rate for services provided by a board certified analyst was more than 25 percent below the U.S. mean. PMID:28845348

  16. Spatial digital database for the tectonic map of Southeast Arizona

    USGS Publications Warehouse

    map by Drewes, Harald; digital database by Fields, Robert A.; Hirschberg, Douglas M.; Bolm, Karen S.

    2002-01-01

    A spatial database was created for Drewes' (1980) tectonic map of southeast Arizona: this database supercedes Drewes and others (2001, ver. 1.0). Staff and a contractor at the U.S. Geological Survey in Tucson, Arizona completed an interim digital geologic map database for the east part of the map in 2001, made revisions to the previously released digital data for the west part of the map (Drewes and others, 2001, ver. 1.0), merged data files for the east and west parts, and added additional data not previously captured. Digital base map data files (such as topography, roads, towns, rivers and lakes) are not included: they may be obtained from a variety of commercial and government sources. This digital geospatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information in a geographic information system (GIS) for use in spatial analysis. The resulting digital geologic map database can be queried in many ways to produce a variety of geologic maps and derivative products. Because Drewes' (1980) map sheets include additional text and graphics that were not included in this report, scanned images of his maps (i1109_e.jpg, i1109_w.jpg) are included as a courtesy to the reader. This database should not be used or displayed at any scale larger than 1:125,000 (for example, 1:100,000 or 1:24,000). The digital geologic map plot files (i1109_e.pdf and i1109_w.pdf) that are provided herein are representations of the database (see Appendix A). The map area is located in southeastern Arizona (fig. 1). This report describes the map units (from Drewes, 1980), the methods used to convert the geologic map data into a digital format, the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. The manuscript and digital data review by Helen Kayser (Information Systems Support, Inc.) is greatly appreciated.

  17. QBIC project: querying images by content, using color, texture, and shape

    NASA Astrophysics Data System (ADS)

    Niblack, Carlton W.; Barber, Ron; Equitz, Will; Flickner, Myron D.; Glasman, Eduardo H.; Petkovic, Dragutin; Yanker, Peter; Faloutsos, Christos; Taubin, Gabriel

    1993-04-01

    In the query by image content (QBIC) project we are studying methods to query large on-line image databases using the images' content as the basis of the queries. Examples of the content we use include color, texture, and shape of image objects and regions. Potential applications include medical (`Give me other images that contain a tumor with a texture like this one'), photo-journalism (`Give me images that have blue at the top and red at the bottom'), and many others in art, fashion, cataloging, retailing, and industry. Key issues include derivation and computation of attributes of images and objects that provide useful query functionality, retrieval methods based on similarity as opposed to exact match, query by image example or user drawn image, the user interfaces, query refinement and navigation, high dimensional database indexing, and automatic and semi-automatic database population. We currently have a prototype system written in X/Motif and C running on an RS/6000 that allows a variety of queries, and a test database of over 1000 images and 1000 objects populated from commercially available photo clip art images. In this paper we present the main algorithms for color texture, shape and sketch query that we use, show example query results, and discuss future directions.

  18. An authoritative global database for active submarine hydrothermal vent fields

    NASA Astrophysics Data System (ADS)

    Beaulieu, Stace E.; Baker, Edward T.; German, Christopher R.; Maffei, Andrew

    2013-11-01

    The InterRidge Vents Database is available online as the authoritative reference for locations of active submarine hydrothermal vent fields. Here we describe the revision of the database to an open source content management system and conduct a meta-analysis of the global distribution of known active vent fields. The number of known active vent fields has almost doubled in the past decade (521 as of year 2009), with about half visually confirmed and others inferred active from physical and chemical clues. Although previously known mainly from mid-ocean ridges (MORs), active vent fields at MORs now comprise only half of the total known, with about a quarter each now known at volcanic arcs and back-arc spreading centers. Discoveries in arc and back-arc settings resulted in an increase in known vent fields within exclusive economic zones, consequently reducing the proportion known in high seas to one third. The increase in known vent fields reflects a number of factors, including increased national and commercial interests in seafloor hydrothermal deposits as mineral resources. The purpose of the database now extends beyond academic research and education and into marine policy and management, with at least 18% of known vent fields in areas granted or pending applications for mineral prospecting and 8% in marine protected areas.

  19. Identification of Clinical Coryneform Bacterial Isolates: Comparison of Biochemical Methods and Sequence Analysis of 16S rRNA and rpoB Genes▿

    PubMed Central

    Adderson, Elisabeth E.; Boudreaux, Jan W.; Cummings, Jessica R.; Pounds, Stanley; Wilson, Deborah A.; Procop, Gary W.; Hayden, Randall T.

    2008-01-01

    We compared the relative levels of effectiveness of three commercial identification kits and three nucleic acid amplification tests for the identification of coryneform bacteria by testing 50 diverse isolates, including 12 well-characterized control strains and 38 organisms obtained from pediatric oncology patients at our institution. Between 33.3 and 75.0% of control strains were correctly identified to the species level by phenotypic systems or nucleic acid amplification assays. The most sensitive tests were the API Coryne system and amplification and sequencing of the 16S rRNA gene using primers optimized for coryneform bacteria, which correctly identified 9 of 12 control isolates to the species level, and all strains with a high-confidence call were correctly identified. Organisms not correctly identified were species not included in the test kit databases or not producing a pattern of reactions included in kit databases or which could not be differentiated among several genospecies based on reaction patterns. Nucleic acid amplification assays had limited abilities to identify some bacteria to the species level, and comparison of sequence homologies was complicated by the inclusion of allele sequences obtained from uncultivated and uncharacterized strains in databases. The utility of rpoB genotyping was limited by the small number of representative gene sequences that are currently available for comparison. The correlation between identifications produced by different classification systems was poor, particularly for clinical isolates. PMID:18160450

  20. Data Access System for Hydrology

    NASA Astrophysics Data System (ADS)

    Whitenack, T.; Zaslavsky, I.; Valentine, D.; Djokic, D.

    2007-12-01

    As part of the CUAHSI HIS (Consortium of Universities for the Advancement of Hydrologic Science, Inc., Hydrologic Information System), the CUAHSI HIS team has developed Data Access System for Hydrology or DASH. DASH is based on commercial off the shelf technology, which has been developed in conjunction with a commercial partner, ESRI. DASH is a web-based user interface, developed in ASP.NET developed using ESRI ArcGIS Server 9.2 that represents a mapping, querying and data retrieval interface over observation and GIS databases, and web services. This is the front end application for the CUAHSI Hydrologic Information System Server. The HIS Server is a software stack that organizes observation databases, geographic data layers, data importing and management tools, and online user interfaces such as the DASH application, into a flexible multi- tier application for serving both national-level and locally-maintained observation data. The user interface of the DASH web application allows online users to query observation networks by location and attributes, selecting stations in a user-specified area where a particular variable was measured during a given time interval. Once one or more stations and variables are selected, the user can retrieve and download the observation data for further off-line analysis. The DASH application is highly configurable. The mapping interface can be configured to display map services from multiple sources in multiple formats, including ArcGIS Server, ArcIMS, and WMS. The observation network data is configured in an XML file where you specify the network's web service location and its corresponding map layer. Upon initial deployment, two national level observation networks (USGS NWIS daily values and USGS NWIS Instantaneous values) are already pre-configured. There is also an optional login page which can be used to restrict access as well as providing a alternative to immediate downloads. For large request, users would be notified via email with a link to their data when it is ready.

  1. SU-F-T-231: Improving the Efficiency of a Radiotherapy Peer-Review System for Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsu, S; Basavatia, A; Garg, M

    Purpose: To improve the efficiency of a radiotherapy peer-review system using a commercially available software application for plan quality evaluation and documentation. Methods: A commercial application, FullAccess (Radialogica LLC, Version 1.4.4), was implemented in a Citrix platform for peer-review process and patient documentation. This application can display images, isodose lines, and dose-volume histograms and create plan reports for peer-review process. Dose metrics in the report can also be benchmarked for plan quality evaluation. Site-specific templates were generated based on departmental treatment planning policies and procedures for each disease site, which generally follow RTOG protocols as well as published prospective clinicalmore » trial data, including both conventional fractionation and hypo-fractionation schema. Once a plan is ready for review, the planner exports the plan to FullAccess, applies the site-specific template, and presents the report for plan review. The plan is still reviewed in the treatment planning system, as that is the legal record. Upon physician’s approval of a plan, the plan is packaged for peer review with the plan report and dose metrics are saved to the database. Results: The reports show dose metrics of PTVs and critical organs for the plans and also indicate whether or not the metrics are within tolerance. Graphical results with green, yellow, and red lights are displayed of whether planning objectives have been met. In addition, benchmarking statistics are collected to see where the current plan falls compared to all historical plans on each metric. All physicians in peer review can easily verify constraints by these reports. Conclusion: We have demonstrated the improvement in a radiotherapy peer-review system, which allows physicians to easily verify planning constraints for different disease sites and fractionation schema, allows for standardization in the clinic to ensure that departmental policies are maintained, and builds a comprehensive database for potential clinical outcome evaluation.« less

  2. Promising More Information

    NASA Technical Reports Server (NTRS)

    2003-01-01

    When NASA needed a real-time, online database system capable of tracking documentation changes in its propulsion test facilities, engineers at Stennis Space Center joined with ECT International, of Brookfield, Wisconsin, to create a solution. Through NASA's Dual-Use Program, ECT developed Exdata, a software program that works within the company's existing Promise software. Exdata not only satisfied NASA s requirements, but also expanded ECT s commercial product line. Promise, ECT s primary product, is an intelligent software program with specialized functions for designing and documenting electrical control systems. An addon to AutoCAD software, Promis e generates control system schematics, panel layouts, bills of material, wire lists, and terminal plans. The drawing functions include symbol libraries, macros, and automatic line breaking. Primary Promise customers include manufacturing companies, utilities, and other organizations with complex processes to control.

  3. Effectiveness and safety of moxibustion treatment for non-specific lower back pain: protocol for a systematic review.

    PubMed

    Leem, Jungtae; Lee, Seunghoon; Park, Yeoncheol; Seo, Byung-Kwan; Cho, Yeeun; Kang, Jung Won; Lee, Yoon Jae; Ha, In-Hyuk; Lee, Hyun-Jong; Kim, Eun-Jung; Lee, Sanghoon; Nam, Dongwoo

    2017-06-23

    Many patients experience acute lower back pain that becomes chronic pain. The proportion of patients using complementary and alternative medicine to treat lower back is increasing. Even though several moxibustion clinical trials for lower back pain have been conducted, the effectiveness and safety of moxibustion intervention is controversial. The purpose of this study protocol for a systematic review is to evaluate the effectiveness and safety of moxibustion treatment for non-specific lower back pain patients. We will conduct an electronic search of several databases from their inception to May 2017, including Embase, PubMed, Cochrane Central Register of Controlled Trial, Allied and Complementary Medicine Database, Wanfang Database, Chongqing VIP Chinese Science and Technology Periodical Database, China National Knowledge Infrastructure Database, Korean Medical Database, Korean Studies Information Service System, National Discovery for Science Leaders, Oriental Medicine Advanced Searching Integrated System, the Korea Institute of Science and Technology, and KoreaMed. Randomised controlled trials investigating any type of moxibustion treatment will be included. The primary outcome will be pain intensity and functional status/disability due to lower back pain. The secondary outcome will be a global measurement of recovery or improvement, work-related outcomes, radiographic improvement of structure, quality of life, and adverse events (presence or absence). Risk ratio or mean differences with a 95% confidence interval will be used to show the effect of moxibustion therapy when it is possible to conduct a meta-analysis. This review will be published in a peer-reviewed journal and will be presented at an international academic conference for dissemination. Our results will provide current evidence of the effectiveness and safety of moxibustion treatment in non-specific lower back pain patients, and thus will be beneficial to patients, practitioners, and policymakers. CRD42016047468 in PROSPERO 2016. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. CD-ROM-aided Databases

    NASA Astrophysics Data System (ADS)

    Masuyama, Keiichi

    CD-ROM has rapidly evolved as a new information medium with large capacity, In the U.S. it is predicted that it will become two hundred billion yen market in three years, and thus CD-ROM is strategic target of database industry. Here in Japan the movement toward its commercialization has been active since this year. Shall CD-ROM bussiness ever conquer information market as an on-disk database or electronic publication? Referring to some cases of the applications in the U.S. the author views marketability and the future trend of this new optical disk medium.

  5. Teleeducation and telepathology for open and distance education.

    PubMed

    Szymas, J

    2000-01-01

    Our experience in creating and using telepathology system and multimedia database for education is described. This program packet currently works in the Department of Pathology of University Medical School in Poznan. It is used for self-education, tests, services and for the examinations in pathology, i.e., for dental students and for medical students in terms of self-education and individual examination services. The system is implemented on microcomputers compatible with IBM PC and works in the network system Netware 5.1. Some modules are available through the Internet. The program packet described here accomplishes the TELEMIC system for telepathology, ASSISTANT, which is the administrator for the databases, and EXAMINATOR, which is the executive program. The realization of multi-user module allows students to work on several working areas, on random be chosen different sets of problems contemporary. The possibility to work in the exercise mode will image files and questions is an attractive way for self-education. The standard format of the notation files enables to elaborate the results by commercial statistic packets in order to estimate the scale of answers and to find correlation between the obtained results. The method of multi-criterion grading excludes unlimited mutual compensation of the criteria, differentiates the importance of particular courses and introduces the quality criteria. The packet is part of the integrated management information system of the department of pathology. Applications for other telepathological systems are presented.

  6. Commercial Database Design vs. Library Terminology Comprehension: Why Do Students Print Abstracts Instead of Full-Text Articles?

    ERIC Educational Resources Information Center

    Imler, Bonnie; Eichelberger, Michelle

    2014-01-01

    When asked to print the full text of an article, many undergraduate college students print the abstract instead of the full text. This study seeks to determine the underlying cause(s) of this confusion. In this quantitative study, participants (n = 40) performed five usability tasks to assess ease of use and usefulness of five commercial library…

  7. Using commercial video games for upper limb stroke rehabilitation: is this the way of the future?

    PubMed

    Pietrzak, Eva; Cotea, Cristina; Pullman, Stephen

    2014-01-01

    The increasing number of people living with poststroke sequelae has stimulated the search for novel ways of providing poststroke rehabilitation without putting additional stress on overburdened health care systems. One of them is the use of commercially available technology and off-the-shelf video games for hemiparetic upper limb rehabilitation. The MEDLINE, EMBASE, and Cochrane Library databases were searched using key word synonyms for stroke, upper limb, and video games. Included studies investigated upper limb stroke rehabilitation using commercially available consoles and video games, reported outcomes that included measures of upper limb functionality, and were published in a peer-reviewed journal written in English. Thirteen studies were identified - 6 published as full articles and 7 as abstracts. Studies were generally small and only 3 were randomized. The gaming systems investigated were the Nintendo Wii (n = 10), EyeToy PlayStation (n = 2), and CyWee Z (n = 1). The Nintendo Wii appears to provide the greatest benefits to patients, with improvements seen in upper extremity function measures such as joint range of motion, hand motor function, grip strength, and dexterity. Three studies indicate that video therapy appears to be safe and that long-term improvements continue at follow-up. At present, the evidence that the use of commercial video games in rehabilitation improves upper limb functionality after stroke is very limited. However, this approach has the potential to provide easily available and affordable stroke rehabilitation therapy in settings where access to therapy is limited by geographical or financial constraints.

  8. U.S. Coast Guard Alternatives for Distributed Data Base Management Systems.

    DTIC Science & Technology

    1982-12-01

    relational databases as the result of the papers and theoretical work done by Dr. Edgar F . Codd . But, the number of commercially available relational DBMSs...DEC 82UNCLASSIFIED F /G 5/O NIL _mlEEEEElllEEE mohEohmhhhohEI mmmmmmmmm EohhmhohmhhEEE L96 130 ’II,, 1 .8 IIIII L2 116 111111 1.6 MICROCOPY RESOLUTION...CLASS. (of thoo nftmj I S .. C kA S S O U1 I C A T IO t i 0 O f t G A D I N * to. oHSTRIGUTIOu STATEMeNT tei lab *GPM# Approved for public release

  9. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry

    DOE PAGES

    Van Berkel, Gary J.; Kertesz, Vilmos

    2016-11-15

    An “Open Access”-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendorprovided software libraries. Sample classification based on spectral comparison utilized themore » spectral contrast angle method. As a result, using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. In conclusion, this work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching.« less

  10. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Berkel, Gary J.; Kertesz, Vilmos

    An “Open Access”-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendorprovided software libraries. Sample classification based on spectral comparison utilized themore » spectral contrast angle method. As a result, using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. In conclusion, this work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching.« less

  11. 32 CFR 518.18 - Judicial actions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... procedures used to search for the requested records, (manual search of records, computer database search, etc... deliberate consideration of the institutional, commercial, and personal privacy interests that could be...

  12. 32 CFR 518.18 - Judicial actions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... procedures used to search for the requested records, (manual search of records, computer database search, etc... deliberate consideration of the institutional, commercial, and personal privacy interests that could be...

  13. 32 CFR 518.18 - Judicial actions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... procedures used to search for the requested records, (manual search of records, computer database search, etc... deliberate consideration of the institutional, commercial, and personal privacy interests that could be...

  14. NASA Scientific Data Purchase Project: From Collection to User

    NASA Technical Reports Server (NTRS)

    Nicholson, Lamar; Policelli, Fritz; Fletcher, Rose

    2002-01-01

    NASA's Scientific Data Purchase (SDP) project is currently a $70 million operation managed by the Earth Science Applications Directorate at Stennis Space Center. The SDP project was developed in 1997 to purchase scientific data from commercial sources for distribution to NASA Earth science researchers. Our current data holdings include 8TB of remote sensing imagery consisting of 18 products from 4 companies. Our anticipated data volume is 60 TB by 2004, and we will be receiving new data products from several additional companies. Our current system capacity is 24 TB, expandable to 89 TB. Operations include tasking of new data collections, archive ordering, shipment verification, data validation, distribution, metrics, finances, customer feedback, and technical support. The program has been included in the Stennis Space Center Commercial Remote Sensing ISO 9001 registration since its inception. Our operational system includes automatic quality control checks on data received (with MatLab analysis); internally developed, custom Web-based interfaces that tie into commercial-off-the-shelf software; and an integrated relational database that links and tracks all data through operations. We've distributed nearly 1500 datasets, and almost 18,000 data files have been downloaded from our public web site; on a 10-point scale, our customer satisfaction index is 8.32 at a 23% response level. More information about the SDP is available on our Web site.

  15. Development of a forestry government agency enterprise GIS system: a disconnected editing approach

    NASA Astrophysics Data System (ADS)

    Zhu, Jin; Barber, Brad L.

    2008-10-01

    The Texas Forest Service (TFS) has developed a geographic information system (GIS) for use by agency personnel in central Texas for managing oak wilt suppression and other landowner assistance programs. This Enterprise GIS system was designed to support multiple concurrent users accessing shared information resources. The disconnected editing approach was adopted in this system to avoid the overhead of maintaining an active connection between TFS central Texas field offices and headquarters since most field offices are operating with commercially provided Internet service. The GIS system entails maintaining a personal geodatabase on each local field office computer. Spatial data from the field is periodically up-loaded into a central master geodatabase stored in a Microsoft SQL Server at the TFS headquarters in College Station through the ESRI Spatial Database Engine (SDE). This GIS allows users to work off-line when editing data and requires connecting to the central geodatabase only when needed.

  16. Distributed structure-searchable toxicity (DSSTox) public database network: a proposal.

    PubMed

    Richard, Ann M; Williams, ClarLynda R

    2002-01-29

    The ability to assess the potential genotoxicity, carcinogenicity, or other toxicity of pharmaceutical or industrial chemicals based on chemical structure information is a highly coveted and shared goal of varied academic, commercial, and government regulatory groups. These diverse interests often employ different approaches and have different criteria and use for toxicity assessments, but they share a need for unrestricted access to existing public toxicity data linked with chemical structure information. Currently, there exists no central repository of toxicity information, commercial or public, that adequately meets the data requirements for flexible analogue searching, Structure-Activity Relationship (SAR) model development, or building of chemical relational databases (CRD). The distributed structure-searchable toxicity (DSSTox) public database network is being proposed as a community-supported, web-based effort to address these shared needs of the SAR and toxicology communities. The DSSTox project has the following major elements: (1) to adopt and encourage the use of a common standard file format (structure data file (SDF)) for public toxicity databases that includes chemical structure, text and property information, and that can easily be imported into available CRD applications; (2) to implement a distributed source approach, managed by a DSSTox Central Website, that will enable decentralized, free public access to structure-toxicity data files, and that will effectively link knowledgeable toxicity data sources with potential users of these data from other disciplines (such as chemistry, modeling, and computer science); and (3) to engage public/commercial/academic/industry groups in contributing to and expanding this community-wide, public data sharing and distribution effort. The DSSTox project's overall aims are to effect the closer association of chemical structure information with existing toxicity data, and to promote and facilitate structure-based exploration of these data within a common chemistry-based framework that spans toxicological disciplines.

  17. A Spacebased Ocean Surface Exchange Data Analysis System

    NASA Technical Reports Server (NTRS)

    Tang, Wenqing; Liu, W. Timothy

    2000-01-01

    Emerging technologies have provided unprecedented opportunities to transform information into knowledge and disseminate them in a much faster, cheaper, and userfriendly mode. We have set up a system to produce and disseminate high level (gridded) ocean surface wind data from the NASA Scatterometer and European Remote Sensing missions. The data system is being expanded to produce real-time gridded ocean surface winds from an improved sensor SeaWinds on the Quikscat Mission. The wind field will be combined with hydrologic parameters from the Tropical Rain Measuring Mission to monitor evolving weather systems and natural hazard in real time. It will form the basis for spacebased Ocean Surface Exchange Data Analysis System (SOSEDAS) which will include the production of ocean surface momentum, heat, and water fluxes needed for interdisciplinary studies of ocean-atmosphere interaction. Various commercial or non-commercial software tools have been compared and selected in terms of their ability in database management, remote data accessing, graphical interface, data quality, storage needs and transfer speed, etc. Issues regarding system security and user authentication, distributed data archiving and accessing, strategy to compress large-volume geophysical and satellite data/image. and increasing transferring speed are being addressed. A simple and easy way to access information and derive knowledge from spacebased data of multiple missions is being provided. The evolving 'knowledge system' will provide relevant infrastructure to address Earth System Science, make inroads in educating an informed populace, and illuminate decision and policy making.

  18. Global Seismicity: Three New Maps Compiled with Geographic Information Systems

    NASA Technical Reports Server (NTRS)

    Lowman, Paul D., Jr.; Montgomery, Brian C.

    1996-01-01

    This paper presents three new maps of global seismicity compiled from NOAA digital data, covering the interval 1963-1998, with three different magnitude ranges (mb): greater than 3.5, less than 3.5, and all detectable magnitudes. A commercially available geographic information system (GIS) was used as the database manager. Epicenter locations were acquired from a CD-ROM supplied by the National Geophysical Data Center. A methodology is presented that can be followed by general users. The implications of the maps are discussed, including the limitations of conventional plate models, and the different tectonic behavior of continental vs. oceanic lithosphere. Several little-known areas of intraplate or passive margin seismicity are also discussed, possibly expressing horizontal compression generated by ridge push.

  19. From scores to face templates: a model-based approach.

    PubMed

    Mohanty, Pranab; Sarkar, Sudeep; Kasturi, Rangachar

    2007-12-01

    Regeneration of templates from match scores has security and privacy implications related to any biometric authentication system. We propose a novel paradigm to reconstruct face templates from match scores using a linear approach. It proceeds by first modeling the behavior of the given face recognition algorithm by an affine transformation. The goal of the modeling is to approximate the distances computed by a face recognition algorithm between two faces by distances between points, representing these faces, in an affine space. Given this space, templates from an independent image set (break-in) are matched only once with the enrolled template of the targeted subject and match scores are recorded. These scores are then used to embed the targeted subject in the approximating affine (non-orthogonal) space. Given the coordinates of the targeted subject in the affine space, the original template of the targeted subject is reconstructed using the inverse of the affine transformation. We demonstrate our ideas using three, fundamentally different, face recognition algorithms: Principal Component Analysis (PCA) with Mahalanobis cosine distance measure, Bayesian intra-extrapersonal classifier (BIC), and a feature-based commercial algorithm. To demonstrate the independence of the break-in set with the gallery set, we select face templates from two different databases: Face Recognition Grand Challenge (FRGC) and Facial Recognition Technology (FERET) Database (FERET). With an operational point set at 1 percent False Acceptance Rate (FAR) and 99 percent True Acceptance Rate (TAR) for 1,196 enrollments (FERET gallery), we show that at most 600 attempts (score computations) are required to achieve a 73 percent chance of breaking in as a randomly chosen target subject for the commercial face recognition system. With similar operational set up, we achieve a 72 percent and 100 percent chance of breaking in for the Bayesian and PCA based face recognition systems, respectively. With three different levels of score quantization, we achieve 69 percent, 68 percent and 49 percent probability of break-in, indicating the robustness of our proposed scheme to score quantization. We also show that the proposed reconstruction scheme has 47 percent more probability of breaking in as a randomly chosen target subject for the commercial system as compared to a hill climbing approach with the same number of attempts. Given that the proposed template reconstruction method uses distinct face templates to reconstruct faces, this work exposes a more severe form of vulnerability than a hill climbing kind of attack where incrementally different versions of the same face are used. Also, the ability of the proposed approach to reconstruct actual face templates of the users increases privacy concerns in biometric systems.

  20. Central Colorado Assessment Project (CCAP)-Geochemical data for rock, sediment, soil, and concentrate sample media

    USGS Publications Warehouse

    Granitto, Matthew; DeWitt, Ed H.; Klein, Terry L.

    2010-01-01

    This database was initiated, designed, and populated to collect and integrate geochemical data from central Colorado in order to facilitate geologic mapping, petrologic studies, mineral resource assessment, definition of geochemical baseline values and statistics, environmental impact assessment, and medical geology. The Microsoft Access database serves as a geochemical data warehouse in support of the Central Colorado Assessment Project (CCAP) and contains data tables describing historical and new quantitative and qualitative geochemical analyses determined by 70 analytical laboratory and field methods for 47,478 rock, sediment, soil, and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey (USGS) personnel and analyzed either in the analytical laboratories of the USGS or by contract with commercial analytical laboratories. These data represent analyses of samples collected as part of various USGS programs and projects. In addition, geochemical data from 7,470 sediment and soil samples collected and analyzed under the Atomic Energy Commission National Uranium Resource Evaluation (NURE) Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program (henceforth called NURE) have been included in this database. In addition to data from 2,377 samples collected and analyzed under CCAP, this dataset includes archived geochemical data originally entered into the in-house Rock Analysis Storage System (RASS) database (used by the USGS from the mid-1960s through the late 1980s) and the in-house PLUTO database (used by the USGS from the mid-1970s through the mid-1990s). All of these data are maintained in the Oracle-based National Geochemical Database (NGDB). Retrievals from the NGDB and from the NURE database were used to generate most of this dataset. In addition, USGS data that have been excluded previously from the NGDB because the data predate earliest USGS geochemical databases, or were once excluded for programmatic reasons, have been included in the CCAP Geochemical Database and are planned to be added to the NGDB.

  1. Latest developments for the IAGOS database: Interoperability and metadata

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume

    2014-05-01

    In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for intercomparison. The optimal data transfer protocol is being investigated to insure the interoperability. To facilitate satellite and model validation, tools will be made available for co-location and comparison with IAGOS. We will enhance the JOIN application in order to properly display aircraft data as vertical profiles and along individual flight tracks and to allow for graphical comparison to model results that are accessible through interoperable web services, such as the daily products from the GMES/Copernicus atmospheric service.

  2. Aerial image databases for pipeline rights-of-way management

    NASA Astrophysics Data System (ADS)

    Jadkowski, Mark A.

    1996-03-01

    Pipeline companies that own and manage extensive rights-of-way corridors are faced with ever-increasing regulatory pressures, operating issues, and the need to remain competitive in today's marketplace. Automation has long been an answer to the problem of having to do more work with less people, and Automated Mapping/Facilities Management/Geographic Information Systems (AM/FM/GIS) solutions have been implemented at several pipeline companies. Until recently, the ability to cost-effectively acquire and incorporate up-to-date aerial imagery into these computerized systems has been out of the reach of most users. NASA's Earth Observations Commercial Applications Program (EOCAP) is providing a means by which pipeline companies can bridge this gap. The EOCAP project described in this paper includes a unique partnership with NASA and James W. Sewall Company to develop an aircraft-mounted digital camera system and a ground-based computer system to geometrically correct and efficiently store and handle the digital aerial images in an AM/FM/GIS environment. This paper provides a synopsis of the project, including details on (1) the need for aerial imagery, (2) NASA's interest and role in the project, (3) the design of a Digital Aerial Rights-of-Way Monitoring System, (4) image georeferencing strategies for pipeline applications, and (5) commercialization of the EOCAP technology through a prototype project at Algonquin Gas Transmission Company which operates major gas pipelines in New England, New York, and New Jersey.

  3. Mini-DIAL system measurements coupled with multivariate data analysis to identify TIC and TIM simulants: preliminary absorption database analysis.

    NASA Astrophysics Data System (ADS)

    Gaudio, P.; Malizia, A.; Gelfusa, M.; Martinelli, E.; Di Natale, C.; Poggi, L. A.; Bellecci, C.

    2017-01-01

    Nowadays Toxic Industrial Components (TICs) and Toxic Industrial Materials (TIMs) are one of the most dangerous and diffuse vehicle of contamination in urban and industrial areas. The academic world together with the industrial and military one are working on innovative solutions to monitor the diffusion in atmosphere of such pollutants. In this phase the most common commercial sensors are based on “point detection” technology but it is clear that such instruments cannot satisfy the needs of the smart cities. The new challenge is developing stand-off systems to continuously monitor the atmosphere. Quantum Electronics and Plasma Physics (QEP) research group has a long experience in laser system development and has built two demonstrators based on DIAL (Differential Absorption of Light) technology could be able to identify chemical agents in atmosphere. In this work the authors will present one of those DIAL system, the miniaturized one, together with the preliminary results of an experimental campaign conducted on TICs and TIMs simulants in cell with aim of use the absorption database for the further atmospheric an analysis using the same DIAL system. The experimental results are analysed with standard multivariate data analysis technique as Principal Component Analysis (PCA) to develop a classification model aimed at identifying organic chemical compound in atmosphere. The preliminary results of absorption coefficients of some chemical compound are shown together pre PCA analysis.

  4. Event Recording Data Acquisition System and Experiment Data Management System for Neutron Experiments at MLF, J-PARC

    NASA Astrophysics Data System (ADS)

    Nakatani, T.; Inamura, Y.; Moriyama, K.; Ito, T.; Muto, S.; Otomo, T.

    Neutron scattering can be a powerful probe in the investigation of many phenomena in the materials and life sciences. The Materials and Life Science Experimental Facility (MLF) at the Japan Proton Accelerator Research Complex (J-PARC) is a leading center of experimental neutron science and boasts one of the most intense pulsed neutron sources in the world. The MLF currently has 18 experimental instruments in operation that support a wide variety of users from across a range of research fields. The instruments include optical elements, sample environment apparatus and detector systems that are controlled and monitored electronically throughout an experiment. Signals from these components and those from the neutron source are converted into a digital format by the data acquisition (DAQ) electronics and recorded as time-tagged event data in the DAQ computers using "DAQ-Middleware". Operating in event mode, the DAQ system produces extremely large data files (˜GB) under various measurement conditions. Simultaneously, the measurement meta-data indicating each measurement condition is recorded in XML format by the MLF control software framework "IROHA". These measurement event data and meta-data are collected in the MLF common storage and cataloged by the MLF Experimental Database (MLF EXP-DB) based on a commercial XML database. The system provides a web interface for users to manage and remotely analyze experimental data.

  5. Hubble Space Telescope: the new telemetry archiving system

    NASA Astrophysics Data System (ADS)

    Miebach, Manfred P.

    2000-07-01

    The Hubble Space Telescope (HST), the first of NASA's Great Observatories, was launched on April 24, 1990. The HST was designed for a minimum fifteen-year mission with on-orbit servicing by the Space Shuttle System planned at approximately three-year intervals. Major changes to the HST ground system have been implemented for the third servicing mission in December 1999. The primary objectives of the ground system re- engineering effort, a project called 'Vision 2000 Control Center System (CCS),' are to reduce both development and operating costs significantly for the remaining years of HST's lifetime. Development costs are reduced by providing a more modern hardware and software architecture and utilizing commercial off the shelf (COTS) products wherever possible. Part of CCS is a Space Telescope Engineering Data Store, the design of which is based on current Data Warehouse technology. The Data Warehouse (Red Brick), as implemented in the CCS Ground System that operates and monitors the Hubble Space Telescope, represents the first use of a commercial Data Warehouse to manage engineering data. The purpose of this data store is to provide a common data source of telemetry data for all HST subsystems. This data store will become the engineering data archive and will provide a queryable database for the user to analyze HST telemetry. The access to the engineering data in the Data Warehouse is platform-independent from an office environment using commercial standards (Unix, Windows98/NT). The latest Internet technology is used to reach the HST engineering community. A WEB-based user interface allows easy access to the data archives. This paper will provide a CCS system overview and will illustrate some of the CCS telemetry capabilities: in particular the use of the new Telemetry Archiving System. Vision 20001 is an ambitious project, but one that is well under way. It will allow the HST program to realize reduced operations costs for the Third Servicing Mission and beyond.

  6. Preliminary Geologic Map of the Topanga 7.5' Quadrangle, Southern California: A Digital Database

    USGS Publications Warehouse

    Yerkes, R.F.; Campbell, R.H.

    1995-01-01

    INTRODUCTION This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. This digital map database is compiled from previously published sources combined with some new mapping and modifications in nomenclature. The geologic map database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U. S. Geological Survey. For detailed descriptions of the units, their stratigraphic relations and sources of geologic mapping consult Yerkes and Campbell (1994). More specific information about the units may be available in the original sources. The content and character of the database and methods of obtaining it are described herein. The geologic map database itself, consisting of three ARC coverages and one base layer, can be obtained over the Internet or by magnetic tape copy as described below. The processes of extracting the geologic map database from the tar file, and importing the ARC export coverages (procedure described herein), will result in the creation of an ARC workspace (directory) called 'topnga.' The database was compiled using ARC/INFO version 7.0.3, a commercial Geographic Information System (Environmental Systems Research Institute, Redlands, California), with version 3.0 of the menu interface ALACARTE (Fitzgibbon and Wentworth, 1991, Fitzgibbon, 1991, Wentworth and Fitzgibbon, 1991). It is stored in uncompressed ARC export format (ARC/INFO version 7.x) in a compressed UNIX tar (tape archive) file. The tar file was compressed with gzip, and may be uncompressed with gzip, which is available free of charge via the Internet from the gzip Home Page (http://w3.teaser.fr/~jlgailly/gzip). A tar utility is required to extract the database from the tar file. This utility is included in most UNIX systems, and can be obtained free of charge via the Internet from Internet Literacy's Common Internet File Formats Webpage http://www.matisse.net/files/formats.html). ARC/INFO export files (files with the .e00 extension) can be converted into ARC/INFO coverages in ARC/INFO (see below) and can be read by some other Geographic Information Systems, such as MapInfo via ArcLink and ESRI's ArcView (version 1.0 for Windows 3.1 to 3.11 is available for free from ESRI's web site: http://www.esri.com). 1. Different base layer - The original digital database included separates clipped out of the Los Angeles 1:100,000 sheet. This release includes a vectorized scan of a scale-stable negative of the Topanga 7.5 minute quadrangle. 2. Map projection - The files in the original release were in polyconic projection. The projection used in this release is state plane, which allows for the tiling of adjacent quadrangles. 3. File compression - The files in the original release were compressed with UNIX compression. The files in this release are compressed with gzip.

  7. Design and Analysis of a Model Reconfigurable Cyber-Exercise Laboratory (RCEL) for Information Assurance Education

    DTIC Science & Technology

    2004-03-01

    with MySQL . This choice was made because MySQL is open source. Any significant database engine such as Oracle or MS- SQL or even MS Access can be used...10 Figure 6. The DoD vs . Commercial Life Cycle...necessarily be interested in SCADA network security 13. MySQL (Database server) – This station represents a typical data server for a web page

  8. LEM-CF Premixed Tool Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-01-19

    The purpose of LEM-CF Premixed Tool Kit is to process premixed flame simulation data from the LEM-CF solver (https://fileshare.craft-tech.com/clusters/view/lem-cf) into a large-eddy simulation (LES) subgrid model database. These databases may be used with a user-defined-function (UDF) that is included in the Tool Kit. The subgrid model UDF may be used with the ANSYS FLUENT flow solver or other commercial flow solvers.

  9. Impact of the mass media on calls to the CDC National AIDS Hotline.

    PubMed

    Fan, D P

    1996-06-01

    This paper considers new computer methodologies for assessing the impact of different types of public health information. The example used public service announcements (PSAs) and mass media news to predict the volume of attempts to call the CDC National AIDS Hotline from December 1992 through to the end of 1993. The analysis relied solely on data from electronic databases. Newspaper stories and television news transcripts were obtained from the NEXIS electronic database and were scored by machine for AIDS coverage. The PSA database was generated by computer monitoring of advertising distributed by the Centers for Disease Control and Prevention (CDC) and by others. The volume of call attempts was collected automatically by the public branch exchange (PBX) of the Hotline telephone system. The call attempts, the PSAs and the news story data were related to each other using both a standard time series method and the statistical model of ideodynamics. The analysis indicated that the only significant explanatory variable for the call attempts was PSAs produced by the CDC. One possible explanation was that these commercials all included the Hotline telephone number while the other information sources did not.

  10. STI Handbook: Guidelines for Producing, Using, and Managing Scientific and Technical Information in the Department of the Navy. A Handbook for Navy Scientists and Engineers on the Use of Scientific and Technical Information

    DTIC Science & Technology

    1992-02-01

    6 What Information Should Be Included in the TR Database? 2-6 What Types of Media Can Be Used to Submit Information to the TR Database? 2-9 How Is...reports. Contract administration documents. Regulations. Commercially published books. WHAT TYPES OF MEDIA CAN BE USED TO SUBMIT INFORMATION TO THE TR...TOWARD DTIC’S WUIS DATA- BASE ? The WUIS database, used to control and report technical and management data, summarizes ongoing research and technology

  11. Species Identification of Clinical Prevotella Isolates by Matrix-Assisted Laser Desorption Ionization–Time of Flight Mass Spectrometry

    PubMed Central

    Soetens, Oriane; De Bel, Annelies; Echahidi, Fedoua; Vancutsem, Ellen; Vandoorslaer, Kristof; Piérard, Denis

    2012-01-01

    The performance of matrix-assisted laser desorption–ionization time of flight mass spectrometry (MALDI-TOF MS) for species identification of Prevotella was evaluated and compared with 16S rRNA gene sequencing. Using a Bruker database, 62.7% of the 102 clinical isolates were identified to the species level and 73.5% to the genus level. Extension of the commercial database improved these figures to, respectively, 83.3% and 89.2%. MALDI-TOF MS identification of Prevotella is reliable but needs a more extensive database. PMID:22301022

  12. Molecular signatures database (MSigDB) 3.0.

    PubMed

    Liberzon, Arthur; Subramanian, Aravind; Pinchback, Reid; Thorvaldsdóttir, Helga; Tamayo, Pablo; Mesirov, Jill P

    2011-06-15

    Well-annotated gene sets representing the universe of the biological processes are critical for meaningful and insightful interpretation of large-scale genomic data. The Molecular Signatures Database (MSigDB) is one of the most widely used repositories of such sets. We report the availability of a new version of the database, MSigDB 3.0, with over 6700 gene sets, a complete revision of the collection of canonical pathways and experimental signatures from publications, enhanced annotations and upgrades to the web site. MSigDB is freely available for non-commercial use at http://www.broadinstitute.org/msigdb.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisher,D

    Concerns about the long-term viability of SFS as the metadata store for HPSS have been increasing. A concern that Transarc may discontinue support for SFS motivates us to consider alternative means to store HPSS metadata. The obvious alternative is a commercial database. Commercial databases have the necessary characteristics for storage of HPSS metadata records. They are robust and scalable and can easily accommodate the volume of data that must be stored. They provide programming interfaces, transactional semantics and a full set of maintenance and performance enhancement tools. A team was organized within the HPSS project to study and recommend anmore » approach for the replacement of SFS. Members of the team are David Fisher, Jim Minton, Donna Mecozzi, Danny Cook, Bart Parliman and Lynn Jones. We examined several possible solutions to the problem of replacing SFS, and recommended on May 22, 2000, in a report to the HPSS Technical and Executive Committees, to change HPSS into a database application over either Oracle or DB2. We recommended either Oracle or DB2 on the basis of market share and technical suitability. Oracle and DB2 are dominant offerings in the market, and it is in the best interest of HPSS to use a major player's product. Both databases provide a suitable programming interface. Transaction management functions, support for multi-threaded clients and data manipulation languages (DML) are available. These findings were supported in meetings held with technical experts from both companies. In both cases, the evidence indicated that either database would provide the features needed to host HPSS.« less

  14. Virtual machine provisioning, code management, and data movement design for the Fermilab HEPCloud Facility

    NASA Astrophysics Data System (ADS)

    Timm, S.; Cooper, G.; Fuess, S.; Garzoglio, G.; Holzman, B.; Kennedy, R.; Grassano, D.; Tiradani, A.; Krishnamurthy, R.; Vinayagam, S.; Raicu, I.; Wu, H.; Ren, S.; Noh, S.-Y.

    2017-10-01

    The Fermilab HEPCloud Facility Project has as its goal to extend the current Fermilab facility interface to provide transparent access to disparate resources including commercial and community clouds, grid federations, and HPC centers. This facility enables experiments to perform the full spectrum of computing tasks, including data-intensive simulation and reconstruction. We have evaluated the use of the commercial cloud to provide elasticity to respond to peaks of demand without overprovisioning local resources. Full scale data-intensive workflows have been successfully completed on Amazon Web Services for two High Energy Physics Experiments, CMS and NOνA, at the scale of 58000 simultaneous cores. This paper describes the significant improvements that were made to the virtual machine provisioning system, code caching system, and data movement system to accomplish this work. The virtual image provisioning and contextualization service was extended to multiple AWS regions, and to support experiment-specific data configurations. A prototype Decision Engine was written to determine the optimal availability zone and instance type to run on, minimizing cost and job interruptions. We have deployed a scalable on-demand caching service to deliver code and database information to jobs running on the commercial cloud. It uses the frontiersquid server and CERN VM File System (CVMFS) clients on EC2 instances and utilizes various services provided by AWS to build the infrastructure (stack). We discuss the architecture and load testing benchmarks on the squid servers. We also describe various approaches that were evaluated to transport experimental data to and from the cloud, and the optimal solutions that were used for the bulk of the data transport. Finally, we summarize lessons learned from this scale test, and our future plans to expand and improve the Fermilab HEP Cloud Facility.

  15. Virtual Machine Provisioning, Code Management, and Data Movement Design for the Fermilab HEPCloud Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timm, S.; Cooper, G.; Fuess, S.

    The Fermilab HEPCloud Facility Project has as its goal to extend the current Fermilab facility interface to provide transparent access to disparate resources including commercial and community clouds, grid federations, and HPC centers. This facility enables experiments to perform the full spectrum of computing tasks, including data-intensive simulation and reconstruction. We have evaluated the use of the commercial cloud to provide elasticity to respond to peaks of demand without overprovisioning local resources. Full scale data-intensive workflows have been successfully completed on Amazon Web Services for two High Energy Physics Experiments, CMS and NOνA, at the scale of 58000 simultaneous cores.more » This paper describes the significant improvements that were made to the virtual machine provisioning system, code caching system, and data movement system to accomplish this work. The virtual image provisioning and contextualization service was extended to multiple AWS regions, and to support experiment-specific data configurations. A prototype Decision Engine was written to determine the optimal availability zone and instance type to run on, minimizing cost and job interruptions. We have deployed a scalable on-demand caching service to deliver code and database information to jobs running on the commercial cloud. It uses the frontiersquid server and CERN VM File System (CVMFS) clients on EC2 instances and utilizes various services provided by AWS to build the infrastructure (stack). We discuss the architecture and load testing benchmarks on the squid servers. We also describe various approaches that were evaluated to transport experimental data to and from the cloud, and the optimal solutions that were used for the bulk of the data transport. Finally, we summarize lessons learned from this scale test, and our future plans to expand and improve the Fermilab HEP Cloud Facility.« less

  16. The AMMA database

    NASA Astrophysics Data System (ADS)

    Boichard, Jean-Luc; Brissebrat, Guillaume; Cloche, Sophie; Eymard, Laurence; Fleury, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim

    2010-05-01

    The AMMA project includes aircraft, ground-based and ocean measurements, an intensive use of satellite data and diverse modelling studies. Therefore, the AMMA database aims at storing a great amount and a large variety of data, and at providing the data as rapidly and safely as possible to the AMMA research community. In order to stimulate the exchange of information and collaboration between researchers from different disciplines or using different tools, the database provides a detailed description of the products and uses standardized formats. The AMMA database contains: - AMMA field campaigns datasets; - historical data in West Africa from 1850 (operational networks and previous scientific programs); - satellite products from past and future satellites, (re-)mapped on a regular latitude/longitude grid and stored in NetCDF format (CF Convention); - model outputs from atmosphere or ocean operational (re-)analysis and forecasts, and from research simulations. The outputs are processed as the satellite products are. Before accessing the data, any user has to sign the AMMA data and publication policy. This chart only covers the use of data in the framework of scientific objectives and categorically excludes the redistribution of data to third parties and the usage for commercial applications. Some collaboration between data producers and users, and the mention of the AMMA project in any publication is also required. The AMMA database and the associated on-line tools have been fully developed and are managed by two teams in France (IPSL Database Centre, Paris and OMP, Toulouse). Users can access data of both data centres using an unique web portal. This website is composed of different modules : - Registration: forms to register, read and sign the data use chart when an user visits for the first time - Data access interface: friendly tool allowing to build a data extraction request by selecting various criteria like location, time, parameters... The request can concern local, satellite and model data. - Documentation: catalogue of all the available data and their metadata. These tools have been developed using standard and free languages and softwares: - Linux system with an Apache web server and a Tomcat application server; - J2EE tools : JSF and Struts frameworks, hibernate; - relational database management systems: PostgreSQL and MySQL; - OpenLDAP directory. In order to facilitate the access to the data by African scientists, the complete system has been mirrored at AGHRYMET Regional Centre in Niamey and is operational there since January 2009. Users can now access metadata and request data through one or the other of two equivalent portals: http://database.amma-international.org or http://amma.agrhymet.ne/amma-data.

  17. MINEs: open access databases of computationally predicted enzyme promiscuity products for untargeted metabolomics.

    PubMed

    Jeffryes, James G; Colastani, Ricardo L; Elbadawi-Sidhu, Mona; Kind, Tobias; Niehaus, Thomas D; Broadbelt, Linda J; Hanson, Andrew D; Fiehn, Oliver; Tyo, Keith E J; Henry, Christopher S

    2015-01-01

    In spite of its great promise, metabolomics has proven difficult to execute in an untargeted and generalizable manner. Liquid chromatography-mass spectrometry (LC-MS) has made it possible to gather data on thousands of cellular metabolites. However, matching metabolites to their spectral features continues to be a bottleneck, meaning that much of the collected information remains uninterpreted and that new metabolites are seldom discovered in untargeted studies. These challenges require new approaches that consider compounds beyond those available in curated biochemistry databases. Here we present Metabolic In silico Network Expansions (MINEs), an extension of known metabolite databases to include molecules that have not been observed, but are likely to occur based on known metabolites and common biochemical reactions. We utilize an algorithm called the Biochemical Network Integrated Computational Explorer (BNICE) and expert-curated reaction rules based on the Enzyme Commission classification system to propose the novel chemical structures and reactions that comprise MINE databases. Starting from the Kyoto Encyclopedia of Genes and Genomes (KEGG) COMPOUND database, the MINE contains over 571,000 compounds, of which 93% are not present in the PubChem database. However, these MINE compounds have on average higher structural similarity to natural products than compounds from KEGG or PubChem. MINE databases were able to propose annotations for 98.6% of a set of 667 MassBank spectra, 14% more than KEGG alone and equivalent to PubChem while returning far fewer candidates per spectra than PubChem (46 vs. 1715 median candidates). Application of MINEs to LC-MS accurate mass data enabled the identity of an unknown peak to be confidently predicted. MINE databases are freely accessible for non-commercial use via user-friendly web-tools at http://minedatabase.mcs.anl.gov and developer-friendly APIs. MINEs improve metabolomics peak identification as compared to general chemical databases whose results include irrelevant synthetic compounds. Furthermore, MINEs complement and expand on previous in silico generated compound databases that focus on human metabolism. We are actively developing the database; future versions of this resource will incorporate transformation rules for spontaneous chemical reactions and more advanced filtering and prioritization of candidate structures. Graphical abstractMINE database construction and access methods. The process of constructing a MINE database from the curated source databases is depicted on the left. The methods for accessing the database are shown on the right.

  18. The pipeline system for Octave and Matlab (PSOM): a lightweight scripting framework and execution engine for scientific workflows.

    PubMed

    Bellec, Pierre; Lavoie-Courchesne, Sébastien; Dickinson, Phil; Lerch, Jason P; Zijdenbos, Alex P; Evans, Alan C

    2012-01-01

    The analysis of neuroimaging databases typically involves a large number of inter-connected steps called a pipeline. The pipeline system for Octave and Matlab (PSOM) is a flexible framework for the implementation of pipelines in the form of Octave or Matlab scripts. PSOM does not introduce new language constructs to specify the steps and structure of the workflow. All steps of analysis are instead described by a regular Matlab data structure, documenting their associated command and options, as well as their input, output, and cleaned-up files. The PSOM execution engine provides a number of automated services: (1) it executes jobs in parallel on a local computing facility as long as the dependencies between jobs allow for it and sufficient resources are available; (2) it generates a comprehensive record of the pipeline stages and the history of execution, which is detailed enough to fully reproduce the analysis; (3) if an analysis is started multiple times, it executes only the parts of the pipeline that need to be reprocessed. PSOM is distributed under an open-source MIT license and can be used without restriction for academic or commercial projects. The package has no external dependencies besides Matlab or Octave, is straightforward to install and supports of variety of operating systems (Linux, Windows, Mac). We ran several benchmark experiments on a public database including 200 subjects, using a pipeline for the preprocessing of functional magnetic resonance images (fMRI). The benchmark results showed that PSOM is a powerful solution for the analysis of large databases using local or distributed computing resources.

  19. The pipeline system for Octave and Matlab (PSOM): a lightweight scripting framework and execution engine for scientific workflows

    PubMed Central

    Bellec, Pierre; Lavoie-Courchesne, Sébastien; Dickinson, Phil; Lerch, Jason P.; Zijdenbos, Alex P.; Evans, Alan C.

    2012-01-01

    The analysis of neuroimaging databases typically involves a large number of inter-connected steps called a pipeline. The pipeline system for Octave and Matlab (PSOM) is a flexible framework for the implementation of pipelines in the form of Octave or Matlab scripts. PSOM does not introduce new language constructs to specify the steps and structure of the workflow. All steps of analysis are instead described by a regular Matlab data structure, documenting their associated command and options, as well as their input, output, and cleaned-up files. The PSOM execution engine provides a number of automated services: (1) it executes jobs in parallel on a local computing facility as long as the dependencies between jobs allow for it and sufficient resources are available; (2) it generates a comprehensive record of the pipeline stages and the history of execution, which is detailed enough to fully reproduce the analysis; (3) if an analysis is started multiple times, it executes only the parts of the pipeline that need to be reprocessed. PSOM is distributed under an open-source MIT license and can be used without restriction for academic or commercial projects. The package has no external dependencies besides Matlab or Octave, is straightforward to install and supports of variety of operating systems (Linux, Windows, Mac). We ran several benchmark experiments on a public database including 200 subjects, using a pipeline for the preprocessing of functional magnetic resonance images (fMRI). The benchmark results showed that PSOM is a powerful solution for the analysis of large databases using local or distributed computing resources. PMID:22493575

  20. MINEs: Open access databases of computationally predicted enzyme promiscuity products for untargeted metabolomics

    DOE PAGES

    Jeffryes, James G.; Colastani, Ricardo L.; Elbadawi-Sidhu, Mona; ...

    2015-08-28

    Metabolomics have proven difficult to execute in an untargeted and generalizable manner. Liquid chromatography–mass spectrometry (LC–MS) has made it possible to gather data on thousands of cellular metabolites. However, matching metabolites to their spectral features continues to be a bottleneck, meaning that much of the collected information remains uninterpreted and that new metabolites are seldom discovered in untargeted studies. These challenges require new approaches that consider compounds beyond those available in curated biochemistry databases. Here we present Metabolic In silico Network Expansions (MINEs), an extension of known metabolite databases to include molecules that have not been observed, but are likelymore » to occur based on known metabolites and common biochemical reactions. We utilize an algorithm called the Biochemical Network Integrated Computational Explorer (BNICE) and expert-curated reaction rules based on the Enzyme Commission classification system to propose the novel chemical structures and reactions that comprise MINE databases. Starting from the Kyoto Encyclopedia of Genes and Genomes (KEGG) COMPOUND database, the MINE contains over 571,000 compounds, of which 93% are not present in the PubChem database. However, these MINE compounds have on average higher structural similarity to natural products than compounds from KEGG or PubChem. MINE databases were able to propose annotations for 98.6% of a set of 667 MassBank spectra, 14% more than KEGG alone and equivalent to PubChem while returning far fewer candidates per spectra than PubChem (46 vs. 1715 median candidates). Application of MINEs to LC–MS accurate mass data enabled the identity of an unknown peak to be confidently predicted. MINE databases are freely accessible for non-commercial use via user-friendly web-tools at http://minedatabase.mcs.anl.gov and developer-friendly APIs. MINEs improve metabolomics peak identification as compared to general chemical databases whose results include irrelevant synthetic compounds. MINEs complement and expand on previous in silico generated compound databases that focus on human metabolism. We are actively developing the database; future versions of this resource will incorporate transformation rules for spontaneous chemical reactions and more advanced filtering and prioritization of candidate structures.« less

  1. Data Structures for Extreme Scale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahan, Simon

    As computing problems of national importance grow, the government meets the increased demand by funding the development of ever larger systems. The overarching goal of the work supported in part by this grant is to increase efficiency of programming and performing computations on these large computing systems. In past work, we have demonstrated that some of these computations once thought to require expensive hardware designs and/or complex, special-purpose programming may be executed efficiently on low-cost commodity cluster computing systems using a general-purpose “latency-tolerant” programming framework. One important developed application of the ideas underlying this framework is graph database technology supportingmore » social network pattern matching used by US intelligence agencies to more quickly identify potential terrorist threats. This database application has been spun out by the Pacific Northwest National Laboratory, a Department of Energy Laboratory, into a commercial start-up, Trovares Inc. We explore an alternative application of the same underlying ideas to a well-studied challenge arising in engineering: solving unstructured sparse linear equations. Solving these equations is key to predicting the behavior of large electronic circuits before they are fabricated. Predicting that behavior ahead of fabrication means that designs can optimized and errors corrected ahead of the expense of manufacture.« less

  2. Regional Geologic Map of San Andreas and Related Faults in Carrizo Plain, Temblor, Caliente and La Panza Ranges and Vicinity, California; A Digital Database

    USGS Publications Warehouse

    Dibblee, T. W.; Digital database compiled by Graham, S. E.; Mahony, T.M.; Blissenbach, J.L.; Mariant, J.J.; Wentworth, C.M.

    1999-01-01

    This Open-File Report is a digital geologic map database. The report serves to introduce and describe the digital data. There is no paper map included in the Open-File Report. The report includes PostScript and PDF plot files that can be used to plot images of the geologic map sheet and explanation sheet. This digital map database is prepared from a previously published map by Dibblee (1973). The geologic map database delineates map units that are identified by general age, lithology, and clast size following the stratigraphic nomenclature of the U.S. Geological Survey. For descriptions of the units, their stratigraphic relations, and sources of geologic mapping, consult the explanation sheet (of99-14_4b.ps or of99-14_4d.pdf), or the original published paper map (Dibblee, 1973). The scale of the source map limits the spatial resolution (scale) of the database to 1:125,000 or smaller. For those interested in the geology of Carrizo Plain and vicinity who do not use an ARC/INFO compatible Geographic Information System (GIS), but would like to obtain a paper map and explanation, PDF and PostScript plot files containing map images of the data in the digital database, as well as PostScript and PDF plot files of the explanation sheet and explanatory text, have been included in the database package (please see the section 'Digital Plot Files', page 5). The PostScript plot files require a gzip utility to access them. For those without computer capability, we can provide users with the PostScript or PDF files on tape that can be taken to a vendor for plotting. Paper plots can also be ordered directly from the USGS (please see the section 'Obtaining Plots from USGS Open-File Services', page 5). The content and character of the database, methods of obtaining it, and processes of extracting the map database from the tar (tape archive) file are described herein. The map database itself, consisting of six ARC/INFO coverages, can be obtained over the Internet or by magnetic tape copy as described below. The database was compiled using ARC/INFO, a commercial Geographic Information System (Environmental Systems Research Institute, Redlands, California), with version 3.0 of the menu interface ALACARTE (Fitzgibbon and Wentworth, 1991, Fitzgibbon, 1991, Wentworth and Fitzgibbon, 1991). The ARC/INFO coverages are stored in uncompressed ARC export format (ARC/INFO version 7.x). All data files have been compressed, and may be uncompressed with gzip, which is available free of charge over the Internet via links from the USGS Public Domain Software page (http://edcwww.cr.usgs.gov/doc/edchome/ndcdb/public.html). ARC/INFO export files (files with the .e00 extension) can be converted into ARC/INFO coverages in ARC/INFO (see below) and can be read by some other Geographic Information Systems, such as MapInfo via ArcLink and ESRI's ArcView.

  3. Scale-Up of GRCop: From Laboratory to Rocket Engines

    NASA Technical Reports Server (NTRS)

    Ellis, David L.

    2016-01-01

    GRCop is a high temperature, high thermal conductivity copper-based series of alloys designed primarily for use in regeneratively cooled rocket engine liners. It began with laboratory-level production of a few grams of ribbon produced by chill block melt spinning and has grown to commercial-scale production of large-scale rocket engine liners. Along the way, a variety of methods of consolidating and working the alloy were examined, a database of properties was developed and a variety of commercial and government applications were considered. This talk will briefly address the basic material properties used for selection of compositions to scale up, the methods used to go from simple ribbon to rocket engines, the need to develop a suitable database, and the issues related to getting the alloy into a rocket engine or other application.

  4. Genotyping and interpretation of STR-DNA: Low-template, mixtures and database matches-Twenty years of research and development.

    PubMed

    Gill, Peter; Haned, Hinda; Bleka, Oyvind; Hansson, Oskar; Dørum, Guro; Egeland, Thore

    2015-09-01

    The introduction of Short Tandem Repeat (STR) DNA was a revolution within a revolution that transformed forensic DNA profiling into a tool that could be used, for the first time, to create National DNA databases. This transformation would not have been possible without the concurrent development of fluorescent automated sequencers, combined with the ability to multiplex several loci together. Use of the polymerase chain reaction (PCR) increased the sensitivity of the method to enable the analysis of a handful of cells. The first multiplexes were simple: 'the quad', introduced by the defunct UK Forensic Science Service (FSS) in 1994, rapidly followed by a more discriminating 'six-plex' (Second Generation Multiplex) in 1995 that was used to create the world's first national DNA database. The success of the database rapidly outgrew the functionality of the original system - by the year 2000 a new multiplex of ten-loci was introduced to reduce the chance of adventitious matches. The technology was adopted world-wide, albeit with different loci. The political requirement to introduce pan-European databases encouraged standardisation - the development of European Standard Set (ESS) of markers comprising twelve-loci is the latest iteration. Although development has been impressive, the methods used to interpret evidence have lagged behind. For example, the theory to interpret complex DNA profiles (low-level mixtures), had been developed fifteen years ago, but only in the past year or so, are the concepts starting to be widely adopted. A plethora of different models (some commercial and others non-commercial) have appeared. This has led to a confusing 'debate' about the 'best' to use. The different models available are described along with their advantages and disadvantages. A section discusses the development of national DNA databases, along with details of an associated controversy to estimate the strength of evidence of matches. Current methodology is limited to searches of complete profiles - another example where the interpretation of matches has not kept pace with development of theory. STRs have also transformed the area of Disaster Victim Identification (DVI) which frequently requires kinship analysis. However, genotyping efficiency is complicated by complex, degraded DNA profiles. Finally, there is now a detailed understanding of the causes of stochastic effects that cause DNA profiles to exhibit the phenomena of drop-out and drop-in, along with artefacts such as stutters. The phenomena discussed include: heterozygote balance; stutter; degradation; the effect of decreasing quantities of DNA; the dilution effect. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Development of a 20-locus fluorescent multiplex system as a valuable tool for national DNA database.

    PubMed

    Jiang, Xianhua; Guo, Fei; Jia, Fei; Jin, Ping; Sun, Zhu

    2013-02-01

    The multiplex system allows the detection of 19 autosomal short tandem repeat (STR) loci [including all Combined DNA Index System (CODIS) STR loci as well as D2S1338, D6S1043, D12S391, D19S433, Penta D and Penta E] plus the sex-determining locus Amelogenin in a single reaction, comprising all STR loci in various commercial kits used in the China national DNA database (NDNAD). Primers are designed so that the amplicons are distributed ranging from 90 base pairs (bp) to 450 bp within a five-dye fluorescent design with the fifth dye reserved for the internal size standard. With 30 cycles, 125 pg to 2 ng DNA template showed optimal profiling result, while robust profiles could also be achieved by adjusting the cycle numbers for the DNA template beyond that optimal DNA input range. Mixture studies showed that 83% and 87% of minor alleles were detected at 9:1 and 1:9 ratios, respectively. When 4 ng of degraded DNA was digested by 2-min DNase and 1 ng undegraded DNA was added to 400 μM haematin, the complete profiles were still observed. Polymerase chain reaction (PCR)-based procedures were examined and optimized including the concentrations of primer set, magnesium and the Taq polymerase as well as volume, cycle number and annealing temperature. In addition, the system has been validated by 3000 bloodstain samples and 35 common case samples in line with the Chinese National Standards and Scientific Working Group on DNA Analysis Methods (SWGDAM) guidelines. The total probability of identity (TPI) can reach to 8×10(-24), where DNA database can be improved at the level of 10 million DNA profiles or more because the number of expected match is far from one person (4×10(-10)) and can be negligible. Further, our system also demonstrates its good performance in case samples and it will be an ideal tool for forensic DNA typing and databasing with potential application. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  6. Mining biological databases for candidate disease genes

    NASA Astrophysics Data System (ADS)

    Braun, Terry A.; Scheetz, Todd; Webster, Gregg L.; Casavant, Thomas L.

    2001-07-01

    The publicly-funded effort to sequence the complete nucleotide sequence of the human genome, the Human Genome Project (HGP), has currently produced more than 93% of the 3 billion nucleotides of the human genome into a preliminary `draft' format. In addition, several valuable sources of information have been developed as direct and indirect results of the HGP. These include the sequencing of model organisms (rat, mouse, fly, and others), gene discovery projects (ESTs and full-length), and new technologies such as expression analysis and resources (micro-arrays or gene chips). These resources are invaluable for the researchers identifying the functional genes of the genome that transcribe and translate into the transcriptome and proteome, both of which potentially contain orders of magnitude more complexity than the genome itself. Preliminary analyses of this data identified approximately 30,000 - 40,000 human `genes.' However, the bulk of the effort still remains -- to identify the functional and structural elements contained within the transcriptome and proteome, and to associate function in the transcriptome and proteome to genes. A fortuitous consequence of the HGP is the existence of hundreds of databases containing biological information that may contain relevant data pertaining to the identification of disease-causing genes. The task of mining these databases for information on candidate genes is a commercial application of enormous potential. We are developing a system to acquire and mine data from specific databases to aid our efforts to identify disease genes. A high speed cluster of Linux of workstations is used to analyze sequence and perform distributed sequence alignments as part of our data mining and processing. This system has been used to mine GeneMap99 sequences within specific genomic intervals to identify potential candidate disease genes associated with Bardet-Biedle Syndrome (BBS).

  7. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry.

    PubMed

    Van Berkel, Gary J; Kertesz, Vilmos

    2017-02-15

    An "Open Access"-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendor-provided software libraries. Sample classification based on spectral comparison utilized the spectral contrast angle method. Using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. This work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.

  8. External access to ALICE controls conditions data

    NASA Astrophysics Data System (ADS)

    Jadlovský, J.; Jadlovská, A.; Sarnovský, J.; Jajčišin, Š.; Čopík, M.; Jadlovská, S.; Papcun, P.; Bielek, R.; Čerkala, J.; Kopčík, M.; Chochula, P.; Augustinus, A.

    2014-06-01

    ALICE Controls data produced by commercial SCADA system WINCCOA is stored in ORACLE database on the private experiment network. The SCADA system allows for basic access and processing of the historical data. More advanced analysis requires tools like ROOT and needs therefore a separate access method to the archives. The present scenario expects that detector experts create simple WINCCOA scripts, which retrieves and stores data in a form usable for further studies. This relatively simple procedure generates a lot of administrative overhead - users have to request the data, experts needed to run the script, the results have to be exported outside of the experiment network. The new mechanism profits from database replica, which is running on the CERN campus network. Access to this database is not restricted and there is no risk of generating a heavy load affecting the operation of the experiment. The developed tools presented in this paper allow for access to this data. The users can use web-based tools to generate the requests, consisting of the data identifiers and period of time of interest. The administrators maintain full control over the data - an authorization and authentication mechanism helps to assign privileges to selected users and restrict access to certain groups of data. Advanced caching mechanism allows the user to profit from the presence of already processed data sets. This feature significantly reduces the time required for debugging as the retrieval of raw data can last tens of minutes. A highly configurable client allows for information retrieval bypassing the interactive interface. This method is for example used by ALICE Offline to extract operational conditions after a run is completed. Last but not least, the software can be easily adopted to any underlying database structure and is therefore not limited to WINCCOA.

  9. Advances in Data Management in Remote Sensing and Climate Modeling

    NASA Astrophysics Data System (ADS)

    Brown, P. G.

    2014-12-01

    Recent commercial interest in "Big Data" information systems has yielded little more than a sense of deja vu among scientists whose work has always required getting their arms around extremely large databases, and writing programs to explore and analyze it. On the flip side, there are some commercial DBMS startups building "Big Data" platform using techniques taken from earth science, astronomy, high energy physics and high performance computing. In this talk, we will introduce one such platform; Paradigm4's SciDB, the first DBMS designed from the ground up to combine the kinds of quality-of-service guarantees made by SQL DBMS platforms—high level data model, query languages, extensibility, transactions—with the kinds of functionality familiar to scientific users—arrays as structural building blocks, integrated linear algebra, and client language interfaces that minimize the learning curve. We will review how SciDB is used to manage and analyze earth science data by several teams of scientific users.

  10. PR-EDB: Power Reactor Embrittlement Database - Version 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jy-An John; Subramani, Ranjit

    2008-03-01

    The aging and degradation of light-water reactor pressure vessels is of particular concern because of their relevance to plant integrity and the magnitude of the expected irradiation embrittlement. The radiation embrittlement of reactor pressure vessel materials depends on many factors, such as neutron fluence, flux, and energy spectrum, irradiation temperature, and preirradiation material history and chemical compositions. These factors must be considered to reliably predict pressure vessel embrittlement and to ensure the safe operation of the reactor. Large amounts of data from surveillance capsules are needed to develop a generally applicable damage prediction model that can be used for industrymore » standards and regulatory guides. Furthermore, the investigations of regulatory issues such as vessel integrity over plant life, vessel failure, and sufficiency of current codes, Standard Review Plans (SRPs), and Guides for license renewal can be greatly expedited by the use of a well-designed computerized database. The Power Reactor Embrittlement Database (PR-EDB) is such a comprehensive collection of data for U.S. designed commercial nuclear reactors. The current version of the PR-EDB lists the test results of 104 heat-affected-zone (HAZ) materials, 115 weld materials, and 141 base materials, including 103 plates, 35 forgings, and 3 correlation monitor materials that were irradiated in 321 capsules from 106 commercial power reactors. The data files are given in dBASE format and can be accessed with any personal computer using the Windows operating system. "User-friendly" utility programs have been written to investigate radiation embrittlement using this database. Utility programs allow the user to retrieve, select and manipulate specific data, display data to the screen or printer, and fit and plot Charpy impact data. The PR-EDB Version 3.0 upgrades Version 2.0. The package was developed based on the Microsoft .NET framework technology and uses Microsoft Access for backend data storage, and Microsoft Excel for plotting graphs. This software package is compatible with Windows (98 or higher) and has been built with a highly versatile user interface. PR-EDB Version 3.0 also contains an "Evaluated Residual File" utility for generating the evaluated processed files used for radiation embrittlement study.« less

  11. Physical Science Informatics: Providing Open Science Access to Microheater Array Boiling Experiment Data

    NASA Technical Reports Server (NTRS)

    McQuillen, John; Green, Robert D.; Henrie, Ben; Miller, Teresa; Chiaramonte, Fran

    2014-01-01

    The Physical Science Informatics (PSI) system is the next step in this an effort to make NASA sponsored flight data available to the scientific and engineering community, along with the general public. The experimental data, from six overall disciplines, Combustion Science, Fluid Physics, Complex Fluids, Fundamental Physics, and Materials Science, will present some unique challenges. Besides data in textual or numerical format, large portions of both the raw and analyzed data for many of these experiments are digital images and video, requiring large data storage requirements. In addition, the accessible data will include experiment design and engineering data (including applicable drawings), any analytical or numerical models, publications, reports, and patents, and any commercial products developed as a result of the research. This objective of paper includes the following: Present the preliminary layout (Figure 2) of MABE data within the PSI database. Obtain feedback on the layout. Present the procedure to obtain access to this database.

  12. Word aligned bitmap compression method, data structure, and apparatus

    DOEpatents

    Wu, Kesheng; Shoshani, Arie; Otoo, Ekow

    2004-12-14

    The Word-Aligned Hybrid (WAH) bitmap compression method and data structure is a relatively efficient method for searching and performing logical, counting, and pattern location operations upon large datasets. The technique is comprised of a data structure and methods that are optimized for computational efficiency by using the WAH compression method, which typically takes advantage of the target computing system's native word length. WAH is particularly apropos to infrequently varying databases, including those found in the on-line analytical processing (OLAP) industry, due to the increased computational efficiency of the WAH compressed bitmap index. Some commercial database products already include some version of a bitmap index, which could possibly be replaced by the WAH bitmap compression techniques for potentially increased operation speed, as well as increased efficiencies in constructing compressed bitmaps. Combined together, this technique may be particularly useful for real-time business intelligence. Additional WAH applications may include scientific modeling, such as climate and combustion simulations, to minimize search time for analysis and subsequent data visualization.

  13. Composing Data Parallel Code for a SPARQL Graph Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Tumeo, Antonino; Villa, Oreste

    Big data analytics process large amount of data to extract knowledge from them. Semantic databases are big data applications that adopt the Resource Description Framework (RDF) to structure metadata through a graph-based representation. The graph based representation provides several benefits, such as the possibility to perform in memory processing with large amounts of parallelism. SPARQL is a language used to perform queries on RDF-structured data through graph matching. In this paper we present a tool that automatically translates SPARQL queries to parallel graph crawling and graph matching operations. The tool also supports complex SPARQL constructs, which requires more than basicmore » graph matching for their implementation. The tool generates parallel code annotated with OpenMP pragmas for x86 Shared-memory Multiprocessors (SMPs). With respect to commercial database systems such as Virtuoso, our approach reduces memory occupation due to join operations and provides higher performance. We show the scaling of the automatically generated graph-matching code on a 48-core SMP.« less

  14. What Is New in Clinical Microbiology—Microbial Identification by MALDI-TOF Mass Spectrometry

    PubMed Central

    Murray, Patrick R.

    2012-01-01

    Matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS) offers the possibility of accurate, rapid, inexpensive identification of bacteria, fungi, and mycobacteria isolated in clinical microbiology laboratories. The procedures for preanalytic processing of organisms and analysis by MALDI-TOF MS are technically simple and reproducible, and commercial databases and interpretive algorithms are available for the identification of a wide spectrum of clinically significant organisms. Although only limited work has been reported on the use of this technique to identify molds, perform strain typing, or determine antibiotic susceptibility results, these are fruitful areas of promising research. As experience is gained with MALDI-TOF MS, it is expected that the databases will be expanded to resolve many of the current inadequate identifications (eg, no identification, genus-level identification) and algorithms for potential misidentification will be developed. The current lack of Food and Drug Administration approval of any MALDI-TOF MS system for organism identification limits widespread use in the United States. PMID:22795961

  15. Security and matching of partial fingerprint recognition systems

    NASA Astrophysics Data System (ADS)

    Jea, Tsai-Yang; Chavan, Viraj S.; Govindaraju, Venu; Schneider, John K.

    2004-08-01

    Despite advances in fingerprint identification techniques, matching incomplete or partial fingerprints still poses a difficult challenge. While the introduction of compact silicon chip-based sensors that capture only a part of the fingerprint area have made this problem important from a commercial perspective, there is also considerable interest on the topic for processing partial and latent fingerprints obtained at crime scenes. Attempts to match partial fingerprints using singular ridge structures-based alignment techniques fail when the partial print does not include such structures (e.g., core or delta). We present a multi-path fingerprint matching approach that utilizes localized secondary features derived using only the relative information of minutiae. Since the minutia-based fingerprint representation, is an ANSI-NIST standard, our approach has the advantage of being directly applicable to already existing databases. We also analyze the vulnerability of partial fingerprint identification systems to brute force attacks. The described matching approach has been tested on one of FVC2002"s DB1 database11. The experimental results show that our approach achieves an equal error rate of 1.25% and a total error rate of 1.8% (with FAR at 0.2% and FRR at 1.6%).

  16. Global Location-Based Access to Web Applications Using Atom-Based Automatic Update

    NASA Astrophysics Data System (ADS)

    Singh, Kulwinder; Park, Dong-Won

    We propose an architecture which enables people to enquire about information available in directory services by voice using regular phones. We implement a Virtual User Agent (VUA) which mediates between the human user and a business directory service. The system enables the user to search for the nearest clinic, gas station by price, motel by price, food / coffee, banks/ATM etc. and fix an appointment, or automatically establish a call between the user and the business party if the user prefers. The user also has an option to receive appointment confirmation by phone, SMS, or e-mail. The VUA is accessible by a toll free DID (Direct Inward Dialing) number using a phone by anyone, anywhere, anytime. We use the Euclidean formula for distance measurement. Since, shorter geodesic distances (on the Earth’s surface) correspond to shorter Euclidean distances (measured by a straight line through the Earth). Our proposed architecture uses Atom XML syndication format protocol for data integration, VoiceXML for creating the voice user interface (VUI) and CCXML for controlling the call components. We also provide an efficient algorithm for parsing Atom feeds which provide data to the system. Moreover, we describe a cost-effective way for providing global access to the VUA based on Asterisk (an open source IP-PBX). We also provide some information on how our system can be integrated with GPS for locating the user coordinates and therefore efficiently and spontaneously enhancing the system response. Additionally, the system has a mechanism for validating the phone numbers in its database, and it updates the number and other information such as daily price of gas, motel etc. automatically using an Atom-based feed. Currently, the commercial directory services (Example 411) do not have facilities to update the listing in the database automatically, so that why callers most of the times get out-of-date phone numbers or other information. Our system can be integrated very easily with an existing web infrastructure, thereby making the wealth of Web information easily available to the user by phone. This kind of system can be deployed as an extension to 911 and 411 services to share the workload with human operators. This paper presents all the underlying principles, architecture, features, and an example of the real world deployment of our proposed system. The source code and documentations are available for commercial productions.

  17. SU-F-J-94: Development of a Plug-in Based Image Analysis Tool for Integration Into Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owen, D; Anderson, C; Mayo, C

    Purpose: To extend the functionality of a commercial treatment planning system (TPS) to support (i) direct use of quantitative image-based metrics within treatment plan optimization and (ii) evaluation of dose-functional volume relationships to assist in functional image adaptive radiotherapy. Methods: A script was written that interfaces with a commercial TPS via an Application Programming Interface (API). The script executes a program that performs dose-functional volume analyses. Written in C#, the script reads the dose grid and correlates it with image data on a voxel-by-voxel basis through API extensions that can access registration transforms. A user interface was designed through WinFormsmore » to input parameters and display results. To test the performance of this program, image- and dose-based metrics computed from perfusion SPECT images aligned to the treatment planning CT were generated, validated, and compared. Results: The integration of image analysis information was successfully implemented as a plug-in to a commercial TPS. Perfusion SPECT images were used to validate the calculation and display of image-based metrics as well as dose-intensity metrics and histograms for defined structures on the treatment planning CT. Various biological dose correction models, custom image-based metrics, dose-intensity computations, and dose-intensity histograms were applied to analyze the image-dose profile. Conclusion: It is possible to add image analysis features to commercial TPSs through custom scripting applications. A tool was developed to enable the evaluation of image-intensity-based metrics in the context of functional targeting and avoidance. In addition to providing dose-intensity metrics and histograms that can be easily extracted from a plan database and correlated with outcomes, the system can also be extended to a plug-in optimization system, which can directly use the computed metrics for optimization of post-treatment tumor or normal tissue response models. Supported by NIH - P01 - CA059827.« less

  18. North Carolina ITS/CVO : business plan

    DOT National Transportation Integrated Search

    1997-11-01

    The Department of Transportation has led the movement to automate information and capture data at the first generation. Commercial Vehicle Operations (CVO) are specifically targeted with the interface of all databases either complete or under develop...

  19. How commercial and non-commercial swine producers move pigs in Scotland: a detailed descriptive analysis

    PubMed Central

    2014-01-01

    Background The impact of non-commercial producers on disease spread via livestock movement is related to their level of interaction with other commercial actors within the industry. Although understanding these relationships is crucial in order to identify likely routes of disease incursion and transmission prior to disease detection, there has been little research in this area due to the difficulties of capturing movements of small producers with sufficient resolution. Here, we used the Scottish Livestock Electronic Identification and Traceability (ScotEID) database to describe the movement patterns of different pig production systems which may affect the risk of disease spread within the swine industry. In particular, we focused on the role of small pig producers. Results Between January 2012 and May 2013, 23,169 batches of pigs were recorded moving animals between 2382 known unique premises. Although the majority of movements (61%) were to a slaughterhouse, the non-commercial and the commercial sectors of the Scottish swine industry coexist, with on- and off-movement of animals occurring relatively frequently. For instance, 13% and 4% of non-slaughter movements from professional producers were sent to a non-assured commercial producer or to a small producer, respectively; whereas 43% and 22% of movements from non-assured commercial farms were sent to a professional or a small producer, respectively. We further identified differences between producer types in several animal movement characteristics which are known to increase the risk of disease spread. Particularly, the distance travelled and the use of haulage were found to be significantly different between producers. Conclusions These results showed that commercial producers are not isolated from the non-commercial sector of the Scottish swine industry and may frequently interact, either directly or indirectly. The observed patterns in the frequency of movements, the type of producers involved, the distance travelled and the use of haulage companies provide insights into the structure of the Scottish swine industry, but also highlight different features that may increase the risk of infectious diseases spread in both Scotland and the UK. Such knowledge is critical for developing more robust biosecurity and surveillance plans and better preparing Scotland against incursions of emerging swine diseases. PMID:24965915

  20. How commercial and non-commercial swine producers move pigs in Scotland: a detailed descriptive analysis.

    PubMed

    Porphyre, Thibaud; Boden, Lisa A; Correia-Gomes, Carla; Auty, Harriet K; Gunn, George J; Woolhouse, Mark E J

    2014-06-25

    The impact of non-commercial producers on disease spread via livestock movement is related to their level of interaction with other commercial actors within the industry. Although understanding these relationships is crucial in order to identify likely routes of disease incursion and transmission prior to disease detection, there has been little research in this area due to the difficulties of capturing movements of small producers with sufficient resolution. Here, we used the Scottish Livestock Electronic Identification and Traceability (ScotEID) database to describe the movement patterns of different pig production systems which may affect the risk of disease spread within the swine industry. In particular, we focused on the role of small pig producers. Between January 2012 and May 2013, 23,169 batches of pigs were recorded moving animals between 2382 known unique premises. Although the majority of movements (61%) were to a slaughterhouse, the non-commercial and the commercial sectors of the Scottish swine industry coexist, with on- and off-movement of animals occurring relatively frequently. For instance, 13% and 4% of non-slaughter movements from professional producers were sent to a non-assured commercial producer or to a small producer, respectively; whereas 43% and 22% of movements from non-assured commercial farms were sent to a professional or a small producer, respectively. We further identified differences between producer types in several animal movement characteristics which are known to increase the risk of disease spread. Particularly, the distance travelled and the use of haulage were found to be significantly different between producers. These results showed that commercial producers are not isolated from the non-commercial sector of the Scottish swine industry and may frequently interact, either directly or indirectly. The observed patterns in the frequency of movements, the type of producers involved, the distance travelled and the use of haulage companies provide insights into the structure of the Scottish swine industry, but also highlight different features that may increase the risk of infectious diseases spread in both Scotland and the UK. Such knowledge is critical for developing more robust biosecurity and surveillance plans and better preparing Scotland against incursions of emerging swine diseases.

  1. Evaluation of RDBMS packages for use in astronomy

    NASA Technical Reports Server (NTRS)

    Page, C. G.; Davenhall, A. C.

    1992-01-01

    Tabular data sets arise in many areas of astronomical data analysis, from raw data (such as photon event lists) to final results (such as source catalogs). The Starlink catalog access and reporting package, SCAR, was originally developed to handle IRAS data and it has been the principal relational DBMS in the Starlink software collection for several years. But SCAR has many limitations and is VMS-specific, while Starlink is in transition from VMS to Unix. Rather than attempt a major re-write of SCAR for Unix, it seemed more sensible to see whether any existing database packages are suitable for general astronomical use. The authors first drew up a list of desirable properties for such a system and then used these criteria to evaluate a number of packages, both free ones and those commercially available. It is already clear that most commercial DBMS packages are not very well suited to the requirements; for example, most cannot carry out efficiently even fairly basic operations such as joining two catalogs on an approximate match of celestial positions. This paper reports the results of the evaluation exercise and notes the problems in using a standard DBMS package to process scientific data. In parallel with this the authors have started to develop a simple database engine that can handle tabular data in a range of common formats including simple direct-access files (such as SCAR and Exosat DBMS tables) and FITS tables (both ASCII and binary).

  2. Prospects for research in haemophilia with real-world data-An analysis of German registry and secondary data.

    PubMed

    Schopohl, D; Bidlingmaier, C; Herzig, D; Klamroth, R; Kurnik, K; Rublee, D; Schramm, W; Schwarzkopf, L; Berger, K

    2018-02-28

    Open questions in haemophilia, such as effectiveness of innovative therapies, clinical and patient-reported outcomes (PROs), epidemiology and cost, await answers. The aim was to identify data attributes required and investigate the availability, appropriateness and accessibility of real-world data (RWD) from German registries and secondary databases to answer the aforementioned questions. Systematic searches were conducted in BIOSIS, EMBASE and MEDLINE to identify non-commercial secondary healthcare databases and registries of patients with haemophilia (PWH). Inclusion of German patients, type of patients, data elements-stratified by use in epidemiology, safety, outcomes and health economics research-and accessibility were investigated by desk research. Screening of 676 hits, identification of four registries [national PWH (DHR), national/international paediatric (GEPARD, PEDNET), international safety monitoring (EUHASS)] and seven national secondary databases. Access was limited to participants in three registries and to employees in one secondary database. One registry asks for PROs. Limitations of secondary databases originate from the ICD-coding system (missing: severity of haemophilia, presence of inhibitory antibodies), data protection laws and need to monitor reliability. Rigorous observational analysis of German haemophilia RWD shows that there is potential to supplement current knowledge and begin to address selected policy goals. To improve the value of existing RWD, the following efforts are proposed: ethical, legal and methodological discussions on data linkage across different sources, formulation of transparent governance rules for data access, redefinition of the ICD-coding, standardized collection of outcome data and implementation of incentives for treatment centres to improve data collection. © 2018 John Wiley & Sons Ltd.

  3. Genomic Target Database (GTD): A database of potential targets in human pathogenic bacteria

    PubMed Central

    Barh, Debmalya; Kumar, Anil; Misra, Amarendra Narayana

    2009-01-01

    A Genomic Target Database (GTD) has been developed having putative genomic drug targets for human bacterial pathogens. The selected pathogens are either drug resistant or vaccines are yet to be developed against them. The drug targets have been identified using subtractive genomics approaches and these are subsequently classified into Drug targets in pathogen specific unique metabolic pathways,Drug targets in host-pathogen common metabolic pathways, andMembrane localized drug targets. HTML code is used to link each target to its various properties and other available public resources. Essential resources and tools for subtractive genomic analysis, sub-cellular localization, vaccine and drug designing are also mentioned. To the best of authors knowledge, no such database (DB) is presently available that has listed metabolic pathways and membrane specific genomic drug targets based on subtractive genomics. Listed targets in GTD are readily available resource in developing drug and vaccine against the respective pathogen, its subtypes, and other family members. Currently GTD contains 58 drug targets for four pathogens. Shortly, drug targets for six more pathogens will be listed. Availability GTD is available at IIOAB website http://www.iioab.webs.com/GTD.htm. It can also be accessed at http://www.iioabdgd.webs.com.GTD is free for academic research and non-commercial use only. Commercial use is strictly prohibited without prior permission from IIOAB. PMID:20011153

  4. Maximum demand charge rates for commercial and industrial electricity tariffs in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLaren, Joyce; Gagnon, Pieter; Zimny-Schmitt, Daniel

    NREL has assembled a list of U.S. retail electricity tariffs and their associated demand charge rates for the Commercial and Industrial sectors. The data was obtained from the Utility Rate Database. Keep the following information in mind when interpreting the data: (1) These data were interpreted and transcribed manually from utility tariff sheets, which are often complex. It is a certainty that these data contain errors, and therefore should only be used as a reference. Actual utility tariff sheets should be consulted if an action requires this type of data. (2) These data only contains tariffs that were entered intomore » the Utility Rate Database. Since not all tariffs are designed in a format that can be entered into the Database, this list is incomplete - it does not contain all tariffs in the United States. (3) These data may have changed since this list was developed (4) Many of the underlying tariffs have additional restrictions or requirements that are not represented here. For example, they may only be available to the agricultural sector or closed to new customers. (5) If there are multiple demand charge elements in a given tariff, the maximum demand charge is the sum of each of the elements at any point in time. Where tiers were present, the highest rate tier was assumed. The value is a maximum for the year, and may be significantly different from demand charge rates at other times in the year. Utility Rate Database: https://openei.org/wiki/Utility_Rate_Database« less

  5. Ineffectiveness of commercial weight-loss programs for achieving modest but meaningful weight loss: Systematic review and meta-analysis.

    PubMed

    McEvedy, Samantha M; Sullivan-Mort, Gillian; McLean, Siân A; Pascoe, Michaela C; Paxton, Susan J

    2017-10-01

    This study collates existing evidence regarding weight loss among overweight but otherwise healthy adults who use commercial weight-loss programs. Systematic search of 3 databases identified 11 randomized controlled trials and 14 observational studies of commercial meal-replacement, calorie-counting, or pre-packaged meal programs which met inclusion criteria. In meta-analysis using intention-to-treat data, 57 percent of individuals who commenced a commercial weight program lost less than 5 percent of their initial body weight. One in two (49%) studies reported attrition ≥30 percent. A second meta-analysis found that 37 percent of program completers lost less than 5 percent of initial body weight. We conclude that commercial weight-loss programs frequently fail to produce modest but clinically meaningful weight loss with high rates of attrition suggesting that many consumers find dietary changes required by these programs unsustainable.

  6. Graphic-based musculoskeletal model for biomechanical analyses and animation.

    PubMed

    Chao, Edmund Y S

    2003-04-01

    The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the 'Virtual Human' reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. This paper details the design, capabilities, and features of the VIMS development at Johns Hopkins University, an effort possible only through academic and commercial collaborations. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of this unique database and simulation technology. This integrated system will impact on medical education, basic research, device development and application, and clinical patient care related to musculoskeletal diseases, trauma, and rehabilitation.

  7. AST commercial human space flight biomedical data collection

    DOT National Transportation Integrated Search

    2007-02-01

    Recommendations are made for specific biomedical data, equipment, and a database that will increase the knowledge and understanding of how short duration, suborbital space flight missions with brief exposure to microgravity affects the human body. Th...

  8. 78 FR 24805 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Minimum...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-26

    ... Collection; Comment Request; Bank Secrecy Act Suspicious Activity Report Database Proposed Data Fields.'' The...; c. Commercial paper; d. Credit card; e. Debit card; f. Forex transactions; g. Futures/Options on...

  9. Identifying new persistent and bioaccumulative organics among chemicals in commerce.

    PubMed

    Howard, Philip H; Muir, Derek C G

    2010-04-01

    The goal of this study was to identify commercial chemicals that might be persistent and bioaccumulative (P&B) and that were not being considered in current Great Lakes, North American, and Arctic contaminant measurement programs. We combined the Canadian Domestic Substance List (DSL), a list of 3059 substances of "unknown or variable composition complex reaction products and biological materials" (UVCBs), and the U.S. Environmental Protection Agency (U.S. EPA) Toxic Substances Control Act (TSCA) Inventory Update Rule (IUR) database for years 1986, 1990, 1994, 1998, 2002, and 2006 yielding a database of 22263 commercial chemicals. From that list, 610 chemicals were identified by estimates from U.S EPA EPISuite software and using expert judgment. This study has yielded some interesting and probable P&B chemicals that should be considered for further study. Recent studies, following up our initial reports and presentations on this work, have confirmed the presence of many of these chemicals in the environment.

  10. Indexing and retrieving point and region objects

    NASA Astrophysics Data System (ADS)

    Ibrahim, Azzam T.; Fotouhi, Farshad A.

    1996-03-01

    R-tree and its variants are examples of spatial data structures for paged-secondary memory. To process a query, these structures require multiple path traversals. In this paper, we present a new image access method, SB+-tree which requires a single path traversal to process a query. Also, SB+-tree will allow commercial databases an access method for spatial objects without a major change, since most commercial databases already support B+-tree as an access method for text data. The SB+-tree can be used for zero and non-zero size data objects. Non-zero size objects are approximated by their minimum bounding rectangles (MBRs). The number of SB+-trees generated is dependent upon the number of dimensions of the approximation of the object. The structure supports efficient spatial operations such as regions-overlap, distance and direction. In this paper, we experimentally and analytically demonstrate the superiority of SB+-tree over R-tree.

  11. Locating grey literature on communication disorders.

    PubMed

    Shpilko, Inna

    2005-01-01

    This article provides an overview of selected Web-based resources containing grey literature in the area of communication disorders. It is geared to practitioners, researchers, students, and consumers seeking reliable, freely available scientific information. Grey (or gray) literature has been defined as "that which is produced on all levels of government, academics, business, and industry in print and electronic formats, but which is not controlled by commercial publishers."1 This paper reviews various general reference sources potentially containing grey literature on communication disorders. This review includes identification of the methods specialists in this field use to obtain this valuable, yet often overlooked, literature. Access points and search tools for identifying grey literature on communication disorders are recommended. Commercial databases containing grey literature are not included. Conclusions presented in this article are considered complementary to traditionally published information resources on communication disorders, such as scholarly journals, online databases, etc.

  12. Wearable Health Monitoring Systems

    NASA Technical Reports Server (NTRS)

    Bell, John

    2015-01-01

    The shrinking size and weight of electronic circuitry has given rise to a new generation of smart clothing that enables biological data to be measured and transmitted. As the variation in the number and type of deployable devices and sensors increases, technology must allow their seamless integration so they can be electrically powered, operated, and recharged over a digital pathway. Nyx Illuminated Clothing Company has developed a lightweight health monitoring system that integrates medical sensors, electrodes, electrical connections, circuits, and a power supply into a single wearable assembly. The system is comfortable, bendable in three dimensions, durable, waterproof, and washable. The innovation will allow astronaut health monitoring in a variety of real-time scenarios, with data stored in digital memory for later use in a medical database. Potential commercial uses are numerous, as the technology enables medical personnel to noninvasively monitor patient vital signs in a multitude of health care settings and applications.

  13. High-speed data search

    NASA Technical Reports Server (NTRS)

    Driscoll, James N.

    1994-01-01

    The high-speed data search system developed for KSC incorporates existing and emerging information retrieval technology to help a user intelligently and rapidly locate information found in large textual databases. This technology includes: natural language input; statistical ranking of retrieved information; an artificial intelligence concept called semantics, where 'surface level' knowledge found in text is used to improve the ranking of retrieved information; and relevance feedback, where user judgements about viewed information are used to automatically modify the search for further information. Semantics and relevance feedback are features of the system which are not available commercially. The system further demonstrates focus on paragraphs of information to decide relevance; and it can be used (without modification) to intelligently search all kinds of document collections, such as collections of legal documents medical documents, news stories, patents, and so forth. The purpose of this paper is to demonstrate the usefulness of statistical ranking, our semantic improvement, and relevance feedback.

  14. System Study: Residual Heat Removal 1998-2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-12-01

    This report presents an unreliability evaluation of the residual heat removal (RHR) system in two modes of operation (low-pressure injection in response to a large loss-of-coolant accident and post-trip shutdown-cooling) at 104 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing trends were identified in themore » RHR results. A highly statistically significant decreasing trend was observed for the RHR injection mode start-only unreliability. Statistically significant decreasing trends were observed for RHR shutdown cooling mode start-only unreliability and RHR shutdown cooling model 24-hour unreliability.« less

  15. A wireless blood pressure monitoring system for personal health management.

    PubMed

    Li, Wun-Jin; Luo, Yuan-Long; Chang, Yao-Shun; Lin, Yuan-Hsiang

    2010-01-01

    In this paper, we developed a wireless blood pressure monitoring system which provides a useful tool for users to measure and manage their daily blood pressure values. This system includes an ARM-based blood pressure monitor with a ZigBee wireless transmission module and a PC-based management unit with graphic user interface and database. The wireless blood pressure monitor can measure the blood pressure and heart rate and then store and forward the measuring information to the management unit through the ZigBee wireless transmission. On the management unit, user can easy to see their blood pressure variation in the past using a line chart. Accuracy of blood pressure measurement has been verified by a commercial blood pressure simulator and shown the bias of systolic blood pressure is ≤ 1 mmHg and the bias of diastolic blood pressure is ≤ 1.4 mmHg.

  16. Open Geoscience Database

    NASA Astrophysics Data System (ADS)

    Bashev, A.

    2012-04-01

    Currently there is an enormous amount of various geoscience databases. Unfortunately the only users of the majority of the databases are their elaborators. There are several reasons for that: incompaitability, specificity of tasks and objects and so on. However the main obstacles for wide usage of geoscience databases are complexity for elaborators and complication for users. The complexity of architecture leads to high costs that block the public access. The complication prevents users from understanding when and how to use the database. Only databases, associated with GoogleMaps don't have these drawbacks, but they could be hardly named "geoscience" Nevertheless, open and simple geoscience database is necessary at least for educational purposes (see our abstract for ESSI20/EOS12). We developed a database and web interface to work with them and now it is accessible at maps.sch192.ru. In this database a result is a value of a parameter (no matter which) in a station with a certain position, associated with metadata: the date when the result was obtained; the type of a station (lake, soil etc); the contributor that sent the result. Each contributor has its own profile, that allows to estimate the reliability of the data. The results can be represented on GoogleMaps space image as a point in a certain position, coloured according to the value of the parameter. There are default colour scales and each registered user can create the own scale. The results can be also extracted in *.csv file. For both types of representation one could select the data by date, object type, parameter type, area and contributor. The data are uploaded in *.csv format: Name of the station; Lattitude(dd.dddddd); Longitude(ddd.dddddd); Station type; Parameter type; Parameter value; Date(yyyy-mm-dd). The contributor is recognised while entering. This is the minimal set of features that is required to connect a value of a parameter with a position and see the results. All the complicated data treatment could be conducted in other programs after extraction the filtered data into *.csv file. It makes the database understandable for non-experts. The database employs open data format (*.csv) and wide spread tools: PHP as the program language, MySQL as database management system, JavaScript for interaction with GoogleMaps and JQueryUI for create user interface. The database is multilingual: there are association tables, which connect with elements of the database. In total the development required about 150 hours. The database still has several problems. The main problem is the reliability of the data. Actually it needs an expert system for estimation the reliability, but the elaboration of such a system would take more resources than the database itself. The second problem is the problem of stream selection - how to select the stations that are connected with each other (for example, belong to one water stream) and indicate their sequence. Currently the interface is English and Russian. However it can be easily translated to your language. But some problems we decided. For example problem "the problem of the same station" (sometimes the distance between stations is smaller, than the error of position): when you adding new station to the database our application automatically find station near this place. Also we decided problem of object and parameter type (how to regard "EC" and "electrical conductivity" as the same parameter). This problem has been solved using "associative tables". If you would like to see the interface on your language, just contact us. We should send you the list of terms and phrases for translation on your language. The main advantage of the database is that it is totally open: everybody can see, extract the data from the database and use them for non-commercial purposes with no charge. Registered users can contribute to the database without getting paid. We hope, that it will be widely used first of all for education purposes, but professional scientists could use it also.

  17. Industry-university cooperation/research

    NASA Technical Reports Server (NTRS)

    Whitten, Raymond P.

    1991-01-01

    The paper concentrates on the commercial development of space programs through cooperative research with the U.S. universities and industry. The origins of the programs are discussed, beginning with the Communication Satellite Act of 1963. The National Space Policy is outlined, and the creation of NASA's Office of Commercial Programs is emphasized, along with its Centers for the Commercial Development of Space. It is noted that the centers are consortia of university, industry, and government involved in commercial-space-technology database development and research and testing of potentially valuable products and services. The center titles, locations, and brief descriptions for such area of research as remote sensing, life sciences, materials processing, space power, space propulsion, materials and space structures, and automation and robotics centers are listed, along with some results of the programs.

  18. A commercial microbial enhanced oil recovery process: statistical evaluation of a multi-project database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Portwood, J.T.

    1995-12-31

    This paper discusses a database of information collected and organized during the past eight years from 2,000 producing oil wells in the United States, all of which have been treated with special applications techniques developed to improve the effectiveness of MEOR technology. The database, believed to be the first of its kind, has been generated for the purpose of statistically evaluating the effectiveness and economics of the MEOR process in a wide variety of oil reservoir environments, and is a tool that can be used to improve the predictability of treatment response. The information in the database has also beenmore » evaluated to determine which, if any, reservoir characteristics are dominant factors in determining the applicability of MEOR.« less

  19. Upset Simulation and Training Initiatives for U.S. Navy Commercial Derived Aircraft

    NASA Technical Reports Server (NTRS)

    Donaldson, Steven; Priest, James; Cunningham, Kevin; Foster, John V.

    2012-01-01

    Militarized versions of commercial platforms are growing in popularity due to many logistical benefits in the form of commercial off-the-shelf (COTS) parts, established production methods, and commonality for different certifications. Commercial data and best practices are often leveraged to reduce procurement and engineering development costs. While the developmental and cost reduction benefits are clear, these militarized aircraft are routinely operated in flight at significantly different conditions and in significantly different manners than for routine commercial flight. Therefore they are at a higher risk of flight envelope exceedance. This risk may lead to departure from controlled flight and/or aircraft loss1. Historically, the risk of departure from controlled flight for military aircraft has been mitigated by piloted simulation training and engineering analysis of typical aircraft response. High-agility military aircraft simulation databases are typically developed to include high angles of attack (AoA) and sideslip due to the dynamic nature of their missions and have been developed for many tactical configurations over the previous decades. These aircraft simulations allow for a more thorough understanding of the vehicle flight dynamics characteristics at high AoA and sideslip. In recent years, government sponsored research on transport airplane aerodynamic characteristics at high angles of attack has produced a growing understanding of stall/post-stall behavior. This research along with recent commercial airline training initiatives has resulted in improved understanding of simulator-based training requirements and simulator model fidelity.2-5 In addition, inflight training research over the past decade has produced a database of pilot performance and recurrency metrics6. Innovative solutions to aerodynamically model large commercial aircraft for upset conditions such as high AoA, high sideslip, and ballistic damage, as well as capability to accurately account for scaling factors, is necessary to develop realistic engineering and training simulations. Such simulations should significantly reduce the risk of departure from controlled flight, loss of aircraft, and ease the airworthiness certification process. The characteristics of commercial derivative aircraft are exemplified by the P-8A Multi-mission Maritime Aircraft (MMA) aircraft, and the largest benefits of initial investigation are likely to be yielded from this platform. The database produced would also be utilized by flight dynamics engineers as a means to further develop and investigate vehicle flight characteristics as mission tactics evolve through the years ahead. This paper will describe ongoing efforts by the U.S. Navy to develop a methodology for simulation and training for large commercial-derived transport aircraft at unusual attitudes, typically experienced during an aircraft upset. This methodology will be applied to a representative Navy aircraft (P-8A) and utilized to develop a robust simulation that should accurately represent aircraft response in these extremes. Simulation capabilities would then extend to flight dynamics analysis and simulation, as well as potential training applications. Recent evaluations of integrated academic, ground-based simulation, and in-flight upset training will be described along with important lessons learned, specific to military requirements.

  20. Establishment of an international database for genetic variants in esophageal cancer.

    PubMed

    Vihinen, Mauno

    2016-10-01

    The establishment of a database has been suggested in order to collect, organize, and distribute genetic information about esophageal cancer. The World Organization for Specialized Studies on Diseases of the Esophagus and the Human Variome Project will be in charge of a central database of information about esophageal cancer-related variations from publications, databases, and laboratories; in addition to genetic details, clinical parameters will also be included. The aim will be to get all the central players in research, clinical, and commercial laboratories to contribute. The database will follow established recommendations and guidelines. The database will require a team of dedicated curators with different backgrounds. Numerous layers of systematics will be applied to facilitate computational analyses. The data items will be extensively integrated with other information sources. The database will be distributed as open access to ensure exchange of the data with other databases. Variations will be reported in relation to reference sequences on three levels--DNA, RNA, and protein-whenever applicable. In the first phase, the database will concentrate on genetic variations including both somatic and germline variations for susceptibility genes. Additional types of information can be integrated at a later stage. © 2016 New York Academy of Sciences.

  1. DISTRIBUTED STRUCTURE-SEARCHABLE TOXICITY ...

    EPA Pesticide Factsheets

    The ability to assess the potential genotoxicity, carcinogenicity, or other toxicity of pharmaceutical or industrial chemicals based on chemical structure information is a highly coveted and shared goal of varied academic, commercial, and government regulatory groups. These diverse interests often employ different approaches and have different criteria and use for toxicity assessments, but they share a need for unrestricted access to existing public toxicity data linked with chemical structure information. Currently, there exists no central repository of toxicity information, commercial or public, that adequately meets the data requirements for flexible analogue searching, SAR model development, or building of chemical relational databases (CRD). The Distributed Structure-Searchable Toxicity (DSSTox) Public Database Network is being proposed as a community-supported, web-based effort to address these shared needs of the SAR and toxicology communities. The DSSTox project has the following major elements: 1) to adopt and encourage the use of a common standard file format (SDF) for public toxicity databases that includes chemical structure, text and property information, and that can easily be imported into available CRD applications; 2) to implement a distributed source approach, managed by a DSSTox Central Website, that will enable decentralized, free public access to structure-toxicity data files, and that will effectively link knowledgeable toxicity data s

  2. Onco-STS: a web-based laboratory information management system for sample and analysis tracking in oncogenomic experiments.

    PubMed

    Gavrielides, Mike; Furney, Simon J; Yates, Tim; Miller, Crispin J; Marais, Richard

    2014-01-01

    Whole genomes, whole exomes and transcriptomes of tumour samples are sequenced routinely to identify the drivers of cancer. The systematic sequencing and analysis of tumour samples, as well other oncogenomic experiments, necessitates the tracking of relevant sample information throughout the investigative process. These meta-data of the sequencing and analysis procedures include information about the samples and projects as well as the sequencing centres, platforms, data locations, results locations, alignments, analysis specifications and further information relevant to the experiments. The current work presents a sample tracking system for oncogenomic studies (Onco-STS) to store these data and make them easily accessible to the researchers who work with the samples. The system is a web application, which includes a database and a front-end web page that allows the remote access, submission and updating of the sample data in the database. The web application development programming framework Grails was used for the development and implementation of the system. The resulting Onco-STS solution is efficient, secure and easy to use and is intended to replace the manual data handling of text records. Onco-STS allows simultaneous remote access to the system making collaboration among researchers more effective. The system stores both information on the samples in oncogenomic studies and details of the analyses conducted on the resulting data. Onco-STS is based on open-source software, is easy to develop and can be modified according to a research group's needs. Hence it is suitable for laboratories that do not require a commercial system.

  3. Validation of food store environment secondary data source and the role of neighborhood deprivation in Appalachia, Kentucky.

    PubMed

    Gustafson, Alison A; Lewis, Sarah; Wilson, Corey; Jilcott-Pitts, Stephanie

    2012-08-22

    Based on the need for better measurement of the retail food environment in rural settings and to examine how deprivation may be unique in rural settings, the aims of this study were: 1) to validate one commercially available data source with direct field observations of food retailers; and 2) to examine the association between modified neighborhood deprivation and the modified retail food environment score (mRFEI). Secondary data were obtained from a commercial database, InfoUSA in 2011, on all retail food outlets for each census tract. In 2011, direct observation identifying all listed food retailers was conducted in 14 counties in Kentucky. Sensitivity and positive predictive values (PPV) were compared. Neighborhood deprivation index was derived from American Community Survey data. Multinomial regression was used to examine associations between neighborhood deprivation and the mRFEI score (indicator of retailers selling healthy foods such as low-fat foods and fruits and vegetables relative to retailers selling more energy dense foods). The sensitivity of the commercial database was high for traditional food retailers (grocery stores, supermarkets, convenience stores), with a range of 0.96-1.00, but lower for non-traditional food retailers; dollar stores (0.20) and Farmer's Markets (0.50). For traditional food outlets, the PPV for smaller non-chain grocery stores was 38%, and large chain supermarkets was 87%. Compared to those with no stores in their neighborhoods, those with a supercenter [OR 0.50 (95% CI 0.27. 0.97)] or convenience store [OR 0.67 (95% CI 0.51, 0.89)] in their neighborhood have lower odds of living in a low deprivation neighborhood relative to a high deprivation neighborhood. The secondary commercial database used in this study was insufficient to characterize the rural retail food environment. Our findings suggest that neighborhoods with high neighborhood deprivation are associated with having certain store types that may promote less healthy food options.

  4. Validation of food store environment secondary data source and the role of neighborhood deprivation in Appalachia, Kentucky

    PubMed Central

    2012-01-01

    Background Based on the need for better measurement of the retail food environment in rural settings and to examine how deprivation may be unique in rural settings, the aims of this study were: 1) to validate one commercially available data source with direct field observations of food retailers; and 2) to examine the association between modified neighborhood deprivation and the modified retail food environment score (mRFEI). Methods Secondary data were obtained from a commercial database, InfoUSA in 2011, on all retail food outlets for each census tract. In 2011, direct observation identifying all listed food retailers was conducted in 14 counties in Kentucky. Sensitivity and positive predictive values (PPV) were compared. Neighborhood deprivation index was derived from American Community Survey data. Multinomial regression was used to examine associations between neighborhood deprivation and the mRFEI score (indicator of retailers selling healthy foods such as low-fat foods and fruits and vegetables relative to retailers selling more energy dense foods). Results The sensitivity of the commercial database was high for traditional food retailers (grocery stores, supermarkets, convenience stores), with a range of 0.96-1.00, but lower for non-traditional food retailers; dollar stores (0.20) and Farmer’s Markets (0.50). For traditional food outlets, the PPV for smaller non-chain grocery stores was 38%, and large chain supermarkets was 87%. Compared to those with no stores in their neighborhoods, those with a supercenter [OR 0.50 (95% CI 0.27. 0.97)] or convenience store [OR 0.67 (95% CI 0.51, 0.89)] in their neighborhood have lower odds of living in a low deprivation neighborhood relative to a high deprivation neighborhood. Conclusion The secondary commercial database used in this study was insufficient to characterize the rural retail food environment. Our findings suggest that neighborhoods with high neighborhood deprivation are associated with having certain store types that may promote less healthy food options. PMID:22914100

  5. High Voltage EEE Parts for EMA/EHA Applications on Manned Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Griffin, Trent; Young, David

    2011-01-01

    The objective of this paper is an assessment of high voltage electronic components required for high horsepower electric thrust vector control (TVC) systems for human spaceflight launch critical application. The scope consists of creating of a database of available Grade 1 electrical, electronic and electromechanical (EEE) parts suited to this application, a qualification path for potential non-Grade 1 EEE parts that could be used in these designs, and pathfinder testing to validate aspects of the proposed qualification plan. Advances in the state of the art in high power electric power systems enable high horsepower electric actuators, such as the electromechnical actuator (EMA) and the electro-hydrostatic actuator (EHA), to be used in launch vehicle TVC systems, dramaticly reducing weight, complexity and operating costs. Designs typically use high voltage insulated gate bipolar transistors (HV-IGBT). However, no Grade 1 HV-IGBT exists and it is unlikely that market factors alone will produce such high quality parts. Furthermore, the perception of risk, the lack of qualification methodoloy, the absence of manned space flight heritage and other barriers impede the adoption of commercial grade parts onto the critical path. The method of approach is to identify high voltage electronic component types and key parameters for parts currently used in high horsepower EMA/EHA applications, to search for higher quality substitutes and custom manufacturers, to create a database for these parts, and then to explore ways to qualify these parts for use in human spaceflight launch critical application, including grossly derating and possibly treating hybrid parts as modules. This effort is ongoing, but results thus far include identification of over 60 HV-IGBT from four manufacturers, including some with a high reliability process flow. Voltage ranges for HV-IGBT have been identified, as has screening tests used to characterize HV-IGBT. BSI BS ISO 21350 Space systems Off-the-shelf item utilization, developed from Marshall Work Instruction MWI8060.1 OFF-THE-SHELF HARDWARE UTILIZATION IN FLIGHT HARDWARE DEVELOPMENTwas found to provide guidance for including commercial off-the-shelf (COTS) hardware for use in critical applications.

  6. The Clinical and Economic Burden of Hyperkalemia on Medicare and Commercial Payers.

    PubMed

    Fitch, Kathryn; Woolley, J Michael; Engel, Tyler; Blumen, Helen

    2017-06-01

    Hyperkalemia (serum potassium >5.0 mEq/L) may be caused by reduced kidney function and drugs affecting the renin-angiotensin-aldosterone system and is often present in patients with chronic kidney disease (CKD). To quantify the burden of hyperkalemia in US Medicare fee-for-service and commercially insured populations using real-world claims data, focusing on prevalence, comorbidities, mortality, medical utilization, and cost. A descriptive, retrospective claims data analysis was performed on patients with hyperkalemia using the 2014 Medicare 5% sample and the 2014 Truven Health Analytics MarketScan Commercial Claims and Encounter databases. The starting study samples required patient insurance eligibility during ≥1 months in 2014. The identification of hyperkalemia and other comorbidities required having ≥1 qualifying claims in 2014 with an appropriate International Classification of Diseases, Ninth Revision, Clinical Modification diagnosis code in any position. To address the differences between patients with and without hyperkalemia, CKD subsamples were analyzed separately. Mortality rates were calculated in the Medicare sample population only. The claims were grouped into major service categories; the allowed costs reflected all costs incurred by each cohort divided by the total number of member months for that cohort. The prevalence of hyperkalemia in the Medicare and commercially insured samples was 2.3% and 0.09%, respectively. Hyperkalemia was associated with multiple comorbidities, most notably CKD. The prevalence of CKD in the Medicare and the commercially insured members with hyperkalemia was 64.8% and 31.8%, respectively. After adjusting for CKD severity, the annual mortality rate for Medicare patients with CKD and hyperkalemia was 24.9% versus 10.4% in patients with CKD without hyperkalemia. The allowed costs in patients with CKD and hyperkalemia in the Medicare and commercially insured cohorts were more than twice those in patients with CKD without hyperkalemia. Inpatient care accounted for >50% of costs in patients with CKD and hyperkalemia. Hyperkalemia is associated with substantial clinical and economic burden among US commercially insured and Medicare populations.

  7. Comparison of the NCI open database with seven large chemical structural databases.

    PubMed

    Voigt, J H; Bienfait, B; Wang, S; Nicklaus, M C

    2001-01-01

    Eight large chemical databases have been analyzed and compared to each other. Central to this comparison is the open National Cancer Institute (NCI) database, consisting of approximately 250 000 structures. The other databases analyzed are the Available Chemicals Directory ("ACD," from MDL, release 1.99, 3D-version); the ChemACX ("ACX," from CamSoft, Version 4.5); the Maybridge Catalog and the Asinex database (both as distributed by CamSoft as part of ChemInfo 4.5); the Sigma-Aldrich Catalog (CD-ROM, 1999 Version); the World Drug Index ("WDI," Derwent, version 1999.03); and the organic part of the Cambridge Crystallographic Database ("CSD," from Cambridge Crystallographic Data Center, 1999 Version 5.18). The database properties analyzed are internal duplication rates; compounds unique to each database; cumulative occurrence of compounds in an increasing number of databases; overlap of identical compounds between two databases; similarity overlap; diversity; and others. The crystallographic database CSD and the WDI show somewhat less overlap with the other databases than those with each other. In particular the collections of commercial compounds and compilations of vendor catalogs have a substantial degree of overlap among each other. Still, no database is completely a subset of any other, and each appears to have its own niche and thus "raison d'être". The NCI database has by far the highest number of compounds that are unique to it. Approximately 200 000 of the NCI structures were not found in any of the other analyzed databases.

  8. Prices For Common Medical Services Vary Substantially Among The Commercially Insured.

    PubMed

    Newman, David; Parente, Stephen T; Barrette, Eric; Kennedy, Kevin

    2016-05-01

    Using a national multipayer commercial claims database containing allowed amounts, we examined variations in the prices for 242 common medical services in forty-one states and the District of Columbia. Ratios of average state prices to national prices ranged from a low of 0.79 in Florida to a high of 2.64 in Alaska. Two- to threefold variations in prices were identified within some states and Metropolitan Statistical Areas. Project HOPE—The People-to-People Health Foundation, Inc.

  9. Machine Learning and Decision Support in Critical Care

    PubMed Central

    Johnson, Alistair E. W.; Ghassemi, Mohammad M.; Nemati, Shamim; Niehaus, Katherine E.; Clifton, David A.; Clifford, Gari D.

    2016-01-01

    Clinical data management systems typically provide caregiver teams with useful information, derived from large, sometimes highly heterogeneous, data sources that are often changing dynamically. Over the last decade there has been a significant surge in interest in using these data sources, from simply re-using the standard clinical databases for event prediction or decision support, to including dynamic and patient-specific information into clinical monitoring and prediction problems. However, in most cases, commercial clinical databases have been designed to document clinical activity for reporting, liability and billing reasons, rather than for developing new algorithms. With increasing excitement surrounding “secondary use of medical records” and “Big Data” analytics, it is important to understand the limitations of current databases and what needs to change in order to enter an era of “precision medicine.” This review article covers many of the issues involved in the collection and preprocessing of critical care data. The three challenges in critical care are considered: compartmentalization, corruption, and complexity. A range of applications addressing these issues are covered, including the modernization of static acuity scoring; on-line patient tracking; personalized prediction and risk assessment; artifact detection; state estimation; and incorporation of multimodal data sources such as genomic and free text data. PMID:27765959

  10. Development of an Ada programming support environment database SEAD (Software Engineering and Ada Database) administration manual

    NASA Technical Reports Server (NTRS)

    Liaw, Morris; Evesson, Donna

    1988-01-01

    Software Engineering and Ada Database (SEAD) was developed to provide an information resource to NASA and NASA contractors with respect to Ada-based resources and activities which are available or underway either in NASA or elsewhere in the worldwide Ada community. The sharing of such information will reduce duplication of effort while improving quality in the development of future software systems. SEAD data is organized into five major areas: information regarding education and training resources which are relevant to the life cycle of Ada-based software engineering projects such as those in the Space Station program; research publications relevant to NASA projects such as the Space Station Program and conferences relating to Ada technology; the latest progress reports on Ada projects completed or in progress both within NASA and throughout the free world; Ada compilers and other commercial products that support Ada software development; and reusable Ada components generated both within NASA and from elsewhere in the free world. This classified listing of reusable components shall include descriptions of tools, libraries, and other components of interest to NASA. Sources for the data include technical newletters and periodicals, conference proceedings, the Ada Information Clearinghouse, product vendors, and project sponsors and contractors.

  11. Source attribution using FLEXPART and carbon monoxide emission inventories for the IAGOS In-situ Observation database

    NASA Astrophysics Data System (ADS)

    Fontaine, Alain; Sauvage, Bastien; Pétetin, Hervé; Auby, Antoine; Boulanger, Damien; Thouret, Valerie

    2016-04-01

    Since 1994, the IAGOS program (In-Service Aircraft for a Global Observing System http://www.iagos.org) and its predecessor MOZAIC has produced in-situ measurements of the atmospheric composition during more than 46000 commercial aircraft flights. In order to help analyzing these observations and further understanding the processes driving their evolution, we developed a modelling tool SOFT-IO quantifying their source/receptor link. We improved the methodology used by Stohl et al. (2003), based on the FLEXPART plume dispersion model, to simulate the contributions of anthropogenic and biomass burning emissions from the ECCAD database (http://eccad.aeris-data.fr) to the measured carbon monoxide mixing ratio along each IAGOS flight. Thanks to automated processes, contributions are simulated for the last 20 days before observation, separating individual contributions from the different source regions. The main goal is to supply add-value products to the IAGOS database showing pollutants geographical origin and emission type. Using this information, it may be possible to link trends in the atmospheric composition to changes in the transport pathways and to the evolution of emissions. This tool could be used for statistical validation as well as for inter-comparisons of emission inventories using large amounts of data, as Lagrangian models are able to bring the global scale emissions down to a smaller scale, where they can be directly compared to the in-situ observations from the IAGOS database.

  12. Mitochondrial DNA identification of game and harvested freshwater fish species.

    PubMed

    Kyle, C J; Wilson, C C

    2007-02-14

    The use of DNA in forensics has grown rapidly for human applications along with the concomitant development of bioinformatics and demographic databases to help fully realize the potential of this molecular information. Similar techniques are also used routinely in many wildlife cases, such as species identification in food products, poaching and the illegal trade of endangered species. The use of molecular techniques in forensic cases related to wildlife and the development of associated databases has, however, mainly focused on large mammals with the exception of a few high-profile species. There is a need to develop similar databases for aquatic species for fisheries enforcement, given the large number of exploited and endangered fish species, the intensity of exploitation, and challenges in identifying species and their derived products. We sequenced a 500bp fragment of the mitochondrial cytochrome b gene from representative individuals from 26 harvested fish taxa from Ontario, Canada, focusing on species that support major commercial and recreational fisheries. Ontario provides a unique model system for the development of a fish species database, as the province contains an evolutionarily diverse array of freshwater fish families representing more than one third of all freshwater fish in Canada. Inter- and intraspecific sequence comparisons using phylogenetic analysis and a BLAST search algorithm provided rigorous statistical metrics for species identification. This methodology and these data will aid in fisheries enforcement, providing a tool to easily and accurately identify fish species in enforcement investigations that would have otherwise been difficult or impossible to pursue.

  13. Clinical Variant Classification: A Comparison of Public Databases and a Commercial Testing Laboratory.

    PubMed

    Gradishar, William; Johnson, KariAnne; Brown, Krystal; Mundt, Erin; Manley, Susan

    2017-07-01

    There is a growing move to consult public databases following receipt of a genetic test result from a clinical laboratory; however, the well-documented limitations of these databases call into question how often clinicians will encounter discordant variant classifications that may introduce uncertainty into patient management. Here, we evaluate discordance in BRCA1 and BRCA2 variant classifications between a single commercial testing laboratory and a public database commonly consulted in clinical practice. BRCA1 and BRCA2 variant classifications were obtained from ClinVar and compared with the classifications from a reference laboratory. Full concordance and discordance were determined for variants whose ClinVar entries were of the same pathogenicity (pathogenic, benign, or uncertain). Variants with conflicting ClinVar classifications were considered partially concordant if ≥1 of the listed classifications agreed with the reference laboratory classification. Four thousand two hundred and fifty unique BRCA1 and BRCA2 variants were available for analysis. Overall, 73.2% of classifications were fully concordant and 12.3% were partially concordant. The remaining 14.5% of variants had discordant classifications, most of which had a definitive classification (pathogenic or benign) from the reference laboratory compared with an uncertain classification in ClinVar (14.0%). Here, we show that discrepant classifications between a public database and single reference laboratory potentially account for 26.7% of variants in BRCA1 and BRCA2 . The time and expertise required of clinicians to research these discordant classifications call into question the practicality of checking all test results against a database and suggest that discordant classifications should be interpreted with these limitations in mind. With the increasing use of clinical genetic testing for hereditary cancer risk, accurate variant classification is vital to ensuring appropriate medical management. There is a growing move to consult public databases following receipt of a genetic test result from a clinical laboratory; however, we show that up to 26.7% of variants in BRCA1 and BRCA2 have discordant classifications between ClinVar and a reference laboratory. The findings presented in this paper serve as a note of caution regarding the utility of database consultation. © AlphaMed Press 2017.

  14. Evaluation of contents-based image retrieval methods for a database of logos on drug tablets

    NASA Astrophysics Data System (ADS)

    Geradts, Zeno J.; Hardy, Huub; Poortman, Anneke; Bijhold, Jurrien

    2001-02-01

    In this research an evaluation has been made of the different ways of contents based image retrieval of logos of drug tablets. On a database of 432 illicitly produced tablets (mostly containing MDMA), we have compared different retrieval methods. Two of these methods were available from commercial packages, QBIC and Imatch, where the implementation of the contents based image retrieval methods are not exactly known. We compared the results for this database with the MPEG-7 shape comparison methods, which are the contour-shape, bounding box and region-based shape methods. In addition, we have tested the log polar method that is available from our own research.

  15. The Fragment Network: A Chemistry Recommendation Engine Built Using a Graph Database.

    PubMed

    Hall, Richard J; Murray, Christopher W; Verdonk, Marcel L

    2017-07-27

    The hit validation stage of a fragment-based drug discovery campaign involves probing the SAR around one or more fragment hits. This often requires a search for similar compounds in a corporate collection or from commercial suppliers. The Fragment Network is a graph database that allows a user to efficiently search chemical space around a compound of interest. The result set is chemically intuitive, naturally grouped by substitution pattern and meaningfully sorted according to the number of observations of each transformation in medicinal chemistry databases. This paper describes the algorithms used to construct and search the Fragment Network and provides examples of how it may be used in a drug discovery context.

  16. A multidisciplinary database for global distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, P.J.

    The issue of selenium toxicity in the environment has been documented in the scientific literature for over 50 years. Recent studies reveal a complex connection between selenium and human and animal populations. This article introduces a bibliographic citation database on selenium in the environment developed for global distribution via the Internet by the University of Wyoming Libraries. The database incorporates material from commercial sources, print abstracts, indexes, and U.S. government literature, resulting in a multidisciplinary resource. Relevant disciplines include, biology, medicine, veterinary science, botany, chemistry, geology, pollution, aquatic sciences, ecology, and others. It covers the years 1985-1996 for most subjectmore » material, with additional years being added as resources permit.« less

  17. Preliminary Results Obtained in Integrated Safety Analysis of NASA Aviation Safety Program Technologies

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This is a listing of recent unclassified RTO technical publications processed by the NASA Center for AeroSpace Information from January 1, 2001 through March 31, 2001 available on the NASA Aeronautics and Space Database. Contents include 1) Cognitive Task Analysis; 2) RTO Educational Notes; 3) The Capability of Virtual Reality to Meet Military Requirements; 4) Aging Engines, Avionics, Subsystems and Helicopters; 5) RTO Meeting Proceedings; 6) RTO Technical Reports; 7) Low Grazing Angle Clutter...; 8) Verification and Validation Data for Computational Unsteady Aerodynamics; 9) Space Observation Technology; 10) The Human Factor in System Reliability...; 11) Flight Control Design...; 12) Commercial Off-the-Shelf Products in Defense Applications.

  18. Database tomography for commercial application

    NASA Technical Reports Server (NTRS)

    Kostoff, Ronald N.; Eberhart, Henry J.

    1994-01-01

    Database tomography is a method for extracting themes and their relationships from text. The algorithms, employed begin with word frequency and word proximity analysis and build upon these results. When the word 'database' is used, think of medical or police records, patents, journals, or papers, etc. (any text information that can be computer stored). Database tomography features a full text, user interactive technique enabling the user to identify areas of interest, establish relationships, and map trends for a deeper understanding of an area of interest. Database tomography concepts and applications have been reported in journals and presented at conferences. One important feature of the database tomography algorithm is that it can be used on a database of any size, and will facilitate the users ability to understand the volume of content therein. While employing the process to identify research opportunities it became obvious that this promising technology has potential applications for business, science, engineering, law, and academe. Examples include evaluating marketing trends, strategies, relationships and associations. Also, the database tomography process would be a powerful component in the area of competitive intelligence, national security intelligence and patent analysis. User interests and involvement cannot be overemphasized.

  19. DISTRIBUTED STRUCTURE-SEARCHABLE TOXICITY (DSSTOX) PUBLIC DATABASE NETWORK: A PROPOSAL

    EPA Science Inventory

    The ability to assess the potential genotoxicity, carcinogenicity, or other toxicity of pharmaceutical or industrial chemicals based on chemical structure information is a highly coveted and shared goal of varied academic, commercial, and government regulatory groups. These dive...

  20. Scheduled Civil Aircraft Emission Inventories for 1999: Database Development and Analysis

    NASA Technical Reports Server (NTRS)

    Sutkus, Donald J., Jr.; Baughcum, Steven L.; DuBois, Douglas P.

    2001-01-01

    This report describes the development of a three-dimensional database of aircraft fuel burn and emissions (NO(x), CO, and hydrocarbons) for the scheduled commercial aircraft fleet for each month of 1999. Global totals of emissions and fuel burn for 1999 are compared to global totals from 1992 and 2015 databases. 1999 fuel burn, departure and distance totals for selected airlines are compared to data reported on DOT Form 41 to evaluate the accuracy of the calculations. DOT Form T-100 data were used to determine typical payloads for freighter aircraft and this information was used to model freighter aircraft more accurately by using more realistic payloads. Differences in the calculation methodology used to create the 1999 fuel burn and emissions database from the methodology used in previous work are described and evaluated.

  1. Transcriptome Analysis and Differential Gene Expression on the Testis of Orange Mud Crab, Scylla olivacea, during Sexual Maturation

    PubMed Central

    Waiho, Khor; Fazhan, Hanafiah; Shahreza, Md Sheriff; Moh, Julia Hwei Zhong; Noorbaiduri, Shaibani; Wong, Li Lian; Sinnasamy, Saranya

    2017-01-01

    Adequate genetic information is essential for sustainable crustacean fisheries and aquaculture management. The commercially important orange mud crab, Scylla olivacea, is prevalent in Southeast Asia region and is highly sought after. Although it is a suitable aquaculture candidate, full domestication of this species is hampered by the lack of knowledge about the sexual maturation process and the molecular mechanisms behind it, especially in males. To date, data on its whole genome is yet to be reported for S. olivacea. The available transcriptome data published previously on this species focus primarily on females and the role of central nervous system in reproductive development. De novo transcriptome sequencing for the testes of S. olivacea from immature, maturing and mature stages were performed. A total of approximately 144 million high-quality reads were generated and de novo assembled into 160,569 transcripts with a total length of 142.2 Mb. Approximately 15–23% of the total assembled transcripts were annotated when compared to public protein sequence databases (i.e. UniProt database, Interpro database, Pfam database and Drosophila melanogaster protein database), and GO-categorised with GO Ontology terms. A total of 156,181 high-quality Single-Nucleotide Polymorphisms (SNPs) were mined from the transcriptome data of present study. Transcriptome comparison among the testes of different maturation stages revealed one gene (beta crystallin like gene) with the most significant differential expression—up-regulated in immature stage and down-regulated in maturing and mature stages. This was further validated by qRT-PCR. In conclusion, a comprehensive transcriptome of the testis of orange mud crabs from different maturation stages were obtained. This report provides an invaluable resource for enhancing our understanding of this species’ genome structure and biology, as expressed and controlled by their gonads. PMID:28135340

  2. Managing operational documentation in the ALICE Detector Control System

    NASA Astrophysics Data System (ADS)

    Lechman, M.; Augustinus, A.; Bond, P.; Chochula, P.; Kurepin, A.; Pinazza, O.; Rosinsky, P.

    2012-12-01

    ALICE (A Large Ion Collider Experiment) is one of the big LHC (Large Hadron Collider) experiments at CERN in Geneve, Switzerland. The experiment is composed of 18 sub-detectors controlled by an integrated Detector Control System (DCS) that is implemented using the commercial SCADA package PVSSII. The DCS includes over 1200 network devices, over 1,000,000 monitored parameters and numerous custom made software components that are prepared by over 100 developers from all around the world. This complex system is controlled by a single operator via a central user interface. One of his/her main tasks is the recovery of anomalies and errors that may occur during operation. Therefore, clear, complete and easily accessible documentation is essential to guide the shifter through the expert interfaces of different subsystems. This paper describes the idea of the management of the operational documentation in ALICE using a generic repository that is built on a relational database and is integrated with the control system. The experience gained and the conclusions drawn from the project are also presented.

  3. Identification and evaluation of fluvial-dominated deltaic (Class 1 oil) reservoirs in Oklahoma. Yearly technical progress report, January 1--December 31, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mankin, C.J.; Banken, M.K.

    The Oklahoma Geological Survey (OGS), the Geological Information Systems department, and the School of Petroleum and Geological Engineering at the University of Oklahoma are engaged in a five-year program to identify and address Oklahoma`s oil recovery opportunities in fluvial-dominated deltaic (FDD) reservoirs. This program includes the systematic and comprehensive collection, evaluation, and distribution of information on all of Oklahoma`s FDD oil reservoirs and the recovery technologies that can be applied to those reservoirs with commercial success. Exhaustive literature searches are being conducted for these plays, both through published sources and through unpublished theses from regional universities. A bibliographic database hasmore » been developed to record these literature sources and their related plays. Trend maps are being developed to identify the FDD portions of the relevant reservoirs, through accessing current production databases and through compiling the literature results. A reservoir database system also has been developed, to record specific reservoir data elements that are identified through the literature, and through public and private data sources. Thus far, the initial demonstration for one has been completed, and second is nearly completed. All of the information gathered through these efforts will be transferred to the Oklahoma petroleum industry through a series of publications and workshops. Additionally, plans are being developed, and hardware and software resources are being acquired, in preparation for the opening of a publicly-accessible computer users laboratory, one component of the technology transfer program.« less

  4. The exploration of the exhibition informatization

    NASA Astrophysics Data System (ADS)

    Zhang, Jiankang

    2017-06-01

    The construction and management of exhibition informatization is the main task and choke point during the process of Chinese exhibition industry’s transformation and promotion. There are three key points expected to realize a breakthrough during the construction of Chinese exhibition informatization, and the three aspects respectively are adopting service outsourcing to construct and maintain the database, adopting advanced chest card technology to collect various kinds of information, developing statistics analysis to maintain good cutomer relations. The success of Chinese exhibition informatization mainly calls for mature suppliers who can provide construction and maintenance of database, the proven technology, a sense of data security, advanced chest card technology, the ability of data mining and analysis and the ability to improve the exhibition service basing on the commercial information got from the data analysis. Several data security measures are expected to apply during the process of system developing, including the measures of the terminal data security, the internet data security, the media data security, the storage data security and the application data security. The informatization of this process is based on the chest card designing. At present, there are several types of chest card technology: bar code chest card; two-dimension code card; magnetic stripe chest card; smart-chip chest card. The information got from the exhibition data will help the organizers to make relevant service strategies, quantify the accumulated indexes of the customers, and improve the level of the customer’s satisfaction and loyalty, what’s more, the information can also provide more additional services like the commercial trips, VIP ceremonial reception.

  5. The prevalence of Listeria spp. food contamination in Iran: A systematic review and meta-analysis.

    PubMed

    Hamidiyan, Negar; Salehi-Abargouei, Amin; Rezaei, Zeynab; Dehghani-Tafti, Roohollah; Akrami-Mohajeri, Fateme

    2018-05-01

    Listeria monocytogenes can cause circling disease, encephalitis, meningitis, septicemia, and mastitis in dairy cattle. Contamination from the environment can contaminate foods with Listeria spp. Consumption of foods containing L. monocytogenes can lead to listeriosis in susceptible people (adults with a compromised immune system), pregnant women, and infants. The objective of this study was to determine the prevalence of Listeria spp. and L. monocytogenes in various foods in Iran. We searched PubMed, Science direct, Scopus, Google scholar, and Iranian local databases including Iranian scientific information database and Magiran for relevant studies up to May 2015 using related keywords. In our preliminary search, we retrieved 1344 articles. After removing duplicates and reviewing titles/abstracts, 117 articles were considered, out of which, 75 articles had sufficient quality for inclusion in this meta-analysis. The prevalence of Listeria spp. contamination was about 18.3% in poultry, 8.5% in raw meat, 14.6% in ready-to-eat (RTE) foods, 10% in sea foods, 7.3% in traditional dairy, 3.2% in commercial dairy, and 0.1% in eggs. The findings showed that L. monocytogenes was most prevalent in ready to eat (9.2%), seafood (5.1%), poultry (5%), traditional dairy (4%), raw meat (2.6%), commercial dairy (1.4%), and egg (0.2%), respectively. Furthermore, the presence of L. monocytogenes particularly in RTE foods (that are consumed without further heat processing) and under-cooked products could be a potential risk for public health. So, contamination should be controlled at all levels of the food chain. Copyright © 2018. Published by Elsevier Ltd.

  6. Mars Colony in situ resource utilization: An integrated architecture and economics model

    NASA Astrophysics Data System (ADS)

    Shishko, Robert; Fradet, René; Do, Sydney; Saydam, Serkan; Tapia-Cortez, Carlos; Dempster, Andrew G.; Coulton, Jeff

    2017-09-01

    This paper reports on our effort to develop an ensemble of specialized models to explore the commercial potential of mining water/ice on Mars in support of a Mars Colony. This ensemble starts with a formal systems architecting framework to describe a Mars Colony and capture its artifacts' parameters and technical attributes. The resulting database is then linked to a variety of ;downstream; analytic models. In particular, we integrated an extraction process (i.e., ;mining;) model, a simulation of the colony's environmental control and life support infrastructure known as HabNet, and a risk-based economics model. The mining model focuses on the technologies associated with in situ resource extraction, processing, storage and handling, and delivery. This model computes the production rate as a function of the systems' technical parameters and the local Mars environment. HabNet simulates the fundamental sustainability relationships associated with establishing and maintaining the colony's population. The economics model brings together market information, investment and operating costs, along with measures of market uncertainty and Monte Carlo techniques, with the objective of determining the profitability of commercial water/ice in situ mining operations. All told, over 50 market and technical parameters can be varied in order to address ;what-if; questions, including colony location.

  7. Sodium monitoring in commercially processed and restaurant foods.

    PubMed

    Ahuja, Jaspreet K C; Pehrsson, Pamela R; Haytowitz, David B; Wasswa-Kintu, Shirley; Nickle, Melissa; Showell, Bethany; Thomas, Robin; Roseland, Janet; Williams, Juhi; Khan, Mona; Nguyen, Quynhanh; Hoy, Kathy; Martin, Carrie; Rhodes, Donna; Moshfegh, Alanna; Gillespie, Cathleen; Gunn, Janelle; Merritt, Robert; Cogswell, Mary

    2015-03-01

    Most sodium in the US diet comes from commercially processed and restaurant foods. Sodium reduction in these foods is key to several recent public health efforts. The objective was to provide an overview of a program led by the USDA, in partnership with other government agencies, to monitor sodium contents in commercially processed and restaurant foods in the United States. We also present comparisons of nutrients generated under the program to older data. We track ∼125 commercially processed and restaurant food items ("sentinel foods") annually using information from food manufacturers and periodically by nationwide sampling and laboratory analyses. In addition, we monitor >1100 other commercially processed and restaurant food items, termed "priority-2 foods" (P2Fs) biennially by using information from food manufacturers. These foods serve as indicators for assessing changes in the sodium content of commercially processed and restaurant foods in the United States. We sampled all sentinel foods nationwide and reviewed all P2Fs in 2010-2013 to determine baseline sodium concentrations. We updated sodium values for 73 sentinel foods and 551 P2Fs in the USDA's National Nutrient Database for Standard Reference (releases 23-26). Sodium values changed by at least 10% for 43 of the sentinel foods, which, for 31 foods, including commonly consumed foods such as bread, tomato catsup, and potato chips, the newer sodium values were lower. Changes in the concentrations of related nutrients (total and saturated fat, total sugar, potassium, or dietary fiber) that were recommended by the 2010 Dietary Guidelines for Americans for reduced or increased consumption accompanied sodium reduction. The results of sodium reduction efforts, based on resampling of the sentinel foods or re-review of P2Fs, will become available beginning in 2015. This monitoring program tracks sodium reduction efforts, improves food composition databases, and strengthens national nutrition monitoring. © 2015 American Society for Nutrition.

  8. Sodium monitoring in commercially processed and restaurant foods

    PubMed Central

    Ahuja, Jaspreet KC; Pehrsson, Pamela R; Haytowitz, David B; Wasswa-Kintu, Shirley; Nickle, Melissa; Showell, Bethany; Thomas, Robin; Roseland, Janet; Williams, Juhi; Khan, Mona; Nguyen, Quynhanh; Hoy, Kathy; Martin, Carrie; Rhodes, Donna; Moshfegh, Alanna; Gillespie, Cathleen; Gunn, Janelle; Merritt, Robert; Cogswell, Mary

    2015-01-01

    Background Most sodium in the US diet comes from commercially processed and restaurant foods. Sodium reduction in these foods is key to several recent public health efforts. Objective The objective was to provide an overview of a program led by the USDA, in partnership with other government agencies, to monitor sodium contents in commercially processed and restaurant foods in the United States. We also present comparisons of nutrients generated under the program to older data. Design We track ∼125 commercially processed and restaurant food items (“sentinel foods”) annually using information from food manufacturers and periodically by nationwide sampling and laboratory analyses. In addition, we monitor >1100 other commercially processed and restaurant food items, termed “priority-2 foods” (P2Fs) biennially by using information from food manufacturers. These foods serve as indicators for assessing changes in the sodium content of commercially processed and restaurant foods in the United States. We sampled all sentinel foods nationwide and reviewed all P2Fs in 2010–2013 to determine baseline sodium concentrations. Results We updated sodium values for 73 sentinel foods and 551 P2Fs in the USDA’s National Nutrient Database for Standard Reference (releases 23–26). Sodium values changed by at least 10% for 43 of the sentinel foods, which, for 31 foods, including commonly consumed foods such as bread, tomato catsup, and potato chips, the newer sodium values were lower. Changes in the concentrations of related nutrients (total and saturated fat, total sugar, potassium, or dietary fiber) that were recommended by the 2010 Dietary Guidelines for Americans for reduced or increased consumption accompanied sodium reduction. The results of sodium reduction efforts, based on resampling of the sentinel foods or re-review of P2Fs, will become available beginning in 2015. Conclusion This monitoring program tracks sodium reduction efforts, improves food composition databases, and strengthens national nutrition monitoring. PMID:25733648

  9. Patent databases and analytical tools for space technology commercialization (Part 2)

    NASA Astrophysics Data System (ADS)

    Hulsey, William N., III

    2002-07-01

    A shift in the space industry has occurred that requires technology developers to understand the basics of the intellectual property laws; Global harmonization facilitates this understanding; internet-based tools enable knowledge of these rights and the facts affecting them.

  10. Business Information Centres: New Resources Are Not Used.

    ERIC Educational Resources Information Center

    Drummond, Janet

    1984-01-01

    Presents findings from survey of Canadian information centers specializing in business, economics, or finance (corporate library, government department library, fee-based service, commercial database, association information center). Questions focused on three broad categories: human resources organization, relative use of different types of…

  11. PDAs and the Library Without a Roof.

    ERIC Educational Resources Information Center

    Foster, Clifton Dale

    1995-01-01

    A project demonstrated the feasibility of accessing library information (online public access catalogs, commercial online databases, Internet) from a distance using handheld personal digital assistants (PDAs) equipped with cellular communication capability. The study is described, and other uses of wireless communications in libraries and…

  12. Alliance Building in the Information and Online Database Industry.

    ERIC Educational Resources Information Center

    Alexander, Johanna Olson

    2001-01-01

    Presents an analysis of information industry alliance formation using environmental scanning methods. Highlights include why libraries and academic institutions should be interested; a literature review; historical context; industry and market structures; commercial and academic models; trends; and implications for information providers,…

  13. Library information services.

    NASA Astrophysics Data System (ADS)

    Michold, U.; Cummins, M.; Watson, J. M.; Holmquist, J.; Shobbrook, R.

    Contents: library catalogs and holdings; indexing and abstract services; preprint services; electronic journals and newsletters; alerting services; commercial databases; informal networking; use of a thesaurus for on-line searching. An extensive list of access pointers for library catalogs and services, electronic newsletters, and publishers and bookshops is enclosed.

  14. 47 CFR 64.5105 - Use of customer proprietary network information without customer approval.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... calls; (ii) Access, either directly or via a third party, a commercially available database that will... permit access to CPNI upon request by the administrator of the TRS Fund, as that term is defined in § 64...

  15. 47 CFR 64.5105 - Use of customer proprietary network information without customer approval.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... calls; (ii) Access, either directly or via a third party, a commercially available database that will... permit access to CPNI upon request by the administrator of the TRS Fund, as that term is defined in § 64...

  16. Rating the Relevance of QUORUM-Selected ASRS Incident Narratives to a "Controlled Flight into Terrain" Accident

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W.; Statler, Irving C.

    1998-01-01

    An exploratory study was conducted to identify commercial aviation incidents that are relevant to a "controlled flight into terrain" (CFIT) accident using a NASA-developed text processing method. The QUORUM method was used to rate 67820 incident narratives, virtually all of the narratives in the Aviation Safety Reporting System (ASRS) database, according to their relevance to two official reports on the crash of American Airlines Flight 965 near Cali, Colombia in December 1995. For comparison with QUORUM's ratings, three experienced ASRS analysts read the reports of the crash and independently rated the relevance of the 100 narratives that were most highly rated by QUORUM, as well as 100 narratives randomly selected from the database. Eighty-four of 100 QUORUM-selected narratives were rated as relevant to the Cali accident by one or more of the analysts. The relevant incidents involved a variety of factors, including, over-reliance on automation, confusion and changes during descent/approach, terrain avoidance, and operations in foreign airspace. In addition, the QUORUM collection of incidents was found to be significantly more relevant than the random collection.

  17. Andromeda: a peptide search engine integrated into the MaxQuant environment.

    PubMed

    Cox, Jürgen; Neuhauser, Nadin; Michalski, Annette; Scheltema, Richard A; Olsen, Jesper V; Mann, Matthias

    2011-04-01

    A key step in mass spectrometry (MS)-based proteomics is the identification of peptides in sequence databases by their fragmentation spectra. Here we describe Andromeda, a novel peptide search engine using a probabilistic scoring model. On proteome data, Andromeda performs as well as Mascot, a widely used commercial search engine, as judged by sensitivity and specificity analysis based on target decoy searches. Furthermore, it can handle data with arbitrarily high fragment mass accuracy, is able to assign and score complex patterns of post-translational modifications, such as highly phosphorylated peptides, and accommodates extremely large databases. The algorithms of Andromeda are provided. Andromeda can function independently or as an integrated search engine of the widely used MaxQuant computational proteomics platform and both are freely available at www.maxquant.org. The combination enables analysis of large data sets in a simple analysis workflow on a desktop computer. For searching individual spectra Andromeda is also accessible via a web server. We demonstrate the flexibility of the system by implementing the capability to identify cofragmented peptides, significantly improving the total number of identified peptides.

  18. Variations in isoflavone levels in soy foods and soy protein isolates and issues related to isoflavone databases and food labeling.

    PubMed

    Setchell, Kenneth D R; Cole, Sidney J

    2003-07-02

    The reliability of databases on the isoflavone composition of foods designed to estimate dietary intakes is contingent on the assumption that soy foods are consistent in their isoflavone content. To validate this, total and individual isoflavone compositions were determined by HPLC for two different soy protein isolates used in the commercial manufacture of soy foods over a 3-year period (n = 30/isolate) and 85 samples of 40 different brands of soy milks. Total isoflavone concentrations differed markedly between the soy protein isolates, varying by 200-300% over 3 years, whereas the protein content varied by only 3%. Total isoflavone content varied by up to 5-fold among different commercial soy milks and was not consistent between repeat purchases. Whole soybean milks had significantly higher isoflavone levels than those made from soy protein isolates (mean +/- SD, 63.6 +/- 21.9 mg/L, n = 43, vs 30.2 +/- 5.8 mg/L, n = 38, respectively, p < 0.0001), although some isolated soy protein-based milks were similar in content to "whole bean" varieties. The ratio of genistein to daidzein isoflavone forms was higher in isolated soy protein-based versus "whole bean" soy milks (2.72 +/- 0.24 vs 1.62 +/- 0.47, respectively, p < 0.0001), and the greatest variability in isoflavone content was observed among brands of whole bean soy milks. These studies illustrate large variability in the isoflavone content of isolated soy proteins used in food manufacture and in commercial soy milks and reinforce the need to accurately determine the isoflavone content of foods used in dietary intervention studies while exposing the limitations of food databases for estimating daily isoflavone intakes.

  19. Evaluation of a Computerized Clinical Information System (Micromedex).

    PubMed Central

    Lundsgaarde, H. P.; Moreshead, G. E.

    1991-01-01

    This paper summarizes data collected as part of a project designed to identify and assess the technical and organizational problems associated with the implementation and evaluation of a Computerized Clinical Information System (CCIS), Micromedex, in three U.S. Department of Veterans Affairs Medical Centers (VAMCs). The study began in 1987 as a national effort to implement decision support technologies in the Veterans Administration Decentralized Hospital Computer Program (DHCP). The specific objectives of this project were to (1) examine one particular decision support technology, (2) identify the technical and organizational barriers to the implementation of a CCIS in the VA host environment, (3) assess the possible benefits of this system to VA clinicians in terms of therapeutic decision making, and (4) develop new methods for identifying the clinical utility of a computer program designed to provide clinicians with a new information tool. The project was conducted intermittently over a three-year period at three VA medical centers chosen as implementation and evaluation test sites for Micromedex. Findings from the Kansas City Medical Center in Missouri are presented to illustrate some of the technical problems associated with the implementation of a commercial database program in the DHCP host environment, the organizational factors influencing clinical use of the system, and the methods used to evaluate its use. Data from 4581 provider encounters with the CCIS are summarized. Usage statistics are presented to illustrate the methodological possibilities for assessing the "benefits and burdens" of a computerized information system by using an automated collection of user demographics and program audit trails that allow evaluators to monitor user interactions with different segments of the database. PMID:1807583

  20. Evaluation of a Computerized Clinical Information System (Micromedex).

    PubMed

    Lundsgaarde, H P; Moreshead, G E

    1991-01-01

    This paper summarizes data collected as part of a project designed to identify and assess the technical and organizational problems associated with the implementation and evaluation of a Computerized Clinical Information System (CCIS), Micromedex, in three U.S. Department of Veterans Affairs Medical Centers (VAMCs). The study began in 1987 as a national effort to implement decision support technologies in the Veterans Administration Decentralized Hospital Computer Program (DHCP). The specific objectives of this project were to (1) examine one particular decision support technology, (2) identify the technical and organizational barriers to the implementation of a CCIS in the VA host environment, (3) assess the possible benefits of this system to VA clinicians in terms of therapeutic decision making, and (4) develop new methods for identifying the clinical utility of a computer program designed to provide clinicians with a new information tool. The project was conducted intermittently over a three-year period at three VA medical centers chosen as implementation and evaluation test sites for Micromedex. Findings from the Kansas City Medical Center in Missouri are presented to illustrate some of the technical problems associated with the implementation of a commercial database program in the DHCP host environment, the organizational factors influencing clinical use of the system, and the methods used to evaluate its use. Data from 4581 provider encounters with the CCIS are summarized. Usage statistics are presented to illustrate the methodological possibilities for assessing the "benefits and burdens" of a computerized information system by using an automated collection of user demographics and program audit trails that allow evaluators to monitor user interactions with different segments of the database.

  1. Database on Performance of Neutron Irradiated FeCrAl Alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Field, Kevin G.; Briggs, Samuel A.; Littrell, Ken

    The present report summarizes and discusses the database on radiation tolerance for Generation I, Generation II, and commercial FeCrAl alloys. This database has been built upon mechanical testing and microstructural characterization on selected alloys irradiated within the High Flux Isotope Reactor (HFIR) at Oak Ridge National Laboratory (ORNL) up to doses of 13.8 dpa at temperatures ranging from 200°C to 550°C. The structure and performance of these irradiated alloys were characterized using advanced microstructural characterization techniques and mechanical testing. The primary objective of developing this database is to enhance the rapid development of a mechanistic understanding on the radiation tolerancemore » of FeCrAl alloys, thereby enabling informed decisions on the optimization of composition and microstructure of FeCrAl alloys for application as an accident tolerant fuel (ATF) cladding. This report is structured to provide a brief summary of critical results related to the database on radiation tolerance of FeCrAl alloys.« less

  2. Sugar composition of French royal jelly for comparison with commercial and artificial sugar samples.

    PubMed

    Daniele, Gaëlle; Casabianca, Hervé

    2012-09-15

    A gas chromatographic method was developed to quantify the major and minor sugars of 400 Royal Jellies (RJs). Their contents were compared in relation to the geographical origins and different production methods. A reliable database was established from the analysis of 290 RJs harvested in different French areas that took into account the diversity of geographical origin, harvesting season, forage sources available in the environment corresponding to natural food of the bees: pollen and nectar. Around 30 RJ samples produced by Italian beekeepers, about sixty-ones from French market, and around thirty-ones derived from feeding experiments were analysed and compared with our database. Fructose and glucose contents are in the range 2.3-7.8% and 3.4-7.7%, respectively, whatever the RJ's origin. On the contrary, differences in minor sugar composition are observed. Indeed sucrose and erlose contents in French RJs are lesser than 1.7% and 0.3%, respectively, whereas they reach 3.9% and 2.0% in some commercial samples and 5.1% and 1.7% in RJs produced from feeding experiments. This study could be used to discriminate different production methods and provide an additional tool for identifying unknown commercial RJs. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Video Compression

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Optivision developed two PC-compatible boards and associated software under a Goddard Space Flight Center Small Business Innovation Research grant for NASA applications in areas such as telerobotics, telesciences and spaceborne experimentation. From this technology, the company used its own funds to develop commercial products, the OPTIVideo MPEG Encoder and Decoder, which are used for realtime video compression and decompression. They are used in commercial applications including interactive video databases and video transmission. The encoder converts video source material to a compressed digital form that can be stored or transmitted, and the decoder decompresses bit streams to provide high quality playback.

  4. ZINC: A Free Tool to Discover Chemistry for Biology

    PubMed Central

    2012-01-01

    ZINC is a free public resource for ligand discovery. The database contains over twenty million commercially available molecules in biologically relevant representations that may be downloaded in popular ready-to-dock formats and subsets. The Web site also enables searches by structure, biological activity, physical property, vendor, catalog number, name, and CAS number. Small custom subsets may be created, edited, shared, docked, downloaded, and conveyed to a vendor for purchase. The database is maintained and curated for a high purchasing success rate and is freely available at zinc.docking.org. PMID:22587354

  5. Biomedical databases: protecting privacy and promoting research.

    PubMed

    Wylie, Jean E; Mineau, Geraldine P

    2003-03-01

    When combined with medical information, large electronic databases of information that identify individuals provide superlative resources for genetic, epidemiology and other biomedical research. Such research resources increasingly need to balance the protection of privacy and confidentiality with the promotion of research. Models that do not allow the use of such individual-identifying information constrain research; models that involve commercial interests raise concerns about what type of access is acceptable. Researchers, individuals representing the public interest and those developing regulatory guidelines must be involved in an ongoing dialogue to identify practical models.

  6. Order Denying Review -- Hawaiian Commercial and Sugar Company Permit No. HI 89-01

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  7. Baseline information development for energy smart schools -- applied research, field testing and technology integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Tengfang; Piette, Mary Ann

    2004-08-05

    The original scope of work was to obtain and analyze existing and emerging data in four states: California, Florida, New York, and Wisconsin. The goal of this data collection was to deliver a baseline database or recommendations for such a database that could possibly contain window and daylighting features and energy performance characteristics of Kindergarten through 12th grade (K-12) school buildings (or those of classrooms when available). In particular, data analyses were performed based upon the California Commercial End-Use Survey (CEUS) databases to understand school energy use, features of window glazing, and availability of daylighting in California K-12 schools. Themore » outcomes from this baseline task can be used to assist in establishing a database of school energy performance, assessing applications of existing technologies relevant to window and daylighting design, and identifying future R&D needs. These are in line with the overall project goals as outlined in the proposal. Through the review and analysis of this data, it is clear that there are many compounding factors impacting energy use in K-12 school buildings in the U.S., and that there are various challenges in understanding the impact of K-12 classroom energy use associated with design features of window glazing and skylight. First, the energy data in the existing CEUS databases has, at most, provided the aggregated electricity and/or gas usages for the building establishments that include other school facilities on top of the classroom spaces. Although the percentage of classroom floor area in schools is often available from the databases, there is no additional information that can be used to quantitatively segregate the EUI for classroom spaces. In order to quantify the EUI for classrooms, sub-metering of energy usage by classrooms must be obtained. Second, magnitudes of energy use for electricity lighting are not attainable from the existing databases, nor are the lighting levels contributed by artificial lighting or daylight. It is impossible to reasonably estimate the lighting energy consumption for classroom areas in the sample of schools studied in this project. Third, there are many other compounding factors that may as well influence the overall classroom energy use, e.g., ventilation, insulation, system efficiency, occupancy, control, schedules, and weather. Fourth, although we have examined the school EUI grouped by various factors such as climate zones, window and daylighting design features from the California databases, no statistically significant associations can be identified from the sampled California K-12 schools in the current California CEUS. There are opportunities to expand such analyses by developing and including more powerful CEUS databases in the future. Finally, a list of parameters is recommended for future database development and for use of future investigation in K-12 classroom energy use, window and skylight design, and possible relations between them. Some of the key parameters include: (1) Energy end use data for lighting systems, classrooms, and schools; (2) Building design and operation including features for windows and daylighting; and (3) Other key parameters and information that would be available to investigate overall energy uses, building and systems design, their operation, and services provided.« less

  8. Performance appraisal of online MEDLINE access routes.

    PubMed Central

    Walker, C. J.; McKibbon, K. A.; Haynes, R. B.; Johnston, M. E.

    1992-01-01

    OBJECTIVE: To compare the performance and cost of 11 online MEDLINE systems with MEDLINE at Elhill. DESIGN: Comparative study. SYSTEMS: Eleven online daytime systems commercially available in North America offering the MEDLINE database. MEASURES: Number of relevant citations, number of irrelevant citations, proportion of searches producing no relevant citations and cost per relevant citation were analyzed for each system. Relevance and cost for each system were compared with direct searching of MEDLINE through NLM for librarian and clinician search strategies for 18 clinical questions. The citations retrieved by both strategies were pooled and rated for relevance on a 7-point scale. RESULTS: Numbers of relevant and irrelevant citations and cost per relevant citation were higher for clinician searches than librarian searches, reflecting the higher total number of citations retrieved by the clinician approaches. A lower proportion of clinician searches produced no relevant citations than librarian searches. CONCLUSIONS: Eleven daytime MEDLINE systems performed similarly in terms of retrieval and cost within similar searching groups. Clinicians, however, tended to capture larger overall retrievals resulting in higher numbers of relevant and irrelevant citations than librarians. PMID:1482922

  9. Sodium content in US packaged foods 2009

    USDA-ARS?s Scientific Manuscript database

    In 2010, the Institute of Medicine recommended food manufacturers reduce the amount of sodium in their products. Monitoring sodium in packaged foods is necessary to evaluate the impact of these efforts. Using commercially available data from Nielsen and Gladson, we created a database with sales and...

  10. Beat the Clock.

    ERIC Educational Resources Information Center

    Feinberg, Rosa Castro

    1995-01-01

    A school board member guides a tour of what is available for organizing school executives' schedules. Describes time-planner notebooks, computer software, and various academic and commercial online databases. Sidebars list the programs and products mentioned, discuss online services, and describe using a time-management planner in campaigning for…

  11. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software.

    PubMed

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2012-10-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely.

  12. Cost effective nuclear commercial grade dedication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maletz, J.J.; Marston, M.J.

    1991-01-01

    This paper describes a new computerized database method to create/edit/view specification technical data sheets (mini-specifications) for procurement of spare parts for nuclear facility maintenance and to develop information that could support possible future facility life extension efforts. This method may reduce cost when compared with current manual methods. The use of standardized technical data sheets (mini-specifications) for items of the same category improves efficiency. This method can be used for a variety of tasks, including: Nuclear safety-related procurement; Non-safety related procurement; Commercial grade item procurement/dedication; Evaluation of replacement items. This program will assist the nuclear facility in upgrading its procurementmore » activities consistent with the recent NUMARC Procurement Initiative. Proper utilization of the program will assist the user in assuring that the procured items are correct for the applications, provide data to assist in detecting fraudulent materials, minimize human error in withdrawing database information, improve data retrievability, improve traceability, and reduce long-term procurement costs.« less

  13. Stratospheric emissions effects database development

    NASA Technical Reports Server (NTRS)

    Baughcum, Steven L.; Henderson, Stephen C.; Hertel, Peter S.; Maggiora, Debra R.; Oncina, Carlos A.

    1994-01-01

    This report describes the development of a stratospheric emissions effects database (SEED) of aircraft fuel burn and emissions from projected Year 2015 subsonic aircraft fleets and from projected fleets of high-speed civil transports (HSCT's). This report also describes the development of a similar database of emissions from Year 1990 scheduled commercial passenger airline and air cargo traffic. The objective of this work was to initiate, develop, and maintain an engineering database for use by atmospheric scientists conducting the Atmospheric Effects of Stratospheric Aircraft (AESA) modeling studies. Fuel burn and emissions of nitrogen oxides (NO(x) as NO2), carbon monoxide, and hydrocarbons (as CH4) have been calculated on a 1-degree latitude x 1-degree longitude x 1-kilometer altitude grid and delivered to NASA as electronic files. This report describes the assumptions and methodology for the calculations and summarizes the results of these calculations.

  14. BioPepDB: an integrated data platform for food-derived bioactive peptides.

    PubMed

    Li, Qilin; Zhang, Chao; Chen, Hongjun; Xue, Jitong; Guo, Xiaolei; Liang, Ming; Chen, Ming

    2018-03-12

    Food-derived bioactive peptides play critical roles in regulating most biological processes and have considerable biological, medical and industrial importance. However, a large number of active peptides data, including sequence, function, source, commercial product information, references and other information are poorly integrated. BioPepDB is a searchable database of food-derived bioactive peptides and their related articles, including more than four thousand bioactive peptide entries. Moreover, BioPepDB provides modules of prediction and hydrolysis-simulation for discovering novel peptides. It can serve as a reference database to investigate the function of different bioactive peptides. BioPepDB is available at http://bis.zju.edu.cn/biopepdbr/ . The web page utilises Apache, PHP5 and MySQL to provide the user interface for accessing the database and predict novel peptides. The database itself is operated on a specialised server.

  15. MEIMAN: Database exploring Medicinal and Edible insects of Manipur

    PubMed Central

    Shantibala, Tourangbam; Lokeshwari, Rajkumari; Thingnam, Gourshyam; Somkuwar, Bharat Gopalrao

    2012-01-01

    We have developed MEIMAN, a unique database on medicinal and edible insects of Manipur which comprises 51 insects species collected through extensive survey and questionnaire for two years. MEIMAN provides integrated access to insect species thorough sophisticated web interface which has following capabilities a) Graphical interface of seasonality, b) Method of preparation, c) Form of use - edible and medicinal, d) habitat, e) medicinal uses, f) commercial importance and g) economic status. This database will be useful for scientific validations and updating of traditional wisdom in bioprospecting aspects. It will be useful in analyzing the insect biodiversity for the development of virgin resources and their industrialization. Further, the features will be suited for detailed investigation on potential medicinal and edible insects that make MEIMAN a powerful tool for sustainable management. Availability The database is available for free at www.ibsd.gov.in/meiman PMID:22715305

  16. An affordable wearable video system for emergency response training

    NASA Astrophysics Data System (ADS)

    King-Smith, Deen; Mikkilineni, Aravind; Ebert, David; Collins, Timothy; Delp, Edward J.

    2009-02-01

    Many emergency response units are currently faced with restrictive budgets that prohibit their use of advanced technology-based training solutions. Our work focuses on creating an affordable, mobile, state-of-the-art emergency response training solution through the integration of low-cost, commercially available products. The system we have developed consists of tracking, audio, and video capability, coupled with other sensors that can all be viewed through a unified visualization system. In this paper we focus on the video sub-system which helps provide real time tracking and video feeds from the training environment through a system of wearable and stationary cameras. These two camera systems interface with a management system that handles storage and indexing of the video during and after training exercises. The wearable systems enable the command center to have live video and tracking information for each trainee in the exercise. The stationary camera systems provide a fixed point of reference for viewing action during the exercise and consist of a small Linux based portable computer and mountable camera. The video management system consists of a server and database which work in tandem with a visualization application to provide real-time and after action review capability to the training system.

  17. DICOM-compliant PACS with CD-based image archival

    NASA Astrophysics Data System (ADS)

    Cox, Robert D.; Henri, Christopher J.; Rubin, Richard K.; Bret, Patrice M.

    1998-07-01

    This paper describes the design and implementation of a low- cost PACS conforming to the DICOM 3.0 standard. The goal was to provide an efficient image archival and management solution on a heterogeneous hospital network as a basis for filmless radiology. The system follows a distributed, client/server model and was implemented at a fraction of the cost of a commercial PACS. It provides reliable archiving on recordable CD and allows access to digital images throughout the hospital and on the Internet. Dedicated servers have been designed for short-term storage, CD-based archival, data retrieval and remote data access or teleradiology. The short-term storage devices provide DICOM storage and query/retrieve services to scanners and workstations and approximately twelve weeks of 'on-line' image data. The CD-based archival and data retrieval processes are fully automated with the exception of CD loading and unloading. The system employs lossless compression on both short- and long-term storage devices. All servers communicate via the DICOM protocol in conjunction with both local and 'master' SQL-patient databases. Records are transferred from the local to the master database independently, ensuring that storage devices will still function if the master database server cannot be reached. The system features rules-based work-flow management and WWW servers to provide multi-platform remote data access. The WWW server system is distributed on the storage, retrieval and teleradiology servers allowing viewing of locally stored image data directly in a WWW browser without the need for data transfer to a central WWW server. An independent system monitors disk usage, processes, network and CPU load on each server and reports errors to the image management team via email. The PACS was implemented using a combination of off-the-shelf hardware, freely available software and applications developed in-house. The system has enabled filmless operation in CT, MR and ultrasound within the radiology department and throughout the hospital. The use of WWW technology has enabled the development of an intuitive we- based teleradiology and image management solution that provides complete access to image data.

  18. The IAGOS Information System

    NASA Astrophysics Data System (ADS)

    Boulanger, D.; Thouret, V.

    2016-12-01

    IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft and do measurements of aerosols, cloud particles, greenhouse gases, ozone, water vapor and nitrogen oxides from the surface to the lower stratosphere. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core and IAGOS-CARIBIC data. The IAGOS Data Portal (http://www.iagos.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). In 2016 the new IAGOS Data Portal has been released. In addition to the data download the portal provides improved and new services such as download in NetCDF or NASA Ames formats and plotting tools (maps, time series, vertical profiles). New added value products are available through the portal: back trajectories, origin of air masses, co-location with satellite data. Web services allow to download IAGOS metadata such as flights and airports information. Administration tools have been implemented for users management and instruments monitoring. A major improvement is the interoperability with international portals and other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse, the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de) and the CAMS (Copernicus Atmosphere Monitoring Service) data center in Jülich (http://join.iek.fz-juelich.de). The link with the CAMS data center, through the JOIN interface, allows to combine model outputs with IAGOS data for inter-comparison. The CAMS project is a prominent user of the IGAS data network. Duting the next year IAGOS will improve metadata standardization and dissemination through different collaborations with the AERIS data center, GAW for which IAGOS is a contributing network and the ENVRI+ European project. Measurements traceability and quality metadata will be available and DOI will be implemented.

  19. Supporting Building Portfolio Investment and Policy Decision Making through an Integrated Building Utility Data Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aziz, Azizan; Lasternas, Bertrand; Alschuler, Elena

    The American Recovery and Reinvestment Act stimulus funding of 2009 for smart grid projects resulted in the tripling of smart meters deployment. In 2012, the Green Button initiative provided utility customers with access to their real-time1 energy usage. The availability of finely granular data provides an enormous potential for energy data analytics and energy benchmarking. The sheer volume of time-series utility data from a large number of buildings also poses challenges in data collection, quality control, and database management for rigorous and meaningful analyses. In this paper, we will describe a building portfolio-level data analytics tool for operational optimization, businessmore » investment and policy assessment using 15-minute to monthly intervals utility data. The analytics tool is developed on top of the U.S. Department of Energy’s Standard Energy Efficiency Data (SEED) platform, an open source software application that manages energy performance data of large groups of buildings. To support the significantly large volume of granular interval data, we integrated a parallel time-series database to the existing relational database. The time-series database improves on the current utility data input, focusing on real-time data collection, storage, analytics and data quality control. The fully integrated data platform supports APIs for utility apps development by third party software developers. These apps will provide actionable intelligence for building owners and facilities managers. Unlike a commercial system, this platform is an open source platform funded by the U.S. Government, accessible to the public, researchers and other developers, to support initiatives in reducing building energy consumption.« less

  20. The Clinical Next-Generation Sequencing Database: A Tool for the Unified Management of Clinical Information and Genetic Variants to Accelerate Variant Pathogenicity Classification.

    PubMed

    Nishio, Shin-Ya; Usami, Shin-Ichi

    2017-03-01

    Recent advances in next-generation sequencing (NGS) have given rise to new challenges due to the difficulties in variant pathogenicity interpretation and large dataset management, including many kinds of public population databases as well as public or commercial disease-specific databases. Here, we report a new database development tool, named the "Clinical NGS Database," for improving clinical NGS workflow through the unified management of variant information and clinical information. This database software offers a two-feature approach to variant pathogenicity classification. The first of these approaches is a phenotype similarity-based approach. This database allows the easy comparison of the detailed phenotype of each patient with the average phenotype of the same gene mutation at the variant or gene level. It is also possible to browse patients with the same gene mutation quickly. The other approach is a statistical approach to variant pathogenicity classification based on the use of the odds ratio for comparisons between the case and the control for each inheritance mode (families with apparently autosomal dominant inheritance vs. control, and families with apparently autosomal recessive inheritance vs. control). A number of case studies are also presented to illustrate the utility of this database. © 2016 The Authors. **Human Mutation published by Wiley Periodicals, Inc.

  1. Building a virtual ligand screening pipeline using free software: a survey.

    PubMed

    Glaab, Enrico

    2016-03-01

    Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. © The Author 2015. Published by Oxford University Press.

  2. Building a virtual ligand screening pipeline using free software: a survey

    PubMed Central

    2016-01-01

    Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. PMID:26094053

  3. Medical oxygen and air travel.

    PubMed

    Lyznicki, J M; Williams, M A; Deitchman, S D; Howe, J P

    2000-08-01

    This report responds to a resolution that asked the American Medical Association (AMA) to take action to improve airport and airline accommodations for passengers requiring medical oxygen. Information for the report was derived from a search of the MEDLINE database and references listed in pertinent articles, as well as through communications with experts in aerospace and emergency medicine. Based on this information, the AMA Council on Scientific Affairs determined that commercial air travel exposes passengers to altitude-related hypoxia and gas expansion, which may cause some passengers to experience significant symptoms and medical complications during flight. Medical guidelines are available to help physicians evaluate and counsel potential passengers who are at increased risk of inflight hypoxemia. Supplemental oxygen may be needed for some passengers to maintain adequate tissue oxygenation and prevent hypoxemic complications. For safety and security reasons, federal regulations prohibit travelers from using their own portable oxygen system onboard commercial aircraft. Many U.S. airlines supply medical oxygen for use during flight but policies and procedures vary. Oxygen-dependent passengers must make additional arrangements for the use of supplemental oxygen in airports. Uniform standards are needed to specify procedures and equipment for the use of medical oxygen in airports and aboard commercial aircraft. Revision of federal regulations should be considered to accommodate oxygen-dependent passengers and permit them to have an uninterrupted source of oxygen from departure to destination.

  4. Land-use and land-cover change in montane mainland southeast Asia.

    PubMed

    Fox, Jefferson; Vogler, John B

    2005-09-01

    This paper summarizes land-cover and land-use change at eight sites in Thailand, Yunnan (China), Vietnam, Cambodia, and Laos over the last 50 years. Project methodology included incorporating information collected from a combination of semiformal, key informant, and formal household interviews with the development of spatial databases based on aerial photographs, satellite images, topographic maps, and GPS data. Results suggest that land use (e.g. swidden cultivation) and land cover (e.g. secondary vegetation) have remained stable and the minor amount of land-use change that has occurred has been a change from swidden to monocultural cash crops. Results suggest that two forces will increasingly determine land-use systems in this region. First, national land tenure policies-the nationalization of forest lands and efforts to increase control over upland resources by central governments-will provide a push factor making it increasingly difficult for farmers to maintain their traditional swidden land-use practices. Second, market pressures-the commercialization of subsistence resources and the substitution of commercial crops for subsistence crops-will provide a pull factor encouraging farmers to engage in new and different forms of commercial agriculture. These results appear to be robust as they come from eight studies conducted over the last decade. But important questions remain in terms of what research protocols are needed, if any, when linking social science data with remotely sensed data for understanding human-environment interactions.

  5. Exploring CD-ROM Encyclopedias.

    ERIC Educational Resources Information Center

    Urrows, Henry; Urrows, Elizabeth

    1989-01-01

    Review of encyclopedias and other databases available on CD-ROM focuses on the International Encyclopedia of Education (IEE). Problems with software and hardware are described, perspectives from the computer industry are presented, the commercial potential of CD-ROMs is discussed, and a list of sources is provided. (six references) (LRW)

  6. Submission of nucleotide sequence clostridium perfringens NetB toxin to genbank database

    USDA-ARS?s Scientific Manuscript database

    Clostridium perfringens can cause gas gangrene and food poisoning in humans and causes several enterot-oxemic diseases in animals including avian necrotic enteritis. This disease affects all chicken producing countries worldwide and is a considerable burden on the commercial chicken production indus...

  7. Tackling community concerns about commercialisation and genetic research: a modest interdisciplinary proposal.

    PubMed

    Haddow, Gillian; Laurie, Graeme; Cunningham-Burley, Sarah; Hunter, Kathryn G

    2007-01-01

    In recent years, there has been a rise in the creation of DNA databases promising a range of health benefits to individuals and populations. This development has been accompanied by an interest in, and concern for the ethical, legal and social aspects of such collections. In terms of policy solutions, much of the focus of these debates has been on issues of consent, confidentiality and research governance. However, there are broader concerns, such as those associated with commercialisation, which cannot be adequately addressed by these foci. In this article, we focus on the health-wealth benefits that DNA databases promise by considering the views of 10 focus groups on Generation Scotland, Scotland's first national genetic database. As in previous studies, our qualitative research on public/s and stakeholders' views of DNA databases show the prospect of utilising donated samples and information derived for wealth-related ends (i.e. for private profit), irrespective of whether there is an associated health-related benefit, arouses considerable reaction. While health-wealth benefits are not mutually exclusive ideals, the tendency has been to cast 'public' benefits as exclusively health-related, while 'private' commercial benefits for funders and/or researchers are held out as a necessary pay-off. We argue for a less polarised approach that reconsiders what is meant by 'public benefits' and questions the exclusivity of commercial interests. We believe accommodation can be achieved via the mobilisation of a grass roots solution known as 'benefit-sharing' or a 'profit pay-off'. We propose a sociologically informed model that has a pragmatic, legal framework, which responds seriously to public concerns.

  8. Icing Simulation Research Supporting the Ice-Accretion Testing of Large-Scale Swept-Wing Models

    NASA Technical Reports Server (NTRS)

    Yadlin, Yoram; Monnig, Jaime T.; Malone, Adam M.; Paul, Bernard P.

    2018-01-01

    The work summarized in this report is a continuation of NASA's Large-Scale, Swept-Wing Test Articles Fabrication; Research and Test Support for NASA IRT contract (NNC10BA05 -NNC14TA36T) performed by Boeing under the NASA Research and Technology for Aerospace Propulsion Systems (RTAPS) contract. In the study conducted under RTAPS, a series of icing tests in the Icing Research Tunnel (IRT) have been conducted to characterize ice formations on large-scale swept wings representative of modern commercial transport airplanes. The outcome of that campaign was a large database of ice-accretion geometries that can be used for subsequent aerodynamic evaluation in other experimental facilities and for validation of ice-accretion prediction codes.

  9. The identification of factors contributing to self-reported anomalies in civil aviation.

    PubMed

    Andrzejczak, Chris; Karwowski, Waldemar; Thompson, William

    2014-01-01

    The main objective of this study was to analyze anomalies voluntarily reported by pilots in civil aviation sector and identify factors leading to such anomalies. Experimental data were obtained from the NASA aviation safety reporting system (ASRS) database. These data contained a range of text records spanning 30 years of civilian aviation, both commercial (airline operations) and general aviation (private aircraft). Narrative data as well as categorical data were used. The associations between incident contributing factors and self-reported anomalies were investigated using data mining and correspondence analysis. The results revealed that a broadly defined human factors category and weather conditions were the main contributors to self-reported civil aviation anomalies. New associations between identified factors and reported anomaly conditions were also reported.

  10. Practical Value of Food Pathogen Traceability through Building a Whole-Genome Sequencing Network and Database

    PubMed Central

    Strain, Errol; Melka, David; Bunning, Kelly; Musser, Steven M.; Brown, Eric W.; Timme, Ruth

    2016-01-01

    The FDA has created a United States-based open-source whole-genome sequencing network of state, federal, international, and commercial partners. The GenomeTrakr network represents a first-of-its-kind distributed genomic food shield for characterizing and tracing foodborne outbreak pathogens back to their sources. The GenomeTrakr network is leading investigations of outbreaks of foodborne illnesses and compliance actions with more accurate and rapid recalls of contaminated foods as well as more effective monitoring of preventive controls for food manufacturing environments. An expanded network would serve to provide an international rapid surveillance system for pathogen traceback, which is critical to support an effective public health response to bacterial outbreaks. PMID:27008877

  11. Incorporating Spatial Data into Enterprise Applications

    NASA Astrophysics Data System (ADS)

    Akiki, Pierre; Maalouf, Hoda

    The main goal of this chapter is to discuss the usage of spatial data within enterprise as well as smaller line-of-business applications. In particular, this chapter proposes new methodologies for storing and manipulating vague spatial data and provides methods for visualizing both crisp and vague spatial data. It also provides a comparison between different types of spatial data, mainly 2D crisp and vague spatial data, and their respective fields of application. Additionally, it compares existing commercial relational database management systems, which are the most widely used with enterprise applications, and discusses their deficiencies in terms of spatial data support. A new spatial extension package called Spatial Extensions (SPEX) is provided in this chapter and is tested on a software prototype.

  12. Lake Pontchartrain Basin: bottom sediments and related environmental resources

    USGS Publications Warehouse

    Manheim, Frank T.; Hayes, Laura

    2002-01-01

    Lake Pontchartrain is the largest estuary southern Louisiana. It is an important recreational, commercial, and environmental resource for New Orleans and southwestern Louisiana. This publication is part of a 5-year cooperative program led by the USGS on the geological framework and sedimentary processes of the Lake Pontchartrain Basin.This presentation is divided into two main parts:- Scientific Research and Assessments- Multimedia Tools and Regional ResourcesThe scientific sections include historical information on the area; shipboard, field, and remote sensing studies; and a comprehensive sediment database with geological and chemical discussions of the region.The multimedia and resources sections include Geographic Information System (GIS) tools and data, a video demonstrating vibracore sampling techniques in Lake Pontchartrain, and abstracts from four Basics of the Basin symposia.

  13. Brunn: an open source laboratory information system for microplates with a graphical plate layout design process.

    PubMed

    Alvarsson, Jonathan; Andersson, Claes; Spjuth, Ola; Larsson, Rolf; Wikberg, Jarl E S

    2011-05-20

    Compound profiling and drug screening generates large amounts of data and is generally based on microplate assays. Current information systems used for handling this are mainly commercial, closed source, expensive, and heavyweight and there is a need for a flexible lightweight open system for handling plate design, and validation and preparation of data. A Bioclipse plugin consisting of a client part and a relational database was constructed. A multiple-step plate layout point-and-click interface was implemented inside Bioclipse. The system contains a data validation step, where outliers can be removed, and finally a plate report with all relevant calculated data, including dose-response curves. Brunn is capable of handling the data from microplate assays. It can create dose-response curves and calculate IC50 values. Using a system of this sort facilitates work in the laboratory. Being able to reuse already constructed plates and plate layouts by starting out from an earlier step in the plate layout design process saves time and cuts down on error sources.

  14. Recent Updates on the Systemic and Local Safety of Intranasal Steroids.

    PubMed

    Jang, Tae Young; Kim, Young Hyo

    2016-01-01

    Allergic rhinitis is a global health problem, and its prevalence rate and socioeconomic burden continue to increase. Intranasal steroid (INS) is the first treatment choice in the majority of patients, because of its ability to effectively control allergic symptoms. However, patients and clinicians are concerned about the potential adverse effects of prolonged INS use. We performed to review for evaluating systemic and local safety of INS use, by searching MEDLINE, EMBASE, and Cochrane Library database for identification of relevant articles. In the present study, the systemic bioavailabilities of several commercially available INSs were researched, and then systemic safeties were reviewed with focus on suppression of the hypothalamus-pituitary-adrenal axis and their effects on pediatric growth. In addition, local adverse effects, such as, epistaxis and nasal septal perforation, were investigated. Finally, the authors proposed some techniques in order to avoid these complications. INSs offer a safe, effective means of treating allergic rhinitis in the short- and long-term with no or minimal adverse systemic and local effects. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  15. United States Army Medical Materiel Development Activity: 1997 Annual Report.

    DTIC Science & Technology

    1997-01-01

    business planning and execution information management system (Project Management Division Database ( PMDD ) and Product Management Database System (PMDS...MANAGEMENT • Project Management Division Database ( PMDD ), Product Management Database System (PMDS), and Special Users Database System:The existing...System (FMS), were investigated. New Product Managers and Project Managers were added into PMDS and PMDD . A separate division, Support, was

  16. Acoustic Test Characterization of Melamine Foam for Usage in NASA's Payload Fairing Acoustic Attenuation Systems

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Anne M.; McNelis, Mark E.

    2014-01-01

    The external acoustic liftoff levels predicted for NASA's future heavy lift launch vehicles are expected to be significantly higher than the environment created by today's commercial launch vehicles. This creates a need to develop an improved acoustic attenuation system for future NASA payload fairings. NASA Glenn Research Center initiated an acoustic test series to characterize the acoustic performance of melamine foam, with and without various acoustic enhancements. This testing was denoted as NEMFAT, which stands for NESC Enhanced Melamine Foam Acoustic Test, and is the subject of this paper. Both absorption and transmission loss testing of numerous foam configurations were performed at the Riverbank Acoustical Laboratory in July 2013. The NEMFAT test data provides an initial acoustic characterization and database of melamine foam for NASA. Because of its acoustic performance and lighter mass relative to fiberglass blankets, melamine foam is being strongly considered for use in the acoustic attenuation systems of NASA's future launch vehicles.

  17. Advanced driver assistance system: Road sign identification using VIAPIX system and a correlation technique

    NASA Astrophysics Data System (ADS)

    Ouerhani, Y.; Alfalou, A.; Desthieux, M.; Brosseau, C.

    2017-02-01

    We present a three-step approach based on the commercial VIAPIX® module for road traffic sign recognition and identification. Firstly, detection in a scene of all objects having characteristics of traffic signs is performed. This is followed by a first-level recognition based on correlation which consists in making a comparison between each detected object with a set of reference images of a database. Finally, a second level of identification allows us to confirm or correct the previous identification. In this study, we perform a correlation-based analysis by combining and adapting the Vander Lugt correlator with the nonlinear joint transformation correlator (JTC). Of particular significance, this approach permits to make a reliable decision on road traffic sign identification. We further discuss a robust scheme allowing us to track a detected road traffic sign in a video sequence for the purpose of increasing the decision performance of our system. This approach can have broad practical applications in the maintenance and rehabilitation of transportation infrastructure, or for drive assistance.

  18. Alaska Geochemical Database (AGDB)-Geochemical data for rock, sediment, soil, mineral, and concentrate sample media

    USGS Publications Warehouse

    Granitto, Matthew; Bailey, Elizabeth A.; Schmidt, Jeanine M.; Shew, Nora B.; Gamble, Bruce M.; Labay, Keith A.

    2011-01-01

    The Alaska Geochemical Database (AGDB) was created and designed to compile and integrate geochemical data from Alaska in order to facilitate geologic mapping, petrologic studies, mineral resource assessments, definition of geochemical baseline values and statistics, environmental impact assessments, and studies in medical geology. This Microsoft Access database serves as a data archive in support of present and future Alaskan geologic and geochemical projects, and contains data tables describing historical and new quantitative and qualitative geochemical analyses. The analytical results were determined by 85 laboratory and field analytical methods on 264,095 rock, sediment, soil, mineral and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey (USGS) personnel and analyzed in USGS laboratories or, under contracts, in commercial analytical laboratories. These data represent analyses of samples collected as part of various USGS programs and projects from 1962 to 2009. In addition, mineralogical data from 18,138 nonmagnetic heavy mineral concentrate samples are included in this database. The AGDB includes historical geochemical data originally archived in the USGS Rock Analysis Storage System (RASS) database, used from the mid-1960s through the late 1980s and the USGS PLUTO database used from the mid-1970s through the mid-1990s. All of these data are currently maintained in the Oracle-based National Geochemical Database (NGDB). Retrievals from the NGDB were used to generate most of the AGDB data set. These data were checked for accuracy regarding sample location, sample media type, and analytical methods used. This arduous process of reviewing, verifying and, where necessary, editing all USGS geochemical data resulted in a significantly improved Alaska geochemical dataset. USGS data that were not previously in the NGDB because the data predate the earliest USGS geochemical databases, or were once excluded for programmatic reasons, are included here in the AGDB and will be added to the NGDB. The AGDB data provided here are the most accurate and complete to date, and should be useful for a wide variety of geochemical studies. The AGDB data provided in the linked database may be updated or changed periodically. The data on the DVD and in the data downloads provided with this report are current as of date of publication.

  19. Performance assessment of EMR systems based on post-relational database.

    PubMed

    Yu, Hai-Yan; Li, Jing-Song; Zhang, Xiao-Guang; Tian, Yu; Suzuki, Muneou; Araki, Kenji

    2012-08-01

    Post-relational databases provide high performance and are currently widely used in American hospitals. As few hospital information systems (HIS) in either China or Japan are based on post-relational databases, here we introduce a new-generation electronic medical records (EMR) system called Hygeia, which was developed with the post-relational database Caché and the latest platform Ensemble. Utilizing the benefits of a post-relational database, Hygeia is equipped with an "integration" feature that allows all the system users to access data-with a fast response time-anywhere and at anytime. Performance tests of databases in EMR systems were implemented in both China and Japan. First, a comparison test was conducted between a post-relational database, Caché, and a relational database, Oracle, embedded in the EMR systems of a medium-sized first-class hospital in China. Second, a user terminal test was done on the EMR system Izanami, which is based on the identical database Caché and operates efficiently at the Miyazaki University Hospital in Japan. The results proved that the post-relational database Caché works faster than the relational database Oracle and showed perfect performance in the real-time EMR system.

  20. The Clinical and Economic Burden of Hyperkalemia on Medicare and Commercial Payers

    PubMed Central

    Fitch, Kathryn; Woolley, J. Michael; Engel, Tyler; Blumen, Helen

    2017-01-01

    Background Hyperkalemia (serum potassium >5.0 mEq/L) may be caused by reduced kidney function and drugs affecting the renin-angiotensin-aldosterone system and is often present in patients with chronic kidney disease (CKD). Objective To quantify the burden of hyperkalemia in US Medicare fee-for-service and commercially insured populations using real-world claims data, focusing on prevalence, comorbidities, mortality, medical utilization, and cost. Methods A descriptive, retrospective claims data analysis was performed on patients with hyperkalemia using the 2014 Medicare 5% sample and the 2014 Truven Health Analytics MarketScan Commercial Claims and Encounter databases. The starting study samples required patient insurance eligibility during ≥1 months in 2014. The identification of hyperkalemia and other comorbidities required having ≥1 qualifying claims in 2014 with an appropriate International Classification of Diseases, Ninth Revision, Clinical Modification diagnosis code in any position. To address the differences between patients with and without hyperkalemia, CKD subsamples were analyzed separately. Mortality rates were calculated in the Medicare sample population only. The claims were grouped into major service categories; the allowed costs reflected all costs incurred by each cohort divided by the total number of member months for that cohort. Results The prevalence of hyperkalemia in the Medicare and commercially insured samples was 2.3% and 0.09%, respectively. Hyperkalemia was associated with multiple comorbidities, most notably CKD. The prevalence of CKD in the Medicare and the commercially insured members with hyperkalemia was 64.8% and 31.8%, respectively. After adjusting for CKD severity, the annual mortality rate for Medicare patients with CKD and hyperkalemia was 24.9% versus 10.4% in patients with CKD without hyperkalemia. The allowed costs in patients with CKD and hyperkalemia in the Medicare and commercially insured cohorts were more than twice those in patients with CKD without hyperkalemia. Inpatient care accounted for >50% of costs in patients with CKD and hyperkalemia. Conclusion Hyperkalemia is associated with substantial clinical and economic burden among US commercially insured and Medicare populations. PMID:28794824

  1. Interlaboratory Comparison of Sample Preparation Methods, Database Expansions, and Cutoff Values for Identification of Yeasts by Matrix-Assisted Laser Desorption Ionization–Time of Flight Mass Spectrometry Using a Yeast Test Panel

    PubMed Central

    Vlek, Anneloes; Kolecka, Anna; Khayhan, Kantarawee; Theelen, Bart; Groenewald, Marizeth; Boel, Edwin

    2014-01-01

    An interlaboratory study using matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) to determine the identification of clinically important yeasts (n = 35) was performed at 11 clinical centers, one company, and one reference center using the Bruker Daltonics MALDI Biotyper system. The optimal cutoff for the MALDI-TOF MS score was investigated using receiver operating characteristic (ROC) curve analyses. The percentages of correct identifications were compared for different sample preparation methods and different databases. Logistic regression analysis was performed to analyze the association between the number of spectra in the database and the percentage of strains that were correctly identified. A total of 5,460 MALDI-TOF MS results were obtained. Using all results, the area under the ROC curve was 0.95 (95% confidence interval [CI], 0.94 to 0.96). With a sensitivity of 0.84 and a specificity of 0.97, a cutoff value of 1.7 was considered optimal. The overall percentage of correct identifications (formic acid-ethanol extraction method, score ≥ 1.7) was 61.5% when the commercial Bruker Daltonics database (BDAL) was used, and it increased to 86.8% by using an extended BDAL supplemented with a Centraalbureau voor Schimmelcultures (CBS)-KNAW Fungal Biodiversity Centre in-house database (BDAL+CBS in-house). A greater number of main spectra (MSP) in the database was associated with a higher percentage of correct identifications (odds ratio [OR], 1.10; 95% CI, 1.05 to 1.15; P < 0.01). The results from the direct transfer method ranged from 0% to 82.9% correct identifications, with the results of the top four centers ranging from 71.4% to 82.9% correct identifications. This study supports the use of a cutoff value of 1.7 for the identification of yeasts using MALDI-TOF MS. The inclusion of enough isolates of the same species in the database can enhance the proportion of correctly identified strains. Further optimization of the preparation methods, especially of the direct transfer method, may contribute to improved diagnosis of yeast-related infections. PMID:24920782

  2. NASA Interactive Forms Type Interface - NIFTI

    NASA Technical Reports Server (NTRS)

    Jain, Bobby; Morris, Bill

    2005-01-01

    A flexible database query, update, modify, and delete tool was developed that provides an easy interface to Oracle forms. This tool - the NASA interactive forms type interface, or NIFTI - features on-the- fly forms creation, forms sharing among users, the capability to query the database from user-entered criteria on forms, traversal of query results, an ability to generate tab-delimited reports, viewing and downloading of reports to the user s workstation, and a hypertext-based help system. NIFTI is a very powerful ad hoc query tool that was developed using C++, X-Windows by a Motif application framework. A unique tool, NIFTI s capabilities appear in no other known commercial-off-the- shelf (COTS) tool, because NIFTI, which can be launched from the user s desktop, is a simple yet very powerful tool with a highly intuitive, easy-to-use graphical user interface (GUI) that will expedite the creation of database query/update forms. NIFTI, therefore, can be used in NASA s International Space Station (ISS) as well as within government and industry - indeed by all users of the widely disseminated Oracle base. And it will provide significant cost savings in the areas of user training and scalability while advancing the art over current COTS browsers. No COTS browser performs all the functions NIFTI does, and NIFTI is easier to use. NIFTI s cost savings are very significant considering the very large database with which it is used and the large user community with varying data requirements it will support. Its ease of use means that personnel unfamiliar with databases (e.g., managers, supervisors, clerks, and others) can develop their own personal reports. For NASA, a tool such as NIFTI was needed to query, update, modify, and make deletions within the ISS vehicle master database (VMDB), a repository of engineering data that includes an indentured parts list and associated resource data (power, thermal, volume, weight, and the like). Since the VMDB is used both as a collection point for data and as a common repository for engineering, integration, and operations teams, a tool such as NIFTI had to be designed that could expedite the creation of database query/update forms which could then be shared among users.

  3. Smartphone-coupled rhinolaryngoscopy at the point of care

    NASA Astrophysics Data System (ADS)

    Mink, Jonah; Bolton, Frank J.; Sebag, Cathy M.; Peterson, Curtis W.; Assia, Shai; Levitz, David

    2018-02-01

    Rhinolaryngoscopy remains difficult to perform in resource-limited settings due to the high cost of purchasing and maintaining equipment as well as the need for specialists to interpret exam findings. While the lack of expertise can be obviated by adopting telemedicine-based approaches, the capture, storage, and sharing of images/video is not a common native functionality of medical devices. Most rhinolaryngoscopy systems consist of an endoscope that interfaces with the patient's naso/oropharynx, and a tower of modules that record video/images. However, these expensive and bulky modules can be replaced by a smartphone that can fulfill the same functions but at a lower cost. To demonstrate this, a commercially available rhinolaryngoscope was coupled to a smartphone using a 3D-printed adapter. Software developed for other clinical applications was repurposed for ENT use, including an application that controls image and video capture, a HIPAA-compliant image/video storage and transfer cloud database, and customized software features developed to improve practitioner competency. Audio recording capabilities to assess speech pathology were also integrated into the smartphone rhinolaryngoscope system. The illumination module coupled onto the endoscope remained unchanged. The spatial resolution of the rhinolaryngoscope system was defined by the fiber diameter of endoscope fiber bundle, rather than the smartphone camera. The mobile rhinolaryngoscope system was used with appropriate patients by a general practitioner in an office setting. The general practitioner then consulted with an ENT specialist via the HIPAA compliant cloud database and workflow modules on difficult cases. These results suggest the smartphone-based rhinolaryngoscope holds promise for use in low-resource settings.

  4. A study of energy use for ventilation and air-conditioning systems in Hong Kong

    NASA Astrophysics Data System (ADS)

    Yu, Chung Hoi Philip

    Most of the local modern buildings are high-rise with enclosed structure. Mechanical ventilation and air conditioning (MVAC) systems are installed for thermal comfort. Various types of MVAC systems found in Hong Kong were critically reviewed with comments on their characteristics in energy efficiency as well as application. The major design considerations were also discussed. Besides MVAC, other energy-consuming components in commercial buildings were also identified, such as lighting, lifts and escalators, office equipment, information technology facilities, etc. A practical approach has been adopted throughout this study in order that the end results will have pragmatic value to the heating, ventilating and air-conditioning (HVAC) industry in Hong Kong. Indoor Air Quality (IAQ) has become a major issue in commercial buildings worldwide including Hong Kong. Ventilation rate is no doubt a critical element in the design of HVAC systems, which can be realized more obviously in railway train compartments where the carbon dioxide level will be built up quickly when the compartments are crowded during rush hours. A study was carried out based on a simplified model using a train compartment that is equipped with an MVAC system. Overall Thermal Transfer Value (OTTV) is a single-value parameter for controlling building energy use and is relatively simple to implement legislatively. The local government has taken a first step in reacting to the worldwide concern of energy conservation and environmental protection since 1995. Different methods of OTTV calculation were studied and the computation results were compared. It gives a clear picture of the advantages and limitations for each method to the building designers. However, due to the limitations of using OTTV as the only parameter for building energy control, some new approaches to a total control of building energy use were discussed and they might be considered for future revision of the building energy codes in Hong Kong. A sample database of 20 existing commercial buildings was established for further analysis of building energy use. Heat gains through building envelopes were reviewed with reference to fundamental theory behind as well as the heat transfer equations presented in the literature. The prevailing methodologies of cooling load estimation and energy calculation were studied. Building energy auditing methods were discussed with reference to the local practice as well as international standards and guides. The common procedures of building energy auditing with three stages were outlined: historical data collection/analysis, preliminary site survey, and detailed energy consumption investigation. A typical commercial building was selected for detailed study of energy use by MVAC systems. (Abstract shortened by UMI.)

  5. Fee-Based Services and the Public Library: An Administrative Perspective.

    ERIC Educational Resources Information Center

    Gaines, Ervin J.; Huttner, Marian A.

    1983-01-01

    This article enumerates factors which created demand for fee-based information service (commercial databases, competition for proprietary information in business world, effectiveness of librarians) and relates experiences at two public libraries. Sources of business, value of advertising, techniques of selling, and hiring and deployment of staff…

  6. Katherine Fleming | NREL

    Science.gov Websites

    Fleming Photo of Katherine Fleming Katherine Fleming Database and Web Applications Engineer and web application development in the Commercial Buildings Research group. Her projects include the , Katherine was pursuing a Ph.D. with a focus on robotics and working as a Web developer and Web accessibility

  7. Cut Next Winter's Heating Bill Today.

    ERIC Educational Resources Information Center

    Sturgeon, Julie

    1999-01-01

    Presents specific steps that help make schools energy efficient and cut costs. Four basic strategies are suggested that include creating a database of energy usage that can also catch the occasional billing error, investigating less obvious ways of cutting energy use, such as applying cellulose commercial spray as an insulation choice, and…

  8. Development of a Context-Rich Database of ToxCast Assay Annotations (SOT)

    EPA Science Inventory

    Major concerns exist for the large number of environmental chemicals which lack toxicity data. The tens of thousands of commercial substances in need of screening for potential human health effects would cost millions of dollars and several decades to test in traditional animal-b...

  9. Operator Influence of Unexploded Ordnance Sensor Technologies

    DTIC Science & Technology

    2007-03-01

    chart display ActiveX control Mscomct2.dll – date/time display ActiveX control Pnpscr.dll – Systran SCRAMNet replicated shared memory device...response value database rgm_p2.dll – Phase 2 shared memory API and implementation Commercial components StripM.ocx – strip chart display ActiveX

  10. Verification and Trust: Background Investigations Preceding Faculty Appointment

    ERIC Educational Resources Information Center

    Finkin, Matthew W.; Post, Robert C.; Thomson, Judith J.

    2004-01-01

    Many employers in the United States have responded to the terrorist attacks of September 11, 2001, by initiating or expanding policies requiring background checks of prospective employees. Their ability to perform such checks has been abetted by the growth of computerized databases and of commercial enterprises that facilitate access to personal…

  11. Verification and Trust: Background Investigations Preceding Faculty Appointment

    ERIC Educational Resources Information Center

    Academe, 2004

    2004-01-01

    Many employers in the United States have been initiating or expanding policies requiring background checks of prospective employees. The ability to perform such checks has been abetted by the growth of computerized databases and of commercial enterprises that facilitate access to personal information. Employers now have ready access to public…

  12. Development of expert systems for analyzing electronic documents

    NASA Astrophysics Data System (ADS)

    Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.

    2018-05-01

    The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.

  13. Identification of Crew-Systems Interactions and Decision Related Trends

    NASA Technical Reports Server (NTRS)

    Jones, Sharon Monica; Evans, Joni K.; Reveley, Mary S.; Withrow, Colleen A.; Ancel, Ersin; Barr, Lawrence

    2013-01-01

    NASA Vehicle System Safety Technology (VSST) project management uses systems analysis to identify key issues and maintain a portfolio of research leading to potential solutions to its three identified technical challenges. Statistical data and published safety priority lists from academic, industry and other government agencies were reviewed and analyzed by NASA Aviation Safety Program (AvSP) systems analysis personnel to identify issues and future research needs related to one of VSST's technical challenges, Crew Decision Making (CDM). The data examined in the study were obtained from the National Transportation Safety Board (NTSB) Aviation Accident and Incident Data System, Federal Aviation Administration (FAA) Accident/Incident Data System and the NASA Aviation Safety Reporting System (ASRS). In addition, this report contains the results of a review of safety priority lists, information databases and other documented references pertaining to aviation crew systems issues and future research needs. The specific sources examined were: Commercial Aviation Safety Team (CAST) Safety Enhancements Reserved for Future Implementation (SERFIs), Flight Deck Automation Issues (FDAI) and NTSB Most Wanted List and Open Recommendations. Various automation issues taxonomies and priority lists pertaining to human factors, automation and flight design were combined to create a list of automation issues related to CDM.

  14. Environment/Health/Safety (EHS): Databases

    Science.gov Websites

    Hazard Documents Database Biosafety Authorization System CATS (Corrective Action Tracking System) (for findings 12/2005 to present) Chemical Management System Electrical Safety Ergonomics Database (for new Learned / Best Practices REMS - Radiation Exposure Monitoring System SJHA Database - Subcontractor Job

  15. Literature review: Use of commercial films as a teaching resource for health sciences students.

    PubMed

    Díaz Membrives, Montserrat; Icart Isern, M Teresa; López Matheu, M Carmen

    2016-01-01

    Analyze some of the characteristics of the publications focused on commercial cinema as a learning tool for university students engaged in health sciences degrees. The review was based on the search of information in three electronic databases: MEDLINE, CINAHL and ERIC. 54 papers were selected and analyzed. Cinema is a commonly used resource; however there is still a lack of studies demonstrating its usefulness and validity. This review is limited on its analysis by the fact that a large number of experiences are described as having a loose design. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Ice Accretion Test Results for Three Large-Scale Swept-Wing Models in the NASA Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Broeren, Andy; Potapczuk, Mark; Lee, Sam; Malone, Adam; Paul, Ben; Woodard, Brian

    2016-01-01

    The design and certification of modern transport airplanes for flight in icing conditions increasing relies on three-dimensional numerical simulation tools for ice accretion prediction. There is currently no publically available, high-quality, ice accretion database upon which to evaluate the performance of icing simulation tools for large-scale swept wings that are representative of modern commercial transport airplanes. The purpose of this presentation is to present the results of a series of icing wind tunnel test campaigns whose aim was to provide an ice accretion database for large-scale, swept wings.

  17. Over-the-horizon, connected home/office (OCHO): situation management of environmental, medical, and security conditions at remote premises via broadband wireless access

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    2010-04-01

    Broadband wireless access standards, together with advances in the development of commercial sensing and actuator devices, enable the feasibility of a consumer service for a multi-sensor system that monitors the conditions within a residence or office: the environment/infrastructure, patient-occupant health, and physical security. The proposed service is a broadband reimplementation and combination of existing services to allow on-demand reports on and management of the conditions by remote subscribers. The flow of on-demand reports to subscribers and to specialists contracted to mitigate out-of-tolerance conditions is the foreground process. Service subscribers for an over-the-horizon connected home/office (OCHO) monitoring system are the occupant of the premises and agencies, contracted by the service provider, to mitigate or resolve any observed out-of-tolerance condition(s) at the premises. Collectively, these parties are the foreground users of the OCHO system; the implemented wireless standards allow the foreground users to be mobile as they request situation reports on demand from the subsystems on remote conditions that comprise OCHO via wireless devices. An OCHO subscriber, i.e., a foreground user, may select the level of detail found in on-demand reports, i.e., the amount of information displayed in the report of monitored conditions at the premises. This is one context of system operations. While foreground reports are sent only periodically to subscribers, the information generated by the monitored conditions at the premises is continuous and is transferred to a background configuration of servers on which databases reside. These databases are each used, generally, in non-real time, for the assessment and management of situations defined by attributes like those being monitored in the foreground by OCHO. This is the second context of system operations. Context awareness and management of conditions at the premises by a second group of analysts and decision makers who extract information from the OCHO data in the databases form the foundation of the situation management problem.

  18. GMDD: a database of GMO detection methods.

    PubMed

    Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans J P; Guo, Rong; Liang, Wanqi; Zhang, Dabing

    2008-06-04

    Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier.

  19. A Relational Database System for Student Use.

    ERIC Educational Resources Information Center

    Fertuck, Len

    1982-01-01

    Describes an APL implementation of a relational database system suitable for use in a teaching environment in which database development and database administration are studied, and discusses the functions of the user and the database administrator. An appendix illustrating system operation and an eight-item reference list are attached. (Author/JL)

  20. A combined Surface Enhanced Raman Spectroscopy (SERS)/UV-vis approach for the investigation of dye content in commercial felt tip pens inks.

    PubMed

    Saviello, Daniela; Trabace, Maddalena; Alyami, Abeer; Mirabile, Antonio; Giorgi, Rodorico; Baglioni, Piero; Iacopino, Daniela

    2018-05-01

    The development of protocols for the protection of the large patrimony of works of art created by felt tip pen media since the 1950's requires detailed knowledge of the main dyes constituting commercial ink mixtures. In this work Surface Enhanced Raman Scattering (SERS) and UV-vis spectroscopy were used for the first time for the systematic identification of dye composition in commercial felt tip pens. A large selection of pens comprising six colors of five different brands was analyzed. Intense SERS spectra were obtained for all colors, allowing identification of main dye constituents. Poinceau 4R and Eosin dyes were found to be the main constituents of red and pink colors; Rhodamine and Tartrazine were found in orange and yellow colors; Erioglaucine was found in green and blue colors. UV-vis analysis of the same inks was used to support SERS findings but also to unequivocally assign some uncertain dye identifications, especially for yellow and orange colors. The spectral data of all felt tip pens collected through this work were assembled in a database format. The data obtained through this systematic investigation constitute the basis for the assembly of larger reference databases that ultimately will support the development of conservation protocols for the long term preservation of modern art collections. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Anaerobic co-digestion of commercial food waste and dairy manure: Characterizing biochemical parameters and synergistic effects.

    PubMed

    Ebner, Jacqueline H; Labatut, Rodrigo A; Lodge, Jeffrey S; Williamson, Anahita A; Trabold, Thomas A

    2016-06-01

    Anaerobic digestion of commercial food waste is a promising alternative to landfilling commercial food waste. This study characterized 11 types of commercial food wastes and 12 co-digestion blends. Bio-methane potential, biodegradable fraction, and apparent first-order hydrolysis rate coefficients were reported based upon biochemical methane potential (BMP) assays. Food waste bio-methane potentials ranged from 165 to 496 mL CH4/g VS. Substrates high in lipids or readily degradable carbohydrates showed the highest methane production. Average bio-methane potential observed for co-digested substrates was -5% to +20% that of the bio-methane potential of the individual substrates weighted by VS content. Apparent hydrolysis rate coefficients ranged from 0.19d(-1) to 0.65d(-1). Co-digested substrates showed an accelerated apparent hydrolysis rate relative to the weighted average of individual substrate rates. These results provide a database of key bio-digestion parameters to advance modeling and utilization of commercial food waste in anaerobic digestion. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. "Trust is not something you can reclaim easily": patenting in the field of direct-to-consumer genetic testing.

    PubMed

    Sterckx, Sigrid; Cockbain, Julian; Howard, Heidi; Huys, Isabelle; Borry, Pascal

    2013-05-01

    Recently, 23andMe announced that it had obtained its first patent, related to "polymorphisms associated with Parkinson's disease" (US-B-8187811). This announcement immediately sparked controversy in the community of 23andMe users and research participants, especially with regard to issues of transparency and trust. The purpose of this article was to analyze the patent portfolio of this prominent direct-to-consumer genetic testing company and discuss the potential ethical implications of patenting in this field for public participation in Web-based genetic research. We searched the publicly accessible patent database Espacenet as well as the commercially available database Micropatent for published patents and patent applications of 23andMe. Six patent families were identified for 23andMe. These included patent applications related to: genetic comparisons between grandparents and grandchildren, family inheritance, genome sharing, processing data from genotyping chips, gamete donor selection based on genetic calculations, finding relatives in a database, and polymorphisms associated with Parkinson disease. An important lesson to be drawn from this ongoing controversy seems to be that any (private or public) organization involved in research that relies on human participation, whether by providing information, body material, or both, needs to be transparent, not only about its research goals but also about its strategies and policies regarding commercialization.

  3. James Webb Space Telescope XML Database: From the Beginning to Today

    NASA Technical Reports Server (NTRS)

    Gal-Edd, Jonathan; Fatig, Curtis C.

    2005-01-01

    The James Webb Space Telescope (JWST) Project has been defining, developing, and exercising the use of a common eXtensible Markup Language (XML) for the command and telemetry (C&T) database structure. JWST is the first large NASA space mission to use XML for databases. The JWST project started developing the concepts for the C&T database in 2002. The database will need to last at least 20 years since it will be used beginning with flight software development, continuing through Observatory integration and test (I&T) and through operations. Also, a database tool kit has been provided to the 18 various flight software development laboratories located in the United States, Europe, and Canada that allows the local users to create their own databases. Recently the JWST Project has been working with the Jet Propulsion Laboratory (JPL) and Object Management Group (OMG) XML Telemetry and Command Exchange (XTCE) personnel to provide all the information needed by JWST and JPL for exchanging database information using a XML standard structure. The lack of standardization requires custom ingest scripts for each ground system segment, increasing the cost of the total system. Providing a non-proprietary standard of the telemetry and command database definition formation will allow dissimilar systems to communicate without the need for expensive mission specific database tools and testing of the systems after the database translation. The various ground system components that would benefit from a standardized database are the telemetry and command systems, archives, simulators, and trending tools. JWST has exchanged the XML database with the Eclipse, EPOCH, ASIST ground systems, Portable spacecraft simulator (PSS), a front-end system, and Integrated Trending and Plotting System (ITPS) successfully. This paper will discuss how JWST decided to use XML, the barriers to a new concept, experiences utilizing the XML structure, exchanging databases with other users, and issues that have been experienced in creating databases for the C&T system.

  4. OrChem - An open source chemistry search engine for Oracle(R).

    PubMed

    Rijnbeek, Mark; Steinbeck, Christoph

    2009-10-22

    Registration, indexing and searching of chemical structures in relational databases is one of the core areas of cheminformatics. However, little detail has been published on the inner workings of search engines and their development has been mostly closed-source. We decided to develop an open source chemistry extension for Oracle, the de facto database platform in the commercial world. Here we present OrChem, an extension for the Oracle 11G database that adds registration and indexing of chemical structures to support fast substructure and similarity searching. The cheminformatics functionality is provided by the Chemistry Development Kit. OrChem provides similarity searching with response times in the order of seconds for databases with millions of compounds, depending on a given similarity cut-off. For substructure searching, it can make use of multiple processor cores on today's powerful database servers to provide fast response times in equally large data sets. OrChem is free software and can be redistributed and/or modified under the terms of the GNU Lesser General Public License as published by the Free Software Foundation. All software is available via http://orchem.sourceforge.net.

  5. Total choline and choline-containing moieties of commercially available pulses.

    PubMed

    Lewis, Erin D; Kosik, Sarah J; Zhao, Yuan-Yuan; Jacobs, René L; Curtis, Jonathan M; Field, Catherine J

    2014-06-01

    Estimating dietary choline intake can be challenging due to missing foods in the current United States Department of Agriculture (USDA) database. The objectives of the study were to quantify the choline-containing moieties and the total choline content of a variety of pulses available in North America and use the expanded compositional database to determine the potential contribution of pulses to dietary choline intake. Commonly consumed pulses (n = 32) were analyzed by hydrophilic interaction liquid chromatography-tandem mass spectrometry (HILIC LC-MS/MS) and compared to the current USDA database. Cooking was found to reduce the relative percent from free choline and increased the contribution of phosphatidylcholine to total choline for most pulses (P < 0.05). Using the expanded database to estimate choline content of recipes using pulses as meat alternatives, resulted in a different estimation of choline content per serving (±30%), compared to the USDA database. These results suggest that when pulses are a large part of a meal or diet, the use of accurate food composition data should be used.

  6. A user friendly database for use in ALARA job dose assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zodiates, A.M.; Willcock, A.

    1995-03-01

    The pressurized water reactor (PWR) design chosen for adoption by Nuclear Electric plc was based on the Westinghouse Standard Nuclear Unit Power Plant (SNUPPS). This design was developed to meet the United Kingdom requirements and these improvements are embodied in the Sizewell B plant which will start commercial operation in 1994. A user-friendly database was developed to assist the station in the dose and ALARP assessments of the work expected to be carried out during station operation and outage. The database stores the information in an easily accessible form and enables updating, editing, retrieval, and searches of the information. Themore » database contains job-related information such as job locations, number of workers required, job times, and the expected plant doserates. It also contains the means to flag job requirements such as requirements for temporary shielding, flushing, scaffolding, etc. Typical uses of the database are envisaged to be in the prediction of occupational doses, the identification of high collective and individual dose jobs, use in ALARP assessments, setting of dose targets, monitoring of dose control performance, and others.« less

  7. Analysis of the NMI01 marker for a population database of cannabis seeds.

    PubMed

    Shirley, Nicholas; Allgeier, Lindsay; Lanier, Tommy; Coyle, Heather Miller

    2013-01-01

    We have analyzed the distribution of genotypes at a single hexanucleotide short tandem repeat (STR) locus in a Cannabis sativa seed database along with seed-packaging information. This STR locus is defined by the polymerase chain reaction amplification primers CS1F and CS1R and is referred to as NMI01 (for National Marijuana Initiative) in our study. The population database consists of seed seizures of two categories: seed samples from labeled and unlabeled packages regarding seed bank source. Of a population database of 93 processed seeds including 12 labeled Cannabis varieties, the observed genotypes generated from single seeds exhibited between one and three peaks (potentially six alleles if in homozygous state). The total number of observed genotypes was 54 making this marker highly specific and highly individualizing even among seeds of common lineage. Cluster analysis associated many but not all of the handwritten labeled seed varieties tested to date as well as the National Park seizure to our known reference database containing Mr. Nice Seedbank and Sensi Seeds commercially packaged reference samples. © 2012 American Academy of Forensic Sciences.

  8. Medical overuse in the Iranian healthcare system: a systematic review protocol.

    PubMed

    Arab-Zozani, Morteza; Pezeshki, Mohammad Zakaria; Khodayari-Zarnaq, Rahim; Janati, Ali

    2018-04-17

    Lack of resources is one of the main problems of all healthcare systems. Recent studies have shown that reducing the overuse of medical services plays an important role in reducing healthcare system costs. Overuse of medical services is a major problem in the healthcare system, and it threatens the quality of the services, can harm patients and create excess costs for patients. So far, few studies have been conducted in this regard in Iran. The main objective of this systematic review is to perform an inclusive search for studies that report overuse of medical services in the Iranian healthcare system. An extensive search of the literature will be conducted in six databases including PubMed, Embase, Scopus, Web of Science, Cochrane and Scientific Information Database using a comprehensive search strategy to identify studies on overuse of medical care. The search will be done without time limit until the end of 2017, completed by reference tracking, author tracking and expert consultation. The search will be conducted on 1 February 2018. Any study that reports an overuse in a service based on a specific standard will be included in the study. Two reviewers will screen the articles based on the title, abstract and full text, and extract data about type of service, clinical area and overuse rate. Quality appraisal will be assessed using the Joanna Briggs Institute checklist. Potential discrepancies will be resolved by consulting a third author. Recommendations will be made to the Iranian MOHME (Ministry of Health and Medical Education) in order to make better evidence-based decisions about medical services in the future. CRD42017075481. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Candida guilliermondii and Other Species of Candida Misidentified as Candida famata: Assessment by Vitek 2, DNA Sequencing Analysis, and Matrix-Assisted Laser Desorption Ionization–Time of Flight Mass Spectrometry in Two Global Antifungal Surveillance Programs

    PubMed Central

    Woosley, Leah N.; Diekema, Daniel J.; Jones, Ronald N.; Pfaller, Michael A.

    2013-01-01

    Candida famata (teleomorph Debaryomyces hansenii) has been described as a medically relevant yeast, and this species has been included in many commercial identification systems that are currently used in clinical laboratories. Among 53 strains collected during the SENTRY and ARTEMIS surveillance programs and previously identified as C. famata (includes all submitted strains with this identification) by a variety of commercial methods (Vitek, MicroScan, API, and AuxaColor), DNA sequencing methods demonstrated that 19 strains were C. guilliermondii, 14 were C. parapsilosis, 5 were C. lusitaniae, 4 were C. albicans, and 3 were C. tropicalis, and five isolates belonged to other Candida species (two C. fermentati and one each C. intermedia, C. pelliculosa, and Pichia fabianni). Additionally, three misidentified C. famata strains were correctly identified as Kodomaea ohmeri, Debaryomyces nepalensis, and Debaryomyces fabryi using intergenic transcribed spacer (ITS) and/or intergenic spacer (IGS) sequencing. The Vitek 2 system identified three isolates with high confidence to be C. famata and another 15 with low confidence between C. famata and C. guilliermondii or C. parapsilosis, displaying only 56.6% agreement with DNA sequencing results. Matrix-assisted laser desorption ionization–time of flight (MALDI-TOF) results displayed 81.1% agreement with DNA sequencing. One strain each of C. metapsilosis, C. fermentati, and C. intermedia demonstrated a low score for identification (<2.0) in the MALDI Biotyper. K. ohmeri, D. nepalensis, and D. fabryi identified by DNA sequencing in this study were not in the current database for the MALDI Biotyper. These results suggest that the occurrence of C. famata in fungal infections is much lower than previously appreciated and that commercial systems do not produce accurate identifications except for the newly introduced MALDI-TOF instruments. PMID:23100350

  10. Heavy vehicle propulsion system materials program: Semiannual progress report, April 1996--September 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, D.R.

    1997-04-01

    The purpose of the Heavy Vehicle Propulsion System Materials Program is the development of materials: ceramics, intermetallics, metal alloys, and metal and ceramic coatings, to support the dieselization of class 1-3 trucks to realize a 35% fuel-economy improvement over current gasoline-fueled trucks and to support commercialization of fuel-flexible LE-55 low-emissions, high-efficiency diesel engines for class 7-8 trucks. The Office of Transportation Technologies, Office of Heavy Vehicle Technologies (OTT OHVT) has an active program to develop the technology for advanced LE-55 diesel engines with 55% efficiency and low emissions levels of 2.0 g/bhp-h NO{sub x} and 0.05 g/bhp-h particulates. The goalmore » is also for the LE-55 engine to run on natural gas with efficiency approaching that of diesel fuel. The LE-55 program is being completed in FY 1997 and, after approximately 10 years of effort, has largely met the program goals of 55% efficiency and low emissions. However, the commercialization of the LE-55 technology requires more durable materials than those that have been used to demonstrate the goals. Heavy Vehicle Propulsion System Materials will, in concert with the heavy duty diesel engine companies, develop the durable materials required to commercialize the LE-55 technologies. OTT OHVT also recognizes a significant opportunity for reduction in petroleum consumption by dieselization of pickup trucks, vans, and sport utility vehicles. Application of the diesel engine to class 1, 2, and 3 trucks is expected to yield a 35% increase in fuel economy per vehicle. The foremost barrier to diesel use in this market is emission control. Once an engine is made certifiable, subsequent challenges will be in cost; noise, vibration, and harshness (NVH); and performance. Separate abstracts have been submitted to the database for contributions to this report.« less

  11. Pothole Detection System Using a Black-box Camera.

    PubMed

    Jo, Youngtae; Ryu, Seungki

    2015-11-19

    Aging roads and poor road-maintenance systems result a large number of potholes, whose numbers increase over time. Potholes jeopardize road safety and transportation efficiency. Moreover, they are often a contributing factor to car accidents. To address the problems associated with potholes, the locations and size of potholes must be determined quickly. Sophisticated road-maintenance strategies can be developed using a pothole database, which requires a specific pothole-detection system that can collect pothole information at low cost and over a wide area. However, pothole repair has long relied on manual detection efforts. Recent automatic detection systems, such as those based on vibrations or laser scanning, are insufficient to detect potholes correctly and inexpensively owing to the unstable detection of vibration-based methods and high costs of laser scanning-based methods. Thus, in this paper, we introduce a new pothole-detection system using a commercial black-box camera. The proposed system detects potholes over a wide area and at low cost. We have developed a novel pothole-detection algorithm specifically designed to work with the embedded computing environments of black-box cameras. Experimental results are presented with our proposed system, showing that potholes can be detected accurately in real-time.

  12. Database Access Systems.

    ERIC Educational Resources Information Center

    Dalrymple, Prudence W.; Roderer, Nancy K.

    1994-01-01

    Highlights the changes that have occurred from 1987-93 in database access systems. Topics addressed include types of databases, including CD-ROMs; enduser interface; database selection; database access management, including library instruction and use of primary literature; economic issues; database users; the search process; and improving…

  13. An Introduction to Database Structure and Database Machines.

    ERIC Educational Resources Information Center

    Detweiler, Karen

    1984-01-01

    Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…

  14. 78 FR 28756 - Defense Federal Acquisition Regulation Supplement: System for Award Management Name Changes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... Excluded Parties Listing System (EPLS) databases into the System for Award Management (SAM) database. DATES... combined the functional capabilities of the CCR, ORCA, and EPLS procurement systems into the SAM database... identification number and the type of organization from the System for Award Management database. 0 3. Revise the...

  15. Contribution for an African autosomic STR database (AmpF/STR Identifiler and Powerplex 16 System) and a report on genotypic variations.

    PubMed

    Alves, Cíntia; Gusmão, Leonor; Damasceno, Albertino; Soares, Benilde; Amorim, António

    2004-01-28

    Allele frequencies, together with some parameters of forensic interest, for 17 STRs included in the AmpF/STR Identifiler (CSF1PO, D2S1338, D3S1358, D5S818, D7S820, D8S1179, D13S317, D16S539, D18S51, D19S433, D21S11, FGA, TH01, TPO and VWA) and Powerplex 16 System (CSF1PO, D3S1358, D5S818, D7S820, D8S1179, D13S317, D16S539, D18S51, D21S11, FGA, Penta D, Penta E, TH01, TPO and VWA) were estimated from a sample of 135-144 unrelated individuals from Mozambique. No deviations from Hardy-Weinberg equilibrium were observed with the exception of the FGA locus (using the Bonferroni correction for the number of loci analysed, the departure observed at this locus was not significant). Comparative analyses between our population data and other African databases, namely Promega's African-Americans, AB Applied Biosystems African-Americans and two other population samples from Mozambique and Guiné Bissau, are presented and discussed. Genotype inconsistencies between both commercial kits (for D16S539 and D8S1179) and other genotypic variations (three-banded allele patterns for TPO) are also reported.

  16. Pulmonary nodule detection using a cascaded SVM classifier

    NASA Astrophysics Data System (ADS)

    Bergtholdt, Martin; Wiemker, Rafael; Klinder, Tobias

    2016-03-01

    Automatic detection of lung nodules from chest CT has been researched intensively over the last decades resulting also in several commercial products. However, solutions are adopted only slowly into daily clinical routine as many current CAD systems still potentially miss true nodules while at the same time generating too many false positives (FP). While many earlier approaches had to rely on rather few cases for development, larger databases become now available and can be used for algorithmic development. In this paper, we address the problem of lung nodule detection via a cascaded SVM classifier. The idea is to sequentially perform two classification tasks in order to select from an extremely large pool of potential candidates the few most likely ones. As the initial pool is allowed to contain thousands of candidates, very loose criteria could be applied during this pre-selection. In this way, the chances that a true nodule is falsely rejected as a candidate are reduced significantly. The final algorithm is trained and tested on the full LIDC/IDRI database. Comparison is done against two previously published CAD systems. Overall, the algorithm achieved sensitivity of 0.859 at 2.5 FP/volume where the other two achieved sensitivity values of 0.321 and 0.625, respectively. On low dose data sets, only slight increase in the number of FP/volume was observed, while the sensitivity was not affected.

  17. Reflecting on the Germanwings Disaster: A Systematic Review of Depression and Suicide in Commercial Airline Pilots.

    PubMed

    Pasha, Terouz; Stokes, Paul R A

    2018-01-01

    The 2015 Germanwings Flight 9525 disaster, in which 150 people were killed after the co-pilot may have intentionally crashed the plane in a suicide attempt, highlights the importance of better understanding the mental health of commercial airline pilots. However, there have been few systematic reviews investigating the topic of mental health in commercial aviation. This systematic review aims to identify the types and prevalence of mental health disorders that commercial airline pilots experience with a focus on mood disorders and suicide risk. A systematic literature search was performed using PubMed, EMBASE, and PsycINFO databases. Eligible studies were assessed and data was extracted and analyzed. 20 studies were identified. The prevalence of depression experienced by commercial airline pilots in this review ranged from 1.9% to 12.6%. Factors that negatively impacted the mental health of pilots included substance abuse, experiencing verbal or sexual abuse, disruption in sleep circadian rhythms and fatigue. This systematic review identifies that commercial airline pilots may experience depression at least as frequently as the general population. Commercial airline pilots experience occupational stressors, such as disrupted circadian rhythms and fatigue which may increase risks of developing mood disorders. Most studies identified in this review were cross-sectional in nature with substantial limitations. There is a clear need for further higher quality longitudinal studies to better understand the mental health of commercial airline pilots.

  18. Valganciclovir Use Among Commercially and Medicaid-insured Infants With Congenital CMV Infection in the United States, 2009-2015.

    PubMed

    Leung, Jessica; Dollard, Sheila C; Grosse, Scott D; Chung, Winnie; Do, ThuyQuynh; Patel, Manisha; Lanzieri, Tatiana M

    2018-03-01

    The aim of this study was to assess the clinical characteristics and trends in valganciclovir use among infants diagnosed with congenital cytomegalovirus (CMV) disease in the United States. We analyzed data from medical claims dated 2009-2015 from the Truven Health MarketScan ® Commercial Claims and Encounters and Medicaid databases. We identified infants with a live birth code in the first claim who were continuously enrolled for at least 45 days. Among infants diagnosed with congenital CMV disease, identified by an ICD-9-CM or ICD-10-CM code for congenital CMV infection or CMV disease within 45 days of birth, we assessed data from claims containing codes for any CMV-associated clinical condition within the same period, and data from claims for hearing loss and/or valganciclovir within the first 180 days of life. In the commercial and Medicaid databases, we identified 257 (2.5/10,000) and 445 (3.3/10,000) infants, respectively, diagnosed with congenital CMV disease, among whom 135 (53%) and 282 (63%) had ≥1 CMV-associated condition, 30 (12%) and 32 (7%) had hearing loss, and 41 (16%) and 78 (18%) had a claim for valganciclovir. Among infants with congenital CMV disease who had a claim for valganciclovir, 37 (90%) among commercially insured infants and 68 (87%) among Medicaid-insured infants had ≥1 CMV-associated condition and/or hearing loss. From 2009 to 2015, the percentages with a claim for valganciclovir increased from 0% to 29% among commercially insured infants and from 4% to 37% among Medicaid-insured infants (P < 0.0001). During 2009-2015, there was a strong upward trend in valganciclovir claims among insured infants who were diagnosed with congenital CMV disease, the majority of whom had CMV-associated conditions and/or hearing loss. Published by Elsevier Inc.

  19. LigandBox: A database for 3D structures of chemical compounds

    PubMed Central

    Kawabata, Takeshi; Sugihara, Yusuke; Fukunishi, Yoshifumi; Nakamura, Haruki

    2013-01-01

    A database for the 3D structures of available compounds is essential for the virtual screening by molecular docking. We have developed the LigandBox database (http://ligandbox.protein.osaka-u.ac.jp/ligandbox/) containing four million available compounds, collected from the catalogues of 37 commercial suppliers, and approved drugs and biochemical compounds taken from KEGG_DRUG, KEGG_COMPOUND and PDB databases. Each chemical compound in the database has several 3D conformers with hydrogen atoms and atomic charges, which are ready to be docked into receptors using docking programs. The 3D conformations were generated using our molecular simulation program package, myPresto. Various physical properties, such as aqueous solubility (LogS) and carcinogenicity have also been calculated to characterize the ADME-Tox properties of the compounds. The Web database provides two services for compound searches: a property/chemical ID search and a chemical structure search. The chemical structure search is performed by a descriptor search and a maximum common substructure (MCS) search combination, using our program kcombu. By specifying a query chemical structure, users can find similar compounds among the millions of compounds in the database within a few minutes. Our database is expected to assist a wide range of researchers, in the fields of medical science, chemical biology, and biochemistry, who are seeking to discover active chemical compounds by the virtual screening. PMID:27493549

  20. LigandBox: A database for 3D structures of chemical compounds.

    PubMed

    Kawabata, Takeshi; Sugihara, Yusuke; Fukunishi, Yoshifumi; Nakamura, Haruki

    2013-01-01

    A database for the 3D structures of available compounds is essential for the virtual screening by molecular docking. We have developed the LigandBox database (http://ligandbox.protein.osaka-u.ac.jp/ligandbox/) containing four million available compounds, collected from the catalogues of 37 commercial suppliers, and approved drugs and biochemical compounds taken from KEGG_DRUG, KEGG_COMPOUND and PDB databases. Each chemical compound in the database has several 3D conformers with hydrogen atoms and atomic charges, which are ready to be docked into receptors using docking programs. The 3D conformations were generated using our molecular simulation program package, myPresto. Various physical properties, such as aqueous solubility (LogS) and carcinogenicity have also been calculated to characterize the ADME-Tox properties of the compounds. The Web database provides two services for compound searches: a property/chemical ID search and a chemical structure search. The chemical structure search is performed by a descriptor search and a maximum common substructure (MCS) search combination, using our program kcombu. By specifying a query chemical structure, users can find similar compounds among the millions of compounds in the database within a few minutes. Our database is expected to assist a wide range of researchers, in the fields of medical science, chemical biology, and biochemistry, who are seeking to discover active chemical compounds by the virtual screening.

  1. Robotic assistants in personal care: A scoping review.

    PubMed

    Bilyea, A; Seth, N; Nesathurai, S; Abdullah, H A

    2017-11-01

    The aim of this study is to present an overview of the technological advances in the field of robotics developed for assistance with activities of daily living (ADL), and to present areas where further research is required. Four databases were searched for articles presenting either a novel design of one of these personal care robotic system or trial results relating to these systems. Articles presenting nine different robotic personal care systems were examined, six of which had been developed after 2005. These six also all have publications relating to their trials. In the majority of trials, patient independence was improved with operation of the robotic device for a specific subset of ADL. A map of the current state of the field of personal care robotics is presented in this study. Areas requiring further research include improving feedback and awareness, as well as refining control methods and pre-programmed behaviors. Developing an affordable, easy to use system would help fill the current gap in the commercial market. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  2. Heterogeneous distributed databases: A case study

    NASA Technical Reports Server (NTRS)

    Stewart, Tracy R.; Mukkamala, Ravi

    1991-01-01

    Alternatives are reviewed for accessing distributed heterogeneous databases and a recommended solution is proposed. The current study is limited to the Automated Information Systems Center at the Naval Sea Combat Systems Engineering Station at Norfolk, VA. This center maintains two databases located on Digital Equipment Corporation's VAX computers running under the VMS operating system. The first data base, ICMS, resides on a VAX11/780 and has been implemented using VAX DBMS, a CODASYL based system. The second database, CSA, resides on a VAX 6460 and has been implemented using the ORACLE relational database management system (RDBMS). Both databases are used for configuration management within the U.S. Navy. Different customer bases are supported by each database. ICMS tracks U.S. Navy ships and major systems (anti-sub, sonar, etc.). Even though the major systems on ships and submarines have totally different functions, some of the equipment within the major systems are common to both ships and submarines.

  3. 48 CFR 212.270 - Major weapon systems as commercial items.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Major weapon systems as... Requirements for the Acquisition of Commercial Items 212.270 Major weapon systems as commercial items. The DoD policy for acquiring major weapon systems as commercial items is in Subpart 234.70. [71 FR 58538, Oct. 4...

  4. 48 CFR 212.270 - Major weapon systems as commercial items.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Major weapon systems as... Requirements for the Acquisition of Commercial Items 212.270 Major weapon systems as commercial items. The DoD policy for acquiring major weapon systems as commercial items is in Subpart 234.70. [71 FR 58538, Oct. 4...

  5. 48 CFR 212.270 - Major weapon systems as commercial items.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Major weapon systems as... Requirements for the Acquisition of Commercial Items 212.270 Major weapon systems as commercial items. The DoD policy for acquiring major weapon systems as commercial items is in Subpart 234.70. [71 FR 58538, Oct. 4...

  6. 48 CFR 212.270 - Major weapon systems as commercial items.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Major weapon systems as... Requirements for the Acquisition of Commercial Items 212.270 Major weapon systems as commercial items. The DoD policy for acquiring major weapon systems as commercial items is in Subpart 234.70. [71 FR 58538, Oct. 4...

  7. 48 CFR 212.270 - Major weapon systems as commercial items.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Major weapon systems as... Requirements for the Acquisition of Commercial Items 212.270 Major weapon systems as commercial items. The DoD policy for acquiring major weapon systems as commercial items is in Subpart 234.70. [71 FR 58538, Oct. 4...

  8. 75 FR 8431 - Carbon Dioxide Fire Suppression Systems on Commercial Vessels

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... Fire Suppression Systems on Commercial Vessels; Proposed Rule #0;#0;Federal Register / Vol. 75 , No. 36... 1625-AB44 Carbon Dioxide Fire Suppression Systems on Commercial Vessels AGENCY: Coast Guard, DHS... for fire suppression systems on several classes of commercial vessels. The amendments would clarify...

  9. How I do it: a practical database management system to assist clinical research teams with data collection, organization, and reporting.

    PubMed

    Lee, Howard; Chapiro, Julius; Schernthaner, Rüdiger; Duran, Rafael; Wang, Zhijun; Gorodetski, Boris; Geschwind, Jean-François; Lin, MingDe

    2015-04-01

    The objective of this study was to demonstrate that an intra-arterial liver therapy clinical research database system is a more workflow efficient and robust tool for clinical research than a spreadsheet storage system. The database system could be used to generate clinical research study populations easily with custom search and retrieval criteria. A questionnaire was designed and distributed to 21 board-certified radiologists to assess current data storage problems and clinician reception to a database management system. Based on the questionnaire findings, a customized database and user interface system were created to perform automatic calculations of clinical scores including staging systems such as the Child-Pugh and Barcelona Clinic Liver Cancer, and facilitates data input and output. Questionnaire participants were favorable to a database system. The interface retrieved study-relevant data accurately and effectively. The database effectively produced easy-to-read study-specific patient populations with custom-defined inclusion/exclusion criteria. The database management system is workflow efficient and robust in retrieving, storing, and analyzing data. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  10. Implementation of a data management software system for SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, Kenneth

    1986-01-01

    The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.

  11. Development and Operation of a Database Machine for Online Access and Update of a Large Database.

    ERIC Educational Resources Information Center

    Rush, James E.

    1980-01-01

    Reviews the development of a fault tolerant database processor system which replaced OCLC's conventional file system. A general introduction to database management systems and the operating environment is followed by a description of the hardware selection, software processes, and system characteristics. (SW)

  12. 75 FR 18255 - Passenger Facility Charge Database System for Air Carrier Reporting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-09

    ... Facility Charge Database System for Air Carrier Reporting AGENCY: Federal Aviation Administration (FAA... the Passenger Facility Charge (PFC) database system to report PFC quarterly report information. In... developed a national PFC database system in order to more easily track the PFC program on a nationwide basis...

  13. An Improved Database System for Program Assessment

    ERIC Educational Resources Information Center

    Haga, Wayne; Morris, Gerard; Morrell, Joseph S.

    2011-01-01

    This research paper presents a database management system for tracking course assessment data and reporting related outcomes for program assessment. It improves on a database system previously presented by the authors and in use for two years. The database system presented is specific to assessment for ABET (Accreditation Board for Engineering and…

  14. 76 FR 11465 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... separate systems of records: ``FHFA-OIG Audit Files Database,'' ``FHFA-OIG Investigative & Evaluative Files Database,'' ``FHFA-OIG Investigative & Evaluative MIS Database,'' and ``FHFA-OIG Hotline Database.'' These... Audit Files Database. FHFA-OIG-2: FHFA-OIG Investigative & Evaluative Files Database. FHFA-OIG-3: FHFA...

  15. Sodium and sugar in complementary infant and toddler foods sold in the United States.

    PubMed

    Cogswell, Mary E; Gunn, Janelle P; Yuan, Keming; Park, Sohyun; Merritt, Robert

    2015-03-01

    To evaluate the sodium and sugar content of US commercial infant and toddler foods. We used a 2012 nutrient database of 1074 US infant and toddler foods and drinks developed from a commercial database, manufacturer Web sites, and major grocery stores. Products were categorized on the basis of their main ingredients and the US Food and Drug Administration's reference amounts customarily consumed per eating occasion (RACC). Sodium and sugar contents and presence of added sugars were determined. All but 2 of the 657 infant vegetables, dinners, fruits, dry cereals, and ready-to-serve mixed grains and fruits were low sodium (≤140 mg/RACC). The majority of these foods did not contain added sugars; however, 41 of 79 infant mixed grains and fruits contained ≥1 added sugar, and 35 also contained >35% calories from sugar. Seventy-two percent of 72 toddler dinners were high in sodium content (>210 mg/RACC). Toddler dinners contained an average of 2295 mg of sodium per 1000 kcal (sodium 212 mg/100 g). Savory infant/toddler snacks (n = 34) contained an average of sodium 1382 mg/1000 kcal (sodium 486 mg/100 g); 1 was high sodium. Thirty-two percent of toddler dinners and the majority of toddler cereal bars/breakfast pastries, fruit, and infant/toddler snacks, desserts, and juices contained ≥1 added sugar. Commercial toddler foods and infant or toddler snacks, desserts, and juice drinks are of potential concern due to sodium or sugar content. Pediatricians should advise parents to look carefully at labels when selecting commercial toddler foods and to limit salty snacks, sweet desserts, and juice drinks. Copyright © 2015 by the American Academy of Pediatrics.

  16. Evaluation of Head-Worn Display Concepts for Commercial Aircraft Taxi Operations

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Kramer, Lynda J.

    2007-01-01

    Previous research has demonstrated that a Head-Up Display (HUD) can be used to enable more capacity and safer aircraft surface operations. This previous research also noted that the HUD exhibited two major limitations which hindered the full potential of the display concept: 1) the monochrome HUD format; and, 2) a limited, fixed field of regard. Full-color Head Worn Displays (HWDs) with very small sizes and weights are emerging to the extent that this technology may be practical for commercial and business aircraft operations. By coupling the HWD with a head tracker, full-color, out-the-window display concepts with an unlimited field-of-regard may be realized to improve efficiency and safety in surface operations. A ground simulation experiment was conducted at NASA Langley to evaluate the efficacy of head-worn display applications which may directly address the limitations of the HUD while retaining all of its advantages in surface operations. The simulation experiment used airline crews to evaluate various displays (HUD, HWD) and display concepts in an operationally realistic environment by using a Chicago, O Hare airport database. The results pertaining to the implications of HWDs for commercial business and transport aviation applications are presented herein. Overall HWD system latency was measured and found to be acceptable, but not necessarily optimal. A few occurrences of simulator sickness were noted while wearing the HWD, but overall there appears to be commercial pilot acceptability and usability to the concept. Many issues were identified which need to be addressed in future research including continued reduction in user encumbrance due to the HWD, and improvement in image alignment, accuracy, and boresighting.

  17. Iodine in food- and dietary supplement–composition databases123

    PubMed Central

    Pehrsson, Pamela R; Patterson, Kristine Y; Spungen, Judith H; Wirtz, Mark S; Andrews, Karen W; Dwyer, Johanna T; Swanson, Christine A

    2016-01-01

    The US Food and Drug Administration (FDA) and the Nutrient Data Laboratory (NDL) of the USDA Agricultural Research Service have worked independently on determining the iodine content of foods and dietary supplements and are now harmonizing their efforts. The objective of the current article is to describe the harmonization plan and the results of initial iodine analyses accomplished under that plan. For many years, the FDA’s Total Diet Study (TDS) has measured iodine concentrations in selected foods collected in 4 regions of the country each year. For more than a decade, the NDL has collected and analyzed foods as part of the National Food and Nutrient Analysis Program; iodine analysis is now being added to the program. The NDL recently qualified a commercial laboratory to conduct iodine analysis of foods by an inductively coupled plasma mass spectrometry (ICP-MS) method. Co-analysis of a set of samples by the commercial laboratory using the ICP-MS method and by the FDA laboratory using its standard colorimetric method yielded comparable results. The FDA recently reviewed historical TDS data for trends in the iodine content of selected foods, and the NDL analyzed samples of a limited subset of those foods for iodine. The FDA and the NDL are working to combine their data on iodine in foods and to produce an online database that can be used for estimating iodine intake from foods in the US population. In addition, the NDL continues to analyze dietary supplements for iodine and, in collaboration with the NIH Office of Dietary Supplements, to publish the data online in the Dietary Supplement Ingredient Database. The goal is to provide, through these 2 harmonized databases and the continuing TDS focus on iodine, improved tools for estimating iodine intake in population studies. PMID:27534627

  18. Validation of intellectual disability coding through hospital morbidity records using an intellectual disability population-based database in Western Australia.

    PubMed

    Bourke, Jenny; Wong, Kingsley; Leonard, Helen

    2018-01-23

    To investigate how well intellectual disability (ID) can be ascertained using hospital morbidity data compared with a population-based data source. All children born in 1983-2010 with a hospital admission in the Western Australian Hospital Morbidity Data System (HMDS) were linked with the Western Australian Intellectual Disability Exploring Answers (IDEA) database. The International Classification of Diseases hospital codes consistent with ID were also identified. The characteristics of those children identified with ID through either or both sources were investigated. Of the 488 905 individuals in the study, 10 218 (2.1%) were identified with ID in either IDEA or HMDS with 1435 (14.0%) individuals identified in both databases, 8305 (81.3%) unique to the IDEA database and 478 (4.7%) unique to the HMDS dataset only. Of those unique to the HMDS dataset, about a quarter (n=124) had died before 1 year of age and most of these (75%) before 1 month. Children with ID who were also coded as such in the HMDS data were more likely to be aged under 1 year, female, non-Aboriginal and have a severe level of ID, compared with those not coded in the HMDS data. The sensitivity of using HMDS to identify ID was 14.7%, whereas the specificity was much higher at 99.9%. Hospital morbidity data are not a reliable source for identifying ID within a population, and epidemiological researchers need to take these findings into account in their study design. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Monitoring an alien invasion: DNA barcoding and the identification of lionfish and their prey on coral reefs of the Mexican Caribbean.

    PubMed

    Valdez-Moreno, Martha; Quintal-Lizama, Carolina; Gómez-Lozano, Ricardo; García-Rivas, María Del Carmen

    2012-01-01

    In the Mexican Caribbean, the exotic lionfish Pterois volitans has become a species of great concern because of their predatory habits and rapid expansion onto the Mesoamerican coral reef, the second largest continuous reef system in the world. This is the first report of DNA identification of stomach contents of lionfish using the barcode of life reference database (BOLD). We confirm with barcoding that only Pterois volitans is apparently present in the Mexican Caribbean. We analyzed the stomach contents of 157 specimens of P. volitans from various locations in the region. Based on DNA matches in the Barcode of Life Database (BOLD) and GenBank, we identified fishes from five orders, 14 families, 22 genera and 34 species in the stomach contents. The families with the most species represented were Gobiidae and Apogonidae. Some prey taxa are commercially important species. Seven species were new records for the Mexican Caribbean: Apogon mosavi, Coryphopterus venezuelae, C. thrix, C. tortugae, Lythrypnus minimus, Starksia langi and S. ocellata. DNA matches, as well as the presence of intact lionfish in the stomach contents, indicate some degree of cannibalism, a behavior confirmed in this species by the first time. We obtained 45 distinct crustacean prey sequences, from which only 20 taxa could be identified from the BOLD and GenBank databases. The matches were primarily to Decapoda but only a single taxon could be identified to the species level, Euphausia americana. This technique proved to be an efficient and useful method, especially since prey species could be identified from partially-digested remains. The primary limitation is the lack of comprehensive coverage of potential prey species in the region in the BOLD and GenBank databases, especially among invertebrates.

  20. Monitoring an Alien Invasion: DNA Barcoding and the Identification of Lionfish and Their Prey on Coral Reefs of the Mexican Caribbean

    PubMed Central

    Valdez-Moreno, Martha; Quintal-Lizama, Carolina; Gómez-Lozano, Ricardo; García-Rivas, María del Carmen

    2012-01-01

    Background In the Mexican Caribbean, the exotic lionfish Pterois volitans has become a species of great concern because of their predatory habits and rapid expansion onto the Mesoamerican coral reef, the second largest continuous reef system in the world. This is the first report of DNA identification of stomach contents of lionfish using the barcode of life reference database (BOLD). Methodology/Principal Findings We confirm with barcoding that only Pterois volitans is apparently present in the Mexican Caribbean. We analyzed the stomach contents of 157 specimens of P. volitans from various locations in the region. Based on DNA matches in the Barcode of Life Database (BOLD) and GenBank, we identified fishes from five orders, 14 families, 22 genera and 34 species in the stomach contents. The families with the most species represented were Gobiidae and Apogonidae. Some prey taxa are commercially important species. Seven species were new records for the Mexican Caribbean: Apogon mosavi, Coryphopterus venezuelae, C. thrix, C. tortugae, Lythrypnus minimus, Starksia langi and S. ocellata. DNA matches, as well as the presence of intact lionfish in the stomach contents, indicate some degree of cannibalism, a behavior confirmed in this species by the first time. We obtained 45 distinct crustacean prey sequences, from which only 20 taxa could be identified from the BOLD and GenBank databases. The matches were primarily to Decapoda but only a single taxon could be identified to the species level, Euphausia americana. Conclusions/Significance This technique proved to be an efficient and useful method, especially since prey species could be identified from partially-digested remains. The primary limitation is the lack of comprehensive coverage of potential prey species in the region in the BOLD and GenBank databases, especially among invertebrates. PMID:22675470

  1. The 2010-2015 Prevalence of Eosinophilic Esophagitis in the USA: A Population-Based Study.

    PubMed

    Mansoor, Emad; Cooper, Gregory S

    2016-10-01

    Eosinophilic esophagitis (EoE) is a chronic inflammatory disorder with increasing prevalence. However, epidemiologic data have mostly been acquired from small studies. We sought to describe the epidemiology of EoE in the USA, utilizing a large database. We queried a commercial database (Explorys Inc, Cleveland, OH, USA), an aggregate of electronic health record data from 26 major integrated US healthcare systems from 1999 to July 2015. We identified an aggregated patient cohort of eligible patients with EoE and a history of proton-pump inhibitor use between July 2010 and July 2015, based on Systematized Nomenclature of Medicine-Clinical Terms. We calculated the prevalence of EoE among different patient groups. Of the 30,301,440 individuals in the database, we identified 7840 patients with EoE with an overall prevalence of 25.9/100,000 persons. Prevalence was higher in males than females [odds ratio (OR) 2.00; 95 % CI 1.92-2.10, p < 0.0001], Caucasians versus African-Americans and Asians (OR 2.00; 95 % CI 1.86-2.14, p < 0.0001), and adults (18-65 years) versus elderly (>65 years) and children (<18 years) (OR 1.63; 95 % CI 1.54-1.71, p < 0.0001). Compared with controls (individuals in database without EoE), individuals with EoE were more likely to have other gastrointestinal diagnoses such as dysphagia and at least one allergic condition. In this large study, we found that the estimated prevalence of EoE in the USA is 25.9/100,000, which is at the lower end of prevalence rates reported in the USA and other industrial countries. We confirmed that EoE has a strong association with allergic and gastrointestinal diagnoses.

  2. Validation of a New Web Application for Identification of Fungi by Use of Matrix-Assisted Laser Desorption Ionization–Time of Flight Mass Spectrometry

    PubMed Central

    Becker, P.; Gabriel, F.; Cassagne, C.; Accoceberry, I.; Gari-Toussaint, M.; Hasseine, L.; De Geyter, D.; Pierard, D.; Surmont, I.; Djenad, F.; Donnadieu, J. L.; Piarroux, M.; Hendrickx, M.; Piarroux, R.

    2017-01-01

    ABSTRACT Matrix-assisted laser desorption ionization–time of flight (MALDI-TOF) mass spectrometry has emerged as a reliable technique to identify molds involved in human diseases, including dermatophytes, provided that exhaustive reference databases are available. This study assessed an online identification application based on original algorithms and an extensive in-house reference database comprising 11,851 spectra (938 fungal species and 246 fungal genera). Validation criteria were established using an initial panel of 422 molds, including dermatophytes, previously identified via DNA sequencing (126 species). The application was further assessed using a separate panel of 501 cultured clinical isolates (88 mold taxa including dermatophytes) derived from five hospital laboratories. A total of 438 (87.35%) isolates were correctly identified at the species level, while 26 (5.22%) were assigned to the correct genus but the wrong species and 37 (7.43%) were not identified, since the defined threshold of 20 was not reached. The use of the Bruker Daltonics database included in the MALDI Biotyper software resulted in a much higher rate of unidentified isolates (39.76 and 74.30% using the score thresholds 1.7 and 2.0, respectively). Moreover, the identification delay of the online application remained compatible with real-time online queries (0.15 s per spectrum), and the application was faster than identifications using the MALDI Biotyper software. This is the first study to assess an online identification system based on MALDI-TOF spectrum analysis. We have successfully applied this approach to identify molds, including dermatophytes, for which diversity is insufficiently represented in commercial databases. This free-access application is available to medical mycologists to improve fungal identification. PMID:28637907

  3. Validation of a New Web Application for Identification of Fungi by Use of Matrix-Assisted Laser Desorption Ionization-Time of Flight Mass Spectrometry.

    PubMed

    Normand, A C; Becker, P; Gabriel, F; Cassagne, C; Accoceberry, I; Gari-Toussaint, M; Hasseine, L; De Geyter, D; Pierard, D; Surmont, I; Djenad, F; Donnadieu, J L; Piarroux, M; Ranque, S; Hendrickx, M; Piarroux, R

    2017-09-01

    Matrix-assisted laser desorption ionization-time of flight (MALDI-TOF) mass spectrometry has emerged as a reliable technique to identify molds involved in human diseases, including dermatophytes, provided that exhaustive reference databases are available. This study assessed an online identification application based on original algorithms and an extensive in-house reference database comprising 11,851 spectra (938 fungal species and 246 fungal genera). Validation criteria were established using an initial panel of 422 molds, including dermatophytes, previously identified via DNA sequencing (126 species). The application was further assessed using a separate panel of 501 cultured clinical isolates (88 mold taxa including dermatophytes) derived from five hospital laboratories. A total of 438 (87.35%) isolates were correctly identified at the species level, while 26 (5.22%) were assigned to the correct genus but the wrong species and 37 (7.43%) were not identified, since the defined threshold of 20 was not reached. The use of the Bruker Daltonics database included in the MALDI Biotyper software resulted in a much higher rate of unidentified isolates (39.76 and 74.30% using the score thresholds 1.7 and 2.0, respectively). Moreover, the identification delay of the online application remained compatible with real-time online queries (0.15 s per spectrum), and the application was faster than identifications using the MALDI Biotyper software. This is the first study to assess an online identification system based on MALDI-TOF spectrum analysis. We have successfully applied this approach to identify molds, including dermatophytes, for which diversity is insufficiently represented in commercial databases. This free-access application is available to medical mycologists to improve fungal identification. Copyright © 2017 American Society for Microbiology.

  4. Digital Geologic Map of the Rosalia 1:100,000 Quadrangle, Washington and Idaho: A Digital Database for the 1990 S.Z. Waggoner Map

    USGS Publications Warehouse

    Derkey, Pamela D.; Johnson, Bruce R.; Lackaff, Beatrice B.; Derkey, Robert E.

    1998-01-01

    The geologic map of the Rosalia 1:100,000-scale quadrangle was compiled in 1990 by S.Z. Waggoner of the Washington state Division of Geology and Earth Resources. This data was entered into a geographic information system (GIS) as part of a larger effort to create regional digital geology for the Pacific Northwest. The intent was to provide a digital geospatial database for a previously published black and white paper geologic map. This database can be queried in many ways to produce a variety of geologic maps. Digital base map data files are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:100,000 (e.g., 1:62,500 or 1:24,000) as it has been somewhat generalized to fit the 1:100,000 scale map. The map area is located in eastern Washington and extends across the state border into western Idaho. This open-file report describes the methods used to convert the geologic map data into a digital format, documents the file structures, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. We wish to thank J. Eric Schuster of the Washington Division of Geology and Earth Resources for providing the original stable-base mylar and the funding for it to be scanned. We also thank Dick Blank and Barry Moring of the U.S. Geological Survey for reviewing the manuscript and digital files, respectively.

  5. Using Commercially available Tools for multi-faceted health assessment: Data Integration Lessons Learned

    PubMed Central

    Wilamowska, Katarzyna; Le, Thai; Demiris, George; Thompson, Hilaire

    2013-01-01

    Health monitoring data collected from multiple available intake devices provide a rich resource to support older adult health and wellness. Though large amounts of data can be collected, there is currently a lack of understanding on integration of these various data sources using commercially available products. This article describes an inexpensive approach to integrating data from multiple sources from a recently completed pilot project that assessed older adult wellness, and demonstrates challenges and benefits in pursuing data integration using commercially available products. The data in this project were sourced from a) electronically captured participant intake surveys, and existing commercial software output for b) vital signs and c) cognitive function. All the software used for data integration in this project was freeware and was chosen because of its ease of comprehension by novice database users. The methods and results of this approach provide a model for researchers with similar data integration needs to easily replicate this effort at a low cost. PMID:23728444

  6. Active in-database processing to support ambient assisted living systems.

    PubMed

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-08-12

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  7. Improved Information Retrieval Performance on SQL Database Using Data Adapter

    NASA Astrophysics Data System (ADS)

    Husni, M.; Djanali, S.; Ciptaningtyas, H. T.; Wicaksana, I. G. N. A.

    2018-02-01

    The NoSQL databases, short for Not Only SQL, are increasingly being used as the number of big data applications increases. Most systems still use relational databases (RDBs), but as the number of data increases each year, the system handles big data with NoSQL databases to analyze and access data more quickly. NoSQL emerged as a result of the exponential growth of the internet and the development of web applications. The query syntax in the NoSQL database differs from the SQL database, therefore requiring code changes in the application. Data adapter allow applications to not change their SQL query syntax. Data adapters provide methods that can synchronize SQL databases with NotSQL databases. In addition, the data adapter provides an interface which is application can access to run SQL queries. Hence, this research applied data adapter system to synchronize data between MySQL database and Apache HBase using direct access query approach, where system allows application to accept query while synchronization process in progress. From the test performed using data adapter, the results obtained that the data adapter can synchronize between SQL databases, MySQL, and NoSQL database, Apache HBase. This system spends the percentage of memory resources in the range of 40% to 60%, and the percentage of processor moving from 10% to 90%. In addition, from this system also obtained the performance of database NoSQL better than SQL database.

  8. Active In-Database Processing to Support Ambient Assisted Living Systems

    PubMed Central

    de Morais, Wagner O.; Lundström, Jens; Wickström, Nicholas

    2014-01-01

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare. PMID:25120164

  9. Machine learning for the automatic detection of anomalous events

    NASA Astrophysics Data System (ADS)

    Fisher, Wendy D.

    In this dissertation, we describe our research contributions for a novel approach to the application of machine learning for the automatic detection of anomalous events. We work in two different domains to ensure a robust data-driven workflow that could be generalized for monitoring other systems. Specifically, in our first domain, we begin with the identification of internal erosion events in earth dams and levees (EDLs) using geophysical data collected from sensors located on the surface of the levee. As EDLs across the globe reach the end of their design lives, effectively monitoring their structural integrity is of critical importance. The second domain of interest is related to mobile telecommunications, where we investigate a system for automatically detecting non-commercial base station routers (BSRs) operating in protected frequency space. The presence of non-commercial BSRs can disrupt the connectivity of end users, cause service issues for the commercial providers, and introduce significant security concerns. We provide our motivation, experimentation, and results from investigating a generalized novel data-driven workflow using several machine learning techniques. In Chapter 2, we present results from our performance study that uses popular unsupervised clustering algorithms to gain insights to our real-world problems, and evaluate our results using internal and external validation techniques. Using EDL passive seismic data from an experimental laboratory earth embankment, results consistently show a clear separation of events from non-events in four of the five clustering algorithms applied. Chapter 3 uses a multivariate Gaussian machine learning model to identify anomalies in our experimental data sets. For the EDL work, we used experimental data from two different laboratory earth embankments. Additionally, we explore five wavelet transform methods for signal denoising. The best performance is achieved with the Haar wavelets. We achieve up to 97.3% overall accuracy and less than 1.4% false negatives in anomaly detection. In Chapter 4, we research using two-class and one-class support vector machines (SVMs) for an effective anomaly detection system. We again use the two different EDL data sets from experimental laboratory earth embankments (each having approximately 80% normal and 20% anomalies) to ensure our workflow is robust enough to work with multiple data sets and different types of anomalous events (e.g., cracks and piping). We apply Haar wavelet-denoising techniques and extract nine spectral features from decomposed segments of the time series data. The two-class SVM with 10-fold cross validation achieved over 94% overall accuracy and 96% F1-score. Our approach provides a means for automatically identifying anomalous events using various machine learning techniques. Detecting internal erosion events in aging EDLs, earlier than is currently possible, can allow more time to prevent or mitigate catastrophic failures. Results show that we can successfully separate normal from anomalous data observations in passive seismic data, and provide a step towards techniques for continuous real-time monitoring of EDL health. Our lightweight non-commercial BSR detection system also has promise in separating commercial from non-commercial BSR scans without the need for prior geographic location information, extensive time-lapse surveys, or a database of known commercial carriers. (Abstract shortened by ProQuest.).

  10. Subject and authorship of records related to the Organization for Tropical Studies (OTS) in BINABITROP, a comprehensive database about Costa Rican biology.

    PubMed

    Monge-Nájera, Julián; Nielsen-Muñoz, Vanessa; Azofeifa-Mora, Ana Beatriz

    2013-06-01

    BINABITROP is a bibliographical database of more than 38000 records about the ecosystems and organisms of Costa Rica. In contrast with commercial databases, such as Web of Knowledge and Scopus, which exclude most of the scientific journals published in tropical countries, BINABITROP is a comprehensive record of knowledge on the tropical ecosystems and organisms of Costa Rica. We analyzed its contents in three sites (La Selva, Palo Verde and Las Cruces) and recorded scientific field, taxonomic group and authorship. We found that most records dealt with ecology and systematics, and that most authors published only one article in the study period (1963-2011). Most research was published in four journals: Biotropica, Revista de Biología Tropical/ International Journal of Tropical Biology and Conservation, Zootaxa and Brenesia. This may be the first study of a such a comprehensive database for any case of tropical biology literature.

  11. SCRIPDB: a portal for easy access to syntheses, chemicals and reactions in patents

    PubMed Central

    Heifets, Abraham; Jurisica, Igor

    2012-01-01

    The patent literature is a rich catalog of biologically relevant chemicals; many public and commercial molecular databases contain the structures disclosed in patent claims. However, patents are an equally rich source of metadata about bioactive molecules, including mechanism of action, disease class, homologous experimental series, structural alternatives, or the synthetic pathways used to produce molecules of interest. Unfortunately, this metadata is discarded when chemical structures are deposited separately in databases. SCRIPDB is a chemical structure database designed to make this metadata accessible. SCRIPDB provides the full original patent text, reactions and relationships described within any individual patent, in addition to the molecular files common to structural databases. We discuss how such information is valuable in medical text mining, chemical image analysis, reaction extraction and in silico pharmaceutical lead optimization. SCRIPDB may be searched by exact chemical structure, substructure or molecular similarity and the results may be restricted to patents describing synthetic routes. SCRIPDB is available at http://dcv.uhnres.utoronto.ca/SCRIPDB. PMID:22067445

  12. 48 CFR 52.212-4 - Contract Terms and Conditions-Commercial Items.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... (OMB) prompt payment regulations at 5 CFR part 1315. (h) Patent indemnity. The Contractor shall... foreign patent, trademark or copyright, arising out of the performance of this contract, provided the... payment of any contract for the accuracy and completeness of the data within the SAM database, and for any...

  13. 48 CFR 52.212-4 - Contract Terms and Conditions-Commercial Items.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Office of Management and Budget (OMB) prompt payment regulations at 5 CFR part 1315. (h) Patent indemnity..., any United States or foreign patent, trademark or copyright, arising out of the performance of this... the accuracy and completeness of the data within the SAM database, and for any liability resulting...

  14. 48 CFR 52.212-4 - Contract Terms and Conditions-Commercial Items.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) and Office of Management and Budget (OMB) prompt payment regulations at 5 CFR part 1315. (h) Patent... infringe, any United States or foreign patent, trademark or copyright, arising out of the performance of... completeness of the data within the CCR database, and for any liability resulting from the Government's...

  15. 47 CFR 27.1231 - Initiating the transition.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Basic Trading Area (BTA). BTAs are based on the Rand McNally 1992 Commercial Atlas & Marketing Guide...; and (C) Specify, if known, the adjacent channel D/U ratio that can be tolerated by any receiver(s) at... database; (F) The bandwidth of each channel or subchannel, the emission type for each channel or subchannel...

  16. Impact of Commercial Search Engines and International Databases on Engineering Teaching and Research

    ERIC Educational Resources Information Center

    Chanson, Hubert

    2007-01-01

    For the last three decades, the engineering higher education and professional environments have been completely transformed by the "electronic/digital information revolution" that has included the introduction of personal computer, the development of email and world wide web, and broadband Internet connections at home. Herein the writer compares…

  17. Airborne Remote Sensing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA imaging technology has provided the basis for a commercial agricultural reconnaissance service. AG-RECON furnishes information from airborne sensors, aerial photographs and satellite and ground databases to farmers, foresters, geologists, etc. This service produces color "maps" of Earth conditions, which enable clients to detect crop color changes or temperature changes that may indicate fire damage or pest stress problems.

  18. Development and integration of an SSR-based molecular identity database into sugarcane breeding program

    USDA-ARS?s Scientific Manuscript database

    Sugarcane breeding is very difficult and it takes 12 to 14 years to develop a new cultivar for commercial production. This is because sugarcane varieties are highly polyploid, inter-specific hybrids with 100 to 130 chromosomes that may vary across geographical areas. Other obstacles/constraints incl...

  19. Scientific Journal Publishing: Yearly Volume and Open Access Availability

    ERIC Educational Resources Information Center

    Bjork, Bo-Christer; Roos, Annikki; Lauri, Mari

    2009-01-01

    Introduction: We estimate the total yearly volume of peer-reviewed scientific journal articles published world-wide as well as the share of these articles available openly on the Web either directly or as copies in e-print repositories. Method: We rely on data from two commercial databases (ISI and Ulrich's Periodicals Directory) supplemented by…

  20. The Case for Creating a Scholars Portal to the Web: A White Paper.

    ERIC Educational Resources Information Center

    Campbell, Jerry D.

    2001-01-01

    Considers the need for reliable, scholarly access to the Web and suggests that the Association for Research Libraries, in partnership with OCLC and the Library of Congress, develop a so-called scholar's portal. Topics include quality content; enhanced library services; and gateway functions, including access to commercial databases and focused…

Top